LLM Tool
Learn how to configure the LLM tool calling.
LLM tool calling works with OpenAI’s Function Calling and can be set up in the llm
layer. It allows AI agents to trigger functions based on user speech during a conversation.
Defining Tool
Top-Level Fields
Field | Type | Required | Description |
---|---|---|---|
type | string | ✅ | Must be "function" to enable tool calling. |
function | object | ✅ | Defines the function that can be called by the LLM. Contains metadata and a strict schema for arguments. |
function
Field | Type | Required | Description |
---|---|---|---|
name | string | ✅ | A unique identifier for the function. Must be in snake_case . The model uses this to refer to the function when calling it. |
description | string | ✅ | A natural language explanation of what the function does. Helps the LLM decide when to call it. |
parameters | object | ✅ | A JSON Schema object that describes the expected structure of the function’s input arguments. |
function.parameters
Field | Type | Required | Description |
---|---|---|---|
type | string | ✅ | Always "object" . Indicates the expected input is a structured object. |
properties | object | ✅ | Defines each expected parameter and its corresponding type, constraints, and description. |
required | array of strings | ✅ | Specifies which parameters are mandatory for the function to execute. |
Each parameter should be included in the required list, even if they might seem optional in your code.
function.parameters.properties
Each key inside properties
defines a single parameter the model must supply when calling the function.
Field | Type | Required | Description |
---|---|---|---|
<parameter_name> | object | ✅ | Each key is a named parameter (e.g., location ). The value is a schema for that parameter. |
Optional subfields for each parameter:
Subfield | Type | Required | Description |
---|---|---|---|
type | string | ✅ | Data type (e.g., string , number , boolean ). |
description | string | ❌ | Explains what the parameter represents and how it should be used. |
enum | array | ❌ | Defines a strict list of allowed values for this parameter. Useful for categorical choices. |
Example Configuration
Here’s an example of tool calling in the llm
layers:
Best Practices:
- Use clear, specific function names to reduce ambiguity.
- Add detailed
description
fields to improve selection accuracy.
How Tool Calling Works
Tool calling is triggered during an active conversation when the LLM model needs to invoke a function. Here’s how the process works:
This example explains the get_current_time
function from the example configuration above.
Input Detected
The AI processes real-time speech input.
Example: The user says, “What time is it now in New York?”
Tool Matching
The LLM analyzes the input and identifies that the user’s question matches the purpose of the get_current_time
function, which expects a location
argument.
Modify Existing Tools
You can update tools
definitions using the Update Persona API.