llm
layer. It allows an AI agent to trigger functions based on user speech during a conversation.
You can use tool calling with our hosted models or any OpenAI-compatible custom LLM.
Defining Tool
Top-Level Fields
Field | Type | Required | Description |
---|---|---|---|
type | string | ✅ | Must be "function" to enable tool calling. |
function | object | ✅ | Defines the function that can be called by the LLM. Contains metadata and a strict schema for arguments. |
function
Field | Type | Required | Description |
---|---|---|---|
name | string | ✅ | A unique identifier for the function. Must be in snake_case . The model uses this to refer to the function when calling it. |
description | string | ✅ | A natural language explanation of what the function does. Helps the LLM decide when to call it. |
parameters | object | ✅ | A JSON Schema object that describes the expected structure of the function’s input arguments. |
function.parameters
Field | Type | Required | Description |
---|---|---|---|
type | string | ✅ | Always "object" . Indicates the expected input is a structured object. |
properties | object | ✅ | Defines each expected parameter and its corresponding type, constraints, and description. |
required | array of strings | ✅ | Specifies which parameters are mandatory for the function to execute. |
Each parameter should be included in the required list, even if they might seem optional in your code.
function.parameters.properties
Each key inside properties
defines a single parameter the model must supply when calling the function.
Field | Type | Required | Description |
---|---|---|---|
<parameter_name> | object | ✅ | Each key is a named parameter (e.g., location ). The value is a schema for that parameter. |
Subfield | Type | Required | Description |
---|---|---|---|
type | string | ✅ | Data type (e.g., string , number , boolean ). |
description | string | ❌ | Explains what the parameter represents and how it should be used. |
enum | array | ❌ | Defines a strict list of allowed values for this parameter. Useful for categorical choices. |
Example Configuration
Here’s an example of tool calling in thellm
layers:
Best Practices:
- Use clear, specific function names to reduce ambiguity.
- Add detailed
description
fields to improve selection accuracy.
LLM Layer
How Tool Calling Works
Tool calling is triggered during an active conversation when the LLM model needs to invoke a function. Here’s how the process works:This example explains the
get_current_time
function from the example configuration above.
Modify Existing Tools
You can updatetools
definitions using the Update Persona API.
Replace
<api_key>
with your actual API key. You can generate one in the Developer Portal.