Conversational Use Cases
Customer Service Agent
Engage in real-time customer support conversations that adapt to user emotions and behavior.
Customer Service Agent Configuration
This predefined persona is configured to provide personalized history lessons. It includes:
- Persona Identity: A professional customer service agent that helps users with real product or service issues. The agent speaks clearly and responds with empathy, adjusting based on how the user sounds or looks.
- Full Pipeline Mode: Enables the full Tavus conversational pipeline, including Perception, STT, LLM, and TTS.
- System Prompt: Tells the agent to act professionally and respond helpfully, while being aware of the user’s emotional state.
- Context: Describes a real customer support situation. The agent listens to the user’s issue, helps resolve it, and changes its tone or pace if the user seems frustrated or confused.
- Persona Layer:
- LLM Layer: Uses the
resolve_customer_issue
tool to gather:product
: what the issue is aboutissue_description
: a short explanation of the problemurgency
: how serious the issue is (low
,medium
, orhigh
)
- Perception Layer: Uses the
raven-0
model to watch for signs like fidgeting, slouching, or facial expressions. If the user appears upset, it calls theuser_emotional_state
tool with:emotional_state
: what the user seems to feel (e.g., frustrated, calm)indicator
: what was observed (e.g., sighing, avoiding eye contact)
- TTS Layer: Employs the
cartesia
voice engine with emotion control. - STT Layer: Uses
tavus-advanced
engine with smart turn detection for seamless real-time conversations.
- LLM Layer: Uses the
Create a Conversation with the Customer Service Agent Persona
- Use the following request body example:
cURL
- Replace
<api_key>
with your actual API key. You can generate one by following the steps in the Quickstart guide. - Replace
<customer_service_persona_id>
with the ID of the persona you created using the Customer Service Agent configuration.
- Click the link in the
conversation_url
field to join the conversation: