Observability¶
Playbooks AI provides comprehensive observability capabilities to help you monitor, debug, and optimize your AI agents. This is essential for building reliable and trustworthy AI systems, especially in production environments.
Overview¶
The observability features in Playbooks AI enable you to:
- Monitor playbook execution and performance
- Track LLM interactions and token usage
- Debug complex agent workflows
- Gain insights into agent decision-making processes
- Measure and optimize costs
- Ensure compliance and auditability
LangFuse Integration¶
Playbooks AI integrates with LangFuse, an open-source observability platform specifically designed for LLM applications. LangFuse provides tracing, evaluation, and analytics for your AI agents.
Setting Up LangFuse¶
To enable LangFuse integration, you need to:
- Deploy your own LangFuse instance using docker compose or set up an account on LangFuse Cloud.
- Configure your environment variables in the
.env
file
What Playbooks AI Traces¶
When LangFuse integration is enabled, Playbooks AI automatically traces:
- Playbook Executions: Each markdown and Python playbook run with timing and context
- LLM Interactions: Prompts, completions, tokens, and latency
- User Interactions: Messages and responses
- Agent State Changes: Variables and context updates
- Errors and Exceptions: Problems encountered during execution
Viewing and Analyzing Data¶
After integrating with LangFuse, run a Playbooks program and then view the traces in the LangFuse dashboard.