Observability & Debugging¶
Playbooks AI provides comprehensive observability capabilities to help you monitor, debug, and optimize your AI agents. This is essential for building reliable and trustworthy AI systems, especially in production environments.
Overview¶
The observability features in Playbooks AI enable you to:
- Monitor playbook execution and performance
- Track LLM interactions and token usage
- Debug complex agent workflows
- Gain insights into agent decision-making processes
- Measure and optimize costs
- Ensure compliance and auditability
VSCode Debugger¶
Attach from VSCode using the Playbooks debug configuration (see VSCode Extension). You can set breakpoints in .pb
, step, and inspect the call stack.
Session logs and events¶
Playbooks emits structured session logs and an event bus stream:
- Session logs: step execution, variable updates, playbook calls, LLM requests/responses
- Event bus: subscribe to
*
to print all events in a custom app; the Web Server and Playground consume this stream to visualize execution in real time
Web Server & Playground¶
Start the server and open the HTML Playground to visualize agent messages and logs live. See Applications > Web Server and > HTML Playground for details.
See also¶
LangFuse Integration¶
Playbooks AI integrates with LangFuse, an open-source observability platform specifically designed for LLM applications. LangFuse provides tracing, evaluation, and analytics for your AI agents.
Setting Up LangFuse¶
To enable LangFuse integration, you need to:
- Deploy your own LangFuse instance using docker compose or set up an account on LangFuse Cloud.
- Configure your environment variables in the
.env
file
What Playbooks AI Traces¶
When LangFuse integration is enabled, Playbooks AI automatically traces:
- Playbook Executions: Each markdown and Python playbook run with timing and context
- LLM Interactions: Prompts, completions, tokens, and latency
- User Interactions: Messages and responses
- Agent State Changes: Variables and context updates
- Errors and Exceptions: Problems encountered during execution
Viewing and analyzing data¶
After integrating with LangFuse, run a Playbooks program and then view the traces in the LangFuse dashboard.