What Gets Tracked
AgentMark automatically collects: Inference Spans - Full prompt execution lifecycle- Token usage and costs
- Response times
- Model information
- Completion status
- Tool name and parameters
- Execution duration
- Success/failure status
- Return values
- Time to first token
- Tokens per second
- Total streaming duration
- Organize by user interaction
- Track multi-step workflows
- Monitor batch processing
- Analyze performance patterns
Quick Start
Enable telemetry when formatting your prompts:Setup
Initialize tracing in your application:When to Use
Development:- Debug prompt behavior
- Optimize token usage
- Understand execution flow
- Test different approaches
- Monitor performance
- Track costs
- Debug user issues
- Analyze usage patterns