
What are Annotations?
Annotations are manual evaluations that you can add to any span in your traces. Unlike automated evaluations that run programmatically, annotations are created by team members directly in the AgentMark dashboard. Each annotation contains:- Name: A short title describing what you’re evaluating
- Label: A categorical assessment (e.g., “correct”, “incorrect”, “regression”)
- Score: A numeric value representing quality or performance
- Reason: Detailed explanation of why you assigned this score and label
Use Cases
Quality Assessment
Review production traces to identify issues and track improvements:Adding Annotations
From the Traces View
- Navigate to the Traces page in your AgentMark dashboard
- Click on any trace to open the trace details drawer
- Select a span from the trace tree
- Click on the Evaluation tab
- Click the Add annotation button
- Fill in the annotation fields:
- Name: Short identifier for this annotation
- Label: Category or classification
- Score: Numeric value (can be decimal)
- Reason: Detailed explanation
- Click Save
Viewing Annotations
Annotations appear in the Evaluation tab alongside automated evaluation scores. They are distinguished by:- A filled badge labeled “annotation” (vs. “eval” for automated scores)
- Different visual styling to make them easily identifiable
Learn More
- Traces and Logs - Understanding trace data
- Automated Evaluations - Setting up automated scoring
- Datasets - Creating test datasets from traces
- Metrics - Monitoring system performance
Have Questions?
We’re here to help! Choose the best way to reach us:
- Join our Discord community for quick answers and discussions
- Email us at hello@agentmark.co for support
- Schedule an Enterprise Demo to learn about our business solutions