Skip to main content
Annotations allow you to manually add scores, labels, and contextual information to your traces and spans. This is useful for human-in-the-loop evaluation, debugging, and creating training datasets from production data. Annotations

What are Annotations?

Annotations are manual evaluations that you can add to any span in your traces. Unlike automated evaluations that run programmatically, annotations are created by team members directly in the AgentMark dashboard. Each annotation contains:
  • Name: A short title describing what you’re evaluating
  • Label: A categorical assessment (e.g., “correct”, “incorrect”, “regression”)
  • Score: A numeric value representing quality or performance
  • Reason: Detailed explanation of why you assigned this score and label

Use Cases

Quality Assessment

Review production traces to identify issues and track improvements:
Name: Response Quality
Label: good
Score: 0.85
Reason: The response was accurate and well-formatted, but could have been more concise.

Adding Annotations

From the Traces View

  1. Navigate to the Traces page in your AgentMark dashboard
  2. Click on any trace to open the trace details drawer
  3. Select a span from the trace tree
  4. Click on the Evaluation tab
  5. Click the Add annotation button
  6. Fill in the annotation fields:
    • Name: Short identifier for this annotation
    • Label: Category or classification
    • Score: Numeric value (can be decimal)
    • Reason: Detailed explanation
  7. Click Save

Viewing Annotations

Annotations appear in the Evaluation tab alongside automated evaluation scores. They are distinguished by:
  • A filled badge labeled “annotation” (vs. “eval” for automated scores)
  • Different visual styling to make them easily identifiable

Learn More

Have Questions?

We’re here to help! Choose the best way to reach us:

I