Skip to main content
AgentMark provides full Python support with multiple adapter options. Python projects benefit from the same prompt management, testing, and observability features as TypeScript projects.

Getting Started

The fastest way to create a Python project:
npm create agentmark@latest my-app
# Select "Python" when prompted for language
# Select your adapter: Pydantic AI or Claude Agent SDK

Project Structure

A Python AgentMark project has this structure:
my-app/
├── agentmark/                    # Prompt files
│   └── prompts/
│       └── party-planner.prompt.mdx
├── agentmark_client.py           # Python client configuration
├── main.py                       # Application entry point
├── .agentmark/
│   └── dev_server.py             # Auto-generated dev server
├── pyproject.toml                # Python dependencies
├── agentmark.json                # AgentMark configuration
└── .env                          # Environment variables

Supported Adapters

Choose the adapter that fits your use case:
AdapterBest ForFeatures
Pydantic AIType-safe LLM appsMulti-model support, structured output, streaming
Claude Agent SDKAgentic tasksTool use, multi-turn reasoning, budget controls

Client Configuration

All Python projects require an agentmark_client.py file that configures the AgentMark client:
agentmark_client.py
from pathlib import Path
from dotenv import load_dotenv
from agentmark.prompt_core import FileLoader
from agentmark_pydantic_ai_v0 import (
    create_pydantic_ai_client,
    create_default_model_registry,
    PydanticAIToolRegistry,
)

load_dotenv()

model_registry = create_default_model_registry()
tool_registry = PydanticAIToolRegistry()
loader = FileLoader(base_dir=str(Path(__file__).parent.resolve()))

client = create_pydantic_ai_client(
    model_registry=model_registry,
    tool_registry=tool_registry,
    loader=loader,
)

Running the Dev Server

Start the development server:
agentmark dev
The CLI automatically detects Python projects via:
  • pyproject.toml
  • agentmark_client.py
  • .agentmark/dev_server.py
See Dev Server for configuration options.

Python SDK

The agentmark-sdk package provides observability utilities for Python applications:
pip install agentmark-sdk

SDK Exports

from agentmark_sdk import (
    # Core SDK class
    AgentMarkSDK,

    # Tracing utilities
    trace,              # Trace an async operation
    trace_context,      # Context manager for tracing
    TraceOptions,       # Configuration for traces
    TraceContext,       # Context passed to traced functions
    TraceResult,        # Result with trace_id

    # Sampler for filtering spans
    AgentmarkSampler,
)

Initializing Tracing

from agentmark_sdk import AgentMarkSDK, trace, TraceOptions

# Initialize the SDK
sdk = AgentMarkSDK(
    api_key="sk-...",
    app_id="app_123",
)
sdk.init_tracing()

# Trace an operation
result = await trace(
    TraceOptions(name="my-operation", user_id="user-1"),
    my_async_function,
)

# Submit a score
await sdk.score(
    resource_id=result.trace_id,
    name="accuracy",
    score=0.95,
)

TraceOptions

Configure tracing with these options:
OptionTypeDescription
namestrName of the trace/span (required)
metadatadict[str, str]Additional key-value pairs
session_idstrSession identifier for grouping
session_namestrHuman-readable session name
user_idstrUser identifier
dataset_run_idstrDataset run ID (for experiments)
dataset_run_namestrDataset run name
dataset_item_namestrDataset item name
dataset_expected_outputstrExpected output for comparison

Context Manager API

For more control over the span lifecycle:
from agentmark_sdk import trace_context, TraceOptions

async with trace_context(TraceOptions(name="my-operation")) as ctx:
    print(f"Trace ID: {ctx.trace_id}")

    # Add custom attributes
    ctx.set_attribute("custom_key", "value")

    # Add events
    ctx.add_event("checkpoint", {"step": 1})

    # Create child spans
    async with ctx.span("sub-operation") as child:
        result = await perform_step()

Requirements

  • Python 3.12 or higher
  • Node.js (for CLI and dev server orchestration)
  • API keys for your chosen model provider

Next Steps

Have Questions?

We’re here to help! Choose the best way to reach us: