agentmark.json for project-level settings, and agentmark.client.ts (or agentmark_client.py) for runtime configuration like models, tools, and loaders.
agentmark.json
Theagentmark.json file lives at your project root and configures your AgentMark application. It is read by both the CLI and the platform.
Basic Example
agentmark.json
Configuration Properties
$schema (optional)
Points to the JSON Schema for editor autocompletion and validation.agentmarkPath (required)
The base directory where AgentMark looks for prompts, components, and datasets. Default is"/", meaning the agentmark/ directory at your project root.
version (required)
The AgentMark configuration version. Use"2.0.0" for new projects.
mdxVersion (optional)
The prompt format version. Use"1.0" for the current format.
builtInModels (optional)
An array of model names that are available for use in prompts. These models are pre-configured with pricing and settings in the platform.pull-models CLI command to interactively add models from supported providers:
evals (optional)
An array of evaluation function names that correspond to evaluations registered in your client’sEvalRegistry. Listing them here makes them available for selection in the platform editor.
modelSchemas (optional)
Define custom model configurations with settings, pricing, and UI controls. Use this for models not covered bybuiltInModels, or to customize settings for existing models.
mcpServers (optional)
Configure Model Context Protocol (MCP) servers that your prompts can use as tools. Servers listed here are available for selection in the platform editor.- URL / SSE
- Stdio
For remote MCP servers accessible via HTTP:
Full Example
agentmark.json
Client Configuration
The client configuration file (agentmark.client.ts or agentmark_client.py) defines your runtime setup: which models to use, what tools are available, how to load prompts, and which evaluations to run.
This file is auto-generated by npm create agentmark@latest and can be customized for your project.
- Cloud Mode
- Self-Hosted Mode
In cloud mode, prompts are loaded from the AgentMark API in production and from your local dev server during development:
agentmark.client.ts
Environment Variables
| Variable | Required | Description |
|---|---|---|
AGENTMARK_API_KEY | Cloud mode | API key from AgentMark platform settings |
AGENTMARK_APP_ID | Cloud mode | App ID from AgentMark platform settings |
AGENTMARK_BASE_URL | No | Override the local dev server URL (default: http://localhost:9418) |
OPENAI_API_KEY | Depends on adapter | OpenAI API key for AI SDK, Mastra, or Pydantic AI adapters |
ANTHROPIC_API_KEY | Depends on adapter | Anthropic API key for Claude Agent SDK adapter |
Have Questions?
We’re here to help! Choose the best way to reach us:
- Join our Discord community for quick answers and discussions
- Email us at hello@agentmark.co for support
- Schedule an Enterprise Demo to learn about our business solutions