The default (fallback) adapter transforms prompts into AgentMark’s raw configuration format without targeting a specific AI SDK. This lets you map the output directly to your preferred provider.Documentation Index
Fetch the complete documentation index at: https://docs.agentmark.co/llms.txt
Use this file to discover all available pages before exploring further.
With the Default integration, you can map the parameters directly to your preferred provider (e.g., OpenAI, Ollama) or your own SDK. This allows you to maintain flexibility while using AgentMark’s interface.
Installation
Usage
FileLoader takes the output directory from npx agentmark build (typically ./dist/agentmark), not your source prompt directory. Prompts are pre-compiled to JSON by build before FileLoader reads them.What it returns
The fallback adapter returns the raw prompt configuration as-is, without transforming it for a specific SDK. The output includes:- Model configuration — model name, temperature, max tokens, and other parameters from your prompt’s frontmatter
- Messages — the rendered system/user/assistant messages after template processing
- Tool references — tool names referenced in the prompt frontmatter
- Schema — for object prompts, the output schema
When to use
- Unsupported SDK — Your AI SDK doesn’t have a dedicated AgentMark adapter yet
- Custom provider — You’re calling a provider API directly without an SDK
- Inspection/debugging — You want to see the raw config AgentMark produces before sending it to a provider
- Adapter development — You’re building a new adapter and want to understand the input format
Next steps
AI SDK
Recommended adapter for most apps
Custom adapter
Build your own adapter
Prompts
Learn about prompt syntax
All integrations
See all available adapters
Have Questions?
We’re here to help! Choose the best way to reach us:
- Email us at hello@agentmark.co for support
- Schedule an Enterprise Demo to learn about our business solutions