The AI SDK adapter allows you to use AgentMark prompts with Vercel AI SDK’s generation functions. This is the recommended adapter for most TypeScript/JavaScript applications.
AgentMark provides two versions of this adapter — one for AI SDK v4 and one for AI SDK v5. Both share the same API surface (VercelAIModelRegistry, VercelAIToolRegistry, McpServerRegistry, createAgentMarkClient), so switching between them only requires changing the package import.
Choosing a Version
| AI SDK v4 Adapter | AI SDK v5 Adapter |
|---|
| Package | @agentmark-ai/ai-sdk-v4-adapter | @agentmark-ai/ai-sdk-v5-adapter |
| AI SDK peer | ai ^4.0.0 | ai ^5.0.52 |
| MCP imports | Built into ai package | Separate @ai-sdk/mcp peer dependency |
| Tool definition | parameters field | inputSchema field (wrapped via jsonSchema) |
| Status | Stable — use if your project is on AI SDK v4 | Recommended — use for new projects |
If you’re starting a new project, use the v5 adapter. The v4 adapter is provided for projects that haven’t yet upgraded to AI SDK v5.
Installation
Install the adapter, the ai core package, and the provider package(s) for the models you want to use. Provider packages must be compatible with your ai core version.
AI SDK v5 (Recommended)
AI SDK v4
# Core
npm install @agentmark-ai/ai-sdk-v5-adapter ai@^5
# Provider packages (install the ones you need)
npm install @ai-sdk/openai # OpenAI / GPT models
npm install @ai-sdk/anthropic # Anthropic / Claude models
npm install @ai-sdk/google # Google / Gemini models
# MCP server support (optional)
npm install @ai-sdk/mcp
# Core
npm install @agentmark-ai/ai-sdk-v4-adapter ai@^4
# Provider packages — use v4-compatible versions
npm install @ai-sdk/openai@^1 # OpenAI / GPT models
npm install @ai-sdk/anthropic@^1 # Anthropic / Claude models
npm install @ai-sdk/google@^1 # Google / Gemini models
AI SDK v4 uses @ai-sdk/ provider packages at v1.x. AI SDK v5 uses v2.x+. Make sure you install the version that matches your ai core package.
Setup
Create your AgentMark client with a model registry. Use .registerModels() to map model names to AI SDK provider instances — supports exact names or regex patterns:
import { createAgentMarkClient, VercelAIModelRegistry } from "@agentmark-ai/ai-sdk-v5-adapter";
import { anthropic } from "@ai-sdk/anthropic";
import { openai } from "@ai-sdk/openai";
const modelRegistry = new VercelAIModelRegistry()
.registerModels(["claude-3-5-sonnet-20241022"], (name) => anthropic(name))
.registerModels(["gpt-4o", "gpt-4o-mini"], (name) => openai(name))
.registerModels([/^gpt-/], (name) => openai(name)); // regex pattern
export const client = createAgentMarkClient({
loader: fileLoader,
modelRegistry,
});
import { createAgentMarkClient, VercelAIModelRegistry } from "@agentmark-ai/ai-sdk-v4-adapter";
import { anthropic } from "@ai-sdk/anthropic";
import { openai } from "@ai-sdk/openai";
const modelRegistry = new VercelAIModelRegistry()
.registerModels(["claude-3-5-sonnet-20241022"], (name) => anthropic(name))
.registerModels(["gpt-4o", "gpt-4o-mini"], (name) => openai(name))
.registerModels([/^gpt-/], (name) => openai(name)); // regex pattern
export const client = createAgentMarkClient({
loader: fileLoader,
modelRegistry,
});
The setup is identical — only the import path changes.
Running Prompts
Load and run prompts with generateText():
import { client } from "./agentmark.client";
import { generateText } from "ai";
const prompt = await client.loadTextPrompt("greeting.prompt.mdx");
const input = await prompt.format({
props: { name: "Alice" },
});
const result = await generateText(input);
console.log(result.text);
Object Generation
For structured output, use generateObject():
import { client } from "./agentmark.client";
import { generateObject } from "ai";
import { z } from "zod";
const prompt = await client.loadObjectPrompt("extract.prompt.mdx", {
schema: z.object({
sentiment: z.enum(["positive", "negative", "neutral"]),
confidence: z.number(),
}),
});
const input = await prompt.format({
props: { text: "This product is amazing!" },
});
const result = await generateObject(input);
console.log(result.object);
// { sentiment: 'positive', confidence: 0.95 }
Streaming
Stream responses for real-time output with streamText or streamObject:
import { streamText } from "ai";
const prompt = await client.loadTextPrompt("story.prompt.mdx");
const input = await prompt.format({
props: { topic: "space exploration" },
});
const result = streamText(input);
for await (const chunk of result.textStream) {
process.stdout.write(chunk);
}
For streaming structured objects:
import { streamObject } from "ai";
const prompt = await client.loadObjectPrompt("extract.prompt.mdx", {
schema: z.object({ summary: z.string(), tags: z.array(z.string()) }),
});
const input = await prompt.format({ props: { text: "..." } });
const result = streamObject(input);
for await (const partial of result.partialObjectStream) {
console.log(partial);
}
Image Generation
Generate images using experimental_generateImage:
import { experimental_generateImage } from "ai";
const prompt = await client.loadImagePrompt("avatar.prompt.mdx");
const input = await prompt.format({
props: { description: "A futuristic city skyline" },
});
const result = await experimental_generateImage(input);
Speech Generation
Generate speech using experimental_generateSpeech:
import { experimental_generateSpeech } from "ai";
const prompt = await client.loadSpeechPrompt("narration.prompt.mdx");
const input = await prompt.format({
props: { text: "Welcome to AgentMark." },
});
const result = await experimental_generateSpeech(input);
Configure tools using VercelAIToolRegistry:
import { createAgentMarkClient, VercelAIModelRegistry, VercelAIToolRegistry } from "@agentmark-ai/ai-sdk-v5-adapter";
import { tool } from "ai";
import { z } from "zod";
const weatherTool = tool({
description: "Get current weather for a location",
parameters: z.object({
location: z.string(),
}),
execute: async ({ location }) => {
return `The weather in ${location} is sunny and 72°F`;
},
});
export const client = createAgentMarkClient({
loader: fileLoader,
modelRegistry,
tools: new VercelAIToolRegistry({
weather: weatherTool,
}),
});
Then reference tools in your prompts:
---
name: weather
text_config:
model_name: claude-3-5-sonnet-20241022
tools:
- weather
---
<System>You are a helpful weather assistant.</System>
<User>What's the weather in {props.location}?</User>
MCP Servers
AgentMark supports Model Context Protocol servers for extended capabilities. Configure both stdio and URL-based servers:
import { createAgentMarkClient, VercelAIModelRegistry, McpServerRegistry } from "@agentmark-ai/ai-sdk-v5-adapter";
export const client = createAgentMarkClient({
loader: fileLoader,
modelRegistry,
mcpServers: new McpServerRegistry({
filesystem: {
command: "npx",
args: ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/files"],
},
remote: {
url: "https://mcp.example.com/sse",
},
}),
});
The v5 adapter imports MCP support from @ai-sdk/mcp (installed separately).import { createAgentMarkClient, VercelAIModelRegistry, McpServerRegistry } from "@agentmark-ai/ai-sdk-v4-adapter";
export const client = createAgentMarkClient({
loader: fileLoader,
modelRegistry,
mcpServers: new McpServerRegistry({
filesystem: {
command: "npx",
args: ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/files"],
},
remote: {
url: "https://mcp.example.com/sse",
},
}),
});
The v4 adapter uses MCP support built into the ai package.
Observability
Enable telemetry to track performance and debug issues:
const input = await prompt.format({
props: { name: "Alice" },
telemetry: {
isEnabled: true,
functionId: "greeting-handler",
metadata: {
userId: "user-123",
sessionId: "session-abc",
},
},
});
const result = await generateText(input);
Learn more in the Observability documentation.
Migrating from v4 to v5
If you’re upgrading from the v4 adapter to v5:
-
Update packages — upgrade the adapter,
ai core, provider packages, and add MCP if needed:
npm uninstall @agentmark-ai/ai-sdk-v4-adapter
npm install @agentmark-ai/ai-sdk-v5-adapter ai@^5 @ai-sdk/mcp
# Also upgrade your provider packages to v5-compatible versions
npm install @ai-sdk/openai@latest @ai-sdk/anthropic@latest
-
Update imports — change the package path in your
agentmark.client.ts:
// Before
import { createAgentMarkClient, VercelAIModelRegistry } from "@agentmark-ai/ai-sdk-v4-adapter";
// After
import { createAgentMarkClient, VercelAIModelRegistry } from "@agentmark-ai/ai-sdk-v5-adapter";
-
No other changes required — the
VercelAIModelRegistry, VercelAIToolRegistry, McpServerRegistry, and createAgentMarkClient APIs are the same. All prompt .format() calls and AI SDK generation functions (generateText, generateObject, streamText, etc.) remain identical.
Next Steps
Have Questions?
We’re here to help! Choose the best way to reach us: