The Vercel AI SDK adapter allows you to use AgentMark prompts with Vercel AI SDK’s AI functions.
Installation
npm install @agentmark/ai-sdk-v4-adapter ai @ai-sdk/openai
Setup
Create your AgentMark client with Vercel AI SDK’s models:
import { createAgentMarkClient } from '@agentmark/sdk';
import { VercelAIModelRegistry } from '@agentmark/ai-sdk-v4-adapter';
import { anthropic } from '@ai-sdk/anthropic';
import { openai } from '@ai-sdk/openai';
export const client = createAgentMarkClient({
models: new VercelAIModelRegistry({
'claude-3-5-sonnet-20241022': anthropic('claude-3-5-sonnet-20241022'),
'gpt-4o': openai('gpt-4o'),
}),
});
Running Prompts
Load and run prompts with generateText():
import { client } from './agentmark.client';
import { generateText } from 'ai';
const prompt = await client.loadTextPrompt('greeting.prompt.mdx');
const input = await prompt.format({
props: { name: 'Alice' }
});
const result = await generateText(input);
console.log(result.text);
Object Generation
For structured output, use generateObject():
import { client } from './agentmark.client';
import { generateObject } from 'ai';
import { z } from 'zod';
const prompt = await client.loadObjectPrompt('extract.prompt.mdx', {
schema: z.object({
sentiment: z.enum(['positive', 'negative', 'neutral']),
confidence: z.number(),
}),
});
const input = await prompt.format({
props: { text: 'This product is amazing!' }
});
const result = await generateObject(input);
console.log(result.object);
// { sentiment: 'positive', confidence: 0.95 }
Streaming
Stream responses for real-time output:
import { streamText } from 'ai';
const prompt = await client.loadTextPrompt('story.prompt.mdx');
const input = await prompt.format({
props: { topic: 'space exploration' }
});
const result = streamText(input);
for await (const chunk of result.textStream) {
process.stdout.write(chunk);
}
Configure tools in your client:
import { createAgentMarkClient } from '@agentmark/sdk';
import { VercelAIModelRegistry, VercelAIToolRegistry } from '@agentmark/ai-sdk-v4-adapter';
import { tool } from 'ai';
import { z } from 'zod';
const weatherTool = tool({
description: 'Get current weather for a location',
parameters: z.object({
location: z.string(),
}),
execute: async ({ location }) => {
return `The weather in ${location} is sunny and 72°F`;
},
});
export const client = createAgentMarkClient({
models: new VercelAIModelRegistry({ /* ... */ }),
tools: new VercelAIToolRegistry({
weather: weatherTool,
}),
});
Then use tools in your prompts:
---
name: weather
text_config:
model_name: claude-3-5-sonnet-20241022
tools:
- weather
---
<System>You are a helpful weather assistant.</System>
<User>What's the weather in {props.location}?</User>
MCP Servers
AgentMark supports Model Context Protocol servers for extended capabilities:
import { createAgentMarkClient } from '@agentmark/sdk';
import { VercelAIModelRegistry, McpServerRegistry } from '@agentmark/ai-sdk-v4-adapter';
export const client = createAgentMarkClient({
models: new VercelAIModelRegistry({ /* ... */ }),
mcpServers: new McpServerRegistry({
filesystem: {
command: 'npx',
args: ['-y', '@modelcontextprotocol/server-filesystem', '/path/to/files'],
},
}),
});
Observability
Enable telemetry to track performance and debug issues:
const input = await prompt.format({
props: { name: 'Alice' },
telemetry: {
isEnabled: true,
functionId: 'greeting-handler',
metadata: {
userId: 'user-123',
sessionId: 'session-abc',
}
}
});
const result = await generateText(input);
Learn more in the Observability documentation.
Next Steps
Have Questions?
We’re here to help! Choose the best way to reach us: