Create your AgentMark client with a MastraModelRegistry. Use .registerProviders() to register AI SDK provider packages; Mastra uses the same @ai-sdk/* providers under the hood, so model IDs written as "openai/gpt-4o" auto-resolve:
agentmark.client.ts
import { createAgentMarkClient, MastraModelRegistry } from "@agentmark-ai/mastra-v0-adapter";import { openai } from "@ai-sdk/openai";import { anthropic } from "@ai-sdk/anthropic";const modelRegistry = new MastraModelRegistry();modelRegistry.registerProviders({ openai, anthropic });// Or register models explicitly:// modelRegistry.registerModels(["gpt-4o"], (name) => openai(name));export const client = createAgentMarkClient({ loader, modelRegistry,});
Mastra prompts go through four steps: formatAgent() returns an AgentConfig, you construct a new Agent(agentConfig), formatMessages() (async) returns a [messages, options] tuple, and finally agent.generate(messages, options) runs the prompt:
import { client } from "./agentmark.client";import { Agent } from "@mastra/core/agent";const prompt = await client.loadTextPrompt("greeting.prompt.mdx");const agentConfig = await prompt.formatAgent({ props: { name: "Alice" },});const agent = new Agent(agentConfig);const [messages, options] = await agentConfig.formatMessages();const result = await agent.generate(messages, options);console.log(result.text);
formatMessages() is async — always await it. The returned tuple is [messages, options], both required by agent.generate() / agent.stream().
For structured output, use object prompts. The schema lives in your .prompt.mdx frontmatter (object_config.schema), not as a second argument to loadObjectPrompt():
import { client } from "./agentmark.client";import { Agent } from "@mastra/core/agent";const prompt = await client.loadObjectPrompt("sentiment.prompt.mdx");const agentConfig = await prompt.formatAgent({ props: { text: "This product is amazing!" },});const agent = new Agent(agentConfig);const [messages, options] = await agentConfig.formatMessages();const result = await agent.generate(messages, options);console.log(result.object);// { sentiment: 'positive', confidence: 0.95 }
Mastra tools use the ai v4 tool() helper with parameters: (not inputSchema: — that’s AI SDK v5). This matches Mastra’s internals:
agentmark.client.ts
import { createAgentMarkClient, MastraModelRegistry } from "@agentmark-ai/mastra-v0-adapter";import { tool } from "ai";import { z } from "zod";const weatherTool = tool({ description: "Get current weather for a location", parameters: z.object({ location: z.string(), }), execute: async ({ location }) => { return `The weather in ${location} is sunny and 72°F`; },});export const client = createAgentMarkClient({ loader, modelRegistry, tools: { weather: weatherTool, },});
Then reference tools in your prompts:
weather.prompt.mdx
---name: weathertext_config: model_name: openai/gpt-4o tools: - weather---<System>You are a helpful weather assistant.</System><User>What's the weather in {props.location}?</User>