Skip to main content

MCP Integration

AgentMark supports calling Model Context Protocol (MCP) tools directly from your prompts. You declare tools in .prompt.mdx using mcp://{server}/{tool} (or mcp://{server}/*) and configure MCP servers when creating your AgentMark client via your chosen adapter (e.g., Vercel AI v4).

What you’ll build

  • Configure MCP servers (SSE URL or stdio)
  • Reference MCP tools in prompts
  • Run via the Vercel AI v4 adapter
  • Combine inline tools with MCP tools

1) Configure MCP servers

Define servers when creating your AgentMark client. Two server types are supported:
  • URL/SSE: url, optionally headers
  • stdio: command, optional args, cwd, env
You can interpolate environment variables anywhere using env("VAR_NAME").
import { createAgentMarkClient, VercelAIModelRegistry, VercelAIToolRegistry } from "@agentmark/vercel-ai-v4-adapter";
import { FileLoader } from "@agentmark/agentmark-core";

const loader = new FileLoader(process.cwd());
const modelRegistry = new VercelAIModelRegistry();
modelRegistry.registerModels("gpt-4o-mini", (name) => ({ name } as any));

const tools = new VercelAIToolRegistry().register("summarize", ({ text, maxSentences = 2 }: { text: string; maxSentences?: number }) => {
  const sentences = String(text).split(/(?<=[.!?])\s+/).slice(0, maxSentences);
  return { summary: sentences.join(" ") };
});

const agentMark = createAgentMarkClient({
  loader,
  modelRegistry,
  toolRegistry: tools,
  mcpServers: {
    docs: {
      url: "env(AGENTMARK_MCP_SSE_URL)",
      headers: { Authorization: "Bearer env(MCP_TOKEN)" },
    },
    local: {
      command: "npx",
      args: ["-y", "@mastra/mcp-docs-server"],
      env: { NODE_ENV: "production" },
    },
  },
});
Type guard behavior (from core):
  • URL servers accept only url (and adapter-allowed headers).
  • stdio servers accept only command, args, cwd, env.

Environment interpolation

Use env("VAR") anywhere in the config; values resolve from process.env.VAR:
mcpServers: {
  docs: { url: "env(DOCS_MCP_URL)", headers: { Authorization: "Bearer env(MCP_TOKEN)" } },
  local: { command: "env(NODE_BIN)", args: ["-y", "@mastra/mcp-docs-server"] },
}

2) Reference MCP tools in prompts

Declare MCP tools in your prompt frontmatter. You can mix MCP tools with inline tool definitions.
mcp-example.prompt.mdx
---
name: mcp-example
text_config:
  model_name: gpt-4
  tools:
    search_docs: mcp://docs/web-search
    summarize:
      description: Summarize a block of text
      parameters:
        type: object
        properties:
          text: { type: string }
          maxSentences: { type: number }
        required: [text]
---

<System>
Use the search_docs tool to look up relevant documentation when needed.
Use the summarize tool to condense content into a short summary.
</System>

<User>
Find the page that explains MCP integration and summarize it in 2 sentences.
</User>
  • search_docs resolves to the MCP server docs, tool web-search.
  • summarize is an inline tool available via the adapter tool registry.

Wildcard: include all tools from a server

Include every tool exported by a server using *:
---
text_config:
  tools:
    all: mcp://docs/*
---
  • The alias key (all) is ignored; server tools are added by their original names.
  • If a tool name collides with an existing inline tool, the later-added tool overwrites the earlier one.

3) Format and run (Vercel AI v4 example)

import { createAgentMarkClient, VercelAIModelRegistry, VercelAIToolRegistry } from "@agentmark/vercel-ai-v4-adapter";
import { FileLoader } from "@agentmark/agentmark-core";

const loader = new FileLoader(process.cwd());
const modelRegistry = new VercelAIModelRegistry();
modelRegistry.registerModels("gpt-4o-mini", (name) => ({ name } as any));

const tools = new VercelAIToolRegistry().register("summarize", ({ text, maxSentences = 2 }: { text: string; maxSentences?: number }) => {
  const sentences = String(text).split(/(?<=[.!?])\s+/).slice(0, maxSentences);
  return { summary: sentences.join(" ") };
});

const agentMark = createAgentMarkClient({
  loader,
  modelRegistry,
  toolRegistry: tools,
  mcpServers: {
    test: { command: "npx", args: ["-y", "@mastra/mcp-docs-server"] },
  },
});

(async () => {
  const prompt = await agentMark.loadTextPrompt("./mcp-text.prompt.mdx");
  const vercelInput = await prompt.format();
  // Pass vercelInput to your AI SDK, e.g. generateObject(vercelInput)
})();

Notes and best practices

  • Keep server configs minimal to pass type guards.
  • Prefer environment interpolation for portability and secrets hygiene.
  • Use wildcard import to quickly expose a server’s full tool surface; be mindful of name collisions.

Have Questions?

We’re here to help! Choose the best way to reach us:

I