AgentMark provides webhook helpers to streamline the process of handling webhook events. This is particularly useful when working with the Vercel AI SDK, as it simplifies running prompts and datasets received from AgentMark webhooks.

Vercel AI SDK Helper

The @agentmark/vercel-ai-v4-webhook-helper package is designed to work seamlessly with the Vercel AI SDK. It abstracts away the complexity of handling different prompt types, allowing you to use a single method to run any prompt.

How It Works

The WebhookHelper class inspects the incoming prompt’s configuration (text_config, object_config, etc.) and automatically calls the appropriate handler (runTextPrompt, runObjectPrompt, etc.). This means you don’t have to write repetitive boilerplate code to handle each prompt type manually.

Installation

To get started, install the necessary packages:

npm install ai @ai-sdk/openai @agentmark/agentmark-core @agentmark/sdk @agentmark/shared-utils @agentmark/vercel-ai-v4-adapter @agentmark/vercel-ai-v4-webhook-helper

Setup

Here’s how to set up your webhook endpoint using the helper in a Next.js app:

  1. Initialize AgentMark SDK and Vercel AI Adapter

    First, configure the AgentMark SDK and create a model registry for the Vercel AI SDK. Then, create an AgentMark client, which will be passed to the WebhookHelper.

    import { AgentMarkSDK } from "@agentmark/sdk";
    import { createAgentMarkClient, VercelAIModelRegistry } from "@agentmark/vercel-ai-v4-adapter";
    import { openai } from "@ai-sdk/openai";
    
    const sdk = new AgentMarkSDK({
      apiKey: process.env.AGENTMARK_API_KEY,
      appId: process.env.AGENTMARK_APP_ID,
    });
    sdk.initTracing({ disableBatch: true });
    
    const modelRegistry = new VercelAIModelRegistry();
    
    // Register all the models you intend to use
    modelRegistry.registerModels(
      ["gpt-4o", "gpt-4-turbo", "gpt-3.5-turbo"],
      (name: string) => openai(name)
    );
    modelRegistry.registerModels(
      ["dall-e-3", "dall-e-2"],
      (name: string) => openai.image(name)
    );
    
    const agentmark = createAgentMarkClient({
      modelRegistry,
      loader: sdk.getFileLoader(),
    });
    
  2. Use the WebhookHelper in Your Endpoint

    Once you have the agentmark client, you can instantiate the WebhookHelper and use it to process incoming webhook events. The example below shows a complete Next.js App Router endpoint.

    import { WebhookHelper } from "@agentmark/vercel-ai-v4-webhook-helper";
    import { NextRequest, NextResponse } from "next/server";
    import { verifySignature } from "@agentmark/shared-utils";
    
    // ... (setup from the previous step)
    
    export async function POST(request: NextRequest) {
      try {
        // 1. Verify the signature
        const payload = await request.json();
        const xAgentmarkSign = request.headers.get("x-agentmark-signature-256");
        if (!xAgentmarkSign || !(await verifySignature(
            process.env.AGENTMARK_WEBHOOK_SECRET!,
            xAgentmarkSign,
            JSON.stringify(payload)
        ))) {
          return NextResponse.json({ message: "Invalid signature" }, { status: 401 });
        }
    
        // 2. Initialize the helper and process the event
        const event = payload.event;
        const webhookHelper = new WebhookHelper(agentmark);
    
        switch (event.type) {
          case "prompt-run": {
            const response = await webhookHelper.runPrompt(event.data, {
              shouldStream: true,
            });
            if (response.type === "stream") {
              return new Response(response.stream, {
                headers: { ...response.streamHeader },
              });
            }
            return NextResponse.json(response);
          }
          
          case "dataset-run": {
            const response = await webhookHelper.runDataset(event.data);
            return new Response(response.stream, {
              headers: {
                ...response.streamHeaders,
              },
            });
          }
    
          case "alert": {
            // Alerts are not handled by the helper, so you can process them manually
            console.log("Alert received:", event.data);
            return NextResponse.json({ message: "Alert processed successfully" });
          }
    
          default:
            return NextResponse.json({ message: "Unknown event type" }, { status: 400 });
        }
      } catch (error) {
        console.error("Webhook error:", error);
        return NextResponse.json({ message: "Internal server error" }, { status: 500 });
      }
    }
    

The runPrompt Method

The runPrompt method is the core of the helper when handling single prompt runs. It takes two arguments:

  1. event.data: The prompt data from the webhook payload.
  2. options (optional): An object to configure the prompt execution.

Options

The options object allows you to specify how the prompt should be run.

  • shouldStream (boolean): Defaults to true. When true, the method returns a ReadableStream to stream the response back to the AgentMark platform. This is the recommended approach for real-time feedback. Set to false to receive the full response as a single JSON object after the model has finished generating.

Return Value & Handling

The runPrompt method returns a promise that resolves to a WebhookResponse. Your webhook must handle the response correctly to ensure proper communication with the AgentMark platform.

  • If response.type is stream: You must return a new Response object. The body should be the response.stream and the headers must include response.streamHeader. This is critical for streaming the result back to AgentMark correctly.

    if (response.type === "stream") {
      return new Response(response.stream, {
        headers: { ...response.streamHeader },
      });
    }
    
  • If response.type is anything else (text, object, image, speech): You should return the entire response object as a JSON response.

    return NextResponse.json(response);
    

The runDataset Method

The runDataset method handles running a prompt against all items in a dataset. It is a “fire-and-forget” operation from the webhook’s perspective. The helper processes the entire dataset in the background and sends telemetry data to AgentMark for each item.

Parameters

The method takes a single argument, event.data, which is an object containing the paths to the prompt and dataset, along with run metadata.

Return Value & Handling

The runDataset method returns a Promise<RunDatasetResponse> that contains a streaming response. The helper processes the entire dataset and streams the results back to AgentMark in real-time.

case "dataset-run": {
  const response = await webhookHelper.runDataset(event.data);
  return new Response(response.stream, {
    headers: {
      ...response.streamHeaders,
    },
  });
}

The runDataset method currently supports prompts that generate text and objects. Image and speech generation prompts are not supported for dataset runs.

Manual Implementation

If you prefer not to use the WebhookHelper or are not using the Vercel AI SDK, you can implement the event handling logic yourself. Please refer to our detailed event documentation for guidance:

Have Questions?

We’re here to help! Choose the best way to reach us: