Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.agentmark.co/llms.txt

Use this file to discover all available pages before exploring further.

AgentMark generates text with prompts that declare a text_config in frontmatter. Text prompts use message-role tags (<System>, <User>, <Assistant>) and return a string.

Example configuration

example.prompt.mdx
---
name: example
text_config:
  model_name: gpt-4o-mini
---

<System>You are a math tutor that can perform calculations.</System>
<User>What's 235 * 18?</User>

Tags

TagDescription
<System>System-level instructions
<User>User message
<Assistant>Assistant message (optional — include for few-shot examples or prior-turn context)

Available configuration

PropertyTypeDescriptionRequired
model_namestringThe name of the model to use for text generation.Yes
max_tokensnumberMaximum number of tokens to generate.No
temperaturenumberControls the randomness of the output; higher values are more random.No
max_callsnumberMaximum number of LLM calls allowed (for agent workflows).No
top_pnumberCumulative probability for nucleus sampling.No
top_knumberLimits next-token selection to the top k tokens.No
presence_penaltynumberPenalizes tokens based on presence in the text so far, encouraging new topics.No
frequency_penaltynumberPenalizes tokens based on frequency in the text so far, reducing verbatim repetition.No
stop_sequencesstring[]Strings that, if encountered, stop generation.No
seednumberRandom-number seed for reproducibility.No
max_retriesnumberMaximum number of retries on failure.No
tool_choice"auto" | "none" | "required" | { type: "tool", tool_name: string }Controls how tools are used during generation.No
toolsstring[]List of tool names or MCP URIs available to the model. Tools resolve from the tools passed to createAgentMarkClient.No

Running a text prompt

See Running prompts → Text generation for the SDK code patterns for generateText (TypeScript) and run_text_prompt (Python).

Have Questions?

We’re here to help! Choose the best way to reach us: