How do I call grok-3-fast from my code?
Use the OpenAI or Anthropic SDK and point baseURL at https://synapse.garden/api/v1. Set model: ‘xai/grok-3-fast’ and supply your Synapse Garden API key. No code changes beyond the base URL.
xai/grok-3-fast idxAI's flagship model that excels at enterprise use cases like data extraction, coding, and text summarization. Possesses deep domain knowledge in finance, healthcare, law, and science. The fast model variant is served on faster infrastructure, offering response times that are significantly faster than the standard. The increased speed comes at a higher cost per output token.
# Drop-in OpenAI-compatible client$ import { generateText } from 'ai'$$ const { text } = await generateText({$ model: 'xai/grok-3-fast',$ baseURL: 'https://synapse.garden/api/v1',$ apiKey: process.env.MG_KEY,$ prompt: 'Why is the sky blue?',$ })
| Rate | Per million tokens · USD |
|---|---|
| Input | $5.50/M |
| Output | $27.50/M |
| Cache read | $1.38/M |
Use the OpenAI or Anthropic SDK and point baseURL at https://synapse.garden/api/v1. Set model: ‘xai/grok-3-fast’ and supply your Synapse Garden API key. No code changes beyond the base URL.
Input: $5.50/M per million tokens. Output: $27.50/M per million tokens. The free tier includes a million tokens every month at no cost.
grok-3-fast supports a context window of 131.1K tokens, with a maximum output of 131.1K tokens.
No. Synapse Garden is the single API surface — one key gives you OpenAI, Anthropic, Google, Meta, Mistral, DeepSeek, xAI, Cohere, and more. Billing, rate limits, and audit logs are unified.
Sign up, create a key, drop our base URL into your existing client. The free tier includes a million tokens every month — no credit card.