meta · Single modalityReleased 2024-07-23

llama-3.1-70b

meta/llama-3.1-70b id

An update to Meta Llama 3 70B Instruct that includes an expanded 128K context length, multilinguality and improved reasoning capabilities.

Tool use
Type
Use llama-3.1-70b
# Drop-in OpenAI-compatible client
$ import { generateText } from 'ai'
$
$ const { text } = await generateText({
$ model: 'meta/llama-3.1-70b',
$ baseURL: 'https://synapse.garden/api/v1',
$ apiKey: process.env.MG_KEY,
$ prompt: 'Why is the sky blue?',
$ })
128K
CONTEXT WINDOW
8.2K
MAX OUTPUT
$0.792/M
INPUT · PER M
$0.792/M
OUTPUT · PER M
PRICING

List prices, every modality.

RatePer million tokens · USD
Input$0.792/M
Output$0.792/M
Honest list pricesHow we calculate prices
MORE FROM META

Other meta models

See all 9
Model
Input
Output
Context
Type
FAQ · LLAMA-3.1-70B

Frequently asked

01 / 04

How do I call llama-3.1-70b from my code?

Use the OpenAI or Anthropic SDK and point baseURL at https://synapse.garden/api/v1. Set model: ‘meta/llama-3.1-70b and supply your Synapse Garden API key. No code changes beyond the base URL.

02 / 04

How much does llama-3.1-70b cost?

Input: $0.792/M per million tokens. Output: $0.792/M per million tokens. The free tier includes a million tokens every month at no cost.

03 / 04

What's the context window for llama-3.1-70b?

llama-3.1-70b supports a context window of 128K tokens, with a maximum output of 8.2K tokens.

04 / 04

Do I need a separate Anthropic or OpenAI account?

No. Synapse Garden is the single API surface — one key gives you OpenAI, Anthropic, Google, Meta, Mistral, DeepSeek, xAI, Cohere, and more. Billing, rate limits, and audit logs are unified.

READY

Try llama-3.1-70b in three minutes.

Sign up, create a key, drop our base URL into your existing client. The free tier includes a million tokens every month — no credit card.