Skip to main content
API quickstart + reference

Production-friendly API documentation

Use this page to provision credentials, call the OpenAI-compatible endpoint, and align your team on the default integration path without wading through unnecessary detail.

Quickstart
3 steps to the first successful request
Short enough for evaluation, clear enough for implementation handoff.
1

Create an API key from the dashboard after signing in.

2

Send a POST request to the chat completions endpoint with your Bearer key.

3

Start with model: auto, then pin hrllm only when you need that behavior explicitly.

Operating notes
What teams usually need to know
Keys are managed from your dashboard and should be stored server-side for production use.
The public docs stay intentionally short so teams can move from evaluation to integration quickly.
Use auto for the default integration path and reserve explicit model IDs for known, tested cases.
Endpoint
POST /v1/chat/completions
https://api.llm.kiwi/v1/chat/completions

Authentication: Authorization: Bearer sk_kiwi_...

Content type: application/json

Supported models: auto and hrllm

Credential posture
Provision keys from the dashboard
Use the dashboard to create, label, revoke, and choose a default key for web tools.

Keep production credentials server-side and avoid exposing long-lived keys in the browser.

Open API key management
AI Tools Web
Use guided tools when you want browser-first workflows
The dashboard also includes a curated library of structured tools for faster drafting and repeatable output.

Keep a default API key configured and open the AI Tools Web library when you want structured prompts instead of raw API calls.

Open dashboard AI Tools Web
auto
Recommended default
Routes to the recommended general-purpose model path for most product integrations.
hrllm
Direct selection
Use when you specifically want hrllm behavior in a tested workflow.
AI Tools Web + API path
Use the dashboard AI Tools Web for guided runs, then move into the API endpoint when you are ready to automate the same workflow in your own product.
AI agents and implementation surfaces
Keep automation grounded in the same credential model and endpoint contract described on this page so product and operations teams stay aligned.
cURL example
curl -X POST "https://api.llm.kiwi/v1/chat/completions" \
  -H "Authorization: Bearer sk_kiwi_..." \
  -H "Content-Type: application/json" \
  -d '{
    "model": "auto",
    "messages": [
      {"role": "user", "content": "Write a short launch announcement."}
    ]
  }'
JavaScript example
const response = await fetch("https://api.llm.kiwi/v1/chat/completions", {
  method: "POST",
  headers: {
    Authorization: "Bearer sk_kiwi_...",
    "Content-Type": "application/json",
  },
  body: JSON.stringify({
    model: "auto",
    messages: [
      { role: "user", content: "Summarize this customer ticket in 3 bullets." },
    ],
  }),
})

const data = await response.json()