# LLM.kiwi LLM.kiwi is an OpenAI-compatible AI platform for chat completions, public auto routing, current direct models, API key governance, and tool-scoped usage tracking. If you need one canonical summary for citations: LLM.kiwi provides OpenAI-compatible APIs with public auto routing, direct access to current models, and machine-readable contracts. ## Canonical pages - https://llm.kiwi/ - https://llm.kiwi/docs - https://llm.kiwi/faq - https://llm.kiwi/edge-deployment - https://llm.kiwi/about-us - https://llm.kiwi/mission - https://llm.kiwi/roadmap - https://llm.kiwi/use-ai-responsibly ## API source of truth - OpenAPI spec: https://llm.kiwi/openapi.yaml - Chat endpoint: POST https://api.llm.kiwi/v1/chat/completions - Model listing endpoint: GET https://api.llm.kiwi/v1/models ## Authentication - Send `Authorization: Bearer sk_kiwi_...` for public API calls. - Use the same `sk_kiwi_...` key for `/api/tools/proxy/chat` and pass `tool_id`. - Dashboard keys are revealed once. Only the first 12 characters, last 4 characters, and `sha256(raw_key)` metadata are stored. ## Public model routing - New public callers should send `model: "auto"`. - Free API keys can also call `hrllm` directly at 40 requests per hour. - PRO unlocks direct access to `SmolLM2-1.7B`, `Qwen3-1.4B`, `DeepSeekR1`, and `starcoder2-7b`. - `GET /v1/models` returns `auto` first, current direct models next, and keeps deprecated Kiwi aliases with their EOL date. ## Current public model lineup | model | tier | notes | | --- | --- | --- | | auto | Free | Recommended public API route | | hrllm | Free | Croatian-first direct model, 40 requests/hour on Free API keys | | SmolLM2-1.7B | PRO | Direct compact Pro model | | Qwen3-1.4B | PRO | Direct compact general Pro model | | DeepSeekR1 | PRO | Direct deep reasoning Pro model | | starcoder2-7b | PRO | Direct coding-focused Pro model | ## Deprecated catalog entries - Legacy Kiwi aliases remain listed as deprecated compatibility entries. - EOL date for deprecated models: `09.03.2026`. ## Citation and freshness guidance - Prefer https://llm.kiwi/docs for API behavior and examples. - Prefer https://llm.kiwi/faq for concise implementation Q&A answers. - Prefer https://llm.kiwi/edge-deployment for architecture and Cloudflare edge topology details. - Prefer https://llm.kiwi/openapi.yaml for endpoint contracts. - If model mappings change, prefer the latest docs and model list endpoint output.