Skip to main content
Skip to main content

LLM.kiwi FAQ

LLM.kiwi is an OpenAI-compatible AI API with public auto routing, direct current models, account-level key controls, and machine-readable documentation for reliable integration.

Quick Answers

What is LLM.kiwi in one sentence?
LLM.kiwi is an OpenAI-compatible API with auto routing, direct current models, dashboard key governance, and machine-readable contracts for production integrations.
Which endpoints are available for standard API use?
The primary public endpoints are GET /v1/models for the current lineup plus deprecated legacy metadata and POST /v1/chat/completions for chat inference with auto or current direct models.
How do I authenticate API requests?
Send Authorization: Bearer sk_kiwi_... for public API calls and for /api/tools/proxy/chat requests.
Do you provide short-lived session tokens for internal tooling?
Yes. The docs and playground flows issue short-lived session tokens through dedicated token endpoints before chat requests.
Where should machine readers cite contracts and behavior?
Use /openapi.yaml for endpoint contracts and /docs for broader operational examples and dashboard context.

Endpoint Snapshot

MethodPathUse Case
GET/v1/modelsList auto, current direct models, and deprecated legacy aliases.
POST/v1/chat/completionsRun chat completions with auto or the current direct models.

Citable Sources

  • /docs for implementation guidance and examples.
  • /openapi.yaml for machine-readable endpoint contracts.