Skip to main content

API Reference

The LLM Router API is fully compatible with the OpenAI API specification, allowing you to switch from OpenAI (or any other provider) with minimal or zero code changes. Base URL
https://api.llmrouter.app/v1
All requests require authentication using your LLM Router API key:
Authorization: Bearer sk_llmr_your_api_key_here

Supported Endpoints

EndpointMethodDescription
/chat/completionsPOSTMain chat completions endpoint (most used)
/messagesPOSTSimplified message-based conversations
/embeddingsPOSTGenerate embeddings for text
/images/generationsPOSTGenerate images (DALL·E, Flux, etc.)
/modelsGETList all available models
/models/{id}GETGet details about a specific model
/responsesPOSTResponses API (OpenAI Responses API)

Core Features

LLM Router extends the standard OpenAI format with a powerful gateway object that enables:
  • Intelligent model routing using tags
  • Automatic Skill injection (enableAutoSearch)
  • Zero Data Retention (ZDR)
  • Context compression & pruning
  • PII redaction
  • Custom routing rules

Example Request with Advanced Features

{
  "model": "anthropic/claude-opus-4.6",
  "messages": [
    { "role": "user", "content": "Write a new billing webhook handler" }
  ],
  "gateway": {
    "zdr": true,
    "skills": {
      "skillIds": ["stripe-api", "node-best-practices"],
      "enableAutoSearch": true
    },
    "tags": ["coding", "backend"]
  }
}

Philosophy

LLM Router gives you the best of both worlds:
  • Full OpenAI compatibility → Drop-in replacement
  • Powerful extensions → Smart routing, Skills, privacy, and optimization
You can start simple (just change the base URL) and gradually unlock advanced features as needed.