API Reference
The LLM Router API is fully compatible with the OpenAI API specification, allowing you to switch from OpenAI (or any other provider) with minimal or zero code changes. Base URLSupported Endpoints
| Endpoint | Method | Description |
|---|---|---|
/chat/completions | POST | Main chat completions endpoint (most used) |
/messages | POST | Simplified message-based conversations |
/embeddings | POST | Generate embeddings for text |
/images/generations | POST | Generate images (DALL·E, Flux, etc.) |
/models | GET | List all available models |
/models/{id} | GET | Get details about a specific model |
/responses | POST | Responses API (OpenAI Responses API) |
Core Features
LLM Router extends the standard OpenAI format with a powerfulgateway object that enables:
- Intelligent model routing using tags
- Automatic Skill injection (
enableAutoSearch) - Zero Data Retention (ZDR)
- Context compression & pruning
- PII redaction
- Custom routing rules
Example Request with Advanced Features
Philosophy
LLM Router gives you the best of both worlds:- Full OpenAI compatibility → Drop-in replacement
- Powerful extensions → Smart routing, Skills, privacy, and optimization