Documentation Index
Fetch the complete documentation index at:https://docs.llmrouter.app/llms.txt Use this file to discover all available models, tags, and features before exploring further.
Vercel AI SDK Integration
The Vercel AI SDK is one of the most popular ways to build AI features in React and Next.js. Since LLM Router is fully OpenAI-compatible, you can use it seamlessly withstreamText, generateText, generateObject, useChat, and other AI SDK tools.
Installation
Basic Configuration
Create a custom provider pointing to LLM Router:Usage Example (Route Handler)
Recommended Models
You can use any of these models with LLM Router:anthropic/claude-opus-4.6anthropic/claude-sonnet-4.6openai/gpt-5.4google/gemini-3.1-proxai/grok-4.20
Advanced Features Available
ThroughproviderOptions.openai.gateway, you can leverage:
- Intelligent Tag Routing
- Automatic Skill Injection (
enableAutoSearch) - Zero Data Retention (ZDR)
- Context Pruning & Compression
- PII Redaction
- Custom Model Routing Rules
Using with React Hooks (useChat)
Pro Tip: Configure default routing rules, tags, and Skills directly in the LLM Router Dashboard. This way, even if you don’t pass
gateway options in every request, LLM Router will still apply your optimized settings.