Skip to main content

Documentation Index

Fetch the complete documentation index at:
https://docs.llmrouter.app/llms.txt
Use this file to discover all available models, tags, and features before exploring further.

Vercel AI SDK Integration

The Vercel AI SDK is one of the most popular ways to build AI features in React and Next.js. Since LLM Router is fully OpenAI-compatible, you can use it seamlessly with streamText, generateText, generateObject, useChat, and other AI SDK tools.

Installation

npm install @ai-sdk/openai

Basic Configuration

Create a custom provider pointing to LLM Router:
import { createOpenAI } from "@ai-sdk/openai";

export const llmRouter = createOpenAI({
  baseURL: "https://api.llmrouter.app/v1",
  apiKey: process.env.LLM_ROUTER_API_KEY!,
  name: "llm-router",
});

Usage Example (Route Handler)

import { streamText } from "ai";
import { llmRouter } from "@/lib/llm-router";

export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = streamText({
    model: llmRouter("anthropic/claude-opus-4.6"),
    messages,
    providerOptions: {
      // Pass LLM Router specific features here
      openai: {
        gateway: {
          // Smart Skill Injection
          skills: {
            skillIds: ["sk_company-style", "sk_stripe-integration"],
            enableAutoSearch: true,
          },
          // Zero Data Retention
          zdr: true,
          // Context Optimization
          chatHistoryCompression: {
            enabled: true,
            score: 0.65,
          },
        },
      },
    },
  });

  return result.toDataStreamResponse();
}
You can use any of these models with LLM Router:
  • anthropic/claude-opus-4.6
  • anthropic/claude-sonnet-4.6
  • openai/gpt-5.4
  • google/gemini-3.1-pro
  • xai/grok-4.20

Advanced Features Available

Through providerOptions.openai.gateway, you can leverage:
  • Intelligent Tag Routing
  • Automatic Skill Injection (enableAutoSearch)
  • Zero Data Retention (ZDR)
  • Context Pruning & Compression
  • PII Redaction
  • Custom Model Routing Rules

Using with React Hooks (useChat)

import { useChat } from "ai/react";
import { llmRouter } from "@/lib/llm-router";

export default function Chat() {
  const { messages, input, handleInputChange, handleSubmit } = useChat({
    api: "/api/chat",
    model: llmRouter("anthropic/claude-sonnet-4.6"),
  });

  // ... rest of your component
}

Pro Tip: Configure default routing rules, tags, and Skills directly in the LLM Router Dashboard. This way, even if you don’t pass gateway options in every request, LLM Router will still apply your optimized settings.