Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.llmrouter.app/llms.txt Use this file to discover all available pages before exploring further.

Open WebUI & LLM Router

Open WebUI is an incredibly popular, extensible, and user-friendly self-hosted web interface for interacting with Large Language Models. Because web-based chat interfaces encourage long, meandering conversations, they quickly become expensive as token usage balloons. By connecting Open WebUI to LLM Router, you automatically apply Chat History Optimization to your long threads, dropping irrelevant old messages and saving you massive amounts of input tokens.

Step 1: Create an LLM Router API Key

Before configuring Open WebUI, you need an API key from LLM Router:
  1. Log into the LLM Router Dashboard.
  2. Navigate to the API Keys section.
  3. Generate a new key and copy it (e.g., sk-router-...).

Step 2: Configure LLM Router in Open WebUI

Open WebUI natively supports adding multiple OpenAI-compatible API endpoints. We will add LLM Router as a new connection.
1

Access Admin Settings

Log into your Open WebUI instance as an administrator. Click on your Profile Icon in the bottom left corner, then navigate to Admin Settings.
2

Navigate to Connections

In the Admin Settings menu, click on the Connections tab. This is where you configure external API providers.
3

Add LLM Router Connection

Under the OpenAI API section, click the + (Add Connection) button and enter the following details: * Base URL: Enter https://api.llmrouter.app/v1 * API Key: Paste your sk-router-... key here.
4

Verify Connection

Click the Verify button next to your new connection. If successful, Open WebUI will fetch the list of all available models supported by LLM Router.Click Save to apply the configuration.