Migrations
Migrate from OpenRouter
Switch to LangRouter for built-in analytics and simpler API. Two-line code change.
LangRouter works just like OpenRouter—same API format, same model names—but with built-in analytics. Migration takes two lines of code.
Quick Migration
Change your base URL and API key:
- const baseURL = "https://openrouter.ai/api/v1";
- const apiKey = process.env.OPENROUTER_API_KEY;
+ const baseURL = "https://api.langrouter.ai/v1";
+ const apiKey = process.env.LANGROUTER_API_KEY;Migration Steps
Get Your LangRouter API Key
Sign up at langrouter.ai/signup and create an API key from your dashboard.
Update Environment Variables
# Remove OpenRouter credentials
# OPENROUTER_API_KEY=sk-or-...
# Add LangRouter credentials
LANGROUTER_API_KEY=lgrouter_your_key_hereUpdate Your Code
Using fetch/axios
// Before (OpenRouter)
const response = await fetch("https://openrouter.ai/api/v1/chat/completions", {
method: "POST",
headers: {
Authorization: `Bearer ${process.env.OPENROUTER_API_KEY}`,
"Content-Type": "application/json",
},
body: JSON.stringify({
model: "openai/gpt-5.2",
messages: [{ role: "user", content: "Hello!" }],
}),
});
// After (LangRouter)
const response = await fetch("https://api.langrouter.ai/v1/chat/completions", {
method: "POST",
headers: {
Authorization: `Bearer ${process.env.LANGROUTER_API_KEY}`,
"Content-Type": "application/json",
},
body: JSON.stringify({
model: "gpt-5.2",
messages: [{ role: "user", content: "Hello!" }],
}),
});Using OpenAI SDK
import OpenAI from "openai";
// Before (OpenRouter)
const client = new OpenAI({
baseURL: "https://openrouter.ai/api/v1",
apiKey: process.env.OPENROUTER_API_KEY,
});
// After (LangRouter)
const client = new OpenAI({
baseURL: "https://api.langrouter.ai/v1",
apiKey: process.env.LANGROUTER_API_KEY,
});
// Usage remains the same
const completion = await client.chat.completions.create({
model: "anthropic/claude-3-5-sonnet-20241022",
messages: [{ role: "user", content: "Hello!" }],
});Using Vercel AI SDK
Both OpenRouter and LangRouter have native AI SDK providers, making migration straightforward:
import { generateText } from "ai";
// Before (OpenRouter AI SDK Provider)
import { createOpenRouter } from "@openrouter/ai-sdk-provider";
const openrouter = createOpenRouter({
apiKey: process.env.OPENROUTER_API_KEY,
});
const { text } = await generateText({
model: openrouter("gpt-5.2"),
prompt: "Hello!",
});
// After (LangRouter AI SDK Provider)
import { createLangRouter } from "@langrouter/ai-sdk-provider";
const langrouter = createLangRouter({
apiKey: process.env.LANGROUTER_API_KEY,
});
const { text } = await generateText({
model: langrouter("gpt-5.2"),
prompt: "Hello!",
});Model Name Mapping
Most model names are compatible, but here are some common mappings:
| OpenRouter Model | LangRouter Model |
|---|---|
| openai/gpt-5.2 | gpt-5.2 or openai/gpt-5.2 |
| gemini/gemini-3-flash-preview | gemini-3-flash-preview or google-ai-studio/gemini-3-flash-preview |
| bedrock/claude-opus-4-5-20251101 | claude-opus-4-5-20251101 or aws-bedrock/claude-opus-4-5-20251101 |
Check the models page for the full list of available models.
Streaming Support
LangRouter supports streaming responses identically to OpenRouter:
const stream = await client.chat.completions.create({
model: "anthropic/claude-3-5-sonnet-20241022",
messages: [{ role: "user", content: "Write a story" }],
stream: true,
});
for await (const chunk of stream) {
process.stdout.write(chunk.choices[0]?.delta?.content || "");
}Need Help?
- Browse available models at langrouter.ai/models
- Read the API documentation
- Contact support at [email protected]
Last updated on