Migrate from Vercel AI Gateway
Keep your Vercel AI SDK code, add response caching, detailed analytics, and smart routing. One provider for all models.
Quick Migration
Swap your provider imports—your AI SDK code stays the same:
- import { openai } from "@ai-sdk/openai";
- import { anthropic } from "@ai-sdk/anthropic";
+ import { generateText } from "ai";
+ import { createLangRouter } from "@langrouter/ai-sdk-provider";
+ const langrouter = createLangRouter({
+ apiKey: process.env.LANGROUTER_API_KEY
+ });
const { text } = await generateText({
- model: openai("gpt-5.2"),
+ model: langrouter("gpt-5.2"),
prompt: "Hello!"
});The key difference: one provider, one API key, all models—with caching and analytics built in.
Migration Steps
Get Your LangRouter API Key
Sign up at langrouter.ai/signup and create an API key from your dashboard.
Install the LangRouter AI SDK Provider
Install the native LangRouter provider for the Vercel AI SDK:
pnpm add @langrouter/ai-sdk-providerThis package provides full compatibility with the Vercel AI SDK and supports all LangRouter features.
Update Your Code
Basic Text Generation
// Before (Vercel AI Gateway with native providers)
import { openai } from "@ai-sdk/openai";
import { anthropic } from "@ai-sdk/anthropic";
import { generateText } from "ai";
const { text: openaiText } = await generateText({
model: openai("gpt-4o"),
prompt: "Hello!",
});
const { text: claudeText } = await generateText({
model: anthropic("claude-3-5-sonnet-20241022"),
prompt: "Hello!",
});
// After (LangRouter - single provider for all models)
import { createLangRouter } from "@langrouter/ai-sdk-provider";
import { generateText } from "ai";
const langrouter = createLangRouter({
apiKey: process.env.LANGROUTER_API_KEY,
});
const { text: openaiText } = await generateText({
model: langrouter("openai/gpt-4o"),
prompt: "Hello!",
});
const { text: claudeText } = await generateText({
model: langrouter("anthropic/claude-3-5-sonnet-20241022"),
prompt: "Hello!",
});Streaming Responses
import { createLangRouter } from "@langrouter/ai-sdk-provider";
import { streamText } from "ai";
const langrouter = createLangRouter({
apiKey: process.env.LANGROUTER_API_KEY,
});
const { textStream } = await streamText({
model: langrouter("anthropic/claude-3-5-sonnet-20241022"),
prompt: "Write a poem about coding",
});
for await (const text of textStream) {
process.stdout.write(text);
}Using in Next.js API Routes
// app/api/chat/route.ts
import { createLangRouter } from "@langrouter/ai-sdk-provider";
import { streamText } from "ai";
const langrouter = createLangRouter({
apiKey: process.env.LANGROUTER_API_KEY,
});
export async function POST(req: Request) {
const { messages } = await req.json();
const result = await streamText({
model: langrouter("openai/gpt-4o"),
messages,
});
return result.toDataStreamResponse();
}Alternative: Using OpenAI SDK Adapter
If you prefer not to install a new package, you can use @ai-sdk/openai with a custom base URL:
import { createOpenAI } from "@ai-sdk/openai";
import { generateText } from "ai";
const langrouter = createOpenAI({
baseURL: "https://api.langrouter.ai/v1",
apiKey: process.env.LANGROUTER_API_KEY,
});
const { text } = await generateText({
model: langrouter("openai/gpt-4o"),
prompt: "Hello!",
});Update Environment Variables
# Remove individual provider keys (optional - can keep as backup)
# OPENAI_API_KEY=sk-...
# ANTHROPIC_API_KEY=sk-ant-...
# Add LangRouter key
export LANGROUTER_API_KEY=lgrouter_your_key_hereModel Name Format
LangRouter supports two model ID formats:
Root Model IDs (without provider prefix) - Uses smart routing to automatically select the best provider based on uptime, throughput, price, and latency:
gpt-4o
claude-3-5-sonnet-20241022
gemini-1.5-proProvider-Prefixed Model IDs - Routes to a specific provider with automatic failover if uptime drops below 90%:
openai/gpt-4o
anthropic/claude-3-5-sonnet-20241022
google-ai-studio/gemini-1.5-proFor more details on routing behavior, see the routing documentation.
Model Mapping Examples
| Vercel AI SDK | LangRouter |
|---|---|
openai("gpt-4o") | langrouter("gpt-4o") or langrouter("openai/gpt-4o") |
anthropic("claude-3-5-sonnet-20241022") | langrouter("claude-3-5-sonnet-20241022") or langrouter("anthropic/claude-3-5-sonnet-20241022") |
google("gemini-1.5-pro") | langrouter("gemini-1.5-pro") or langrouter("google-ai-studio/gemini-1.5-pro") |
Check the models page for the full list of available models.
Tool Calling
LangRouter supports tool calling through the AI SDK:
import { createLangRouter } from "@langrouter/ai-sdk-provider";
import { generateText, tool } from "ai";
import { z } from "zod";
const langrouter = createLangRouter({
apiKey: process.env.LANGROUTER_API_KEY,
});
const { text, toolResults } = await generateText({
model: langrouter("openai/gpt-4o"),
tools: {
weather: tool({
description: "Get the weather for a location",
parameters: z.object({
location: z.string(),
}),
execute: async ({ location }) => {
return { temperature: 72, condition: "sunny" };
},
}),
},
prompt: "What's the weather in San Francisco?",
});Need Help?
- Browse available models at langrouter.ai/models
- Read the API documentation
- Contact support at [email protected]
Last updated on