Environment-variable-driven provider for Vercel AI SDK. Switch AI providers and models without code changes.
pnpm add ai-sdk-provider-env# .env
OPENAI_API_KEY=sk-xxximport { generateText } from 'ai'
import { envProvider } from 'ai-sdk-provider-env'
const provider = envProvider()
const { text } = await generateText({
model: provider.languageModel('openai/gpt-4o'),
prompt: 'Hello!',
})The config set name openai auto-matches the built-in preset — only an API key is needed.
Any valid env var prefix becomes a config set. Two endpoints, zero code changes:
# .env
FAST_BASE_URL=https://fast-api.example.com/v1
FAST_API_KEY=key-fast
SMART_BASE_URL=https://smart-api.example.com/v1
SMART_API_KEY=key-smartprovider.languageModel('fast/llama-3-8b')
provider.languageModel('smart/gpt-4o')Model ID format: {configSet}/{modelId}. The config set maps to an uppercased env var prefix.
Config set names must match [A-Za-z_][A-Za-z0-9_-]* — ASCII letters, digits, underscores, and hyphens. Hyphens are automatically normalized to underscores for env var lookup:
# Config set "my-api" → reads MY_API_* env vars
MY_API_BASE_URL=https://api.example.com/v1
MY_API_API_KEY=sk-xxxprovider.languageModel('my-api/some-model') // reads MY_API_* env vars
provider.languageModel('my_api/some-model') // same env varsFor config set names outside these rules (e.g. Unicode, dots), use the
configsoption instead.
| Variable | Required | Description |
|---|---|---|
{PREFIX}_API_KEY |
Yes | API key |
{PREFIX}_BASE_URL |
Unless preset matches | API base URL |
{PREFIX}_PRESET |
No | Built-in preset name (e.g. openai) |
{PREFIX}_COMPATIBLE |
No | openai · anthropic · gemini · openai-compatible (default) |
{PREFIX}_HEADERS |
No | Custom HTTP headers (JSON) |
{PREFIX}_NATIVE_ROUTING |
No | Enable/disable native model routing (true/false) |
When _PRESET is set or auto-detected, _BASE_URL and _COMPATIBLE fall back to preset defaults.
| Value | SDK | Fallback |
|---|---|---|
openai |
@ai-sdk/openai |
@ai-sdk/openai-compatible if not installed |
anthropic |
@ai-sdk/anthropic |
None |
gemini |
@ai-sdk/google |
None |
openai-compatible |
@ai-sdk/openai-compatible (default) |
— |
Install provider SDKs as needed: pnpm add @ai-sdk/openai @ai-sdk/anthropic @ai-sdk/google
When the config set name matches a preset, it auto-applies — only _API_KEY is needed:
DEEPSEEK_API_KEY=sk-xxx # config set "deepseek" matches the presetprovider.languageModel('deepseek/deepseek-chat') // just works| Preset | Base URL | Compatible |
|---|---|---|
openai |
https://api.openai.com/v1 |
openai |
anthropic |
https://api.anthropic.com |
anthropic |
google |
https://generativelanguage.googleapis.com/v1beta |
gemini |
opencode-zen |
https://opencode.ai/zen/v1 |
openai-compatible (nativeRouting enabled) |
opencode-go |
https://opencode.ai/zen/go/v1 |
openai-compatible |
deepseek |
https://api.deepseek.com |
openai-compatible |
groq |
https://api.groq.com/openai/v1 |
openai-compatible |
together |
https://api.together.xyz/v1 |
openai-compatible |
fireworks |
https://api.fireworks.ai/inference/v1 |
openai-compatible |
mistral |
https://api.mistral.ai/v1 |
openai-compatible |
moonshot |
https://api.moonshot.ai/v1 |
openai-compatible |
moonshot-china |
https://api.moonshot.cn/v1 |
openai-compatible |
perplexity |
https://api.perplexity.ai |
openai-compatible |
openrouter |
https://openrouter.ai/api/v1 |
openai-compatible |
siliconflow |
https://api.siliconflow.com/v1 |
openai-compatible |
siliconflow-china |
https://api.siliconflow.cn/v1 |
openai-compatible |
xai |
https://api.x.ai/v1 |
openai-compatible |
zai |
https://api.z.ai/api/paas/v4 |
openai-compatible |
zhipu |
https://open.bigmodel.cn/api/paas/v4 |
openai-compatible |
To disable auto-detection: envProvider({ presetAutoDetect: false }). See Advanced Usage for details.
When a config set uses a gateway that exposes multiple AI providers (like opencode-zen), nativeRouting auto-detects the model family from the model ID prefix and routes it to the appropriate native SDK:
claude-*→@ai-sdk/anthropicgemini-*→@ai-sdk/googlegpt-*→@ai-sdk/openai- Other models fall back to the config set's default
compatiblemode
# opencode-zen preset has nativeRouting enabled — just set API key
OPENCODE_ZEN_API_KEY=zen-xxx// Routes to native Anthropic SDK automatically
provider.languageModel('opencode-zen/claude-sonnet-4-20250514')
// Routes to native Google SDK automatically
provider.languageModel('opencode-zen/gemini-3-flash')
// Other models use openai-compatible
provider.languageModel('opencode-zen/minimax-m2.5')To disable: OPENCODE_ZEN_NATIVE_ROUTING=false
Known limitation:
o1-*,o3-*,chatgpt-*models are not automatically routed. Use{PREFIX}_COMPATIBLE=openaiexplicitly for these.
- API Reference —
envProvider()options, types, model ID format - Advanced Usage — Code-based configs, custom fetch/headers, custom separator, provider registry
- Bundler Usage — For
bun build,vite build, and other bundlers