An opencode provider plugin that exposes Trae models through Trae raw-chat SSE or an OpenAI-compatible HTTP endpoint.
It is intended to make Trae act as an LLM backend for opencode. For coding workflows, use OAuth/PAT plus a provider base URL; the legacy traecli subprocess path is disabled by default.
- Adds a
traeprovider to opencode. - Prefers direct Trae raw-chat SSE streaming when
patis configured. - Supports generic OpenAI-compatible streaming when
openaiBaseURLandopenaiApiKeyare configured. - Disables legacy
traeclifallback by default to avoid CLI-internal tool restrictions leaking into OpenCode. - Exposes common Trae cloud models, including
GLM-5.1,Doubao-Seed-2.0-Code,DeepSeek-V3.2,Qwen3-Coder-Next, and more. - Reads the current model from
~/.trae/trae_cli.yaml/~/.trae/traecli.yamlwhen available. - Provides stable text-first generation and an optional experimental tool-call bridge for coding workflows.
- Supports local
file://plugin installs for development.
- Node.js >= 20
- opencode >= 1.14
- A Trae enterprise/PAT token or an OpenAI-compatible endpoint token
From npm:
opencode plugin opencode-trae-cli-authFor local development from this repository:
npm install
npm run build
opencode plugin file:///absolute/path/to/opencode-trae-cli-auth/dist/index.jsYou can also add it manually to an opencode config:
{
"plugin": [
"opencode-trae-cli-auth"
],
"model": "trae/GLM-5.1"
}Local file example:
{
"plugin": [
"file:///Users/you/dev/opencode-trae-cli-auth/dist/index.js"
],
"model": "trae/GLM-5.1"
}List models after installing:
opencode models traeBuilt-in model ids currently include:
trae/Doubao-Seed-Codetrae/GLM-5.1trae/MiniMax-M2.7trae/Kimi-K2.6trae/DeepSeek-V4-Pro
opencode run --agent build --model trae/GLM-5.1 "reply with 'ok'"This package can use these backend transports:
- Direct Trae raw-chat HTTP: enabled by
pat; posts to Trae enterprise raw chat and consumes real SSE output. - Direct OpenAI-compatible HTTP: enabled by
openaiBaseURL+openaiApiKey; supports SSE text streaming and OpenAI-style streamedtool_calls. - Legacy Trae CLI fallback: disabled by default. Enable only with
allowCliFallback: truefor debugging or migration.
The provider uses a single model-first configuration path. Users select a model directly, and behavior changes only through explicit options.
Tool execution still belongs to OpenCode runtime (permissions, sandbox, command execution). Trae CLI is not used as a standalone tool runtime.
When loading the plugin programmatically, the intended user-facing options are:
type TraePluginOptions = {
pat?: string
openaiBaseURL?: string
openaiApiKey?: string
modelName?: string
enableToolCalling?: boolean
allowCliFallback?: boolean
cliPath?: string
}pat: explicit Trae PAT/OAuth token for direct raw-chat transport. This is intentionally read only fromprovider.trae.options.pat, not from environment variables.openaiBaseURL: optional OpenAI-compatible base URL. When set withopenaiApiKey, this transport is used before CLI fallback.openaiApiKey: bearer token for the OpenAI-compatible endpoint.modelName: force a Traemodel.nameregardless of opencode model id. Leave unset to use the selected opencode model id directly.enableToolCalling: defaults totrue; whentrue, provider forwards Traefunctiontool calls to OpenCode.allowCliFallback: defaults tofalse. Keep it false for real OpenCode usage; set true only to debug the legacytraeclisubprocess path.cliPath: legacy only; override thetraeclibinary path whenallowCliFallback=true.
- Tool execution depends on OpenCode runtime permissions and sandbox policy.
- Experimental mode:
enableToolCalling=truesupports forwarding streamed function tool calls observed from direct HTTP transports. - In experimental tool-calling mode, common tool input aliases are normalized (
file_path -> filePath,old_string/new_string -> oldString/newString, etc.). - Usage/token counts may be zero when the upstream transport does not emit usage metadata.
- Legacy CLI mode can leak
traecliinternal tool restrictions into model behavior and is not recommended for coding agents.
bun install
bun run test
bun run build
bun pm pack --dry-runSmoke check a local Trae CLI and OpenCode install:
traecli "reply with ok" -p --json
opencode run --agent build --model trae/GLM-5.1 "reply with ok"Recommended local config examples:
Direct Trae raw-chat transport:
{
"provider": {
"trae": {
"options": {
"pat": "your-token"
}
}
},
"model": "trae/Kimi-K2.6"
}pat is explicit by design. The plugin does not read TRAE_RAW_API_KEY, TRAECLI_PERSONAL_ACCESS_TOKEN, or other token environment variables.
Direct OpenAI-compatible transport:
{
"provider": {
"trae": {
"options": {
"openaiBaseURL": "https://your-enterprise-openai-compatible-host/v1",
"openaiApiKey": "your-token"
}
}
},
"model": "trae/DeepSeek-V4-Pro"
}OpenAI-compatible credentials are also explicit config only; the plugin does not read token environment variables.
Legacy CLI debug example:
{
"provider": {
"trae": {
"options": {
"allowCliFallback": true
}
}
},
"model": "trae/GLM-5.1"
}Optional soak test (success rate + latency summary):
bun run soak -- --model trae/GLM-5.1 --runs 12 --concurrency 3Tool-calling smoke (reports whether tool-call events are observed):
bun run smoke:tools -- --model trae/GLM-5.1Strict mode (non-zero exit if no tool-call event):
bun run smoke:tools -- --model trae/GLM-5.1 --strictOpenCode read/write tool smoke against the real provider path:
bun run smoke:opencode:rw -- --model trae/GLM-5.1This creates an isolated temporary workspace and verifies real OpenCode Bash, Read, Glob, Grep, Write, and Edit execution through opencode run --agent build --format json.
Include a heavier project-scaffolding case when you want to verify real coding setup behavior:
bun run smoke:opencode:rw -- --model trae/GLM-5.1 --includeScaffoldOvernight agentic run (prompt-file driven, no built-in demo prompts):
bun run overnight -- \
--model trae/GLM-5.1 \
--hours 8 \
--concurrency 2 \
--timeoutMs 180000 \
--promptsFile /absolute/path/to/prompts.txtprompts.txt format: one real task prompt per line, lines starting with # are ignored.
Results are written to artifacts/overnight/*.jsonl plus a *.summary.json.
--maxRuns is optional; omit it for true overnight runs. Use it for short verification runs.
This repository includes scripts/trae-cli-env.sh, a sourceable shell helper for machines where Trae CLI is already installed.
Usage:
PAT=<your-token> source scripts/trae-cli-env.sh
trae-cli --print "say hello"The helper creates the minimal Trae config files only when missing:
~/.trae/traecli.yaml~/.trae/trae_cli.yaml
It exports TRAECLI_PERSONAL_ACCESS_TOKEN and SEC_TOKEN_PATH, and adds the detected trae-cli binary directory to PATH.
MIT