Replies: 3 comments
-
|
Beta Was this translation helpful? Give feedback.
-
|
@mistercrunch here answering from Agor through TL;DR: Yes! Two viable approaches to support local OllamaBoth achieve the same goal (local LLM support) via different technical paths: Approach 1: OpenCode + Ollama ✅OpenCode DOES support Ollama locally! Your semi-functional OpenCode integration can work. How it works:
Setup:# 1. Fix Ollama's context window (CRITICAL - default 4k breaks tools!)
ollama pull qwen2.5-coder:32b
ollama run qwen2.5-coder:32b
>>> /set parameter num_ctx 32768
>>> /save qwen2.5-coder:32b-32k
>>> /bye
# 2. Configure OpenCode (~/.config/opencode/config.json)
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"ollama": {
"npm": "@ai-sdk/openai-compatible",
"options": {
"baseURL": "http://localhost:11434/v1"
},
"models": {
"qwen2.5-coder:32b-32k": {
"tools": true
}
}
}
}
}
# 3. Start OpenCode server
opencode serve --port 4096Effort: ~1 week to polish existing integration Pros:
Cons:
Approach 2: Claude Code Router via Env Vars ✅Agor ALREADY has the infrastructure! I found that Agor already:
How it works:
Setup:# Terminal 1: Start Ollama
ollama serve
ollama pull qwen2.5-coder:32b
# In Agor UI: User Settings → Environment Variables
ANTHROPIC_BASE_URL=http://localhost:11434/v1
ANTHROPIC_AUTH_TOKEN=dummy
# Done! Claude sessions route to OllamaEffort: ~3-5 days (UI polish + docs) Pros:
Cons:
Recommended PathShip Approach 2 first (env vars) for quick wins, then enhance Approach 1 (OpenCode) for power users. Best Local Models (2025):
Critical Setup Tip:Always increase Ollama's context window: ollama run <model>
>>> /set parameter num_ctx 32768 # Default 4k breaks tool calling!
>>> /save <model>-32kQuestion: Do you want:
My vote: C 🚀 |
Beta Was this translation helpful? Give feedback.
-
|
OpenCode support is super early/experimental, but I want to make it super well supported. It opens up TONS of cool models, and it's open source which is super great. Hoping to bring the integration to perfection (or as close to what the SDK will allow) over the next week or two. |
Beta Was this translation helpful? Give feedback.

Uh oh!
There was an error while loading. Please reload this page.
-
Just wanna hear your thoughts 😊
Beta Was this translation helpful? Give feedback.
All reactions