-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Description
Before submitting your bug report
- I've tried using the "Ask AI" feature on the Continue docs site to see if the docs have an answer
- I believe this is a bug. I'll try to join the Continue Discord for questions
- I'm not able to find an open issue that reports the same bug
- I've seen the troubleshooting guide on the Continue Docs
Relevant environment info
OS: Windows 11
Continue Version: 1.2.8 (check via VS Code → Extensions → Continue → About)
IDE Version: VSCode 1.105.0
Model: DeepSeek-Coder-V2-Lite-Instruct-IQ4_XS (running in Jan AI)
Config:
YAML
name: My Continue Configuration
version: 1.0.0
schema: https://continue.dev/schemas/config.json
models:
- name: DeepSeek-Coder-V2-Lite-Instruct
provider: ollama
model: DeepSeek-Coder-V2-Lite-Instruct-IQ4_XS
apiBase: http://localhost:1337/v1/chat/completions
apiKey: Bearer janai
roles:
- chat
- edit
- apply
- summarize
OR link to agent in Continue hub:
Description
To Reproduce
Steps to reproduce the behavior:
Start the Jan.ai application.
Download and load the DeepSeek-Coder-V2-Lite-Instruct-IQ4_XS model.
Start the local server from within Jan.ai (which runs on http://localhost:1337).
Attempt to use the configuration below in continue.dev.
The connection fails.
Expected behavior
continue.dev should successfully connect to the local Jan.ai server and be able to query the DeepSeek-Coder-V2-Lite-Instruct model.
Screenshots / Logs
If applicable, add screenshots or logs to help explain your problem.
Environment (please complete the following information):
Tool: continue.dev
Local Server: Jan.ai
Server Version: [e.g., v0.2.1] (if you know it)
Model: DeepSeek-Coder-V2-Lite-Instruct-IQ4_XS
Configuration File
Here is my current config.yaml:
YAML
name: My Continue Configuration
version: 1.0.0
schema: https://continue.dev/schemas/config.json
models:
- name: DeepSeek-Coder-V2-Lite-Instruct
provider: ollama # <-- I suspect this is the issue
model: DeepSeek-Coder-V2-Lite-Instruct-IQ4_XS
apiBase: http://localhost:1337/v1/chat/completions
apiKey: Bearer janai
roles:- chat
- edit
- apply
- summarize
Additional context
The Jan.ai server is confirmed to be running and accessible. I can interact with it via its built-in Swagger UI at http://localhost:1337/v1/chat/completions.
My hypothesis is that the provider: ollama setting is incorrect for this use case. Since Jan.ai provides an OpenAI-compatible API endpoint, the provider might need to be set to openai. I am unsure of the correct syntax or provider name to use for a local, OpenAI-compatible server.
Could you please provide guidance on the correct configuration to connect continue.dev to a local Jan.ai server?
Thank you
To reproduce
No response
Log output
Metadata
Metadata
Assignees
Labels
Type
Projects
Status