Skip to content

Config.yaml Issue: Connecting to local Jan.ai Server Body: #8246

@swapnil0545

Description

@swapnil0545

Before submitting your bug report

Relevant environment info

OS: Windows 11
Continue Version: 1.2.8 (check via VS Code → Extensions → Continue → About)
IDE Version: VSCode 1.105.0
Model: DeepSeek-Coder-V2-Lite-Instruct-IQ4_XS (running in Jan AI)
Config:
YAML

name: My Continue Configuration  
version: 1.0.0  
schema: https://continue.dev/schemas/config.json  
models:  
  - name: DeepSeek-Coder-V2-Lite-Instruct  
    provider: ollama  
    model: DeepSeek-Coder-V2-Lite-Instruct-IQ4_XS  
    apiBase: http://localhost:1337/v1/chat/completions  
    apiKey: Bearer janai  
    roles:  
      - chat  
      - edit  
      - apply  
      - summarize  
  

  
  OR link to agent in Continue hub:

Description

To Reproduce
Steps to reproduce the behavior:

Start the Jan.ai application.
Download and load the DeepSeek-Coder-V2-Lite-Instruct-IQ4_XS model.
Start the local server from within Jan.ai (which runs on http://localhost:1337).
Attempt to use the configuration below in continue.dev.
The connection fails.
Expected behavior
continue.dev should successfully connect to the local Jan.ai server and be able to query the DeepSeek-Coder-V2-Lite-Instruct model.

Screenshots / Logs
If applicable, add screenshots or logs to help explain your problem.

Environment (please complete the following information):
Tool: continue.dev
Local Server: Jan.ai
Server Version: [e.g., v0.2.1] (if you know it)
Model: DeepSeek-Coder-V2-Lite-Instruct-IQ4_XS
Configuration File
Here is my current config.yaml:

YAML

name: My Continue Configuration
version: 1.0.0
schema: https://continue.dev/schemas/config.json
models:

  • name: DeepSeek-Coder-V2-Lite-Instruct
    provider: ollama # <-- I suspect this is the issue
    model: DeepSeek-Coder-V2-Lite-Instruct-IQ4_XS
    apiBase: http://localhost:1337/v1/chat/completions
    apiKey: Bearer janai
    roles:

My hypothesis is that the provider: ollama setting is incorrect for this use case. Since Jan.ai provides an OpenAI-compatible API endpoint, the provider might need to be set to openai. I am unsure of the correct syntax or provider name to use for a local, OpenAI-compatible server.

Could you please provide guidance on the correct configuration to connect continue.dev to a local Jan.ai server?

Thank you

To reproduce

No response

Log output

Metadata

Metadata

Assignees

No one assigned

    Labels

    area:configurationRelates to configuration optionside:vscodeRelates specifically to VS Code extensionkind:questionIndicates a questionos:windowsHappening specifically on Windows

    Type

    No type

    Projects

    Status

    Todo

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions