Skip to content

Support configurable thinking output formats #5992

Open
@sestinj

Description

@sestinj

Validations

  • I believe this is a way to improve. I'll try to join the Continue Discord for questions
  • I'm not able to find an open issue that requests the same enhancement

Problem

Some LLM providers like vLLM let you change the output format of thinking. Right now Continue expects only the standard format from OpenAI, Anthropic, etc. but doesn't have a way of supporting arbitrary thinking tag formats.

For reference: https://docs.vllm.ai/en/latest/features/reasoning_outputs.html#streaming-chat-completions

Solution

No response

Metadata

Metadata

Assignees

Labels

area:configurationRelates to configuration optionsgood-first-issueSuggested issue for new contributorskind:enhancementIndicates a new feature request, imrovement, or extension

Type

No type

Projects

Status

Todo

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions