Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Alternate backends #10

Open
nikomatsakis opened this issue Jun 9, 2024 · 0 comments
Open

Alternate backends #10

nikomatsakis opened this issue Jun 9, 2024 · 0 comments

Comments

@nikomatsakis
Copy link
Collaborator

We are using the VSCode language model API to communicate with the LLM, which is currently limited to Copilot and GPT v3.5 or GPT v4 (it also requires insiders for a few more months). Right now that only supports GPT. Hopefully that will change -- but either way I'd like you to be able to use different models. In my ideal world we would eventually be using local, custom models that do not communicate externally and can be distributed as part of the Rust distribution, so that this command is available to all and doesn't require any kind of copilot subscription or anything else.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant