You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We are using the VSCode language model API to communicate with the LLM, which is currently limited to Copilot and GPT v3.5 or GPT v4 (it also requires insiders for a few more months). Right now that only supports GPT. Hopefully that will change -- but either way I'd like you to be able to use different models. In my ideal world we would eventually be using local, custom models that do not communicate externally and can be distributed as part of the Rust distribution, so that this command is available to all and doesn't require any kind of copilot subscription or anything else.
The text was updated successfully, but these errors were encountered:
We are using the VSCode language model API to communicate with the LLM, which is currently limited to Copilot and GPT v3.5 or GPT v4 (it also requires insiders for a few more months). Right now that only supports GPT. Hopefully that will change -- but either way I'd like you to be able to use different models. In my ideal world we would eventually be using local, custom models that do not communicate externally and can be distributed as part of the Rust distribution, so that this command is available to all and doesn't require any kind of copilot subscription or anything else.
The text was updated successfully, but these errors were encountered: