Replies: 1 comment 1 reply
-
Did you make some progress? I have a similar issue. The prompt goes to the server, but I get zero response: ![]() |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I have successfully connected the extension to my self hosted
OLLaMA Open-WebUI
instance and model is active too. I am trying to work out the code auto completion in the editor ofVisual Studio Code
, but it is always resulting into the greeting message likeHello, how may I help you today?
and things like that.The
Chat
feature works wonderfully though:)
My configuration is as following;
Beta Was this translation helpful? Give feedback.
All reactions