You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
However, when I create a new workspace, chats are broken.
Yes, it is possible to configure AnythingLLM to talk directly to Ollama, but this is also broken and I would prefer to use a generic OpenAI-compatible endpoint.
Yeah, because it works differently than ollama does for stream chunk handling. Just use the Ollama provider since it is fully supported as an LLM anyways - then you wont have any issues!
How are you running AnythingLLM?
AnythingLLM desktop app on MacOS (Macbook Pro).
What happened?
I want to connect AnythingLLM to Ollama using the generic OpenAI-compatible endpoint. I configure as per the API docs on Ollama:
However, when I create a new workspace, chats are broken.
Yes, it is possible to configure AnythingLLM to talk directly to Ollama, but this is also broken and I would prefer to use a generic OpenAI-compatible endpoint.
Are there known steps to reproduce?
To reproduce:
The text was updated successfully, but these errors were encountered: