Ollama Embeddings Not Loading #7073
Replies: 1 comment
-
Embedding support was removed from Trilium in version 0.95.0, so the 'Embedding' section in the UI is now just a leftover artifact and won't work as expected, even if you configure Ollama or specify a model like If you need AI chat or LLM features, those are still experimental and do not use embeddings. For Ollama integration, you can try the trilium-chat plugin, which supports local Ollama models and customizable prompts. If you need vector search or embeddings, you'll need to use external tools or consider older Trilium versions, but those features are no longer supported in current releases. To reply, just mention @dosu. How did I do? Good | Irrelevant | Incorrect | Verbose | Hallucination | Report 🐛 | Other |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I'm running Trilium via Docker, as well as Ollama with "mxbai-embed-large" model.
Any idea why I'm never getting the 'Embedding' section to load with the expected functionality?
Beta Was this translation helpful? Give feedback.
All reactions