Releases: TabbyML/tabby
Releases Β· TabbyML/tabby
v0.25.1
nightly
docs(changelog): add v0.25.1 release notes (#3900) Signed-off-by: Wei Zhang <[email protected]>
v0.25.1-rc.0
v0.25.1-rc.0
v0.25.0
β οΈ Notice
Significant changes have been implemented in this release; please consider adjusting them to fit your specific use case.
- The default parallelism has been increased from 1 to 4, which might increase VRAM usage. (#3832)
- Introduce a new embedding kind
llama.cpp/before_b4356_embedding
for llamafile or other embedding services utilizing the legacy llama.cpp embedding API. (#3828)
π Features
- Expose thinking process of Answer Engine to the answers in thread message. (#3785) (#3672)
- Enable the Answer Engine to access the repository's directory file list as needed. (#3796)
- Enable the use of
@
to mention a symbol in Chat Sidebar. (#3778) - Provide default question recommendations that are repository-aware on Answer Engine. (#3815)
π§° Fixed and Improvements
- Provide a configuration to truncate text content prior to dispatching it to embedding service.. (#3816)
- Bump llama.cpp version to b4651. (#3798)
- Automatically retry embedding when the service occasionally fails due to issues with llama.cpp. (#3805)
- Enhance the user interface experience for Answer Engine. (#3845) (#3794)
- Resolve the deserialization issue related to
finish_reason
in chat response from LiteLLM Proxy Server.(#3882)
π« New Contributors
@zhanba made their first contribution in #3675
@faceCutWall made their first contribution in #3812
Full Changelog:Β v0.24.0...v0.25.0
v0.25.0-rc.5
v0.25.0-rc.5
[email protected]
chore(intellij): bump intellij plugin version to 1.11.0.
[email protected]
chore(intellij): bump intellij plugin version to 1.11.0-rc.0.
[email protected]
chore(intellij): bump intellij plugin version to 1.10.1.
v0.25.0-rc.4
v0.25.0-rc.4
v0.25.0-rc.3
v0.25.0-rc.3