Skip to content

chore(deps): update dependency mudler/localai to v2.27.0 #76

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Apr 1, 2025

Conversation

M0Rf30
Copy link
Owner

@M0Rf30 M0Rf30 commented Apr 1, 2025

This PR contains the following updates:

Package Update Change
mudler/LocalAI minor 2.26.0 -> 2.27.0

Release Notes

mudler/LocalAI (mudler/LocalAI)

v2.27.0

Compare Source

🚀 LocalAI v2.27.0


Welcome to another exciting release of LocalAI v2.27.0! We've been working hard to bring you a fresh WebUI experience and a host of improvements under the hood. Get ready to explore new updates!

🔥 AIO Images Updates

Check out the updated models we're now shipping with our All-in-One images:

CPU All-in-One:

  • Text-to-Text: llama3.1
  • Embeddings: granite-embeddings
  • Vision: minicpm

GPU All-in-One:

  • Text-to-Text: localai-functioncall-qwen2.5-7b-v0.5 (our tiniest flagship model!)
  • Embeddings: granite-embeddings
  • Vision: minicpm
💻 WebUI Overhaul!

We've given the WebUI a brand-new look and feel. Have a look at the stunning new interface:

Talk Interface Generate Audio
Screenshot 2025-03-31 at 12-01-36 LocalAI - Talk Screenshot 2025-03-31 at 12-01-29 LocalAI - Generate audio with voice-en-us-ryan-low
Models Overview Generate Images
Screenshot 2025-03-31 at 12-01-20 LocalAI - Models Screenshot 2025-03-31 at 12-31-41 LocalAI - Generate images with flux 1-dev
Chat Interface API Overview
Screenshot 2025-03-31 at 11-57-44 LocalAI - Chat with localai-functioncall-qwen2 5-7b-v0 5 Screenshot 2025-03-31 at 11-57-23 LocalAI API - c2a39e3 (c2a39e3639227cfd94ffffe9f5691239acc275a8)
Login Swarm
Screenshot 2025-03-31 at 12-09-59 Screenshot 2025-03-31 at 12-10-39 LocalAI - P2P dashboard
How to Use

To get started with LocalAI, you can use our container images. Here’s how to run them with Docker:

##### CPU only image:
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-cpu

##### Nvidia GPU:
docker run -ti --name local-ai -p 8080:8080 --gpus all localai/localai:latest-gpu-nvidia-cuda-12

##### CPU and GPU image (bigger size):
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest

##### AIO images (pre-downloads a set of models ready for use, see https://localai.io/basics/container/)
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-aio-cpu

Check out our Documentation for more information.

Key Highlights:
  • Complete WebUI Redesign: A fresh, modern interface with enhanced navigation and visuals.
  • Model Gallery Improvements: Easier exploration with improved pagination and filtering.
  • AIO Image Updates: Smoother deployments with updated models.
  • Stability Fixes: Critical bug fixes in model initialization, embeddings handling, and GPU offloading.
What’s New 🎉
  • Chat Interface Enhancements: Cleaner layout, model-specific UI tweaks, and custom reply prefixes.
  • Smart Model Detection: Automatically links to relevant model documentation based on use.
  • Performance Tweaks: GGUF models now auto-detect context size, and Llama.cpp handles batch embeddings and SIGTERM gracefully.
  • VLLM Config Boost: Added options to disable logging, set dtype, and enforce per-prompt media limits.
  • New model architecture supported: Gemma 3, Mistral, Deepseek
Bug Fixes 🐛
  • Resolved model icon display inconsistencies.
  • Ensured proper handling of generated artifacts without API key restrictions.
  • Optimized CLIP offloading and Llama.cpp process termination.
Stay Tuned!

We have some incredibly exciting features and updates lined up for you. While we can't reveal everything just yet. Keep an eye out for our upcoming announcements – you won't want to miss them!


Do you like the new webui? let us know in the Github discussions!

Enjoy 🚀

Full changelog 👇
👉 Click to expand 👈
What's Changed
Bug fixes 🐛
Exciting New Features 🎉
🧠 Models
👒 Dependencies
Other Changes
New Contributors

Full Changelog: mudler/LocalAI@v2.26.0...v2.27.0


Configuration

📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).

🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.

Rebasing: Whenever PR is behind base branch, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about this update again.


  • If you want to rebase/retry this PR, check this box

This PR has been generated by Renovate Bot.

@M0Rf30 M0Rf30 force-pushed the renovate/mudler-localai-2.x branch from f91b47f to 9ba6965 Compare April 1, 2025 04:21
@M0Rf30 M0Rf30 merged commit 3b1d234 into main Apr 1, 2025
@M0Rf30 M0Rf30 deleted the renovate/mudler-localai-2.x branch April 1, 2025 10:15
M0Rf30 added a commit that referenced this pull request Apr 1, 2025
* chore(deps): update dependency mudler/localai to v2.27.0

* Apply automatic changes

---------

Co-authored-by: Renovate Bot <[email protected]>
M0Rf30 added a commit that referenced this pull request Apr 1, 2025
* chore(deps): update dependency mudler/localai to v2.27.0

* Apply automatic changes

---------

Co-authored-by: Renovate Bot <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants