A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://github.com/mudler/LocalAI/releases/tag/v3.2.0 below:

Release v3.2.0 · mudler/LocalAI · GitHub


🚀 LocalAI 3.2.0

Welcome to LocalAI 3.2.0! This is a release that refactors our architecture to be more flexible and lightweight.

The core is now separated from all the backends, making LocalAI faster to download, easier to manage, portable, and much more smaller.

TL;DR – What’s New in LocalAI 3.2.0 🎉

Note: CI is in the process of building all the backends for this release and will be available soon - if you hit any issue, please try in a few, thanks for understanding!
Note: Some parts of the documentation and the installation scripts (that download the release binaries) have to yet be adapted to the latest changes and/or might not reflect the current state

A New Modular Architecture 🧩

The biggest change in v3.2.0 is the complete separation of inference backends from the core LocalAI binary. Backends like llama.cpp, whisper.cpp, piper, and stablediffusion-ggml are no longer bundled in.

This fundamental shift makes LocalAI:

Smart, Automatic Backend Installation 🤖

To make the new modular system seamless, LocalAI now features automatic backend installation.

When you install a model from the gallery (or a YAML file), LocalAI intelligently detects the required backend and your system's capabilities, then downloads the correct version for you. Whether you're running on a standard CPU, an NVIDIA GPU, an AMD GPU, or an Intel GPU, LocalAI handles it automatically.

For advanced use cases or to override auto-detection, you can use the LOCALAI_FORCE_META_BACKEND_CAPABILITY environment variable. Here are the available options:

The Backend Gallery & CLI Control 🖼️

You are in full control. You can browse, install, and manage all available backends directly from the WebUI or using the new CLI commands:

# List all available backends in the gallery
local-ai backends list

# Install a specific backend (e.g., llama-cpp)
local-ai backends install llama-cpp

# Uninstall a backend
local-ai backends uninstall llama-cpp

For development, offline or air-gapped environments, you can now also install backends directly from a local OCI tar file:

local-ai backends install "ocifile://<PATH_TO_TAR_FILE>"
Other Key Improvements 🚨 Important Note for Upgrading

Due to the new modular architecture, if you have existing models installed with a version prior to 3.2.0, they might not have a specific backend assigned.

After upgrading, you may need to install the required backend manually for these models to work. You can do this easily from the WebUI or via the CLI: local-ai backends install <backend_name>.

The Complete Local Stack for Privacy-First AI LocalAI

The free, Open Source OpenAI alternative. Acts as a drop-in replacement REST API compatible with OpenAI specifications for local AI inferencing. No GPU required.

Link: https://github.com/mudler/LocalAI

LocalAGI

A powerful Local AI agent management platform. Serves as a drop-in replacement for OpenAI's Responses API, supercharged with advanced agentic capabilities and a no-code UI.

Link: https://github.com/mudler/LocalAGI

LocalRecall

A RESTful API and knowledge base management system providing persistent memory and storage capabilities for AI agents. Designed to work alongside LocalAI and LocalAGI.

Link: https://github.com/mudler/LocalRecall

Thank you! ❤️

A massive THANK YOU to our incredible community and our sponsors! LocalAI has over 34,100 stars, and LocalAGI has already rocketed past 900+ stars!

As a reminder, LocalAI is real FOSS (Free and Open Source Software) and its sibling projects are community-driven and not backed by VCs or a company. We rely on contributors donating their spare time and our sponsors to provide us the hardware! If you love open-source, privacy-first AI, please consider starring the repos, contributing code, reporting bugs, or spreading the word!

👉 Check out the reborn LocalAGI v2 today: https://github.com/mudler/LocalAGI

Full changelog 👇 👉 Click to expand 👈 What's Changed Breaking Changes 🛠 Bug fixes 🐛 Exciting New Features 🎉 🧠 Models 📖 Documentation and examples 👒 Dependencies Other Changes New Contributors

Full Changelog: v3.1.1...v3.2.0


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4