LocalAI
Drop-in OpenAI replacement
Ettore Di Giacinto
LocalAI is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API that's compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs, generate images, audio locally with consumer grade hardware, supporting multiple model families and architectures. ā ļø Note Before running a model, make sure your device has enough free RAM to support it. Attempting to run a model that exceeds your available memory could cause your device to crash or become unresponsive. Always check the model requirements before downloading or starting it.
This release includes several improvements and new features: - Added support for multilingual capabilities in Chatterbox - Improved reranking models for the llama.cpp backend - Added support for L4T devices in Kokoro - Introduced new models to the gallery, including Qwen image edit and IBM Granite variants - Enhanced chat completion and function calling - Improved backend runtime capability detection Full release notes can be found at https://github.com/mudler/LocalAI/releases
- Versionv3.6.0
- CategoryAI
- Source codePublic
- Developed byEttore Di Giacinto
- Submitted byhighghlow
- Compatible withumbrelOS 0.5 or later