We're excited to announce that our AIO image has been upgraded with the latest LLM model, llama3, enhancing our capabilities with more accurate and dynamic responses. Behind the scenes uses https://huggingface.co/NousResearch/Hermes-2-Pro-Llama-3-8B-GGUF which is ready for function call, yay!
đź’¬ WebUI enhancements: Updates in Chat, Image Generation, and TTS
Chat
TTS
Image gen
Our interfaces for Chat, Text-to-Speech (TTS), and Image Generation have finally landed. Enjoy streamlined and simple interactions thanks to the efforts of our team, led by @​mudler, who have worked tirelessly to enhance your experience. The WebUI interface serves as a quick way to debug and assess models loaded in LocalAI - there is much to improve, but we have now a small, hackable interface!
🖼️ Many new models in the model gallery!
The model gallery has received a substantial upgrade with numerous new models, including Einstein v6.1, SOVL, and several specialized Llama3 iterations. These additions are designed to cater to a broader range of tasks , making LocalAI more versatile than ever. Kudos to @​mudler for spearheading these exciting updates - now you can select with a couple of click the model you like!
🛠️ Robust Fixes and Optimizations
This update brings a series of crucial bug fixes and security enhancements to ensure our platform remains secure and efficient. Special thanks to @​dave-gray101, @​cryptk, and @​fakezeta for their diligent work in rooting out and resolving these issues :hugs:
✨ OpenVINO and more
We're introducing OpenVINO acceleration, and many OpenVINO models in the gallery. You can now enjoy fast-as-hell speed on Intel CPU and GPUs. Applause to @​fakezeta for the contributions!
đź“š Documentation and Dependency Upgrades
We've updated our documentation and dependencies to keep you equipped with the latest tools and knowledge. These updates ensure that LocalAI remains a robust and dependable platform.
đź‘Ą A Community Effort
A special shout-out to our new contributors, @​QuinnPiers and @​LeonSijiaLu, who have enriched our community with their first contributions. Welcome aboard, and thank you for your dedication and fresh insights!
Each update in this release not only enhances our platform's capabilities but also ensures a safer and more user-friendly experience. We are excited to see how our users leverage these new features in their projects, freel free to hit a line on Twitter or in any other social, we'd be happy to hear how you use LocalAI!
đź“Ł Spread the word!
First off, a massive thank you (again!) to each and every one of you who've chipped in to squash bugs and suggest cool new features for LocalAI. Your help, kind words, and brilliant ideas are truly appreciated - more than words can say!
And to those of you who've been heros, giving up your own time to help out fellow users on Discord and in our repo, you're absolutely amazing. We couldn't have asked for a better community.
Just so you know, LocalAI doesn't have the luxury of big corporate sponsors behind it. It's all us, folks. So, if you've found value in what we're building together and want to keep the momentum going, consider showing your support. A little shoutout on your favorite social platforms using @​LocalAI_OSS and @​mudler_it or joining our sponsors can make a big difference.
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Bumps go-skynet/local-ai from v2.12.4-ffmpeg-core to v2.14.0-ffmpeg-core.
Release notes
Sourced from go-skynet/local-ai's releases.
... (truncated)
Commits
b58274b
feat(ui): support multilineand styleul
(#2226)a31d00d
feat(aio): switch to llama3-based for LLM (#2225)2cc1bd8
:arrow_up: Update ggerganov/llama.cpp (#2224)2c5a46b
feat(ux): Add chat, tts, and image-gen pages to the WebUI (#2222)f7f8b48
models(gallery): Add Hermes-2-Pro-Llama-3-8B-GGUF (#2218)e5bd9a7
models(gallery): add wizardlm2 (#2209)4690b53
feat: user defined inference device for CUDA and OpenVINO (#2212)6a7a799
:arrow_up: Update ggerganov/llama.cpp (#2213)962ebba
models(gallery): fixup phi-3 shaf90d56d
:arrow_up: Update ggerganov/llama.cpp (#2203)Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting
@dependabot rebase
.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show