szaimen / aio-local-ai

GNU Affero General Public License v3.0
2 stars 1 forks source link

Bump go-skynet/local-ai from v2.9.0-ffmpeg-core to v2.10.0-ffmpeg-core #27

Closed dependabot[bot] closed 3 months ago

dependabot[bot] commented 3 months ago

Bumps go-skynet/local-ai from v2.9.0-ffmpeg-core to v2.10.0-ffmpeg-core.

Release notes

Sourced from go-skynet/local-ai's releases.

v2.10.0

LocalAI v2.10.0 Release Notes

Excited to announce the release of LocalAI v2.10.0! This version introduces significant changes, including breaking changes, numerous bug fixes, exciting new features, dependency updates, and more. Here's a summary of what's new:

Breaking Changes 🛠

  • The trust_remote_code setting in the YAML config file of the model are now consumed for enhanced security measures also for the AutoGPTQ and transformers backend, thanks to @​dave-gray101's contribution (#1799). If your model relied on the old behavior and you are sure of what you are doing, set trust_remote_code: true in the YAML config file.

Bug Fixes :bug:

  • Various fixes have been implemented to enhance the stability and performance of LocalAI:
    • SSE no longer omits empty finish_reason fields for better compatibility with the OpenAI API, fixed by @​mudler (#1745).
    • Functions now correctly handle scenarios with no results, also addressed by @​mudler (#1758).
    • A Command Injection Vulnerability has been fixed by @​ouxs-19 (#1778).
    • OpenCL-based builds for llama.cpp have been restored, thanks to @​cryptk's efforts (#1828, #1830).
    • An issue with OSX build default.metallib has been resolved, which should now allow running the llama-cpp backend on Apple arm64, fixed by @​dave-gray101 (#1837).

Exciting New Features 🎉

  • LocalAI continues to evolve with several new features:
    • Ongoing implementation of the assistants API, making great progress thanks to community contributions, including an initial implementation by @​christ66 (#1761).
    • Addition of diffusers/transformers support for Intel GPU - now you can generate images and use the transformer backend also on Intel GPUs, implemented by @​mudler (#1746).
    • Introduction of Bitsandbytes quantization for transformer backend enhancement and a fix for transformer backend error on CUDA by @​fakezeta (#1823).
    • Compatibility layers for Elevenlabs and OpenAI TTS, enhancing text-to-speech capabilities: Now LocalAI is compatible with Elevenlabs and OpenAI TTS, thanks to @​mudler (#1834).
    • vLLM now supports stream: true! This feature was introduced by @​golgeek (#1749).

Dependency Updates 👒

  • Our continuous effort to keep dependencies up-to-date includes multiple updates to ggerganov/llama.cpp, donomii/go-rwkv.cpp, mudler/go-stable-diffusion, and others, ensuring that LocalAI is built on the latest and most secure libraries.

Other Changes

  • Several internal changes have been made to improve the development process and documentation, including updates to integration guides, stress reduction on self-hosted runners, and more.

Details of What's Changed

Breaking Changes 🛠

Bug fixes :bug:

Exciting New Features 🎉

... (truncated)

Commits
  • 8967ed1 :arrow_up: Update ggerganov/llama.cpp (#1840)
  • 5826fb8 :arrow_up: Update mudler/go-piper (#1844)
  • 89351f1 feat(embeddings): do not require to be configured (#1842)
  • ae2e4fc docs(transformers): add docs section about transformers (#1841)
  • db199f6 fix: osx build default.metallib (#1837)
  • 44adbd2 :arrow_up: Update go-skynet/go-llama.cpp (#1835)
  • 20136ca feat(tts): add Elevenlabs and OpenAI TTS compatibility layer (#1834)
  • 45d520f fix: OSX Build Files for llama.cpp (#1836)
  • 3882130 feat: Add Bitsandbytes quantization for transformer backend enhancement #1775...
  • a6b5407 fix: missing OpenCL libraries from docker containers during clblas docker bui...
  • Additional commits viewable in compare view


Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
dependabot[bot] commented 3 months ago

The following labels could not be found: 3. to review, dependencies.

dependabot[bot] commented 3 months ago

Superseded by #28.