abetlen/llama-cpp-python (llama_cpp_python)
### [`v0.2.57`](https://togithub.com/abetlen/llama-cpp-python/blob/HEAD/CHANGELOG.md#0257)
[Compare Source](https://togithub.com/abetlen/llama-cpp-python/compare/v0.2.56...v0.2.57)
- feat: Update llama.cpp to [ggerganov/llama.cpp@`ac9ee6a`](https://togithub.com/ggerganov/llama.cpp/commit/ac9ee6a4ad740bc1ee484ede43e9f92b5af244c1)
- fix: set default embedding pooling type to unspecified by [@abetlen](https://togithub.com/abetlen) in [`4084aab`](https://togithub.com/abetlen/llama-cpp-python/commit/4084aabe867b8ec2aba1b22659e59c9318b0d1f3)
- fix: Fix and optimize functionary chat handler by [@jeffrey-fong](https://togithub.com/jeffrey-fong) in [#1282](https://togithub.com/abetlen/llama-cpp-python/issues/1282)
- fix: json mode for basic chat formats by [@abetlen](https://togithub.com/abetlen) in [`20e6815`](https://togithub.com/abetlen/llama-cpp-python/commit/20e6815252d0efd9f015f7adbf108faaf36e3f3c)
Configuration
📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).
🚦 Automerge: Enabled.
♻ Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
[ ] If you want to rebase/retry this PR, check this box
This PR has been generated by Mend Renovate. View repository job log here.
This PR contains the following updates:
==0.2.56
->==0.2.57
Release Notes
abetlen/llama-cpp-python (llama_cpp_python)
### [`v0.2.57`](https://togithub.com/abetlen/llama-cpp-python/blob/HEAD/CHANGELOG.md#0257) [Compare Source](https://togithub.com/abetlen/llama-cpp-python/compare/v0.2.56...v0.2.57) - feat: Update llama.cpp to [ggerganov/llama.cpp@`ac9ee6a`](https://togithub.com/ggerganov/llama.cpp/commit/ac9ee6a4ad740bc1ee484ede43e9f92b5af244c1) - fix: set default embedding pooling type to unspecified by [@abetlen](https://togithub.com/abetlen) in [`4084aab`](https://togithub.com/abetlen/llama-cpp-python/commit/4084aabe867b8ec2aba1b22659e59c9318b0d1f3) - fix: Fix and optimize functionary chat handler by [@jeffrey-fong](https://togithub.com/jeffrey-fong) in [#1282](https://togithub.com/abetlen/llama-cpp-python/issues/1282) - fix: json mode for basic chat formats by [@abetlen](https://togithub.com/abetlen) in [`20e6815`](https://togithub.com/abetlen/llama-cpp-python/commit/20e6815252d0efd9f015f7adbf108faaf36e3f3c)Configuration
📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).
🚦 Automerge: Enabled.
♻ Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
This PR has been generated by Mend Renovate. View repository job log here.