abetlen/llama-cpp-python (llama_cpp_python)
### [`v0.2.44`](https://togithub.com/abetlen/llama-cpp-python/blob/HEAD/CHANGELOG.md#0244)
[Compare Source](https://togithub.com/abetlen/llama-cpp-python/compare/v0.2.43...v0.2.44)
- feat: Update llama.cpp to [ggerganov/llama.cpp@`4524290`](https://togithub.com/ggerganov/llama.cpp/commit/4524290e87b8e107cc2b56e1251751546f4b9051)
- fix: create_embedding broken response for input type str by [@abetlen](https://togithub.com/abetlen) in [`0ce66bc`](https://togithub.com/abetlen/llama-cpp-python/commit/0ce66bc080fe537590b05b24bf442480bf2dd045)
- fix: Use '\n' seperator for EventSourceResponse by [@khimaros](https://togithub.com/khimaros) in [#1188](https://togithub.com/abetlen/llama-cpp-python/issues/1188)
- fix: Incorporate embedding pooling layer fixes by [@iamlemec](https://togithub.com/iamlemec) in [#1194](https://togithub.com/abetlen/llama-cpp-python/issues/1194)
Configuration
📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).
🚦 Automerge: Enabled.
♻ Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
[ ] If you want to rebase/retry this PR, check this box
This PR has been generated by Mend Renovate. View repository job log here.
This PR contains the following updates:
==0.2.43
->==0.2.44
Release Notes
abetlen/llama-cpp-python (llama_cpp_python)
### [`v0.2.44`](https://togithub.com/abetlen/llama-cpp-python/blob/HEAD/CHANGELOG.md#0244) [Compare Source](https://togithub.com/abetlen/llama-cpp-python/compare/v0.2.43...v0.2.44) - feat: Update llama.cpp to [ggerganov/llama.cpp@`4524290`](https://togithub.com/ggerganov/llama.cpp/commit/4524290e87b8e107cc2b56e1251751546f4b9051) - fix: create_embedding broken response for input type str by [@abetlen](https://togithub.com/abetlen) in [`0ce66bc`](https://togithub.com/abetlen/llama-cpp-python/commit/0ce66bc080fe537590b05b24bf442480bf2dd045) - fix: Use '\n' seperator for EventSourceResponse by [@khimaros](https://togithub.com/khimaros) in [#1188](https://togithub.com/abetlen/llama-cpp-python/issues/1188) - fix: Incorporate embedding pooling layer fixes by [@iamlemec](https://togithub.com/iamlemec) in [#1194](https://togithub.com/abetlen/llama-cpp-python/issues/1194)Configuration
📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).
🚦 Automerge: Enabled.
♻ Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
This PR has been generated by Mend Renovate. View repository job log here.