abetlen/llama-cpp-python (llama_cpp_python)
### [`v0.2.75`](https://togithub.com/abetlen/llama-cpp-python/blob/HEAD/CHANGELOG.md#0275)
[Compare Source](https://togithub.com/abetlen/llama-cpp-python/compare/v0.2.74...v0.2.75)
- feat: Update llama.cpp to [ggerganov/llama.cpp@`13ad16a`](https://togithub.com/ggerganov/llama.cpp/commit/13ad16af1231ab2d245d35df3295bcfa23de1305)
- fix: segfault for models without eos / bos tokens by [@abetlen](https://togithub.com/abetlen) in [`d99a6ba`](https://togithub.com/abetlen/llama-cpp-python/commit/d99a6ba607a4885fb00e63e967964aa41bdbbbcb)
- feat: add MinTokensLogitProcessor and min_tokens argument to server by [@twaka](https://togithub.com/twaka) in [#1333](https://togithub.com/abetlen/llama-cpp-python/issues/1333)
- misc: Remove unnecessary metadata lookups by [@CISC](https://togithub.com/CISC) in [#1448](https://togithub.com/abetlen/llama-cpp-python/issues/1448)
Configuration
📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).
🚦 Automerge: Enabled.
♻ Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
[ ] If you want to rebase/retry this PR, check this box
This PR has been generated by Mend Renovate. View repository job log here.
This PR contains the following updates:
==0.2.74
->==0.2.75
Release Notes
abetlen/llama-cpp-python (llama_cpp_python)
### [`v0.2.75`](https://togithub.com/abetlen/llama-cpp-python/blob/HEAD/CHANGELOG.md#0275) [Compare Source](https://togithub.com/abetlen/llama-cpp-python/compare/v0.2.74...v0.2.75) - feat: Update llama.cpp to [ggerganov/llama.cpp@`13ad16a`](https://togithub.com/ggerganov/llama.cpp/commit/13ad16af1231ab2d245d35df3295bcfa23de1305) - fix: segfault for models without eos / bos tokens by [@abetlen](https://togithub.com/abetlen) in [`d99a6ba`](https://togithub.com/abetlen/llama-cpp-python/commit/d99a6ba607a4885fb00e63e967964aa41bdbbbcb) - feat: add MinTokensLogitProcessor and min_tokens argument to server by [@twaka](https://togithub.com/twaka) in [#1333](https://togithub.com/abetlen/llama-cpp-python/issues/1333) - misc: Remove unnecessary metadata lookups by [@CISC](https://togithub.com/CISC) in [#1448](https://togithub.com/abetlen/llama-cpp-python/issues/1448)Configuration
📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).
🚦 Automerge: Enabled.
♻ Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
This PR has been generated by Mend Renovate. View repository job log here.