allenporter / llama-cpp-server

Docker images for easier running of llama-cpp-python server
Apache License 2.0
4 stars 2 forks source link

Update dependency llama_cpp_python to v0.2.78 #80

Closed renovate[bot] closed 3 months ago

renovate[bot] commented 3 months ago

Mend Renovate

This PR contains the following updates:

Package Change Age Adoption Passing Confidence
llama_cpp_python (changelog) ==0.2.77 -> ==0.2.78 age adoption passing confidence

Release Notes

abetlen/llama-cpp-python (llama_cpp_python) ### [`v0.2.78`](https://togithub.com/abetlen/llama-cpp-python/blob/HEAD/CHANGELOG.md#0278) [Compare Source](https://togithub.com/abetlen/llama-cpp-python/compare/v0.2.77...v0.2.78) - feat: Update llama.cpp to [ggerganov/llama.cpp@`fd5ea0f`](https://togithub.com/ggerganov/llama.cpp/commit/fd5ea0f897ecb3659d6c269ef6f3d833e865ead7) - fix: Avoid duplicate special tokens in chat formats by [@​CISC](https://togithub.com/CISC) in [#​1439](https://togithub.com/abetlen/llama-cpp-python/issues/1439) - fix: fix logprobs when BOS is not present by [@​ghorbani](https://togithub.com/ghorbani) in [#​1471](https://togithub.com/abetlen/llama-cpp-python/issues/1471) - feat: adding rpc_servers parameter to Llama class by [@​chraac](https://togithub.com/chraac) in [#​1477](https://togithub.com/abetlen/llama-cpp-python/issues/1477)

Configuration

📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).

🚦 Automerge: Enabled.

Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about this update again.



This PR has been generated by Mend Renovate. View repository job log here.