abetlen/llama-cpp-python (llama_cpp_python)
### [`v0.2.79`](https://togithub.com/abetlen/llama-cpp-python/blob/HEAD/CHANGELOG.md#0279)
[Compare Source](https://togithub.com/abetlen/llama-cpp-python/compare/v0.2.78...v0.2.79)
- feat: Update llama.cpp to [ggerganov/llama.cpp@`9c77ec1`](https://togithub.com/ggerganov/llama.cpp/commit/9c77ec1d74874ee22bdef8f110e8e8d41389abf2)
- feat(ci): Update workflows and pre-built wheels by [@Smartappli](https://togithub.com/Smartappli) in [#1416](https://togithub.com/abetlen/llama-cpp-python/issues/1416)
- feat: Add .close() method to Llama class to explicitly free model from memory by [@jkawamoto](https://togithub.com/jkawamoto) in [#1513](https://togithub.com/abetlen/llama-cpp-python/issues/1513)
- feat: Support SPM infill by [@CISC](https://togithub.com/CISC) in [#1492](https://togithub.com/abetlen/llama-cpp-python/issues/1492)
Configuration
📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).
🚦 Automerge: Enabled.
â™» Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
[ ] If you want to rebase/retry this PR, check this box
This PR has been generated by Mend Renovate. View repository job log here.
This PR contains the following updates:
==0.2.78
->==0.2.79
Release Notes
abetlen/llama-cpp-python (llama_cpp_python)
### [`v0.2.79`](https://togithub.com/abetlen/llama-cpp-python/blob/HEAD/CHANGELOG.md#0279) [Compare Source](https://togithub.com/abetlen/llama-cpp-python/compare/v0.2.78...v0.2.79) - feat: Update llama.cpp to [ggerganov/llama.cpp@`9c77ec1`](https://togithub.com/ggerganov/llama.cpp/commit/9c77ec1d74874ee22bdef8f110e8e8d41389abf2) - feat(ci): Update workflows and pre-built wheels by [@Smartappli](https://togithub.com/Smartappli) in [#1416](https://togithub.com/abetlen/llama-cpp-python/issues/1416) - feat: Add .close() method to Llama class to explicitly free model from memory by [@jkawamoto](https://togithub.com/jkawamoto) in [#1513](https://togithub.com/abetlen/llama-cpp-python/issues/1513) - feat: Support SPM infill by [@CISC](https://togithub.com/CISC) in [#1492](https://togithub.com/abetlen/llama-cpp-python/issues/1492)Configuration
📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).
🚦 Automerge: Enabled.
â™» Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
This PR has been generated by Mend Renovate. View repository job log here.