abetlen/llama-cpp-python (llama-cpp-python)
### [`v0.3.1`](https://redirect.github.com/abetlen/llama-cpp-python/blob/HEAD/CHANGELOG.md#031)
[Compare Source](https://redirect.github.com/abetlen/llama-cpp-python/compare/v0.3.0...v0.3.1)
- feat: Update llama.cpp to [ggerganov/llama.cpp@`c919d5d`](https://redirect.github.com/ggerganov/llama.cpp/commit/c919d5db39c8a7fcb64737f008e4b105ee0acd20)
- feat: Expose libggml in internal APIs by [@abetlen](https://redirect.github.com/abetlen) in [#1761](https://redirect.github.com/abetlen/llama-cpp-python/issues/1761)
- fix: Fix speculative decoding by [@abetlen](https://redirect.github.com/abetlen) in [`9992c50`](https://redirect.github.com/abetlen/llama-cpp-python/commit/9992c5084a3df2f533e265d10f81d4269b97a1e6) and [`e975dab`](https://redirect.github.com/abetlen/llama-cpp-python/commit/e975dabf74b3ad85689c9a07719cbb181313139b)
- misc: Rename all_text to remaining_text by [@xu-song](https://redirect.github.com/xu-song) in [#1658](https://redirect.github.com/abetlen/llama-cpp-python/issues/1658)
Configuration
📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).
🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.
â™» Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
[ ] If you want to rebase/retry this PR, check this box
This PR contains the following updates:
==0.3.0
->==0.3.1
Release Notes
abetlen/llama-cpp-python (llama-cpp-python)
### [`v0.3.1`](https://redirect.github.com/abetlen/llama-cpp-python/blob/HEAD/CHANGELOG.md#031) [Compare Source](https://redirect.github.com/abetlen/llama-cpp-python/compare/v0.3.0...v0.3.1) - feat: Update llama.cpp to [ggerganov/llama.cpp@`c919d5d`](https://redirect.github.com/ggerganov/llama.cpp/commit/c919d5db39c8a7fcb64737f008e4b105ee0acd20) - feat: Expose libggml in internal APIs by [@abetlen](https://redirect.github.com/abetlen) in [#1761](https://redirect.github.com/abetlen/llama-cpp-python/issues/1761) - fix: Fix speculative decoding by [@abetlen](https://redirect.github.com/abetlen) in [`9992c50`](https://redirect.github.com/abetlen/llama-cpp-python/commit/9992c5084a3df2f533e265d10f81d4269b97a1e6) and [`e975dab`](https://redirect.github.com/abetlen/llama-cpp-python/commit/e975dabf74b3ad85689c9a07719cbb181313139b) - misc: Rename all_text to remaining_text by [@xu-song](https://redirect.github.com/xu-song) in [#1658](https://redirect.github.com/abetlen/llama-cpp-python/issues/1658)Configuration
📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).
🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.
â™» Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
This PR was generated by Mend Renovate. View the repository job log.