edgenai / llama_cpp-rs

High-level, optionally asynchronous Rust bindings to llama.cpp
Apache License 2.0
160 stars 32 forks source link

Building with latest version of llama.cpp #90

Open cooperll opened 3 months ago

cooperll commented 3 months ago

On adding llama_cpp-rs to my Cargo.toml, llama.cpp seems to be locked to an older version. I'm trying to use Phi-3 128k in a project and I'm unable to because the PR that was merged into llama.cpp about two weeks ago.

Is there an easy way to make sure that llama_cpp-rs is using the latest llama.cpp commit?

vargad commented 3 months ago

I just open a PR: https://github.com/edgenai/llama_cpp-rs/pull/91

In the meantime, if you want to try it out, just put this in the Cargo.toml: llama_cpp = { git = "https://github.com/vargad/llama_cpp-rs.git", branch = "bump_3038" }

I only tested Phi3 4K model, that works with the change above.

cooperll commented 3 months ago

This is great. Thank you.