Originally posted by **bmtwl** February 10, 2024
Hello, I'm working on a private branch of llama.cpp to add some features for an eventual PR, but I'd like to try to use it in oobabooga ahead of any PR going in as both a kind of regression test, and because I'd like to use my feature early : )
I didn't find anything in previous discussions, the wiki, the README or anywhere else I have been able to search.
Is it possible, and if so is there a documented procedure?
Thanks!
I think there are a variety of reasons someone might want to use a local compile of llama.cpp.
Maybe there should be an official guide with steps?
Discussed in https://github.com/oobabooga/text-generation-webui/discussions/5479
I think there are a variety of reasons someone might want to use a local compile of llama.cpp. Maybe there should be an official guide with steps?