liltom-eth / llama2-webui

Run any Llama 2 locally with gradio UI on GPU or CPU from anywhere (Linux/Windows/Mac). Use `llama2-wrapper` as your local llama2 backend for Generative Agents/Apps.
MIT License
1.97k stars 202 forks source link

[FEATURE] support for ctransformers #47

Closed touchtop closed 1 year ago

touchtop commented 1 year ago

Thank you for your outstanding work. I have already incorporated it as a primary tool for researching LLM. Additionally, I noticed that some of the backends for the GGML models use ctransformers. Could we consider adding support for ctransformers in LLAMA2-webui in the future? Thanks again.

touchtop commented 1 year ago

I found using llama_cpp directly is helpful.