Run any Llama 2 locally with gradio UI on GPU or CPU from anywhere (Linux/Windows/Mac). Use `llama2-wrapper` as your local llama2 backend for Generative Agents/Apps.
Thank you for your outstanding work. I have already incorporated it as a primary tool for researching LLM. Additionally, I noticed that some of the backends for the GGML models use ctransformers. Could we consider adding support for ctransformers in LLAMA2-webui in the future? Thanks again.
Thank you for your outstanding work. I have already incorporated it as a primary tool for researching LLM. Additionally, I noticed that some of the backends for the GGML models use ctransformers. Could we consider adding support for ctransformers in LLAMA2-webui in the future? Thanks again.