issues
search
liltom-eth
/
llama2-webui
Run any Llama 2 locally with gradio UI on GPU or CPU from anywhere (Linux/Windows/Mac). Use `llama2-wrapper` as your local llama2 backend for Generative Agents/Apps.
MIT License
1.97k
stars
202
forks
source link
[FEATURE] Llama2 wrapper unify arguments, change initial method
#38
Closed
liltom-eth
closed
1 year ago