liltom-eth / llama2-webui

Run any Llama 2 locally with gradio UI on GPU or CPU from anywhere (Linux/Windows/Mac). Use `llama2-wrapper` as your local llama2 backend for Generative Agents/Apps.
MIT License
1.96k stars 201 forks source link

[DOCUMENT] update license, intro #8

Closed AndyW-llm closed 1 year ago

AndyW-llm commented 1 year ago

Modified Readme

liltom-eth commented 1 year ago

Thanks for contributing, leave your Twitter name here if you like to be mentioned in some posts later.