liltom-eth / llama2-webui

Run any Llama 2 locally with gradio UI on GPU or CPU from anywhere (Linux/Windows/Mac). Use `llama2-wrapper` as your local llama2 backend for Generative Agents/Apps.
MIT License
1.96k stars 202 forks source link

[FEATURE] docker support #20

Open liltom-eth opened 1 year ago

rainfall-datta commented 1 year ago

When we try to run llama2-webui in docker container.. facing issue as below local url can not expose to outside and if we use gradio that couldnt get route or associated port and ip can not be identified Running on local URL: http://127.0.0.1:7861 Running on public URL: https://748b3f0963e1e60211.gradio.live

Can someone help in same as we are configuring llma2 server

rudism commented 1 year ago

Set GRADIO_SERVER_NAME=0.0.0.0 in your docker container to have gradio bind to an address that can be exposed outside the container.