liltom-eth / llama2-webui

Run any Llama 2 locally with gradio UI on GPU or CPU from anywhere (Linux/Windows/Mac). Use `llama2-wrapper` as your local llama2 backend for Generative Agents/Apps.
MIT License
1.96k stars 201 forks source link

Gradio Memory Leak Issue #85

Open ruizcrp opened 9 months ago

ruizcrp commented 9 months ago

Hi, I experienced a memory leak issue that could probably be connected to Gradio and to the issue discussed here: https://github.com/gradio-app/gradio/issues/3321 In the last messages they write that the issue might be solved with Gradio 4.x - I couldn't try that yet and in that issue it was also not yet tested.

I guess you can replicate the memory leak by just using a server for a longer while and making several requests to it without restarting.