liltom-eth / llama2-webui

Run any Llama 2 locally with gradio UI on GPU or CPU from anywhere (Linux/Windows/Mac). Use `llama2-wrapper` as your local llama2 backend for Generative Agents/Apps.
MIT License
1.96k stars 201 forks source link

When I was running app. py, I encountered some errors #48

Closed Nerva05251228 closed 1 year ago

Nerva05251228 commented 1 year ago

This is my problem, i need help!!

root@autodl-container-66e0119cac-10ae5257:~/autodl-tmp/llama2-webui# python app.py Traceback (most recent call last): File "app.py", line 8, in from llama2_wrapper import LLAMA2_WRAPPER File "/root/autodl-tmp/llama2-webui/llama2_wrapper/init.py", line 1, in from .model import LLAMA2_WRAPPER, get_prompt File "/root/autodl-tmp/llama2-webui/llama2_wrapper/model.py", line 7, in class LLAMA2_WRAPPER: File "/root/autodl-tmp/llama2-webui/llama2_wrapper/model.py", line 163, in LLAMA2_WRAPPER chat_history: list[tuple[str, str]] = [], TypeError: 'type' object is not subscriptable

liltom-eth commented 1 year ago

Hi, can you try re-clone the repo and re-install llama2-wrapper? It shouldn't happen. And you also can try python benchmark.py to test if model is correctly running.

Nerva05251228 commented 1 year ago

Hi, can you try re-clone the repo and re-install llama2-wrapper? It shouldn't happen.嗨,您可以尝试重新克隆存储库并重新安装 llama2 包装器吗?它不应该发生。 And you also can try python benchmark.py to test if model is correctly running.您还可以尝试 python benchmark.py 测试模型是否正确运行。

Thank you! I have already solved this problem!