Alpha-VLLM / LLaMA2-Accessory

An Open-source Toolkit for LLM Development
https://llama2-accessory.readthedocs.io/
Other
2.73k stars 176 forks source link

无法导入模块 #153

Open bao-xiaoyi opened 10 months ago

bao-xiaoyi commented 10 months ago

File "/home/LLaMA2-Accessory/accessory/model/meta.py", line 29, in init model_module = importlib.import_module(f"accessory.model.LLM.{llama_type}") File "/home/.conda/envs/llama-accessory/lib/python3.10/importlib/init.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 1050, in _gcd_import File "", line 1027, in _find_and_load File "", line 1004, in _find_and_load_unlocked ModuleNotFoundError: No module named 'accessory.model.LLM.mistral'

bao-xiaoyi commented 10 months ago

还有个现象是,model.generate会输出一堆问号,像这样 image 目前我只能通过model.stream_generate得到正常一些的结果

bao-xiaoyi commented 10 months ago

导入模块的问题已解决。

ChrisLiu6 commented 9 months ago

还有个现象是,model.generate会输出一堆问号,像这样 image 目前我只能通过model.stream_generate得到正常一些的结果

Thank you for the feedback. Please make sure that you have pulled the lasted version of llama2-accessory. If the problem still exists, could you please show us your code?