Alpha-VLLM / LLaMA2-Accessory

An Open-source Toolkit for LLM Development
https://llama2-accessory.readthedocs.io/
Other
2.68k stars 170 forks source link

Warning instead of Error #83

Closed yeonju7kim closed 10 months ago

yeonju7kim commented 10 months ago

Hello. I'm currently using this repo a lot. Thank you for your good work.

https://github.com/Alpha-VLLM/LLaMA2-Accessory/blob/a76a29070a718e1aad4d4e493503d278b29d5c6a/accessory/configs/global_configs.py#L6C1-L6C28

I think the above should be just except instead of except ModuleNotFoundError

ChrisLiu6 commented 10 months ago

You are right, thank you! ModuleNotFoundError is now changed to the general ImportError, see https://github.com/Alpha-VLLM/LLaMA2-Accessory/commit/448bcaf3695c955e0de9079a1606f2c605ae16cf