Alpha-VLLM / LLaMA2-Accessory

An Open-source Toolkit for LLM Development
https://llama2-accessory.readthedocs.io/
Other
2.7k stars 170 forks source link

TypeError: 'type' object is not subscriptable,请问改问题怎么解决? #150

Open zhanghaobucunzai opened 8 months ago

zhanghaobucunzai commented 8 months ago

1.执行:bash exps/finetune/mm/alpacaLlava_llamaQformerv2_13B.sh ./LLaMA2-Accessory ./13B_params.json ./tokenizer.model 报错如下: raceback (most recent call last): File "main_finetune.py", line 46, in from accessory.util.tensor_type import default_tensor_type, promote_trainable_params_to_fp32 File "/home/LLaMA2-Accessory-main/accessory/util/tensor_type.py", line 7, in class default_tensor_type: File "/home/LLaMA2-Accessory-main/accessory/util/tensor_type.py", line 37, in default_tensor_type exc_type: Optional[type[BaseException]], TypeError: 'type' object is not subscriptable

2。修改LLaMA2-Accessory-main/accessory/util/tensor_type.py 37行 将exc_type: Optional[type[BaseException]] 改为exc_type: Optional[type([BaseException])],暂时规避问题,继续执行

3.遇到新报错 Traceback (most recent call last): File "main_finetune.py", line 47, in from accessory.model.meta import MetaModel File "/home/LLaMA2-Accessory-main/accessory/model/meta.py", line 14, in from accessory.util import misc, tensor_parallel File "/home//LLaMA2-Accessory-main/accessory/util/tensor_parallel.py", line 86, in ) -> OrderedDict[str, torch.Tensor]: TypeError: 'type' object is not subscriptable

ChrisLiu6 commented 8 months ago

Are you using python 3.8? Updating to python 3.10 should fix it

zhanghaobucunzai commented 8 months ago

是的,我是用的3.8的版本,后续还遇到了RuntimeError: Tensors must be CUDA and dense报错。