Open xhzheng1895 opened 4 weeks ago
phi-2也是transformers各版本报不同的错... <4.37.0
Traceback (most recent call last):
File "/data_sdb/demos/mnn-llm/models/llm-export/llm_export.py", line 1404, in <module>
llm_exporter = llm_models[model_type](args)
File "/data_sdb/demos/mnn-llm/models/llm-export/llm_export.py", line 1142, in __init__
super().__init__(args)
File "/data_sdb/demos/mnn-llm/models/llm-export/llm_export.py", line 121, in __init__
self.load_hf(args.path)
File "/data_sdb/demos/mnn-llm/models/llm-export/llm_export.py", line 137, in load_hf
self.model = AutoModel.from_pretrained(model_path, trust_remote_code=True).float().eval()
File "/home/xinzhe02/.local/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py", line 525, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
File "/home/xinzhe02/.local/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py", line 1050, in from_pretrained
config_class = CONFIG_MAPPING[config_dict["model_type"]]
File "/home/xinzhe02/.local/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py", line 748, in __getitem__
raise KeyError(key)
KeyError: 'phi'
=4.37.0
Traceback (most recent call last): File "/data_sdb/demos/mnn-llm/models/llm-export/llm_export.py", line 1404, in <module> llm_exporter = llm_models[model_type](args) File "/data_sdb/demos/mnn-llm/models/llm-export/llm_export.py", line 1142, in __init__ super().__init__(args) File "/data_sdb/demos/mnn-llm/models/llm-export/llm_export.py", line 122, in __init__ self.load_model() File "/data_sdb/demos/mnn-llm/models/llm-export/llm_export.py", line 1147, in load_model transformer = self.model.transformer File "/home/xinzhe02/.local/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1709, in __getattr__ raise AttributeError(f"'{type(self).__name__}' object has no attribute '{name}'") AttributeError: 'PhiForCausalLM' object has no attribute 'transformer'
我猜得用llm_models里面的对应.py替换hf transformers中的实现
分别跑了Qwen1.5-1.8B-Chat和Qwen-1_8B-Chat,报了类似的问题: 以Qwen1.5-1.8B-Chat举例: 使用tramsformers==4.31.0:
使用最新版本transformers(version==4.42.3)
这里ONNX的目录下是有文件的,可能是ONNX转MNN时出错
cmdline: python3 llm_export.py --type Qwen1_5-1_8B-Chat \ --path ../Qwen1.5-1.8B-Chat \ --export --export_mnn \ --onnx_path ./Qwen1_5-1_8B-Chat-Int4-S-ONNX \ --mnn_path ./Qwen1_5-1_8B-Chat-Int4-S-MNN python version: 3.9.19 其他包除transformers,均follow requirement.txt