运行参数如下:
llamafactory-cli eval examples/lora_single_gpu/llama3_lora_eval.yaml
报错如下:
Traceback (most recent call last):
File "/home/ps/miniconda3/envs/llamafactory/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 1029, in from_pretrained
config_class = CONFIG_MAPPING[config_dict["model_type"]]
File "/home/ps/miniconda3/envs/llamafactory/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 731, in getitem
raise KeyError(key)
KeyError: 'llava_qwen2'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/ps/miniconda3/envs/llamafactory/bin/llamafactory-cli", line 8, in
sys.exit(main())
File "/home/ps/proj/LLaMA-Factory/src/llamafactory/cli.py", line 71, in main
run_eval()
File "/home/ps/proj/LLaMA-Factory/src/llamafactory/eval/evaluator.py", line 122, in run_eval
Evaluator().eval()
File "/home/ps/proj/LLaMA-Factory/src/llamafactory/eval/evaluator.py", line 27, in init
self.model = load_model(self.tokenizer, self.model_args, finetuning_args)
File "/home/ps/proj/LLaMA-Factory/src/llamafactory/model/loader.py", line 115, in load_model
config = load_config(model_args)
File "/home/ps/proj/LLaMA-Factory/src/llamafactory/model/loader.py", line 101, in load_config
return AutoConfig.from_pretrained(model_args.model_name_or_path, **init_kwargs)
File "/home/ps/miniconda3/envs/llamafactory/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 1031, in from_pretrained
raise ValueError(
ValueError: The checkpoint you are trying to load has model type llava_qwen2 but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
Reminder
System Info
llamafactory
version: 0.7.2.dev0Reproduction
运行参数如下: llamafactory-cli eval examples/lora_single_gpu/llama3_lora_eval.yaml 报错如下: Traceback (most recent call last): File "/home/ps/miniconda3/envs/llamafactory/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 1029, in from_pretrained config_class = CONFIG_MAPPING[config_dict["model_type"]] File "/home/ps/miniconda3/envs/llamafactory/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 731, in getitem raise KeyError(key) KeyError: 'llava_qwen2'
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "/home/ps/miniconda3/envs/llamafactory/bin/llamafactory-cli", line 8, in
sys.exit(main())
File "/home/ps/proj/LLaMA-Factory/src/llamafactory/cli.py", line 71, in main
run_eval()
File "/home/ps/proj/LLaMA-Factory/src/llamafactory/eval/evaluator.py", line 122, in run_eval
Evaluator().eval()
File "/home/ps/proj/LLaMA-Factory/src/llamafactory/eval/evaluator.py", line 27, in init
self.model = load_model(self.tokenizer, self.model_args, finetuning_args)
File "/home/ps/proj/LLaMA-Factory/src/llamafactory/model/loader.py", line 115, in load_model
config = load_config(model_args)
File "/home/ps/proj/LLaMA-Factory/src/llamafactory/model/loader.py", line 101, in load_config
return AutoConfig.from_pretrained(model_args.model_name_or_path, **init_kwargs)
File "/home/ps/miniconda3/envs/llamafactory/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 1031, in from_pretrained
raise ValueError(
ValueError: The checkpoint you are trying to load has model type
llava_qwen2
but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.Expected behavior
我测评的模型是huatuo,这个项目是否支持,如何正确下载或是否能修改测评模型的model type。
Others
No response