Closed Zmeo closed 3 months ago
Hi,
Thank you for your feedback! To ensure that the script runs correctly, you need to add the following parameter setting to your script:
LLM_TYPE="llama3"
Additionally, make sure to include the --llm_type argument in your torchrun command, like this:
torchrun $DISTRIBUTED_ARGS finetune.py \
--model_name_or_path $MODEL \
--llm_type $LLM_TYPE \
... \
Please include the above line in your script, making sure it is set before execution. If you have any other questions, feel free to reach out.
Thanks!
感谢您的回复 加上参数之后出现了新的报错
File "/home/kas/.conda/envs/MiniCPMV/lib/python3.10/site-packages/transformers/hf_argparser.py", line 347, in parse_args_into_dataclasses raise ValueError(f"Some specified arguments are not used by the HfArgumentParser: {remaining_args}") ValueError: Some specified arguments are not used by the HfArgumentParser: ['--llm_type', 'llama3']
看起来是模型权重类型不一致的问题,你可以尝试加一行 model = model.to(device='cuda', dtype=torch.bfloat16)
报错信息:
finetune脚本:
数据: