zejunwang1 / LLMTuner

大语言模型指令调优工具(支持 FlashAttention)
166 stars 11 forks source link

出错啦 #12

Open iwaitu opened 6 months ago

iwaitu commented 6 months ago

04/26/2024 09:39:08 - INFO - llmtuner.model.utils.checkpointing - Gradient checkpointing enabled. 04/26/2024 09:39:08 - INFO - llmtuner.model.utils.attention - Using torch SDPA for faster training and inference. 04/26/2024 09:39:08 - INFO - llmtuner.model.adapter - Fine-tuning method: LoRA saves/Qwen1.5-0.5B/lora/qwen_fc Exception in thread Thread-10 (run_exp): Traceback (most recent call last): File "/home/iwaitu/anaconda3/envs/unsloth_env/lib/python3.10/site-packages/peft/config.py", line 197, in _get_peft_type config_file = hf_hub_download( File "/home/iwaitu/anaconda3/envs/unsloth_env/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 111, in _inner_fn validate_repo_id(arg_value) File "/home/iwaitu/anaconda3/envs/unsloth_env/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 159, in validate_repo_id raise HFValidationError( huggingface_hub.utils._validators.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': 'saves/Qwen1.5-0.5B/lora/qwen_fc'. Use repo_type argument if needed.

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/home/iwaitu/anaconda3/envs/unsloth_env/lib/python3.10/threading.py", line 1016, in _bootstrap_inner self.run() File "/home/iwaitu/anaconda3/envs/unsloth_env/lib/python3.10/threading.py", line 953, in run self._target(*self._args, **self._kwargs) File "/home/iwaitu/unsloth/LLaMA-Factory/src/llmtuner/train/tuner.py", line 33, in run_exp run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks) File "/home/iwaitu/unsloth/LLaMA-Factory/src/llmtuner/train/sft/workflow.py", line 34, in run_sft model = load_model(tokenizer, model_args, finetuning_args, training_args.do_train) File "/home/iwaitu/unsloth/LLaMA-Factory/src/llmtuner/model/loader.py", line 136, in load_model model = init_adapter(config, model, model_args, finetuning_args, is_trainable) File "/home/iwaitu/unsloth/LLaMA-Factory/src/llmtuner/model/adapter.py", line 137, in init_adapter model = PeftModel.from_pretrained( File "/home/iwaitu/anaconda3/envs/unsloth_env/lib/python3.10/site-packages/peft/peft_model.py", line 328, in from_pretrained PeftConfig._get_peft_type( File "/home/iwaitu/anaconda3/envs/unsloth_env/lib/python3.10/site-packages/peft/config.py", line 203, in _get_peft_type raise ValueError(f"Can't find '{CONFIG_NAME}' at '{model_id}'") ValueError: Can't find 'adapter_config.json' at 'saves/Qwen1.5-0.5B/lora/qwen_fc'