Facico / Chinese-Vicuna

Chinese-Vicuna: A Chinese Instruction-following LLaMA-based Model —— 一个中文低资源的llama+lora方案,结构参考alpaca
https://github.com/Facico/Chinese-Vicuna
Apache License 2.0
4.14k stars 421 forks source link

ValueError: Can't find 'adapter_config.json' at './lora-Vicuna/checkpoint-final' #185

Closed adaaaaaa closed 1 year ago

adaaaaaa commented 1 year ago

[nano@archlinux Chinese-Vicuna]$ python interaction.py ===================================BUG REPORT=================================== Welcome to bitsandbytes. For bug reports, please submit your error trace to: https://github.com/TimDettmers/bitsandbytes/issues

The tokenizer class you load from this checkpoint is not the same type as the class this function is called from. It may result in unexpected tokenization. The tokenizer class you load from this checkpoint is 'LLaMATokenizer'. The class this function is called from is 'LlamaTokenizer'. normalizer.cc(51) LOG(INFO) precompiled_charsmap is empty. use identity normalization. ./lora-Vicuna/checkpoint-final/adapter_model.bin ./lora-Vicuna/checkpoint-final/pytorch_model.bin Loading checkpoint shards: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 33/33 [00:10<00:00, 3.01it/s] ╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮ │ /home/nano/.local/lib/python3.10/site-packages/peft/utils/config.py:99 in from_pretrained │ │ │ │ 96 │ │ │ config_file = os.path.join(pretrained_model_name_or_path, CONFIG_NAME) │ │ 97 │ │ else: │ │ 98 │ │ │ try: │ │ ❱ 99 │ │ │ │ config_file = hf_hub_download(pretrained_model_name_or_path, CONFIG_NAME │ │ 100 │ │ │ except Exception: │ │ 101 │ │ │ │ raise ValueError(f"Can't find '{CONFIG_NAME}' at '{pretrained_model_name │ │ 102 │ │ │ │ /home/nano/.local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py:112 in │ │ _inner_fn │ │ │ │ 109 │ │ │ kwargs.items(), # Kwargs values │ │ 110 │ │ ): │ │ 111 │ │ │ if arg_name in ["repo_id", "from_id", "to_id"]: │ │ ❱ 112 │ │ │ │ validate_repo_id(arg_value) │ │ 113 │ │ │ │ │ 114 │ │ │ elif arg_name == "token" and arg_value is not None: │ │ 115 │ │ │ │ has_token = True │ │ │ │ /home/nano/.local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py:160 in │ │ validate_repo_id │ │ │ │ 157 │ │ raise HFValidationError(f"Repo id must be a string, not {type(repoid)}: '{repo │ │ 158 │ │ │ 159 │ if repo_id.count("/") > 1: │ │ ❱ 160 │ │ raise HFValidationError( │ │ 161 │ │ │ "Repo id must be in the form 'repo_name' or 'namespace/repo_name':" │ │ 162 │ │ │ f" '{repo_id}'. Use repo_type argument if needed." │ │ 163 │ │ ) │ ╰──────────────────────────────────────────────────────────────────────────────────────────────────╯ HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': './lora-Vicuna/checkpoint-final'. Use repo_type argument if needed.

During handling of the above exception, another exception occurred:

╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮ │ /data/Chinese-Vicuna/interaction.py:57 in │ │ │ │ 54 │ │ torch_dtype=torch.float16, │ │ 55 │ │ device_map="auto", │ │ 56 │ ) │ │ ❱ 57 │ model = PeftModel.from_pretrained( │ │ 58 │ │ model, │ │ 59 │ │ LORA_WEIGHTS, │ │ 60 │ │ torch_dtype=torch.float16, │ │ │ │ /home/nano/.local/lib/python3.10/site-packages/peft/peft_model.py:137 in from_pretrained │ │ │ │ 134 │ │ from .mapping import MODEL_TYPE_TO_PEFT_MODEL_MAPPING, PEFT_TYPE_TO_CONFIG_MAPPI │ │ 135 │ │ │ │ 136 │ │ # load the config │ │ ❱ 137 │ │ config = PEFT_TYPE_TO_CONFIG_MAPPING[PeftConfig.from_pretrained(model_id).peft_t │ │ 138 │ │ │ │ 139 │ │ if getattr(model, "hf_device_map", None) is not None: │ │ 140 │ │ │ remove_hook_from_submodules(model) │ │ │ │ /home/nano/.local/lib/python3.10/site-packages/peft/utils/config.py:101 in from_pretrained │ │ │ │ 98 │ │ │ try: │ │ 99 │ │ │ │ config_file = hf_hub_download(pretrained_model_name_or_path, CONFIG_NAME │ │ 100 │ │ │ except Exception: │ │ ❱ 101 │ │ │ │ raise ValueError(f"Can't find '{CONFIG_NAME}' at '{pretrained_model_name │ │ 102 │ │ │ │ 103 │ │ loaded_attributes = cls.from_json_file(config_file) │ │ 104 │ ╰──────────────────────────────────────────────────────────────────────────────────────────────────╯ ValueError: Can't find 'adapter_config.json' at './lora-Vicuna/checkpoint-final'

jingtian11 commented 1 year ago

问题解决了嘛

Facico commented 1 year ago

我们把这些文件都放到huggingface上去了,你可以把“./lora-Vicuna/checkpoint-final”换成这个“https://huggingface.co/Chinese-Vicuna/Chinese-Vicuna-lora-7b-belle-and-guanaco