yuanzhoulvpi2017 / zero_nlp

中文nlp解决方案(大模型、数据、模型、训练、推理)
MIT License
3.04k stars 369 forks source link

总是报这个错,怎么才是本地文件夹,我已经下载到本地了啊。chatglm-6b is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' #116

Open cat88hzh opened 1 year ago

cat88hzh commented 1 year ago
---------------------------------------------------------------------------
HTTPError                                 Traceback (most recent call last)
File [c:\Python310\lib\site-packages\huggingface_hub\utils\_errors.py:259](file:///C:/Python310/lib/site-packages/huggingface_hub/utils/_errors.py:259), in hf_raise_for_status(response, endpoint_name)
    258 try:
--> 259     response.raise_for_status()
    260 except HTTPError as e:

File [c:\Python310\lib\site-packages\requests\models.py:1021](file:///C:/Python310/lib/site-packages/requests/models.py:1021), in Response.raise_for_status(self)
   1020 if http_error_msg:
-> 1021     raise HTTPError(http_error_msg, response=self)

HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/chatglm-6b/resolve/main/tokenizer_config.json

The above exception was the direct cause of the following exception:

RepositoryNotFoundError                   Traceback (most recent call last)
File [c:\Python310\lib\site-packages\transformers\utils\hub.py:409](file:///C:/Python310/lib/site-packages/transformers/utils/hub.py:409), in cached_file(path_or_repo_id, filename, cache_dir, force_download, resume_download, proxies, use_auth_token, revision, local_files_only, subfolder, user_agent, _raise_exceptions_for_missing_entries, _raise_exceptions_for_connection_errors, _commit_hash)
    407 try:
    408     # Load from URL or cache if already cached
--> 409     resolved_file = hf_hub_download(
    410         path_or_repo_id,
    411         filename,
    412         subfolder=None if len(subfolder) == 0 else subfolder,
    413         revision=revision,
    414         cache_dir=cache_dir,
...
    434         f"'https://huggingface.co/{path_or_repo_id}' for available revisions."
    435     )

OSError: chatglm-6b is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
If this is a private repository, make sure to pass a token having permission to this repo with `use_auth_token` or log in with `huggingface-cli login` and pass `use_auth_token=True`.
Output is truncated. View as a [scrollable element](command:cellOutput.enableScrolling?2e5e8415-6eb8-40b0-a15c-965acb5331ed) or open in a [text editor](command:workbench.action.openLargeOutput?2e5e8415-6eb8-40b0-a15c-965acb5331ed). Adjust cell output [settings](command:workbench.action.openSettings?%5B%22%40tag%3AnotebookOutputLayout%22%5D)...
shaoqing404 commented 1 year ago

我也遇到这个问题- -

yuanzhoulvpi2017 commented 1 year ago

你俩都报错,说明根本就没有下载我提供的模型chatglm6b-dddd