THUDM / ChatGLM-6B

ChatGLM-6B: An Open Bilingual Dialogue Language Model | 开源双语对话语言模型
Apache License 2.0
40.47k stars 5.19k forks source link

<titleExplicitly passing a `revision`> #1482

Open KaiQiang-Liu1994 opened 3 months ago

KaiQiang-Liu1994 commented 3 months ago

Is there an existing issue for this?

Current Behavior

D:\anaconda\envs\buxueguLLM\python.exe D:\boxuegu_code\finance\finance_ie.py Explicitly passing a revision is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision. Explicitly passing a revision is encouraged when loading a configuration with custom code to ensure no malicious code has been contributed in a newer revision. Explicitly passing a revision is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision.

Process finished with exit code -1073741819 (0xC0000005) 程序一运行出现这些就终止了,没有我想打印出来的信息。

Expected Behavior

期望有人解答我的这个问题

Steps To Reproduce

D:\anaconda\envs\buxueguLLM\python.exe D:\boxuegu_code\finance\finance_ie.py Explicitly passing a revision is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision. Explicitly passing a revision is encouraged when loading a configuration with custom code to ensure no malicious code has been contributed in a newer revision. Explicitly passing a revision is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision.

Process finished with exit code -1073741819 (0xC0000005)

Environment

- OS:
- Python:
- Transformers:
- PyTorch:
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) :

Anything else?

D:\anaconda\envs\buxueguLLM\python.exe D:\boxuegu_code\finance\finance_ie.py Explicitly passing a revision is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision. Explicitly passing a revision is encouraged when loading a configuration with custom code to ensure no malicious code has been contributed in a newer revision. Explicitly passing a revision is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision.

Process finished with exit code -1073741819 (0xC0000005)

GuHugo95 commented 1 month ago

我也出现了!

Explicitly passing a `revision` is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision.
Explicitly passing a `revision` is encouraged when loading a configuration with custom code to ensure no malicious code has been contributed in a newer revision.
Explicitly passing a `revision` is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision.
Loading checkpoint shards:   0%|          | 0/8 [00:00<?, ?it/s]D:\ProgramData\anaconda3\envs\*******\lib\site-packages\transformers\modeling_utils.py:415: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
  return torch.load(checkpoint_file, map_location="cpu")
Loading checkpoint shards: 100%|██████████| 8/8 [00:47<00:00,  5.94s/it]

进程已结束,退出代码为 -1073741819 (0xC0000005)
GuHugo95 commented 1 month ago

我自己试了一下 加这个加粗部分能解决一个报错( tokenizer = AutoTokenizer.from_pretrained(**, trust_remote_code=True , revision="main")

GuHugo95 commented 1 month ago

解决方法(自己使用成功了): 加" , revision="main" ": tokenizer = AutoTokenizer.from_pretrained(******, trust_remote_code=True , revision="main")

重装torch老版本: pip install torch==1.12.0