zjunlp / DeepKE

[EMNLP 2022] An Open Toolkit for Knowledge Graph Extraction and Construction
http://deepke.zjukg.cn/
MIT License
3.52k stars 682 forks source link

老哥救救 #471

Closed Yanglemin closed 6 months ago

Yanglemin commented 6 months ago

老哥,哥们跑run.py启动不了呀几把一直提示这个,我让chatgpt给我看下载不了,是挂了么也没有更新一下的老哥。 (deepke) C:\Users\Administrator\DeepKE\example\re\standard>python run.py C:\Users\Administrator\anaconda3\envs\deepke\lib\site-packages\hydra\core\utils.py:207: UserWarning: Using config_path to specify the config name is deprecated, specify the config name via config_name See https://hydra.cc/docs/next/upgrades/0.11_to_1.0/config_path_changes warnings.warn(category=UserWarning, message=msg) C:\Users\Administrator\anaconda3\envs\deepke\lib\site-packages\hydra\plugins\config_source.py:190: UserWarning: Missing @package directive hydra/output/custom.yaml in file://C:\Users\Administrator\DeepKE\example\re\standard\conf. See https://hydra.cc/docs/next/upgrades/0.11_to_1.0/adding_a_package_directive warnings.warn(message=msg, category=UserWarning) C:\Users\Administrator\anaconda3\envs\deepke\lib\site-packages\hydra\plugins\config_source.py:190: UserWarning: Missing @package directive preprocess.yaml in file://C:\Users\Administrator\DeepKE\example\re\standard\conf. See https://hydra.cc/docs/next/upgrades/0.11_to_1.0/adding_a_package_directive warnings.warn(message=msg, category=UserWarning) C:\Users\Administrator\anaconda3\envs\deepke\lib\site-packages\hydra\plugins\config_source.py:190: UserWarning: Missing @package directive train.yaml in file://C:\Users\Administrator\DeepKE\example\re\standard\conf. See https://hydra.cc/docs/next/upgrades/0.11_to_1.0/adding_a_package_directive warnings.warn(message=msg, category=UserWarning) C:\Users\Administrator\anaconda3\envs\deepke\lib\site-packages\hydra\plugins\config_source.py:190: UserWarning: Missing @package directive embedding.yaml in file://C:\Users\Administrator\DeepKE\example\re\standard\conf. See https://hydra.cc/docs/next/upgrades/0.11_to_1.0/adding_a_package_directive warnings.warn(message=msg, category=UserWarning) C:\Users\Administrator\anaconda3\envs\deepke\lib\site-packages\hydra\plugins\config_source.py:190: UserWarning: Missing @package directive predict.yaml in file://C:\Users\Administrator\DeepKE\example\re\standard\conf. See https://hydra.cc/docs/next/upgrades/0.11_to_1.0/adding_a_package_directive warnings.warn(message=msg, category=UserWarning) C:\Users\Administrator\anaconda3\envs\deepke\lib\site-packages\hydra\plugins\config_source.py:190: UserWarning: Missing @package directive model/lm.yaml in file://C:\Users\Administrator\DeepKE\example\re\standard\conf. See https://hydra.cc/docs/next/upgrades/0.11_to_1.0/adding_a_package_directive warnings.warn(message=msg, category=UserWarning) C:\Users\Administrator\anaconda3\envs\deepke\lib\site-packages\omegaconf\basecontainer.py:225: UserWarning: cfg.pretty() is deprecated and will be removed in a future version. Use OmegaConf.to_yaml(cfg)

warnings.warn( [2024-04-19 11:35:31,713][main][INFO] - cwd: C:\Users\Administrator\DeepKE\example\re\standard use_wandb: false preprocess: true data_path: data/origin out_path: data/out chinese_split: true replace_entity_with_type: true replace_entity_with_scope: true min_freq: 3 pos_limit: 30 seed: 1 use_gpu: true gpu_id: 0 epoch: 50 batch_size: 32 learning_rate: 0.0003 lr_factor: 0.7 lr_patience: 3 weight_decay: 0.001 early_stopping_patience: 6 train_log: true log_interval: 10 show_plot: false only_comparison_plot: false plot_utils: matplot predict_plot: false use_multi_gpu: false gpu_ids: 0,1 vocab_size: ??? word_dim: 60 pos_size: 62 pos_dim: 10 dim_strategy: sum num_relations: 11 fp: xxx/checkpoints/2019-12-03_17-35-30/cnn_epoch21.pth model_name: lm lm_file: bert-base-chinese num_hidden_layers: 1 type_rnn: LSTM input_size: 768 hidden_size: 100 num_layers: 1 dropout: 0.3 bidirectional: true last_layer_hn: true

[2024-04-19 11:35:31,713][main][INFO] - device: cpu [2024-04-19 11:35:31,717][deepke.relation_extraction.standard.tools.preprocess][INFO] - ===== start preprocess data ===== [2024-04-19 11:35:31,722][deepke.relation_extraction.standard.tools.preprocess][INFO] - load raw files... [2024-04-19 11:35:31,722][utils.ioUtils][INFO] - load csv from C:\Users\Administrator\DeepKE\example\re\standard\data/origin\train.csv [2024-04-19 11:35:31,734][utils.ioUtils][INFO] - load csv from C:\Users\Administrator\DeepKE\example\re\standard\data/origin\valid.csv [2024-04-19 11:35:31,742][utils.ioUtils][INFO] - load csv from C:\Users\Administrator\DeepKE\example\re\standard\data/origin\test.csv [2024-04-19 11:35:31,746][utils.ioUtils][INFO] - load csv from C:\Users\Administrator\DeepKE\example\re\standard\data/origin\relation.csv [2024-04-19 11:35:31,746][deepke.relation_extraction.standard.tools.preprocess][INFO] - clean data... [2024-04-19 11:35:31,750][deepke.relation_extraction.standard.tools.preprocess][INFO] - convert relation into index... [2024-04-19 11:35:31,750][deepke.relation_extraction.standard.tools.preprocess][INFO] - verify whether use pretrained language models... [2024-04-19 11:35:31,754][deepke.relation_extraction.standard.tools.preprocess][INFO] - use pretrained language models serialize sentence... [2024-04-19 11:35:31,759][deepke.relation_extraction.standard.tools.preprocess][INFO] - use bert tokenizer... 'HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /bert-base-chinese/resolve/main/vocab.txt (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x0000025C83F4A8B0>, 'Connection to huggingface.co timed out. (connect timeout=10)'))' thrown while requesting HEAD https://huggingface.co/bert-base-chinese/resolve/main/vocab.txt [2024-04-19 11:35:41,774][huggingface_hub.utils._http][WARNING] - 'HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /bert-base-chinese/resolve/main/vocab.txt (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x0000025C83F4A8B0>, 'Connection to huggingface.co timed out. (connect timeout=10)'))' thrown while requesting HEAD https://huggingface.co/bert-base-chinese/resolve/main/vocab.txt Traceback (most recent call last): File "run.py", line 61, in main preprocess(cfg) File "C:\Users\Administrator\anaconda3\envs\deepke\lib\site-packages\deepke-2.2.7-py3.8.egg\deepke\relation_extraction\standard\tools\preprocess.py", line 208, in preprocess _lm_serialize(train_data, cfg) File "C:\Users\Administrator\anaconda3\envs\deepke\lib\site-packages\deepke-2.2.7-py3.8.egg\deepke\relation_extraction\standard\tools\preprocess.py", line 120, in _lm_serialize tokenizer = BertTokenizer.from_pretrained(cfg.lm_file) File "C:\Users\Administrator\anaconda3\envs\deepke\lib\site-packages\transformers\tokenization_utils_base.py", line 1763, in from_pretrained resolved_vocab_files[file_id] = cached_file( File "C:\Users\Administrator\anaconda3\envs\deepke\lib\site-packages\transformers\utils\hub.py", line 409, in cached_file resolved_file = hf_hub_download( File "C:\Users\Administrator\anaconda3\envs\deepke\lib\site-packages\huggingface_hub\utils_validators.py", line 124, in _inner_fn return fn(*args, **kwargs) File "C:\Users\Administrator\anaconda3\envs\deepke\lib\site-packages\huggingface_hub\file_download.py", line 1148, in hf_hub_download with open(ref_path) as f: FileNotFoundError: [Errno 2] No such file or directory: 'C:\Users\Administrator/.cache\huggingface\hub\models--bert-base-chinese\refs\main'

Set the environment variable HYDRA_FULL_ERROR=1 for a complete stack trace.

(deepke) C:\Users\Administrator\DeepKE\example\re\standard>

zxlzr commented 6 months ago

您可以科学上网下载下模型,如用代理https://hf-mirror.com/ 或从modelscope 或wisemodel 下载模型到本地在运行。

orangeper commented 6 months ago

下载pytorch版的bert预训练模型,按照pretrained文件夹下的readme里面的要求,将那三个文件放进去就行,然后把lm.yaml文件里面的路径lm_file改成这个文件夹的路径

orangeper commented 6 months ago

下载pytorch版的bert预训练模型,按照pretrained文件夹下的readme里面的要求,将那三个文件放进去就行,然后把lm.yaml文件里面的路径lm_file改成这个文件夹的路径

xxupiano commented 6 months ago

'C:\Users\Administrator/.cache\huggingface\hub\models--bert-base-chinese\refs\main'您好,路径中斜杠有问题,如果使用win请使用\,建议使用linux并使用/

zxlzr commented 6 months ago

请问您的问题是否已解决?