goodbai-nlp / AMRBART

Code for our paper "Graph Pre-training for AMR Parsing and Generation" in ACL2022
MIT License
92 stars 28 forks source link

Excuse me, I meet with this problem #1

Closed cathyry closed 2 years ago

cathyry commented 2 years ago

when I run the bash : bash eval_AMRbart_amrparsing.sh /path/to/fine-tuned/AMRBART/ gpu_id,I meet with this problem:

Global seed set to 42 Traceback (most recent call last): File "/home/jsj201-4/mount1/jym/AMR-Parser/AMRBART-main/fine-tune/run_amrparsing.py", line 157, in main(args) File "/home/jsj201-4/mount1/jym/AMR-Parser/AMRBART-main/fine-tune/run_amrparsing.py", line 91, in main raw_graph=False, File "/home/jsj201-4/mount1/jym/AMR-Parser/AMRBART-main/spring/spring_amr/tokenization_bart.py", line 44, in from_pretrained inst = super().from_pretrained(pretrained_model_path, *args, **kwargs) File "/home/jsj201-4/.conda/envs/kinyum/lib/python3.6/site-packages/transformers/tokenization_utils_base.py", line 1708, in from_pretrained raise EnvironmentError(msg) OSError: Can't load tokenizer for '../../../data/pretrained-model/bart-large'. Make sure that:

so what should I do?

goodbai-nlp commented 2 years ago

Hi,

You can replace "../../../data/pretrained-model/bart-large" with "facebook/bart-large" if your device has access to the Internet, otherwise replace it with the directory of the pre-trained model (it can be downloaded here).