goodbai-nlp / AMRBART

Code for our paper "Graph Pre-training for AMR Parsing and Generation" in ACL2022
MIT License
92 stars 28 forks source link

Question about fine-tuned models for AMR parsing #3

Closed cws7777 closed 2 years ago

cws7777 commented 2 years ago

Hi, First of all, thank you for your great work! I recently got interested in AMR and found your work from accepted papers in ACL.

I have a question about fine-tuned models that you shared for AMR parsing. Can those models be used for plain texts to generate AMR graphs?? Also, do you support the code for plain texts to generate AMR graphs like SPRING model?

Thanks! :) Hope you have a good one!

goodbai-nlp commented 2 years ago

Hi, Thanks for your interest, our fine-tuned model can be used for parsing plain text. Please follow the instructions here to prepare your data and run parsing scripts.

cws7777 commented 2 years ago

Thanks alot!!! :) I'll try! :D

cws7777 commented 2 years ago

Hi, I'm trying to download the pre-trained AMR parsing models (2.0 & 3.0) from OneDrive that you've shared on the README.md, When it downloads around 1.2 GB, it is disconnected with the error ("server/network error"), and needs to download from the beginning again. Isn't there any problem when you download the model?

cylnlp commented 2 years ago

Hi @cws7777, it works well on my side, did you try multiple times and they all returned such an error?

cws7777 commented 2 years ago

Hm.. might be our internet connection is not that stable then.. :'(

I tried to download 2.0 model (zip file) first on last Friday, and it gave me the error. So I tried 3.0 model but gave me the same error.. :'( I tried this morning, and it happened again..

I'll try more and let you know! :) Thanks for quick reply!

Ah, I have one more question about inferencing on my own data, Since it has config,json in in the pre-trained model, Should I just run this command bash inference_amr.sh /path/to/fine-tuned/AMRBART/ gpu_id with setting the directory of pre-trained model? Don't I need to edit the inference_amr.sh file such as tokenizer, train/eval/test_data_file?

cylnlp commented 2 years ago

Hi @cws7777, that's weird, but do not worry, @muyeby is now uploading the models to Google drive. Please give us some time, and you can try it later.

cws7777 commented 2 years ago

Oh.. wow..! Thank you so much for helping me out! :) I really appreciated it!!

goodbai-nlp commented 2 years ago

Hi, @cws7777, we have uploaded the models to Huggingface and updated the README, you can try it again. To inference on your own data, you should edit the inference_amr.sh file and prepare your file to have the same format with examples.

xu1998hz commented 2 years ago

Hi @muyeby, I also have a question on generating graph on our own text data. So I saw your uploads on Transformer library. I tried to run the code and use example data to see the output. Specifically, I used "bash inference_amr.sh xfbai/AMRBART-large-finetuned-AMR2.0-AMRParsing 0" and I modified "datacate=examples". However, I don't think I am able to generate graph for data format like data/data4parsing.jsonl. Is it possible that you can help me on generating outputs for "data/data4parsing.jsonl"?

cws7777 commented 2 years ago

@muyeby Thanks for uploading! I will check it out and try! :)

cws7777 commented 2 years ago

@xu1998hz What kind of output are you getting?? Did you use same format as the examples??

xu1998hz commented 2 years ago

@cws7777 , Yes, I used the example file to try do run the outputs. Are you able to produce the output?

xu1998hz commented 2 years ago

I don't have output

cws7777 commented 2 years ago

@xu1998hz No, not yet. I just downloaded the fine-tuned models from huggingface!

xu1998hz commented 2 years ago

Cool, please let me know if you can successfully generate text!

cws7777 commented 2 years ago

Did you just modified datacate and run the command bash inference_amr.sh xfbai/AMRBART-large-finetuned-AMR2.0-AMRParsing 0?

xu1998hz commented 2 years ago

I also moved examples folder under data

cws7777 commented 2 years ago

@xu1998hz Ah ha, didn't it give an error? did you modify Tokenizer in inference_amr.sh file as well??

cws7777 commented 2 years ago

@muyeby Can you give us an example code of detailed setting in inference_amr.sh? I don't know why but it keeps giving me an error..! :'(

goodbai-nlp commented 2 years ago

Hi, @cws7777 @xu1998hz, I have updated the scripts, Now you can modify the tokenizer path and run the command bash inference_amr.sh xfbai/AMRBART-large-finetuned-AMR2.0-AMRParsing 0 to get the output of examples/data4parsing.jsonl. If you still get any errors, please post the errors here.

xu1998hz commented 2 years ago

Thanks a lot! It works perfectly. @muyeby

cws7777 commented 2 years ago

Thanks for the updated scripts! :) I think it goes through well, but my cuda is a problem I think..! :'(

When I ran the conda env update --name <env> --file requirements.yml to create, it gave me an error as below, ResolvePackageNotFound:

So I just downloaded one by one in requirements.yaml without the nvidia::cudatoolkit=11.1.1 and pytorch::pytorch=1.8.1=py3.8_cuda11.1_cudnn8.0.5_0

and now it can't find gpus on my computer.... image

I think I have to figure out the version of it somehow..

cws7777 commented 2 years ago

@xu1998hz Did you follow the same step as in the README.md for creating env & installing requirement.yaml?

cws7777 commented 2 years ago

@muyeby @xu1998hz Figured out the problem above, and it went through well.

Thanks for the help. :)