PlusLabNLP / story-gen-BART

34 stars 4 forks source link

Ready-to-use Checkpoint of this model #7

Open DaehanKim opened 2 years ago

DaehanKim commented 2 years ago

Dear Authors, I wonder whether you can provide ready-to-use checkpoint of this model to use for inference_plot.py and inference_story.py. I only see instructions on pretrained BART and Tokenizers checkpoint. If I'm missing something, please let me know. Thank you.

figlang2022sharedtask commented 2 years ago

https://drive.google.com/drive/folders/1cOouBxVsORnNdQJuZlH9fu3ACc7p9CwG?usp=sharing

I think this folder has everything you need

tuhinjubcse commented 2 years ago

I dont think the event discrimators are here , but can you please train them , sorry i dont have them with me i the drive @seraphinatarrant can help more

DaehanKim commented 2 years ago

Thank you for the reply. Where can I find a script for coreference resolution and semantic role labeling? If there's no such script, how can I process raw story txt into silver standard plots as in the paper? Plus, do we use the same plot model for revising plots too? Thanks in advance

tuhinjubcse commented 2 years ago

https://github.com/PlusLabNLP/story-gen-BART/tree/master/srl_plot_preprocessing

DaehanKim commented 2 years ago

Thank you!

DaehanKim commented 2 years ago

Hi, I'm trying to generate plot using two rescorers(entity and relevance). It requires input0.dict.txt and label.dict.txt in roberta checkpoint folder but I can't find these files at your drive link. Should I retrain all these models?