zjunlp / KnowPrompt

[WWW 2022] KnowPrompt: Knowledge-aware Prompt-tuning with Synergistic Optimization for Relation Extraction
MIT License
194 stars 34 forks source link

Hi! How to run this code with BART model? Would you give an example of parameters setting? Thank you so much!! #17

Closed Drizze999 closed 1 year ago

Drizze999 commented 1 year ago

Hi!I used these parameters below but the result produced was very bad. Maybe I ran it in a wrong way,would you help me? --max_epochs 30 --num_workers 8 --model_name_or_path facebook/bart-base --accumulate_grad_batches 1 --batch_size 16 --data_dir dataset/semeval/k-shot/8-1 --check_val_every_n_epoch 1 --data_class BartREDataset --max_seq_length 256 --model_class BartRE --wandb --litmodel_class BartRELitModel --task_name wiki80 --lr 2e-5

njcx-ai commented 1 year ago

Hi. Thanks for your attention. We don't involve experiments on BART models. Thus, we have not accumulated experience in tuning parameters on the BART model. Still, KnowPrompt can essentially be adapted for BART. Here we suggest you feed the sentence into the Encoder part and prompt sequence into the Decoder part. Moreover, the prompt template can be adjusted to like "the relation between [sub] Apple [sub] and [obj] Steve Jobs [obj] is [mask]". Hope my answer can help you.

Drizze999 commented 1 year ago

Thank you for your help! I will try to tune the parameters. But I want to make sure that BartREDataset in data/dialogue.py , BartRELitModel in lit_models/transformer.py and BartRE in models/init.py are for experience on BART. If so, I would modify on these bases maybe. Looking forward for your reply.

CheaSim commented 1 year ago

Sure, you can run some experiments with Bart model based on our code in Relation Extraction dataset. But there may be poor performance in RE dataset with seq2seq model. If you find something interesting, you are welcome to set a discussion in a new issue.