UBC-NLP / araT5

AraT5: Text-to-Text Transformers for Arabic Language Understanding
84 stars 18 forks source link

Shouldn't we use Prefixes? #13

Open Moustafa-Banbouk opened 1 year ago

Moustafa-Banbouk commented 1 year ago

Thanks a lot for the great code, just wandering if I want to do closed book question answering fine tuning, do I need to specify a particular prefix before fine-tuning or just examples of questions and answers will be enough?