UBC-NLP / araT5

AraT5: Text-to-Text Transformers for Arabic Language Understanding
84 stars 18 forks source link

title generation output on fine-tuned AraT5 #14

Open hadikhamoud opened 1 year ago

hadikhamoud commented 1 year ago

Hello contributors! Thank you for the amazing project. Wonderful work. I am working on a text classification task, and found AraT5 to be very useful for my case. I was going through Fine_tuning_AraT5.ipynb featured in the examples directory. I trained the model, following your "best results" instructions (22 epochs...) on my own data, and got class predictions not featured in the original classification classes. I then tried training on the sample data provided in the notebook (ARGEn_title_genration_sample_train.tsv), and got the following results: Screenshot_1

Please note the prediction results were mostly the same, regardless of the training dataset (mine or the one provided).

Am I missing something? Could you please help with this? It would be much appreciated.