Lightning-Universe / lightning-transformers

Flexible components pairing 🤗 Transformers with :zap: Pytorch Lightning
https://lightning-transformers.readthedocs.io
Apache License 2.0
611 stars 77 forks source link

Jointly train Question Answering and Multiple Choice #291

Open tmquan opened 2 years ago

tmquan commented 2 years ago

🚀 Feature: Multi task NLP model

For such an IELTS exam paper, there are several types of questions such as Question Answering and Multiple Choice. The current implementation of lightning_transformer does well for a single task but I wonder whether a case to jointly train 2 tasks at the same time? Because the context will be shared during two tasks, therefore sharing the encoder will be beneficial.

Alternatives

I found a reference to do this directly on huggingface transformer but dont know how to structure it to adapt with lightning transformers. https://colab.research.google.com/github/zphang/zphang.github.io/blob/master/files/notebooks/Multi_task_Training_with_Transformers_NLP.ipynb#scrollTo=xW8bnTgCsx5c

uakarsh commented 2 years ago

Hi @tmquan, from the question, I guess it is not directly available as of now, in lightning-transformer, but from the colab notebook that you shared and the code base of lightning-transformer, I think, the following approach could help:

  1. You can inherit the class of Seq2SeqTransformer as mentioned here, and modify the arguments during initialization and common step.
  2. Modifying this step with some pre-processing, model defining, metrics calculation, and post-processing, would make it easy to jointly train Question Answering and Multiple Choice (Multi-task NLP model) tasks.

I hope this helps.