Open tmquan opened 2 years ago
Hi @tmquan, from the question, I guess it is not directly available as of now, in lightning-transformer
, but from the colab notebook that you shared and the code base of lightning-transformer
, I think, the following approach could help:
Seq2SeqTransformer
as mentioned here, and modify the arguments during initialization and common step.I hope this helps.
🚀 Feature: Multi task NLP model
For such an IELTS exam paper, there are several types of questions such as Question Answering and Multiple Choice. The current implementation of lightning_transformer does well for a single task but I wonder whether a case to jointly train 2 tasks at the same time? Because the context will be shared during two tasks, therefore sharing the encoder will be beneficial.
Alternatives
I found a reference to do this directly on huggingface transformer but dont know how to structure it to adapt with lightning transformers. https://colab.research.google.com/github/zphang/zphang.github.io/blob/master/files/notebooks/Multi_task_Training_with_Transformers_NLP.ipynb#scrollTo=xW8bnTgCsx5c