This paper present a new multitask question answering network (MQAN) that jointly learns all tasks in ten different NLP tasks. (Cast 10 tasks to question answering)
The model uses dual coattention, multi-head self attention for encoding, and based on pointer-generator network for copying the words from context or question, or generating the words from external vocabulary. No explicit supervision is needed.
The model can perform zero-shot classification tasks due to the unseen new task is represented as questions, and the unseen classes can be copied from the question. (Meta-learning).
Metadata