Closed Omarito2412 closed 4 years ago
Interesting paper showing a simple technique for zero/few shot cross-lingual transfer, the proposed technique relies on finetuning on a set of other small languages sizes before eval. on the target language.
pros: interesting literature and experiment setup that is to follow cons: the method they follow is quite straight forward, imo its connections with meta learning are not very well established, gains over multilingual bert baseline are quite minor.
Some refs we mentioned today:
Crosslingual word embeddings for zero shot transfer and Unsupervised MT https://www.samtalksml.net/aligning-vector-representations/ https://github.com/facebookresearch/UnsupervisedMT ACL 2019 tutorial on the same topic https://ruder.io/unsupervised-cross-lingual-learning/
Changing the seed affects results quite much (BERT finetuning context): https://arxiv.org/abs/2002.06305
The technique mentioned is shallow, but it looks nice and effective in some cases. Although, there are a lot of questions on whether this really is meta-learning or not.
Join us in our discussion, our paper's abstract is:
We'll be meeting on Google hangouts: https://hangouts.google.com/group/kUxBAunjGittAkBUA
URL: https://arxiv.org/abs/2003.02739