Open alexanderkoller opened 5 years ago
We are not allowed to use it for the shared task though: http://svn.nlpl.eu/mrp/2019/public/resources.txt After the deadline I'd be happy to integrate it. By then, AllenNLP will probably already contain a module, I hope.
Ah yes. Damn.
Let's do it anyway, right after the shared task deadline, and report the numbers in our system description.
XLNet, which just came out, beats BERT by an incredible margin on a number of nontrivial tasks: https://arxiv.org/abs/1906.08237
Once our basic infrastructure for the shared task is up and running, we should integrate XLNet into the am-parser and see what this buys us.
In an ideal world, AllenNLP will already include a module for pretrained XLNet embeddings. Otherwise, it doesn't seem very hard to write our own. The BERT class in AllenNLP is basically just a rather thin wrapper around the Pytorch BERT library from huggingface, and those people have already implemented XLNet. So it seems like it would just be a matter of replacing the core bits of the AllenNLP class with calls to the huggingface class.