Open nashid opened 2 years ago
@mommi84 any idea?
That would be a very useful feature.
This would be very useful as contextual embedding became a norm nowadays. Wondering anyone implemented or was able to run this model with any contextualized embedding?
Is it possible to use BERT word embeddings along with this NMT implementation?
The goal is to use a pre-trained BERT language model so the contextualized embedding could be leveraged.
I am wondering whether anyone implemented or was able to run this model with any other contextualized embedding like ELMO or BERT.