Open nlp-wh opened 5 years ago
Hi. That is correct. The ELMo embeddings are not updated. This way it is also implemented in Allen Nlp.
If you have a specific domain, you could update the language model for that domain. In the ELMo Paper they did some experiments with fine tuning the language model for a domain
Thank you very much, I am reading ELMo paper and considering fine-tuning.
Hello!
According to my understanding, ELMo in this code directly calls the officially provided pre-training model without parameter updates. Because it is trained in unlabeled corpora in the general field, is this suitable for a specific field, such as biomedicine?
Best Regards