Open chenkejin opened 4 years ago
Hi, the link of pre-trained Chinese ELMO is https://share.weiyun.com/5HxvL3Q
, which is pre-trained with --encoder bilstm and --target bilm
But we haven't finished training this model yet.
In additional, the above model is based on Chinese character, so we do not add the sub-encoder.
We recommend to use the following two LSTM pre-trained models:
Thank you very much for your work!I need the elmo Chinese pre-training model for downstream task. Can you provide some, thank you very much.