kentonl / e2e-coref

End-to-end Neural Coreference Resolution
Apache License 2.0
518 stars 174 forks source link

Is this model suitable for chinese? #81

Open Waste-Wood opened 4 years ago

Waste-Wood commented 4 years ago

I have seen some files like "char_vocab_chinese.txt", so is there a chinese pretrained model?

henryhust commented 4 years ago

That 's ok, maybe you need to train a model of your own

Waste-Wood commented 4 years ago

That 's ok, maybe you need to train a model of your own

Except the GloVe embedding that I should replace with my own chinese embedding, is there any file that I should replace with my own file? And I have seen training data of chinese is included in these files!

mjj1094 commented 4 years ago

Hope that someone has trained this model in Chinese and would like to share his github link here

viviqi commented 4 years ago

I am trying train the model with Chinese dataset from Ontonote5.0, but it seems that the model can not converge, if anyone has done the same work, advice will be welcome.

mjj1094 commented 4 years ago

Thanks for your reply! I just referenced https://github.com/mandarjoshi90/coref for the conversion of Chinese data,maybe you can try it.

On 06/30/2020 13:59, zhuqi wrote:

I am trying train the model with Chinese dataset from Ontonote5.0, but it seems that the model can not converge, if anyone has done the same work, advice will be welcome.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.

viviqi commented 4 years ago

Thanks for your advice, yes I am trying the bert-coref model now and the performance is not good yet, anyway, still finetune the model. And if you are training it as well, maybe we can share our results.

mjj1094 commented 4 years ago

I am training(not finished) the model of Chinese-coref-bert from https://github.com/mandarjoshi90/coref which is an extension of the e2e-coref model. If you want a further communication,you can contact me by my e-mail:mjjblcu@126.com.

On 07/01/2020 16:43, zhuqi wrote:

Thanks for your advice, yes I am trying the bert-coref model now and the performance is not good yet, anyway, still finetune the model. And if you are training it as well, maybe we can share our results.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.