joongbo / tta

Repository for the paper "Fast and Accurate Deep Bidirectional Language Representations for Unsupervised Learning"
Apache License 2.0
108 stars 20 forks source link

I have done some experiments on Chinese using bert-base config, the results are not promising #6

Open yyht opened 3 years ago

yyht commented 3 years ago

Hi, I have done pretraining on Chinese-dataset(50G) and run downstream finetuning on ChineseClue benchmark, the default hyperparameters ars the same to bert-base: learning_rate: 3e-5, epoch: 3 or 5 the finetuning results on benchmark are worse than official Chinese bert-base released by Goolge

zhu143xin commented 3 years ago

Hi, I want to use TTA to do some work about spelling Error Correction on Chinese, and did you do some experiments?

yyht commented 3 years ago

no,I could provide you my preteained Chinese tradition base for spell correction or do it together

---Original--- From: "zhu143xin"<notifications@github.com> Date: Thu, Jan 28, 2021 20:04 PM To: "joongbo/tta"<tta@noreply.github.com>; Cc: "yyht"<htxu91@gmail.com>;"Author"<author@noreply.github.com>; Subject: Re: [joongbo/tta] I have done some experiments on Chinese using bert-base config, the results are not promising (#6)

Hi, I want to use TTA to do some work about spelling Error Correction on Chinese, and did you do some experiments?

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.