monologg / R-BERT

Pytorch implementation of R-BERT: "Enriching Pre-trained Language Model with Entity Information for Relation Classification"
Apache License 2.0
348 stars 71 forks source link

The F1-Score decrease after the last update. #10

Open iimlearning opened 3 years ago

iimlearning commented 3 years ago

This is a good job. Thank you. And I remember the f1-score > 88 some days ago, but the f1-score < 88 I ran last night. Because the entities fully-connected layer use the same weight after the update?

Could you share how to adjust parameters such as the learning rate to get more results? Because I want to use R-BERT in my dataset, but the result is not very well. Thanks.

wenyu332 commented 3 years ago

This is a good job. Thank you. And I remember the f1-score > 88 some days ago, but the f1-score < 88 I ran last night. Because the entities fully-connected layer use the same weight after the update?

Could you share how to adjust parameters such as the learning rate to get more results? Because I want to use R-BERT in my dataset, but the result is not very well. Thanks.

你好,请问你有复现到89.25吗 我现在最高调试到88.39,你的最高能到多少呢

iimlearning commented 3 years ago

This is a good job. Thank you. And I remember the f1-score > 88 some days ago, but the f1-score < 88 I ran last night. Because the entities fully-connected layer use the same weight after the update? Could you share how to adjust parameters such as the learning rate to get more results? Because I want to use R-BERT in my dataset, but the result is not very well. Thanks.

你好,请问你有复现到89.25吗 我现在最高调试到88.39,你的最高能到多少呢

你好,我没有对这个进行调参,我们可以交流+Q 276431860

duan348733684 commented 3 years ago

This is a good job. Thank you. And I remember the f1-score > 88 some days ago, but the f1-score < 88 I ran last night. Because the entities fully-connected layer use the same weight after the update?

Could you share how to adjust parameters such as the learning rate to get more results? Because I want to use R-BERT in my dataset, but the result is not very well. Thanks.

请问你是用中文训练的嘛

iimlearning commented 3 years ago

This is a good job. Thank you. And I remember the f1-score > 88 some days ago, but the f1-score < 88 I ran last night. Because the entities fully-connected layer use the same weight after the update? Could you share how to adjust parameters such as the learning rate to get more results? Because I want to use R-BERT in my dataset, but the result is not very well. Thanks.

请问你是用中文训练的嘛

不是,只是跑了一下这个代码,当然用的是Sem2010数据集的。

duan348733684 commented 3 years ago

This is a good job. Thank you. And I remember the f1-score > 88 some days ago, but the f1-score < 88 I ran last night. Because the entities fully-connected layer use the same weight after the update? Could you share how to adjust parameters such as the learning rate to get more results? Because I want to use R-BERT in my dataset, but the result is not very well. Thanks.

请问你是用中文训练的嘛

不是,只是跑了一下这个代码,当然用的是Sem2010数据集的。

嘿嘿。请问下有没有类似的中文数据集啊?谢谢

iimlearning commented 3 years ago

This is a good job. Thank you. And I remember the f1-score > 88 some days ago, but the f1-score < 88 I ran last night. Because the entities fully-connected layer use the same weight after the update? Could you share how to adjust parameters such as the learning rate to get more results? Because I want to use R-BERT in my dataset, but the result is not very well. Thanks.

请问你是用中文训练的嘛

不是,只是跑了一下这个代码,当然用的是Sem2010数据集的。

嘿嘿。请问下有没有类似的中文数据集啊?谢谢

我之前整理过中医数据集方面的资料,不过没有细看,你可以看一下。 http://cips-chip.org.cn/2020/eval2 https://github.com/GanjinZero/awesome_Chinese_medical_NLP https://kns.cnki.net/kcms/detail/detail.aspx?dbcode=CMFD&dbname=CMFD202001&filename=1019668145.nh&v=PoGN%25mmd2FoPjZYSA%25mmd2Bk7VryERaP9ChrEabWA0giTnbdc%25mmd2FfyZZ5hDJj1Jw9gwEErKppD5Y https://github.com/xiaopangxia/TCM-Ancient-Books https://github.com/yao8839836/PTM https://github.com/yao8839836/CEMRClass

songhanyu commented 3 years ago

This is a good job. Thank you. And I remember the f1-score > 88 some days ago, but the f1-score < 88 I ran last night. Because the entities fully-connected layer use the same weight after the update? Could you share how to adjust parameters such as the learning rate to get more results? Because I want to use R-BERT in my dataset, but the result is not very well. Thanks.

你好,请问你有复现到89.25吗 我现在最高调试到88.39,你的最高能到多少呢

你好,我跑了一下这个代码,结果只得到87.95,请问你是通过什么调试到88.39的呢,方便分享下方法吗,谢谢~

wenyu332 commented 3 years ago

This is a good job. Thank you. And I remember the f1-score > 88 some days ago, but the f1-score < 88 I ran last night. Because the entities fully-connected layer use the same weight after the update? Could you share how to adjust parameters such as the learning rate to get more results? Because I want to use R-BERT in my dataset, but the result is not very well. Thanks.

你好,请问你有复现到89.25吗 我现在最高调试到88.39,你的最高能到多少呢

你好,我跑了一下这个代码,结果只得到87.95,请问你是通过什么调试到88.39的呢,方便分享下方法吗,谢谢~

就是修改不同的种子,我当时把seed设置为2333就可以得到,你可以尝试一下

songhanyu commented 3 years ago

好的,非常感谢啦!!

------------------ 原始邮件 ------------------ 发件人: "monologg/R-BERT" <notifications@github.com>; 发送时间: 2020年11月6日(星期五) 下午4:48 收件人: "monologg/R-BERT"<R-BERT@noreply.github.com>; 抄送: "星星🌟和月亮🌙的距离"<1298233316@qq.com>;"Comment"<comment@noreply.github.com>; 主题: Re: [monologg/R-BERT] The F1-Score decrease after the last update. (#10)

This is a good job. Thank you. And I remember the f1-score > 88 some days ago, but the f1-score < 88 I ran last night. Because the entities fully-connected layer use the same weight after the update? Could you share how to adjust parameters such as the learning rate to get more results? Because I want to use R-BERT in my dataset, but the result is not very well. Thanks.

你好,请问你有复现到89.25吗 我现在最高调试到88.39,你的最高能到多少呢

你好,我跑了一下这个代码,结果只得到87.95,请问你是通过什么调试到88.39的呢,方便分享下方法吗,谢谢~

就是修改不同的种子,我当时把seed设置为2333就可以得到,你可以尝试一下

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.

DongPoLI commented 3 years ago

Mul-BERT 在 the SemEval 2010 Task 8 dataset 到达 90.72 (Macro-F1)方法非常简单,推荐一波:https://github.com/DongPoLI/Mul-BERT