jind11 / MMM-MCQA

Source code for our "MMM" paper at AAAI 2020
41 stars 11 forks source link

Please can the pre-training code about NLI be shared? #4

Open MingjieWang0606 opened 3 years ago

MingjieWang0606 commented 3 years ago

I want to train more models like albert and will share it later~

jind11 commented 3 years ago

Wow, super thanks! Look forward to them!

MingjieWang0606 commented 3 years ago

哇,超级谢谢!期待他们!

您好~ 能分享关于NLI的代码嘛~可能是我表达有误~ 我会接着重新训练albert和其他模型之后合并到您的git里~ 或者您有代码也可以直接发送给我~ 很高兴能看到这么出色的工作~ 我可以提供机器支持~

MingjieWang0606 commented 3 years ago

这是我的微信和邮箱~ 十分期待您的回复! wmj745000 xiaowangiii@yahoo.com

jind11 commented 3 years ago

I see, I need to find the code for NLI and will post it these two days. Thank you for the patience!

MingjieWang0606 commented 3 years ago

Thank you for your quick reply! Hope the code is not eaten by mice~

I see, I need to find the code for NLI and will post it these two days. Thank you for the patience!

jind11 commented 3 years ago

Hi, I am so sorry for delay since I have been busy with my conference ddl. I just now spent half an hour digging into my hard drive to find out the code for NLI fine-tuning, but I failed, which is so weird. But I did recall that I was using the huggingface code for training the NLI model, which can be referred to this code: https://github.com/abidlabs/pytorch-transformers. I am sorry that I cannot offer a direct code but I do believe that it is easy to adopt the code in the link to fine-tuned an Albert model on NLI. Of course you can also use the latest Huggingface transformers source code. Let me know if you have more questions.