yangzhch6 / InterMWP

dataset & code of "LogicSolver: Towards Interpretable Math Word Problem Solving with Logical Prompt-enhanced Learning" in Findings of EMNLP 2022
9 stars 1 forks source link

chinese-bert-wwm #2

Closed Cherrypower closed 10 months ago

Cherrypower commented 11 months ago

Hello, parser.add_argument('--bert_path', type=str, default="/data1/yangzhicheng/Data/models/chinese-bert-wwm") refer to?

yangzhch6 commented 11 months ago

This is a Chinese version of bert trained by using Whole Word Masking method, its github repo is here: chinese-bert-wwm

If there is any other question, feel free to contact me.

Besides, we welcome your attention to our new work: Speak Like a Native: Prompting Large Language Models in a Native Style, which proposed an easy used In-Context Learning method to improve LLMs' reasoning ability especially mathematical reasoning.