Closed Cherrypower closed 10 months ago
This is a Chinese version of bert trained by using Whole Word Masking method, its github repo is here: chinese-bert-wwm
If there is any other question, feel free to contact me.
Besides, we welcome your attention to our new work: Speak Like a Native: Prompting Large Language Models in a Native Style, which proposed an easy used In-Context Learning method to improve LLMs' reasoning ability especially mathematical reasoning.
Hello, parser.add_argument('--bert_path', type=str, default="/data1/yangzhicheng/Data/models/chinese-bert-wwm") refer to?