issues
search
ymcui
/
Chinese-BERT-wwm
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
https://ieeexplore.ieee.org/document/9599397
Apache License 2.0
9.56k
stars
1.38k
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
如何获得词向量?
#192
c9412600
closed
3 years ago
2
tokenizer
#191
hongjianyuan
closed
3 years ago
0
tf2无法加载hfl / chinese-roberta-wwm-ext
#190
kscp123
closed
3 years ago
3
为什么RoBERTa版本训练速度更慢?
#189
sunyilgdx
closed
3 years ago
3
Whole Word Masking (wwm) 为什么能提升效果呢?
#188
guotong1988
closed
3 years ago
1
我猜 Whole Word Masking (wwm) 实际是一种更高效的方式,字级别增加预训练时间和随机性 也能达到同样最终效果吧?
#187
guotong1988
closed
3 years ago
3
关于用bert_base训练THUCNNews的结果
#186
vencentDebug
closed
3 years ago
4
请问如何用命令行下载
#185
chaohuang
closed
3 years ago
0
关于fill-mask的一些疑问
#184
yooopan
closed
3 years ago
3
模型代码没有嘛
#183
MKaxie
closed
3 years ago
0
RBTL3和RBT3的三层是怎么选取的?
#182
MingFL
closed
3 years ago
6
roberta预训练数据
#181
Daemon-ser
closed
3 years ago
4
较差的分词工具对Chinese-BERT-wwm是否可能会有负向影响?
#180
guotong1988
closed
3 years ago
2
更好的分词工具对Chinese-BERT-wwm的增益能有多少?
#179
guotong1988
closed
3 years ago
2
词向量
#178
Goku17
closed
3 years ago
4
rbt4没有使用mlm继续预训练微调吗
#177
SysuCharon
closed
3 years ago
2
The exact English pretraining data and Chinese pretraining data that are exact same to the BERT paper's pretraining data.
#176
guotong1988
closed
3 years ago
3
1B 词汇大约是多少G
#175
ChaooMa
closed
3 years ago
2
可否提供一下EXT数据的下载地址
#174
bbbxixixixi
closed
3 years ago
2
支持命名实体识别吗
#173
yooopan
closed
3 years ago
1
相同配置代码,重复运行结果不同?
#172
sirlb
closed
3 years ago
9
论文中说create_pretraining_data.py放在了git rep中,我看这个目录下没有数据处理的这个脚本,请问在哪个rep中呢?
#171
PearlW1
closed
3 years ago
4
wwm训练的模型,如何使用bert的tokenize?
#170
nathinal
closed
3 years ago
9
大佬,讯飞云下载下面的rbt4可以开放一下吗?
#169
zhusleep
closed
3 years ago
2
What are the pretrained-language-model that is obviously better than BERT and RoBERTa?
#168
guotong1988
closed
3 years ago
2
使用transformer加载模型失败呢
#167
CoinCheung
closed
3 years ago
2
请问训练时有考虑到emoji表情符号吗
#166
lzy318
closed
3 years ago
2
关于tensorflow版本
#165
KuLee-1
closed
3 years ago
2
是否有TF版本的下载呢?
#164
callzhang
closed
3 years ago
2
有没有可以区分英文的大小写的chinese-bert-wwm模型
#163
sunxiaojie99
closed
3 years ago
5
BART
#162
CharlieLzy
closed
3 years ago
2
Error no file named ['pytorch_model.bin', 'tf_model.h5'] found in directory chinese-bert-wwm
#161
wangbf
closed
3 years ago
3
建议增加例子demo 方便快速入手
#160
zeng8280
closed
3 years ago
2
模型都接入到transformers,tensorflow也支持吗
#159
shulongdeng22
closed
3 years ago
3
计算序列长度大于512的向量
#158
FaxinZ
closed
3 years ago
3
句子分词
#157
rucieryi369
closed
3 years ago
2
tensorflow2如何加载模型
#156
archersama
closed
3 years ago
3
wwm是否可以提取固定向量?
#155
1535966643
closed
3 years ago
6
pytorch模型是用pytorch代码预训练的还是tf预训练后转成pytorch的?
#154
daniellibin
closed
3 years ago
3
transformer 实现 WWM 来 Fine-tune Bert 相关问题
#153
wlhgtc
closed
3 years ago
7
chinese-roberta-wwm-ext-large中的config.json参数pooler_fc_size有误
#152
JiaqiYao
closed
3 years ago
2
RoBERTa need vocab.json in hugging face Transformers
#151
MrRace
closed
3 years ago
1
which is the MacBERT?
#150
21157651
closed
3 years ago
1
Huggingface-Transformers预训练的模型,需要对文本进行分词么?
#149
fanlinbo
closed
3 years ago
1
请问什么时候可以提供GPT2之类的CLM的预训练语言模型
#148
piekey1994
closed
3 years ago
1
TensorFlow版本的AdamWeightDecayOptimizer在fine-tune BERT时不起作用,请问您有遇到过吗?
#147
guotong1988
closed
3 years ago
15
RoBERTa-wwm-ext-large能不能把mlm权重补充上?
#146
bojone
closed
3 years ago
1
中文维基百科数据集
#145
liuwei1206
closed
3 years ago
2
transformer2.2.2 加载参数失败
#144
RoacherM
closed
3 years ago
4
预训练模型收敛效果问题
#143
LeoWood
closed
3 years ago
2
Previous
Next