ymcui / Chinese-BERT-wwm

Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
https://ieeexplore.ieee.org/document/9599397
Apache License 2.0
9.56k stars 1.38k forks source link

求教 roberta既然在训练的时候未做NSP,那pooler_output 还有意义吗 #201

Closed rmbone closed 2 years ago

rmbone commented 2 years ago

使用的huggingface的api去加载RoBERTa-wwm-large, 这个BertModel 输出的pooler_output 是否还有意义??

stale[bot] commented 2 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

stale[bot] commented 2 years ago

Closing the issue, since no updates observed. Feel free to re-open if you need any further assistance.