yym6472 / ConSERT

Code for our ACL 2021 paper - ConSERT: A Contrastive Framework for Self-Supervised Sentence Representation Transfer
538 stars 81 forks source link

'BertModel' object has no attribute 'set_flag' #26

Open zhihao-chen opened 2 years ago

zhihao-chen commented 2 years ago

具体报错是: File "/data2/work2/chenzhihao/NLP/nlp/sentence_transformers/SentenceTransformer.py", line 594, in fit loss_value = loss_model(features, labels) File "/root/anaconda3/envs/NLP_py39/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1110, in _call_impl return forward_call(*input, **kwargs) File "/data2/work2/chenzhihao/NLP/nlp/sentence_transformers/losses/AdvCLSoftmaxLoss.py", line 775, in forward rep_a_view1 = self._data_aug(sentence_feature_a, self.data_augmentation_strategy_final_1, File "/data2/work2/chenzhihao/NLP/nlp/sentence_transformers/losses/AdvCLSoftmaxLoss.py", line 495, in _data_aug self.model[0].auto_model.set_flag("data_aug_cutoff", True) File "/root/anaconda3/envs/NLP_py39/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1185, in getattr raise AttributeError("'{}' object has no attribute '{}'".format( AttributeError: 'BertModel' object has no attribute 'set_flag'

我加载的是hfl/chinese-roberta-wwm-ext模型。

yym6472 commented 2 years ago

看报错应该是没有运行repo下的transformers包,而是用安装的site-packages下的transformers包,可以试试把site-packages下的transformers包删除(pip uninstall transformers),或者手动把目前安装的transformers包的modeling_bert.py文件用repo下的版本(https://github.com/yym6472/ConSERT/blob/master/transformers/modeling_bert.py)替换

zhihao-chen commented 2 years ago

非常感谢,计划适配官方的sentence_transformers 和transformers吗

yym6472 commented 2 years ago

因为涉及到embedding层的数据增强,所以需要改动transformers里BERT forward内部的代码,目前还没想好怎么能不改动官方的package去实现这一点

zhihao-chen commented 2 years ago

明白了,谢谢您的回复