Open comchobo opened 1 year ago
Hi, I think you may try to change 'Roberta' into 'DebertaV2' in models.py.
e.g., change
from transformers.models.roberta.modeling_roberta import RobertaPreTrainedModel, RobertaModel, RobertaLMHead
to
from transformers.models.deberta_v2.modeling_deberta_v2 import DebertaV2PreTrainedModel, DebertaV2Model, DebertaV2LMHead
,
and create a DebertaForCL
class.
Hope it works.
I think, because of attention mechanism in debertav2, it is not easy to implement debertav2 prefix based model. It is not supported in peft also, which implemented most of light training methods, including prefix based model.
Hello. I'm trying to use your method for sentence embedding. I thought debertav3 is a really good BERT-style model and tried to implement with your code. However, It seems a bit tough for me. Do you have any solution/plan to implement DebertaV3 with your code?, like, implementing DebertaForCL on models.py