YJiangcm / PromCSE

Code for "Improved Universal Sentence Embeddings with Prompt-based Contrastive Learning and Energy-based Learning (EMNLP 2022)"
https://arxiv.org/abs/2203.06875v2
133 stars 15 forks source link

unsup-bert-base evaluation error #5

Closed khs1126 closed 1 year ago

khs1126 commented 1 year ago

Hi,

I downloaded my-unsup-bert-base-uncased and used this command

python evaluation.py --model_name_or_path my-unsup-promcse-bert-base-uncased --pooler_type cls_before_pooler --task_set sts --mode test --pre_seq_len 10

to retrieve your output but got this:

RuntimeError: Error(s) in loading state_dict for BertForCL: size mismatch for prefix_encoder.embedding.weight: copying a param with shape torch.Size([16, 18432]) from checkpoint, the shape in current model is torch.Size([10, 18432]).

Could you please help?

YJiangcm commented 1 year ago

Hi,

Since we trained the unsup-bert-base-uncased model with 16 soft prompts, you need to set "pre_seq_len" to 16 when evaluating.