Closed khs1126 closed 1 year ago
Hi,
I downloaded my-unsup-bert-base-uncased and used this command
python evaluation.py --model_name_or_path my-unsup-promcse-bert-base-uncased --pooler_type cls_before_pooler --task_set sts --mode test --pre_seq_len 10
to retrieve your output but got this:
RuntimeError: Error(s) in loading state_dict for BertForCL: size mismatch for prefix_encoder.embedding.weight: copying a param with shape torch.Size([16, 18432]) from checkpoint, the shape in current model is torch.Size([10, 18432]).
Could you please help?
Since we trained the unsup-bert-base-uncased model with 16 soft prompts, you need to set "pre_seq_len" to 16 when evaluating.
Hi,
I downloaded my-unsup-bert-base-uncased and used this command
python evaluation.py --model_name_or_path my-unsup-promcse-bert-base-uncased --pooler_type cls_before_pooler --task_set sts --mode test --pre_seq_len 10
to retrieve your output but got this:
RuntimeError: Error(s) in loading state_dict for BertForCL: size mismatch for prefix_encoder.embedding.weight: copying a param with shape torch.Size([16, 18432]) from checkpoint, the shape in current model is torch.Size([10, 18432]).
Could you please help?