YJiangcm / PromCSE

Code for "Improved Universal Sentence Embeddings with Prompt-based Contrastive Learning and Energy-based Learning (EMNLP 2022)"
https://arxiv.org/abs/2203.06875v2
133 stars 15 forks source link

Reproduce supervised results #4

Closed vinayak1 closed 1 year ago

vinayak1 commented 1 year ago

Hi, I'm trying to reproduce the results for the repo for the supervised training and am using the following environment:

V100 (p3.2xlage) Transformers 4.2.1 PyTorch 1.12 CUDA Version: 11.3

When using these settings I get the following results:

Bert-base-uncased ------ test ------ | STS12 | STS13 | STS14 | STS15 | STS16 | STSBenchmark | SICKRelatedness | Avg. | | 74.46 | 83.81 | 79.25 | 86.02 | 81.07 | 84.00 | 80.65 | 81.32 |

Roberta-base ------ test ------ | STS12 | STS13 | STS14 | STS15 | STS16 | STSBenchmark | SICKRelatedness | Avg. | | 76.23 | 86.07 | 80.71 | 86.72 | 83.32 | 85.95 | 80.24 | 82.75 |

It looks like the results are slightly off compared to the reported ones, can you please share the Pytorch and Cuda versions used for training?

Thanks!

YJiangcm commented 1 year ago

Hi, these are the Pytorch and Cuda versions we use: Nvidia RTX 3090Ti GPUs Transformers 4.2.1 PyTorch 1.10 CUDA Version: 11.4

Results can vary based on different hardware/software.

vinayak1 commented 1 year ago

Thanks!

azareln commented 1 year ago

Are you able to release the checkpoints for sup-PromCSE-RoBERTa-base? I cannot reproduce results due to hardware differences I presume. Thanks

YJiangcm commented 1 year ago

Hi, the checkpoint for sup-PromCSE-RoBERTa-base has been uploaded to Huggingface (https://huggingface.co/YuxinJiang/sup-promcse-roberta-base). I hope this helps you :)