HobbitLong / SupContrast

PyTorch implementation of "Supervised Contrastive Learning" (and SimCLR incidentally)
BSD 2-Clause "Simplified" License
3.12k stars 537 forks source link

Training using Supcon loss is slower than cross_entropy loss #151

Open yinbing668 opened 2 months ago

yinbing668 commented 2 months ago

I find that in the same case(same dataloader, same network architecture), training using Supcon loss, the accuracy improves slower than the cross_entropy loss. Do you find the same case? Besides, i cannot run the code, it report"segment error".