neurodata / ProgLearn

NeuroData's package for exploring and using progressive learning algorithms
https://proglearn.neurodata.io
Other
35 stars 42 forks source link

Supervised Contrastive Loss #518

Open waleeattia opened 2 years ago

waleeattia commented 2 years ago

Reference issue

#426

Type of change

Implementing supervised contrastive loss Adding plotting script to compare accuracies and transfer efficiencies

What does this implement/fix?

Implementing contrastive loss explicitly learns the progressive learning network transformer by penalizing samples of different classes that are close to one another. The new script enables two dnn algorithms to be compared by plotting the difference between their accuracies and transfer efficiencies. The accuracy of the supervised contrastive loss version improves by 6 percent compared to the PL network with categorical cross entropy.

Additional information

NDD 2021

codecov[bot] commented 2 years ago

Codecov Report

Merging #518 (43b05f7) into staging (634d4d1) will not change coverage. The diff coverage is n/a.

Impacted file tree graph

@@           Coverage Diff            @@
##           staging     #518   +/-   ##
========================================
  Coverage    90.09%   90.09%           
========================================
  Files            7        7           
  Lines          404      404           
========================================
  Hits           364      364           
  Misses          40       40           

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update 634d4d1...43b05f7. Read the comment docs.

jdey4 commented 2 years ago

@rflperry Does this PR help your query about contrastive loss?

rflperry commented 2 years ago

Yeah seems like it matches my results here that the transfer ability goes down, which I find interesting but the reason why I'm still a bit intrigued by. Not really worth adding if just always worse? I forget why I had multiple different results with different labels.

rflperry commented 2 years ago

My takeaways/summary:

waleeattia commented 2 years ago

@PSSF23 fixed!

waleeattia commented 2 years ago

@PSSF23 Perfect, just made those changes. Thank you!

waleeattia commented 2 years ago

@PSSF23 Sorry I missed that, it should be good now.