neurodata / ProgLearn

NeuroData's package for exploring and using progressive learning algorithms
https://proglearn.neurodata.io
Other
33 stars 42 forks source link

Contrastive learning transformer for PL Network #426

Open rflperry opened 3 years ago

rflperry commented 3 years ago

Background

Currently, the progressive learning network transformer is learned as a byproduct of optimizing the softmax objective loss for classification accuracy.

Contrastive loss (reference 1, reference 2) explicitly learns a transformer, penalizing samples of different classes that are close to one another (see also margin loss). This may be better suited for using kNN later, and shows state of the art accuracy.

See official implementation here.

Proposed feature: implement contrastive loss.

Validate by comparing accuracy and then compare transfer efficiency. Determine the best form of contrastive loss to use.

Prior experiments

I had fiddled with this a bit. See attempted contrastive learning implementation here and preliminary transfer efficiency benchmarks for variations in the contrastive loss layers.

Dante-Basile commented 2 years ago

Hi, I am looking into addressing this issue for NDD 2021-2022.

waleeattia commented 2 years ago

Hello, I'm also looking into this issue for NDD 2021