KevinMusgrave / pytorch-metric-learning

The easiest way to use deep metric learning in your application. Modular, flexible, and extensible. Written in PyTorch.
https://kevinmusgrave.github.io/pytorch-metric-learning/
MIT License
6.01k stars 658 forks source link

Neural representation similarities #683

Open domenicoMuscill0 opened 9 months ago

domenicoMuscill0 commented 9 months ago

Will you consider to add to this library some similarities among neural network representations? Like CCA or CKA for example.

KevinMusgrave commented 9 months ago

Can you provide links to the relevant papers or code?

I'm wondering if it's out of the scope of this library, or if there is an existing library that already serves this purpose well.

domenicoMuscill0 commented 9 months ago

CKA and CCA similarities have been implemented by jayroxis and moskomule. However the second link is a library with other neural network representation similarities i am not sure it is mantained anymore, but there are tests and pytorch implementation to which compare in the tests of a future implementation.

KevinMusgrave commented 8 months ago

Does it make sense to apply CKA and CCA to embeddings?

domenicoMuscill0 commented 8 months ago

I think it depends on the task. It may be used in some application to NAS or in some other model that could benefit from these similarities

KevinMusgrave commented 8 months ago

I mean can they be used as a drop-in replacement for any existing losses, like the contrastive loss?

domenicoMuscill0 commented 8 months ago

Probably not. For CKA one uses the data matrix X (num_examples x num_features), computes a Gram matrix with it (in the link they either use linear or rbf kernels, but i think any other similarity between embeddings can be adopted) and then compares it with another Gram matrix from another batch of data. I think it qualifies best as a network comparison method rather than a true similarity measure as it is intended in this library.

KevinMusgrave commented 8 months ago

Ok let's leave it out for now, unless other people express interest.