google-research / simclr

SimCLRv2 - Big Self-Supervised Models are Strong Semi-Supervised Learners
https://arxiv.org/abs/2006.10029
Apache License 2.0
4.06k stars 622 forks source link

Pretrain weights and selecting intermediate outputs #96

Closed adrian-dalessandro closed 3 years ago

adrian-dalessandro commented 3 years ago

Hello! I'm very interested in accessing the pretrained simclr weights. Specifically, I'm interested in working with some intermediate outputs. Is there an easy way to access and process the outputs of the sublayers of the network? I'm interested in doing some simple explorations of the GradCAMs, and it's not clear how to get these gradients in tf hub models.

chentingpc commented 3 years ago

There are several options here: 1) you could directly take gradient using tf hub module, but only to the anchor points that are available in hub module, see https://github.com/google-research/simclr/blob/master/colabs/load_and_inference.ipynb In [7] for how to get those anchor points. 2) if you want to take gradient to arbitrary layer, then you need to load the checkpoint for a defined network, you could do this in tensorflow, or a pytorch network (see readme on how to convert the checkpoints, provided by community contributors).