google-research / simclr

SimCLRv2 - Big Self-Supervised Models are Strong Semi-Supervised Learners
https://arxiv.org/abs/2006.10029
Apache License 2.0
4.09k stars 624 forks source link

Could you please kindly provide the self-distilled model(ResNet152-3x-sk1) for us #70

Closed YFWDZWS closed 4 years ago

YFWDZWS commented 4 years ago

Hi: I am a reasearcher about self-supervised learning and I really appriaciate your work about SimCLR v2, and could you kindly provide the self-distilled model(ResNet152-3x-sk1) for us, thank you so much!

YFWDZWS commented 4 years ago

Also the ResNet-50 which is distilled from big model (like Res 152-3x-sk1)

chentingpc commented 4 years ago

we are considering oss distilled models, but it could take a few days/weeks. will notify here once they're available.

chentingpc commented 4 years ago

The distilled model checkpoints have been released at gs://simclr-checkpoints/simclrv2. Let me know if there's any issues.

YFWDZWS commented 4 years ago

Thank you so much for your update!

Best Yifei Wu

Ting Chen notifications@github.com于2020年8月19日 周三上午1:43写道:

The distilled model checkpoints have been released at gs://simclr-checkpoints/simclrv2 https://console.cloud.google.com/storage/browser/simclr-checkpoints/simclrv2/. Let me know if there's any issues.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/google-research/simclr/issues/70#issuecomment-675619869, or unsubscribe https://github.com/notifications/unsubscribe-auth/AL63BH2QH5CBEF62QUWZQWTSBK4VHANCNFSM4PPE5RMA .