vkinakh / scatsimclr

Official implementation of paper "ScatSimCLR: self-supervised contrastive learning with pretext task regularization for small-scale datasets", accepted at ICCV 2021 2nd Visual Inductive Priors for Data-Efficient Deep Learning Workshop
MIT License
26 stars 4 forks source link

pretext task regularizer #1

Closed juingzhou closed 2 years ago

juingzhou commented 3 years ago

hello, I want to pretext task regularizer

vkinakh commented 3 years ago

Hello! For pretext task regularizer training use PretextTaskTrainer

You can do it by filling the config file, as it is shown in the example

Then run command python main.py -m=pretext -c <path to your config>

Good luck!