I want to know why num_cycles set 7. / 16,have some referance?
And I checked the original algorithm of the cosine Annealing algorithm. It seems to be slightly different from the version in the code. Is it possible that TorchSSL does not use the original algorithm but a self-set algorithm with a reduced learning rate? or I found a wrong reference paper?
Hello , thanks for Semi-supervised-learning,
semilearn/core/utils/build.py:217:def get_cosine_schedule_with_warmup
I want to know why num_cycles set 7. / 16,have some referance?
And I checked the original algorithm of the cosine Annealing algorithm. It seems to be slightly different from the version in the code. Is it possible that TorchSSL does not use the original algorithm but a self-set algorithm with a reduced learning rate? or I found a wrong reference paper?
and TorchSSL is :
Looking forward for your reply, thank you~~~