YU1ut / MixMatch-pytorch

Code for "MixMatch - A Holistic Approach to Semi-Supervised Learning"
MIT License
633 stars 129 forks source link

Importance to update the lambda slowly #9

Closed wang3702 closed 5 years ago

wang3702 commented 5 years ago

Hi, I have tested on your original code and my reproduced code. I found the accuracy will increase more smoothly as Google suggested to update lambda to 72 with 1024 epochs. Here is the comparison of our result in first around 100 epochs:

mixmatch_yu My training result: Accuracy Loss comparison: mixmatch_yu1 My training result CLoss

JongMokKim commented 5 years ago

hello! what does that mean 'update lambda to 72 with 1024 epochs' ? you mean, max lambda value is 72 and gradually increase in whole epoch (1024)? thank you in advance!

wang3702 commented 5 years ago

@JongMokKim Yes. Exactly. Slowly update is much better for stable training

sanyouwu commented 5 years ago

@JongMokKim Yes. Exactly. Slowly update is much better for stable training

So, how to design the function to control the change process of lambda?

YU1ut commented 5 years ago

Updated the calculation of lambda to make it gradually increase in the whole epoch.