Spijkervet / BYOL

Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning
129 stars 24 forks source link

No pretrained model to be logistic regression will also achieve 83+ on cifar-10 #11

Open yanghu819 opened 1 year ago

yanghu819 commented 1 year ago

before pre-train I save the checkpoint which should be very weak in representation image

and perform logistic regression as: python3 logistic_regression.py --model_path=./model-no-train.pt and the result is Epoch [296/300]: Loss/train: 0.4051439380645752 Accuracy/train: 0.8583268229166667 Epoch [297/300]: Loss/train: 0.4050535774230957 Accuracy/train: 0.8583854166666667 Epoch [298/300]: Loss/train: 0.4049636995792389 Accuracy/train: 0.8583333333333333 Epoch [299/300]: Loss/train: 0.40487425565719604 Accuracy/train: 0.8583528645833333

Calculating final testing performance

Final test performance: Accuracy/test: 0.8356724330357143