vaseline555 / Federated-Learning-in-PyTorch

Handy PyTorch implementation of Federated Learning (for your painless research)
MIT License
407 stars 89 forks source link

The accuracy on Cifar10 may be low. #6

Closed ADAM0064 closed 1 year ago

ADAM0064 commented 2 years ago

I ran the FedAvg code with the CNN2 given in model.py, with regard to the Cifar10 dataset. I also excluded the model initialization in server.py, and all of the clients (only 10) were set to update and upload their models to the server. However, over about 100 rounds, the accuracy can only raise up to around 70%, and do not go up afterwards. I wonder if there is anything I've missed or mistaken. Could anyone please offer me some advice?

Weixiang-Han commented 2 years ago

Hello, I am also trying to train cifar10, but my training accuracy rate is very slow when LR = 0.001 and I have tried different learning rates for many times, and sometimes there is no change. I don't know where the problem is. Can you give me some suggestions or parameters to share? Thank you

otouat commented 1 year ago

This CNN model may not be the best for this dataset, you may use other CNN model (such as VGG) to get better performance

vaseline555 commented 1 year ago

Sorry for super late reply. In order for reproducing the best performance on CIFAR10 dataset reported in the original paper (McMahan et al., 2016), you should modify the hyperparmeter settings.

According to Figure 4 and CIFAR experiments section in the paper, with 100 clients, you need E=5, B=50, lr >=0.05, at least R > 500 and learning rate decay of 0.99 per round. Please kindly check the original paper and conduct the experiment again.

FYI, I wrote paper on the personalized federated learning, called SuPerFed (published in ACM 28th SIGKDD 2022 conference). You may find more refined implementation of FedAvg, FedProx, etc in the above repo. Thank you.