FedML-AI / FedNLP

FedNLP: An Industry and Research Integrated Platform for Federated Learning in Natural Language Processing, Backed by FedML, Inc. The Previous Research Version is Accepted to NAACL 2022
223 stars 45 forks source link

Optimizer suggestion for federated learning experiments #1

Closed chaoyanghe closed 3 years ago

chaoyanghe commented 3 years ago

1. Client optimizer (sgd) + server optimizer (adam).

Pro: good for cross-device FL since all devices do not need to synchronize the optimizer states; cons: the accuracy will drop a bit? not sure. nobody has ever explored this based on Transformer. Maybe our paper has this contribution.

2. Client optimizer (adam) + server optimizer (adam). Pro: good for accuracy cons: can only work at the cross-silo setting

We will have a discussion of the optimizer performance in our benchmarking paper, which is viewed as a field guide for our benchmarking users.