FedNLP: An Industry and Research Integrated Platform for Federated Learning in Natural Language Processing, Backed by FedML, Inc. The Previous Research Version is Accepted to NAACL 2022
I want to know how do you maintain the parameters of each large model (such as Bert) in the process of federated learning, such as the fedavg algorithm? Because before server aggregation, if you run federated learning locally, you need to save many model parameters in memory
I want to know how do you maintain the parameters of each large model (such as Bert) in the process of federated learning, such as the fedavg algorithm? Because before server aggregation, if you run federated learning locally, you need to save many model parameters in memory