diaoenmao / HeteroFL-Computation-and-Communication-Efficient-Federated-Learning-for-Heterogeneous-Clients

[ICLR 2021] HeteroFL: Computation and Communication Efficient Federated Learning for Heterogeneous Clients
MIT License
150 stars 33 forks source link

Optimizer discrepancy #12

Closed AmirEstiri closed 9 months ago

AmirEstiri commented 9 months ago

Hi, Thank you for your work. After studying your work in detail, we have discovered that the make_optimizer() function uses the optimizer from the file config.yml even though we pass the optimizer name as an argument in the terminal. In config file the optimizer used is Adam, but in the paper there is no mention of Adam, and all the experiments are run with SGD. Can you please clarify this issue? What optimizer was used for the experiments?