Open krishnakanthnakka opened 1 year ago
In https://github.com/alibaba/FederatedScope/blob/6dfe8d4474451c07cd7d69690a2df33a1f455f19/federatedscope/autotune/fedex/client.py#L30
the sampled hyperparams at the client are updated to self.trainer.cfg but they don't seem to take actual effect in client-side trainer.
self.trainer.cfg
For example, the dropout in the model (at https://github.com/alibaba/FederatedScope/blob/6dfe8d4474451c07cd7d69690a2df33a1f455f19/federatedscope/cv/model/cnn.py#L45C15-L45C22) is maintained by self.dropout but this variable is never updated during the FedEx training but instead model_config.dropout is updated which is never reused except during the initialization stage.
dropout
self.dropout
model_config.dropout
In other words, I think, similar to trainer.update() at https://github.com/alibaba/FederatedScope/blob/6dfe8d4474451c07cd7d69690a2df33a1f455f19/federatedscope/autotune/fedex/client.py#L46
we should have some function to update the dynamic hyperparameters in torch_trainer.py.
Please correct me if I missed something.
Edit:
The optimizer related hyperparams are indeed updated at https://github.com/alibaba/FederatedScope/blob/6dfe8d4474451c07cd7d69690a2df33a1f455f19/federatedscope/core/trainers/torch_trainer.py#L199 but kindly check if model related hyperparams (dropout) updated somewhere...
Thanks for the suggestion, we noticed that if the cfg changes, the correct implementation is that the model (and the corresponding optimizer) should be re-instantiated. We will fix it ASAP.
In https://github.com/alibaba/FederatedScope/blob/6dfe8d4474451c07cd7d69690a2df33a1f455f19/federatedscope/autotune/fedex/client.py#L30
the sampled hyperparams at the client are updated to
self.trainer.cfg
but they don't seem to take actual effect in client-side trainer.For example, the
dropout
in the model (at https://github.com/alibaba/FederatedScope/blob/6dfe8d4474451c07cd7d69690a2df33a1f455f19/federatedscope/cv/model/cnn.py#L45C15-L45C22) is maintained byself.dropout
but this variable is never updated during the FedEx training but insteadmodel_config.dropout
is updated which is never reused except during the initialization stage.In other words, I think, similar to trainer.update() at https://github.com/alibaba/FederatedScope/blob/6dfe8d4474451c07cd7d69690a2df33a1f455f19/federatedscope/autotune/fedex/client.py#L46
we should have some function to update the dynamic hyperparameters in torch_trainer.py.
Please correct me if I missed something.
Edit:
The optimizer related hyperparams are indeed updated at https://github.com/alibaba/FederatedScope/blob/6dfe8d4474451c07cd7d69690a2df33a1f455f19/federatedscope/core/trainers/torch_trainer.py#L199 but kindly check if model related hyperparams (dropout) updated somewhere...