diaoenmao / HeteroFL-Computation-and-Communication-Efficient-Federated-Learning-for-Heterogeneous-Clients

[ICLR 2021] HeteroFL: Computation and Communication Efficient Federated Learning for Heterogeneous Clients
MIT License
154 stars 33 forks source link

Dynamic Model #1

Closed Zihao-Kevin closed 3 years ago

Zihao-Kevin commented 3 years ago

Hello. When I use the dynamic assigments(i.e. the example'Train CIFAR10 dataset (Non-IID 2 classes) with ResNet model, 10 users, active rate 0.1, model split 'Dynamic', model split mode 'a-b-c (uniform)', GroupNorm, Scaler (False) , Masked CrossEntropy (False)'), some problems have arisen.

First of all, in the example, there are 10 users, but the split mode 'a-b-c', so does this mean each mode has 3.33 users?

Secondly, the 131 line in 'utils.py' has a bug, because there are no characters after 'm'(not a2, b8), so there is no 'm[1:]'.

Can you help to answer? Thanks a lot.

diaoenmao commented 3 years ago

Thank you for pointing out the bug. It is a typo in readme and I have fixed it from 'a-b-c' to 'a1-b1-c1'. The proportion in this case for 'a','b' or 'c' is 1/(1+1+1) = 1/3. The allocation of model rate for each client is sampled in line 17 of fed.py.