wfwf10 / Feature-based-Federated-Transfer-Learning

Communication-Efficient Feature-based Federated Transfer Learning
11 stars 0 forks source link

question #1

Open twythebest opened 2 months ago

twythebest commented 2 months ago

sorrt for bothering you,in you paper:only 1.28x10^(-3)clients join the train every iter,but in your code:you put whole train_loader for iter trainning?I wonder where I can find in your code that means :1.28x10^(-3)clients join the train ?

wfwf10 commented 2 months ago

Thanks for asking! In the experiment on CIFAR10 - VGG16, we have 50000 training images. We assume there are 6250 clients in total, and each client has 8 different images. In the paper, we said "each iteration takes a fraction C = 1.28 × 10^(−3) of all clients". So, in each iteration, we randomly pick "6250 × (1.28 × 10^(−3)) = 8 clients" from all 6250 clients. So, we are randomly picking "8 clients × 8 images/client = 64 images" in each global iteration, and use the average gradient to update the central parameters. In both python files for CIFAR10 - VGG16 experiments, you will find "batch_size = 64" in the code.