Closed josebummer closed 3 years ago
Hi, following the algorithm, and doing stochastic, for each epoch, we only sample one mini-batch, when the number of the local epoch is large enough, it will cover all batches in training data. So by varying the number of the local epoch, actually it is similar to one epoch train complete data.
Thank you very much for your prompt response. Perfect, I understand what you are saying.
Hi, I have a question about the train method in the clients. Normally, in each epoch the complete dataset is trained using batches, but I have seen in your code that in each epoch, only a single batch is trained. Is this correct? Best regards.