lunan0320 / Federated-Learning-Knowledge-Distillation

repo in several methods FedAvg, FedMD, FedProto, FedProx, FedHKD.
16 stars 1 forks source link

Question about ServerFedHKD #4

Open Fruit0218 opened 2 days ago

Fruit0218 commented 2 days ago

Hello Author! Thank you very much for providing the code. While reading your code, I noticed that the global_knowledge_aggregation method in serverFedHKD is not executed, and the final global knowledge (global_features and globalsoft_prediction) is adopted from the local knowledge (local_features and localsoft_prediction) of the last user in idxs_users instead of the aggregation of the local knowledge of all users in idxs_users. Is this correct? I'm not very familiar with this part. I'm looking forward to your reply.

lunan0320 commented 2 days ago

Thank you for your question. The first epoch looks like what you said, but starting from the second epoch, global knowledge is continuously iterated, which is another aggregation method of federation. Here, it is not directly aggregated during implementation. If you want to implement aggregation, you can also refer to the ServerFedMD part.