mahdibeit / FedPFT

Parametric Feature Transfer: One-shot Federated Learning with Foundation Models
Apache License 2.0
0 stars 0 forks source link

questions #1

Open chenrxi opened 2 months ago

chenrxi commented 2 months ago

hi mahdibeit ,

  1. Your proposed baseline is similar to CCVR[No Fear of Heterogeneity: Classifier Calibration for Federated Learning with Non-IID Data]. Why don't make a comparison with it?
  2. Could you provide the training code for U-Net-based conditional denoising diffusion models
mahdibeit commented 2 months ago

Hi @chenrxi,

Thank you so much for reaching out and showing interest in our work. Please find below the answers to your questions.

  1. Yes, our work is similar to CCVR. In our latest version (which has not been uploaded to arxiv), we do a direct comparison with CCVR, and the results show that FedPFT outperforms CCVR by 7 percent since it allows GMMs with k>1. You can also infer this result from Figure 6 (left) of the paper.

  2. We have not yet open-sourced our code. However, for the reconstruction, we utilized the implementation of the inversion attack of Teterwak, Zhang, Krishnan, and Mozer (ICML'21): Understanding Invariance via Feedforward Inversion of Discriminatively Trained Classifiers.

chenrxi commented 2 months ago

Hi @mahdibeit Thank you for your kind reply.

  1. It solves my concerns. However, the performance gain may come from the classifier being trained from scratch compared to CCVR,in my opinion.
  2. Yes,I understand why the code is not open-source. In session E.3. Experimental details, you mention that you trained U-Net-based conditional denoising diffusion models on the CIFAR-10 train set. Is it also implemented from this implementation? Or maybe provide more details if possible. And the result in Table 6 and Figure 11 may not relate to it.