jiayunz / RecipFL

Code for WWW 2024 paper "How Few Davids Improve One Goliath: Federated Learning in Resource-Skewed Edge Computing Environments"
7 stars 1 forks source link

I need some help to replay this experiment #1

Open jacazjx opened 1 week ago

jacazjx commented 1 week ago

First of all, thank for sharing your code. And I have succeed replay your work on resnet and densenet. But I have meet some trouble on replaying Bert.

  1. I conda the environment based on requirment.txt, but I cannot import AutoAdapterModel from transformers. But when I install the lib named "adapter-transformers" and import it from adapters, it works.
  2. I have downloaded the weight of bert-base and distill-bert from huggingface "https://huggingface.co/distilbert/distilbert-base-uncased/tree/main". But when I run this code with MNLI datasets, there is a bug in ./utils/hypernetwork/graph/_named_modules. It always print len of self.model.named_parameters() is not equal to len(modules). I have debuged this issue, and I find that self.model.named_parameters() always lack of one layer such as heads.defalut.0.weight or heads.defalut.3.weight. But I cannot fix it. So, I hope the authors can help me to fix it. Really thanks. 🤝🤝🤝
jiayunz commented 1 week ago

Hi,

Could you install adapter-transformers==3.1.0 and see if it works?