pmichel31415 / are-16-heads-really-better-than-1

Code for the paper "Are Sixteen Heads Really Better than One?"
MIT License
169 stars 14 forks source link

Is the code still able to run? #9

Open bing0037 opened 3 years ago

bing0037 commented 3 years ago

Hi,

I am trying to reproduce your result of BERT. I followed the Prerequisite:

# Pytorch pretrained BERT
git clone https://github.com/pmichel31415/pytorch-pretrained-BERT
cd pytorch-pretrained-BERT
git checkout paul
cd ..
# Install the pytorch-pretrained_BERT:
cd pytorch-pretrained-BERT
pip install .
cd ..
# Run the code:
bash experiments/BERT/heads_ablation.sh MNLI

But got this error:

02:06:57-INFO: Weights of BertForSequenceClassification not initialized from pretrained model: ['classifier.weight', 'classifier.bias']
02:06:57-INFO: Weights from pretrained model not used in BertForSequenceClassification: ['cls.predictions.bias', 'cls.predictions.transform.dense.weight', 'cls.predictions.transform.dense.bias', 'cls.predictions.decoder.weight', 'cls.seq_relationship.weight', 'cls.seq_relationship.bias', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.transform.LayerNorm.bias']
Traceback (most recent call last):
  File "pytorch-pretrained-BERT/examples/run_classifier.py", line 582, in <module>
    main()
  File "pytorch-pretrained-BERT/examples/run_classifier.py", line 275, in main
    model.bert.mask_heads(to_prune)
  File "/home/guest/anaconda3/envs/huggingface_env/lib/python3.6/site-packages/torch/nn/modules/module.py", line 594, in __getattr__
    type(self).__name__, name))
AttributeError: 'DataParallel' object has no attribute 'bert'

1(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error

Any idea or suggestion?

pmichel31415 commented 3 years ago

Hi @bing0037 I haven't run this code in a while but it used to work. My first guess would be probably an incompatibility with a newer version of pytorch. Can you try again in an environment with pytorch 1.0 or 1.1?

If that doesn't solve it then I'm not too sure... I wasn't using DataParallel in the code so I'm not sure why it would show up in the eror message... let me know how changing the pytorch version goes.

caidongqi commented 2 years ago

I met the same problem today and solved it by adding the following code block in _pytorch-pretrained-BERT/examples/runclassifier.py

# around line 260
model = torch.nn.DataParallel(model)
+ model = model.module

Hope it helps.

Reference:

  1. https://blog.csdn.net/weixin_41990278/article/details/105127101
  2. https://zhuanlan.zhihu.com/p/92759707
vrunm commented 1 year ago

Hi @caidongqi I tried changing the run_classifier.py file as you have done. But ran into the same errors as @bing0037. I was also trying to reproduce the result for BERT. I am using Python 3.8 and PyTorch 1.8.0

1(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error

Any ideas or solutions to this?