princeton-nlp / CoFiPruning

[ACL 2022] Structured Pruning Learns Compact and Accurate Models https://arxiv.org/abs/2204.00408
MIT License
189 stars 31 forks source link

Error in finetune with pruned model--AttributeError: 'NoneType' object has no attribute 'forward'` #33

Closed zoetu closed 1 year ago

zoetu commented 2 years ago

hello,@xiamengzhou! When i use your script to finetune the pruned model, there is an issue. But i hava no idea about it. What`s wrong with my code?

screenshot-20221021-161617

TASK=MRPC
SUFFIX=sparsity0.95
EX_CATE=CoFi
SPARSITY=0.95
DISTILL_LAYER_LOSS_ALPHA=0.9
DISTILL_CE_LOSS_ALPHA=0.1
LAYER_DISTILL_VERSION=4
SPARSITY_EPSILON=0.01
DISTILLATION_PATH=/home/tt6232/KdQuant/teacher-model/bert-base-uncased/

PRUNED_MODEL_PATH=./out/$TASK/$EX_CATE/${TASK}_${SUFFIX}/best
PRUNING_TYPE=None # Setting the pruning type to be None for standard fine-tuning.
LEARNING_RATE=3e-5

bash scripts/run_CoFi.sh $TASK $SUFFIX $EX_CATE $PRUNING_TYPE $SPARSITY $DISTILLATION_PATH $DISTILL_LAYER_LOSS_ALPHA $DISTILL_CE_LOSS_ALPHA $LAYER_DISTILL_VERSION $PRUNED_MODEL_PATH $LEARNING_RATE &
zhangzhenyu13 commented 2 years ago

this is due to that there the l0 module is not initialized for FT mode, I think this is a manual exp scripts that do not check what the mode is. to address the problem, modify the trainer of the project by adding if-condition (only when the l0 module is not none) to call l0.forward.

zoetu commented 2 years ago

this is due to that there the l0 module is not initialized for FT mode, I think this is a manual exp scripts that do not check what the mode is. to address the problem, modify the trainer of the project by adding if-condition (only when the l0 module is not none) to call l0.forward.

Thank you! I will try it

xiamengzhou commented 1 year ago

@zhangzhenyu13 Thanks! Yes, that is correct.