THUDM / SwissArmyTransformer

SwissArmyTransformer is a flexible and powerful library to develop your own Transformer variants.
https://THUDM.github.io/SwissArmyTransformer
Apache License 2.0
951 stars 90 forks source link

fix lost bias when quantize from pre-trained model parameters #135

Closed jimmieliu closed 9 months ago

jimmieliu commented 12 months ago

Previously, if you quantize a model with pre-trained parameter, the bias parameters in Linear will be lost, and you will get a all-zero bias. 修复quantize时候丢掉模型Linear的bias的问题。

1049451037 commented 9 months ago

@Sleepychord Does this bug exist? Whether this is the real problem of quantizing a ViT?

Sleepychord commented 9 months ago

Thank you for your catching this bug!

jimmieliu commented 9 months ago

you are welcome, and thank you all for your great works :)