microsoft / unilm

Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
https://aka.ms/GeneralAI
MIT License
19.62k stars 2.5k forks source link

Unable to get similar results using the absmean method mentioned in the paper #1524

Open Tylersuard opened 5 months ago

Tylersuard commented 5 months ago

I am using your same absmean quantization method on my trained neural network. The quantization drops the neural network's accuracy down to 10% from 95%, even though I am using all the tips you put into the follow-up paper. I also realized that a ternary network can only output multiples of its inputs, so if I want a particular output I need to hope that I have a certain input to make it happen mathematically.

Please release your code so the rest of us can verify your findings.

Thank you, Tyler Suard