I am using your same absmean quantization method on my trained neural network. The quantization drops the neural network's accuracy down to 10% from 95%, even though I am using all the tips you put into the follow-up paper. I also realized that a ternary network can only output multiples of its inputs, so if I want a particular output I need to hope that I have a certain input to make it happen mathematically.
Please release your code so the rest of us can verify your findings.
I am using your same absmean quantization method on my trained neural network. The quantization drops the neural network's accuracy down to 10% from 95%, even though I am using all the tips you put into the follow-up paper. I also realized that a ternary network can only output multiples of its inputs, so if I want a particular output I need to hope that I have a certain input to make it happen mathematically.
Please release your code so the rest of us can verify your findings.
Thank you, Tyler Suard