question1: ALL Q series layers are all coding by c++ code ?
question2: QActivation layers with backward_only=True can back propagate quantization gradient,it means that you didn't write the calculate gradient of quantization function code in Qconvolution,Qfullconnectlayer?
Hi Peng,
to your questions,
Q1: yes, the implementation is in C++, otherwise we cannot create the binarized models, you can check the model size here
Q2: yes.
Hi guys,
May I ask some questions?
question1: ALL Q series layers are all coding by c++ code ? question2: QActivation layers with backward_only=True can back propagate quantization gradient,it means that you didn't write the calculate gradient of quantization function code in Qconvolution,Qfullconnectlayer?
thanks a lot
Best Regards, Peng