Closed mohdumar644 closed 5 years ago
We're planning to release the training scripts by the end of this week - I hope this will answer all of your questions.
Can you link a research paper, if you have implemented that?
Yes, it is this one. The small change that we do not allow the largest negative value to be represented by the weights.
I.e., for 2-bit weights we allow the values [-1, 0, 1]
Sorry for the delay, we will update the repo with this training code, but some other priorities have taken over in the short term.
Thanks, but your paper did not define the quantization function exactly, and I could not decipher the function from the finnthesizer. Are there more any more hints?
What does the s0.5 and s0.25 imply in the given .npz
names?
There is actually no (relevant) meaning to those. It's related to the topological description of CNV & LFC - these values are simply hardcoded into the cnv/lfc.py files respectively.
Apologies again for the delay, we plan to upload the training scripts for these this week.
Thanks. I was able to train in Theano a W1A2 and a W1A4 network using the -.5,.5 and equidistant thresholds respectively. I would like to see your scripts though.
I noticed an update to the BNN-PYNQ library adding support for WnAn.
Can you please give a pointer as to what quantization scheme are you using, and what framework?
From the finnthesizer, it does not look like you are using DoReFa-Net like you used in QNN-MO-PYNQ.