Xilinx / QNN-MO-PYNQ

BSD 3-Clause "New" or "Revised" License
236 stars 114 forks source link

About the Tinier YOLO #27

Closed BinhBa closed 5 years ago

BinhBa commented 5 years ago

Hi, Currently, I want to train a new weights for the Tinier YOLO Topology. So I trained a network with the same topology as the network you provided in the example. I tried to match it with the work you done by using the BNN weight generator:

  1. I try to using the Keras/Tensorflow to train a network and convert it to the form of the Keras/Theano form.
  2. Using the BNNWeightReader to read and generate weights. But something still unclear for me:
  3. The topology and the jupyter notebook showed that your network has 9 layer from 0 to 8. But in the params folder of Tiny Yolo, there're 10 set of weights from 1 to 10. Did you remove the layer 0 or just shift it from 0-9 to 1-10
  4. As I read from other questions, your way to calculate the peCounts and simdCounts map have not been not public yet. Could you please provide me the peCounts and simdCounts map which you use for this topology? Because I try a lot of value and nothing can look alike like the original weights size.
  5. If I want to add another Conv Layer, whether add it to the end of tinier-yolo-layers.json and put the binparams to the folder binparam-tinier-yolo-nopool is enough? Hope to hearing from you soon, Thanks,
mohdumar644 commented 5 years ago
  1. Framework does not matter. Have you incorporated quantization into your training?
  2. You have to modify the finnthesizer from BNN as one is not available here.
  3. There are 10 sets because max output channels are 256, so layers with 512 out channels are split into 2 layers of 256 each (2 iterations). Hence 10 layers represent 64 64 128 256 256/256 256/256 256/256
  4. For QNN architecture is fixed: PE 32 SIMD 64 for all layers.
  5. Yes and also modify sizes for output buffer in your notebook.
BinhBa commented 5 years ago

Hi @mohdumar644 , Thank you for your explanation, I use a Binary Layer for keras with W1A1. The problem maybe because I don't split the layer. Thank you again.

KaTaiHo commented 5 years ago

@LovExtreme were u ever able to get it up and running? If so, could you share your script? : D