AlexeyAB / yolo2_light

Light version of convolutional neural network Yolo v3 & v2 for objects detection with a minimum of dependencies (INT8-inference, BIT1-XNOR-inference)
MIT License
303 stars 116 forks source link

INT8 cudnn with 1st layer return error #54

Open jasonwu1977 opened 5 years ago

jasonwu1977 commented 5 years ago

Currently Yolo2_light supports quantization for some specified layers, Not the 1st layer, Not the layer before yolo(region), and Not the size=1 layer.

And I tried to make a full INT8 quantization, but however, when I changed the 1st layer on Yolov3-tiny, it says

CUDNN failure
Error: 3 - CUDNN_STATUS_BAD_PARAM

my questions are

  1. is that possible to quantize with all the layers? don't care about mAP lose at this stage.
  2. is that possible to fixed the CUDNN error?
AlexeyAB commented 5 years ago

The last layer must be fp32, since it is used for object coordinates. May be it can be int8 only if we use much more anchors (initial w,h defined with lower step) and higher network resolution (initial x,y defined with lower step).

  1. It seems you changed something incorrect.