Open GallonDeng opened 5 years ago
@AllenDun
Have you checked the inference accuracy for the converted yolov3 model?
Unfortunately, I have not confirmed it in detail yet. However, it feels pretty inferior in terms of experience. For now, MobileNet-SSD is more accurate but a bit slow. By the way, it is the same situation as when I implemented tiny-YoloV2
It works fine in the darknet, but the inference accuracy drops a lot.
I am in the same situation as you.
Have you tried other yolov3-based model, especially your own model?
No, I have not tried it yet. First of all, I prioritized the establishment of a procedure to make it work.
BTW, I use https://github.com/mystic123/tensorflow-yolo-v3.git
I am using the same repository.
Hi! I also tried the regular yolo v3 on a intel compute stick 1, by following the guide for creating the .pb file, and then using this compile command:
mo_tf.py --input_model yolo_v3.pb --tensorflow_use_custom_operations_config yolo_v3.json --input_shape "(1,413,413,3)" --data_type=FP16
I also compile it without the --data_type=FP16
and ran it on the CPU.
There is a difference in accuracy for sure, especially in the confidence scores. for example, i saw a difference with "sports ball" of 98% on fp32 vs 54% on fp16. Also, the bounding box was less accurate with fp16, but still ok. Some fp16 detections are rather around 10%, so definitely I would say that it is really different depending on where you will be doing the inference on!
@AllenDun, @deblauwetom
Thank you for providing the information. I can not find a valid solution so far. :cry:
Are you all using https://github.com/mystic123/tensorflow-yolo-v3 to convert from Darknet to IR? There are a few open issues that may explain the accuracy issues ( https://github.com/mystic123/tensorflow-yolo-v3/issues )
Have you tried another way to convert Darknet?
Cheers,
nikos
Have you tried another way to convert Darknet?
I have not tried it yet. I think I will search for it.
I began regenerating the model with reference to the following repository. So far, the loss value is steadily decreasing.
https://github.com/khanh1412/tiny-yolo-tensorflow.git
Hello everyone. I solved the problem of low precision. There was a mistake in the logic of preprocessing and postprocessing.
I meet the same problem. Have your modified the code of "openvino_tiny-yolov3_test.py" ? As my weights file has a good performence on cpu, but when comes to RaspberryPi with a Intel Neural Compute Stick v1, the accurracy is pretty slow. Still, thanks for your work!
Hello everyone. I solved the problem of low precision. There was a mistake in the logic of preprocessing and postprocessing.
Hi thanks for your work! How to fix this "logic of preprocessing and postprocessing" bug?
Have you really sovled this problem, someone says it is the resize of darknet is letterbox, but it has no this operation in openvino, I have no idea how to solve it.
Have you really sovled this problem, someone says it is the resize of darknet is letterbox, but it has no this operation in openvino, I have no idea how to solve it.
Yes, I used letter-box to resize after reading darknet's source code in the inference. But I haven't looked up the postporocessing code. Maybe you have to be the same with that when inference.
@Jucjiaswiss after use the letter-box resize ,have you solved it ?
Hi, great work. I'm also trying to use yolov3 in openvino. Have you checked the inference accuracy for the converted yolov3 model? I trained my own dataset with new model(based on the structure of yolov3). It works fine in the darknet, but the inference accuracy drops a lot. I found that the conversion from darknet to tensorflow may have some problems. The converted tensorflow model performs badly for inference, the accuracy drops a lot. Have you tried other yolov3-based model, especially your own model?