Closed ywpkwon closed 7 years ago
loading model takes most of the time
However, for me, model loading was 0.4 sec, inference was 115.5 sec. This is unusual to you?
loading model and constructing graph take most of the time, you should test 'sess.run(...)' time.
Thanks for sharing your work. I tried the demo.py and it seems to work fine. I think I could understand more about YOLO.
However, I am slightly curious why it takes quite long. One single image query took 116 sec (model load 0.4, inference 115.5, post-process 0). I am seeing 9 conv + 3 fc layers. It must be only feed-forward. Such size usually takes that long? Do you have any idea? Is it tensorflow problem?