Edit: nevermind, it just happens later than I expected. Ignore this post
It appears that ImageNet normalization is never being applied when running the network using the unversaldemo.py script. Is this intentional?
This happens for all cases: single image (render_single_img_pred in inference_task.py), video (execute_on_video in inference_task.py), and even on a folder of images using create_test_loader. A comment in create_test_loader suggests normalization should happen on the fly, but it appears this was never added.
Even without normalization the results look good though.
Hi John Lambert,
Edit: nevermind, it just happens later than I expected. Ignore this post
It appears that ImageNet normalization is never being applied when running the network using the unversaldemo.py script. Is this intentional?
This happens for all cases: single image (render_single_img_pred in inference_task.py), video (execute_on_video in inference_task.py), and even on a folder of images using create_test_loader. A comment in create_test_loader suggests normalization should happen on the fly, but it appears this was never added.
Even without normalization the results look good though.
Thanks for the great repo!