Open icoicqico opened 3 years ago
Can I have more details on the pretrained weights that you are using and also the datasets, Thanks.
I have tried all pretrained weights on the readme page, testing on the image img_0071_heatpmap.png, but only the left side. And the result wasn't even close to 1236, was around 500.
@icoicqico The img_0071_heatpmap.png in ./images went through the replotting and exporting for visualization purposes. It is not supposed to be cropped and used as an input image.
If you want to test on an image in the UCF-QNRF dataset, please download the dataset here. Also, you should be using the pretrained M_SFANet* on UCF-QNRF, download here.
The actual img_0071.jpg is in the "Test dir" of the UCF-QNRF dataset. To preprocess (UCF-QNRF) before training&testing, please check the BL repo. In this repo, now, I only provide the Bayesian preprocessed SHA&SHB datasets.
The attached file is the actual img_0071.jpg. img_0071.zip
Hope this helps, sorry for the confusion. Thanks.
Thanks for your reply, I will try to download and test again.
Hello, I have tested the image again but the result wasn't close to the ground truth.
@icoicqico Since I do not see your code, I cannot really tell what went wrong for you. However, the model still works fine for me, even in the CPU environment. Hope this helps.
The test file -> Test on CPU.ipynb.zip
Unable to reproduce the result of the testing image, with using the pretrained model, anyone know how to reproduce the result? thanks.