CommissarMa / Context-Aware_Crowd_Counting-pytorch

The implementation of Context-Aware Crowd Counting(CVPR2019)
MIT License
68 stars 23 forks source link

about result #2

Closed yanfangfangfang closed 5 years ago

yanfangfangfang commented 5 years ago

thanks for your code. but when i run ,i cant got same result as you say mae=62.3,at epoch 353. image can you give me some advice. all code not change expect the path of data.

CommissarMa commented 5 years ago

I have run the code for about ten times. In my memory, the mae can be less than 100 at ~10th epoch. Just now I run the code again and get the following result: image
May be you can try again. I can't find any other possibility for now. Good luck! If you find the reason , please inform me, thank you.

ElaineXiaoke commented 5 years ago

thanks for your code. but when i run ,i cant got same result as you say mae=62.3,at epoch 353. image can you give me some advice. all code not change expect the path of data.

Did you solve the problem? I also tried the code but I could only get large MAE as you did. Would you mind sharing your solutions if you fix it?

CommissarMa commented 5 years ago

@yanfangfangfang @ElaineXiaoke I think maybe the problem is at the generation of density map. You can check the data_preparation/k_nearest_gaussian_kernel.py to generate the correct density map. I think it will work.

yanfangfangfang commented 5 years ago

image in the code of data_preparation/k_nearest_gaussian_kernel.py. 'gt' don't definie before ,maybe there should be 'points'

csonde commented 5 years ago

Hey!

Just putting this here, it might be related.

Thanks for this, I am using it for a custom dataset, and seems to be working with some small issues. I can achieve nice results. I would like to point out 2 things though. First in the line below, an integer is divided by another one.

https://github.com/CommissarMa/Context-Aware_Crowd_Counting-pytorch/blob/07f86a5d03664b87812557c429df1835cb2e3124/my_dataset.py#L35

Now in Python 2.7 this will yield another integer, so in our case 0 or 1, but most likely 0. So if someone is using 2.7 then changing to 3.x could solve this, Or change 255 to 255.0 (I recommend the latter). I also got awful results before doing so.

The second thing is for the validation part. In the dataset enumerator you are doing a random flip if I am not mistaken.

https://github.com/CommissarMa/Context-Aware_Crowd_Counting-pytorch/blob/07f86a5d03664b87812557c429df1835cb2e3124/my_dataset.py#L43-L45

I assume this is for data augmentation. Which is nice and helps the training procedure, but not very fortunate during validation in my opinion. It is just a suggestion, it would be better to not flip any image, or flip all, or validate on both normal and flipped images, or basically any option is good, which does not involve randomness for validation. Although if one considers the possibility to compare results to other methods then most likely the "no flipping" version is the most desirable, to help to ensure that the results are evaluated on the same set.

yanfangfangfang commented 5 years ago

very thanks for your advices. and i run code with python 2.7 there should be /255.0.

image

CommissarMa commented 5 years ago

@csonde You are right. I will fix them. Thank you.

xinke-wang commented 5 years ago

Hi @yanfangfangfang , have you finally achieved the MAE (62.3) which was proposed in the paper for Shanghai Part A ? As I can only get about 66 even after 1000 epochs. Many thanks.

yanfangfangfang commented 5 years ago

Hi @yanfangfangfang , have you finally achieved the MAE (62.3) which was proposed in the paper for Shanghai Part A ? As I can only get about 66 even after 1000 epochs. Many thanks.

no i just got the same result as you, i can't found my problem. but i use the pre-model of code author can reach the MAE(about 62). if you fix, please concat ,thank you very much.

xinke-wang commented 5 years ago

Hi @yanfangfangfang , have you finally achieved the MAE (62.3) which was proposed in the paper for Shanghai Part A ? As I can only get about 66 even after 1000 epochs. Many thanks.

no i just got the same result as you, i can't found my problem. but i use the pre-model of code author can reach the MAE(about 62). if you fix, please concat ,thank you very much.

Thank you for your prompt reply, I will update it if I reach the MAE.

RanRan-Margaret commented 3 years ago

Hi @yanfangfangfang , have you finally achieved the MAE (62.3) which was proposed in the paper for Shanghai Part A ? As I can only get about 66 even after 1000 epochs. Many thanks.

no i just got the same result as you, i can't found my problem. but i use the pre-model of code author can reach the MAE(about 62). if you fix, please concat ,thank you very much.

Thank you for your prompt reply, I will update it if I reach the MAE.

I have the same problem that mae=66,Have you fix it?