cvlab-stonybrook / LearningToCountEverything

MIT License
357 stars 72 forks source link

Could you provide density map generator codes? #6

Closed s10133679 closed 2 years ago

s10133679 commented 3 years ago

Dear Authors,

Could you upload the codes to generate density map for training?

Thank you!

Viresh-R commented 3 years ago

Hey, We have provided the precomputed Gaussian GT maps used in all of our experiments in this repo. We have also provided the point annotations, and details for generating the GT maps are given in the paper in case you'd like to generate the GT maps yourself.

Thanks, Viresh

tersekmatija commented 2 years ago

Hi @Viresh-R ,

I'll continue in this thread as it is a very relevant question. My team and I are participating in ML Reproducibility Challenge 2021 and we selected your paper. As part of the replication challenge we decided to try re-generating the GT density maps ourselves, but following the process described in your paper, we encountered some discrepancies in the generated density maps.

For a certain file (lets say image 4.jpg), we extract the points from annotation_FSC147_384.json and round them to the nearest integer. Then we use use the scipy to compute the distance to the nearest point for each point and average those distances and get avg. We use the matlab_style_gauss2D method to generate the Gaussian filter. We set the window size to (avg, avg) and sigma to avg/4, and then use the cv2.filter2d to do the convolution and apply the filter. We also tried computing the Gaussian filter with cv2 library, and the implementation of the matlab_style_gauss2D seems fine. And we tested another function for convolution from scipy. However, the obtained GT is very different from the precomputed one.

For example, this is precomputed density map for this image: GT

and this is the generated GT with described process: Generated

We tested out different sigma parameters, and we get the most similar result when sigma is a square root of avg: Generated2

However, it is still not the same, as sum of absolute pointwise differences is bigger than 1 (which is a lot, considering the total sum is 8).

Do you have some insight into why we get different results? What is the proper way to generate the ground truth density maps?

A snippet of the code (we omit the imports, pts are taken from the JSON file and rounded using numpy):

# compute nearest neighbours and distances (k = 2 is used becase k = 1 is the same point and distance == 0)
tree = scipy.spatial.KDTree(pts.copy(), leafsize=10)
dists, neighbours = tree.query(pts, k = 2)
avg = np.average(dists[:, 1])

# empty GT map with annotated pixels set to 1
test = np.zeros(gt_computed.shape)
for i in range(pts.shape[0]):
    y = pts[i, 1]
    x = pts[i, 0]
    test[y, x] = 1

# compute filter
filt = matlab_style_gauss2D((avg, avg), avg/4)

# convolve (apply filter)
gt_generated = cv2.filter2D(test.copy(), -1, filt, 0)

Any insight would be greatly appreciated!

Thanks, Matija

Viresh-R commented 2 years ago

Hi @tersekmatija , The discrepancy between the GT maps we have shared and the ones you computed for the ML reproducibility challenge can most likely be attributed to the fact that point annotations we shared correspond to the 384 image width, i.e. the size of shared resized images used in the paper, Whereas the GT maps we shared were generated using the original image sizes (which could be much larger than 384 width) which are subsequently resized to 384 * image width.

benedictflorance commented 2 years ago

Doing the latter way i.e., resizing a higher dimension density map generated using coordinates acc to original image dimensions to 384*width, also reduces the sum of the density map proportionately. How do you preserve the count while resizing?

Viresh-R commented 2 years ago

We multiply the new density map with a scaling factor ( i.e. original count/count of resized map) which preserves the count.

ajkailash commented 2 years ago

Hi @Viresh-R

In lieu with your reply, I tried the following things.

However, inspite of this there is a difference between the precomputed density map and the generated one. precomputed density map for the image

computed-density-map-49

generated density map for the image

generated-density-map-49

sharing the code blocks below. Code is borrowed.

def matlab_style_gauss2D(shape=(3,3),sigma=0.5):
    m,n = [(ss-1.)/2. for ss in shape]
    y,x = np.ogrid[-m:m+1,-n:n+1]
    h = np.exp( -(x*x + y*y) / (2.*sigma*sigma) )
    h[ h < np.finfo(h.dtype).eps*h.max() ] = 0
    sumh = h.sum()
    if sumh != 0:
        h /= sumh
    return h
def gen_density_map(image_arr, pts):
    tree = scipy.spatial.KDTree(pts.copy(), leafsize=10)
    distance, neighbours = tree.query(pts, k=2)
    avg = np.average(distance[:,1])
    test = np.zeros(image_arr.shape)
    for i in range(len(pts)):   
        y = pts[i][1]
        x = pts[i][0]
        test[y,x] = 1
    filt = matlab_style_gauss2D((avg,avg), avg/4)
    gen_map = cv2.filter2D(test.copy(), -1, filt, 0)
    return gen_map

Any insight on what mistake I am doing would be of great help. Thanks in advance for your answers.

Update: Used the density map generation code found in a crowd counting paper. Got similar results (low level visual similarity) when I took the avg distance between the neighbors.

crowd-counting-code-density-map

cuizhiling commented 2 years ago

Hi @Viresh-R

In lieu with your reply, I tried the following things.

  • I resized one of the images in the shared dataset (49.jpg) to its original dimensions.
  • Annotated each object instance in the image using CVAT
  • Rounded the points and generated a density map (from @tersekmatija reply, I used matlab_style_gauss2D to generate the gaussian filter, values for window size, sigma, along with code. )
  • Resized the density map (384 *width)
  • Multiplied the resized density map with the scaling factor to preserve the count (used np.sum() to calculate the sum)

However, inspite of this there is a difference between the precomputed density map and the generated one. precomputed density map for the image computed-density-map-49

generated density map for the image generated-density-map-49

sharing the code blocks below. Code is borrowed.

def matlab_style_gauss2D(shape=(3,3),sigma=0.5):
    m,n = [(ss-1.)/2. for ss in shape]
    y,x = np.ogrid[-m:m+1,-n:n+1]
    h = np.exp( -(x*x + y*y) / (2.*sigma*sigma) )
    h[ h < np.finfo(h.dtype).eps*h.max() ] = 0
    sumh = h.sum()
    if sumh != 0:
        h /= sumh
    return h
def gen_density_map(image_arr, pts):
    tree = scipy.spatial.KDTree(pts.copy(), leafsize=10)
    distance, neighbours = tree.query(pts, k=2)
    avg = np.average(distance[:,1])
    test = np.zeros(image_arr.shape)
    for i in range(len(pts)):   
        y = pts[i][1]
        x = pts[i][0]
        test[y,x] = 1
    filt = matlab_style_gauss2D((avg,avg), avg/4)
    gen_map = cv2.filter2D(test.copy(), -1, filt, 0)
    return gen_map

Any insight on what mistake I am doing would be of great help. Thanks in advance for your answers.

Update: Used the density map generation code found in a crowd counting paper. Got similar results (low level visual similarity) when I took the avg distance between the neighbors.

crowd-counting-code-density-map

Dear Lakshmi Narayan,I have the same problem as you.

If it's convenient for you, can you send me the code for generating density map?My E-mai is:330384850@qq.com.

Thank you!

Viresh-R commented 2 years ago

Hey all, We have already shared the Gaussian density map computed using adaptive kernel sizes with the dataset. Please use those for your experiments. All the density map prediction results reported in the paper are computed using the same.

Also, thanks to @ajkailash for sharing the density map generation code which results in similar density map as the ones we have shared.

@tersekmatija : please follow @ajkailash 's comment to reproduce the density maps.