Closed m-zheng closed 2 years ago
Hi, The Gaussian kernel is normalized by its sum, so for any standard deviation, the kernel sums to one. Thus the standard deviation does not affect the count of the density map. It does affect how sharp/diffused the density map is, i.e. a small standard deviation will result in a sharp density map. And for our dataset, we found keeping the SD as quarter window size works well. One motivation for using quarter window size as the SD is the fact that Gaussian distributions contain around 95% of the data within 2 * SD from the mean.
Thank you very much for the information.
Hello. Viresh. Thanks for the great work.
I have one question related to this issue.
In issue #27 , you mentioned that we can use (https://github.com/CommissarMa/MCNN-pytorch/blob/master/data_preparation/k_nearest_gaussian_kernel.py) to generate gaussian density maps.
However, in the mentioned source, using the quarter window size is not set as default. Can you explain how you modified the source to generate gaussian density maps for the FSC-147 dataset? How can I change the following source code to reflect the window size?
tree = scipy.spatial.KDTree(points.copy(), leafsize=leafsize)
# query kdtree
distances, locations = tree.query(points, k=4)
print ('generate density...')
for i, pt in enumerate(points):
pt2d = np.zeros(img_shape, dtype=np.float32)
if int(pt[1])<img_shape[0] and int(pt[0])<img_shape[1]:
pt2d[int(pt[1]),int(pt[0])] = 1.
else:
continue
if gt_count > 1:
sigma = (distances[i][1]+distances[i][2]+distances[i][3])*0.1
Thank you very much. Best regards.
Hi,
Can I ask why the standard deviation of the Gaussian kernel is a quarter of the window size written in the second paragraph in section 3.2?
I understand the whole Gaussian kernel design is adaptive to the sizes of objects. But why using a quarter of the window size for the standard deviation specifically?
Any help would be highly appreciated.