Closed Arne-Thomsen closed 8 months ago
Great, thanks a lot, looks really good.
I think the sklearn dependency is fine, since it a standard package for ML. However, is sklearn
nowadays not in scikit-learn
(the package name is still sklearn
but the pip command uses scikit-learn
)? If so, please adjust the dependency.
It would be good to keep the docstrings consistent in case we setup a proper documentation at some point.
Thanks for the fast response and pointing out the scikit-learn
dependency, you're right. I've unified the format of the docstrings with the rest of the repo.
Based off an idea by Tomek, I've implemented a non trainable layer to smooth healpy maps in real space. The smoothing is done in the following steps:
sklearn.neighbors.BallTree
is used to find the nearestk
neighbors for every pixel/index in the Healpix patch to be considered, wherek
is determined from a radius that is set as the number of standard deviations of the smoothing kernel(n_pix, n_pix)
is built such that every row corresponds to thek
nearest neighbors and their Gaussian smoothing weights with respect to the original pixeltf.sparse.sparse_dense_matmul
when the layer is calledThis operation supports having multiple smoothing scales for the different channels, which are assumed to be in the last input tensor axis like
(n_batch, n_pix, n_channels)
. To save GPU memory (the sparse kernel matrix can become quite large for large smoothing scales), the different smoothing scales are all implemented with the same sparse kernel matrix: Larger smoothing scales are achieved by smoothing repeatedly according to the rule the result of a convolution of two Gaussians withsigma_1
andsigma_2
, the result is a Gaussian withsigma_3 = sqrt(sigma_1^2 + sigma_2^2)
. Since the number of times the smoothing is applied is restricted to integers, the layer automatically implements the closest conservative smoothing scale.Two additional notes: