netneurolab / neuromaps

A toolbox for comparing brain maps
https://netneurolab.github.io/neuromaps
Other
233 stars 52 forks source link

Issues with plotting neuromaps output and with higher resolution images #154

Closed Melissa1909 closed 1 month ago

Melissa1909 commented 5 months ago

Description of issue

Dear neuromaps developers, I have two questions regarding the functioning of the toolbox.

I have used nulls.moran(img1, atlas='mni152', density='3mm', n_perm=100, seed=1234, n_proc=15) to calculate a null distribution for the image attached. The output numpy array has the following dimensions: [60, 72, 60, 100] - what do the dimensions mean? How can I visualize the null distribution? Why are a lot of the values NaN?

Furthermore, I was not able to run the function with the same image with 2mm resolution since my computer has always crashed. Even when running it on powerful servers with 1 GB RAM, it didn't finish. How does the computational complexity scale with resolution? Do you think 2mm resolution should be doable, so is sth going wrong here or would I just need a more powerful system? Best, Melissa

MeanSZPVBM_z_3mm.nii.gz

Code of Conduct

justinehansen commented 4 months ago

Hi Melissa, sorry about the late response. nulls.moran took your image and made n_perm new null maps with similar spatial autocorrelation as your original map. Your original map is a volume (3D matrix) of shape [60, 72, 60] so the output of nulls.moran is [60, 72, 60, nperm] where nperm=100 in your case. There shouldn't be NaNs within the brain volume but maybe they are appearing in the "background" voxels?

For your second question, it's probably just that 2mm resolution is too many voxels (computational cost scales at $O(n^2\log(n))$ :slightly_frowning_face:). The step that takes a lot of space (on disk) is computing the distance matrix so the issue is likely storage space and not RAM. See this part of the documentation for relevant info.

Hope that helps! Justine