layerfMRI / LAYNII

Stand alone fMRI software suite for layer-fMRI analyses.
BSD 3-Clause "New" or "Revised" License
85 stars 25 forks source link

Getting layers in EPI space #47

Closed giobattistella closed 1 year ago

giobattistella commented 2 years ago

Hi, I followed the steps described in https://layerfmri.com/2017/11/26/getting-layers-in-epi-space/ to extract layers from the MP2RAGE in the EPI space. Everything seemed to work fine till the transformation of pial surface from Freesurfer to volume with SUMA. It looks like that the pial rim is "scattered" in some portions of the brain, thus impeding the layering of the cortex. The WM rim looks fine instead. I am happy to share with you the data if those might help you understand better the problem I am having.

Best, Giovanni

layerfMRI commented 2 years ago

Hi Giovanni,

Yes, I think I know what you mean. You mean an artifact like this? missing_voxels_artifact (artifact from @kenshukoiso) This is a feature of working in vertex space. It suggests that the vertex density is not high enough to capture all voxels. If you increase the vertex density, this effect will be reduced. E.g. for the MapIcosahedron command, you can use the option -ld 2000. Note that this will increase the file size of the surfaces.

Note that the blog post that you are referring to is from 2017. LayNii has improved quite a bit since then. I would recommend you to use the new program LN2_LAYERS instead in step 5.

Best regards, Renzo

ofgulban commented 2 years ago

@giobattistella , what is your progress on this issue? It would be useful if you can let us know whether your issue is solved. Thanks in advance.

giobattistella commented 2 years ago

Hi everybody, I am really sorry for the delay in getting back to you but I had to work urgently on something else in the past weeks. As Renzo suggested, increasing the vertex density improves the results. I tried with -ld1200 and the results are better than my original attempt at -ld564 but still not good enough. The problem that I have is that when I try -ld2000 the command crashes; I got the message "Killed" and not outputs are stored. Could this be due to the size of the output files that should be written? Do you have any ideas on how to solve this problem? thanks a lot for our help.

ofgulban commented 2 years ago

This sounds like a freesurfer problem therefore out of scope for us. If @layerfMRI does not have any suggestions I will close the topic.

layerfMRI commented 2 years ago

@kenshukoiso has looked into the number of vertices a bit for a representative while brain dataset with voxel sizes of 0.4mm. This shows that 2000k (2.000.000) vertices (NOTE: The number refers to the argument -ld in suma. As far as I understand, it refers to the number of vertices per hemisphere in units of 1000, default is -ld 64 ) per hemisphere are necessary in order not to miss any voxels. This means that the mesh file is 47 GB big!

And it takes a lot of time to work with it...

Mesh_with_parameter

ofgulban commented 1 year ago

Closing due to no response. Feel free to reopen in the future.