layerfMRI / LAYNII

Stand alone fMRI software suite for layer-fMRI analyses.
BSD 3-Clause "New" or "Revised" License
85 stars 26 forks source link

Column generation problem for LN_COLUMNAR_DIST #50

Closed yyinghua closed 2 years ago

yyinghua commented 2 years ago

Hi all,

I am trying to analyze the slice-wise column generation with the newest LAYNII. I met the trouble when running LN_COLUMNAR_DIST. the program cannot extend the columns very well. Even I gave it a big value with -vinc option, e.g. -vinc 400, the maximum columns generation seems to stop at about 230 voxels in the R->L direction. I added the test data and masks with the attachment. Would you like to take a look at the attached files? Any feedback and advice would be appreciated.

Best regards Yinghua s1_LN_COLUMN_DIST.zip

layerfMRI commented 2 years ago

Dear Yinghua,

Thanks for the description and the data. I think I understand better now what the problem is. Could you let us know which what the version was when this still worked? Was that a LayNii version before the times of LayNii on Github?

As a short term 'work around' you could use the following series of commands to solve it. fslswapdim s1_test1_rim_layers.nii y x z s1_test1_rim_layers_swap.nii fslswapdim s1_test1_landmarks.nii y x z s1_test1_landmarks_swap.nii LN_COLUMNAR_DIST -layers s1_test1_rim_layers_swap.nii -landmarks s1_test1_landmarks_swap.nii -vinc 100 Note, however that you would need swap all the other filed into this space too :-/ If I will have time, I will try to see how hard it would be to change the code of LN_COLUMNAR_DIST to make it specific to either direction.

Have you already tried LN_COLUMNAR_DIST?

As a long term solution, we might need to include a -two_dim option into the more comprehensive newer LN2_MULTILATERATE program.

Best regards, Renzo

ofgulban commented 2 years ago

@yyinghua maybe also try LN2_COLUMNS program, if you just want to create an arbitrary number of columns and not interested in e.g. flattening them.

layerfMRI commented 2 years ago

If you want to use it for digit mapping in S1, you might also consider using LN_3DCOLUMNS, which works on axial slices (like yours) LN_3DCOLUMNS -layers s1_test1_rim_layers.nii -landmarks s1_test1_landmarks_for_LN_3DCOLUMNS.nii Note, however, that the landmark file needs, three landmarks (see attached file). Note also the extra -jiajiaoption to include the borderlines. s1_test1_landmarks_for_LN_3DCOLUMNS.nii.zip .

yyinghua commented 2 years ago

Dear all, 

I am sorry for the delay. Thanks a lot @layerfMRI and @ofgulban for your commands. I tried both LN_3DCOLUMNS and LN2_COLUMNS worked well on my data. LN_3DCOLUMNS is indeed more applicable to seeing the digit mapping across slices, and Jiajiaoption is very helpful. It could totally solve my problem. Many thanks for your help.

In addition, for the previous version of LN_COLUMNAR_DIST. It worked well on my data when I used the previous version you shared in October 2018. The cod is: LN_COLUMNAR_DIST -layer_file layers.nii -landmarks landmarks.nii -vinc 400. For the latest version of LN_COLUMNAR_DIST, I re-oriented the images of my data, but the problem still has. s1_swapYXZ.zip

Thanks again. Yinghua

layerfMRI commented 2 years ago

I gave it another look so better understand what has changed since the first versions of this program. I think I understand the problem now.

The vicinity looping structure had been changed and externalised from the for-scope itseft. Now there arejx_startand jx_stop. These new parameters had a small bug that confused different orientations.

This is fixed now in the devel branch now and it seems to works fine in the test data that you provided now. This bug was specific to the case of asymmetric FOVs.

Screenshot 2021-12-22 at 15 34 44

In case you do not want to recompile it on your own from the devel branch, you can find the pre-compiled version (MacOS) attached. LN_COLUMNAR_DIST.zip

yyinghua commented 2 years ago

Thank you for looking up the reason and sharing the executable file. I also tried this pre Version of LN_COLUMNAR_DIST, and it works fine on my data now. Thank you for being so helpful!