noahbenson / neuropythy

A neuroscience library for Python, intended to complement the existing nibabel library.
GNU Affero General Public License v3.0
115 stars 21 forks source link

Minimal required files for Benson 2014 retinotopy command #26

Closed N-HEDGER closed 2 years ago

N-HEDGER commented 2 years ago

Hi there. Thanks for this excellent package.

I am working with remote data and would like to fit the Benson Retinotopy 2014 model to it.

Since the subject freesurfer directories are quite large, I could do without downloading all of the data. However, I am finding it difficult to determine from the scripts what the required freesurfer files are for this command.

I assume from reading the paper that only 'mri' and 'surf' subdirectories are used - but could you give some direction as to what specific files from these directories are used?

Many Thanks,

N

noahbenson commented 2 years ago

Hi there, I haven't written neuropythy to be very good at using partial FreeSurfer directories—it's intended as a tool for interpreting and interacting with FreeSurfer's output, not as a tool for individual operations on individual files. In fact, I believe that it will fail to recognize the directory as a FreeSurfer directory and thrown an error if certain files are missing. Important files for the 2014 atlas include the surf/lh.sphere.reg and surf/rh.sphere.reg files; however, they are likely not sufficient, and the code is old enough that I don't know the answer off the top of my head. Figuring out the minimum set of files is possible, but it may require more hours of human time than the hours of computer time required to download the FreeSurfer directories. i'm currently traveling for the US holidays, but if you have a compelling reason that you need this answer beyond just saving computer time, let me know and I can give you some guidance on how one might figure it out! -Noah

N-HEDGER commented 2 years ago

Thanks Noah.

I can see why this might seem like an odd thing to want to do - I wouldn't normally try and use code in a way that it wasn't intended, but the context is that I will be working on a dataset with thousands of subjects - so in this case the time it takes to download the files scales up considerably. My idea was that I could configure datalad to download only the files that are absolutely necessary (while maintaining the freesurfer directory structure). So far, this seems to produce good results when restricting to just the mri and surf subdirectories.

Anyway - I just thought id try my luck to see if there was an 'off the top of the head' answer . I'll spend a little more time to look into this some more.

Have a good Christmas Break.

noahbenson commented 2 years ago

Yeah, sorry—the code is just a little too old at this point for me to be able to provide much guidance. I'm in the middle of a big cleanup/rewrite at the moment and will try to document some of this stuff over the next few months as it proceeds. (In the meantime, if you figure out what the minimum requirement is, please post it here, and I'll add it to the wiki!) Cheers!