canlab / MediationToolbox

Single-level and multi-level mediation analyses for any kind of data, with bootstrap-based significance testing. Neuroimaging-oriented functions allow for mediation effect parametric mapping (mapping of mediation effects across the brain) and multivariate mediation.
https://canlabweb.colorado.edu/wiki/doku.php/help/mediation/m3_mediation_fmri_toolbox
78 stars 28 forks source link

Possibility to parallelise? #13

Closed ethanknights closed 3 years ago

ethanknights commented 3 years ago

Great toolbox - thanks for making available. Do you think there's a simple way to parallelise mediation_brain.m on an HPC cluster? (Its taking an, understandably, long time to run with n=650, bootstrapping and 192 t1w slices).

I've considered calling mediation_brain in a parfor loop (or even just a for loop that assigns single jobs) with a different 'start slice' e.g.

parfor slice = 1:nSlices
    results = mediation_brain(X, Y, M, 'names', names, 'mask', mask, 'rank',    'startslice', slice); 
end

And then in mediation_brain.m hardcode the 'nslices' to 1.

    %nslices = max(z); %ORIGINAL
    nslices = 1;

    str = display_string('Statistics.');
    spm_progress_bar('init');

    for i = startslice:nslices
        % calculates and writes images for this slice
        process_slice();

        med_results;

        save med_results med_results
        spm_progress_bar('set', i/nslices);
    end

I think that would update the relevant voxel rows when each slice finishes in med_results.mat.

But I'd appreciate any thoughts incase I'm being naive and this isn't a good idea (e.g. updating the sets of .img / .hdr files in parallel).

torwager commented 3 years ago

Hi Ethan,

Yes, it should be fairly straightforward to parallelize using parlor at the slice level or voxel level within-slices. It would need a bit of love I’m sure. I started doing this with some functions, and considered it for mediation, but was a bit worried about the long-term stability — at this point, should be fine!

Best wishes, Tor


Tor Wager Diana L. Taylor Distinguished Professor Presidential Cluster in Neuroscience and Department of Psychological and Brain Sciences Dartmouth College


On Sep 26, 2020, at 9:14 AM, Ethan Knights notifications@github.com wrote:

Great toolbox - thanks for making available. Do you think there's a simple way to parallelise mediation_brain.m on an HPC cluster? (Its taking an, understandably, long time to run with n=650, bootstrapping and 192 t1w slices).

I've considered calling mediation_brain in a parfor loop (or even just a for loop that assigns single jobs) with a different 'start slice' e.g.

parfor slice = 1:nSlices results = mediation_brain(X, Y, M, 'names', names, 'mask', mask, 'rank', 'startslice', slice); end And then in mediation_brain.m hardcode the 'nslices' to 1.

%nslices = max(z); %ORIGINAL
nslices = 1;

str = display_string('Statistics.');
spm_progress_bar('init');

for i = startslice:nslices
    % calculates and writes images for this slice
    process_slice();

    med_results;

    save med_results med_results
    spm_progress_bar('set', i/nslices);
end

I think that would update the relevant voxel rows when each slice finishes in med_results.mat.

But I'd appreciate any thoughts incase I'm being naive and this isn't a good idea (e.g. updating the sets of .img / .hdr files in parallel).

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/canlab/MediationToolbox/issues/13, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABPY7LBTX776RBB2UHFBHW3SHXSJ5ANCNFSM4R24RLFA.

ethanknights commented 3 years ago

Hi Tor After poking around more today, I believe I have a better understanding of how to approach this: