populse / populse_mia

Multiparametric Image Analysis
Other
10 stars 9 forks source link

[Pipeline Manager] Dragging and dropping a nipype-SPM node into the pipeline manager takes a while, only outside of the BV singularity container. #229

Closed servoz closed 11 months ago

servoz commented 2 years ago

Minimum step to reproduce:

servoz commented 2 years ago

What seems curious is that the problem just seems to be shifted between the BV container and the direct host operation.

servoz commented 2 years ago

Without taking the time to investigate, while working on something else, it seems that this slowness is caused by:

proc = Popen(
        cmdline,
        stdout=stdout,
        stderr=stderr,
        shell=True,
        cwd=runtime.cwd,
        env=env,
        close_fds=(not sys.platform.startswith("win")),
    )

in the run_command() function of the nipype.utils.subprocess module.

The purpose of this command is to retrieve some information about SPM (make sure the path to SPM is known to matlab, etc.).

servoz commented 11 months ago

As mentioned in the previous post, when a nipype - SPM process is instantiated (which is what happens when we drag and drop a brick into the pipeline manager), nipype will automatically try to find certain information, such as the version of SPM.

As we don't set environment variables for executables when we use Mia, because we want this to be done mainly in the Mia preferences, nipype will try to make a system call with the matlab command (matlab -nodesktop -nosplash ...).

If the environment variables allow matlab (with licence) to be launched, nipype will do so. From then on, it will remember the value of the SPM version and will no longer restart the system call to determine the SPM version.

This is why we see this difference.

The stations with which I observed this duration difference have the value of the directory where matlab (with licence) is located in the PATH, so the system call takes place when the SPM brick is dragged and dropped into the pipeline manager. However, as this value is known to nipype after the instanciation of the process, the command is not executed again at initialisation time.

On the other hand, in BV, we generally don't use a matlab licence but rather MCR with SPM standalone. This means that when the process is instantiated, nipype does not launch the system call to determine the SPM version, so the operation seems to take less time than in the case described above, directly on the host. On the other hand, at initialisation time, the step seems to take longer because the SPM standalone / MCR parameters are now known and the system call is made.

So this ticket doesn't describe a real issue but rather a way of running nipype that we could do without as the Mia / capsul configuration knows the SPM version (especially as automatically determining SPM version with the matlab licence makes no sense if we use SPM standalone later!).

I think this ticket can be closed now that we know why there is this slowness at different times between host and BV singularity container.

manuegrx commented 4 months ago

Just to add my personal experience, in case anyone encounters the same trouble : I use POPULSE MIA outside the BV singularity container with SPM standalone (and with no matlab in my PATH) and in this configuration it woks fine (the SMP bricks appear almost instantly in the editor).

I added a license Matlab in my computer (not to use in MIA) and I added "matlab" in my PATH . In this case, in MIA, SPM bricks take a very very long time to appear in the editor even if I keep SPM standalone in the configuration. To go back to a "normal" situation I had to I remove the matlab from my PATH.