Neurita / pypes

Reusable neuroimaging pipelines using nipype
http://neuro-pypes.readthedocs.io
Other
16 stars 7 forks source link

NameError: ("Input formula couldn't be processed: ...) #39

Open mrahimpour opened 5 years ago

mrahimpour commented 5 years ago

Hi,

Thank you for sharing your code. I am trying to use your package for PET-MRI co-registration. By using "spm_anat_preproc" and "spm_mrpet_preproc" functions, I am running PETPVC pipeline on MR/PET images; but I am getting the following error.

ERROR:nipype.workflow:

Traceback (most recent call last): File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nipype/pipeline/plugins/multiproc.py", line 70, in run_node result['result'] = node.run(updatehash=updatehash) File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nipype/pipeline/engine/nodes.py", line 480, in run result = self._run_interface(execute=True) File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nipype/pipeline/engine/nodes.py", line 564, in _run_interface return self._run_command(execute) File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nipype/pipeline/engine/nodes.py", line 644, in _run_command result = self._interface.run(cwd=outdir) File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nipype/interfaces/base/core.py", line 521, in run runtime = self._run_interface(runtime) File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nipype/interfaces/utility/wrappers.py", line 144, in _run_interface out = function_handle(*args) File "/home/masoomeh/PET/pypes/neuro_pypes/interfaces/nilearn/image.py", line 34, in wrapped res_img = f(args, **kwargs) File "", line 26, in math_img File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nilearn-0.4.2-py3.5.egg/nilearn/image/image.py", line 793, in math_img result = eval(formula, data_dict) File "", line 1, in NameError: ("Input formula couldn't be processed, you provided 'img / nan',", "name 'nan' is not defined").

I checked the code and could not find where to modify the formula and define the "val". I will be grateful if you can provide me with some clue to solve this error!

alexsavio commented 5 years ago

Hi

Thanks for trying to use it.

Are you directly importing those functions and passing Nipype workflows to them?

Can I have a look at your code?

The val value is determined from your input image. If you're getting NaNs it is either because of strange values in a previous pre-processing step or the function in that step has a bug or should be more robust.

Cheers Alex

On Mon, 20 Aug 2018, 16:11 mrahimpour, notifications@github.com wrote:

Hi,

Thank you for sharing your code. I am trying to use your package for PET-MRI co-registration. By using "spm_anat_preproc" and "spm_mrpet_preproc" functions, I am running PETPVC pipeline on MR/PET images; but I am getting the following error.

ERROR:nipype.workflow:

Traceback (most recent call last): File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nipype/pipeline/plugins/multiproc.py", line 70, in run_node result['result'] = node.run(updatehash=updatehash) File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nipype/pipeline/engine/nodes.py", line 480, in run result = self._run_interface(execute=True) File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nipype/pipeline/engine/nodes.py", line 564, in _run_interface return self._run_command(execute) File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nipype/pipeline/engine/nodes.py", line 644, in _run_command result = self._interface.run(cwd=outdir) File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nipype/interfaces/base/core.py", line 521, in run runtime = self._run_interface(runtime) File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nipype/interfaces/utility/wrappers.py", line 144, in _run_interface out = function_handle(*args) File "/home/masoomeh/PET/pypes/neuro_pypes/interfaces/nilearn/image.py", line 34, in wrapped res_img = f(args, **kwargs) File "", line 26, in math_img File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nilearn-0.4.2-py3.5.egg/nilearn/image/image.py", line 793, in math_img result = eval(formula, data_dict) File "", line 1, in NameError: ("Input formula couldn't be processed, you provided 'img / nan',", "name 'nan' is not defined").

I checked the code and could not find where to modify the formula and define the "val". I will be grateful if you can provide me with some clue to solve this error!

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/Neurita/pypes/issues/39, or mute the thread https://github.com/notifications/unsubscribe-auth/ACW4jEbkjjKo3VSpvKD8g-gXOIerJk4jks5uSsOjgaJpZM4WEFGh .

--

Sent from my phone, sorry for brevity or typos.

mrahimpour commented 5 years ago

Hi alexsavio,

Thanks for your reply. I am trying to use the functions as you explained in the tutorial (https://neuro-pypes.readthedocs.io/en/latest/), following is the code I am using:

import os import pdb

import warnings

from hansel import Crumb from neuro_pypes.anat import attach_spm_anat_preprocessing from neuro_pypes.pet import attach_spm_mrpet_preprocessing from neuro_pypes.io import build_crumb_workflow from neuro_pypes.config import update_config from neuro_pypes.run import run_debug

""" Mtalab-SPM path """ import nipype.interfaces.matlab as mlab mlab.MatlabCommand.set_default_paths('/usr/local/MATLAB/R2018a/toolbox/spm12')

warnings.filterwarnings("always")

cwd = os.getcwd() #print(cwd)

base_dir = "/home/masoomeh/PET-Quantification/Data" data_path = os.path.join(base_dir, "{subject_id}", "{modality}", "{image}")

data_crumb = Crumb(data_path, ignore_list=[".*"]) print(data_crumb)

subj_ids = data_crumb['subject_id'] print(subj_ids)

attach_functions = {"spm_anat_preproc": attach_spm_anat_preprocessing, "spm_mrpet_preproc": attach_spm_mrpet_preprocessing}

crumb_arguments = {'anat': [('modality', 'anat_1'), ('image','MPRAGE.nii.gz')], 'pet': [('modality', 'pet_1'), ('image', 'FET_DYN.nii.gz')]}

output_dir = os.path.join(os.path.dirname(base_dir), "out") cache_dir = os.path.join(os.path.dirname(base_dir), "wd")

pdb.set_trace()

wf = build_crumb_workflow(attach_functions, data_crumb=data_crumb, in_out_kwargs=crumb_arguments, output_dir=output_dir, cache_dir=cache_dir,)

pdb.set_trace()

run_debug(wf, plugin="Linear", n_cpus=2)

alexsavio commented 5 years ago

Hi!

Your code looks fine. Thanks The issue is probably in the petpvc_mask or intensity_norm functions in: https://github.com/Neurita/pypes/blob/master/neuro_pypes/pet/utils.py

Pypes calculates the mask or intensity norm based on the tissue segmentation provided by SPM12. If you could identify which subject is throwing this error, go to the working directory folder, look for this subject and its 'tissues' node folder. Have a look at the results, they are probably bad? If you're getting NaN is probably because you might have an empty tissue mask? I am not sure, many things can go wrong there. Please have a look at the intermediate results in the wd folder.

mrahimpour commented 5 years ago

You are totally right! Checking the working directory for this subject, I have no output for tissues, coreg_pet, etc. But it is wired because before running the code for PET/MR preprocessing, I have run the "spm_anat_preproc" for MR-only tissue segmentation and it worked very well. In the current implementation, I also have right outputs for tissue segmentation in "wd/main_workflow/spm_anat_preproc/subject_id/new_segment". I am going through the code to find out where is the error happening. I hope I can find something. Would you please let me know whether it is possible to implement the PET/MR coregistration using your pipeline without PETPVC step? Many thanks for your time.

alexsavio commented 5 years ago

Hi.

I am sorry but I don't have a pet/mr workflow without petpvc. It is actually a good feature request, it makes sense. Although this would be simple to implement directly with Nipype, I understand some features here are not in Nipype. It shouldn't be complicated to implement. I will give it a try for it soon, if you don't want to step in ;)

Please let me know if you can find the issue. Haven't you seen any error in the output log?

Thanks!

mrahimpour commented 5 years ago

Hi!

Running the code to find out what makes "Input formula couldn't be processed: ..." Error, I got another error : " raise RuntimeError("Graph changed during iteration")"

This is something that happens when nipype workflow has started to execute.

180825-20:40:01,680 nipype.workflow INFO: Generated workflow graph: /home/masoomeh/PET/wd/main_workflow/main_workflow_colored_workflow.svg (graph2use=colored, simple_form=True). INFO:nipype.workflow:Generated workflow graph: /home/masoomeh/PET/wd/main_workflow/main_workflow_colored_workflow.svg (graph2use=colored, simple_form=True). (<class 'RuntimeError'>, RuntimeError('Graph changed during iteration',), <traceback object at 0x7f9c33ffac88>) /home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/networkx/algorithms/dag.py(189)topological_sort() -> raise RuntimeError("Graph changed during iteration")

I ll be grateful if you have any debugging advice or other insight on this?

alexsavio commented 5 years ago

Hi!

I haven't had time to test the newest versions of nipype.

However, why do you think both events are related? It looks more like the workflow svg generation throws the error. Instead of using the run_debug function you can pick the workflow and do wf.run(), just to test if you also get the error.

Have you checked if your hansel.crumb paths are correct? You can try it with the crumb CLI interface.

crumb ls <crumb_path>

I hope this helps.

Cheers Alex

On Sat, 25 Aug 2018, 20:50 mrahimpour, notifications@github.com wrote:

Hi!

Running the code to find out what makes "Input formula couldn't be processed: ..." Error, I got another error : " raise RuntimeError("Graph changed during iteration")"

This is something that happens when nipype workflow has started to execute.

180825-20:40:01,680 nipype.workflow INFO: Generated workflow graph: /home/masoomeh/PET/wd/main_workflow/main_workflow_colored_workflow.svg (graph2use=colored, simple_form=True). INFO:nipype.workflow:Generated workflow graph: /home/masoomeh/PET/wd/main_workflow/main_workflow_colored_workflow.svg (graph2use=colored, simple_form=True). (<class 'RuntimeError'>, RuntimeError('Graph changed during iteration',), <traceback object at 0x7f9c33ffac88>)

/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/networkx/algorithms/dag.py(189)topological_sort() -> raise RuntimeError("Graph changed during iteration")

I ll be grateful if you have any debugging advice or other insight on this?

— You are receiving this because you commented.

Reply to this email directly, view it on GitHub https://github.com/Neurita/pypes/issues/39#issuecomment-415989203, or mute the thread https://github.com/notifications/unsubscribe-auth/ACW4jFZRtR_tL9cQZBHkrbgr9MmwqSP_ks5uUZyEgaJpZM4WEFGh .

--

Sent from my phone, sorry for brevity or typos.

mrahimpour commented 5 years ago

Hi,

I did not mean that these errors are related, I just asked for some clue and thanks to your help ,by using run_wf(), I don't have the "Graph changed during iteration" error anymore!

In order to debug the former error (no output for tissue node in petpvc workflow and following nodes), I have checked all the nodes and connections but still nothing found. Can I ask which part I need to focus more, build_crumb_workflow or run_wf? and is it something related to nipype pipeline or your pipeline?

I also checked the hansel.crumb paths by this : crumb ls and this was the path to my data, is it right?

Sorry for asking lots of questions, I really hope to run your pipeline successfully!

alexsavio commented 5 years ago

Hi,

The crumb ls <data_crumb>, it's to check if your crumb path to your data is correct and it is fetching all your files correctly. In your case it would be: crumb ls "<base_dir>/{subject_id}/{modality}/{image}"

If it is correctly fetching your data you need to check what was the last node that ran. For that you have to have a look at the .svg file with the plot of the graphs, check the order of the blocks, and go through your wd folder looking for reports.

Anyway, do you have any crash files?

mrahimpour commented 5 years ago

Hi,

I tried to simplify the workflow by removing the rbvpvc and some other nodes; it was easier to check the details in a simplified pipeline. Checking the order of blocks, I found out that main problem is in "coreg_pet" node. I have completely invalid outputs for this node which makes error in the following parts. I also checked spm_coregistration in matlab and got invalid outputs again (it look likes that the algorithm does not converge)! I think there is something wrong with clinical PET, MR data.

alexsavio commented 5 years ago

Hi,

I am glad you found the error. If I were you I would have a look at the files and try to run SPM separately in one or two subjects. If you can't find the issue, maybe I can help you. Just let me know. If you paste here your code with the simplified pipeline I could paste it here and add it to the module. I can put you as co-author. As you wish.

Good luck!

mrahimpour commented 5 years ago

Sure, It would be nice if I can add something useful to your code. I am trying to double check the simplified code to ensure that I did no mistake. If I can have your email address, I will share the simplified code and also all the other issues that I encountered when I was running your code.

I also found out the problem with PET/MR data. Their origins had not been adjusted correctly! I am looking for an automatic way to do it.

Thanks!

alexsavio commented 5 years ago

If you have time, please send me the code to alexsavio at gmail .com Thanks

alexsavio commented 5 years ago

Hello! How are you doing? Need any help?