nipy / nipype

Workflows and interfaces for neuroimaging packages
https://nipype.readthedocs.org/en/latest/
Other
745 stars 529 forks source link

no crashashdumps - but instead nondescript AttributeErrors all over the place on latest HEAD #2269

Closed TheChymera closed 6 years ago

TheChymera commented 6 years ago

My main preprocessing workflow (rather comprehensive, difficult to reduce to a minimum example) worked just fine with the latest HEAD as of CET 19:10:11 21.08.2017. The current HEAD, however ( 557aad3 ) leads to some strange behaviour:

         [Job finished] jobname: s_bids_filename.a0.b06 jobid: 50                                                
171031-20:14:46,478 workflow INFO:                      
         Currently running 4 tasks, and 39 jobs ready. Free memory (GB): 54.50/55.30, Free processors: 6/10      
171031-20:14:46,494 workflow INFO:                      
         Executing node composite_work.s_biascorrect in dir: /home/chymera/ni_data/test/preprocessing/composite_work/_subject_session_5694.ofMcF2/_scan_type_acq-TurboRARE/s_biascorrect                                          
171031-20:14:46,506 workflow INFO:                      
         [Job finished] jobname: events_file.a0.b06 jobid: 57                                                    
171031-20:14:46,515 workflow INFO:                      
         Running node "s_biascorrect" ("nipype.interfaces.ants.segmentation.N4BiasFieldCorrection"), a CommandLine Interface with command:                                                                                        
N4BiasFieldCorrection --bspline-fitting [ 10, 4 ] -d 3 --input-image /mnt/data/ni_data/test/preprocessing/composite_work/_subject_session_5694.ofMcF2/_scan_type_acq-TurboRARE/s_bru2nii/6.nii --convergence [ 150x100x50x30, 1e-16 ] --output 6_corrected.nii --shrink-factor 2.         
171031-20:14:46,520 workflow INFO:                      
         [Job finished] jobname: dummy_scans.a0.b06 jobid: 61                                                    
171031-20:14:46,520 workflow INFO:                      
         Executing node composite_work.f_biascorrect in dir: /home/chymera/ni_data/test/preprocessing/composite_work/_subject_session_5699.ofMcF1/_scan_type_acq-EPI_CBV_trial-CogB/f_biascorrect                                 
171031-20:14:46,538 workflow INFO:                      
         Running node "f_biascorrect" ("nipype.interfaces.ants.segmentation.N4BiasFieldCorrection"), a CommandLine Interface with command:                                                                                        
N4BiasFieldCorrection --bspline-fitting [ 10, 4 ] -d 3 --input-image /mnt/data/ni_data/test/preprocessing/composite_work/_subject_session_5699.ofMcF1/_scan_type_acq-EPI_CBV_trial-CogB/temporal_mean/10_st_mean.nii.gz --convergence [ 150x100x50x30, 1e-11 ] --output 10_st_mean_corrected.nii.gz --shrink-factor 2.                             
171031-20:14:46,542 workflow INFO:                      
         [Job finished] jobname: bids_filename.a0.b10 jobid: 65                                                  
171031-20:14:46,550 workflow INFO:                      
         [Job finished] jobname: f_bru2nii.a0.b10 jobid: 66                                                      
171031-20:14:48,565 workflow INFO:                      
         Currently running 6 tasks, and 35 jobs ready. Free memory (GB): 54.10/55.30, Free processors: 4/10      
171031-20:14:48,569 workflow INFO:                      
         [Job finished] jobname: slicetimer.a0.b06 jobid: 62                                                     
171031-20:14:48,587 workflow INFO:                      
         [Job finished] jobname: get_f_scan.aI.a0.b07 jobid: 67                                                  
171031-20:14:48,611 workflow INFO:                      
         [Job finished] jobname: bids_stim_filename.a0.b10 jobid: 70                                             
171031-20:14:48,635 workflow INFO:                      
         [Job finished] jobname: bids_stim_filename.a0.b15 jobid: 72                                             
171031-20:14:50,643 workflow INFO:                      
         Currently running 6 tasks, and 37 jobs ready. Free memory (GB): 54.10/55.30, Free processors: 4/10      
171031-20:14:50,653 workflow INFO:                      
         Executing node composite_work.temporal_mean in dir: /home/chymera/ni_data/test/preprocessing/composite_work/_subject_session_5694.ofMcF2/_scan_type_acq-EPI_CBV_trial-CogB/temporal_mean                                 
Traceback (most recent call last):                      
  File "<string>", line 1, in <module>                  
  File "development.py", line 93, in bids_test          
    verbose=False,                                      
  File "preprocessing.py", line 446, in bruker          
    workflow.run(plugin="MultiProc",  plugin_args={'n_procs' : n_procs})                                         
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/engine/workflows.py", line 591, in run                
    runner.run(execgraph, updatehash=updatehash, config=self.config)                                             
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/plugins/base.py", line 182, in run                    
    self._send_procs_to_workers(updatehash=updatehash, graph=graph)                                              
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/plugins/multiproc.py", line 260, in _send_procs_to_workers                                                                                                             
    if self._local_hash_check(jobid, graph):            
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/plugins/base.py", line 333, in _local_hash_check      
    hash_exists, _, _, _ = self.procs[jobid].hash_exists()                                                       
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/engine/nodes.py", line 278, in hash_exists            
    hashed_inputs, hashvalue = self._get_hashval()      
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/engine/nodes.py", line 442, in _get_hashval           
    self._get_inputs()                                  
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/engine/nodes.py", line 497, in _get_inputs            
    output_value = results.outputs.get()[output_name]   
AttributeError: 'NoneType' object has no attribute 'get'                                                         
171031-20:14:50,661 workflow INFO:                      
         Running node "temporal_mean" ("nipype.interfaces.fsl.maths.MeanImage"), a CommandLine Interface with command:                                                                                                            
fslmaths /mnt/data/ni_data/test/preprocessing/composite_work/_subject_session_5694.ofMcF2/_scan_type_acq-EPI_CBV_trial-CogB/slicetimer/8_st.nii.gz -Tmean /mnt/data/ni_data/test/preprocessing/composite_work/_subject_session_5694.ofMcF2/_scan_type_acq-EPI_CBV_trial-CogB/temporal_mean/8_st_mean.nii.gz. 

I was able to make the workflow proceed somewhat further by:

Still, the workflow just won't run to completion. I can keep re-starting it after each attribute error an it seems to add more and more nodes, but really at a snail's pace. It is really frustrating that the traceback will not say which node specifically made the workflow fail (maybe it really is just one node?).

I also notice that the child processes keep running in the background until finished. I don't know if this is new behaviour, but I get the feeling that nipype used to wait until all its children were done before closing itself. I have tried to dig through the commits to find the culprit, but there have been just too many over the last 2 months.

I would guess someone innovated the jobs management in a fashion which makes it less failsafe. Maybe that person is reading this?

Platform details:

{'nibabel_version': '2.1.0', 'sys_executable': '/usr/lib/python-exec/python2.7/python', 'networkx_version': '1.11', 'numpy_version': '1.13.3', 'sys_platform': 'linux2', 'sys_version': '2.7.14 (default, Oct 10 2017, 17:25:48) \n[GCC 6.4.0]', 'commit_source': u'installation', 'commit_hash': u'557aad337', 'pkg_path': '/usr/lib64/python2.7/site-packages/nipype', 'nipype_version': u'1.0.0-dev', 'traits_version': '4.6.0', 'scipy_version': '0.19.1'} 1.0.0-dev

Execution environment

Doeme commented 6 years ago

When we took the naïve approach of looking at every commit whose diff contains ignore_exception, 10d0f39e9728236ee35069d732d13b8243dd5a2c stood out the most in the relevant timespan.

oesteban commented 6 years ago

Could you give it a try with the Linear plugin? I'll double check how we recover from errors in MultiProc.

TheChymera commented 6 years ago

I will. One last thing before I move on to detailng that route of debugging: I have tried to re-start the workflow over and over again, and the amount of nodes seem to have hit a ceiling at 369 (not that the value means anything, but there is a ceiling and increments don't seem to be “asymptotical”):

198 → 227 → 266 → 279 → 354 → 368 → 369

Now I get this reproducible output:

/usr/lib64/python2.7/site-packages/nipype/interfaces/base.py:450: UserWarning: Input terminal_output in interface ApplyTransforms is deprecated. Will be removed or raise an error as of release 1.0.0                            
  warn(msg)                                             
171031-23:33:14,360 workflow INFO:                      
         Generated workflow graph: /home/chymera/ni_data/test/preprocessing/composite_work/graph.dot.png (graph2use=hierarchical, simple_form=True).                                                                              
171031-23:33:14,937 workflow INFO:                      
         Workflow composite_work settings: ['check', 'execution', 'logging']                                     
171031-23:33:15,289 workflow INFO:                      
         Running in parallel.                           
171031-23:33:15,301 workflow INFO:                      
         Currently running 0 tasks, and 32 jobs ready. Free memory (GB): 55.30/55.30, Free processors: 10/10     
171031-23:33:15,319 workflow INFO:                      
         [Job finished] jobname: get_s_scan.aI.a0.b08 jobid: 0                                                   
171031-23:33:15,334 workflow INFO:                      
         [Job finished] jobname: get_s_scan.aI.a0.b05 jobid: 1                                                   
171031-23:33:15,348 workflow INFO:                      
         [Job finished] jobname: get_f_scan.aI.a0.b09 jobid: 2                                                   
171031-23:33:15,362 workflow INFO:                      
         [Job finished] jobname: get_f_scan.aI.a0.b07 jobid: 5                                                   
171031-23:33:15,378 workflow INFO:                      
         [Job finished] jobname: get_s_scan.aI.a0.b00 jobid: 8                                                   
171031-23:33:15,401 workflow INFO:                      
         [Job finished] jobname: get_s_scan.aI.a0.b07 jobid: 10                                                  
171031-23:33:15,417 workflow INFO:                      
         [Job finished] jobname: get_f_scan.aI.a0.b15 jobid: 14                                                  
171031-23:33:15,431 workflow INFO:                      
         [Job finished] jobname: get_s_scan.aI.a0.b10 jobid: 16                                                  
171031-23:33:15,445 workflow INFO:                      
         [Job finished] jobname: get_s_scan.aI.a0.b11 jobid: 21                                                  
171031-23:33:15,460 workflow INFO:                      
         [Job finished] jobname: get_f_scan.aI.a0.b10 jobid: 23                                                  
171031-23:33:17,474 workflow INFO:                      
         Currently running 0 tasks, and 46 jobs ready. Free memory (GB): 55.30/55.30, Free processors: 10/10     
171031-23:33:17,504 workflow INFO:                      
         [Job finished] jobname: s_bids_filename.a0.b08 jobid: 3                                                 
171031-23:33:17,529 workflow INFO:                      
         [Job finished] jobname: s_bids_filename.a0.b05 jobid: 4                                                 
Traceback (most recent call last):                      
  File "<string>", line 1, in <module>                  
  File "development.py", line 93, in bids_test          
    verbose=False,                                      
  File "preprocessing.py", line 446, in bruker          
    workflow.run(plugin="MultiProc",  plugin_args={'n_procs' : n_procs})                                         
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/engine/workflows.py", line 591, in run                
    runner.run(execgraph, updatehash=updatehash, config=self.config)                                             
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/plugins/base.py", line 182, in run                    
    self._send_procs_to_workers(updatehash=updatehash, graph=graph)                                              
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/plugins/multiproc.py", line 260, in _send_procs_to_workers                                                                                                             
    if self._local_hash_check(jobid, graph):            
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/plugins/base.py", line 333, in _local_hash_check      
    hash_exists, _, _, _ = self.procs[jobid].hash_exists()                                                       
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/engine/nodes.py", line 278, in hash_exists            
    hashed_inputs, hashvalue = self._get_hashval()      
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/engine/nodes.py", line 442, in _get_hashval           
    self._get_inputs()                                  
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/engine/nodes.py", line 497, in _get_inputs            
    output_value = results.outputs.get()[output_name]   
AttributeError: 'NoneType' object has no attribute 'get' 

Interestingly, this does not seem to indicate that any one node constitutes a single point of failure, but rather that the workflow engine is somehow failing...

oesteban commented 6 years ago

Probably related: https://neurostars.org/t/fmriprep-error-during-lta-concat-and-freesurfer-brainmask-problem/942?u=oesteban

TheChymera commented 6 years ago

are you sure? That seems to be a TypeError thrown by the freesurfer interface. This is an AttributeError thrown by the nodes engine.

oesteban commented 6 years ago

Both cases the outputs object is None, isn't it?.

oesteban commented 6 years ago

Question: are you reusing a working directory?

TheChymera commented 6 years ago

Generally I am not reusing a working directory. For the above debugging effort, where I re-ran the workflow until I got a reproducible error, I did.

Also, executing the workflow with the Linear plugin has managed to finish at least one of the infosource iterations. I assume the rest will also be succesful. So most probably this is in fact an issue with the multiprocessing.

oesteban commented 6 years ago

Could you run nipype with all logging set to DEBUG? I will need some more information to investigate this.

TheChymera commented 6 years ago

@oesteban should I run this with the linear or multiproc plugin?

oesteban commented 6 years ago

MultiProc, let's see if we can get to the bottom

effigies commented 6 years ago

The issue seems to be here:

https://github.com/nipy/nipype/blob/557aad3370f820e3765454ea6774f95826ed8fc1/nipype/interfaces/base.py#L1087-L1089

If _run_interface() has an exception, the outputs variable is not set.

oesteban commented 6 years ago

Yes, that could be the origin. But here the error happens because the inputs pkl file was found (interface potentially cached) and there is no outputs object in there (self._local_hash_check(jobid, graph)).

We may be caching errored interfaces? (cc @satra)

satra commented 6 years ago

@TheChymera, @oesteban - we should definitely add a simple test for this before we fix it.

oesteban commented 6 years ago

Totally agreed.

My suspicion is that the original working directory had some stale cache files. Besides enabling the DEBUG logging, could you also run on a clean (new, do not rewrite old results that allow you to replicate) directory @TheChymera ?

TheChymera commented 6 years ago

SO I cleaned up everything, and ran the processing on a reduced dataset, with the same HEAD as before, and using the following parameters for the workflow execution:

(In addition to not rerunning the workflow in order to get a more complete result, I am also not enabling all of the ignoe_exception attributes which were not necessary for my workflow to execute with the late-August HEAD; also I am not setting stop_on_first_crash and stop_on_first_rerun explicitly)

        workflow.config = {
                "execution": {
                        'crashdump_dir': path.join(measurements_base,'preprocessing/crashdump'),
                #       'stop_on_first_crash':'false',
                #       'stop_on_first_rerun':'false',
                        },
                "logging": {
                        'workflow_level':'DEBUG',
                        'utils_level':'DEBUG',
                        'interface_level':'DEBUG',
                        'filemanip_level':'DEBUG',
                        'log_to_file':'true',
                        }
                }
        workflow.write_graph(dotfilename=path.join(workflow.base_dir,workdir_name,"graph.dot"), graph2use="hierarchical", format="png")

        #workflow.run(plugin="MultiProc", plugin_args={'n_procs' : n_procs})
        workflow.run(plugin="MultiProc",)
        if not keep_work:

I was sadly unable to locate the logging file I believe I was supposed to get. In the terminal I get the following output:

171102-10:00:31,434 workflow INFO:
         Running node "get_f_scan" ("nipype.interfaces.utility.wrappers.Function").
171102-10:00:31,440 workflow INFO:
         Running node "get_f_scan" ("nipype.interfaces.utility.wrappers.Function").
171102-10:00:31,442 workflow INFO:
         Running node "get_s_scan" ("nipype.interfaces.utility.wrappers.Function").
171102-10:00:31,442 workflow INFO:
         Running node "get_f_scan" ("nipype.interfaces.utility.wrappers.Function").
171102-10:00:33,304 workflow INFO:
         [Job finished] jobname: get_f_scan.aI.a0.b1 jobid: 0
171102-10:00:33,309 workflow INFO:
         [Job finished] jobname: get_s_scan.aI.a0.b4 jobid: 1
171102-10:00:33,312 workflow INFO:
         [Job finished] jobname: get_s_scan.aI.a0.b1 jobid: 2
171102-10:00:33,314 workflow INFO:
         [Job finished] jobname: get_s_scan.aI.a0.b0 jobid: 3
171102-10:00:33,315 workflow INFO:
         [Job finished] jobname: get_f_scan.aI.a0.b6 jobid: 6
171102-10:00:33,317 workflow INFO:
         [Job finished] jobname: get_f_scan.aI.a0.b8 jobid: 7
171102-10:00:33,318 workflow INFO:
         [Job finished] jobname: get_f_scan.aI.a0.b4 jobid: 11
171102-10:00:33,319 workflow INFO:
         [Job finished] jobname: get_s_scan.aI.a0.b3 jobid: 12
171102-10:00:33,321 workflow INFO:
         [Job finished] jobname: get_f_scan.aI.a1.b0 jobid: 14
171102-10:00:33,322 workflow INFO:
         [Job finished] jobname: get_f_scan.aI.a1.b8 jobid: 20
171102-10:00:33,325 workflow INFO:
         Currently running 0 tasks, and 43 jobs ready. Free memory (GB): 55.30/55.30, Free processors: 10/10
171102-10:00:33,350 workflow INFO:
         Executing node composite_work.s_bids_filename in dir: /home/chymera/ni_data/test/preprocessing/composite_work/_subject_session_5694.ofMcF2/_scan_type_acq-TurboRARE/s_bids_filename
171102-10:00:33,370 workflow INFO:
         Executing node composite_work.s_bids_filename in dir: /home/chymera/ni_data/test/preprocessing/composite_work/_subject_session_6262.ofMcF2/_scan_type_acq-TurboRARE/s_bids_filename
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "development.py", line 93, in bids_test
    verbose=True,
  File "preprocessing.py", line 455, in bruker
    workflow.run(plugin="MultiProc", plugin_args={'n_procs' : n_procs})
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/engine/workflows.py", line 591, in run
    runner.run(execgraph, updatehash=updatehash, config=self.config)
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/plugins/base.py", line 182, in run
    self._send_procs_to_workers(updatehash=updatehash, graph=graph)
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/plugins/multiproc.py", line 260, in _send_procs_to_workers
    if self._local_hash_check(jobid, graph):
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/plugins/base.py", line 333, in _local_hash_check
    hash_exists, _, _, _ = self.procs[jobid].hash_exists()
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/engine/nodes.py", line 278, in hash_exists
    hashed_inputs, hashvalue = self._get_hashval()
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/engine/nodes.py", line 442, in _get_hashval
    self._get_inputs()
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/engine/nodes.py", line 497, in _get_inputs
    output_value = results.outputs.get()[output_name]
AttributeError: 'NoneType' object has no attribute 'get'
171102-10:00:33,421 workflow INFO:
         Running node "s_bids_filename" ("nipype.interfaces.utility.wrappers.Function").
TheChymera commented 6 years ago

@satra @oesteban @effigies In the next week I will be rushing to meet the ISMRM submission deadline, so I do not expect I can be of much help before the 0.14.0 release date.

I went ahead and put together a “minimal” working/not working example. To run it you will need to install the most recent commit SAMRI. It seems the issue only arises if I iterate over multiple sessions.

This part may not make much sense to you, but I leave it here for completeness: My workflow graph is a bit complicated to explain (because it has to iterate through subject/scan/tiral combinations where not all combinations are present). I actually can't figure out why iterating over more subject/session combinations would cause additional crashes, since the iteration list for that is constructed based on what data is really available. The step where I have to iterate over all possible (rather than available) combinations is the trials iteration.

Anyway, long story short, this is the output I get for no_worky.py, with all of the debugging enabled. I cannot find any log file being produced:

Data selection:                                                                                                                                                                                                         [126/1854]
  acquisition contrast               measurement scan               scan_type  \                                                                                                                                                  
0   TurboRARE      T2w  20170710_185042_5694_1_6    8           acq-TurboRARE                                                                                                                                                     
1   TurboRARE      T2w  20170710_172213_5692_1_6    7           acq-TurboRARE                                                                                                                                                     
2   TurboRARE      T2w  20170623_224956_5694_1_4    6           acq-TurboRARE                                                                                                                                                     
0         EPI      cbv  20170710_185042_5694_1_6   10  acq-EPI_CBV_trial-CogB                                    
1         EPI      cbv  20170710_172213_5692_1_6    9  acq-EPI_CBV_trial-CogB                                    
2         EPI      cbv  20170623_224956_5694_1_4    8   acq-EPIcbv_trial-CogB                                    

  session subject trial                                                                                                                                                                                                           
0   ofMpF    5694   NaN                                                                                                                                                                                                           
1   ofMpF    5692   NaN                                                                                          
2  ofMcF2    5694   NaN                                                                                          
0   ofMpF    5694  CogB                                                                                          
1   ofMpF    5692  CogB                                                                                          
2  ofMcF2    5694  CogB                                                                                          
Iterating over:                                                                                                  
[['5694', 'ofMpF'], ['5692', 'ofMpF'], ['5694', 'ofMcF2']]                                                       
/usr/lib64/python2.7/site-packages/nipype/interfaces/base.py:450: UserWarning: Input terminal_output in interface ApplyTransforms is deprecated. Will be removed or raise an error as of release 1.0.0
  warn(msg)                                                                                                      
171102-12:29:38,267 workflow INFO:                                                                               
         Generated workflow graph: /home/chymera/workflow/preprocessing/composite_work/graph.dot.png (graph2use=hierarchical, simple_form=True).
171102-12:29:38,768 workflow INFO:                                                                               
         Workflow composite_work settings: ['check', 'execution', 'logging']                                     
171102-12:29:38,909 workflow INFO:                                                                                                                                                                    
         Running in parallel.                                                                                    
171102-12:29:38,917 workflow INFO:                                                                                                                                                          
         Currently running 0 tasks, and 9 jobs ready. Free memory (GB): 55.30/55.30, Free processors: 10/10      
171102-12:29:38,949 workflow INFO:                                                                                                                                                      
         Executing node composite_work.get_f_scan in dir: /home/chymera/workflow/preprocessing/composite_work/_subject_session_5692.ofMpF/_scan_type_acq-EPI_CBV_trial-CogB/get_f_scan
171102-12:29:38,977 workflow INFO:                                                                                                                                          
         Executing node composite_work.get_f_scan in dir: /home/chymera/workflow/preprocessing/composite_work/_subject_session_5694.ofMpF/_scan_type_acq-EPI_CBV_trial-CogB/get_f_scan
171102-12:29:38,979 workflow INFO:                                                                                                                                         
         Executing node composite_work.get_s_scan in dir: /home/chymera/workflow/preprocessing/composite_work/_subject_session_5694.ofMcF2/_scan_type_acq-TurboRARE/get_s_scan
171102-12:29:38,992 workflow INFO:                                                                               
         Executing node composite_work.get_s_scan in dir: /home/chymera/workflow/preprocessing/composite_work/_subject_session_5694.ofMpF/_scan_type_acq-TurboRARE/get_s_scan
171102-12:29:39,2 workflow INFO:                                                                                 
         Executing node composite_work.get_f_scan in dir: /home/chymera/workflow/preprocessing/composite_work/_subject_session_5694.ofMcF2/_scan_type_acq-EPI_CBV_trial-CogB/get_f_scan
171102-12:29:39,17 workflow INFO:                                                                                
         Executing node composite_work.get_f_scan in dir: /home/chymera/workflow/preprocessing/composite_work/_subject_session_5694.ofMcF2/_scan_type_acq-EPIcbv_trial-CogB/get_f_scan
171102-12:29:39,23 workflow INFO:                                                                                
         Running node "get_f_scan" ("nipype.interfaces.utility.wrappers.Function").                              
171102-12:29:39,38 workflow INFO:                                                                                    
         Executing node composite_work.get_f_scan in dir: /home/chymera/workflow/preprocessing/composite_work/_subject_session_5692.ofMpF/_scan_type_acq-EPIcbv_trial-CogB/get_f_scan
171102-12:29:39,42 workflow INFO:                                                                                
         Executing node composite_work.get_f_scan in dir: /home/chymera/workflow/preprocessing/composite_work/_subject_session_5694.ofMpF/_scan_type_acq-EPIcbv_trial-CogB/get_f_scan
171102-12:29:39,57 workflow INFO:                                                                                
         Running node "get_f_scan" ("nipype.interfaces.utility.wrappers.Function").                              
171102-12:29:39,58 workflow INFO:                                                                                
         Executing node composite_work.get_s_scan in dir: /home/chymera/workflow/preprocessing/composite_work/_subject_session_5692.ofMpF/_scan_type_acq-TurboRARE/get_s_scan
171102-12:29:39,84 workflow INFO:                                                                                
         Running node "get_s_scan" ("nipype.interfaces.utility.wrappers.Function").                              
171102-12:29:39,88 workflow INFO:                                                                                
         Running node "get_f_scan" ("nipype.interfaces.utility.wrappers.Function").                              
171102-12:29:39,103 workflow INFO:                                                                               
         Running node "get_s_scan" ("nipype.interfaces.utility.wrappers.Function").                              
171102-12:29:39,103 workflow INFO:                                                                               
         Running node "get_f_scan" ("nipype.interfaces.utility.wrappers.Function").                                                                                                   
171102-12:29:39,103 workflow INFO:                                                                               
         Running node "get_f_scan" ("nipype.interfaces.utility.wrappers.Function").                              
171102-12:29:39,118 workflow INFO:                                                                                                                                                   
         Running node "get_f_scan" ("nipype.interfaces.utility.wrappers.Function").  
171102-12:29:39,123 workflow INFO:
         Running node "get_s_scan" ("nipype.interfaces.utility.wrappers.Function").
171102-12:29:41,19 workflow INFO:
         [Job finished] jobname: get_f_scan.aI.a0.b1 jobid: 0
171102-12:29:41,20 workflow INFO:
         [Job finished] jobname: get_f_scan.aI.a0.b0 jobid: 3
171102-12:29:41,21 workflow INFO:
         [Job finished] jobname: get_s_scan.aI.a0.b2 jobid: 4
171102-12:29:41,22 workflow INFO:
         [Job finished] jobname: get_s_scan.aI.a0.b0 jobid: 7
171102-12:29:41,24 workflow INFO:
         [Job finished] jobname: get_f_scan.aI.a0.b2 jobid: 14
171102-12:29:41,25 workflow INFO:
         [Job finished] jobname: get_f_scan.aI.a1.b2 jobid: 21
171102-12:29:41,26 workflow INFO:
         [Job finished] jobname: get_f_scan.aI.a1.b1 jobid: 25
171102-12:29:41,27 workflow INFO:
         [Job finished] jobname: get_f_scan.aI.a1.b0 jobid: 35
171102-12:29:41,28 workflow INFO:
         [Job finished] jobname: get_s_scan.aI.a0.b1 jobid: 94
171102-12:29:41,30 workflow INFO:
         Currently running 0 tasks, and 24 jobs ready. Free memory (GB): 55.30/55.30, Free processors: 10/10
171102-12:29:41,52 workflow INFO:
         Executing node composite_work.bids_stim_filename in dir: /home/chymera/workflow/preprocessing/composite_work/_subject_session_5692.ofMpF/_scan_type_acq-EPI_CBV_trial-CogB/bids_stim_filename
171102-12:29:41,66 workflow INFO:
         Executing node composite_work.bids_filename in dir: /home/chymera/workflow/preprocessing/composite_work/_subject_session_5692.ofMpF/_scan_type_acq-EPI_CBV_trial-CogB/bids_filename
171102-12:29:41,89 workflow INFO:
         Executing node composite_work.s_bids_filename in dir: /home/chymera/workflow/preprocessing/composite_work/_subject_session_5694.ofMcF2/_scan_type_acq-TurboRARE/s_bids_filename
171102-12:29:41,91 workflow INFO:
         Executing node composite_work.s_bru2nii in dir: /home/chymera/workflow/preprocessing/composite_work/_subject_session_5694.ofMcF2/_scan_type_acq-TurboRARE/s_bru2nii
171102-12:29:41,93 workflow INFO:
         Executing node composite_work.s_bru2nii in dir: /home/chymera/workflow/preprocessing/composite_work/_subject_session_5694.ofMpF/_scan_type_acq-TurboRARE/s_bru2nii
Traceback (most recent call last):
  File "no_worky.py", line 18, in <module>
    verbose=True,
  File "/home/chymera/src/SAMRI/samri/pipelines/preprocessing.py", line 456, in bruker
    workflow.run(plugin="MultiProc", plugin_args={'n_procs' : n_procs})
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/engine/workflows.py", line 591, in run
    runner.run(execgraph, updatehash=updatehash, config=self.config)
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/plugins/base.py", line 182, in run
    self._send_procs_to_workers(updatehash=updatehash, graph=graph)
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/plugins/multiproc.py", line 260, in _send_procs_to_workers
    if self._local_hash_check(jobid, graph):
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/plugins/base.py", line 333, in _local_hash_check
    hash_exists, _, _, _ = self.procs[jobid].hash_exists()
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/engine/nodes.py", line 278, in hash_exists
    hashed_inputs, hashvalue = self._get_hashval()
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/engine/nodes.py", line 442, in _get_hashval
    self._get_inputs()
  File "/usr/lib64/python2.7/site-packages/nipype/pipeline/engine/nodes.py", line 497, in _get_inputs
    output_value = results.outputs.get()[output_name]
AttributeError: 'NoneType' object has no attribute 'get'
171102-12:29:41,102 workflow INFO:
         Running node "bids_stim_filename" ("nipype.interfaces.utility.wrappers.Function").
171102-12:29:41,104 workflow INFO:
         Running node "s_bru2nii" ("nipype.interfaces.bru2nii.Bru2"), a CommandLine Interface with command:
Bru2 -a -f -o /home/chymera/workflow/preprocessing/composite_work/_subject_session_5694.ofMcF2/_scan_type_acq-TurboRARE/s_bru2nii/6 /home/chymera/workflow/20170623_224956_5694_1_4/6.
171102-12:29:41,104 workflow INFO:
         Running node "s_bru2nii" ("nipype.interfaces.bru2nii.Bru2"), a CommandLine Interface with command:
Bru2 -a -f -o /home/chymera/workflow/preprocessing/composite_work/_subject_session_5694.ofMpF/_scan_type_acq-TurboRARE/s_bru2nii/8 /home/chymera/workflow/20170710_185042_5694_1_6/8.
171102-12:29:41,132 workflow INFO:
         Running node "bids_filename" ("nipype.interfaces.utility.wrappers.Function").
oesteban commented 6 years ago

Hi @TheChymera, could you confirm that this is not happening with Linear?. I'm afraid that if @effigies is right, then all plugins should show this issue.

Could you run it with logging on DEBUG?

Of course we can wait until after the ISMRM deadline next week.

TheChymera commented 6 years ago

The above output is with all logging set to DEBUG (as seen in the minimal example). I can confirm that it all works with Linear.

effigies commented 6 years ago

Is there no way to reproduce this in a one-or-two node workflow?

oesteban commented 6 years ago

DEBUG must be off, the corresponding traces are missing.

oesteban commented 6 years ago

Hi @TheChymera could you try this again with current master? Even though we could not find the real source of this problem, the logic around the problem has changed significantly. I would hope for that refactoring to fix this issue.

Please, let me know if you would be able to give it a try anytime soon. Thanks very much!

TheChymera commented 6 years ago

@oesteban, we've currently frozen nipype usage to the august version/commits which have worked before this started happening. We need a stable-ish backbone while we migrate to bids data management internally. Depending on @damaggu , this may be finished this week and we can more confidently resume our running-release/live nipype usage.

effigies commented 6 years ago

Hi @TheChymera, can you verify whether this is still an issue in 1.0.1 (released Tuesday)?

effigies commented 6 years ago

Closing for now. Please reopen if this is still an issue.