nipy / nipype

Workflows and interfaces for neuroimaging packages
https://nipype.readthedocs.org/en/latest/
Other
747 stars 530 forks source link

traits.trait_errors.TraitError using SpecifySPMModel #3301

Open JohannesWiesner opened 3 years ago

JohannesWiesner commented 3 years ago

Summary

When using nipype.algorithms.modelgen.SpecifySPMModel, the respective node-folder contains empty files (0 KB) and therefore raises an traits.trait_errors.TraitError

Actual behavior

I set up a workflow using the HCP dataset, where for each subject I select the LR/RL files, standardize and smooth them (using MapNodes) and then I pass them over to nipype.algorithms.modelgen.SpecifySPMModel (setting concatenate_runs to True). However, I am getting the following error message;

traits.trait_errors.TraitError: Each element of the 'functional_runs' trait of a SpecifySPMModelInputSpec instance must be a list of items which are a pathlike object or string representing an existing file or a pathlike object or string representing an existing file, but a value of '/output/workflow_files/nback_first_level_analysis/_subject_id_952863/model_specifier/sub-952863_tfMRI_WM_LR_smoothed.nii' <class 'str'> was specified.

When looking inside the 'node'-folder, one can see that the files are of no size:

hcp_SpecifySPMModel

All other prior nodes work as expected (also when checking the last node before SpecifySPMModel (smoothing) it contains both smoothed files).

Script/Workflow details

Here's the output from the crash.pklz file:

File: /home/neuro/nipype_tutorial/crash-20210213-130208-neuro-model_specifier.a0-08e59e65-3d30-4469-a7ca-66c066766d33.pklz
Node: nback_first_level_analysis.model_specifier
Working directory: /output/workflow_files/nback_first_level_analysis/_subject_id_952863/model_specifier

Node inputs:

bids_amplitude_column = <undefined>
bids_condition_column = trial_type
bids_event_file = ['/output/workflow_files/nback_first_level_analysis/_subject_id_952863/events_tsv_getter/mapflow/_events_tsv_getter0/sub-952863_tfMRI_WM_LR_events.tsv', '/output/workflow_files/nback_first_level_analysis/_subject_id_952863/events_tsv_getter/mapflow/_events_tsv_getter1/sub-952863_tfMRI_WM_RL_events.tsv']
concatenate_runs = True
event_files = <undefined>
functional_runs = ['/output/workflow_files/nback_first_level_analysis/_subject_id_952863/model_specifier/sub-952863_tfMRI_WM_LR_smoothed.nii', '/output/workflow_files/nback_first_level_analysis/_subject_id_952863/model_specifier/sub-952863_tfMRI_WM_RL_smoothed.nii']
high_pass_filter_cutoff = 200.0
input_units = secs
outlier_files = <undefined>
output_units = secs
parameter_source = SPM
realignment_parameters = <undefined>
subject_info = <undefined>
time_repetition = 0.72

Traceback: 
Traceback (most recent call last):
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/pipeline/plugins/linear.py", line 46, in run
    node.run(updatehash=updatehash)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/pipeline/engine/nodes.py", line 516, in run
    result = self._run_interface(execute=True)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/pipeline/engine/nodes.py", line 635, in _run_interface
    return self._run_command(execute)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/pipeline/engine/nodes.py", line 741, in _run_command
    result = self._interface.run(cwd=outdir)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/interfaces/base/core.py", line 419, in run
    runtime = self._run_interface(runtime)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/algorithms/modelgen.py", line 523, in _run_interface
    self._generate_design()
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/algorithms/modelgen.py", line 662, in _generate_design
    infolist = gen_info(self.inputs.event_files)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/algorithms/modelgen.py", line 195, in gen_info
    for i, event_files in enumerate(run_event_files):
TypeError: '_Undefined' object is not iterable

Rerunning node
210213-13:06:15,746 nipype.workflow INFO:
     [Node] Setting-up "nback_first_level_analysis.model_specifier" in "/output/workflow_files/nback_first_level_analysis/_subject_id_952863/model_specifier".
210213-13:06:15,844 nipype.workflow WARNING:
     [Node] Error on "nback_first_level_analysis.model_specifier" (/output/workflow_files/nback_first_level_analysis/_subject_id_952863/model_specifier)
Traceback (most recent call last):
  File "/opt/miniconda-latest/envs/neuro/bin/nipypecli", line 8, in <module>
    sys.exit(cli())
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/click/core.py", line 1259, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/scripts/cli.py", line 96, in crash
    display_crash_file(crashfile, rerun, debug, dir)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/scripts/crash_files.py", line 81, in display_crash_file
    node.run()
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/pipeline/engine/nodes.py", line 516, in run
    result = self._run_interface(execute=True)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/pipeline/engine/nodes.py", line 635, in _run_interface
    return self._run_command(execute)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/pipeline/engine/nodes.py", line 715, in _run_command
    self._copyfiles_to_wd(execute=execute)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/pipeline/engine/nodes.py", line 817, in _copyfiles_to_wd
    setattr(self.inputs, info["key"], newfiles)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/interfaces/base/traits_extension.py", line 426, in validate
    value = super(MultiObject, self).validate(objekt, name, newvalue)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/traits/trait_types.py", line 2473, in validate
    return TraitListObject(self, object, name, value)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/traits/trait_list_object.py", line 582, in __init__
    notifiers=[self.notifier],
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/traits/trait_list_object.py", line 210, in __init__
    super().__init__(self.item_validator(item) for item in iterable)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/traits/trait_list_object.py", line 210, in <genexpr>
    super().__init__(self.item_validator(item) for item in iterable)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/traits/trait_list_object.py", line 862, in _item_validator
    return trait_validator(object, self.name, value)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/traits/trait_handlers.py", line 880, in validate
    return self.slow_validate(object, name, value)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/traits/trait_handlers.py", line 888, in slow_validate
    self.error(object, name, value)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/traits/base_trait_handler.py", line 75, in error
    object, name, self.full_info(object, name, value), value
traits.trait_errors.TraitError: Each element of the 'functional_runs' trait of a SpecifySPMModelInputSpec instance must be a list of items which are a pathlike object or string representing an existing file or a pathlike object or string representing an existing file, but a value of '/output/workflow_files/nback_first_level_analysis/_subject_id_952863/model_specifier/sub-952863_tfMRI_WM_LR_smoothed.nii' <class 'str'> was specified.

Here's my script:

## Import necessary modules

import pandas as pd
from nipype.interfaces.utility import Rename
from nipype.interfaces.base import Bunch
from nipype.interfaces.io import SelectFiles, DataSink
from nipype import Workflow,Node,MapNode
from nipype.interfaces.utility import Function, IdentityInterface
from nipype.algorithms.misc import Gunzip
from os.path import join as opj

from nipype.interfaces.spm import Smooth,Level1Design, EstimateModel, EstimateContrast
from nipype.algorithms.modelgen import SpecifySPMModel

## Global specific settings that apply to all HCP first level analyses

# define a list of subjects
subject_list = ['952863']

# we need to concatenate both the LR and the RL phase encoding files. So we set up a list
# to iterate over both the two functional iamges and their corresponding /EV folders where the task information is stored
phase_encodings = ['LR','RL']

# set up a smoothing kernel
smoothing_fwhm = [4,4,4]

## Task specific settings

# set task specific data directory
# NOTE: This has to end with a slash!
data_dir = '/data/'

# set up the names for all different event files
ev_txt_list = ['0bk_faces.txt','0bk_places.txt','0bk_body.txt','0bk_tools.txt',
               '2bk_faces.txt','2bk_places.txt','2bk_body.txt','2bk_tools.txt']

# provide a list of names for all different event files
ev_names = ['0bk_faces','0bk_places','0bk_body','0bk_tools',
            '2bk_faces','2bk_places','2bk_body','2bk_tools']

# set a name for this specific workflow
workflow_name = 'nback_first_level_analysis'

## Data Preparation

# create a node that iterates over the subject(s)
subject_iterator = Node(IdentityInterface(fields=["subject_id"]),name="subject_iterator")
subject_iterator.iterables = [("subject_id",subject_list)]

## Select the LR and the RL files 

# create a MapNode in order to select either a LR or RL file for a subject
templates = {'func':'{subject_id}/MNINonLinear/Results/tfMRI_WM_{phase_encoding}/tfMRI_WM_{phase_encoding}.nii.gz'}

run_selector = MapNode(SelectFiles(templates,base_directory=data_dir),
                        iterfield=['phase_encoding'],
                        name='run_selector')

run_selector.inputs.phase_encoding = phase_encodings

## Unzip the LR and RL files

# create a node that unzips input files (this is necessary for SPM to run)
gunzipper = MapNode(Gunzip(),iterfield=['in_file'],name='gunzipper')

## Standardize the LR and the RL files

# we want to standardize each image individually
def standardize_img(in_file,subject_id,phase_encoding):

    from nilearn.image import clean_img
    import os

    in_file_standardized = clean_img(imgs=in_file,
                                     detrend=False,
                                     standardize=True)

    in_file_standardized.to_filename(f"sub-{subject_id}_tfMRI_WM_{phase_encoding}.nii")
    out_file = os.path.abspath(f"sub-{subject_id}_tfMRI_WM_{phase_encoding}.nii")

    return out_file

standardizer = MapNode(Function(input_names=['in_file','subject_id','phase_encoding'],
                                output_names=['out_file'],
                                function=standardize_img),
                       iterfield=['in_file','phase_encoding'],
                       name='standardizer')

standardizer.inputs.phase_encoding = phase_encodings

## Smooth the LR and RL files using SPM

def smooth_img(in_file,subject_id,phase_encoding,smoothing_fwhm):

    from nilearn.image import smooth_img
    import os

    in_file_smoothed = smooth_img(imgs=in_file,fwhm=smoothing_fwhm)

    in_file_smoothed.to_filename(f"sub-{subject_id}_tfMRI_WM_{phase_encoding}_smoothed.nii")
    out_file = os.path.abspath(f"sub-{subject_id}_tfMRI_WM_{phase_encoding}_smoothed.nii")

    return out_file

smoother = MapNode(Function(input_names=['in_file','subject_id','phase_encoding','smoothing_fwhm'],
                            output_names=['out_file'],
                            function=smooth_img),
                   iterfield=['in_file','phase_encoding'],
                   name='smoother')

smoother.inputs.smoothing_fwhm = smoothing_fwhm
smoother.inputs.phase_encoding = phase_encodings

## Create a .tsv file from the files in the EVs/ directory for the LR and RL files

# create a function that graps the event files from a run and and returns them as a single dataframe
# NOTE! We have to import Bunch and pandas again because in nipype functions are closed environments
def get_events_tsv_file(data_dir,subject_id,phase_encoding,ev_txt_list,ev_names):

    import pandas as pd
    import os

    # create a path to the directory where the event files are stored based on the input subject id
    subject_dir = f"{data_dir}{subject_id}/MNINonLinear/Results/tfMRI_WM_{phase_encoding}/EVs/"

    # initialize an empty list where the upcoming data frames will be stored
    ev_dfs_list = []

    # read in the .txt file as pandas data frame,add a column 'trial_type' to give a description
    # and add the df to a list of dfs
    for idx,_ in enumerate(ev_txt_list):
        ev_df = pd.read_table(subject_dir + ev_txt_list[idx],names=['onset','duration','amplitude'])
        ev_df['trial_type'] = ev_names[idx]
        ev_dfs_list.append(ev_df)

    # concatenate all dfs
    ev_df = pd.concat(ev_dfs_list,axis=0)

    # save the data frame as .tsv file
    tsv_file = os.path.abspath(f"sub-{subject_id}_tfMRI_WM_{phase_encoding}_events.tsv")
    ev_df.to_csv(tsv_file,index=False,float_format='%.3f',sep='\t')

    return tsv_file

# define a MapNode 
events_tsv_getter = MapNode(Function(input_names=['data_dir','subject_id','phase_encoding','ev_txt_list','ev_names'],
                                     output_names=['tsv_file'],
                                     function=get_events_tsv_file),
                            iterfield=['phase_encoding'],
                            name='events_tsv_getter')

events_tsv_getter.inputs.data_dir = data_dir
events_tsv_getter.inputs.phase_encoding = phase_encodings
events_tsv_getter.inputs.ev_txt_list = ev_txt_list
events_tsv_getter.inputs.ev_names = ev_names

## Set up the First-Level-Model for SPM

# SpecifyModel - Generates SPM-specific Model
model_specifier = Node(SpecifySPMModel(concatenate_runs=True,
                                       input_units='secs',
                                       output_units='secs',
                                       time_repetition=0.720,
                                       high_pass_filter_cutoff=200),
                       name='model_specifier')

# Level1Design - Generates an SPM design matrix
first_level_design = Node(Level1Design(bases={'hrf': {'derivs': [1,0]}},
                                       timing_units='secs',
                                       interscan_interval=0.720,
                                       model_serial_correlations='FAST'),
                          name='first_level_design')

## Define a DataSink Node where outputs should be stored

# define a DataSink node where files should be stored
# NOTE: The boolean parameterization = False ensures that the files are directly saved
# in the datasink folder. Otherwise for each subject there would be another folder (e.g. "_subject_123") created.
datasink = Node(DataSink(base_directory='/output',
                         parameterization=False),
                name="datasink")

## Connect all nodes to a workflow

# define workflow
wf = Workflow(name=workflow_name)
wf.base_dir = '/output/workflow_files'

# pass the subject id over to all nodes that need it
wf.connect(subject_iterator,'subject_id',run_selector,'subject_id')
wf.connect(subject_iterator,'subject_id',standardizer,'subject_id')
wf.connect(subject_iterator,'subject_id',smoother,'subject_id')
wf.connect(subject_iterator,'subject_id',events_tsv_getter,'subject_id')
wf.connect(subject_iterator,'subject_id',datasink,'container')

# connect all nodes that deal with the functional images
wf.connect(run_selector,'func',gunzipper,'in_file')
wf.connect(gunzipper,'out_file',standardizer,'in_file')
wf.connect(standardizer,'out_file',smoother,'in_file')

# connect all nodes that deal with SPM
wf.connect(smoother,'out_file',model_specifier,'functional_runs')
wf.connect(events_tsv_getter,'tsv_file',model_specifier,'bids_event_file')
wf.connect(model_specifier,'session_info',first_level_design,'session_info')

wf_results = wf.run()

Platform details:

{'commit_hash': '%h',
 'commit_source': 'archive substitution',
 'networkx_version': '2.5',
 'nibabel_version': '3.2.0',
 'nipype_version': '1.6.0-dev',
 'numpy_version': '1.19.2',
 'pkg_path': '/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype',
 'scipy_version': '1.5.2',
 'sys_executable': '/opt/miniconda-latest/envs/neuro/bin/python',
 'sys_platform': 'linux',
 'sys_version': '3.7.8 | packaged by conda-forge | (default, Jul 31 2020, '
                '02:25:08) \n'
                '[GCC 7.5.0]',
 'traits_version': '6.1.1'}

Execution environment

Using Michael Notter's nipype_tutorial (most-recent version miykael/nipype_tutorial:2020) running as a docker container on Windows 10

JohannesWiesner commented 3 years ago

I ran the same script using Singularity on an Linux-Server. Again, everything worked fine until the SpecifySPMModel node. Strangely enough, this time I got a different error message. Also the SpecifySPMModel node folder was empty (so no files, which is probably the same as having 0KB files).

File: /home/neuro/nipype_tutorial/crash-20210213-163639-johannes.wiesner-model_specifier.a0-4ab02ae5-ad19-4833-9304-6925b05da769.pklz
Node: nback_first_level_analysis.model_specifier
Working directory: /output/workflow_files/nback_first_level_analysis/_subject_id_952863/model_specifier

Node inputs:

bids_amplitude_column = <undefined>
bids_condition_column = trial_type
bids_event_file = ['/output/workflow_files/nback_first_level_analysis/_subject_id_952863/events_tsv_getter/mapflow/_events_tsv_getter0/sub-952863_tfMRI_WM_LR_events.tsv', '/output/workflow_files/nback_first_level_analysis/_subject_id_952863/events_tsv_getter/mapflow/_events_tsv_getter1/sub-952863_tfMRI_WM_RL_events.tsv']
concatenate_runs = True
event_files = <undefined>
functional_runs = ['/output/workflow_files/nback_first_level_analysis/_subject_id_952863/model_specifier/sub-952863_tfMRI_WM_LR_smoothed.nii', '/output/workflow_files/nback_first_level_analysis/_subject_id_952863/model_specifier/sub-952863_tfMRI_WM_RL_smoothed.nii']
high_pass_filter_cutoff = 200.0
input_units = secs
outlier_files = <undefined>
output_units = secs
parameter_source = SPM
realignment_parameters = <undefined>
subject_info = <undefined>
time_repetition = 0.72

Traceback: 
Traceback (most recent call last):
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/pipeline/plugins/linear.py", line 46, in run
    node.run(updatehash=updatehash)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/pipeline/engine/nodes.py", line 516, in run
    result = self._run_interface(execute=True)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/pipeline/engine/nodes.py", line 635, in _run_interface
    return self._run_command(execute)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/pipeline/engine/nodes.py", line 741, in _run_command
    result = self._interface.run(cwd=outdir)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/interfaces/base/core.py", line 419, in run
    runtime = self._run_interface(runtime)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/algorithms/modelgen.py", line 523, in _run_interface
    self._generate_design()
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/algorithms/modelgen.py", line 662, in _generate_design
    infolist = gen_info(self.inputs.event_files)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/algorithms/modelgen.py", line 195, in gen_info
    for i, event_files in enumerate(run_event_files):
TypeError: '_Undefined' object is not iterable

Rerunning node
210213-16:51:26,395 nipype.workflow INFO:
     [Node] Setting-up "nback_first_level_analysis.model_specifier" in "/output/workflow_files/nback_first_level_analysis/_subject_id_952863/model_specifier".
210213-16:51:26,558 nipype.workflow WARNING:
     [Node] Error on "nback_first_level_analysis.model_specifier" (/output/workflow_files/nback_first_level_analysis/_subject_id_952863/model_specifier)
Traceback (most recent call last):
  File "/opt/miniconda-latest/envs/neuro/bin/nipypecli", line 8, in <module>
    sys.exit(cli())
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/click/core.py", line 1259, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/scripts/cli.py", line 96, in crash
    display_crash_file(crashfile, rerun, debug, dir)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/scripts/crash_files.py", line 81, in display_crash_file
    node.run()
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/pipeline/engine/nodes.py", line 516, in run
    result = self._run_interface(execute=True)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/pipeline/engine/nodes.py", line 635, in _run_interface
    return self._run_command(execute)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/pipeline/engine/nodes.py", line 715, in _run_command
    self._copyfiles_to_wd(execute=execute)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/pipeline/engine/nodes.py", line 811, in _copyfiles_to_wd
    infiles, [outdir], copy=info["copy"], create_new=True
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/utils/filemanip.py", line 518, in copyfiles
    destfile = copyfile(f, destfile, copy, create_new=create_new)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/site-packages/nipype/utils/filemanip.py", line 440, in copyfile
    shutil.copyfile(originalfile, newfile)
  File "/opt/miniconda-latest/envs/neuro/lib/python3.7/shutil.py", line 120, in copyfile
    with open(src, 'rb') as fsrc:
FileNotFoundError: [Errno 2] No such file or directory: '/output/workflow_files/nback_first_level_analysis/_subject_id_952863/model_specifier/sub-952863_tfMRI_WM_LR_smoothed.nii'
satra commented 3 years ago

@JohannesWiesner - this is because this interface was not updated when bids_event_files was introduced. you can use standard event_files - this would be FSL event files, which i believe already exist with the HCP data or this interface needs to be updated (a PR is most welcome. to switch between the two types of event files.

this is the place where an undefined set of files is being processed: https://github.com/nipy/nipype/blob/47fe00b38/nipype/algorithms/modelgen.py#L662

satra commented 3 years ago

it would need to follow a pattern similar to this section:

https://github.com/nipy/nipype/blob/47fe00b38/nipype/algorithms/modelgen.py#L501

JohannesWiesner commented 3 years ago

@satra Thanks for the answer! Okay, so if I got that right, one should use subject_info or event_files for SpecifySPMModel at this moment. This would also mean that the documentation for SpecifySPMModel w.r. to bids_event_files currently is incorrect right?

I can try to update _generate_design in SpecfifySPMModel so that it can handle the bids_event_files attribute. But just to get at the same page here (because I am new to nipype). Why does SpecifySPMModel model add this functionality to _generate_design when the function is inherited from SpecifyModel, as one also would expect when reading the docs?:

Add SPM specific options to SpecifyModel

Adds:

concatenate_runs output_units

Shouldn't everything 'event-related' be handled in SpecifyModel only and (as expected from the docs), SpecifySPMModel only takes care of handling concatenate_runs and output_units?

satra commented 3 years ago

since SPM inherits from the base class, the field gets added to its inputspec, which is why the documentation lists it. indeed that is incorrect.

the generic model indeed handles most things, but concatenating information is a very specific option that only applies to SPM and it changes how the information needs to be structured for it. i haven't looked at those functions closely in a long time. i'm sure there are things that could be refactored.