# Errors and Crashes¶

Probably the most important chapter in this section is about how to handle error and crashes. Because at the beginning you will run into a few.

For example:

1. You specified filenames or paths that don't exist.
2. You try to give an interface a string as input, where a float value is expected or you try to specify a parameter that doesn't exist. Be sure to use the right input type and input name.
3. You wanted to give a list of inputs [func1.nii, func2.nii, func3.nii] to a node that only expects one input file. MapNode is your solution.
4. You wanted to run SPM's motion correction on compressed NIfTI files, i.e. *.nii.gz? SPM cannot handle that. Nipype's Gunzip interface can help.
5. You haven't set up all necessary environment variables. Nipype, for example, doesn't find your MATLAB or SPM version.
6. You forget to specify a mandatory input field.
7. You try to connect a node to an input field that another node is already connected to.

Important note about crashfiles. Crashfiles are only created when you run a workflow, not during building a workflow. If you have a typo in a folder path, because they didn't happen during runtime, but still during workflow building.

We will start by removing old crashfiles:

In [ ]:
%%bash
rm $(pwd)/crash-* ## Example Crash 1: File doesn't exist¶ When creating a new workflow, very often the initial errors are OSError, meaning Nipype cannot find the right files. For example, let's try to run a workflow on sub-11, that in our dataset doesn't exist. ### Creating the crash¶ In [ ]: from nipype import SelectFiles, Node, Workflow from os.path import abspath as opap from nipype.interfaces.fsl import MCFLIRT, IsotropicSmooth # Create SelectFiles node templates={'func': '{subject_id}/ses-test/func/{subject_id}_ses-test_task-fingerfootlips_bold.nii.gz'} sf = Node(SelectFiles(templates), name='selectfiles') sf.inputs.base_directory = opap('/data/ds000114') sf.inputs.subject_id = 'sub-11' # Create Motion Correction Node mcflirt = Node(MCFLIRT(mean_vol=True, save_plots=True), name='mcflirt') # Create Smoothing node smooth = Node(IsotropicSmooth(fwhm=4), name='smooth') # Create a preprocessing workflow wf = Workflow(name="preprocWF") wf.base_dir = 'working_dir' # Connect the three nodes to each other wf.connect([(sf, mcflirt, [("func", "in_file")]), (mcflirt, smooth, [("out_file", "in_file")])]) # Let's run the workflow try: wf.run() except(RuntimeError) as err: print("RuntimeError:", err) else: raise 180514-09:15:24,87 workflow INFO: Workflow preprocWF settings: ['check', 'execution', 'logging', 'monitoring'] 180514-09:15:24,92 workflow INFO: Running serially. 180514-09:15:24,93 workflow INFO: [Node] Setting-up "preprocWF.selectfiles" in "/home/neuro/nipype_tutorial/notebooks/working_dir/preprocWF/selectfiles". 180514-09:15:24,97 workflow INFO: [Node] Running "selectfiles" ("nipype.interfaces.io.SelectFiles") 180514-09:15:24,105 workflow WARNING: [Node] Error on "preprocWF.selectfiles" (/home/neuro/nipype_tutorial/notebooks/working_dir/preprocWF/selectfiles) 180514-09:15:24,108 workflow ERROR: Node selectfiles failed to run on host 7eb1beccba8f. 180514-09:15:24,111 workflow ERROR: Saving crash info to /home/neuro/nipype_tutorial/notebooks/crash-20180514-091524-neuro-selectfiles-648d7b9b-092e-479a-b79c-c04ce2ba5774.pklz Traceback (most recent call last): File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/pipeline/plugins/linear.py", line 44, in run node.run(updatehash=updatehash) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/pipeline/engine/nodes.py", line 480, in run result = self._run_interface(execute=True) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/pipeline/engine/nodes.py", line 564, in _run_interface return self._run_command(execute) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/pipeline/engine/nodes.py", line 644, in _run_command result = self._interface.run(cwd=outdir) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/interfaces/base/core.py", line 521, in run outputs = self.aggregate_outputs(runtime) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/interfaces/base/core.py", line 595, in aggregate_outputs predicted_outputs = self._list_outputs() File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/interfaces/io.py", line 1402, in _list_outputs raise IOError(msg) OSError: No files were found matching func template: /data/ds000114/sub-11/ses-test/func/sub-11_ses-test_task-fingerfootlips_bold.nii.gz 180514-09:15:24,112 workflow INFO: *********************************** 180514-09:15:24,113 workflow ERROR: could not run node: preprocWF.selectfiles 180514-09:15:24,114 workflow INFO: crashfile: /home/neuro/nipype_tutorial/notebooks/crash-20180514-091524-neuro-selectfiles-648d7b9b-092e-479a-b79c-c04ce2ba5774.pklz 180514-09:15:24,114 workflow INFO: *********************************** RuntimeError: Workflow did not execute cleanly. Check log for details ### Investigating the crash¶ Hidden, in the log file you can find the relevant information: OSError: No files were found matching func template: /data/ds000114/sub-11/ses-test/func/sub-11_ses-test_task-fingerfootlips_bold.nii.gz Interface SelectFiles failed to run. 170904-05:48:13,727 workflow INFO: *********************************** 170904-05:48:13,728 workflow ERROR: could not run node: preprocWF.selectfiles 170904-05:48:13,730 workflow INFO: crashfile: /repos/nipype_tutorial/notebooks/crash-20170904-054813-neuro-selectfiles-15f5400a-452e-4e0c-ae99-fc0d4b9a44f3.pklz 170904-05:48:13,731 workflow INFO: *********************************** This part tells you that it's an OSError and that it looked for the file /data/ds000114/sub-11/ses-test/func/sub-11_ses-test_task-fingerfootlips_bold.nii.gz. After the line ***********************************, you can additional see, that it's the node preprocWF.selectfiles that crasehd and that you can find a crashfile to this crash under /opt/tutorial/notebooks. ### Reading the crashfile¶ To get the full picture of the error, we can read the content of the crashfile (that has pklz format by default) with the bash command nipypecli crash. We will get the same information as above, but additionally, we can also see directly the input values of the Node that crashed. In [ ]: !nipypecli crash$(pwd)/crash-*selectfiles-*.pklz

File: /home/neuro/nipype_tutorial/notebooks/crash-20180514-091524-neuro-selectfiles-648d7b9b-092e-479a-b79c-c04ce2ba5774.pklz
Node: preprocWF.selectfiles
Working directory: /home/neuro/nipype_tutorial/notebooks/working_dir/preprocWF/selectfiles

Node inputs:

base_directory = /data/ds000114
force_lists = False
ignore_exception = False
raise_on_empty = True
sort_filelist = True
subject_id = sub-11

Traceback:
Traceback (most recent call last):
File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/pipeline/plugins/linear.py", line 44, in run
node.run(updatehash=updatehash)
File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/pipeline/engine/nodes.py", line 480, in run
result = self._run_interface(execute=True)
File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/pipeline/engine/nodes.py", line 564, in _run_interface
return self._run_command(execute)
File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/pipeline/engine/nodes.py", line 644, in _run_command
result = self._interface.run(cwd=outdir)
File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/interfaces/base/core.py", line 521, in run
outputs = self.aggregate_outputs(runtime)
File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/interfaces/base/core.py", line 595, in aggregate_outputs
predicted_outputs = self._list_outputs()
File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/interfaces/io.py", line 1402, in _list_outputs
raise IOError(msg)
OSError: No files were found matching func template: /data/ds000114/sub-11/ses-test/func/sub-11_ses-test_task-fingerfootlips_bold.nii.gz

nipypecli allows you to rerun the crashed node using an additional option -r.

In [ ]:
!nipypecli crash -r $(pwd)/crash-*selectfiles-*.pklz File: /home/neuro/nipype_tutorial/notebooks/crash-20180514-091524-neuro-selectfiles-648d7b9b-092e-479a-b79c-c04ce2ba5774.pklz Node: preprocWF.selectfiles Working directory: /home/neuro/nipype_tutorial/notebooks/working_dir/preprocWF/selectfiles Node inputs: base_directory = /data/ds000114 force_lists = False ignore_exception = False raise_on_empty = True sort_filelist = True subject_id = sub-11 Traceback: Traceback (most recent call last): File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/pipeline/plugins/linear.py", line 44, in run node.run(updatehash=updatehash) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/pipeline/engine/nodes.py", line 480, in run result = self._run_interface(execute=True) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/pipeline/engine/nodes.py", line 564, in _run_interface return self._run_command(execute) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/pipeline/engine/nodes.py", line 644, in _run_command result = self._interface.run(cwd=outdir) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/interfaces/base/core.py", line 521, in run outputs = self.aggregate_outputs(runtime) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/interfaces/base/core.py", line 595, in aggregate_outputs predicted_outputs = self._list_outputs() File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/interfaces/io.py", line 1402, in _list_outputs raise IOError(msg) OSError: No files were found matching func template: /data/ds000114/sub-11/ses-test/func/sub-11_ses-test_task-fingerfootlips_bold.nii.gz Rerunning node 180514-09:15:27,681 workflow INFO: [Node] Setting-up "preprocWF.selectfiles" in "/home/neuro/nipype_tutorial/notebooks/working_dir/preprocWF/selectfiles". 180514-09:15:27,685 workflow INFO: [Node] Running "selectfiles" ("nipype.interfaces.io.SelectFiles") 180514-09:15:27,688 workflow WARNING: [Node] Error on "preprocWF.selectfiles" (/home/neuro/nipype_tutorial/notebooks/working_dir/preprocWF/selectfiles) Traceback (most recent call last): File "/opt/conda/envs/neuro/bin/nipypecli", line 11, in <module> load_entry_point('nipype==1.0.4.dev0', 'console_scripts', 'nipypecli')() File "/opt/conda/envs/neuro/lib/python3.6/site-packages/click/core.py", line 722, in __call__ return self.main(*args, **kwargs) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/click/core.py", line 697, in main rv = self.invoke(ctx) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/click/core.py", line 1066, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/click/core.py", line 895, in invoke return ctx.invoke(self.callback, **ctx.params) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/click/core.py", line 535, in invoke return callback(*args, **kwargs) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/scripts/cli.py", line 94, in crash display_crash_file(crashfile, rerun, debug, dir) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/scripts/crash_files.py", line 81, in display_crash_file node.run() File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/pipeline/engine/nodes.py", line 480, in run result = self._run_interface(execute=True) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/pipeline/engine/nodes.py", line 564, in _run_interface return self._run_command(execute) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/pipeline/engine/nodes.py", line 644, in _run_command result = self._interface.run(cwd=outdir) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/interfaces/base/core.py", line 521, in run outputs = self.aggregate_outputs(runtime) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/interfaces/base/core.py", line 595, in aggregate_outputs predicted_outputs = self._list_outputs() File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/interfaces/io.py", line 1402, in _list_outputs raise IOError(msg) OSError: No files were found matching func template: /data/ds000114/sub-11/ses-test/func/sub-11_ses-test_task-fingerfootlips_bold.nii.gz When running in the terminal you can also try options that enable the Python or Ipython debugger when re-executing: -d or -i. If you don't want to have an option to rerun the crashed workflow, you can change the format of crashfile to a text format. You can either change this in a configuration file (you can read more here), or you can directly change the wf.config dictionary before running the workflow. In [ ]: wf.config['execution']['crashfile_format'] = 'txt' try: wf.run() except(RuntimeError) as err: print("RuntimeError:", err) else: raise 180514-09:15:27,908 workflow INFO: Workflow preprocWF settings: ['check', 'execution', 'logging', 'monitoring'] 180514-09:15:27,916 workflow INFO: Running serially. 180514-09:15:27,917 workflow INFO: [Node] Setting-up "preprocWF.selectfiles" in "/home/neuro/nipype_tutorial/notebooks/working_dir/preprocWF/selectfiles". 180514-09:15:27,924 workflow INFO: [Node] Running "selectfiles" ("nipype.interfaces.io.SelectFiles") 180514-09:15:27,927 workflow WARNING: [Node] Error on "preprocWF.selectfiles" (/home/neuro/nipype_tutorial/notebooks/working_dir/preprocWF/selectfiles) 180514-09:15:27,930 workflow ERROR: Node selectfiles failed to run on host 7eb1beccba8f. 180514-09:15:27,931 workflow ERROR: Saving crash info to /home/neuro/nipype_tutorial/notebooks/crash-20180514-091527-neuro-selectfiles-21e0b54b-5a6c-45fb-a996-92e803d9778c.txt Traceback (most recent call last): File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/pipeline/plugins/linear.py", line 44, in run node.run(updatehash=updatehash) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/pipeline/engine/nodes.py", line 480, in run result = self._run_interface(execute=True) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/pipeline/engine/nodes.py", line 564, in _run_interface return self._run_command(execute) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/pipeline/engine/nodes.py", line 644, in _run_command result = self._interface.run(cwd=outdir) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/interfaces/base/core.py", line 521, in run outputs = self.aggregate_outputs(runtime) File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/interfaces/base/core.py", line 595, in aggregate_outputs predicted_outputs = self._list_outputs() File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/interfaces/io.py", line 1402, in _list_outputs raise IOError(msg) OSError: No files were found matching func template: /data/ds000114/sub-11/ses-test/func/sub-11_ses-test_task-fingerfootlips_bold.nii.gz 180514-09:15:27,932 workflow INFO: *********************************** 180514-09:15:27,933 workflow ERROR: could not run node: preprocWF.selectfiles 180514-09:15:27,933 workflow INFO: crashfile: /home/neuro/nipype_tutorial/notebooks/crash-20180514-091527-neuro-selectfiles-21e0b54b-5a6c-45fb-a996-92e803d9778c.txt 180514-09:15:27,934 workflow INFO: *********************************** RuntimeError: Workflow did not execute cleanly. Check log for details Now you should have a new text file with your crash report. In [ ]: !cat$(pwd)/crash-*selectfiles-*.txt
Node: preprocWF.selectfiles
Working directory: /home/neuro/nipype_tutorial/notebooks/working_dir/preprocWF/selectfiles

Node inputs:

base_directory = /data/ds000114
force_lists = False
ignore_exception = False
raise_on_empty = True
sort_filelist = True
subject_id = sub-11

Traceback (most recent call last):
File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/pipeline/plugins/linear.py", line 44, in run
node.run(updatehash=updatehash)
File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/pipeline/engine/nodes.py", line 480, in run
result = self._run_interface(execute=True)
File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/pipeline/engine/nodes.py", line 564, in _run_interface
return self._run_command(execute)
File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/pipeline/engine/nodes.py", line 644, in _run_command
result = self._interface.run(cwd=outdir)
File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/interfaces/base/core.py", line 521, in run
outputs = self.aggregate_outputs(runtime)
File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/interfaces/base/core.py", line 595, in aggregate_outputs
predicted_outputs = self._list_outputs()
File "/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/interfaces/io.py", line 1402, in _list_outputs
raise IOError(msg)
OSError: No files were found matching func template: /data/ds000114/sub-11/ses-test/func/sub-11_ses-test_task-fingerfootlips_bold.nii.gz

## Example Crash 2: Wrong Input Type or Typo in the parameter¶

Very simple, if an interface expects a float as input, but you give it a string, it will crash:

In [ ]:
from nipype.interfaces.fsl import IsotropicSmooth
try:
smooth = IsotropicSmooth(fwhm='4')
except(Exception) as err:
if "TraitError" in str(err.__class__):
print("TraitError:", err)
else:
raise
else:
raise
TraitError: The 'fwhm' trait of an IsotropicSmoothInput instance must be a float, but a value of '4' <class 'str'> was specified.

This will give you the error: TraitError: The 'fwhm' trait of an IsotropicSmoothInput instance must be a float, but a value of '4' <type 'str'> was specified.

To make sure that you are using the right input types, just check the help section of a given interface. There you can see fwhm: (a float).

In [ ]:
IsotropicSmooth.help()
Wraps command **fslmaths**

Use fslmaths to spatially smooth an image with a gaussian kernel.

Inputs::

[Mandatory]
fwhm: (a float)
fwhm of smoothing kernel [mm]
flag: -s %.5f, position: 4
mutually_exclusive: sigma
in_file: (an existing file name)
image to operate on
flag: %s, position: 2
sigma: (a float)
sigma of smoothing kernel [mm]
flag: -s %.5f, position: 4
mutually_exclusive: fwhm

[Optional]
args: (a unicode string)
flag: %s
environ: (a dictionary with keys which are a bytes or None or a value
of class 'str' and with values which are a bytes or None or a value
of class 'str', nipype default value: {})
Environment variables
ignore_exception: (a boolean, nipype default value: False)
Print an error message instead of throwing an exception in case the
interface fails to run
internal_datatype: ('float' or 'char' or 'int' or 'short' or 'double'
or 'input')
datatype to use for calculations (default is float)
flag: -dt %s, position: 1
nan2zeros: (a boolean)
change NaNs to zeros before doing anything
flag: -nan, position: 3
out_file: (a file name)
image to write
flag: %s, position: -2
output_datatype: ('float' or 'char' or 'int' or 'short' or 'double'
or 'input')
datatype to use for output (default uses input type)
flag: -odt %s, position: -1
output_type: ('NIFTI' or 'NIFTI_PAIR' or 'NIFTI_GZ' or
'NIFTI_PAIR_GZ')
FSL output type
terminal_output: ('stream' or 'allatonce' or 'file' or 'none')
Control terminal output: stream - displays to terminal immediately
(default), allatonce - waits till command is finished to display
output, file - writes output to file, none - output is ignored

Outputs::

out_file: (an existing file name)
image written after calculations

References::
BibTeX('@article{JenkinsonBeckmannBehrensWoolrichSmith2012,author={M. Jenkinson, C.F. Beckmann, T.E. Behrens, M.W. Woolrich, and S.M. Smith},title={FSL},journal={NeuroImage},volume={62},pages={782-790},year={2012},}', key='JenkinsonBeckmannBehrensWoolrichSmith2012')

In a similar way, you will also get an error message if the input type is correct but you have a type in the name:

TraitError: The 'output_type' trait of an IsotropicSmoothInput instance must be u'NIFTI_PAIR' or u'NIFTI_PAIR_GZ' or u'NIFTI_GZ' or u'NIFTI', but a value of 'NIFTIiii' <type 'str'> was specified.
In [ ]:
from nipype.interfaces.fsl import IsotropicSmooth
try:
smooth = IsotropicSmooth(output_type='NIFTIiii')
except(Exception) as err:
if "TraitError" in str(err.__class__):
print("TraitError:", err)
else:
raise
else:
raise
TraitError: The 'output_type' trait of an IsotropicSmoothInput instance must be 'NIFTI' or 'NIFTI_PAIR' or 'NIFTI_GZ' or 'NIFTI_PAIR_GZ', but a value of 'NIFTIiii' <class 'str'> was specified.

## Example Crash 3: Giving an array as input where a single file is expected¶

As you can see in the MapNode example, if you try to feed an array as an input into a field that only expects a single file, you will get a TraitError.

In [ ]:
from nipype.algorithms.misc import Gunzip
from nipype import Node

gunzip = Node(Gunzip(), name='gunzip',)

try:
gunzip.inputs.in_file = files
except(Exception) as err:
if "TraitError" in str(err.__class__):
print("TraitError:", err)
else:
raise
else:
raise
TraitError: The 'in_file' trait of a GunzipInputSpec instance must be an existing file name, but a value of ['/data/ds000114/sub-01/ses-test/func/sub-01_ses-test_task-fingerfootlips_bold.nii.gz', '/data/ds000114/sub-01/ses-test/func/sub-01_ses-test_task-fingerfootlips_bold.nii.gz'] <class 'list'> was specified.

This can be solved by using a MapNode:

In [ ]:
from nipype import MapNode
gunzip = MapNode(Gunzip(), name='gunzip', iterfield=['in_file'])
gunzip.inputs.in_file = files

Now, make sure that you specify files that actually exist, otherwise you will have a TraitError again:

In [ ]:

try:
gunzip.inputs.in_file = files
except(Exception) as err:
if "TraitError" in str(err.__class__):
print("TraitError:", err)
else:
raise
else:
raise
TraitError: The trait 'in_file' of a DynamicTraitedSpec instance is an existing file name, but the path  '/data/ds000114/sub-01/func/sub-01_task-fingerfootlips_bold.nii.gz' does not exist.

By the way, not that those crashes don't create a crashfile, because they didn't happen during runtime, but still during workflow building.

## Example Crash 4: SPM doesn't like *.nii.gz files¶

SPM12 cannot handle compressed NIfTI files (*nii.gz). If you try to run the node nonetheless, it can give you different kind of problems:

### SPM Problem 1 with *.nii.gz files¶

SPM12 has a problem with handling *.nii.gz files. For it a compressed functional image has no temporal dimension and therefore seems to be just a 3D file. So if we try to run the Realign interface on a compressed file, we will get a TraitError error.

In [ ]:
from nipype.interfaces.spm import Smooth

try:
smooth = Smooth(in_files='/data/ds000114/sub-01/ses-test/anat/sub-01_ses-test_T1w.nii.gz')
except(Exception) as err:
if "TraitError" in str(err.__class__):
print("TraitError:", err)
else:
raise
else:
raise
TraitError: /data/ds000114/sub-01/ses-test/anat/sub-01_ses-test_T1w.nii.gz is not included in allowed types: .img, .hdr, .nii

### SPM problem 2 with *.nii.gz files¶

Sometimes TraitError can be more misleading.

In [ ]:
from nipype.interfaces.spm import Realign

try:
except(Exception) as err:
if "TraitError" in str(err.__class__):
print("TraitError:", err)
else:
raise
else:
raise
TraitError: Each element of the 'in_files' trait of a RealignInputSpec instance must be an existing, uncompressed file (valid extensions: [.img, .hdr, .nii]) or a list of items which are an existing, uncompressed file (valid extensions: [.img, .hdr, .nii]), but a value of '/data/ds000114/sub-01/ses-test/func/sub-01_ses-test_task-fingerfootlips_bold.nii.gz' <class 'str'> was specified.

This issue can be solved by unzipping the compressed NIfTI file before giving it as an input to an SPM node. This can either be done by using the Gunzip interface from Nipype or even better if the input is coming from a FSL interface, most of them have an input filed output_type='NIFTI', that you can set to NIFIT.

## Example Crash 5: Nipype cannot find the right software¶

Especially at the beginning, just after installation, you sometimes forgot to specify some environment variables. If you try to use an interface where the environment variables of the software are not specified, e.g. if you try to run:

from nipype.interfaces.freesurfer import MRIConvert
convert = MRIConvert(in_file='/data/ds000114/sub-01/ses-test/anat/sub-01_ses-test_T1w.nii.gz',
out_type='nii')

you might get an errors, such as:

IOError: command 'mri_convert' could not be found on host mnotter
Interface MRIConvert failed to run.

Or if you try to use SPM, but forgot to tell Nipype where to find it. If you forgot to tell the system where to find MATLAB (or MCR), then you will get the same kind of error as above. But if you forgot to specify which SPM you want to use, you'll get the following RuntimeError:

Standard error:
MATLAB code threw an exception:
SPM not in matlab path

You can solve this issue by specifying the path to your SPM version:

from nipype.interfaces.matlab import MatlabCommand
MatlabCommand.set_default_paths('/opt/spm12-r7219/spm12_mcr/spm12')

## Example Crash 6: You forget mandatory inputs or use input fields that don't exist¶

One of the simpler errors are the ones connected to input and output fields.

### Forgetting mandatory input fields¶

Let's see what happens if you forget a [Mandatory] input field.

In [ ]:
from nipype.interfaces.spm import Realign
realign = Realign(register_to_mean=True)

try:
realign.run()
except(ValueError) as err:
print("ValueError:", err)
else:
raise
ValueError: Realign requires a value for input 'in_files'. For a list of required inputs, see Realign.help()

This gives you the error:

ValueError: Realign requires a value for input 'in_files'. For a list of required inputs, see Realign.help()

As described by the error text, if we use the help() function, we can actually see, which inputs are mandatory and which are optional.

In [ ]:
realign.help()
Use spm_realign for estimating within modality rigid body alignment

http://www.fil.ion.ucl.ac.uk/spm/doc/manual.pdf#page=25

Examples
--------

>>> import nipype.interfaces.spm as spm
>>> realign = spm.Realign()
>>> realign.inputs.in_files = 'functional.nii'
>>> realign.inputs.register_to_mean = True
>>> realign.run() # doctest: +SKIP

Inputs::

[Mandatory]
in_files: (a list of items which are an existing, uncompressed file
(valid extensions: [.img, .hdr, .nii]) or a list of items which are
an existing, uncompressed file (valid extensions: [.img, .hdr,
.nii]))
list of filenames to realign

[Optional]
fwhm: (a floating point number >= 0.0)
gaussian smoothing kernel width
ignore_exception: (a boolean, nipype default value: False)
Print an error message instead of throwing an exception in case the
interface fails to run
interp: (0 <= a long integer <= 7)
degree of b-spline used for interpolation
jobtype: ('estwrite' or 'estimate' or 'write', nipype default value:
estwrite)
one of: estimate, write, estwrite
matlab_cmd: (a unicode string)
matlab command to use
mfile: (a boolean, nipype default value: True)
Run m-code using m-file
out_prefix: (a string, nipype default value: r)
realigned output prefix
paths: (a list of items which are a directory name)
quality: (0.0 <= a floating point number <= 1.0)
0.1 = fast, 1.0 = precise
register_to_mean: (a boolean)
Indicate whether realignment is done to the mean image
separation: (a floating point number >= 0.0)
sampling separation in mm
use_mcr: (a boolean)
Run m-code using SPM MCR
use_v8struct: (a boolean, nipype default value: True)
Generate SPM8 and higher compatible jobs
weight_img: (an existing file name)
filename of weighting image
wrap: (a list of from 3 to 3 items which are an integer (int or
long))
Check if interpolation should wrap in [x,y,z]
write_interp: (0 <= a long integer <= 7)
degree of b-spline used for interpolation
write_which: (a list of items which are a value of class 'int',
nipype default value: [2, 1])
determines which images to reslice
write_wrap: (a list of from 3 to 3 items which are an integer (int or
long))
Check if interpolation should wrap in [x,y,z]

Outputs::

mean_image: (an existing file name)
Mean image file from the realignment
modified_in_files: (a list of items which are a list of items which
are an existing file name or an existing file name)
Copies of all files passed to in_files. Headers will have been
modified to align all images with the first, or optionally to first
do that, extract a mean image, and re-align to that mean image.
realigned_files: (a list of items which are a list of items which are
an existing file name or an existing file name)
If jobtype is write or estwrite, these will be the resliced files.
Otherwise, they will be copies of in_files that have had their
realignment_parameters: (a list of items which are an existing file
name)
Estimated translation and rotation parameters

References::
BibTeX('@book{FrackowiakFristonFrithDolanMazziotta1997,author={R.S.J. Frackowiak, K.J. Friston, C.D. Frith, R.J. Dolan, and J.C. Mazziotta},title={Human Brain Function},publisher={Academic Press USA},year={1997},}', key='FrackowiakFristonFrithDolanMazziotta1997')

### Using input fields that don't exist¶

Let's see what happens if we try to specify a parameter that doesn't exist as an input field:

In [ ]:
from nipype.interfaces.afni import Despike

try:
output_type='NIFTI')
except(Exception) as err:
if "TraitError" in str(err.__class__):
print("TraitError:", err)
else:
raise
else:
raise
TraitError: Cannot set the undefined 'output_type' attribute of a 'DespikeInputSpec' object.

This results in the TraitError:

TraitError: Cannot set the undefined 'output_type' attribute of a 'DespikeInputSpec' object.

So what went wrong? If you use the help() function, you will see that the correct input filed is called outputtype and not output_type.

## Example Crash 7: Trying to connect a node to an input field that is already occupied¶

Sometimes when you build a new workflow, you might forget that an output field was already connected and you try to connect a new node to the already occupied field.

First, let's create a simple workflow:

In [ ]:
from nipype import SelectFiles, Node, Workflow
from os.path import abspath as opap
from nipype.interfaces.fsl import MCFLIRT, IsotropicSmooth

# Create SelectFiles node
sf = Node(SelectFiles(templates),
name='selectfiles')
sf.inputs.base_directory = opap('/data/ds000114')
sf.inputs.subject_id = 'sub-01'

# Create Motion Correction Node
mcflirt = Node(MCFLIRT(mean_vol=True,
save_plots=True),
name='mcflirt')

# Create Smoothing node
smooth = Node(IsotropicSmooth(fwhm=4),
name='smooth')

# Create a preprocessing workflow
wf = Workflow(name="preprocWF")
wf.base_dir = 'working_dir'

# Connect the three nodes to each other
wf.connect([(sf, mcflirt, [("func", "in_file")]),
(mcflirt, smooth, [("out_file", "in_file")])])

Now, let's create a new node and connect it to the already occupied input field in_file of the smooth node:

In [ ]:
# Create a new node
mcflirt_NEW = Node(MCFLIRT(mean_vol=True),
name='mcflirt_NEW')

# Connect it to an already connected input field
try:
wf.connect([(mcflirt_NEW, smooth, [("out_file", "in_file")])])
except(Exception) as err:
print("Exception:", err)
else:
raise
Exception: Trying to connect preprocWF.mcflirt_NEW:out_file to preprocWF.smooth:in_file but input 'in_file' of node 'preprocWF.smooth' is already
connected.

This will lead to the error:

Exception:
Trying to connect preprocWF.mcflirt_NEW:out_file to preprocWF.smooth:in_file but input 'in_file' of node 'preprocWF.smooth' is already connected.