Some steps in a neuroimaging analysis are repetitive. Running the same preprocessing on multiple subjects or doing statistical inference on multiple files. To prevent the creation of multiple individual scripts, Nipype has as execution plugin for Workflow, called iterables.

If you are interested in more advanced procedures, such as synchronizing multiple iterables or using conditional iterables, check out the synchronizeand intersource section in the JoinNode notebook.

Realistic example

Let's assume we have a workflow with two nodes, node (A) does simple skull stripping, and is followed by a node (B) that does isometric smoothing. Now, let's say, that we are curious about the effect of different smoothing kernels. Therefore, we want to run the smoothing node with FWHM set to 2mm, 8mm, and 16mm.

In [ ]:
from nipype import Node, Workflow
from nipype.interfaces.fsl import BET, IsotropicSmooth

# Initiate a skull stripping Node with BET
skullstrip = Node(BET(mask=True,

Create a smoothing Node with IsotropicSmooth

In [ ]:
isosmooth = Node(IsotropicSmooth(), name='iso_smooth')

Now, to use iterables and therefore smooth with different fwhm is as simple as that:

In [ ]:
isosmooth.iterables = ("fwhm", [4, 8, 16])

And to wrap it up. We need to create a workflow, connect the nodes and finally, can run the workflow in parallel.

In [ ]:
# Create the workflow
wf = Workflow(name="smoothflow")
wf.base_dir = "/output"
wf.connect(skullstrip, 'out_file', isosmooth, 'in_file')

# Run it in parallel (one core for each smoothing kernel)'MultiProc', plugin_args={'n_procs': 3})
180514-09:16:04,390 workflow INFO:
	 Workflow smoothflow settings: ['check', 'execution', 'logging', 'monitoring']
180514-09:16:04,412 workflow INFO:
	 Running in parallel.
180514-09:16:04,417 workflow INFO:
	 [MultiProc] Running 0 tasks, and 1 jobs ready. Free memory (GB): 53.94/53.94, Free processors: 3/3.
180514-09:16:04,463 workflow INFO:
	 [Job 0] Cached (smoothflow.skullstrip).
180514-09:16:06,418 workflow INFO:
	 [MultiProc] Running 0 tasks, and 3 jobs ready. Free memory (GB): 53.94/53.94, Free processors: 3/3.
180514-09:16:06,463 workflow INFO:
	 [Job 1] Cached (smoothflow.iso_smooth).
180514-09:16:06,467 workflow INFO:
	 [Job 2] Cached (smoothflow.iso_smooth).
180514-09:16:06,471 workflow INFO:
	 [Job 3] Cached (smoothflow.iso_smooth).
Out[ ]:
<networkx.classes.digraph.DiGraph at 0x7f77bd100470>

Note, that iterables is set on a specific node (isosmooth in this case), but Workflow is needed to expend the graph to three subgraphs with three different versions of the isosmooth node.

If we visualize the graph with exec, we can see where the parallelization actually takes place.

In [ ]:
# Visualize the detailed graph
from IPython.display import Image
wf.write_graph(graph2use='exec', format='png', simple_form=True)
180514-09:16:08,656 workflow INFO:
	 Generated workflow graph: /output/smoothflow/graph.png (graph2use=exec, simple_form=True).
Out[ ]:

If you look at the structure in the workflow directory, you can also see, that for each smoothing, a specific folder was created, i.e. _fwhm_16.

In [ ]:
!tree /output/smoothflow -I '*txt|*pklz|report*|*.json|*js|*.dot|*.html'
├── _fwhm_16
│   └── iso_smooth
│       ├── _report
│       └── sub-01_ses-test_T1w_brain_smooth.nii.gz
├── _fwhm_4
│   └── iso_smooth
│       ├── _report
│       └── sub-01_ses-test_T1w_brain_smooth.nii.gz
├── _fwhm_8
│   └── iso_smooth
│       ├── _report
│       └── sub-01_ses-test_T1w_brain_smooth.nii.gz
├── graph_detailed.png
├── graph.png
└── skullstrip
    ├── _report
    └── sub-01_ses-test_T1w_brain.nii.gz

11 directories, 6 files

Now, let's visualize the results!

In [ ]:
from nilearn import plotting
%matplotlib inline
In [ ]:
    '/data/ds000114/sub-01/ses-test/anat/sub-01_ses-test_T1w.nii.gz', title='original',
    display_mode='z', dim=-1, cut_coords=(-50, -35, -20, -5), annotate=False);
In [ ]:
    '/output/smoothflow/skullstrip/sub-01_ses-test_T1w_brain.nii.gz', title='skullstripped',
    display_mode='z', dim=-1, cut_coords=(-50, -35, -20, -5), annotate=False);
In [ ]:
    '/output/smoothflow/_fwhm_4/iso_smooth/sub-01_ses-test_T1w_brain_smooth.nii.gz', title='FWHM=4',
    display_mode='z', dim=-0.5, cut_coords=(-50, -35, -20, -5), annotate=False);
In [ ]:
    '/output/smoothflow/_fwhm_8/iso_smooth/sub-01_ses-test_T1w_brain_smooth.nii.gz', title='FWHM=8',
    display_mode='z', dim=-0.5, cut_coords=(-50, -35, -20, -5), annotate=False);
In [ ]:
    '/output/smoothflow/_fwhm_16/iso_smooth/sub-01_ses-test_T1w_brain_smooth.nii.gz', title='FWHM=16',
    display_mode='z', dim=-0.5, cut_coords=(-50, -35, -20, -5), annotate=False);

IdentityInterface (special use case of iterables)

We often want to start our worflow from creating subgraphs, e.g. for running preprocessing for all subjects. We can easily do it with setting iterables on the IdentityInterface. The IdentityInterface interface allows you to create Nodes that does simple identity mapping, i.e. Nodes that only work on parameters/strings.

For example, you want to start your workflow by collecting anatomical files for 5 subjects.

In [ ]:
# First, let's specify the list of subjects
subject_list = ['01', '02', '03', '04', '05']

Now, we can create the IdentityInterface Node

In [ ]:
from nipype import IdentityInterface
infosource = Node(IdentityInterface(fields=['subject_id']),
infosource.iterables = [('subject_id', subject_list)]

That's it. Now, we can connect the output fields of this infosource node to SelectFiles and DataSink nodes.

In [ ]:
from os.path import join as opj
from import SelectFiles, DataSink

anat_file = opj('sub-{subject_id}', 'ses-test', 'anat', 'sub-{subject_id}_ses-test_T1w.nii.gz')

templates = {'anat': anat_file}

selectfiles = Node(SelectFiles(templates,

# Datasink - creates output folder for important outputs
datasink = Node(DataSink(base_directory="/output",

wf_sub = Workflow(name="choosing_subjects")
wf_sub.connect(infosource, "subject_id", selectfiles, "subject_id")
wf_sub.connect(selectfiles, "anat", datasink, "anat_files")
180514-09:16:36,65 workflow INFO:
	 Workflow choosing_subjects settings: ['check', 'execution', 'logging', 'monitoring']
180514-09:16:36,88 workflow INFO:
	 Running serially.
180514-09:16:36,89 workflow INFO:
	 [Node] Setting-up "choosing_subjects.selectfiles" in "/tmp/tmptq0wihmm/choosing_subjects/_subject_id_05/selectfiles".
180514-09:16:36,94 workflow INFO:
	 [Node] Running "selectfiles" ("")
180514-09:16:36,100 workflow INFO:
	 [Node] Finished "choosing_subjects.selectfiles".
180514-09:16:36,102 workflow INFO:
	 [Node] Setting-up "choosing_subjects.datasink" in "/tmp/tmpahli7s3b/choosing_subjects/_subject_id_05/datasink".
180514-09:16:36,112 workflow INFO:
	 [Node] Running "datasink" ("")
180514-09:16:36,135 workflow INFO:
	 [Node] Finished "choosing_subjects.datasink".
180514-09:16:36,136 workflow INFO:
	 [Node] Setting-up "choosing_subjects.selectfiles" in "/tmp/tmpn4n7wuql/choosing_subjects/_subject_id_04/selectfiles".
180514-09:16:36,140 workflow INFO:
	 [Node] Running "selectfiles" ("")
180514-09:16:36,145 workflow INFO:
	 [Node] Finished "choosing_subjects.selectfiles".
180514-09:16:36,146 workflow INFO:
	 [Node] Setting-up "choosing_subjects.datasink" in "/tmp/tmpo5zqli58/choosing_subjects/_subject_id_04/datasink".
180514-09:16:36,153 workflow INFO:
	 [Node] Running "datasink" ("")
180514-09:16:36,158 workflow INFO:
	 [Node] Finished "choosing_subjects.datasink".
180514-09:16:36,159 workflow INFO:
	 [Node] Setting-up "choosing_subjects.selectfiles" in "/tmp/tmpjn4miyer/choosing_subjects/_subject_id_03/selectfiles".
180514-09:16:36,163 workflow INFO:
	 [Node] Running "selectfiles" ("")
180514-09:16:36,168 workflow INFO:
	 [Node] Finished "choosing_subjects.selectfiles".
180514-09:16:36,169 workflow INFO:
	 [Node] Setting-up "choosing_subjects.datasink" in "/tmp/tmp0nkil0_h/choosing_subjects/_subject_id_03/datasink".
180514-09:16:36,175 workflow INFO:
	 [Node] Running "datasink" ("")
180514-09:16:36,179 workflow INFO:
	 [Node] Finished "choosing_subjects.datasink".
180514-09:16:36,181 workflow INFO:
	 [Node] Setting-up "choosing_subjects.selectfiles" in "/tmp/tmpqfn0qf9r/choosing_subjects/_subject_id_02/selectfiles".
180514-09:16:36,186 workflow INFO:
	 [Node] Running "selectfiles" ("")
180514-09:16:36,191 workflow INFO:
	 [Node] Finished "choosing_subjects.selectfiles".
180514-09:16:36,192 workflow INFO:
	 [Node] Setting-up "choosing_subjects.datasink" in "/tmp/tmpk41d2ifu/choosing_subjects/_subject_id_02/datasink".
180514-09:16:36,198 workflow INFO:
	 [Node] Running "datasink" ("")
180514-09:16:36,204 workflow INFO:
	 [Node] Finished "choosing_subjects.datasink".
180514-09:16:36,205 workflow INFO:
	 [Node] Setting-up "choosing_subjects.selectfiles" in "/tmp/tmphi4x0wvy/choosing_subjects/_subject_id_01/selectfiles".
180514-09:16:36,210 workflow INFO:
	 [Node] Running "selectfiles" ("")
180514-09:16:36,216 workflow INFO:
	 [Node] Finished "choosing_subjects.selectfiles".
180514-09:16:36,217 workflow INFO:
	 [Node] Setting-up "choosing_subjects.datasink" in "/tmp/tmpmdti4c5d/choosing_subjects/_subject_id_01/datasink".
180514-09:16:36,224 workflow INFO:
	 [Node] Running "datasink" ("")
180514-09:16:36,464 workflow INFO:
	 [Node] Finished "choosing_subjects.datasink".
Out[ ]:
<networkx.classes.digraph.DiGraph at 0x7f77bc5fd710>

Now we can check that five anatomicl images are in anat_files directory:

In [ ]:
! ls -lh /output/datasink/anat_files/
total 35M
-rw-r--r-- 1 neuro users 8.3M May  3 07:29 sub-01_ses-test_T1w.nii.gz
-rw-r--r-- 1 neuro users 9.6M May 13 22:11 sub-02_ses-test_T1w.nii.gz
-rw-r--r-- 1 neuro users 7.7M May 13 22:11 sub-03_ses-test_T1w.nii.gz
-rw-r--r-- 1 neuro users 9.3M May 13 22:11 sub-04_ses-test_T1w.nii.gz

This was just a simple example of using IdentityInterface, but a complete example of preprocessing workflow you can find in Preprocessing Example).

Exercise 1

Create a workflow to calculate various powers of 2 using two nodes, one for IdentityInterface with iterables, and one for Function interface to calculate the power of 2.

In [ ]:
# write your solution here
In [ ]:
# lets start from the Identity node
from nipype import Function, Node, Workflow
from nipype.interfaces.utility import IdentityInterface

iden = Node(IdentityInterface(fields=['number']), name="identity")
iden.iterables = [("number", range(8))]
In [ ]:
# the second node should use the Function interface
def power_of_two(n):
    return 2**n

# Create Node
power = Node(Function(input_names=["n"],
In [ ]:
#and now the workflow
wf_ex1 = Workflow(name="exercise1")
wf_ex1.connect(iden, "number", power, "n")
res_ex1 =

# we can print the results
for i in range(8):
180514-09:16:37,175 workflow INFO:
	 Workflow exercise1 settings: ['check', 'execution', 'logging', 'monitoring']
180514-09:16:37,188 workflow INFO:
	 Running serially.
180514-09:16:37,189 workflow INFO:
	 [Node] Setting-up "exercise1.power" in "/tmp/tmpvozrai48/exercise1/_number_7/power".
180514-09:16:37,193 workflow INFO:
	 [Node] Running "power" ("nipype.interfaces.utility.wrappers.Function")
180514-09:16:37,198 workflow INFO:
	 [Node] Finished "exercise1.power".
180514-09:16:37,199 workflow INFO:
	 [Node] Setting-up "exercise1.power" in "/tmp/tmpfybxi3e4/exercise1/_number_6/power".
180514-09:16:37,203 workflow INFO:
	 [Node] Running "power" ("nipype.interfaces.utility.wrappers.Function")
180514-09:16:37,208 workflow INFO:
	 [Node] Finished "exercise1.power".
180514-09:16:37,209 workflow INFO:
	 [Node] Setting-up "exercise1.power" in "/tmp/tmpr7z0xy4u/exercise1/_number_5/power".
180514-09:16:37,213 workflow INFO:
	 [Node] Running "power" ("nipype.interfaces.utility.wrappers.Function")
180514-09:16:37,219 workflow INFO:
	 [Node] Finished "exercise1.power".
180514-09:16:37,220 workflow INFO:
	 [Node] Setting-up "exercise1.power" in "/tmp/tmpy5n8vgsh/exercise1/_number_4/power".
180514-09:16:37,223 workflow INFO:
	 [Node] Running "power" ("nipype.interfaces.utility.wrappers.Function")
180514-09:16:37,228 workflow INFO:
	 [Node] Finished "exercise1.power".
180514-09:16:37,229 workflow INFO:
	 [Node] Setting-up "exercise1.power" in "/tmp/tmpvjjj7j5n/exercise1/_number_3/power".
180514-09:16:37,232 workflow INFO:
	 [Node] Running "power" ("nipype.interfaces.utility.wrappers.Function")
180514-09:16:37,237 workflow INFO:
	 [Node] Finished "exercise1.power".
180514-09:16:37,238 workflow INFO:
	 [Node] Setting-up "exercise1.power" in "/tmp/tmpkbo9hgu3/exercise1/_number_2/power".
180514-09:16:37,241 workflow INFO:
	 [Node] Running "power" ("nipype.interfaces.utility.wrappers.Function")
180514-09:16:37,245 workflow INFO:
	 [Node] Finished "exercise1.power".
180514-09:16:37,246 workflow INFO:
	 [Node] Setting-up "exercise1.power" in "/tmp/tmpsdb_4wyt/exercise1/_number_1/power".
180514-09:16:37,250 workflow INFO:
	 [Node] Running "power" ("nipype.interfaces.utility.wrappers.Function")
180514-09:16:37,254 workflow INFO:
	 [Node] Finished "exercise1.power".
180514-09:16:37,255 workflow INFO:
	 [Node] Setting-up "exercise1.power" in "/tmp/tmprpl39ttq/exercise1/_number_0/power".
180514-09:16:37,259 workflow INFO:
	 [Node] Running "power" ("nipype.interfaces.utility.wrappers.Function")
180514-09:16:37,263 workflow INFO:
	 [Node] Finished "exercise1.power".

pow = 1

pow = 2

pow = 4

pow = 8

pow = 16

pow = 32

pow = 64

pow = 128

Home | github | Nipype