Hands-on 1: How to create a fMRI preprocessing workflow

The purpose of this section is that you set-up a complete fMRI analysis workflow yourself. So that in the end you are able to perform the analysis from A-Z, i.e. from preprocessing to group analysis. This section will cover the preprocessing part, and the section Hands-on 2: Analysis will handle the analysis part.

We will use this opportunity to show you some nice additional interfaces/nodes that might not be relevant to your usual analysis. But it's always nice to know that they exist. And hopefully, this will encourage you to investigate all other interfaces that Nipype can bring to the tip of your finger.

Preparation

Before we can start with anything we first need to download the data. For this hands-on, we will only use the right-handed subjects 2-4 and 7-9. This can be done very quickly with the following datalad command.

Note: This might take a while, as datalad needs to download ~200MB of data

In [ ]:
%%bash
datalad get -J 4 /data/ds000114/sub-0[234789]/ses-test/anat/sub-0[234789]_ses-test_T1w.nii.gz \
                /data/ds000114/sub-0[234789]/ses-test/func/*fingerfootlips*
get(notneeded): /data/ds000114/sub-02/ses-test/anat/sub-02_ses-test_T1w.nii.gz (file) [already present]
get(notneeded): /data/ds000114/sub-03/ses-test/anat/sub-03_ses-test_T1w.nii.gz (file) [already present]
get(notneeded): /data/ds000114/sub-04/ses-test/anat/sub-04_ses-test_T1w.nii.gz (file) [already present]
get(notneeded): /data/ds000114/sub-07/ses-test/anat/sub-07_ses-test_T1w.nii.gz (file) [already present]
get(notneeded): /data/ds000114/sub-08/ses-test/anat/sub-08_ses-test_T1w.nii.gz (file) [already present]
get(notneeded): /data/ds000114/sub-09/ses-test/anat/sub-09_ses-test_T1w.nii.gz (file) [already present]
get(notneeded): /data/ds000114/sub-02/ses-test/func/sub-02_ses-test_task-fingerfootlips_bold.nii.gz (file) [already present]
get(notneeded): /data/ds000114/sub-03/ses-test/func/sub-03_ses-test_task-fingerfootlips_bold.nii.gz (file) [already present]
get(notneeded): /data/ds000114/sub-04/ses-test/func/sub-04_ses-test_task-fingerfootlips_bold.nii.gz (file) [already present]
get(notneeded): /data/ds000114/sub-07/ses-test/func/sub-07_ses-test_task-fingerfootlips_bold.nii.gz (file) [already present]
get(notneeded): /data/ds000114/sub-08/ses-test/func/sub-08_ses-test_task-fingerfootlips_bold.nii.gz (file) [already present]
get(notneeded): /data/ds000114/sub-09/ses-test/func/sub-09_ses-test_task-fingerfootlips_bold.nii.gz (file) [already present]
action summary:
  get (notneeded: 12)

Preprocessing Workflow Structure

So let's get our hands dirty. First things first, it's always good to know which interfaces you want to use in your workflow and in which order you want to execute them. For the preprocessing workflow, I recommend that we use the following nodes:

 1. Gunzip (Nipype)
 2. Drop Dummy Scans (FSL)
 3. Slice Time Correction (SPM)
 4. Motion Correction (SPM)
 5. Artifact Detection
 6. Segmentation (SPM)
 7. Coregistration (FSL)
 8. Smoothing (FSL)
 9. Apply Binary Mask (FSL)
10. Remove Linear Trends (Nipype)

Note: This workflow might be overkill concerning data manipulation, but it hopefully serves as a good Nipype exercise.

Imports

It's always best to have all relevant module imports at the beginning of your script. So let's import what we most certainly need.

In [ ]:
# Get the Node and Workflow object
from nipype import Node, Workflow

# Specify which SPM to use
from nipype.interfaces.matlab import MatlabCommand
MatlabCommand.set_default_paths('/opt/spm12-r7219/spm12_mcr/spm12')

Note: Ideally you would also put the imports of all the interfaces that you use here at the top. But as we will develop the workflow step by step, we can also import the relevant modules as we go.

Create Nodes and Workflow connections

Let's create all the nodes that we need! Make sure to specify all relevant inputs and keep in mind which ones you later on need to connect in your pipeline.

Workflow

We recommend to create the workflow and establish all its connections at a later place in your script. This helps to have everything nicely together. But for this hands-on example it makes sense to establish the connections between the nodes as we go.

And for this, we first need to create a workflow:

In [ ]:
# Create the workflow here
# Hint: use 'base_dir' to specify where to store the working directory
In [ ]:
preproc = Workflow(name='work_preproc', base_dir='/output/')

Gunzip

I've already created the Gunzip node as a template for the other nodes. Also, we've specified an in_file here so that we can directly test the nodes without worrying about the Input/Output data stream to the workflow. This will be taken care of in a later section.

In [ ]:
from nipype.algorithms.misc import Gunzip
In [ ]:
# Specify example input file
func_file = '/data/ds000114/sub-07/ses-test/func/sub-07_ses-test_task-fingerfootlips_bold.nii.gz'

# Initiate Gunzip node
gunzip_func = Node(Gunzip(in_file=func_file), name='gunzip_func')

Drop Dummy Scans

The functional images of this dataset were recorded with 4 dummy scans at the beginning (see the corresponding publication). But those dummy scans were not yet taken out from the functional images.

To better illustrate this, let's plot the time course of a random voxel of the just defined func_file:

In [ ]:
%matplotlib inline
import pylab as plt
import nibabel as nb
plt.plot(nb.load(func_file).get_fdata()[32, 32, 15, :]);

In the figure above, we see that at the very beginning there are extreme values, which hint to the fact that steady state wasn't reached yet. Therefore, we want to exclude the dummy scans from the original data. This can be achieved with FSL's ExtractROI.

In [ ]:
from nipype.interfaces.fsl import ExtractROI
In [ ]:
extract = Node(ExtractROI(t_min=4, t_size=-1, output_type='NIFTI'),
               name="extract")

This ExtractROI node can now be connected to the gunzip_func node from above. To do this, we use the following command:

In [ ]:
preproc.connect([(gunzip_func, extract, [('out_file', 'in_file')])])

Slice Time Correction

Now to the next step. Let's us SPM's SliceTiming to correct for slice wise acquisition of the volumes. As a reminder, the tutorial dataset was recorded...

  • with a time repetition (TR) of 2.5 seconds
  • with 30 slices per volume
  • in an interleaved fashion, i.e. slice order is [1, 3, 5, 7, ..., 2, 4, 6, ..., 30]
  • with a time acquisition (TA) of 2.4167 seconds, i.e. TR-(TR/num_slices)
In [ ]:
from nipype.interfaces.spm import SliceTiming
In [ ]:
slice_order = list(range(1, 31, 2)) + list(range(2, 31, 2))
print(slice_order)
[1, 3, 5, 7, 9, 11, 13, 15, 17, 19, 21, 23, 25, 27, 29, 2, 4, 6, 8, 10, 12, 14, 16, 18, 20, 22, 24, 26, 28, 30]
In [ ]:
# Initiate SliceTiming node here
In [ ]:
slicetime = Node(SliceTiming(num_slices=30,
                             ref_slice=15,
                             slice_order=slice_order,
                             time_repetition=2.5,
                             time_acquisition=2.5-(2.5/30)),
                 name='slicetime')

Now the next step is to connect the SliceTiming node to the rest of the workflow, i.e. the ExtractROI node.

In [ ]:
# Connect SliceTiming node to the other nodes here
In [ ]:
preproc.connect([(extract, slicetime, [('roi_file', 'in_files')])])

Motion Correction

To correct for motion in the scanner, we will be using FSL's MCFLIRT.

In [ ]:
from nipype.interfaces.fsl import MCFLIRT
In [ ]:
# Initiate MCFLIRT node here
In [ ]:
mcflirt = Node(MCFLIRT(mean_vol=True,
                       save_plots=True),
               name="mcflirt")

Connect the MCFLIRT node to the rest of the workflow.

In [ ]:
# Connect MCFLIRT node to the other nodes here
In [ ]:
preproc.connect([(slicetime, mcflirt, [('timecorrected_files', 'in_file')])])

Artifact Detection

We will use the really cool and useful ArtifactDetection tool from Nipype to detect motion and intensity outliers in the functional images. The interface is initiated as follows:

In [ ]:
from nipype.algorithms.rapidart import ArtifactDetect
In [ ]:
art = Node(ArtifactDetect(norm_threshold=2,
                          zintensity_threshold=2,
                          mask_type='spm_global',
                          parameter_source='FSL',
                          use_differences=[True, False],
                          plot_type='svg'),
           name="art")

The parameters above mean the following:

  • norm_threshold - Threshold to use to detect motion-related outliers when composite motion is being used
  • zintensity_threshold - Intensity Z-threshold use to detection images that deviate from the mean
  • mask_type - Type of mask that should be used to mask the functional data. spm_global uses an spm_global like calculation to determine the brain mask
  • parameter_source - Source of movement parameters
  • use_differences - If you want to use differences between successive motion (first element) and intensity parameter (second element) estimates in order to determine outliers

And this is how you connect this node to the rest of the workflow:

In [ ]:
preproc.connect([(mcflirt, art, [('out_file', 'realigned_files'),
                                 ('par_file', 'realignment_parameters')])
                 ])

Segmentation of anatomical image

Now let's work on the anatomical image. In particular, let's use SPM's NewSegment to create probability maps for the gray matter, white matter tissue and CSF.

In [ ]:
from nipype.interfaces.spm import NewSegment
In [ ]:
# Use the following tissue specification to get a GM and WM probability map
tpm_img ='/opt/spm12-r7219/spm12_mcr/spm12/tpm/TPM.nii'
tissue1 = ((tpm_img, 1), 1, (True,False), (False, False))
tissue2 = ((tpm_img, 2), 1, (True,False), (False, False))
tissue3 = ((tpm_img, 3), 2, (True,False), (False, False))
tissue4 = ((tpm_img, 4), 3, (False,False), (False, False))
tissue5 = ((tpm_img, 5), 4, (False,False), (False, False))
tissue6 = ((tpm_img, 6), 2, (False,False), (False, False))
tissues = [tissue1, tissue2, tissue3, tissue4, tissue5, tissue6]
In [ ]:
# Initiate NewSegment node here
In [ ]:
segment = Node(NewSegment(tissues=tissues), name='segment')

We will again be using a Gunzip node to unzip the anatomical image that we then want to use as input to the segmentation node. We again also need to specify the anatomical image that we want to use in this case. As before, this will later also be handled directly by the Input/Output stream.

In [ ]:
# Specify example input file
anat_file = '/data/ds000114/sub-07/ses-test/anat/sub-07_ses-test_T1w.nii.gz'

# Initiate Gunzip node
gunzip_anat = Node(Gunzip(in_file=anat_file), name='gunzip_anat')

Now we can connect the NewSegment node to the rest of the workflow.

In [ ]:
# Connect NewSegment node to the other nodes here
In [ ]:
preproc.connect([(gunzip_anat, segment, [('out_file', 'channel_files')])])

Compute Coregistration Matrix

As a next step, we will make sure that the functional images are coregistered to the anatomical image. For this, we will use FSL's FLIRT function. As we just created a white matter probability map, we can use this together with the a Boundary-Based Registration (BBR) cost function do optimize the image coregistration. As some helpful notes...

  • use a degree of freedom of 6
  • specify the cost function as bbr
  • use the schedule='/usr/share/fsl/5.0/etc/flirtsch/bbr.sch'
In [ ]:
from nipype.interfaces.fsl import FLIRT
In [ ]:
# Initiate FLIRT node here
In [ ]:
coreg = Node(FLIRT(dof=6,
                   cost='bbr',
                   schedule='/usr/share/fsl/5.0/etc/flirtsch/bbr.sch',
                   output_type='NIFTI'),
             name="coreg")
In [ ]:
# Connect FLIRT node to the other nodes here
In [ ]:
preproc.connect([(gunzip_anat, coreg, [('out_file', 'reference')]),
                 (mcflirt, coreg, [('mean_img', 'in_file')])
                 ])

As mentioned above, the bbr routine can use the subject-specific white matter probability map to guide the coregistration. But for this, we need to create a binary mask out of the WM probability map. This can easily be done by FSL's Threshold interface.

In [ ]:
from nipype.interfaces.fsl import Threshold

# Threshold - Threshold WM probability image
threshold_WM = Node(Threshold(thresh=0.5,
                              args='-bin',
                              output_type='NIFTI'),
                name="threshold_WM")

Now, to select the WM probability map that the NewSegment node created, we need some helper function. Because the output field partial_volume_files form the segmentation node, will give us a list of files, i.e. [[GM_prob], [WM_prob], [], [], [], []]. Therefore, using the following function, we can select only the last element of this list.

In [ ]:
# Select WM segmentation file from segmentation output
def get_wm(files):
    return files[1][0]

# Connecting the segmentation node with the threshold node
preproc.connect([(segment, threshold_WM, [(('native_class_images', get_wm),
                                           'in_file')])])

Now we can just connect this Threshold node to the coregistration node from above.

In [ ]:
# Connect Threshold node to coregistration node above here
In [ ]:
preproc.connect([(threshold_WM, coreg, [('out_file', 'wm_seg')])])

Apply Coregistration Matrix to functional image

Now that we know the coregistration matrix to correctly overlay the functional mean image on the subject-specific anatomy, we need to apply to coregistration to the whole time series. This can be achieved with FSL's FLIRT as follows:

In [ ]:
# Specify the isometric voxel resolution you want after coregistration
desired_voxel_iso = 4

# Apply coregistration warp to functional images
applywarp = Node(FLIRT(interp='spline',
                       apply_isoxfm=desired_voxel_iso,
                       output_type='NIFTI'),
                 name="applywarp")

Important: As you can see above, we also specified a variable desired_voxel_iso. This is very important at this stage, otherwise FLIRT will transform your functional images to a resolution of the anatomical image, which will dramatically increase the file size (e.g. to 1-10GB per file). If you don't want to change the voxel resolution, use the additional parameter no_resample=True. Important, for this to work, you still need to define apply_isoxfm=desired_voxel_iso.

In [ ]:
# Connecting the ApplyWarp node to all the other nodes
preproc.connect([(mcflirt, applywarp, [('out_file', 'in_file')]),
                 (coreg, applywarp, [('out_matrix_file', 'in_matrix_file')]),
                 (gunzip_anat, applywarp, [('out_file', 'reference')])
                 ])

Smoothing

Next step is image smoothing. The most simple way to do this is to use FSL's or SPM's Smooth function. But for learning purposes, let's use FSL's SUSAN workflow as it is implemented in Nipype. Note that this time, we are importing a workflow instead of an interface.

In [ ]:
from nipype.workflows.fmri.fsl.preprocess import create_susan_smooth

If you type create_susan_smooth? you can see how to specify the input variables to the susan workflow. In particular, they are...

  • fwhm: set this value to 4 (or whichever value you want)
  • mask_file: will be created in a later step
  • in_file: will be handled while connection to other nodes in the preproc workflow
In [ ]:
# Initiate SUSAN workflow here
In [ ]:
susan = create_susan_smooth(name='susan')
susan.inputs.inputnode.fwhm = 4
In [ ]:
# Connect Threshold node to coregistration node above here
In [ ]:
preproc.connect([(applywarp, susan, [('out_file', 'inputnode.in_files')])])

Create Binary Mask

There are many possible approaches on how you can mask your functional images. One of them is not at all, one is with a simple brain mask and one that only considers certain kind of brain tissue, e.g. gray matter.

For the current example, we want to create a dilated gray matter mask. For this purpose we need to:

  1. Resample the gray matter probability map to the same resolution as the functional images
  2. Threshold this resampled probability map at a specific value
  3. Dilate this mask by some voxels to make the mask less conservative and more inclusive

The first step can be done in many ways (eg. using freesurfer's mri_convert, nibabel) but in our case, we will use FSL's FLIRT. The trick is to use the probability mask, as input file and a reference file.

In [ ]:
from nipype.interfaces.fsl import FLIRT

# Initiate resample node
resample = Node(FLIRT(apply_isoxfm=desired_voxel_iso,
                      output_type='NIFTI'),
                name="resample")

The second and third step can luckily be done with just one node. We can take almost the same Threshold node as above. We just need to add another additional argument: -dilF - which applies a maximum filtering of all voxels.

In [ ]:
from nipype.interfaces.fsl import Threshold

# Threshold - Threshold GM probability image
mask_GM = Node(Threshold(thresh=0.5,
                         args='-bin -dilF',
                         output_type='NIFTI'),
                name="mask_GM")

# Select GM segmentation file from segmentation output
def get_gm(files):
    return files[0][0]

Now we can connect the resample and the gray matter mask node to the segmentation node and each other.

In [ ]:
preproc.connect([(segment, resample, [(('native_class_images', get_gm), 'in_file'),
                                      (('native_class_images', get_gm), 'reference')
                                      ]),
                 (resample, mask_GM, [('out_file', 'in_file')])
                 ])

This should do the trick.

Apply the binary mask

Now we can connect this dilated gray matter mask to the susan node, as well as actually applying this to the resulting smoothed images.

In [ ]:
# Connect gray matter Mask node to the susan workflow here
In [ ]:
preproc.connect([(mask_GM, susan, [('out_file', 'inputnode.mask_file')])])

To apply the mask to the smoothed functional images, we will use FSL's ApplyMask interface.

In [ ]:
from nipype.interfaces.fsl import ApplyMask

Important: The susan workflow gives out a list of files, i.e. [smoothed_func.nii] instead of just the filename directly. If we would use a normal Node for ApplyMask this would lead to the following error:

TraitError: The 'in_file' trait of an ApplyMaskInput instance must be an existing file name, but a value of ['/output/work_preproc/susan/smooth/mapflow/_smooth0/asub-07_ses-test_task-fingerfootlips_bold_mcf_flirt_smooth.nii.gz'] <class 'list'> was specified.


To prevent this we will be using a MapNode and specify the in_file as it's iterfield. Like this, the node is capable to handle a list of inputs as it will know that it has to apply itself iteratively to the list of inputs.

In [ ]:
from nipype import MapNode
In [ ]:
# Initiate ApplyMask node here
In [ ]:
mask_func = MapNode(ApplyMask(output_type='NIFTI'),
                    name="mask_func",
                    iterfield=["in_file"])
In [ ]:
# Connect smoothed susan output file to ApplyMask node here
In [ ]:
preproc.connect([(susan, mask_func, [('outputnode.smoothed_files', 'in_file')]),
                 (mask_GM, mask_func, [('out_file', 'mask_file')])
                 ])

Last but not least. Let's use Nipype's TSNR module to remove linear and quadratic trends in the functionally smoothed images. For this, you only have to specify the regress_poly parameter in the node initiation.

In [ ]:
from nipype.algorithms.confounds import TSNR
In [ ]:
# Initiate TSNR node here
In [ ]:
detrend = Node(TSNR(regress_poly=2), name="detrend")
In [ ]:
# Connect the detrend node to the other nodes here
In [ ]:
preproc.connect([(mask_func, detrend, [('out_file', 'in_file')])])

Datainput with SelectFiles and iterables

This is all nice and well. But so far we still had to specify the input values for gunzip_anat and gunzip_func ourselves. How can we scale this up to multiple subjects and/or multiple functional images and make the workflow take the input directly from the BIDS dataset?

For this, we need SelectFiles and iterables! It's rather simple, specify a template and fill-up the placeholder variables.

In [ ]:
# Import the SelectFiles
from nipype import SelectFiles

# String template with {}-based strings
templates = {'anat': 'sub-{subject_id}/ses-{ses_id}/anat/'
                     'sub-{subject_id}_ses-test_T1w.nii.gz',
             'func': 'sub-{subject_id}/ses-{ses_id}/func/'
                     'sub-{subject_id}_ses-{ses_id}_task-{task_id}_bold.nii.gz'}

# Create SelectFiles node
sf = Node(SelectFiles(templates,
                      base_directory='/data/ds000114',
                      sort_filelist=True),
          name='selectfiles')
sf.inputs.ses_id='test'
sf.inputs.task_id='fingerfootlips'

Now we can specify over which subjects the workflow should iterate. To test the workflow, let's still just look at subject 2.

In [ ]:
subject_list = ['07']
sf.iterables = [('subject_id', subject_list)]
In [ ]:
# Connect SelectFiles node to the other nodes here
In [ ]:
preproc.connect([(sf, gunzip_anat, [('anat', 'in_file')]),
                 (sf, gunzip_func, [('func', 'in_file')])])

Visualize the workflow

Now that we're done. Let's look at the workflow that we just created.

In [ ]:
# Create preproc output graph
preproc.write_graph(graph2use='colored', format='png', simple_form=True)

# Visualize the graph
from IPython.display import Image
Image(filename='/output/work_preproc/graph.png', width=750)
180514-09:56:56,605 workflow INFO:
	 Generated workflow graph: /output/work_preproc/graph.png (graph2use=colored, simple_form=True).
Out[ ]:

Run the Workflow

Now we are ready to run the workflow! Be careful about the n_procs parameter if you run a workflow in 'MultiProc' mode. n_procs specifies the number of jobs/cores your computer will use to run the workflow. If this number is too high your computer will try to execute too many things at once and will most likely crash.

Note: If you're using a Docker container and FLIRT fails to run without any good reason, you might need to change memory settings in the Docker preferences (6 GB should be enough for this workflow).

In [ ]:
preproc.run('MultiProc', plugin_args={'n_procs': 8})
180514-09:56:56,693 workflow INFO:
	 Workflow work_preproc settings: ['check', 'execution', 'logging', 'monitoring']
180514-09:56:56,726 workflow INFO:
	 Running in parallel.
180514-09:56:56,730 workflow INFO:
	 [MultiProc] Running 0 tasks, and 1 jobs ready. Free memory (GB): 53.94/53.94, Free processors: 8/8.
180514-09:56:56,803 workflow INFO:
	 [Node] Setting-up "work_preproc.selectfiles" in "/output/work_preproc/_subject_id_07/selectfiles".
180514-09:56:56,838 workflow INFO:
	 [Node] Running "selectfiles" ("nipype.interfaces.io.SelectFiles")
180514-09:56:56,860 workflow INFO:
	 [Node] Finished "work_preproc.selectfiles".
180514-09:56:58,732 workflow INFO:
	 [Job 0] Completed (work_preproc.selectfiles).
180514-09:56:58,735 workflow INFO:
	 [MultiProc] Running 0 tasks, and 2 jobs ready. Free memory (GB): 53.94/53.94, Free processors: 8/8.
180514-09:56:58,792 workflow INFO:
	 [Node] Setting-up "work_preproc.gunzip_anat" in "/output/work_preproc/_subject_id_07/gunzip_anat".
180514-09:56:58,790 workflow INFO:
	 [Node] Setting-up "work_preproc.gunzip_func" in "/output/work_preproc/_subject_id_07/gunzip_func".
180514-09:56:58,822 workflow INFO:
	 [Node] Running "gunzip_anat" ("nipype.algorithms.misc.Gunzip")
180514-09:56:58,826 workflow INFO:
	 [Node] Running "gunzip_func" ("nipype.algorithms.misc.Gunzip")
180514-09:56:59,281 workflow INFO:
	 [Node] Finished "work_preproc.gunzip_anat".
180514-09:56:59,647 workflow INFO:
	 [Node] Finished "work_preproc.gunzip_func".
180514-09:57:00,733 workflow INFO:
	 [Job 1] Completed (work_preproc.gunzip_func).
180514-09:57:00,735 workflow INFO:
	 [Job 6] Completed (work_preproc.gunzip_anat).
180514-09:57:00,737 workflow INFO:
	 [MultiProc] Running 0 tasks, and 2 jobs ready. Free memory (GB): 53.94/53.94, Free processors: 8/8.
180514-09:57:00,792 workflow INFO:
	 [Node] Setting-up "work_preproc.segment" in "/output/work_preproc/_subject_id_07/segment".
180514-09:57:00,802 workflow INFO:
	 [Node] Running "segment" ("nipype.interfaces.spm.preprocess.NewSegment")
180514-09:57:00,785 workflow INFO:
	 [Node] Setting-up "work_preproc.extract" in "/output/work_preproc/_subject_id_07/extract".
180514-09:57:00,820 workflow INFO:
	 [Node] Running "extract" ("nipype.interfaces.fsl.utils.ExtractROI"), a CommandLine Interface with command:
fslroi /output/work_preproc/_subject_id_07/gunzip_func/sub-07_ses-test_task-fingerfootlips_bold.nii /output/work_preproc/_subject_id_07/extract/sub-07_ses-test_task-fingerfootlips_bold_roi.nii 4 -1
180514-09:57:01,127 workflow INFO:
	 [Node] Finished "work_preproc.extract".
180514-09:57:02,736 workflow INFO:
	 [Job 2] Completed (work_preproc.extract).
180514-09:57:02,739 workflow INFO:
	 [MultiProc] Running 1 tasks, and 1 jobs ready. Free memory (GB): 53.74/53.94, Free processors: 7/8.
                     Currently running:
                       * work_preproc.segment
180514-09:57:02,798 workflow INFO:
	 [Node] Setting-up "work_preproc.slicetime" in "/output/work_preproc/_subject_id_07/slicetime".
180514-09:57:02,806 workflow INFO:
	 [Node] Running "slicetime" ("nipype.interfaces.spm.preprocess.SliceTiming")
180514-09:57:04,738 workflow INFO:
	 [MultiProc] Running 2 tasks, and 0 jobs ready. Free memory (GB): 53.54/53.94, Free processors: 6/8.
                     Currently running:
                       * work_preproc.slicetime
                       * work_preproc.segment
180514-09:57:27,694 workflow INFO:
	 [Node] Finished "work_preproc.slicetime".
180514-09:57:28,762 workflow INFO:
	 [Job 3] Completed (work_preproc.slicetime).
180514-09:57:28,766 workflow INFO:
	 [MultiProc] Running 1 tasks, and 1 jobs ready. Free memory (GB): 53.74/53.94, Free processors: 7/8.
                     Currently running:
                       * work_preproc.segment
180514-09:57:28,827 workflow INFO:
	 [Node] Setting-up "work_preproc.mcflirt" in "/output/work_preproc/_subject_id_07/mcflirt".
180514-09:57:28,837 workflow INFO:
	 [Node] Running "mcflirt" ("nipype.interfaces.fsl.preprocess.MCFLIRT"), a CommandLine Interface with command:
mcflirt -in /output/work_preproc/_subject_id_07/slicetime/asub-07_ses-test_task-fingerfootlips_bold_roi.nii -meanvol -out /output/work_preproc/_subject_id_07/mcflirt/asub-07_ses-test_task-fingerfootlips_bold_roi_mcf.nii.gz -plots
180514-09:57:30,764 workflow INFO:
	 [MultiProc] Running 2 tasks, and 0 jobs ready. Free memory (GB): 53.54/53.94, Free processors: 6/8.
                     Currently running:
                       * work_preproc.mcflirt
                       * work_preproc.segment
180514-09:58:41,977 workflow INFO:
	 [Node] Finished "work_preproc.mcflirt".
180514-09:58:42,833 workflow INFO:
	 [Job 4] Completed (work_preproc.mcflirt).
180514-09:58:42,836 workflow INFO:
	 [MultiProc] Running 1 tasks, and 1 jobs ready. Free memory (GB): 53.74/53.94, Free processors: 7/8.
                     Currently running:
                       * work_preproc.segment
180514-09:58:42,890 workflow INFO:
	 [Node] Setting-up "work_preproc.art" in "/output/work_preproc/_subject_id_07/art".
180514-09:58:42,897 workflow INFO:
	 [Node] Running "art" ("nipype.algorithms.rapidart.ArtifactDetect")
/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/algorithms/rapidart.py:542: UserWarning:
This call to matplotlib.use() has no effect because the backend has already
been chosen; matplotlib.use() must be called *before* pylab, matplotlib.pyplot,
or matplotlib.backends is imported for the first time.

The backend was *originally* set to 'module://ipykernel.pylab.backend_inline' by the following code:
  File "/opt/conda/envs/neuro/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/opt/conda/envs/neuro/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/ipykernel/__main__.py", line 3, in <module>
    app.launch_new_instance()
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/traitlets/config/application.py", line 658, in launch_instance
    app.start()
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/ipykernel/kernelapp.py", line 486, in start
    self.io_loop.start()
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/tornado/platform/asyncio.py", line 127, in start
    self.asyncio_loop.run_forever()
  File "/opt/conda/envs/neuro/lib/python3.6/asyncio/base_events.py", line 422, in run_forever
    self._run_once()
  File "/opt/conda/envs/neuro/lib/python3.6/asyncio/base_events.py", line 1432, in _run_once
    handle._run()
  File "/opt/conda/envs/neuro/lib/python3.6/asyncio/events.py", line 145, in _run
    self._callback(*self._args)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/tornado/ioloop.py", line 759, in _run_callback
    ret = callback()
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/tornado/stack_context.py", line 276, in null_wrapper
    return fn(*args, **kwargs)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/zmq/eventloop/zmqstream.py", line 536, in <lambda>
    self.io_loop.add_callback(lambda : self._handle_events(self.socket, 0))
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/zmq/eventloop/zmqstream.py", line 450, in _handle_events
    self._handle_recv()
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/zmq/eventloop/zmqstream.py", line 480, in _handle_recv
    self._run_callback(callback, msg)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/zmq/eventloop/zmqstream.py", line 432, in _run_callback
    callback(*args, **kwargs)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/tornado/stack_context.py", line 276, in null_wrapper
    return fn(*args, **kwargs)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/ipykernel/kernelbase.py", line 283, in dispatcher
    return self.dispatch_shell(stream, msg)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/ipykernel/kernelbase.py", line 233, in dispatch_shell
    handler(stream, idents, msg)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/ipykernel/kernelbase.py", line 399, in execute_request
    user_expressions, allow_stdin)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/ipykernel/ipkernel.py", line 208, in do_execute
    res = shell.run_cell(code, store_history=store_history, silent=silent)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/ipykernel/zmqshell.py", line 537, in run_cell
    return super(ZMQInteractiveShell, self).run_cell(*args, **kwargs)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/IPython/core/interactiveshell.py", line 2666, in run_cell
    self.events.trigger('post_run_cell', result)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/IPython/core/events.py", line 88, in trigger
    func(*args, **kwargs)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/ipykernel/pylab/backend_inline.py", line 160, in configure_once
    activate_matplotlib(backend)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/IPython/core/pylabtools.py", line 311, in activate_matplotlib
    matplotlib.pyplot.switch_backend(backend)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/matplotlib/pyplot.py", line 231, in switch_backend
    matplotlib.use(newbackend, warn=False, force=True)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/matplotlib/__init__.py", line 1410, in use
    reload(sys.modules['matplotlib.backends'])
  File "/opt/conda/envs/neuro/lib/python3.6/importlib/__init__.py", line 166, in reload
    _bootstrap._exec(spec, module)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/matplotlib/backends/__init__.py", line 16, in <module>
    line for line in traceback.format_stack()


  matplotlib.use(config.get("execution", "matplotlib_backend"))
/opt/conda/envs/neuro/lib/python3.6/site-packages/nipype/algorithms/rapidart.py:398: UserWarning:
This call to matplotlib.use() has no effect because the backend has already
been chosen; matplotlib.use() must be called *before* pylab, matplotlib.pyplot,
or matplotlib.backends is imported for the first time.

The backend was *originally* set to 'module://ipykernel.pylab.backend_inline' by the following code:
  File "/opt/conda/envs/neuro/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/opt/conda/envs/neuro/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/ipykernel/__main__.py", line 3, in <module>
    app.launch_new_instance()
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/traitlets/config/application.py", line 658, in launch_instance
    app.start()
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/ipykernel/kernelapp.py", line 486, in start
    self.io_loop.start()
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/tornado/platform/asyncio.py", line 127, in start
    self.asyncio_loop.run_forever()
  File "/opt/conda/envs/neuro/lib/python3.6/asyncio/base_events.py", line 422, in run_forever
    self._run_once()
  File "/opt/conda/envs/neuro/lib/python3.6/asyncio/base_events.py", line 1432, in _run_once
    handle._run()
  File "/opt/conda/envs/neuro/lib/python3.6/asyncio/events.py", line 145, in _run
    self._callback(*self._args)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/tornado/ioloop.py", line 759, in _run_callback
    ret = callback()
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/tornado/stack_context.py", line 276, in null_wrapper
    return fn(*args, **kwargs)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/zmq/eventloop/zmqstream.py", line 536, in <lambda>
    self.io_loop.add_callback(lambda : self._handle_events(self.socket, 0))
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/zmq/eventloop/zmqstream.py", line 450, in _handle_events
    self._handle_recv()
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/zmq/eventloop/zmqstream.py", line 480, in _handle_recv
    self._run_callback(callback, msg)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/zmq/eventloop/zmqstream.py", line 432, in _run_callback
    callback(*args, **kwargs)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/tornado/stack_context.py", line 276, in null_wrapper
    return fn(*args, **kwargs)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/ipykernel/kernelbase.py", line 283, in dispatcher
    return self.dispatch_shell(stream, msg)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/ipykernel/kernelbase.py", line 233, in dispatch_shell
    handler(stream, idents, msg)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/ipykernel/kernelbase.py", line 399, in execute_request
    user_expressions, allow_stdin)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/ipykernel/ipkernel.py", line 208, in do_execute
    res = shell.run_cell(code, store_history=store_history, silent=silent)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/ipykernel/zmqshell.py", line 537, in run_cell
    return super(ZMQInteractiveShell, self).run_cell(*args, **kwargs)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/IPython/core/interactiveshell.py", line 2666, in run_cell
    self.events.trigger('post_run_cell', result)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/IPython/core/events.py", line 88, in trigger
    func(*args, **kwargs)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/ipykernel/pylab/backend_inline.py", line 160, in configure_once
    activate_matplotlib(backend)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/IPython/core/pylabtools.py", line 311, in activate_matplotlib
    matplotlib.pyplot.switch_backend(backend)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/matplotlib/pyplot.py", line 231, in switch_backend
    matplotlib.use(newbackend, warn=False, force=True)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/matplotlib/__init__.py", line 1410, in use
    reload(sys.modules['matplotlib.backends'])
  File "/opt/conda/envs/neuro/lib/python3.6/importlib/__init__.py", line 166, in reload
    _bootstrap._exec(spec, module)
  File "/opt/conda/envs/neuro/lib/python3.6/site-packages/matplotlib/backends/__init__.py", line 16, in <module>
    line for line in traceback.format_stack()


  matplotlib.use(config.get("execution", "matplotlib_backend"))
180514-09:58:44,76 workflow INFO:
	 [Node] Finished "work_preproc.art".
180514-09:58:44,835 workflow INFO:
	 [Job 5] Completed (work_preproc.art).
180514-09:58:44,838 workflow INFO:
	 [MultiProc] Running 1 tasks, and 0 jobs ready. Free memory (GB): 53.74/53.94, Free processors: 7/8.
                     Currently running:
                       * work_preproc.segment
180514-10:00:06,119 workflow INFO:
	 [Node] Finished "work_preproc.segment".
180514-10:00:06,913 workflow INFO:
	 [Job 7] Completed (work_preproc.segment).
180514-10:00:06,921 workflow INFO:
	 [MultiProc] Running 0 tasks, and 2 jobs ready. Free memory (GB): 53.94/53.94, Free processors: 8/8.
180514-10:00:06,991 workflow INFO:
	 [Node] Setting-up "work_preproc.resample" in "/output/work_preproc/_subject_id_07/resample".
180514-10:00:06,993 workflow INFO:
	 [Node] Setting-up "work_preproc.threshold_WM" in "/output/work_preproc/_subject_id_07/threshold_WM".
180514-10:00:07,2 workflow INFO:
	 [Node] Running "threshold_WM" ("nipype.interfaces.fsl.maths.Threshold"), a CommandLine Interface with command:
fslmaths /output/work_preproc/_subject_id_07/segment/c2sub-07_ses-test_T1w.nii -thr 0.5000000000 -bin /output/work_preproc/_subject_id_07/threshold_WM/c2sub-07_ses-test_T1w_thresh.nii180514-10:00:07,2 workflow INFO:
	 [Node] Running "resample" ("nipype.interfaces.fsl.preprocess.FLIRT"), a CommandLine Interface with command:
flirt -in /output/work_preproc/_subject_id_07/segment/c1sub-07_ses-test_T1w.nii -ref /output/work_preproc/_subject_id_07/segment/c1sub-07_ses-test_T1w.nii -out c1sub-07_ses-test_T1w_flirt.nii -omat c1sub-07_ses-test_T1w_flirt.mat -applyisoxfm 4.000000

180514-10:00:07,538 workflow INFO:
	 [Node] Finished "work_preproc.threshold_WM".
180514-10:00:08,913 workflow INFO:
	 [Job 10] Completed (work_preproc.threshold_WM).
180514-10:00:08,916 workflow INFO:
	 [MultiProc] Running 1 tasks, and 1 jobs ready. Free memory (GB): 53.74/53.94, Free processors: 7/8.
                     Currently running:
                       * work_preproc.resample
180514-10:00:08,979 workflow INFO:
	 [Node] Setting-up "work_preproc.coreg" in "/output/work_preproc/_subject_id_07/coreg".
180514-10:00:08,987 workflow INFO:
	 [Node] Running "coreg" ("nipype.interfaces.fsl.preprocess.FLIRT"), a CommandLine Interface with command:
flirt -in /output/work_preproc/_subject_id_07/mcflirt/asub-07_ses-test_task-fingerfootlips_bold_roi_mcf.nii.gz_mean_reg.nii.gz -ref /output/work_preproc/_subject_id_07/gunzip_anat/sub-07_ses-test_T1w.nii -out asub-07_ses-test_task-fingerfootlips_bold_roi_mcf.nii.gz_mean_reg_flirt.nii -omat asub-07_ses-test_task-fingerfootlips_bold_roi_mcf.nii.gz_mean_reg_flirt.mat -cost bbr -dof 6 -schedule /usr/share/fsl/5.0/etc/flirtsch/bbr.sch -wmseg /output/work_preproc/_subject_id_07/threshold_WM/c2sub-07_ses-test_T1w_thresh.nii
180514-10:00:10,779 workflow INFO:
	 [Node] Finished "work_preproc.resample".
180514-10:00:10,915 workflow INFO:
	 [Job 8] Completed (work_preproc.resample).
180514-10:00:10,917 workflow INFO:
	 [MultiProc] Running 1 tasks, and 1 jobs ready. Free memory (GB): 53.74/53.94, Free processors: 7/8.
                     Currently running:
                       * work_preproc.coreg
180514-10:00:10,966 workflow INFO:
	 [Node] Setting-up "work_preproc.mask_GM" in "/output/work_preproc/_subject_id_07/mask_GM".
180514-10:00:10,973 workflow INFO:
	 [Node] Running "mask_GM" ("nipype.interfaces.fsl.maths.Threshold"), a CommandLine Interface with command:
fslmaths /output/work_preproc/_subject_id_07/resample/c1sub-07_ses-test_T1w_flirt.nii -thr 0.5000000000 -bin -dilF /output/work_preproc/_subject_id_07/mask_GM/c1sub-07_ses-test_T1w_flirt_thresh.nii
180514-10:00:11,333 workflow INFO:
	 [Node] Finished "work_preproc.mask_GM".
180514-10:00:12,917 workflow INFO:
	 [Job 9] Completed (work_preproc.mask_GM).
180514-10:00:12,920 workflow INFO:
	 [MultiProc] Running 1 tasks, and 0 jobs ready. Free memory (GB): 53.74/53.94, Free processors: 7/8.
                     Currently running:
                       * work_preproc.coreg
180514-10:01:35,185 workflow INFO:
	 [Node] Finished "work_preproc.coreg".
180514-10:01:36,995 workflow INFO:
	 [Job 11] Completed (work_preproc.coreg).
180514-10:01:37,3 workflow INFO:
	 [MultiProc] Running 0 tasks, and 1 jobs ready. Free memory (GB): 53.94/53.94, Free processors: 8/8.
180514-10:01:37,69 workflow INFO:
	 [Node] Setting-up "work_preproc.applywarp" in "/output/work_preproc/_subject_id_07/applywarp".
180514-10:01:37,77 workflow INFO:
	 [Node] Running "applywarp" ("nipype.interfaces.fsl.preprocess.FLIRT"), a CommandLine Interface with command:
flirt -in /output/work_preproc/_subject_id_07/mcflirt/asub-07_ses-test_task-fingerfootlips_bold_roi_mcf.nii.gz -ref /output/work_preproc/_subject_id_07/gunzip_anat/sub-07_ses-test_T1w.nii -out asub-07_ses-test_task-fingerfootlips_bold_roi_mcf_flirt.nii -omat asub-07_ses-test_task-fingerfootlips_bold_roi_mcf_flirt.mat -applyisoxfm 4.000000 -init /output/work_preproc/_subject_id_07/coreg/asub-07_ses-test_task-fingerfootlips_bold_roi_mcf.nii.gz_mean_reg_flirt.mat -interp spline
180514-10:01:38,996 workflow INFO:
	 [MultiProc] Running 1 tasks, and 0 jobs ready. Free memory (GB): 53.74/53.94, Free processors: 7/8.
                     Currently running:
                       * work_preproc.applywarp
180514-10:01:48,221 workflow INFO:
	 [Node] Finished "work_preproc.applywarp".
180514-10:01:49,7 workflow INFO:
	 [Job 12] Completed (work_preproc.applywarp).
180514-10:01:49,13 workflow INFO:
	 [MultiProc] Running 0 tasks, and 2 jobs ready. Free memory (GB): 53.94/53.94, Free processors: 8/8.
180514-10:01:49,81 workflow INFO:
	 [Node] Setting-up "work_preproc.susan.mask" in "/output/work_preproc/susan/_subject_id_07/mask".
180514-10:01:49,85 workflow INFO:
	 [Node] Setting-up "work_preproc.susan.median" in "/output/work_preproc/susan/_subject_id_07/median".180514-10:01:49,88 workflow INFO:
	 [Node] Setting-up "_mask0" in "/output/work_preproc/susan/_subject_id_07/mask/mapflow/_mask0".

180514-10:01:49,94 workflow INFO:
	 [Node] Setting-up "_median0" in "/output/work_preproc/susan/_subject_id_07/median/mapflow/_median0".180514-10:01:49,94 workflow INFO:
	 [Node] Running "_mask0" ("nipype.interfaces.fsl.utils.ImageMaths"), a CommandLine Interface with command:
fslmaths /output/work_preproc/_subject_id_07/applywarp/asub-07_ses-test_task-fingerfootlips_bold_roi_mcf_flirt.nii -mas /output/work_preproc/_subject_id_07/mask_GM/c1sub-07_ses-test_T1w_flirt_thresh.nii /output/work_preproc/susan/_subject_id_07/mask/mapflow/_mask0/asub-07_ses-test_task-fingerfootlips_bold_roi_mcf_flirt_mask.nii.gz

180514-10:01:49,99 workflow INFO:
	 [Node] Running "_median0" ("nipype.interfaces.fsl.utils.ImageStats"), a CommandLine Interface with command:
fslstats /output/work_preproc/_subject_id_07/applywarp/asub-07_ses-test_task-fingerfootlips_bold_roi_mcf_flirt.nii -k /output/work_preproc/_subject_id_07/mask_GM/c1sub-07_ses-test_T1w_flirt_thresh.nii -p 50
180514-10:01:50,433 workflow INFO:
	 [Node] Finished "_median0".
180514-10:01:50,438 workflow INFO:
	 [Node] Finished "work_preproc.susan.median".
180514-10:01:51,8 workflow INFO:
	 [Job 15] Completed (work_preproc.susan.median).
180514-10:01:51,11 workflow INFO:
	 [MultiProc] Running 1 tasks, and 0 jobs ready. Free memory (GB): 53.74/53.94, Free processors: 7/8.
                     Currently running:
                       * work_preproc.susan.mask
180514-10:01:52,397 workflow INFO:
	 [Node] Finished "_mask0".
180514-10:01:52,402 workflow INFO:
	 [Node] Finished "work_preproc.susan.mask".
180514-10:01:53,11 workflow INFO:
	 [Job 13] Completed (work_preproc.susan.mask).
180514-10:01:53,18 workflow INFO:
	 [MultiProc] Running 0 tasks, and 1 jobs ready. Free memory (GB): 53.94/53.94, Free processors: 8/8.
180514-10:01:53,86 workflow INFO:
	 [Node] Setting-up "work_preproc.susan.meanfunc2" in "/output/work_preproc/susan/_subject_id_07/meanfunc2".
180514-10:01:53,93 workflow INFO:
	 [Node] Setting-up "_meanfunc20" in "/output/work_preproc/susan/_subject_id_07/meanfunc2/mapflow/_meanfunc20".
180514-10:01:53,99 workflow INFO:
	 [Node] Running "_meanfunc20" ("nipype.interfaces.fsl.utils.ImageMaths"), a CommandLine Interface with command:
fslmaths /output/work_preproc/susan/_subject_id_07/mask/mapflow/_mask0/asub-07_ses-test_task-fingerfootlips_bold_roi_mcf_flirt_mask.nii.gz -Tmean /output/work_preproc/susan/_subject_id_07/meanfunc2/mapflow/_meanfunc20/asub-07_ses-test_task-fingerfootlips_bold_roi_mcf_flirt_mask_mean.nii.gz
180514-10:01:54,364 workflow INFO:
	 [Node] Finished "_meanfunc20".
180514-10:01:54,370 workflow INFO:
	 [Node] Finished "work_preproc.susan.meanfunc2".
180514-10:01:55,13 workflow INFO:
	 [Job 14] Completed (work_preproc.susan.meanfunc2).
180514-10:01:55,20 workflow INFO:
	 [MultiProc] Running 0 tasks, and 1 jobs ready. Free memory (GB): 53.94/53.94, Free processors: 8/8.
180514-10:01:55,89 workflow INFO:
	 [Node] Setting-up "work_preproc.susan.merge" in "/output/work_preproc/susan/_subject_id_07/merge".
180514-10:01:55,95 workflow INFO:
	 [Node] Running "merge" ("nipype.interfaces.utility.base.Merge")
180514-10:01:55,103 workflow INFO:
	 [Node] Finished "work_preproc.susan.merge".
180514-10:01:57,16 workflow INFO:
	 [Job 16] Completed (work_preproc.susan.merge).
180514-10:01:57,23 workflow INFO:
	 [MultiProc] Running 0 tasks, and 1 jobs ready. Free memory (GB): 53.94/53.94, Free processors: 8/8.
180514-10:01:57,94 workflow INFO:
	 [Node] Setting-up "work_preproc.susan.multi_inputs" in "/output/work_preproc/susan/_subject_id_07/multi_inputs".
180514-10:01:57,101 workflow INFO:
	 [Node] Running "multi_inputs" ("nipype.interfaces.utility.wrappers.Function")
180514-10:01:57,109 workflow INFO:
	 [Node] Finished "work_preproc.susan.multi_inputs".
180514-10:01:59,16 workflow INFO:
	 [Job 17] Completed (work_preproc.susan.multi_inputs).
180514-10:01:59,21 workflow INFO:
	 [MultiProc] Running 0 tasks, and 1 jobs ready. Free memory (GB): 53.94/53.94, Free processors: 8/8.
180514-10:01:59,79 workflow INFO:
	 [Node] Setting-up "work_preproc.susan.smooth" in "/output/work_preproc/susan/_subject_id_07/smooth".
180514-10:01:59,108 workflow INFO:
	 [Node] Setting-up "_smooth0" in "/output/work_preproc/susan/_subject_id_07/smooth/mapflow/_smooth0".
180514-10:01:59,115 workflow INFO:
	 [Node] Running "_smooth0" ("nipype.interfaces.fsl.preprocess.SUSAN"), a CommandLine Interface with command:
susan /output/work_preproc/_subject_id_07/applywarp/asub-07_ses-test_task-fingerfootlips_bold_roi_mcf_flirt.nii 1046.2500000000 1.6986436006 3 1 1 /output/work_preproc/susan/_subject_id_07/meanfunc2/mapflow/_meanfunc20/asub-07_ses-test_task-fingerfootlips_bold_roi_mcf_flirt_mask_mean.nii.gz 1046.2500000000 /output/work_preproc/susan/_subject_id_07/smooth/mapflow/_smooth0/asub-07_ses-test_task-fingerfootlips_bold_roi_mcf_flirt_smooth.nii.gz
180514-10:02:01,18 workflow INFO:
	 [MultiProc] Running 1 tasks, and 0 jobs ready. Free memory (GB): 53.74/53.94, Free processors: 7/8.
                     Currently running:
                       * work_preproc.susan.smooth
180514-10:02:38,356 workflow INFO:
	 [Node] Finished "_smooth0".
180514-10:02:38,362 workflow INFO:
	 [Node] Finished "work_preproc.susan.smooth".
180514-10:02:39,55 workflow INFO:
	 [Job 18] Completed (work_preproc.susan.smooth).
180514-10:02:39,63 workflow INFO:
	 [MultiProc] Running 0 tasks, and 1 jobs ready. Free memory (GB): 53.94/53.94, Free processors: 8/8.
180514-10:02:39,133 workflow INFO:
	 [Node] Setting-up "work_preproc.mask_func" in "/output/work_preproc/_subject_id_07/mask_func".
180514-10:02:39,140 workflow INFO:
	 [Node] Setting-up "_mask_func0" in "/output/work_preproc/_subject_id_07/mask_func/mapflow/_mask_func0".
180514-10:02:39,145 workflow INFO:
	 [Node] Running "_mask_func0" ("nipype.interfaces.fsl.maths.ApplyMask"), a CommandLine Interface with command:
fslmaths /output/work_preproc/susan/_subject_id_07/smooth/mapflow/_smooth0/asub-07_ses-test_task-fingerfootlips_bold_roi_mcf_flirt_smooth.nii.gz -mas /output/work_preproc/_subject_id_07/mask_GM/c1sub-07_ses-test_T1w_flirt_thresh.nii /output/work_preproc/_subject_id_07/mask_func/mapflow/_mask_func0/asub-07_ses-test_task-fingerfootlips_bold_roi_mcf_flirt_smooth_masked.nii
180514-10:02:40,163 workflow INFO:
	 [Node] Finished "_mask_func0".
180514-10:02:40,168 workflow INFO:
	 [Node] Finished "work_preproc.mask_func".
180514-10:02:41,55 workflow INFO:
	 [Job 19] Completed (work_preproc.mask_func).
180514-10:02:41,58 workflow INFO:
	 [MultiProc] Running 0 tasks, and 1 jobs ready. Free memory (GB): 53.94/53.94, Free processors: 8/8.
180514-10:02:41,108 workflow INFO:
	 [Node] Setting-up "work_preproc.detrend" in "/output/work_preproc/_subject_id_07/detrend".
180514-10:02:41,113 workflow INFO:
	 [Node] Running "detrend" ("nipype.algorithms.confounds.TSNR")
180514-10:02:43,58 workflow INFO:
	 [MultiProc] Running 1 tasks, and 0 jobs ready. Free memory (GB): 53.74/53.94, Free processors: 7/8.
                     Currently running:
                       * work_preproc.detrend
180514-10:02:46,341 workflow INFO:
	 [Node] Finished "work_preproc.detrend".
180514-10:02:47,63 workflow INFO:
	 [Job 20] Completed (work_preproc.detrend).
180514-10:02:47,69 workflow INFO:
	 [MultiProc] Running 0 tasks, and 0 jobs ready. Free memory (GB): 53.94/53.94, Free processors: 8/8.
Out[ ]:
<networkx.classes.digraph.DiGraph at 0x7f026449ba58>

Inspect output

What did we actually do? Let's look at all the data that was created.

In [ ]:
!tree /output/work_preproc/ -I '*js|*json|*pklz|_report|*dot|*html|*txt|*.m'
/output/work_preproc/
├── graph.png
├── _subject_id_07
│   ├── applywarp
│   │   ├── asub-07_ses-test_task-fingerfootlips_bold_roi_mcf_flirt.mat
│   │   └── asub-07_ses-test_task-fingerfootlips_bold_roi_mcf_flirt.nii
│   ├── art
│   │   ├── mask.asub-07_ses-test_task-fingerfootlips_bold_roi_mcf.nii.gz
│   │   └── plot.asub-07_ses-test_task-fingerfootlips_bold_roi_mcf.svg
│   ├── coreg
│   │   ├── asub-07_ses-test_task-fingerfootlips_bold_roi_mcf.nii.gz_mean_reg_flirt.mat
│   │   └── asub-07_ses-test_task-fingerfootlips_bold_roi_mcf.nii.gz_mean_reg_flirt.nii
│   ├── detrend
│   │   ├── detrend.nii.gz
│   │   ├── mean.nii.gz
│   │   ├── stdev.nii.gz
│   │   └── tsnr.nii.gz
│   ├── extract
│   │   └── sub-07_ses-test_task-fingerfootlips_bold_roi.nii
│   ├── gunzip_anat
│   │   └── sub-07_ses-test_T1w.nii
│   ├── gunzip_func
│   │   └── sub-07_ses-test_task-fingerfootlips_bold.nii
│   ├── mask_func
│   │   └── mapflow
│   │       └── _mask_func0
│   │           └── asub-07_ses-test_task-fingerfootlips_bold_roi_mcf_flirt_smooth_masked.nii
│   ├── mask_GM
│   │   └── c1sub-07_ses-test_T1w_flirt_thresh.nii
│   ├── mcflirt
│   │   ├── asub-07_ses-test_task-fingerfootlips_bold_roi_mcf.nii.gz
│   │   ├── asub-07_ses-test_task-fingerfootlips_bold_roi_mcf.nii.gz_mean_reg.nii.gz
│   │   └── asub-07_ses-test_task-fingerfootlips_bold_roi_mcf.nii.gz.par
│   ├── resample
│   │   ├── c1sub-07_ses-test_T1w_flirt.mat
│   │   └── c1sub-07_ses-test_T1w_flirt.nii
│   ├── segment
│   │   ├── c1sub-07_ses-test_T1w.nii
│   │   ├── c2sub-07_ses-test_T1w.nii
│   │   └── c3sub-07_ses-test_T1w.nii
│   ├── selectfiles
│   ├── slicetime
│   │   └── asub-07_ses-test_task-fingerfootlips_bold_roi.nii
│   └── threshold_WM
│       └── c2sub-07_ses-test_T1w_thresh.nii
└── susan
    └── _subject_id_07
        ├── mask
        │   └── mapflow
        │       └── _mask0
        │           └── asub-07_ses-test_task-fingerfootlips_bold_roi_mcf_flirt_mask.nii.gz
        ├── meanfunc2
        │   └── mapflow
        │       └── _meanfunc20
        │           └── asub-07_ses-test_task-fingerfootlips_bold_roi_mcf_flirt_mask_mean.nii.gz
        ├── median
        │   └── mapflow
        │       └── _median0
        ├── merge
        ├── multi_inputs
        └── smooth
            └── mapflow
                └── _smooth0
                    └── asub-07_ses-test_task-fingerfootlips_bold_roi_mcf_flirt_smooth.nii.gz

34 directories, 29 files

But what did we do specifically? Well, let's investigate.

Motion Correction and Artifact Detection

How much did the subject move in the scanner and where there any outliers in the functional images?

In [ ]:
%matplotlib inline
In [ ]:
# Plot the motion paramters
import numpy as np
import pylab as plt
par = np.loadtxt('/output/work_preproc/_subject_id_07/mcflirt/'
                 'asub-07_ses-test_task-fingerfootlips_bold_roi_mcf.nii.gz.par')
fig, axes = plt.subplots(2, 1, figsize=(15, 5))
axes[0].set_ylabel('rotation (radians)')
axes[0].plot(par[0:, :3])
axes[1].plot(par[0:, 3:])
axes[1].set_xlabel('time (TR)')
axes[1].set_ylabel('translation (mm)');

The motion parameters seems to look ok. What about the detection of artifacts?

In [ ]:
# Showing the artifact detection output
from IPython.display import SVG
SVG(filename='/output/work_preproc/_subject_id_07/art/'
    'plot.asub-07_ses-test_task-fingerfootlips_bold_roi_mcf.svg')
Out[ ]:

Which volumes are problematic?

In [ ]:
outliers = np.loadtxt('/output/work_preproc/_subject_id_07/art/'
                      'art.asub-07_ses-test_task-fingerfootlips_bold_roi_mcf_outliers.txt')
list(outliers.astype('int'))
Out[ ]:
[9, 21, 95, 96, 105, 120, 141, 156, 157]

Masks and Probability maps

Let's see what all the masks and probability maps look like. For this, we will use nilearn's plot_anat function.

In [ ]:
from nilearn import image as nli
from nilearn.plotting import plot_stat_map
%matplotlib inline
output = '/output/work_preproc/_subject_id_07/'

First, let's look at the tissue probability maps.

In [ ]:
anat = output + 'gunzip_anat/sub-07_ses-test_T1w.nii'
In [ ]:
plot_stat_map(
    output + 'segment/c1sub-07_ses-test_T1w.nii', title='GM prob. map',  cmap=plt.cm.magma,
    threshold=0.5, bg_img=anat, display_mode='z', cut_coords=range(-35, 15, 10), dim=-1);