Skip to content

pet-surface - Projection of PET acquisition into surface of the cortex

This pipeline is designed to extract PET signal and project it into the cortical surface. To do so, anatomical (T1-MRI) and functional (PET) acquisition are required. This is a vertex-wise approach. Results are obtained in native cortical surface or in a common template (FsAverage) for subsequent analysis (machine learning, group comparison).

PET surface results
FDG PET SUVR projected onto the cortical surface (left hemisphere) for (from left to right) a cognitively normal subject (CN), a patient with Alzheimer’s disease (AD), a patient with semantic variant primary progressive aphasia (svPPA) and a patient with logopenic variant primary progressive aphasia (lvPPA). The first row is the projection in the subject’s space. The second row is the same signal for each subject, but warped to FsAverage after smoothing with a 20 mm Gaussian kernel.

Prerequisite

You need to have performed the t1-freesurfer-cross-sectional pipeline on your T1-weighted MRI images.

Dependencies

If you only installed the core of Clinica, this pipeline needs the installation of FreeSurfer, SPM (along with MATLAB), FSL and PETPVC on your computer. You can find how to install this software on the third-party page.

Running the pipeline

The pipeline can be run with the following command line:

clinica run pet-surface bids_directory caps_directory
where:

  • bids_directory is the input folder containing the dataset in a BIDS hierarchy.
  • caps_directory is the output folder containing the results in a CAPS hierarchy.

If you want to run the pipeline on a subset of your BIDS dataset, you can use the -tsv flag to specify in a TSV file the participants belonging to your subset.

Don't hesitate to run clinica run pet-surface (without parameters) to see every optional parameters.

For example, you can specify what kind of PET you want to project on your cortical surface, using the flag --pet_tracer PET_TRACER where PET_TRACER can have the value fdg (default) or av45. For those 2 biomarkers, normalization step is different (SUVr for FDG-PET are obtained by dividing whole volume by mean value in pons, whereas SUVr for AV45-PET is computed with the mean value of pons and cerebellum).

Outputs

Results are stored in the following folder of the CAPS hierarchy: subjects/sub-<participant_label>/ses-<session_label>/pet/surface

The files are: (where * stands for sub-<participant_label>_ses-<session_label>)

  • atlas_statistics/*_task-<label>_acq-<label>_pet_space-<label>_pvc-iy_suvr-<label>_statistics.tsv are text files that display average PET values in various regions (on either _space_desikan_ or _space_destrieux_ atlases. With the help of pandas (Python library), it can be easily parsed for machine learning purposes.
  • *_hemi-<hemi_label>_midcorticalsurface: represents the surface at equal distance between the white matter/gray matter interface and the pial surface (one per hemisphere).
  • *_task-rest_acq-<label>_pet_space-<label>_suvr-<label>_pvc-iy_hemi-<label>_fwhm-<value>_projection.mgh: PET data that can be mapped onto meshes. If _space_fsaverage_ is in the name, it can be mapped either onto the white or pial surface of FsAverage. If _space_native_, is in the name, it can be mapped onto the white or pial surface of the subject’s surface (*h.white, *h.pial files from t1-freesurfer-cross-sectional pipeline)

How to manipulate outputs

Outputs of pipeline are composed of two different type of file: surface files and MGH data that are to be overlaid onto a surface.

Surface file

They can be read using various software. You can open it using freeview (FreeSurfer viewer), with freeview -f /path/to/your/surface/file.

You can also open it in MATLAB, using SurfStat: mysurface = SurfStatReadSurf('/path/to/your/surface/file'). This will give you a structure with fields coord (for coordinates), a list of coordinates for each point of the mesh, and also field tri (for triangle), a list of triplet for each triangle, each number representing the Nth vertex of the coord list. Here is below an example to make things clearer (read with Matlab)

PET surface file

Data files

The data files wear the .mgh extension. It is composed of a single vector. This file contains a vector, where value at Nth position must be mapped into the Nth vertex of the coord list to be correctly represented. You can access them either in Matlab with the command:

mydata = SurfStatReadData('/path/to/your/file.mgh');
(you will get a single row vector)

Or in Python with the nibabel library:

import nibabel
mydata = nibabel.load('/path/to/your/mgh/file')
mydata will then be a MGHImage, more information here. Keep in mind that if you want to manipulate the data vector within this object, you will need to transform it a bit. Indeed, if you do the following:

raw_data = mydata.get_data()
print(raw_data.shape)

The shape of your "raw" vector will probably look like this: (163842, 1, 1). Use the squeeze function from numpy to get a (163842,) shape. (documentation here). The reverse operation ((163842,) to a (163842, 1, 1) shape) can be achieved with the atleast_3d function from numpy (documentation here). This may come handy when you need to create a MGHImage from scratch.

Visualization of the results

After the execution of the pipeline, you can check the outputs of one subject by running this command (subject moved into FsAverage):

freeview -f $SUBJECTS_DIR/fsaverage/surf/lh.pial:overlay=path/to/your/projected/pet/in/fsaverage/left/hemi \
 -f $SUBJECTS_DIR/fsaverage/surf/rh.pial:overlay=path/to/your/projected/pet/in/fsaverage/right/hemi
PET-Surface Freeview

But you can also visualize your subjects cortical projection directly into his native space:

freeview -f path/to/midcortical/surface/left:overlay=path/to/your/projected/pet/in/nativespace/left/hemi \
 -f path/to/midcortical/surface/right:overlay=path/to/your/projected/pet/in/nativespace/right/hemi

You will need to adjust the colormap using the Configure button in the left panel, just below the Overlay section.

You can also visualize your surface using the SurfStat tool. Once SurfStat installation folder is added to your MATLAB path, you can display your surfaces with the following commands:

mydata = SurfStatReadData({'/path/to/left/data', '/path/to/right/data'});
mysurfaces = SurfStatReadSurf({'/path/to/left/surface', '/path/to/right/surface'});
figure, SurfStatViewData(mydata, mysurfaces, 'Title of figure');
It will get you the following figure:
PET-Surface SurfStat

Describing this pipeline in your paper

Example of paragraph:

These results have been obtained using the pet-surface pipeline of Clinica. More precisely, this pipeline uses the automatic segmentation gtmseg from FreeSurfer to generate masks used in the Partial Volume Correction method Iterative Yang. The coregistered and Partial Volume Corrected PET signal is then projected into multiple surfaces, at different fraction of cortical thickness. The final value is a weighted average using a normal distribution.