/
Longitudinal Analysis

Attention: Confluence is not suitable for the storage of highly confidential data. Please ensure that any data classified as Highly Protected is stored using a more secure platform.
If you have any questions, please refer to the University's data classification guide or contact ict.askcyber@sydney.edu.au

Longitudinal Analysis

There is some additional processing that needs to be carried out before you can do your longitudinal analysis (these steps are detailed on the FreeSurfer Wiki ) You should have already completed the CROSS steps (either with the FreeSurfer or FastSurfer scripts found here. Scripts and instructions for the BASE and LONG steps can be found on this page.

You will need to edit the scripts if your timepoints do not all end in “_MRX” (where X is the number for the timepoint, for example “Control_001_MR1”).

The script for the Base processing is below. You need to fill in the Project name at the top and on the line with export SUBJECTS_DIR, so that this variable reflects where your pre-processed subject folders are. When submitting the script you need to create a list of all the BASE IDs (ie Control_001) rather than the time point IDs (ie Control_001_MR1).

#PBS -P #PBS -N BASE_processing #PBS -l select=1:ncpus=2:mem=32GB #PBS -q=main #PBS -l walltime=10:00:00 #PBS -q defaultQ module load python/3.6.5 module load openmpi-gcc/3.1.3 module load cuda/10.0.130 module load mrtrix3tissue/5.2.7 module load singularity/3.5.3 module load fsl/6.0.3 module load ants/2.3.2 module load mrtrix3/3.0.1 module load freesurfer/6.0.0 source $FREESURFER_HOME/SetUpFreeSurfer.sh export SUBJECTS_DIR=/scratch/Project/Preprocessing # CODE FOR TERMINAL # list='XXX_XXX # XXX_XXX # XXX_XXX' # for i in $list ; do qsub -v base=$i FS_BASE_Processing.pbs; done echo ${base} cd $SUBJECTS_DIR time_points=$(find ${SUBJECTS_DIR} -iname "${base}_MR*") NumTPs=( $time_points ) if [[ ${#NumTPs[@]} = 2 ]] then recon-all -base $base -tp ${base}_MR1 -tp ${base}_MR2 -all -openmp 4 elif [[ ${#NumTPs[@]} = 3 ]] then recon-all -base $base -tp ${base}_MR1 -tp ${base}_MR2 -tp ${base}_MR3 -all -openmp 4 elif [[ ${#NumTPs[@]} = 4 ]] then recon-all -base $base -tp ${base}_MR1 -tp ${base}_MR2 -tp ${base}_MR3 -tp ${base}_MR4 -all -openmp 4 elif [[ ${#NumTPs[@]} = 5 ]] then recon-all -base $base -tp ${base}_MR1 -tp ${base}_MR2 -tp ${base}_MR3 -tp ${base}_MR4 -tp ${base}_MR5 -all -openmp 4 elif [[ ${#NumTPs[@]} = 6 ]] then recon-all -base $base -tp ${base}_MR1 -tp ${base}_MR2 -tp ${base}_MR3 -tp ${base}_MR4 -tp ${base}_MR5 -tp ${base}_MR6 -all -openmp 4 elif [[ ${#NumTPs[@]} = 7 ]] then recon-all -base $base -tp ${base}_MR1 -tp ${base}_MR2 -tp ${base}_MR3 -tp ${base}_MR4 -tp ${base}_MR5 -tp ${base}_MR6 -tp ${base}_MR7 -all -openmp 4 else # if there is only one timepoint echo 'there is either only one timepoint (no longitudinal data) or there are more than 8, please edit the script." fi

Check that the jobs have completed without error (use grep “Exit Status” * in the folder with your output files). Once all the BASE processing is completed you can submit the LONG processing for each timepoint using the script below. Again you need to fill in the Project name at the top and on the line with export SUBJECTS_DIR. When submitting the script, this time you will create a list of all time point IDs (ie Control_001_MR1).

#PBS -P #PBS -N LONG_processing #PBS -l select=1:ncpus=2:mem=32GB #PBS -q=main #PBS -l walltime=10:00:00 #PBS -q defaultQ module load python/3.6.5 module load openmpi-gcc/3.1.3 module load cuda/10.0.130 module load mrtrix3tissue/5.2.7 module load singularity/3.5.3 module load fsl/6.0.3 module load ants/2.3.2 module load mrtrix3/3.0.1 module load freesurfer/6.0.0 source $FREESURFER_HOME/SetUpFreeSurfer.sh export SUBJECTS_DIR=/scratch/Project/Preprocessing # CODE FOR TERMINAL # list='XXX_XXX_MRX # XXX_XXX_MRX # XXX_XXX_MRX' # for i in $list ; do qsub -v base=$i FS_LONG_Processing.pbs; done cd $SUBJECTS_DIR echo ${ses} base=${ses%????} recon-all -long ${ses} ${base} -all -openmp 4

Again check that the jobs have completed without error (use grep “Exit Status” * in the folder with your output files). You can also have a look at your data directory to make sure there is now a folder for each BASE id and long folder for each time point (called TIMEPOINTID.long.BASEID).

Next, follow the instructions on the FreeSurfer wiki for Inspecting Longitudinal Data to check some of your data to make sure the pipeline is doing a good job.

*Note, The next steps depend on the design of your study. If you want to look at trajectories for individual groups (not comparing groups) you will want to create separate longitudinal qdec tables at this point and then repeat the statistical analysis steps for each group. If you would like to compare the groups in your model, include all the cases and add grouping columns. The statistical analysis uses MATLAB, so you may need to request access to Argus. Alternatively you can follow the steps from the FreeSurfer Wiki on a local installation of MATLAB.*

To get the longitudinal data ready for statistical analysis (LinearMixedEffectsModels) you need to create a table (space separated as a text file) in the following format:

fsid

fsid-base

yearsfromBL

...

OAS2_0001_MR1

OAS2_0001

0

 

OAS2_0001_MR2

OAS2_0001

1.25

 

OAS2_0004_MR1

OAS2_0004

0

 

OAS2_0004_MR2

OAS2_0004

1.47

 

...

 

 

 

where the first column is called fsid (containing all time points of all subjects) and the second column is fsid-base containing the within-subject template (=base) name, to group time points within subject. You can have many more columns such as gender, age, group, etc. Make sure you include the yearsfromBL column with an accurate measure of the time from the baseline scan for each time point (NOT the time from the previous timepoint). You can use the excel formula (=YEARFRAC) to get the time difference as a year fraction. If you are interested in including other variables in the model, add them into your longitudinal QDEC table. Make sure there are no spaces in the column headings as the file is space separated.

Then smooth the data from each qdec table using a script similar to the below. This example is for one qdec file and smooths to 15 and 20. You can repeat the mris_preproc and mri_surf2surf steps for multiple groups in the same script. Make sure you label the files with the group name if you have multiple qdec tables.

#PBS -P #PBS -N assembleAndSmooth #PBS -l select=1:ncpus=2:mem=32GB #PBS -q=main #PBS -l walltime=02:00:00 #PBS -q defaultQ module load python/3.6.5 module load openmpi-gcc/3.1.3 module load cuda/10.0.130 module load mrtrix3tissue/5.2.7 module load singularity/3.5.3 module load fsl/6.0.3 module load ants/2.3.2 module load mrtrix3/3.0.1 module load freesurfer/6.0.0 source $FREESURFER_HOME/SetUpFreeSurfer.sh export SUBJECTS_DIR=/scratch/Project/Data cd $SUBJECTS_DIR mris_preproc --qdec-long ../qdec/long.qdec.bvFTD.table.dat --target fsaverage --hemi lh --meas thickness --out lh.thickness.long.bvFTD.mgh mris_preproc --qdec-long ../qdec/long.qdec.bvFTD.table.dat --target fsaverage --hemi rh --meas thickness --out rh.thickness.long.bvFTD.mgh mri_surf2surf --hemi lh --s fsaverage --sval lh.thickness.long.bvFTD.mgh --tval lh.thickness.long.bvFTD_sm15.mgh --fwhm-trg 15 --cortex --noreshape mri_surf2surf --hemi rh --s fsaverage --sval rh.thickness.long.bvFTD.mgh --tval rh.thickness.long.bvFTD_sm15.mgh --fwhm-trg 15 --cortex --noreshape mri_surf2surf --hemi lh --s fsaverage --sval lh.thickness.long.bvFTD.mgh --tval lh.thickness.long.bvFTD_sm20.mgh --fwhm-trg 20 --cortex --noreshape mri_surf2surf --hemi rh --s fsaverage --sval rh.thickness.long.bvFTD.mgh --tval rh.thickness.long.bvFTD_sm20.mgh --fwhm-trg 20 --cortex --noreshape

Once you have completed the smoothing you are ready to carry out the LinearMixedEffectsModels. Again, the analysis uses MATLAB so you may need to request access to Argus if you don’t have an installation of MATLAB with the required toolboxes. Alternatively you can follow the steps from the FreeSurfer Wiki on a local installation of MATLAB.

Related content

Group Analysis (baseline)
Group Analysis (baseline)
More like this
FBA - Longitudinal template
FBA - Longitudinal template
More like this
Artemis job submission examples
Artemis job submission examples
More like this
Pre-processing
Pre-processing
More like this
Analysis
More like this
FreeSurfer
More like this