1. 配准思路
fMRI分析的几个方面需要以某种方式进行图像的空间变换,例如个体内部的图像匹配(头动校正、功能到个体)或者个体间的图像配准(用于完成组分析)。
众所周知,fMRI是4D文件,一般借助自身的T1解剖像来配准到标准空间(常用MNI),也就是利用T1与MNI standard的配准信息以及自身与T1像的共配准信息来将fMRI配准到标准空间,思路大致如下图所示。
2. 具体实现
数据准备(预处理)
为了更好的配准效果,在配准之前可以对结构像和功能像先进行一系列的预处理。主要包括以下几个方面:
BOLD像时间层校正(Slice timing)
头动校正(Motion correction)
扭动图像(Deoblique)
平均BOLD像(BOLD mean)
共配准
个体的BOLD像和T1像之间的共配准用Unix平台FSL软件中的flirt
命令进行仿射配准,可以选择6、7、9或12自由度(DOF),同时可以保存变换信息为MAT文件(仿射矩阵)。详细用法参见【FLIRT/UserGuide】。
flirt [options] -in <inputvol> -ref <refvol> -out <outputvol>
flirt [options] -in <inputvol> -ref <refvol> -omat <outputmatrix>
flirt [options] -in <inputvol> -ref <refvol> -applyxfm -init <matrix> -out <outputvol>
具体流程如下:
FlIRT pre-alignment for BBR registration.
flirt -ref $T1_brain.nii -in $BOLD_mean.nii -dof 6 -omat $Bold2struc_1.mat
Estimating transformation matrix for BBR registration
flirt -ref $T1_brain.nii -in $BOLD_mean.nii -dof 6 -cost bbr -wmseg $STRUC_WM.nii -init $Bold2struc_1.mat -omat $Bold2struc_2.mat -schedule $FSLDIR/etc/flirtsch/bbr.shr
Apply BBR registration to BOLD FILE
applywarp -i $BOLD.nii -r $T1_brain.nii -o $BOLD_reg2_T1.nii --premat=$Bold2struc_2.mat --interp=spline
可结合下图理解代码:
这一步可以获取配准到Structural Space
的fMRI像,因此下一步只需要获得Structutal Space -> Standard Space
的变换矩阵,然后应用于fMRI即可得到最终配准到标准空间的BOLD像。
结构像标准化
这一步用ANTs软件中的antsRegistration
命令来配准结构像,使用方法如下
# define vars
prefix= the prefix of output file
fixed= the image is moved to the target or template
moving= the target or template image
base_mask=none
in_mask=none
base_mask_SyN= the mask image
in_mask_SyN=none
# ANTs command
antsRegistration -d 3 --float 1 --verbose \
--output [ ${prefix}_, ${prefix}_fwd_warped.nii.gz, ${prefix}_inv_warped.nii.gz ] \
--interpolation LanczosWindowedSinc \
--collapse-output-transforms 1 \
--initial-moving-transform [ ${fixed}, ${moving}, 1 ] \
--winsorize-image-intensities [0.005,0.995] \
--use-histogram-matching 1 \
--transform translation[ 0.1 ] \
--metric mattes[ ${fixed}, ${moving}, 1, 32, regular, 0.3 ] \
--convergence [ 1000x300x100, 1e-6, 10 ] \
--smoothing-sigmas 4x2x1vox \
--shrink-factors 8x4x2 \
--use-estimate-learning-rate-once 1 \
--masks [ ${base_mask}, ${in_mask} ] \
-t rigid[ 0.1 ] \
-m mattes[ ${fixed}, ${moving}, 1, 32, regular, 0.3 ] \
-c [ 1000x300x100, 1e-6, 10 ] \
-s 4x2x1vox \
-f 4x2x1 -l 1 \
-x [ ${base_mask}, ${in_mask} ] \
-t affine[ 0.1 ] \
-m mattes[ ${fixed}, ${moving}, 1, 32, regular, 0.3 ] \
-c [ 1000x300x100, 1e-6, 10 ] \
-s 2x1x0vox \
-f 4x2x1 -l 1 \
-x [ ${base_mask}, ${in_mask} ] \
-t SyN[ 0.1, 3, 0 ] \
-m mattes[ ${fixed}, ${moving}, 0.5 , 32 ] \
-m cc[ ${fixed}, ${moving}, 0.5 , 4 ] \
-c [ 500x500x100, 1e-8, 10 ] \
-s 1x0.5x0vox \
-f 4x2x1 -l 1 \
-x [ ${base_mask_SyN}, ${in_mask_SyN} ]
为了实现最优配准,在 ANT 中使用了一组算法: 刚性、仿射、同步(SyN)。前两者也称为线性配准,SyN
配准是非线性配准。命令参数及更多信息参见【Anatomy of an antsRegistration call】。
这个过程会生成配准后的“正反”变换文件及相应的变换矩阵文件,然后用Python文件(1_ants2afniMatrix.py
,内容如下)将其转换为.1D
(AFNI通用)文件,python 1_ants2afniMatrix.py -i mat_file.mat -o 1D_file.1D
。
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import numpy as np
import scipy.io as scio
import argparse
if __name__ == '__main__':
parser = argparse.ArgumentParser(description = 'Personal Information ',epilog = 'Information end ')
parser.add_argument('-i', '--input', type=str, default='Input.mat', help='Transform matrix generated by ANTs')
parser.add_argument('-o', '--output', type=str, default='Output.1D', help='Transform matrix used in AFNI')
args = parser.parse_args()
data = scio.loadmat(args.input)
mat=np.concatenate((data['AffineTransform_float_3_3'][0:9].reshape((3,3)), data['AffineTransform_float_3_3'][9:12]),axis=1)
center = data['fixed']
v=np.dot(mat[0:3,0:3],-center)+mat[:,3][:,np.newaxis]+center
matt = np.concatenate((mat[0:3,0:3],v),axis=1)
np.savetxt(args.output, matt, delimiter='\t')
应用变换
这一步用AFNI的3dNwarpApply
命令完成,nwarp
及master
选项的文件是上一步生成的。
3dNwarpApply -prefix ${prefix}.nii.gz -source ${BD_name}.nii.gz \
-master ${prefix}_fwd_warped.nii.gz \
-nwarp "${prefix}_1Warp.nii.gz ${prefix}_0GenericAffine.1D"
至此,配准已全部完成了的,当然配准的思路大概就是这个,但是有很多种实现方式。
3. pipeline代码
Prepare your data: resting.nii、 t1.nii、 1_ants2afniMatrix.py(Here)
# !/bin/bash
# 2022-04-14
# 配准批处理程序
# Deoblique
3dWarp -deoblique -prefix resting_deo.nii resting.nii
# volreg
3dvolreg -zpad 1 -base 0 -cubic -prefix resting_v.nii resting_deo.nii
# BOLD mean
3dTstat -mean -prefix resting_mean.nii resting_v.nii
# T1
3dWarp -deoblique -prefix t1_deo.nii t1.nii
bet t1_deo.nii t1_brain -f 0.3 -B -m
# ANTs
prefix=trans
fixed=t1_brain.nii.gz
moving=resting_mean.nii
base_mask=none
in_mask=none
# base_mask_SyN=../nodif_brain_mask.nii.gz
base_mask_SyN=t1_brain_mask.nii.gz
in_mask_SyN=none
antsRegistration -d 3 --float 1 --verbose \
--output [ ${prefix}_, ${prefix}_fwd_warped.nii.gz, ${prefix}_inv_warped.nii.gz ] \
--interpolation LanczosWindowedSinc \
--collapse-output-transforms 1 \
--initial-moving-transform [ ${fixed}, ${moving}, 1 ] \
--winsorize-image-intensities [0.005,0.995] \
--use-histogram-matching 1 \
--transform translation[ 0.1 ] \
--metric mattes[ ${fixed}, ${moving}, 1, 32, regular, 0.3 ] \
--convergence [ 1000x300x100, 1e-6, 10 ] \
--smoothing-sigmas 4x2x1vox \
--shrink-factors 8x4x2 \
--use-estimate-learning-rate-once 1 \
--masks [ ${base_mask}, ${in_mask} ] \
-t rigid[ 0.1 ] \
-m mattes[ ${fixed}, ${moving}, 1, 32, regular, 0.3 ] \
-c [ 1000x300x100, 1e-6, 10 ] \
-s 4x2x1vox \
-f 4x2x1 -l 1 \
-x [ ${base_mask}, ${in_mask} ] \
-t affine[ 0.1 ] \
-m mattes[ ${fixed}, ${moving}, 1, 32, regular, 0.3 ] \
-c [ 1000x300x100, 1e-6, 10 ] \
-s 2x1x0vox \
-f 4x2x1 -l 1 \
-x [ ${base_mask}, ${in_mask} ] \
-t SyN[ 0.1, 3, 0 ] \
-m mattes[ ${fixed}, ${moving}, 0.5 , 32 ] \
-m cc[ ${fixed}, ${moving}, 0.5 , 4 ] \
-c [ 500x500x100, 1e-8, 10 ] \
-s 1x0.5x0vox \
-f 4x2x1 -l 1 \
-x [ ${base_mask_SyN}, ${in_mask_SyN} ]
# 转换mat文件为AFNI所用
python ./1_ants2afniMatrix.py -i ${prefix}_0GenericAffine.mat -o ${prefix}_0GenericAffine.1D
# 转换五种刺激结果
3dNwarpApply -prefix epi2anat.nii.gz -source resting_v.nii -master ${prefix}_fwd_warped.nii.gz -nwarp "${prefix}_1Warp.nii.gz ${prefix}_0GenericAffine.1D"
相关链接
- https://www.mrmikehart.com/tumour_analysis.html
- https://www.frontiersin.org/articles/10.3389/fninf.2019.00005/full
- https://www.nature.com/articles/s41592-018-0235-4
$\cdots$ end $\cdots$