Group-Wise FMRI Activation Detection on DICCCOL Landmarks

Jinglei Lv, Lei Guo, Dajiang Zhu, Tuo Zhang, Xintao Hu, Junwei Han, Tianming Liu

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

Group-wise activation detection in task-based fMRI has been widely used because of its robustness to noises and its capacity to deal with variability of individual brains. However, current group-wise fMRI activation detection methods typically rely on the co-registration of individual brains’ fMRI images, which has difficulty in dealing with the remarkable anatomic variation of different brains. As a consequence, the resulted misalignments could significantly degrade the required inter-subject correspondences, thus substantially reducing the sensitivity and specificity of group-wise fMRI activation detection. To deal with these challenges, this paper presents a novel approach to detecting group-wise fMRI activation on our recently developed and validated Dense Individualized and Common Connectivity-based Cortical Landmarks (DICCCOL). The basic idea here is that the first-level general linear model (GLM) analysis is first performed on the fMRI signal of each corresponding DICCCOL landmark in individual brain’s own space, and then the estimated effect sizes of the same landmark from a group of subjects are statistically assessed with the mixed-effect model at the group level. Finally, the consistently activated DICCCOL landmarks are determined and declared in a group-wise fashion in response to external block-based stimuli. Our experimental results have demonstrated that the proposed approach can detect meaningful activations.

Original languageEnglish
Pages (from-to)513-534
Number of pages22
JournalNeuroinformatics
Volume12
Issue number4
DOIs
StatePublished - 21 Oct 2014

Keywords

  • DICCCOL
  • DTI
  • fMRI
  • Group-wise activation detection

Fingerprint

Dive into the research topics of 'Group-Wise FMRI Activation Detection on DICCCOL Landmarks'. Together they form a unique fingerprint.

Cite this