Brain anatomical structure segmentation by hybrid discriminative/generative models

Zhuowen Tu, Katherine L. Narr, Piotr Dollar, Ivo Dinov, Paul M. Thompson, Arthur W. Toga

Research output: Contribution to journalArticlepeer-review

159 Scopus citations

Abstract

In this paper, a hybrid discriminative/generative model for brain anatomical structure segmentation is proposed. The learning aspect of the approach is emphasized. In the discriminative appearance models, various cues such as intensity and curvatures are combined to locally capture the complex appearances of different anatomical structures. A probabilistic boosting tree (PBT) framework is adopted to learn multiclass discriminative models that combine hundreds of features across different scales. On the generative model side, both global and local shape models are used to capture the shape information about each anatomical structure. The parameters to combine the discriminative appearance and generative shape models are also automatically learned. Thus, low-level and high-level information is learned and integrated in a hybrid model. Segmentations are obtained by minimizing an energy function associated with the proposed hybrid model. Finally, a grid-face structure is designed to explicitly represent the 3-D region topology. This representation handles an arbitrary number of regions and facilitates fast surface evolution. Our system was trained and tested on a set of 3-D magnetic resonance imaging (MRI) volumes and the results obtained are encouraging.

Original languageEnglish
Article number4359071
Pages (from-to)495-508
Number of pages14
JournalIEEE Transactions on Medical Imaging
Volume27
Issue number4
DOIs
StatePublished - Apr 2008
Externally publishedYes

Keywords

  • Brain anatomical structures
  • Discriminative models
  • Generative models
  • Probabilistic boosting tree (PBT)
  • Segmentation

Fingerprint

Dive into the research topics of 'Brain anatomical structure segmentation by hybrid discriminative/generative models'. Together they form a unique fingerprint.

Cite this