Segmentation of tissue boundary evolution from brain MR image sequences using multi-phase level sets

Corina S. Drapaca, Valerie Cardenas, Colin Studholme

Research output: Contribution to journalArticlepeer-review

24 Scopus citations

Abstract

In this paper, we focus on the automated extraction of the cerebrospinal fluid-tissue boundary, particularly around the ventricular surface, from serial structural MRI of the brain acquired in imaging studies of aging and dementia. This is a challenging segmentation problem because of the common occurrence of peri-ventricular lesions which locally alter the appearance of white matter. We examine a level set approach which evolves a 4D description of the ventricular surface over time. This has the advantage of allowing constraints on the contour in the temporal direction, improving the consistency of the extracted object over time. The 3D MR images of the entire brain are first aligned using global rigid registration. We then follow the approach proposed by Chan and Vese which is based on the Mumford and Shah model and implemented using the Osher and Sethian level set method. We have extended this to the 4D case to propagate a 4D contour toward the tissue boundaries through the evolution of a 5D implicit function. For convergence we use region-based information provided by the image rather than the gradient of the image. This model is then adapted to allow intensity contrast changes between time frames in the MRI sequence. Results on time sequences of 3D brain MR images are presented and discussed.

Original languageEnglish (US)
Pages (from-to)312-329
Number of pages18
JournalComputer Vision and Image Understanding
Volume100
Issue number3
DOIs
StatePublished - Dec 2005

All Science Journal Classification (ASJC) codes

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition

Fingerprint Dive into the research topics of 'Segmentation of tissue boundary evolution from brain MR image sequences using multi-phase level sets'. Together they form a unique fingerprint.

Cite this