Tissue-type discrimination in magnetic resonance images

David Y. Amamoto, Rangachar Kasturi, Alexander Mamourian

Research output: Chapter in Book/Report/Conference proceedingConference contribution

9 Scopus citations


A method developed for classifying each location in a set of magnetic resonance (MR) images by tissue type is described. Three MR images of a region of interest are acquired using spin-echo pulse sequences. The sequences used to acquire these images are specifically defined to allow the calculation of MR-related physical parameters from the image intensity data. After preprocessing operators are applied to the original images, the image intensity data are used to calculate three MR-related parameters of each location. Then, in a supervised training environment, this calculated data set is used with the acquired image data set in a minimum-distance classifier to assign a class-specific color or gray level to each location in the image. Following the classification and formation of the tissue-map image, a set of edge detection routines is applied to generate tissue boundary images for all or a selected set of tissue types. Experimental results verify that the method is capable of accurately distinguishing between major tissue types in a region of interest.

Original languageEnglish (US)
Title of host publicationProceedings - International Conference on Pattern Recognition
PublisherPubl by IEEE
Number of pages5
ISBN (Print)0818620625
StatePublished - Dec 1 1990
EventProceedings of the 10th International Conference on Pattern Recognition - Atlantic City, NJ, USA
Duration: Jun 16 1990Jun 21 1990

Publication series

NameProceedings - International Conference on Pattern Recognition


ConferenceProceedings of the 10th International Conference on Pattern Recognition
CityAtlantic City, NJ, USA

All Science Journal Classification (ASJC) codes

  • Computer Vision and Pattern Recognition


Dive into the research topics of 'Tissue-type discrimination in magnetic resonance images'. Together they form a unique fingerprint.

Cite this