Multimodal entity coreference for cervical dysplasia diagnosis

Dezhao Song, Edward Kim, Sharon Xiaolei Huang, Joseph Patruno, Héctor Muñoz-Avila, Jeff Heflin, L. Rodney Long, Sameer Antani

Research output: Contribution to journalArticle

24 Citations (Scopus)

Abstract

Cervical cancer is the second most common type of cancer for women. Existing screening programs for cervical cancer, such as Pap Smear, suffer from low sensitivity. Thus, many patients who are ill are not detected in the screening process. Using images of the cervix as an aid in cervical cancer screening has the potential to greatly improve sensitivity, and can be especially useful in resource-poor regions of the world. In this paper, we develop a data-driven computer algorithm for interpreting cervical images based on color and texture. We are able to obtain 74% sensitivity and 90% specificity when differentiating high-grade cervical lesions from low-grade lesions and normal tissue. On the same dataset, using Pap tests alone yields a sensitivity of 37% and specificity of 96%, and using HPV test alone gives a 57% sensitivity and 93% specificity. Furthermore, we develop a comprehensive algorithmic framework based on Multimodal Entity Coreference for combining various tests to perform disease classification and diagnosis. When integrating multiple tests, we adopt information gain and gradient-based approaches for learning the relative weights of different tests. In our evaluation, we present a novel algorithm that integrates cervical images, Pap, HPV, and patient age, which yields 83.21% sensitivity and 94.79% specificity, a statistically significant improvement over using any single source of information alone.

Original languageEnglish (US)
Article number2352311
Pages (from-to)229-245
Number of pages17
JournalIEEE Transactions on Medical Imaging
Volume34
Issue number1
DOIs
StatePublished - Jan 1 2015

Fingerprint

Uterine Cervical Dysplasia
Screening
Uterine Cervical Neoplasms
Papanicolaou Test
Sensitivity and Specificity
Textures
Early Detection of Cancer
Cervix Uteri
Tissue
Color
Learning
Weights and Measures
Neoplasms

All Science Journal Classification (ASJC) codes

  • Software
  • Radiological and Ultrasound Technology
  • Computer Science Applications
  • Electrical and Electronic Engineering

Cite this

Song, D., Kim, E., Huang, S. X., Patruno, J., Muñoz-Avila, H., Heflin, J., ... Antani, S. (2015). Multimodal entity coreference for cervical dysplasia diagnosis. IEEE Transactions on Medical Imaging, 34(1), 229-245. [2352311]. https://doi.org/10.1109/TMI.2014.2352311
Song, Dezhao ; Kim, Edward ; Huang, Sharon Xiaolei ; Patruno, Joseph ; Muñoz-Avila, Héctor ; Heflin, Jeff ; Long, L. Rodney ; Antani, Sameer. / Multimodal entity coreference for cervical dysplasia diagnosis. In: IEEE Transactions on Medical Imaging. 2015 ; Vol. 34, No. 1. pp. 229-245.
@article{ae73451e57794acfad9a1c2efd2eea5b,
title = "Multimodal entity coreference for cervical dysplasia diagnosis",
abstract = "Cervical cancer is the second most common type of cancer for women. Existing screening programs for cervical cancer, such as Pap Smear, suffer from low sensitivity. Thus, many patients who are ill are not detected in the screening process. Using images of the cervix as an aid in cervical cancer screening has the potential to greatly improve sensitivity, and can be especially useful in resource-poor regions of the world. In this paper, we develop a data-driven computer algorithm for interpreting cervical images based on color and texture. We are able to obtain 74{\%} sensitivity and 90{\%} specificity when differentiating high-grade cervical lesions from low-grade lesions and normal tissue. On the same dataset, using Pap tests alone yields a sensitivity of 37{\%} and specificity of 96{\%}, and using HPV test alone gives a 57{\%} sensitivity and 93{\%} specificity. Furthermore, we develop a comprehensive algorithmic framework based on Multimodal Entity Coreference for combining various tests to perform disease classification and diagnosis. When integrating multiple tests, we adopt information gain and gradient-based approaches for learning the relative weights of different tests. In our evaluation, we present a novel algorithm that integrates cervical images, Pap, HPV, and patient age, which yields 83.21{\%} sensitivity and 94.79{\%} specificity, a statistically significant improvement over using any single source of information alone.",
author = "Dezhao Song and Edward Kim and Huang, {Sharon Xiaolei} and Joseph Patruno and H{\'e}ctor Mu{\~n}oz-Avila and Jeff Heflin and Long, {L. Rodney} and Sameer Antani",
year = "2015",
month = "1",
day = "1",
doi = "10.1109/TMI.2014.2352311",
language = "English (US)",
volume = "34",
pages = "229--245",
journal = "IEEE Transactions on Medical Imaging",
issn = "0278-0062",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "1",

}

Song, D, Kim, E, Huang, SX, Patruno, J, Muñoz-Avila, H, Heflin, J, Long, LR & Antani, S 2015, 'Multimodal entity coreference for cervical dysplasia diagnosis', IEEE Transactions on Medical Imaging, vol. 34, no. 1, 2352311, pp. 229-245. https://doi.org/10.1109/TMI.2014.2352311

Multimodal entity coreference for cervical dysplasia diagnosis. / Song, Dezhao; Kim, Edward; Huang, Sharon Xiaolei; Patruno, Joseph; Muñoz-Avila, Héctor; Heflin, Jeff; Long, L. Rodney; Antani, Sameer.

In: IEEE Transactions on Medical Imaging, Vol. 34, No. 1, 2352311, 01.01.2015, p. 229-245.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Multimodal entity coreference for cervical dysplasia diagnosis

AU - Song, Dezhao

AU - Kim, Edward

AU - Huang, Sharon Xiaolei

AU - Patruno, Joseph

AU - Muñoz-Avila, Héctor

AU - Heflin, Jeff

AU - Long, L. Rodney

AU - Antani, Sameer

PY - 2015/1/1

Y1 - 2015/1/1

N2 - Cervical cancer is the second most common type of cancer for women. Existing screening programs for cervical cancer, such as Pap Smear, suffer from low sensitivity. Thus, many patients who are ill are not detected in the screening process. Using images of the cervix as an aid in cervical cancer screening has the potential to greatly improve sensitivity, and can be especially useful in resource-poor regions of the world. In this paper, we develop a data-driven computer algorithm for interpreting cervical images based on color and texture. We are able to obtain 74% sensitivity and 90% specificity when differentiating high-grade cervical lesions from low-grade lesions and normal tissue. On the same dataset, using Pap tests alone yields a sensitivity of 37% and specificity of 96%, and using HPV test alone gives a 57% sensitivity and 93% specificity. Furthermore, we develop a comprehensive algorithmic framework based on Multimodal Entity Coreference for combining various tests to perform disease classification and diagnosis. When integrating multiple tests, we adopt information gain and gradient-based approaches for learning the relative weights of different tests. In our evaluation, we present a novel algorithm that integrates cervical images, Pap, HPV, and patient age, which yields 83.21% sensitivity and 94.79% specificity, a statistically significant improvement over using any single source of information alone.

AB - Cervical cancer is the second most common type of cancer for women. Existing screening programs for cervical cancer, such as Pap Smear, suffer from low sensitivity. Thus, many patients who are ill are not detected in the screening process. Using images of the cervix as an aid in cervical cancer screening has the potential to greatly improve sensitivity, and can be especially useful in resource-poor regions of the world. In this paper, we develop a data-driven computer algorithm for interpreting cervical images based on color and texture. We are able to obtain 74% sensitivity and 90% specificity when differentiating high-grade cervical lesions from low-grade lesions and normal tissue. On the same dataset, using Pap tests alone yields a sensitivity of 37% and specificity of 96%, and using HPV test alone gives a 57% sensitivity and 93% specificity. Furthermore, we develop a comprehensive algorithmic framework based on Multimodal Entity Coreference for combining various tests to perform disease classification and diagnosis. When integrating multiple tests, we adopt information gain and gradient-based approaches for learning the relative weights of different tests. In our evaluation, we present a novel algorithm that integrates cervical images, Pap, HPV, and patient age, which yields 83.21% sensitivity and 94.79% specificity, a statistically significant improvement over using any single source of information alone.

UR - http://www.scopus.com/inward/record.url?scp=84937543520&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84937543520&partnerID=8YFLogxK

U2 - 10.1109/TMI.2014.2352311

DO - 10.1109/TMI.2014.2352311

M3 - Article

C2 - 25167547

AN - SCOPUS:84937543520

VL - 34

SP - 229

EP - 245

JO - IEEE Transactions on Medical Imaging

JF - IEEE Transactions on Medical Imaging

SN - 0278-0062

IS - 1

M1 - 2352311

ER -