Instrument tracking via online learning in retinal microsurgery

Yeqing Li, Chen Chen, Sharon Xiaolei Huang, Junzhou Huang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

13 Citations (Scopus)

Abstract

Robust visual tracking of instruments is an important task in retinal microsurgery. In this context, the instruments are subject to a large variety of appearance changes due to illumination and other changes during a procedure, which makes the task very challenging. Most existing methods require collecting a sufficient amount of labelled data and yet perform poorly in handling appearance changes that are unseen in training data. To address these problems, we propose a new approach for robust instrument tracking. Specifically, we adopt an online learning technique that collects appearance samples of instruments on the fly and gradually learns a target-specific detector. Online learning enables the detector to reinforce its model and become more robust over time. The performance of the proposed method has been evaluated on a fully annotated dataset of retinal instruments in in-vivo retinal microsurgery and on a laparoscopy image sequence. In all experimental results, our proposed tracking approach shows superior performance compared to several other state-of-the-art approaches.

Original languageEnglish (US)
Title of host publicationMedical Image Computing and Computer-Assisted Intervention - MICCAI2014 - 17th International Conference, Proceedings
EditorsNobuhiko Hata, Christian Barillot, Joachim Hornegger, Polina Golland, Robert Howe
PublisherSpringer Verlag
Pages464-471
Number of pages8
ISBN (Electronic)9783319104034
StatePublished - Jan 1 2014
Event17th International Conference on Medical Image Computing and Computer-Assisted Intervention, MICCAI 2014 - Boston, United States
Duration: Sep 14 2014Sep 18 2014

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume8673
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference17th International Conference on Medical Image Computing and Computer-Assisted Intervention, MICCAI 2014
CountryUnited States
CityBoston
Period9/14/149/18/14

Fingerprint

Online Learning
Laparoscopy
Detector
Detectors
Visual Tracking
Image Sequence
Illumination
Lighting
Sufficient
Target
Experimental Results

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Computer Science(all)

Cite this

Li, Y., Chen, C., Huang, S. X., & Huang, J. (2014). Instrument tracking via online learning in retinal microsurgery. In N. Hata, C. Barillot, J. Hornegger, P. Golland, & R. Howe (Eds.), Medical Image Computing and Computer-Assisted Intervention - MICCAI2014 - 17th International Conference, Proceedings (pp. 464-471). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 8673). Springer Verlag.
Li, Yeqing ; Chen, Chen ; Huang, Sharon Xiaolei ; Huang, Junzhou. / Instrument tracking via online learning in retinal microsurgery. Medical Image Computing and Computer-Assisted Intervention - MICCAI2014 - 17th International Conference, Proceedings. editor / Nobuhiko Hata ; Christian Barillot ; Joachim Hornegger ; Polina Golland ; Robert Howe. Springer Verlag, 2014. pp. 464-471 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
@inproceedings{2e0128bac6cc48de8f6bd7e343e5912d,
title = "Instrument tracking via online learning in retinal microsurgery",
abstract = "Robust visual tracking of instruments is an important task in retinal microsurgery. In this context, the instruments are subject to a large variety of appearance changes due to illumination and other changes during a procedure, which makes the task very challenging. Most existing methods require collecting a sufficient amount of labelled data and yet perform poorly in handling appearance changes that are unseen in training data. To address these problems, we propose a new approach for robust instrument tracking. Specifically, we adopt an online learning technique that collects appearance samples of instruments on the fly and gradually learns a target-specific detector. Online learning enables the detector to reinforce its model and become more robust over time. The performance of the proposed method has been evaluated on a fully annotated dataset of retinal instruments in in-vivo retinal microsurgery and on a laparoscopy image sequence. In all experimental results, our proposed tracking approach shows superior performance compared to several other state-of-the-art approaches.",
author = "Yeqing Li and Chen Chen and Huang, {Sharon Xiaolei} and Junzhou Huang",
year = "2014",
month = "1",
day = "1",
language = "English (US)",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Verlag",
pages = "464--471",
editor = "Nobuhiko Hata and Christian Barillot and Joachim Hornegger and Polina Golland and Robert Howe",
booktitle = "Medical Image Computing and Computer-Assisted Intervention - MICCAI2014 - 17th International Conference, Proceedings",
address = "Germany",

}

Li, Y, Chen, C, Huang, SX & Huang, J 2014, Instrument tracking via online learning in retinal microsurgery. in N Hata, C Barillot, J Hornegger, P Golland & R Howe (eds), Medical Image Computing and Computer-Assisted Intervention - MICCAI2014 - 17th International Conference, Proceedings. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 8673, Springer Verlag, pp. 464-471, 17th International Conference on Medical Image Computing and Computer-Assisted Intervention, MICCAI 2014, Boston, United States, 9/14/14.

Instrument tracking via online learning in retinal microsurgery. / Li, Yeqing; Chen, Chen; Huang, Sharon Xiaolei; Huang, Junzhou.

Medical Image Computing and Computer-Assisted Intervention - MICCAI2014 - 17th International Conference, Proceedings. ed. / Nobuhiko Hata; Christian Barillot; Joachim Hornegger; Polina Golland; Robert Howe. Springer Verlag, 2014. p. 464-471 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 8673).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Instrument tracking via online learning in retinal microsurgery

AU - Li, Yeqing

AU - Chen, Chen

AU - Huang, Sharon Xiaolei

AU - Huang, Junzhou

PY - 2014/1/1

Y1 - 2014/1/1

N2 - Robust visual tracking of instruments is an important task in retinal microsurgery. In this context, the instruments are subject to a large variety of appearance changes due to illumination and other changes during a procedure, which makes the task very challenging. Most existing methods require collecting a sufficient amount of labelled data and yet perform poorly in handling appearance changes that are unseen in training data. To address these problems, we propose a new approach for robust instrument tracking. Specifically, we adopt an online learning technique that collects appearance samples of instruments on the fly and gradually learns a target-specific detector. Online learning enables the detector to reinforce its model and become more robust over time. The performance of the proposed method has been evaluated on a fully annotated dataset of retinal instruments in in-vivo retinal microsurgery and on a laparoscopy image sequence. In all experimental results, our proposed tracking approach shows superior performance compared to several other state-of-the-art approaches.

AB - Robust visual tracking of instruments is an important task in retinal microsurgery. In this context, the instruments are subject to a large variety of appearance changes due to illumination and other changes during a procedure, which makes the task very challenging. Most existing methods require collecting a sufficient amount of labelled data and yet perform poorly in handling appearance changes that are unseen in training data. To address these problems, we propose a new approach for robust instrument tracking. Specifically, we adopt an online learning technique that collects appearance samples of instruments on the fly and gradually learns a target-specific detector. Online learning enables the detector to reinforce its model and become more robust over time. The performance of the proposed method has been evaluated on a fully annotated dataset of retinal instruments in in-vivo retinal microsurgery and on a laparoscopy image sequence. In all experimental results, our proposed tracking approach shows superior performance compared to several other state-of-the-art approaches.

UR - http://www.scopus.com/inward/record.url?scp=84906984711&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84906984711&partnerID=8YFLogxK

M3 - Conference contribution

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 464

EP - 471

BT - Medical Image Computing and Computer-Assisted Intervention - MICCAI2014 - 17th International Conference, Proceedings

A2 - Hata, Nobuhiko

A2 - Barillot, Christian

A2 - Hornegger, Joachim

A2 - Golland, Polina

A2 - Howe, Robert

PB - Springer Verlag

ER -

Li Y, Chen C, Huang SX, Huang J. Instrument tracking via online learning in retinal microsurgery. In Hata N, Barillot C, Hornegger J, Golland P, Howe R, editors, Medical Image Computing and Computer-Assisted Intervention - MICCAI2014 - 17th International Conference, Proceedings. Springer Verlag. 2014. p. 464-471. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).