Novel deep learning architecture for optical fluence dependent photoacoustic target localization

Kerrick Johnstonbaugh, Sumit Agrawal, Deepit Abhishek, Matthew Homewood, Sri Phani Krisna Karri, Sri-Rajasekhar Kothapalli

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Photoacoustic imaging shows great promise for clinical environments where real-time position feedback is critical, including the guiding of minimally invasive surgery, drug delivery, stem cell transplantation, and the placement of metal implants such as stents, needles, staples, and brachytherapy seeds. Photoacoustic imaging techniques generate high contrast, label-free images of human vasculature, leveraging the high optical absorption characteristics of hemoglobin to generate measurable longitudinal pressure waves. However, the depth-dependent decrease in optical fluence and lateral resolution affects the visibility of deeper vessels or other absorbing targets. This poses a problem when the precise locations of vessels are critical for the application at hand, such as navigational tasks during minimally invasive surgery. To address this issue, a novel deep neural network was designed, developed, and trained to predict the location of circular chromophore targets in tissue mimicking a strong scattering background, given measurements of photoacoustic signals from a linear array of ultrasound elements. The network was trained on 16,240 samples of simulated sensor data and tested on a separate set of 4,060 samples. Both our training and test sets consisted of optical fluence-dependent photoacoustic signal measurements from point sources at varying locations. Our network was able to predict the location of point sources with a mean axial error of 4.3 μm and a mean lateral error of 5.8 μm.

Original languageEnglish (US)
Title of host publicationPhotons Plus Ultrasound
Subtitle of host publicationImaging and Sensing 2019
EditorsLihong V. Wang, Alexander A. Oraevsky
PublisherSPIE
ISBN (Electronic)9781510623989
DOIs
StatePublished - Jan 1 2019
EventPhotons Plus Ultrasound: Imaging and Sensing 2019 - San Francisco, United States
Duration: Feb 3 2019Feb 6 2019

Publication series

NameProgress in Biomedical Optics and Imaging - Proceedings of SPIE
Volume10878
ISSN (Print)1605-7422

Conference

ConferencePhotons Plus Ultrasound: Imaging and Sensing 2019
CountryUnited States
CitySan Francisco
Period2/3/192/6/19

Fingerprint

Minimally Invasive Surgical Procedures
Photoacoustic effect
learning
Photoacoustic Techniques
fluence
Learning
Brachytherapy
Stem Cell Transplantation
Surgery
Needles
Stents
Seeds
Hemoglobins
surgery
Hand
Metals
point sources
vessels
Transplantation (surgical)
Imaging techniques

All Science Journal Classification (ASJC) codes

  • Electronic, Optical and Magnetic Materials
  • Atomic and Molecular Physics, and Optics
  • Biomaterials
  • Radiology Nuclear Medicine and imaging

Cite this

Johnstonbaugh, K., Agrawal, S., Abhishek, D., Homewood, M., Krisna Karri, S. P., & Kothapalli, S-R. (2019). Novel deep learning architecture for optical fluence dependent photoacoustic target localization. In L. V. Wang, & A. A. Oraevsky (Eds.), Photons Plus Ultrasound: Imaging and Sensing 2019 [108781L] (Progress in Biomedical Optics and Imaging - Proceedings of SPIE; Vol. 10878). SPIE. https://doi.org/10.1117/12.2511015
Johnstonbaugh, Kerrick ; Agrawal, Sumit ; Abhishek, Deepit ; Homewood, Matthew ; Krisna Karri, Sri Phani ; Kothapalli, Sri-Rajasekhar. / Novel deep learning architecture for optical fluence dependent photoacoustic target localization. Photons Plus Ultrasound: Imaging and Sensing 2019. editor / Lihong V. Wang ; Alexander A. Oraevsky. SPIE, 2019. (Progress in Biomedical Optics and Imaging - Proceedings of SPIE).
@inproceedings{f14c768030eb4e8ba6a3ae46d79a1a3a,
title = "Novel deep learning architecture for optical fluence dependent photoacoustic target localization",
abstract = "Photoacoustic imaging shows great promise for clinical environments where real-time position feedback is critical, including the guiding of minimally invasive surgery, drug delivery, stem cell transplantation, and the placement of metal implants such as stents, needles, staples, and brachytherapy seeds. Photoacoustic imaging techniques generate high contrast, label-free images of human vasculature, leveraging the high optical absorption characteristics of hemoglobin to generate measurable longitudinal pressure waves. However, the depth-dependent decrease in optical fluence and lateral resolution affects the visibility of deeper vessels or other absorbing targets. This poses a problem when the precise locations of vessels are critical for the application at hand, such as navigational tasks during minimally invasive surgery. To address this issue, a novel deep neural network was designed, developed, and trained to predict the location of circular chromophore targets in tissue mimicking a strong scattering background, given measurements of photoacoustic signals from a linear array of ultrasound elements. The network was trained on 16,240 samples of simulated sensor data and tested on a separate set of 4,060 samples. Both our training and test sets consisted of optical fluence-dependent photoacoustic signal measurements from point sources at varying locations. Our network was able to predict the location of point sources with a mean axial error of 4.3 μm and a mean lateral error of 5.8 μm.",
author = "Kerrick Johnstonbaugh and Sumit Agrawal and Deepit Abhishek and Matthew Homewood and {Krisna Karri}, {Sri Phani} and Sri-Rajasekhar Kothapalli",
year = "2019",
month = "1",
day = "1",
doi = "10.1117/12.2511015",
language = "English (US)",
series = "Progress in Biomedical Optics and Imaging - Proceedings of SPIE",
publisher = "SPIE",
editor = "Wang, {Lihong V.} and Oraevsky, {Alexander A.}",
booktitle = "Photons Plus Ultrasound",
address = "United States",

}

Johnstonbaugh, K, Agrawal, S, Abhishek, D, Homewood, M, Krisna Karri, SP & Kothapalli, S-R 2019, Novel deep learning architecture for optical fluence dependent photoacoustic target localization. in LV Wang & AA Oraevsky (eds), Photons Plus Ultrasound: Imaging and Sensing 2019., 108781L, Progress in Biomedical Optics and Imaging - Proceedings of SPIE, vol. 10878, SPIE, Photons Plus Ultrasound: Imaging and Sensing 2019, San Francisco, United States, 2/3/19. https://doi.org/10.1117/12.2511015

Novel deep learning architecture for optical fluence dependent photoacoustic target localization. / Johnstonbaugh, Kerrick; Agrawal, Sumit; Abhishek, Deepit; Homewood, Matthew; Krisna Karri, Sri Phani; Kothapalli, Sri-Rajasekhar.

Photons Plus Ultrasound: Imaging and Sensing 2019. ed. / Lihong V. Wang; Alexander A. Oraevsky. SPIE, 2019. 108781L (Progress in Biomedical Optics and Imaging - Proceedings of SPIE; Vol. 10878).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Novel deep learning architecture for optical fluence dependent photoacoustic target localization

AU - Johnstonbaugh, Kerrick

AU - Agrawal, Sumit

AU - Abhishek, Deepit

AU - Homewood, Matthew

AU - Krisna Karri, Sri Phani

AU - Kothapalli, Sri-Rajasekhar

PY - 2019/1/1

Y1 - 2019/1/1

N2 - Photoacoustic imaging shows great promise for clinical environments where real-time position feedback is critical, including the guiding of minimally invasive surgery, drug delivery, stem cell transplantation, and the placement of metal implants such as stents, needles, staples, and brachytherapy seeds. Photoacoustic imaging techniques generate high contrast, label-free images of human vasculature, leveraging the high optical absorption characteristics of hemoglobin to generate measurable longitudinal pressure waves. However, the depth-dependent decrease in optical fluence and lateral resolution affects the visibility of deeper vessels or other absorbing targets. This poses a problem when the precise locations of vessels are critical for the application at hand, such as navigational tasks during minimally invasive surgery. To address this issue, a novel deep neural network was designed, developed, and trained to predict the location of circular chromophore targets in tissue mimicking a strong scattering background, given measurements of photoacoustic signals from a linear array of ultrasound elements. The network was trained on 16,240 samples of simulated sensor data and tested on a separate set of 4,060 samples. Both our training and test sets consisted of optical fluence-dependent photoacoustic signal measurements from point sources at varying locations. Our network was able to predict the location of point sources with a mean axial error of 4.3 μm and a mean lateral error of 5.8 μm.

AB - Photoacoustic imaging shows great promise for clinical environments where real-time position feedback is critical, including the guiding of minimally invasive surgery, drug delivery, stem cell transplantation, and the placement of metal implants such as stents, needles, staples, and brachytherapy seeds. Photoacoustic imaging techniques generate high contrast, label-free images of human vasculature, leveraging the high optical absorption characteristics of hemoglobin to generate measurable longitudinal pressure waves. However, the depth-dependent decrease in optical fluence and lateral resolution affects the visibility of deeper vessels or other absorbing targets. This poses a problem when the precise locations of vessels are critical for the application at hand, such as navigational tasks during minimally invasive surgery. To address this issue, a novel deep neural network was designed, developed, and trained to predict the location of circular chromophore targets in tissue mimicking a strong scattering background, given measurements of photoacoustic signals from a linear array of ultrasound elements. The network was trained on 16,240 samples of simulated sensor data and tested on a separate set of 4,060 samples. Both our training and test sets consisted of optical fluence-dependent photoacoustic signal measurements from point sources at varying locations. Our network was able to predict the location of point sources with a mean axial error of 4.3 μm and a mean lateral error of 5.8 μm.

UR - http://www.scopus.com/inward/record.url?scp=85065388430&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85065388430&partnerID=8YFLogxK

U2 - 10.1117/12.2511015

DO - 10.1117/12.2511015

M3 - Conference contribution

AN - SCOPUS:85065388430

T3 - Progress in Biomedical Optics and Imaging - Proceedings of SPIE

BT - Photons Plus Ultrasound

A2 - Wang, Lihong V.

A2 - Oraevsky, Alexander A.

PB - SPIE

ER -

Johnstonbaugh K, Agrawal S, Abhishek D, Homewood M, Krisna Karri SP, Kothapalli S-R. Novel deep learning architecture for optical fluence dependent photoacoustic target localization. In Wang LV, Oraevsky AA, editors, Photons Plus Ultrasound: Imaging and Sensing 2019. SPIE. 2019. 108781L. (Progress in Biomedical Optics and Imaging - Proceedings of SPIE). https://doi.org/10.1117/12.2511015