Target localization in synthetic aperture sonar imagery using convolutional neural networks

Thibaud Berthomier, David P. Williams, Samantha Dugelay

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Automatic target recognition (ATR) in synthetic aperture sonar (SAS) is usually performed in two stages: object detection and target classification. The detector aims to localize all the potential targets whereas the classifier distinguishes between real targets and false alarms. The probability of detection at this first stage must be the highest as possible to ensure that targets are not missed. Unfortunately, this generally implies a significant false alarm rate. Therefore, the challenge of the second stage, classification, is to drastically reduce the number of false alarms while keeping the detected targets. Using a large database of SAS images, efficient CNN classifiers have been demonstrated for underwater target classification tasks. In this paper, we suggest applying a pretrained classification CNN for localizing targets in SAS images. In so doing, we show the feasibility of target detection and classification in one-step using CNNs.

Original languageEnglish (US)
Title of host publicationOCEANS 2019 MTS/IEEE Seattle, OCEANS 2019
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9780578576183
DOIs
StatePublished - Oct 2019
Event2019 OCEANS MTS/IEEE Seattle, OCEANS 2019 - Seattle, United States
Duration: Oct 27 2019Oct 31 2019

Publication series

NameOCEANS 2019 MTS/IEEE Seattle, OCEANS 2019

Conference

Conference2019 OCEANS MTS/IEEE Seattle, OCEANS 2019
Country/TerritoryUnited States
CitySeattle
Period10/27/1910/31/19

All Science Journal Classification (ASJC) codes

  • Automotive Engineering
  • Ocean Engineering
  • Acoustics and Ultrasonics
  • Fluid Flow and Transfer Processes
  • Oceanography

Fingerprint

Dive into the research topics of 'Target localization in synthetic aperture sonar imagery using convolutional neural networks'. Together they form a unique fingerprint.

Cite this