Fusion of multiple-look synthetic aperture radar images at data and image levels

Ram Mohan Narayanan, Zhixi Li, Scott Papson

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Synthetic aperture radar (SAR) and inverse synthetic aperture radar (ISAR) have proven capabilities for non-cooperative target recognition (NCTR) applications. Multiple looks of the same target (at different aspect angles, frequencies, etc.) can be exploited to enhance target recognition by fusing the information from each look. Such fusion can be performed at the raw data level or at the processed image level depending on what is available. At the data level, physics based image fusion techniques can be developed by processing the raw data collected from multiple inverse synthetic aperture radar (ISAR) sensors, even if these individual images are at different resolutions. The technique maps multiple data sets collected by multiple radars with different system parameters on to the same spatial-frequency space. The composite image can be reconstructed using the inverse 2-D Fourier Transform over the separated multiple integration areas. An algorithm called the Matrix Fourier Transform (MFT) is proposed to realize such a complicated integral. At the image level, a persistence framework can be used to enhance target features in large, aspect-varying datasets. The model focuses on cases containing rich aspect data from a single depression angle. The goal is to replace the data's intrinsic viewing geometry dependencies with target-specific dependencies. Both direct mapping functions and cost functions are presented for data transformation. An intensity-only mapping function is realized to illustrate the persistence model in terms of a canonical example, visualization, and classification.

Original languageEnglish (US)
Title of host publication2008 5th International Conference on Electrical Engineering, Computing Science and Automatic Control, CCE 2008
Pages508-513
Number of pages6
DOIs
StatePublished - Dec 1 2008
Event2008 5th International Conference on Electrical Engineering, Computing Science and Automatic Control, CCE 2008 - Mexico City, Mexico
Duration: Nov 12 2008Nov 14 2008

Other

Other2008 5th International Conference on Electrical Engineering, Computing Science and Automatic Control, CCE 2008
CountryMexico
CityMexico City
Period11/12/0811/14/08

Fingerprint

Inverse synthetic aperture radar
Synthetic aperture radar
Fourier transforms
Image fusion
Cost functions
Visualization
Physics
Geometry
Sensors
Composite materials
Processing

All Science Journal Classification (ASJC) codes

  • Computer Science Applications
  • Energy Engineering and Power Technology
  • Control and Systems Engineering
  • Electrical and Electronic Engineering

Cite this

Narayanan, R. M., Li, Z., & Papson, S. (2008). Fusion of multiple-look synthetic aperture radar images at data and image levels. In 2008 5th International Conference on Electrical Engineering, Computing Science and Automatic Control, CCE 2008 (pp. 508-513). [4723463] https://doi.org/10.1109/ICEEE.2008.4723463
Narayanan, Ram Mohan ; Li, Zhixi ; Papson, Scott. / Fusion of multiple-look synthetic aperture radar images at data and image levels. 2008 5th International Conference on Electrical Engineering, Computing Science and Automatic Control, CCE 2008. 2008. pp. 508-513
@inproceedings{96cf33b9200e4762baa03f809dcc22b7,
title = "Fusion of multiple-look synthetic aperture radar images at data and image levels",
abstract = "Synthetic aperture radar (SAR) and inverse synthetic aperture radar (ISAR) have proven capabilities for non-cooperative target recognition (NCTR) applications. Multiple looks of the same target (at different aspect angles, frequencies, etc.) can be exploited to enhance target recognition by fusing the information from each look. Such fusion can be performed at the raw data level or at the processed image level depending on what is available. At the data level, physics based image fusion techniques can be developed by processing the raw data collected from multiple inverse synthetic aperture radar (ISAR) sensors, even if these individual images are at different resolutions. The technique maps multiple data sets collected by multiple radars with different system parameters on to the same spatial-frequency space. The composite image can be reconstructed using the inverse 2-D Fourier Transform over the separated multiple integration areas. An algorithm called the Matrix Fourier Transform (MFT) is proposed to realize such a complicated integral. At the image level, a persistence framework can be used to enhance target features in large, aspect-varying datasets. The model focuses on cases containing rich aspect data from a single depression angle. The goal is to replace the data's intrinsic viewing geometry dependencies with target-specific dependencies. Both direct mapping functions and cost functions are presented for data transformation. An intensity-only mapping function is realized to illustrate the persistence model in terms of a canonical example, visualization, and classification.",
author = "Narayanan, {Ram Mohan} and Zhixi Li and Scott Papson",
year = "2008",
month = "12",
day = "1",
doi = "10.1109/ICEEE.2008.4723463",
language = "English (US)",
isbn = "9781424424993",
pages = "508--513",
booktitle = "2008 5th International Conference on Electrical Engineering, Computing Science and Automatic Control, CCE 2008",

}

Narayanan, RM, Li, Z & Papson, S 2008, Fusion of multiple-look synthetic aperture radar images at data and image levels. in 2008 5th International Conference on Electrical Engineering, Computing Science and Automatic Control, CCE 2008., 4723463, pp. 508-513, 2008 5th International Conference on Electrical Engineering, Computing Science and Automatic Control, CCE 2008, Mexico City, Mexico, 11/12/08. https://doi.org/10.1109/ICEEE.2008.4723463

Fusion of multiple-look synthetic aperture radar images at data and image levels. / Narayanan, Ram Mohan; Li, Zhixi; Papson, Scott.

2008 5th International Conference on Electrical Engineering, Computing Science and Automatic Control, CCE 2008. 2008. p. 508-513 4723463.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Fusion of multiple-look synthetic aperture radar images at data and image levels

AU - Narayanan, Ram Mohan

AU - Li, Zhixi

AU - Papson, Scott

PY - 2008/12/1

Y1 - 2008/12/1

N2 - Synthetic aperture radar (SAR) and inverse synthetic aperture radar (ISAR) have proven capabilities for non-cooperative target recognition (NCTR) applications. Multiple looks of the same target (at different aspect angles, frequencies, etc.) can be exploited to enhance target recognition by fusing the information from each look. Such fusion can be performed at the raw data level or at the processed image level depending on what is available. At the data level, physics based image fusion techniques can be developed by processing the raw data collected from multiple inverse synthetic aperture radar (ISAR) sensors, even if these individual images are at different resolutions. The technique maps multiple data sets collected by multiple radars with different system parameters on to the same spatial-frequency space. The composite image can be reconstructed using the inverse 2-D Fourier Transform over the separated multiple integration areas. An algorithm called the Matrix Fourier Transform (MFT) is proposed to realize such a complicated integral. At the image level, a persistence framework can be used to enhance target features in large, aspect-varying datasets. The model focuses on cases containing rich aspect data from a single depression angle. The goal is to replace the data's intrinsic viewing geometry dependencies with target-specific dependencies. Both direct mapping functions and cost functions are presented for data transformation. An intensity-only mapping function is realized to illustrate the persistence model in terms of a canonical example, visualization, and classification.

AB - Synthetic aperture radar (SAR) and inverse synthetic aperture radar (ISAR) have proven capabilities for non-cooperative target recognition (NCTR) applications. Multiple looks of the same target (at different aspect angles, frequencies, etc.) can be exploited to enhance target recognition by fusing the information from each look. Such fusion can be performed at the raw data level or at the processed image level depending on what is available. At the data level, physics based image fusion techniques can be developed by processing the raw data collected from multiple inverse synthetic aperture radar (ISAR) sensors, even if these individual images are at different resolutions. The technique maps multiple data sets collected by multiple radars with different system parameters on to the same spatial-frequency space. The composite image can be reconstructed using the inverse 2-D Fourier Transform over the separated multiple integration areas. An algorithm called the Matrix Fourier Transform (MFT) is proposed to realize such a complicated integral. At the image level, a persistence framework can be used to enhance target features in large, aspect-varying datasets. The model focuses on cases containing rich aspect data from a single depression angle. The goal is to replace the data's intrinsic viewing geometry dependencies with target-specific dependencies. Both direct mapping functions and cost functions are presented for data transformation. An intensity-only mapping function is realized to illustrate the persistence model in terms of a canonical example, visualization, and classification.

UR - http://www.scopus.com/inward/record.url?scp=61549097766&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=61549097766&partnerID=8YFLogxK

U2 - 10.1109/ICEEE.2008.4723463

DO - 10.1109/ICEEE.2008.4723463

M3 - Conference contribution

AN - SCOPUS:61549097766

SN - 9781424424993

SP - 508

EP - 513

BT - 2008 5th International Conference on Electrical Engineering, Computing Science and Automatic Control, CCE 2008

ER -

Narayanan RM, Li Z, Papson S. Fusion of multiple-look synthetic aperture radar images at data and image levels. In 2008 5th International Conference on Electrical Engineering, Computing Science and Automatic Control, CCE 2008. 2008. p. 508-513. 4723463 https://doi.org/10.1109/ICEEE.2008.4723463