Fusing heterogeneous data: A case for remote sensing and social media

Han Wang, Erik Skau, Hamid Krim, Guido Cervone

Research output: Contribution to journalArticle

4 Citations (Scopus)

Abstract

Data heterogeneity can pose a great challenge to process and systematically fuse low-level data from different modalities with no recourse to heuristics and manual adjustments and refinements. In this paper, a new methodology is introduced for the fusion of measured data for detecting and predicting weather-driven natural hazards. The proposed research introduces a robust theoretical and algorithmic framework for the fusion of heterogeneous data in near real time. We establish a flexible information-based fusion framework with a target optimality criterion of choice, which for illustration, is specialized to a maximum entropy principle and a least effort principle for semisupervised learning with noisy labels. We develop a methodology to account for multimodality data and a solution for addressing inherent sensor limitations. In our case study of interest, namely, that of flood density estimation, we further show that by fusing remote sensing and social media data, we can develop well founded and actionable flood maps. This capability is valuable in situations where environmental hazards, such as hurricanes or severe weather, affect very large areas. Relative to the state of the art working with such data, our proposed information-theoretic solution is principled and systematic, while offering a joint exploitation of any set of heterogeneous sensor modalities with minimally assuming priors. This flexibility is coupled with the ability to quantitatively and clearly state the fusion principles with very reasonable computational costs. The proposed method is tested and substantiated with the multimodality data of a 2013 Boulder Colorado flood event.

Original languageEnglish (US)
Article number8412269
Pages (from-to)6956-6968
Number of pages13
JournalIEEE Transactions on Geoscience and Remote Sensing
Volume56
Issue number12
DOIs
StatePublished - Dec 1 2018

Fingerprint

Remote sensing
Fusion reactions
remote sensing
Hazards
Hurricanes
Sensors
Electric fuses
Labels
Entropy
sensor
severe weather
methodology
social media
environmental hazard
natural hazard
boulder
heuristics
hurricane
entropy
Costs

All Science Journal Classification (ASJC) codes

  • Electrical and Electronic Engineering
  • Earth and Planetary Sciences(all)

Cite this

@article{2ffa89ffeb3b477a8d5b9e4eee2cc1a6,
title = "Fusing heterogeneous data: A case for remote sensing and social media",
abstract = "Data heterogeneity can pose a great challenge to process and systematically fuse low-level data from different modalities with no recourse to heuristics and manual adjustments and refinements. In this paper, a new methodology is introduced for the fusion of measured data for detecting and predicting weather-driven natural hazards. The proposed research introduces a robust theoretical and algorithmic framework for the fusion of heterogeneous data in near real time. We establish a flexible information-based fusion framework with a target optimality criterion of choice, which for illustration, is specialized to a maximum entropy principle and a least effort principle for semisupervised learning with noisy labels. We develop a methodology to account for multimodality data and a solution for addressing inherent sensor limitations. In our case study of interest, namely, that of flood density estimation, we further show that by fusing remote sensing and social media data, we can develop well founded and actionable flood maps. This capability is valuable in situations where environmental hazards, such as hurricanes or severe weather, affect very large areas. Relative to the state of the art working with such data, our proposed information-theoretic solution is principled and systematic, while offering a joint exploitation of any set of heterogeneous sensor modalities with minimally assuming priors. This flexibility is coupled with the ability to quantitatively and clearly state the fusion principles with very reasonable computational costs. The proposed method is tested and substantiated with the multimodality data of a 2013 Boulder Colorado flood event.",
author = "Han Wang and Erik Skau and Hamid Krim and Guido Cervone",
year = "2018",
month = "12",
day = "1",
doi = "10.1109/TGRS.2018.2846199",
language = "English (US)",
volume = "56",
pages = "6956--6968",
journal = "IEEE Transactions on Geoscience and Remote Sensing",
issn = "0196-2892",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "12",

}

Fusing heterogeneous data : A case for remote sensing and social media. / Wang, Han; Skau, Erik; Krim, Hamid; Cervone, Guido.

In: IEEE Transactions on Geoscience and Remote Sensing, Vol. 56, No. 12, 8412269, 01.12.2018, p. 6956-6968.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Fusing heterogeneous data

T2 - A case for remote sensing and social media

AU - Wang, Han

AU - Skau, Erik

AU - Krim, Hamid

AU - Cervone, Guido

PY - 2018/12/1

Y1 - 2018/12/1

N2 - Data heterogeneity can pose a great challenge to process and systematically fuse low-level data from different modalities with no recourse to heuristics and manual adjustments and refinements. In this paper, a new methodology is introduced for the fusion of measured data for detecting and predicting weather-driven natural hazards. The proposed research introduces a robust theoretical and algorithmic framework for the fusion of heterogeneous data in near real time. We establish a flexible information-based fusion framework with a target optimality criterion of choice, which for illustration, is specialized to a maximum entropy principle and a least effort principle for semisupervised learning with noisy labels. We develop a methodology to account for multimodality data and a solution for addressing inherent sensor limitations. In our case study of interest, namely, that of flood density estimation, we further show that by fusing remote sensing and social media data, we can develop well founded and actionable flood maps. This capability is valuable in situations where environmental hazards, such as hurricanes or severe weather, affect very large areas. Relative to the state of the art working with such data, our proposed information-theoretic solution is principled and systematic, while offering a joint exploitation of any set of heterogeneous sensor modalities with minimally assuming priors. This flexibility is coupled with the ability to quantitatively and clearly state the fusion principles with very reasonable computational costs. The proposed method is tested and substantiated with the multimodality data of a 2013 Boulder Colorado flood event.

AB - Data heterogeneity can pose a great challenge to process and systematically fuse low-level data from different modalities with no recourse to heuristics and manual adjustments and refinements. In this paper, a new methodology is introduced for the fusion of measured data for detecting and predicting weather-driven natural hazards. The proposed research introduces a robust theoretical and algorithmic framework for the fusion of heterogeneous data in near real time. We establish a flexible information-based fusion framework with a target optimality criterion of choice, which for illustration, is specialized to a maximum entropy principle and a least effort principle for semisupervised learning with noisy labels. We develop a methodology to account for multimodality data and a solution for addressing inherent sensor limitations. In our case study of interest, namely, that of flood density estimation, we further show that by fusing remote sensing and social media data, we can develop well founded and actionable flood maps. This capability is valuable in situations where environmental hazards, such as hurricanes or severe weather, affect very large areas. Relative to the state of the art working with such data, our proposed information-theoretic solution is principled and systematic, while offering a joint exploitation of any set of heterogeneous sensor modalities with minimally assuming priors. This flexibility is coupled with the ability to quantitatively and clearly state the fusion principles with very reasonable computational costs. The proposed method is tested and substantiated with the multimodality data of a 2013 Boulder Colorado flood event.

UR - http://www.scopus.com/inward/record.url?scp=85056144998&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85056144998&partnerID=8YFLogxK

U2 - 10.1109/TGRS.2018.2846199

DO - 10.1109/TGRS.2018.2846199

M3 - Article

AN - SCOPUS:85056144998

VL - 56

SP - 6956

EP - 6968

JO - IEEE Transactions on Geoscience and Remote Sensing

JF - IEEE Transactions on Geoscience and Remote Sensing

SN - 0196-2892

IS - 12

M1 - 8412269

ER -