Deepfusion: A deep learning framework for the fusion of heterogeneous sensory data

Hongfei Xue, Wenjun Jiang, Chenglin Miao, Ye Yuan, Fenglong Ma, Xin Ma, Yijiang Wang, Shuochao Yao, Wenyao Xu, Aidong Zhang, Lu Su

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In recent years, significant research efforts have been spent towards building intelligent and user-friendly IoT systems to enable a new generation of applications capable of performing complex sensing and recognition tasks. In many of such applications, there are usually multiple different sensors monitoring the same object. Each of these sensors can be regarded as an information source and provides us a unique “view” of the observed object. Intuitively, if we can combine the complementary information carried by multiple sensors, we will be able to improve the sensing performance. Towards this end, we propose DeepFusion, a unified multi-sensor deep learning framework, to learn informative representations of heterogeneous sensory data. DeepFusion can combine different sensors’ information weighted by the quality of their data and incorporate cross-sensor correlations, and thus can benefit a wide spectrum of IoT applications. To evaluate the proposed DeepFusion model, we set up two real-world human activity recognition testbeds using commercialized wearable and wireless sensing devices. Experiment results show that DeepFusion can outperform the state-of-the-art human activity recognition methods.

Original languageEnglish (US)
Title of host publicationMobihoc 2019 - Proceedings of the 2019 20th ACM International Symposium on Mobile Ad Hoc Networking and Computing
PublisherAssociation for Computing Machinery
Pages151-160
Number of pages10
ISBN (Electronic)9781450367646
DOIs
StatePublished - Jul 2 2019
Event20th ACM International Symposium on Mobile Ad Hoc Networking and Computing, MobiHoc 2019 - Catania, Italy
Duration: Jul 2 2019Jul 5 2019

Publication series

NameProceedings of the International Symposium on Mobile Ad Hoc Networking and Computing (MobiHoc)

Conference

Conference20th ACM International Symposium on Mobile Ad Hoc Networking and Computing, MobiHoc 2019
CountryItaly
CityCatania
Period7/2/197/5/19

Fingerprint

Fusion reactions
Sensors
Intelligent buildings
Testbeds
Deep learning
Monitoring
Experiments
Internet of things

All Science Journal Classification (ASJC) codes

  • Hardware and Architecture
  • Computer Networks and Communications
  • Software

Cite this

Xue, H., Jiang, W., Miao, C., Yuan, Y., Ma, F., Ma, X., ... Su, L. (2019). Deepfusion: A deep learning framework for the fusion of heterogeneous sensory data. In Mobihoc 2019 - Proceedings of the 2019 20th ACM International Symposium on Mobile Ad Hoc Networking and Computing (pp. 151-160). (Proceedings of the International Symposium on Mobile Ad Hoc Networking and Computing (MobiHoc)). Association for Computing Machinery. https://doi.org/10.1145/3323679.3326513
Xue, Hongfei ; Jiang, Wenjun ; Miao, Chenglin ; Yuan, Ye ; Ma, Fenglong ; Ma, Xin ; Wang, Yijiang ; Yao, Shuochao ; Xu, Wenyao ; Zhang, Aidong ; Su, Lu. / Deepfusion : A deep learning framework for the fusion of heterogeneous sensory data. Mobihoc 2019 - Proceedings of the 2019 20th ACM International Symposium on Mobile Ad Hoc Networking and Computing. Association for Computing Machinery, 2019. pp. 151-160 (Proceedings of the International Symposium on Mobile Ad Hoc Networking and Computing (MobiHoc)).
@inproceedings{ba689d14d1c94867be75770cbbedef35,
title = "Deepfusion: A deep learning framework for the fusion of heterogeneous sensory data",
abstract = "In recent years, significant research efforts have been spent towards building intelligent and user-friendly IoT systems to enable a new generation of applications capable of performing complex sensing and recognition tasks. In many of such applications, there are usually multiple different sensors monitoring the same object. Each of these sensors can be regarded as an information source and provides us a unique “view” of the observed object. Intuitively, if we can combine the complementary information carried by multiple sensors, we will be able to improve the sensing performance. Towards this end, we propose DeepFusion, a unified multi-sensor deep learning framework, to learn informative representations of heterogeneous sensory data. DeepFusion can combine different sensors’ information weighted by the quality of their data and incorporate cross-sensor correlations, and thus can benefit a wide spectrum of IoT applications. To evaluate the proposed DeepFusion model, we set up two real-world human activity recognition testbeds using commercialized wearable and wireless sensing devices. Experiment results show that DeepFusion can outperform the state-of-the-art human activity recognition methods.",
author = "Hongfei Xue and Wenjun Jiang and Chenglin Miao and Ye Yuan and Fenglong Ma and Xin Ma and Yijiang Wang and Shuochao Yao and Wenyao Xu and Aidong Zhang and Lu Su",
year = "2019",
month = "7",
day = "2",
doi = "10.1145/3323679.3326513",
language = "English (US)",
series = "Proceedings of the International Symposium on Mobile Ad Hoc Networking and Computing (MobiHoc)",
publisher = "Association for Computing Machinery",
pages = "151--160",
booktitle = "Mobihoc 2019 - Proceedings of the 2019 20th ACM International Symposium on Mobile Ad Hoc Networking and Computing",

}

Xue, H, Jiang, W, Miao, C, Yuan, Y, Ma, F, Ma, X, Wang, Y, Yao, S, Xu, W, Zhang, A & Su, L 2019, Deepfusion: A deep learning framework for the fusion of heterogeneous sensory data. in Mobihoc 2019 - Proceedings of the 2019 20th ACM International Symposium on Mobile Ad Hoc Networking and Computing. Proceedings of the International Symposium on Mobile Ad Hoc Networking and Computing (MobiHoc), Association for Computing Machinery, pp. 151-160, 20th ACM International Symposium on Mobile Ad Hoc Networking and Computing, MobiHoc 2019, Catania, Italy, 7/2/19. https://doi.org/10.1145/3323679.3326513

Deepfusion : A deep learning framework for the fusion of heterogeneous sensory data. / Xue, Hongfei; Jiang, Wenjun; Miao, Chenglin; Yuan, Ye; Ma, Fenglong; Ma, Xin; Wang, Yijiang; Yao, Shuochao; Xu, Wenyao; Zhang, Aidong; Su, Lu.

Mobihoc 2019 - Proceedings of the 2019 20th ACM International Symposium on Mobile Ad Hoc Networking and Computing. Association for Computing Machinery, 2019. p. 151-160 (Proceedings of the International Symposium on Mobile Ad Hoc Networking and Computing (MobiHoc)).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Deepfusion

T2 - A deep learning framework for the fusion of heterogeneous sensory data

AU - Xue, Hongfei

AU - Jiang, Wenjun

AU - Miao, Chenglin

AU - Yuan, Ye

AU - Ma, Fenglong

AU - Ma, Xin

AU - Wang, Yijiang

AU - Yao, Shuochao

AU - Xu, Wenyao

AU - Zhang, Aidong

AU - Su, Lu

PY - 2019/7/2

Y1 - 2019/7/2

N2 - In recent years, significant research efforts have been spent towards building intelligent and user-friendly IoT systems to enable a new generation of applications capable of performing complex sensing and recognition tasks. In many of such applications, there are usually multiple different sensors monitoring the same object. Each of these sensors can be regarded as an information source and provides us a unique “view” of the observed object. Intuitively, if we can combine the complementary information carried by multiple sensors, we will be able to improve the sensing performance. Towards this end, we propose DeepFusion, a unified multi-sensor deep learning framework, to learn informative representations of heterogeneous sensory data. DeepFusion can combine different sensors’ information weighted by the quality of their data and incorporate cross-sensor correlations, and thus can benefit a wide spectrum of IoT applications. To evaluate the proposed DeepFusion model, we set up two real-world human activity recognition testbeds using commercialized wearable and wireless sensing devices. Experiment results show that DeepFusion can outperform the state-of-the-art human activity recognition methods.

AB - In recent years, significant research efforts have been spent towards building intelligent and user-friendly IoT systems to enable a new generation of applications capable of performing complex sensing and recognition tasks. In many of such applications, there are usually multiple different sensors monitoring the same object. Each of these sensors can be regarded as an information source and provides us a unique “view” of the observed object. Intuitively, if we can combine the complementary information carried by multiple sensors, we will be able to improve the sensing performance. Towards this end, we propose DeepFusion, a unified multi-sensor deep learning framework, to learn informative representations of heterogeneous sensory data. DeepFusion can combine different sensors’ information weighted by the quality of their data and incorporate cross-sensor correlations, and thus can benefit a wide spectrum of IoT applications. To evaluate the proposed DeepFusion model, we set up two real-world human activity recognition testbeds using commercialized wearable and wireless sensing devices. Experiment results show that DeepFusion can outperform the state-of-the-art human activity recognition methods.

UR - http://www.scopus.com/inward/record.url?scp=85069797383&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85069797383&partnerID=8YFLogxK

U2 - 10.1145/3323679.3326513

DO - 10.1145/3323679.3326513

M3 - Conference contribution

AN - SCOPUS:85069797383

T3 - Proceedings of the International Symposium on Mobile Ad Hoc Networking and Computing (MobiHoc)

SP - 151

EP - 160

BT - Mobihoc 2019 - Proceedings of the 2019 20th ACM International Symposium on Mobile Ad Hoc Networking and Computing

PB - Association for Computing Machinery

ER -

Xue H, Jiang W, Miao C, Yuan Y, Ma F, Ma X et al. Deepfusion: A deep learning framework for the fusion of heterogeneous sensory data. In Mobihoc 2019 - Proceedings of the 2019 20th ACM International Symposium on Mobile Ad Hoc Networking and Computing. Association for Computing Machinery. 2019. p. 151-160. (Proceedings of the International Symposium on Mobile Ad Hoc Networking and Computing (MobiHoc)). https://doi.org/10.1145/3323679.3326513