Neural network and Bayesian network fusion models to fuse electronic nose and surface acoustic wave sensor data for apple defect detection

Changying Li, Paul Heinz Heinemann, Richard Sherry

Research output: Contribution to journalArticle

66 Citations (Scopus)

Abstract

The Cyranose 320 electronic nose (Enose) and zNose™ are two instruments used to detect volatile profiles. In this research, feature level and decision level multisensor data fusion models, combined with covariance matrix adaptation evolutionary strategy (CMAES), were developed to fuse the Enose and zNose data to improve detection and classification performance for damaged apples compared with using the individual instruments alone. Principal component analysis (PCA) was used for feature extraction and probabilistic neural networks (PNN) were developed as the classifier. Three feature-based fusion schemes were compared. Dynamic selective fusion achieved an average 1.8% and a best 0% classification error rate in a total of 30 independent runs. The static selective fusion approach resulted in a 6.1% classification error rate, which was not as good as using individual sensors (4.2% for the Enose and 2.6% for the zNose) if only selected features were applied. Simply adding the Enose and zNose features without selection (non-selective fusion) worsened the classification performance with a 32.5% classification error rate. This indicated that the feature selection using the CMAES is an indispensable process in multisensor data fusion, especially if multiple sources of sensors contain much irrelevant or redundant information. At the decision level, Bayesian network fusion achieved better performance than two individual sensors, with 11% error rate versus 13% error rate for the Enose and 20% error rate for the zNose. It is shown that both the feature level fusion with the CMAES optimization algorithms and decision level fusion using a Bayesian network as a classifier improved system classification performance. This methodology can also be applied to other sensor fusion applications.

Original languageEnglish (US)
Pages (from-to)301-310
Number of pages10
JournalSensors and Actuators, B: Chemical
Volume125
Issue number1
DOIs
StatePublished - Jul 16 2007

Fingerprint

fuses
Bayesian networks
Electric fuses
Surface waves
fusion
Acoustic waves
Neural networks
acoustics
sensors
defects
Sensors
Covariance matrix
electronics
multisensor fusion
Data fusion
Classifiers
classifiers
Principal component analysis
Electronic nose
Defect detection

All Science Journal Classification (ASJC) codes

  • Electronic, Optical and Magnetic Materials
  • Instrumentation
  • Condensed Matter Physics
  • Surfaces, Coatings and Films
  • Metals and Alloys
  • Electrical and Electronic Engineering
  • Materials Chemistry

Cite this

@article{ea8df50188ea4db3adedac1396fa87d9,
title = "Neural network and Bayesian network fusion models to fuse electronic nose and surface acoustic wave sensor data for apple defect detection",
abstract = "The Cyranose 320 electronic nose (Enose) and zNose™ are two instruments used to detect volatile profiles. In this research, feature level and decision level multisensor data fusion models, combined with covariance matrix adaptation evolutionary strategy (CMAES), were developed to fuse the Enose and zNose data to improve detection and classification performance for damaged apples compared with using the individual instruments alone. Principal component analysis (PCA) was used for feature extraction and probabilistic neural networks (PNN) were developed as the classifier. Three feature-based fusion schemes were compared. Dynamic selective fusion achieved an average 1.8{\%} and a best 0{\%} classification error rate in a total of 30 independent runs. The static selective fusion approach resulted in a 6.1{\%} classification error rate, which was not as good as using individual sensors (4.2{\%} for the Enose and 2.6{\%} for the zNose) if only selected features were applied. Simply adding the Enose and zNose features without selection (non-selective fusion) worsened the classification performance with a 32.5{\%} classification error rate. This indicated that the feature selection using the CMAES is an indispensable process in multisensor data fusion, especially if multiple sources of sensors contain much irrelevant or redundant information. At the decision level, Bayesian network fusion achieved better performance than two individual sensors, with 11{\%} error rate versus 13{\%} error rate for the Enose and 20{\%} error rate for the zNose. It is shown that both the feature level fusion with the CMAES optimization algorithms and decision level fusion using a Bayesian network as a classifier improved system classification performance. This methodology can also be applied to other sensor fusion applications.",
author = "Changying Li and Heinemann, {Paul Heinz} and Richard Sherry",
year = "2007",
month = "7",
day = "16",
doi = "10.1016/j.snb.2007.02.027",
language = "English (US)",
volume = "125",
pages = "301--310",
journal = "Sensors and Actuators, B: Chemical",
issn = "0925-4005",
publisher = "Elsevier",
number = "1",

}

Neural network and Bayesian network fusion models to fuse electronic nose and surface acoustic wave sensor data for apple defect detection. / Li, Changying; Heinemann, Paul Heinz; Sherry, Richard.

In: Sensors and Actuators, B: Chemical, Vol. 125, No. 1, 16.07.2007, p. 301-310.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Neural network and Bayesian network fusion models to fuse electronic nose and surface acoustic wave sensor data for apple defect detection

AU - Li, Changying

AU - Heinemann, Paul Heinz

AU - Sherry, Richard

PY - 2007/7/16

Y1 - 2007/7/16

N2 - The Cyranose 320 electronic nose (Enose) and zNose™ are two instruments used to detect volatile profiles. In this research, feature level and decision level multisensor data fusion models, combined with covariance matrix adaptation evolutionary strategy (CMAES), were developed to fuse the Enose and zNose data to improve detection and classification performance for damaged apples compared with using the individual instruments alone. Principal component analysis (PCA) was used for feature extraction and probabilistic neural networks (PNN) were developed as the classifier. Three feature-based fusion schemes were compared. Dynamic selective fusion achieved an average 1.8% and a best 0% classification error rate in a total of 30 independent runs. The static selective fusion approach resulted in a 6.1% classification error rate, which was not as good as using individual sensors (4.2% for the Enose and 2.6% for the zNose) if only selected features were applied. Simply adding the Enose and zNose features without selection (non-selective fusion) worsened the classification performance with a 32.5% classification error rate. This indicated that the feature selection using the CMAES is an indispensable process in multisensor data fusion, especially if multiple sources of sensors contain much irrelevant or redundant information. At the decision level, Bayesian network fusion achieved better performance than two individual sensors, with 11% error rate versus 13% error rate for the Enose and 20% error rate for the zNose. It is shown that both the feature level fusion with the CMAES optimization algorithms and decision level fusion using a Bayesian network as a classifier improved system classification performance. This methodology can also be applied to other sensor fusion applications.

AB - The Cyranose 320 electronic nose (Enose) and zNose™ are two instruments used to detect volatile profiles. In this research, feature level and decision level multisensor data fusion models, combined with covariance matrix adaptation evolutionary strategy (CMAES), were developed to fuse the Enose and zNose data to improve detection and classification performance for damaged apples compared with using the individual instruments alone. Principal component analysis (PCA) was used for feature extraction and probabilistic neural networks (PNN) were developed as the classifier. Three feature-based fusion schemes were compared. Dynamic selective fusion achieved an average 1.8% and a best 0% classification error rate in a total of 30 independent runs. The static selective fusion approach resulted in a 6.1% classification error rate, which was not as good as using individual sensors (4.2% for the Enose and 2.6% for the zNose) if only selected features were applied. Simply adding the Enose and zNose features without selection (non-selective fusion) worsened the classification performance with a 32.5% classification error rate. This indicated that the feature selection using the CMAES is an indispensable process in multisensor data fusion, especially if multiple sources of sensors contain much irrelevant or redundant information. At the decision level, Bayesian network fusion achieved better performance than two individual sensors, with 11% error rate versus 13% error rate for the Enose and 20% error rate for the zNose. It is shown that both the feature level fusion with the CMAES optimization algorithms and decision level fusion using a Bayesian network as a classifier improved system classification performance. This methodology can also be applied to other sensor fusion applications.

UR - http://www.scopus.com/inward/record.url?scp=34347378913&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=34347378913&partnerID=8YFLogxK

U2 - 10.1016/j.snb.2007.02.027

DO - 10.1016/j.snb.2007.02.027

M3 - Article

AN - SCOPUS:34347378913

VL - 125

SP - 301

EP - 310

JO - Sensors and Actuators, B: Chemical

JF - Sensors and Actuators, B: Chemical

SN - 0925-4005

IS - 1

ER -