Prediction of human odour assessments based on hedonic tone method using instrument measurements and multi-sensor data fusion integrated neural networks

Fangle Chang, Paul H. Heinemann

Research output: Contribution to journalArticlepeer-review

Abstract

A Cyranose 320 (eNose) and a Fast Gas Chromatograph (CG) analyser (zNose™) were used to measure the headspace odour of solid samples from dairy operations. The measurements of both sensors were trained by Levenberg–Marquardt Back-propagation Neural Network (LMBNN) to match human assessments. A trained human panel was used to assess the odours based on hedonic tone method and provide the model targets. A multi-sensor data fusion approach was developed and applied to integrate the eNose and zNose readings for higher predictive accuracy compared to each sensor alone. Principle Component Analysis, Forward Selection, and Gamma Test were applied to reduce the model input dimensions. Measurement fusion models and information fusion model approaches were applied. The information fusion prediction models were shown to be more accurate than all other models, including single instrument models. The information fusion model based on eNose with Gamma Test data reduction + zNose showed the best results of all cases in validation mean square error (0.34 odour units), R value (0.92), probability of the prediction falling within 10% of the target (96%), and probability of the prediction falling within 5% of the target (63%).

Original languageEnglish (US)
Pages (from-to)272-283
Number of pages12
JournalBiosystems Engineering
Volume200
DOIs
StatePublished - Dec 2020

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • Food Science
  • Animal Science and Zoology
  • Agronomy and Crop Science
  • Soil Science

Fingerprint Dive into the research topics of 'Prediction of human odour assessments based on hedonic tone method using instrument measurements and multi-sensor data fusion integrated neural networks'. Together they form a unique fingerprint.

Cite this