Hierarchical maximum entropy modeling for regression

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Maximum entropy/iterative scaling (ME/IS) models have been well developed for classification on categorical (discrete-field) feature spaces. In this paper, we propose a hierarchical maximum entropy regression (HMEreg) model in building a posterior model for continuous target, which encodes constraints in the hierarchical tree structures from both input features and target output variable. In ME models, the tradeoff between model bias and variance is found in the constraints encoded into the model - complex constraints give the model more representation capacity but may over-fit, whereas simple constraints may produce less over-fitting but may have much more model bias. We developed a greedy order-growing constraint search method to sequentially build constraints with flexible order based on likelihood gain on a validation set. Experiments showed the HMEreg model performed comparably to or better than other regression models, including generalized linear regression, multi-layer perceptron, support vector regression, and regression tree.

Original languageEnglish (US)
Title of host publicationMachine Learning for Signal Processing XIX - Proceedings of the 2009 IEEE Signal Processing Society Workshop, MLSP 2009
DOIs
StatePublished - Dec 1 2009
EventMachine Learning for Signal Processing XIX - 2009 IEEE Signal Processing Society Workshop, MLSP 2009 - Grenoble, France
Duration: Sep 2 2009Sep 4 2009

Publication series

NameMachine Learning for Signal Processing XIX - Proceedings of the 2009 IEEE Signal Processing Society Workshop, MLSP 2009

Other

OtherMachine Learning for Signal Processing XIX - 2009 IEEE Signal Processing Society Workshop, MLSP 2009
CountryFrance
CityGrenoble
Period9/2/099/4/09

Fingerprint

entropy
Entropy
regression
Multilayer neural networks
trend
Linear regression
scaling
experiment

All Science Journal Classification (ASJC) codes

  • Human-Computer Interaction
  • Signal Processing
  • Education

Cite this

Zhang, Y., Miller, D. J., & Kesidis, G. (2009). Hierarchical maximum entropy modeling for regression. In Machine Learning for Signal Processing XIX - Proceedings of the 2009 IEEE Signal Processing Society Workshop, MLSP 2009 [5306225] (Machine Learning for Signal Processing XIX - Proceedings of the 2009 IEEE Signal Processing Society Workshop, MLSP 2009). https://doi.org/10.1109/MLSP.2009.5306225
Zhang, Yanxin ; Miller, David Jonathan ; Kesidis, George. / Hierarchical maximum entropy modeling for regression. Machine Learning for Signal Processing XIX - Proceedings of the 2009 IEEE Signal Processing Society Workshop, MLSP 2009. 2009. (Machine Learning for Signal Processing XIX - Proceedings of the 2009 IEEE Signal Processing Society Workshop, MLSP 2009).
@inproceedings{a139af06083743ef81c3c922a2ea5ae4,
title = "Hierarchical maximum entropy modeling for regression",
abstract = "Maximum entropy/iterative scaling (ME/IS) models have been well developed for classification on categorical (discrete-field) feature spaces. In this paper, we propose a hierarchical maximum entropy regression (HMEreg) model in building a posterior model for continuous target, which encodes constraints in the hierarchical tree structures from both input features and target output variable. In ME models, the tradeoff between model bias and variance is found in the constraints encoded into the model - complex constraints give the model more representation capacity but may over-fit, whereas simple constraints may produce less over-fitting but may have much more model bias. We developed a greedy order-growing constraint search method to sequentially build constraints with flexible order based on likelihood gain on a validation set. Experiments showed the HMEreg model performed comparably to or better than other regression models, including generalized linear regression, multi-layer perceptron, support vector regression, and regression tree.",
author = "Yanxin Zhang and Miller, {David Jonathan} and George Kesidis",
year = "2009",
month = "12",
day = "1",
doi = "10.1109/MLSP.2009.5306225",
language = "English (US)",
isbn = "9781424449484",
series = "Machine Learning for Signal Processing XIX - Proceedings of the 2009 IEEE Signal Processing Society Workshop, MLSP 2009",
booktitle = "Machine Learning for Signal Processing XIX - Proceedings of the 2009 IEEE Signal Processing Society Workshop, MLSP 2009",

}

Zhang, Y, Miller, DJ & Kesidis, G 2009, Hierarchical maximum entropy modeling for regression. in Machine Learning for Signal Processing XIX - Proceedings of the 2009 IEEE Signal Processing Society Workshop, MLSP 2009., 5306225, Machine Learning for Signal Processing XIX - Proceedings of the 2009 IEEE Signal Processing Society Workshop, MLSP 2009, Machine Learning for Signal Processing XIX - 2009 IEEE Signal Processing Society Workshop, MLSP 2009, Grenoble, France, 9/2/09. https://doi.org/10.1109/MLSP.2009.5306225

Hierarchical maximum entropy modeling for regression. / Zhang, Yanxin; Miller, David Jonathan; Kesidis, George.

Machine Learning for Signal Processing XIX - Proceedings of the 2009 IEEE Signal Processing Society Workshop, MLSP 2009. 2009. 5306225 (Machine Learning for Signal Processing XIX - Proceedings of the 2009 IEEE Signal Processing Society Workshop, MLSP 2009).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Hierarchical maximum entropy modeling for regression

AU - Zhang, Yanxin

AU - Miller, David Jonathan

AU - Kesidis, George

PY - 2009/12/1

Y1 - 2009/12/1

N2 - Maximum entropy/iterative scaling (ME/IS) models have been well developed for classification on categorical (discrete-field) feature spaces. In this paper, we propose a hierarchical maximum entropy regression (HMEreg) model in building a posterior model for continuous target, which encodes constraints in the hierarchical tree structures from both input features and target output variable. In ME models, the tradeoff between model bias and variance is found in the constraints encoded into the model - complex constraints give the model more representation capacity but may over-fit, whereas simple constraints may produce less over-fitting but may have much more model bias. We developed a greedy order-growing constraint search method to sequentially build constraints with flexible order based on likelihood gain on a validation set. Experiments showed the HMEreg model performed comparably to or better than other regression models, including generalized linear regression, multi-layer perceptron, support vector regression, and regression tree.

AB - Maximum entropy/iterative scaling (ME/IS) models have been well developed for classification on categorical (discrete-field) feature spaces. In this paper, we propose a hierarchical maximum entropy regression (HMEreg) model in building a posterior model for continuous target, which encodes constraints in the hierarchical tree structures from both input features and target output variable. In ME models, the tradeoff between model bias and variance is found in the constraints encoded into the model - complex constraints give the model more representation capacity but may over-fit, whereas simple constraints may produce less over-fitting but may have much more model bias. We developed a greedy order-growing constraint search method to sequentially build constraints with flexible order based on likelihood gain on a validation set. Experiments showed the HMEreg model performed comparably to or better than other regression models, including generalized linear regression, multi-layer perceptron, support vector regression, and regression tree.

UR - http://www.scopus.com/inward/record.url?scp=77950932380&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=77950932380&partnerID=8YFLogxK

U2 - 10.1109/MLSP.2009.5306225

DO - 10.1109/MLSP.2009.5306225

M3 - Conference contribution

AN - SCOPUS:77950932380

SN - 9781424449484

T3 - Machine Learning for Signal Processing XIX - Proceedings of the 2009 IEEE Signal Processing Society Workshop, MLSP 2009

BT - Machine Learning for Signal Processing XIX - Proceedings of the 2009 IEEE Signal Processing Society Workshop, MLSP 2009

ER -

Zhang Y, Miller DJ, Kesidis G. Hierarchical maximum entropy modeling for regression. In Machine Learning for Signal Processing XIX - Proceedings of the 2009 IEEE Signal Processing Society Workshop, MLSP 2009. 2009. 5306225. (Machine Learning for Signal Processing XIX - Proceedings of the 2009 IEEE Signal Processing Society Workshop, MLSP 2009). https://doi.org/10.1109/MLSP.2009.5306225