Technical efficiency-based selection of learning cases to improve forecasting accuracy of neural networks under monotonicity assumption

Parag C. Pendharkar, James A. Rodger

Research output: Contribution to journalArticlepeer-review

67 Scopus citations

Abstract

In this paper, we show that when an artificial neural network (ANN) model is used for learning monotonic forecasting functions, it may be useful to screen training data so the screened examples approximately satisfy the monotonicity property. We show how a technical efficiency-based ranking, using the data envelopment analysis (DEA) model, and a predetermined threshold efficiency, might be useful to screen training data so that a subset of examples that approximately satisfy the monotonicity property can be identified. Using a health care forecasting problem, the monotonicity assumption, and a predetermined threshold efficiency level, we use DEA to split training data into two mutually exclusive, "efficient" and "inefficient", training data subsets. We compare the performance of the ANN by using the "efficient" and "inefficient" training data subsets. Our results indicate that the predictive performance of an ANN that is trained on the "efficient" training data subset is higher than the predictive performance of an ANN that is trained on the "inefficient" training data subset.

Original languageEnglish (US)
Pages (from-to)117-136
Number of pages20
JournalDecision Support Systems
Volume36
Issue number1
DOIs
StatePublished - Sep 2003

All Science Journal Classification (ASJC) codes

  • Management Information Systems
  • Information Systems
  • Developmental and Educational Psychology
  • Arts and Humanities (miscellaneous)
  • Information Systems and Management

Fingerprint Dive into the research topics of 'Technical efficiency-based selection of learning cases to improve forecasting accuracy of neural networks under monotonicity assumption'. Together they form a unique fingerprint.

Cite this