Parameterized cross-validation for nonlinear regression models

Imhoi Koo, Namgil Lee, Rhee Man Kil

Research output: Contribution to journalArticle

4 Citations (Scopus)

Abstract

This paper presents a new method of cross-validation (CV) for nonlinear regression problems. In the conventional CV methods, a validation set, that is, a part of training data is used to check the performance of learning. As a result, the trained regression models cannot utilize the whole training data and obtain the less performance than the expected for the given training data. In this context, we consider to construct the performance prediction model using the validation set to determine the optimal structure for the whole training data. We analyze risk bounds using the VC dimension theory and suggest a parameterized form of risk estimates for the performance prediction model. As a result, we can estimate the optimal structure for the whole training data using the suggested CV method referred to as the parameterize CV (p-CV) method. Through the simulation for function approximation, we have shown the effectiveness of our approach.

Original languageEnglish (US)
Pages (from-to)3089-3095
Number of pages7
JournalNeurocomputing
Volume71
Issue number16-18
DOIs
StatePublished - Oct 1 2008

Fingerprint

Nonlinear Dynamics
Learning

All Science Journal Classification (ASJC) codes

  • Computer Science Applications
  • Cognitive Neuroscience
  • Artificial Intelligence

Cite this

Koo, Imhoi ; Lee, Namgil ; Kil, Rhee Man. / Parameterized cross-validation for nonlinear regression models. In: Neurocomputing. 2008 ; Vol. 71, No. 16-18. pp. 3089-3095.
@article{7c4def986b3e49dc9e41e9c06dc1659f,
title = "Parameterized cross-validation for nonlinear regression models",
abstract = "This paper presents a new method of cross-validation (CV) for nonlinear regression problems. In the conventional CV methods, a validation set, that is, a part of training data is used to check the performance of learning. As a result, the trained regression models cannot utilize the whole training data and obtain the less performance than the expected for the given training data. In this context, we consider to construct the performance prediction model using the validation set to determine the optimal structure for the whole training data. We analyze risk bounds using the VC dimension theory and suggest a parameterized form of risk estimates for the performance prediction model. As a result, we can estimate the optimal structure for the whole training data using the suggested CV method referred to as the parameterize CV (p-CV) method. Through the simulation for function approximation, we have shown the effectiveness of our approach.",
author = "Imhoi Koo and Namgil Lee and Kil, {Rhee Man}",
year = "2008",
month = "10",
day = "1",
doi = "10.1016/j.neucom.2008.04.043",
language = "English (US)",
volume = "71",
pages = "3089--3095",
journal = "Neurocomputing",
issn = "0925-2312",
publisher = "Elsevier",
number = "16-18",

}

Parameterized cross-validation for nonlinear regression models. / Koo, Imhoi; Lee, Namgil; Kil, Rhee Man.

In: Neurocomputing, Vol. 71, No. 16-18, 01.10.2008, p. 3089-3095.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Parameterized cross-validation for nonlinear regression models

AU - Koo, Imhoi

AU - Lee, Namgil

AU - Kil, Rhee Man

PY - 2008/10/1

Y1 - 2008/10/1

N2 - This paper presents a new method of cross-validation (CV) for nonlinear regression problems. In the conventional CV methods, a validation set, that is, a part of training data is used to check the performance of learning. As a result, the trained regression models cannot utilize the whole training data and obtain the less performance than the expected for the given training data. In this context, we consider to construct the performance prediction model using the validation set to determine the optimal structure for the whole training data. We analyze risk bounds using the VC dimension theory and suggest a parameterized form of risk estimates for the performance prediction model. As a result, we can estimate the optimal structure for the whole training data using the suggested CV method referred to as the parameterize CV (p-CV) method. Through the simulation for function approximation, we have shown the effectiveness of our approach.

AB - This paper presents a new method of cross-validation (CV) for nonlinear regression problems. In the conventional CV methods, a validation set, that is, a part of training data is used to check the performance of learning. As a result, the trained regression models cannot utilize the whole training data and obtain the less performance than the expected for the given training data. In this context, we consider to construct the performance prediction model using the validation set to determine the optimal structure for the whole training data. We analyze risk bounds using the VC dimension theory and suggest a parameterized form of risk estimates for the performance prediction model. As a result, we can estimate the optimal structure for the whole training data using the suggested CV method referred to as the parameterize CV (p-CV) method. Through the simulation for function approximation, we have shown the effectiveness of our approach.

UR - http://www.scopus.com/inward/record.url?scp=56549096545&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=56549096545&partnerID=8YFLogxK

U2 - 10.1016/j.neucom.2008.04.043

DO - 10.1016/j.neucom.2008.04.043

M3 - Article

AN - SCOPUS:56549096545

VL - 71

SP - 3089

EP - 3095

JO - Neurocomputing

JF - Neurocomputing

SN - 0925-2312

IS - 16-18

ER -