True Risk Bounds for the Regression of Real-Valued Functions

Rhee Man Kil, Imhoi Koo

Research output: Contribution to conferencePaper

1 Citation (Scopus)

Abstract

This paper presents a new form of true risk bounds for the regression of real-valued functions. The goal of machine learning is minimizing the true risk (or general error) for the whole distribution of sample space, not just a set of training samples. However, the true risk can not be estimated accurately with the finite number of samples. In this sense, we derive the form of true risk bounds which may provide the useful guideline for the optimization of learning models. Through the simulation for the function approximation, we have shown that the prediction of true risk bounds based on the suggested functional form is well fitted to the empirical data.

Original languageEnglish (US)
Pages507-512
Number of pages6
StatePublished - Sep 24 2003
EventInternational Joint Conference on Neural Networks 2003 - Portland, OR, United States
Duration: Jul 20 2003Jul 24 2003

Conference

ConferenceInternational Joint Conference on Neural Networks 2003
CountryUnited States
CityPortland, OR
Period7/20/037/24/03

Fingerprint

Learning systems

All Science Journal Classification (ASJC) codes

  • Software
  • Artificial Intelligence

Cite this

Kil, R. M., & Koo, I. (2003). True Risk Bounds for the Regression of Real-Valued Functions. 507-512. Paper presented at International Joint Conference on Neural Networks 2003, Portland, OR, United States.
Kil, Rhee Man ; Koo, Imhoi. / True Risk Bounds for the Regression of Real-Valued Functions. Paper presented at International Joint Conference on Neural Networks 2003, Portland, OR, United States.6 p.
@conference{6d4bcd9575e344a4bea4d76695c30a12,
title = "True Risk Bounds for the Regression of Real-Valued Functions",
abstract = "This paper presents a new form of true risk bounds for the regression of real-valued functions. The goal of machine learning is minimizing the true risk (or general error) for the whole distribution of sample space, not just a set of training samples. However, the true risk can not be estimated accurately with the finite number of samples. In this sense, we derive the form of true risk bounds which may provide the useful guideline for the optimization of learning models. Through the simulation for the function approximation, we have shown that the prediction of true risk bounds based on the suggested functional form is well fitted to the empirical data.",
author = "Kil, {Rhee Man} and Imhoi Koo",
year = "2003",
month = "9",
day = "24",
language = "English (US)",
pages = "507--512",
note = "International Joint Conference on Neural Networks 2003 ; Conference date: 20-07-2003 Through 24-07-2003",

}

Kil, RM & Koo, I 2003, 'True Risk Bounds for the Regression of Real-Valued Functions' Paper presented at International Joint Conference on Neural Networks 2003, Portland, OR, United States, 7/20/03 - 7/24/03, pp. 507-512.

True Risk Bounds for the Regression of Real-Valued Functions. / Kil, Rhee Man; Koo, Imhoi.

2003. 507-512 Paper presented at International Joint Conference on Neural Networks 2003, Portland, OR, United States.

Research output: Contribution to conferencePaper

TY - CONF

T1 - True Risk Bounds for the Regression of Real-Valued Functions

AU - Kil, Rhee Man

AU - Koo, Imhoi

PY - 2003/9/24

Y1 - 2003/9/24

N2 - This paper presents a new form of true risk bounds for the regression of real-valued functions. The goal of machine learning is minimizing the true risk (or general error) for the whole distribution of sample space, not just a set of training samples. However, the true risk can not be estimated accurately with the finite number of samples. In this sense, we derive the form of true risk bounds which may provide the useful guideline for the optimization of learning models. Through the simulation for the function approximation, we have shown that the prediction of true risk bounds based on the suggested functional form is well fitted to the empirical data.

AB - This paper presents a new form of true risk bounds for the regression of real-valued functions. The goal of machine learning is minimizing the true risk (or general error) for the whole distribution of sample space, not just a set of training samples. However, the true risk can not be estimated accurately with the finite number of samples. In this sense, we derive the form of true risk bounds which may provide the useful guideline for the optimization of learning models. Through the simulation for the function approximation, we have shown that the prediction of true risk bounds based on the suggested functional form is well fitted to the empirical data.

UR - http://www.scopus.com/inward/record.url?scp=0141573320&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0141573320&partnerID=8YFLogxK

M3 - Paper

SP - 507

EP - 512

ER -

Kil RM, Koo I. True Risk Bounds for the Regression of Real-Valued Functions. 2003. Paper presented at International Joint Conference on Neural Networks 2003, Portland, OR, United States.