### Abstract

This paper presents a new form of true risk bounds for the regression of real-valued functions. The goal of machine learning is minimizing the true risk (or general error) for the whole distribution of sample space, not just a set of training samples. However, the true risk can not be estimated accurately with the finite number of samples. In this sense, we derive the form of true risk bounds which may provide the useful guideline for the optimization of learning models. Through the simulation for the function approximation, we have shown that the prediction of true risk bounds based on the suggested functional form is well fitted to the empirical data.

Original language | English (US) |
---|---|

Pages | 507-512 |

Number of pages | 6 |

State | Published - Sep 24 2003 |

Event | International Joint Conference on Neural Networks 2003 - Portland, OR, United States Duration: Jul 20 2003 → Jul 24 2003 |

### Conference

Conference | International Joint Conference on Neural Networks 2003 |
---|---|

Country | United States |

City | Portland, OR |

Period | 7/20/03 → 7/24/03 |

### All Science Journal Classification (ASJC) codes

- Software
- Artificial Intelligence

## Fingerprint Dive into the research topics of 'True Risk Bounds for the Regression of Real-Valued Functions'. Together they form a unique fingerprint.

## Cite this

Kil, R. M., & Koo, I. (2003).

*True Risk Bounds for the Regression of Real-Valued Functions*. 507-512. Paper presented at International Joint Conference on Neural Networks 2003, Portland, OR, United States.