### Abstract

By allowing the training set to become arbitrarily large, appropriately trained and configured single hidden layer feedforward networks converge in probability to the smooth function that they were trained to estimate. A bound on the probabilistic rate of convergence of these network estimates is given. The convergence rate is calculated as a function of the sample size n. If the function being estimated has square integrable mth order partial derivatives then the L_{2}-norm estimation error approaches O_{p}(n^{-} 1 2) for large m. Two steps are required for determining these bounds. A bound on the rate of convergence of approximations to an unknown smooth function by members of a special class of single hidden layer feedforward networks is determined. The class of networks considered can embed Fourier series. Using this fact and results on approximation properties of Fourier series yields a bound on L_{2}-norm approximation error. This bound is less than O(q^{- 1 2}) for approximating a smooth function by networks with q hidden units. A modification of existing results for bounding estimation error provides a general theorem for calculating estimation error convergence rates. Combining this result with the bound on approximation rates yields the final convergence rates.

Original language | English (US) |
---|---|

Pages (from-to) | 147-158 |

Number of pages | 12 |

Journal | Neural Networks |

Volume | 7 |

Issue number | 1 |

DOIs | |

State | Published - Jan 1 1994 |

### Fingerprint

### All Science Journal Classification (ASJC) codes

- Cognitive Neuroscience
- Artificial Intelligence

### Cite this

*Neural Networks*,

*7*(1), 147-158. https://doi.org/10.1016/0893-6080(94)90063-9

}

*Neural Networks*, vol. 7, no. 1, pp. 147-158. https://doi.org/10.1016/0893-6080(94)90063-9

**Convergence rates for single hidden layer feedforward networks.** / McCaffrey, Daniel F.; Gallant, Andrew Ronald.

Research output: Contribution to journal › Article

TY - JOUR

T1 - Convergence rates for single hidden layer feedforward networks

AU - McCaffrey, Daniel F.

AU - Gallant, Andrew Ronald

PY - 1994/1/1

Y1 - 1994/1/1

N2 - By allowing the training set to become arbitrarily large, appropriately trained and configured single hidden layer feedforward networks converge in probability to the smooth function that they were trained to estimate. A bound on the probabilistic rate of convergence of these network estimates is given. The convergence rate is calculated as a function of the sample size n. If the function being estimated has square integrable mth order partial derivatives then the L2-norm estimation error approaches Op(n- 1 2) for large m. Two steps are required for determining these bounds. A bound on the rate of convergence of approximations to an unknown smooth function by members of a special class of single hidden layer feedforward networks is determined. The class of networks considered can embed Fourier series. Using this fact and results on approximation properties of Fourier series yields a bound on L2-norm approximation error. This bound is less than O(q- 1 2) for approximating a smooth function by networks with q hidden units. A modification of existing results for bounding estimation error provides a general theorem for calculating estimation error convergence rates. Combining this result with the bound on approximation rates yields the final convergence rates.

AB - By allowing the training set to become arbitrarily large, appropriately trained and configured single hidden layer feedforward networks converge in probability to the smooth function that they were trained to estimate. A bound on the probabilistic rate of convergence of these network estimates is given. The convergence rate is calculated as a function of the sample size n. If the function being estimated has square integrable mth order partial derivatives then the L2-norm estimation error approaches Op(n- 1 2) for large m. Two steps are required for determining these bounds. A bound on the rate of convergence of approximations to an unknown smooth function by members of a special class of single hidden layer feedforward networks is determined. The class of networks considered can embed Fourier series. Using this fact and results on approximation properties of Fourier series yields a bound on L2-norm approximation error. This bound is less than O(q- 1 2) for approximating a smooth function by networks with q hidden units. A modification of existing results for bounding estimation error provides a general theorem for calculating estimation error convergence rates. Combining this result with the bound on approximation rates yields the final convergence rates.

UR - http://www.scopus.com/inward/record.url?scp=0028320833&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0028320833&partnerID=8YFLogxK

U2 - 10.1016/0893-6080(94)90063-9

DO - 10.1016/0893-6080(94)90063-9

M3 - Article

AN - SCOPUS:0028320833

VL - 7

SP - 147

EP - 158

JO - Neural Networks

JF - Neural Networks

SN - 0893-6080

IS - 1

ER -