Approximation rates for neural networks with general activation functions

Jonathan W. Siegel, Jinchao Xu

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

We prove some new results concerning the approximation rate of neural networks with general activation functions. Our first result concerns the rate of approximation of a two layer neural network with a polynomially-decaying non-sigmoidal activation function. We extend the dimension independent approximation rates previously obtained to this new class of activation functions. Our second result gives a weaker, but still dimension independent, approximation rate for a larger class of activation functions, removing the polynomial decay assumption. This result applies to any bounded, integrable activation function. Finally, we show that a stratified sampling approach can be used to improve the approximation rate for polynomially decaying activation functions under mild additional assumptions.

Original languageEnglish (US)
Pages (from-to)313-321
Number of pages9
JournalNeural Networks
Volume128
DOIs
StatePublished - Aug 2020

All Science Journal Classification (ASJC) codes

  • Cognitive Neuroscience
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Approximation rates for neural networks with general activation functions'. Together they form a unique fingerprint.

Cite this