Abstract

Despite its notoriously slow learning time, backpropagation (BP) is one of the most widely used neural network training algorithms. Two major reasons for this slow convergence are the step-size problem and the flat spot problem. A simple modification, the expected source values (ESV) rule, has been proposed for speeding up the BP algorithm. We have extended the ESV rule by coupling it with a flat-spot removal strategy as well as incorporating a momentum term to combat the step-size problem. The resulting rule has shown dramatically improved learning time over standard BP, measured in training epochs. Two versions of the ESV modification have been developed in the literature, on-demand and up-front, but simulation results are given mostly for the on-demand case. Our results indicate that the up-front version works somewhat better than the on-demand version in terms of learning speed. We have also analyzed the interactions between the three modifications as they are used in various combinations.

Original languageEnglish (US)
Pages (from-to)29-34
Number of pages6
JournalMicrocomputer Applications
Volume17
Issue number2
StatePublished - Dec 1 1998

All Science Journal Classification (ASJC) codes

  • Computer Science(all)

Fingerprint Dive into the research topics of 'Empirical analysis of the expected source values rule'. Together they form a unique fingerprint.

  • Cite this