Training recurrent neural networks with temporal input encodings

C. W. Omlin, C. L. Giles, B. G. Horne, L. R. Leerink, T. Lin

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

We investigate the learning of deterministic finite-state automata (DFA's) with recurrent networks with a single input neuron, where each input symbol is represented as a temporal pattern and strings as sequences of temporal patterns. We empirically demonstrate that obvious temporal encodings can make learning very difficult or even impossible. Based on preliminary results, we formulate some hypotheses about 'good' temporal encoding, i.e. encodings which do not significantly increase training time compared to training of networks with multiple input neurons.

Original languageEnglish (US)
Title of host publicationIEEE International Conference on Neural Networks - Conference Proceedings
PublisherIEEE
Pages1267-1272
Number of pages6
Volume2
StatePublished - 1994
EventProceedings of the 1994 IEEE International Conference on Neural Networks. Part 1 (of 7) - Orlando, FL, USA
Duration: Jun 27 1994Jun 29 1994

Other

OtherProceedings of the 1994 IEEE International Conference on Neural Networks. Part 1 (of 7)
CityOrlando, FL, USA
Period6/27/946/29/94

All Science Journal Classification (ASJC) codes

  • Software

Fingerprint Dive into the research topics of 'Training recurrent neural networks with temporal input encodings'. Together they form a unique fingerprint.

Cite this