Training recurrent neural networks with temporal input encodings

C. W. Omlin, C. L. Giles, B. G. Horne, L. R. Leerink, T. Lin

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

We investigate the learning of deterministic finite-state automata (DFA's) with recurrent networks with a single input neuron, where each input symbol is represented as a temporal pattern and strings as sequences of temporal patterns. We empirically demonstrate that obvious temporal encodings can make learning very difficult or even impossible. Based on preliminary results, we formulate some hypotheses about 'good' temporal encoding, i.e. encodings which do not significantly increase training time compared to training of networks with multiple input neurons.

Original languageEnglish (US)
Title of host publicationIEEE International Conference on Neural Networks - Conference Proceedings
PublisherIEEE
Pages1267-1272
Number of pages6
Volume2
StatePublished - 1994
EventProceedings of the 1994 IEEE International Conference on Neural Networks. Part 1 (of 7) - Orlando, FL, USA
Duration: Jun 27 1994Jun 29 1994

Other

OtherProceedings of the 1994 IEEE International Conference on Neural Networks. Part 1 (of 7)
CityOrlando, FL, USA
Period6/27/946/29/94

Fingerprint

Recurrent neural networks
Neurons
Finite automata

All Science Journal Classification (ASJC) codes

  • Software

Cite this

Omlin, C. W., Giles, C. L., Horne, B. G., Leerink, L. R., & Lin, T. (1994). Training recurrent neural networks with temporal input encodings. In IEEE International Conference on Neural Networks - Conference Proceedings (Vol. 2, pp. 1267-1272). IEEE.
Omlin, C. W. ; Giles, C. L. ; Horne, B. G. ; Leerink, L. R. ; Lin, T. / Training recurrent neural networks with temporal input encodings. IEEE International Conference on Neural Networks - Conference Proceedings. Vol. 2 IEEE, 1994. pp. 1267-1272
@inproceedings{1dc81f455d3542828d70f3002133697d,
title = "Training recurrent neural networks with temporal input encodings",
abstract = "We investigate the learning of deterministic finite-state automata (DFA's) with recurrent networks with a single input neuron, where each input symbol is represented as a temporal pattern and strings as sequences of temporal patterns. We empirically demonstrate that obvious temporal encodings can make learning very difficult or even impossible. Based on preliminary results, we formulate some hypotheses about 'good' temporal encoding, i.e. encodings which do not significantly increase training time compared to training of networks with multiple input neurons.",
author = "Omlin, {C. W.} and Giles, {C. L.} and Horne, {B. G.} and Leerink, {L. R.} and T. Lin",
year = "1994",
language = "English (US)",
volume = "2",
pages = "1267--1272",
booktitle = "IEEE International Conference on Neural Networks - Conference Proceedings",
publisher = "IEEE",

}

Omlin, CW, Giles, CL, Horne, BG, Leerink, LR & Lin, T 1994, Training recurrent neural networks with temporal input encodings. in IEEE International Conference on Neural Networks - Conference Proceedings. vol. 2, IEEE, pp. 1267-1272, Proceedings of the 1994 IEEE International Conference on Neural Networks. Part 1 (of 7), Orlando, FL, USA, 6/27/94.

Training recurrent neural networks with temporal input encodings. / Omlin, C. W.; Giles, C. L.; Horne, B. G.; Leerink, L. R.; Lin, T.

IEEE International Conference on Neural Networks - Conference Proceedings. Vol. 2 IEEE, 1994. p. 1267-1272.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Training recurrent neural networks with temporal input encodings

AU - Omlin, C. W.

AU - Giles, C. L.

AU - Horne, B. G.

AU - Leerink, L. R.

AU - Lin, T.

PY - 1994

Y1 - 1994

N2 - We investigate the learning of deterministic finite-state automata (DFA's) with recurrent networks with a single input neuron, where each input symbol is represented as a temporal pattern and strings as sequences of temporal patterns. We empirically demonstrate that obvious temporal encodings can make learning very difficult or even impossible. Based on preliminary results, we formulate some hypotheses about 'good' temporal encoding, i.e. encodings which do not significantly increase training time compared to training of networks with multiple input neurons.

AB - We investigate the learning of deterministic finite-state automata (DFA's) with recurrent networks with a single input neuron, where each input symbol is represented as a temporal pattern and strings as sequences of temporal patterns. We empirically demonstrate that obvious temporal encodings can make learning very difficult or even impossible. Based on preliminary results, we formulate some hypotheses about 'good' temporal encoding, i.e. encodings which do not significantly increase training time compared to training of networks with multiple input neurons.

UR - http://www.scopus.com/inward/record.url?scp=0028756069&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0028756069&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:0028756069

VL - 2

SP - 1267

EP - 1272

BT - IEEE International Conference on Neural Networks - Conference Proceedings

PB - IEEE

ER -

Omlin CW, Giles CL, Horne BG, Leerink LR, Lin T. Training recurrent neural networks with temporal input encodings. In IEEE International Conference on Neural Networks - Conference Proceedings. Vol. 2. IEEE. 1994. p. 1267-1272