Constructive learning of recurrent neural networks

D. Chen, C. L. Giles, G. Z. Sun, H. H. Chen, Y. C. Lee, M. W. Goudreau

Research output: Chapter in Book/Report/Conference proceedingConference contribution

7 Citations (Scopus)

Abstract

Recurrent neural networks are a natural model for learning and predicting temporal signals. In addition, simple recurrent networks have been shown to be both theoretically and experimentally capable of learning finite state automata [Cleeremans 89. Giles 92a, Minsky 67, Pollack 91, Siegelmann 92]. However, it is difficult to determine what is the minimal neural network structure for a particular automaton. Using a large recurrent network, which would be versatile in theory, in practice proves to be very difficult to train. Constructive or destructive recurrent methods might offer a solution to this problem. We prove that one current method. Recurrent Cascade Correlation, has fundamental limitations in representation and thus in its learning capabilities. We give a preliminary approach on how to get around these limitations by devising a "simple" constructive training method that adds neurons during training while still preserving the powerful fully recurrent structure. Through simulations we show that such a method can learn many types of regular grammars that the Recurrent Cascade Correlation method is unable to learn.

Original languageEnglish (US)
Title of host publication1993 IEEE International Conference on Neural Networks, ICNN 1993
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1196-1201
Number of pages6
ISBN (Electronic)0780309995
DOIs
StatePublished - Jan 1 1993
EventIEEE International Conference on Neural Networks, ICNN 1993 - San Francisco, United States
Duration: Mar 28 1993Apr 1 1993

Publication series

NameIEEE International Conference on Neural Networks - Conference Proceedings
Volume1993-January
ISSN (Print)1098-7576

Other

OtherIEEE International Conference on Neural Networks, ICNN 1993
CountryUnited States
CitySan Francisco
Period3/28/934/1/93

Fingerprint

Correlation methods
Recurrent neural networks
Finite automata
Neurons
Neural networks

All Science Journal Classification (ASJC) codes

  • Software

Cite this

Chen, D., Giles, C. L., Sun, G. Z., Chen, H. H., Lee, Y. C., & Goudreau, M. W. (1993). Constructive learning of recurrent neural networks. In 1993 IEEE International Conference on Neural Networks, ICNN 1993 (pp. 1196-1201). [298727] (IEEE International Conference on Neural Networks - Conference Proceedings; Vol. 1993-January). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ICNN.1993.298727
Chen, D. ; Giles, C. L. ; Sun, G. Z. ; Chen, H. H. ; Lee, Y. C. ; Goudreau, M. W. / Constructive learning of recurrent neural networks. 1993 IEEE International Conference on Neural Networks, ICNN 1993. Institute of Electrical and Electronics Engineers Inc., 1993. pp. 1196-1201 (IEEE International Conference on Neural Networks - Conference Proceedings).
@inproceedings{a4499e5540114554bdbafb6bf078cbb3,
title = "Constructive learning of recurrent neural networks",
abstract = "Recurrent neural networks are a natural model for learning and predicting temporal signals. In addition, simple recurrent networks have been shown to be both theoretically and experimentally capable of learning finite state automata [Cleeremans 89. Giles 92a, Minsky 67, Pollack 91, Siegelmann 92]. However, it is difficult to determine what is the minimal neural network structure for a particular automaton. Using a large recurrent network, which would be versatile in theory, in practice proves to be very difficult to train. Constructive or destructive recurrent methods might offer a solution to this problem. We prove that one current method. Recurrent Cascade Correlation, has fundamental limitations in representation and thus in its learning capabilities. We give a preliminary approach on how to get around these limitations by devising a {"}simple{"} constructive training method that adds neurons during training while still preserving the powerful fully recurrent structure. Through simulations we show that such a method can learn many types of regular grammars that the Recurrent Cascade Correlation method is unable to learn.",
author = "D. Chen and Giles, {C. L.} and Sun, {G. Z.} and Chen, {H. H.} and Lee, {Y. C.} and Goudreau, {M. W.}",
year = "1993",
month = "1",
day = "1",
doi = "10.1109/ICNN.1993.298727",
language = "English (US)",
series = "IEEE International Conference on Neural Networks - Conference Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "1196--1201",
booktitle = "1993 IEEE International Conference on Neural Networks, ICNN 1993",
address = "United States",

}

Chen, D, Giles, CL, Sun, GZ, Chen, HH, Lee, YC & Goudreau, MW 1993, Constructive learning of recurrent neural networks. in 1993 IEEE International Conference on Neural Networks, ICNN 1993., 298727, IEEE International Conference on Neural Networks - Conference Proceedings, vol. 1993-January, Institute of Electrical and Electronics Engineers Inc., pp. 1196-1201, IEEE International Conference on Neural Networks, ICNN 1993, San Francisco, United States, 3/28/93. https://doi.org/10.1109/ICNN.1993.298727

Constructive learning of recurrent neural networks. / Chen, D.; Giles, C. L.; Sun, G. Z.; Chen, H. H.; Lee, Y. C.; Goudreau, M. W.

1993 IEEE International Conference on Neural Networks, ICNN 1993. Institute of Electrical and Electronics Engineers Inc., 1993. p. 1196-1201 298727 (IEEE International Conference on Neural Networks - Conference Proceedings; Vol. 1993-January).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Constructive learning of recurrent neural networks

AU - Chen, D.

AU - Giles, C. L.

AU - Sun, G. Z.

AU - Chen, H. H.

AU - Lee, Y. C.

AU - Goudreau, M. W.

PY - 1993/1/1

Y1 - 1993/1/1

N2 - Recurrent neural networks are a natural model for learning and predicting temporal signals. In addition, simple recurrent networks have been shown to be both theoretically and experimentally capable of learning finite state automata [Cleeremans 89. Giles 92a, Minsky 67, Pollack 91, Siegelmann 92]. However, it is difficult to determine what is the minimal neural network structure for a particular automaton. Using a large recurrent network, which would be versatile in theory, in practice proves to be very difficult to train. Constructive or destructive recurrent methods might offer a solution to this problem. We prove that one current method. Recurrent Cascade Correlation, has fundamental limitations in representation and thus in its learning capabilities. We give a preliminary approach on how to get around these limitations by devising a "simple" constructive training method that adds neurons during training while still preserving the powerful fully recurrent structure. Through simulations we show that such a method can learn many types of regular grammars that the Recurrent Cascade Correlation method is unable to learn.

AB - Recurrent neural networks are a natural model for learning and predicting temporal signals. In addition, simple recurrent networks have been shown to be both theoretically and experimentally capable of learning finite state automata [Cleeremans 89. Giles 92a, Minsky 67, Pollack 91, Siegelmann 92]. However, it is difficult to determine what is the minimal neural network structure for a particular automaton. Using a large recurrent network, which would be versatile in theory, in practice proves to be very difficult to train. Constructive or destructive recurrent methods might offer a solution to this problem. We prove that one current method. Recurrent Cascade Correlation, has fundamental limitations in representation and thus in its learning capabilities. We give a preliminary approach on how to get around these limitations by devising a "simple" constructive training method that adds neurons during training while still preserving the powerful fully recurrent structure. Through simulations we show that such a method can learn many types of regular grammars that the Recurrent Cascade Correlation method is unable to learn.

UR - http://www.scopus.com/inward/record.url?scp=84943266293&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84943266293&partnerID=8YFLogxK

U2 - 10.1109/ICNN.1993.298727

DO - 10.1109/ICNN.1993.298727

M3 - Conference contribution

AN - SCOPUS:84943266293

T3 - IEEE International Conference on Neural Networks - Conference Proceedings

SP - 1196

EP - 1201

BT - 1993 IEEE International Conference on Neural Networks, ICNN 1993

PB - Institute of Electrical and Electronics Engineers Inc.

ER -

Chen D, Giles CL, Sun GZ, Chen HH, Lee YC, Goudreau MW. Constructive learning of recurrent neural networks. In 1993 IEEE International Conference on Neural Networks, ICNN 1993. Institute of Electrical and Electronics Engineers Inc. 1993. p. 1196-1201. 298727. (IEEE International Conference on Neural Networks - Conference Proceedings). https://doi.org/10.1109/ICNN.1993.298727