Constructive learning of recurrent neural network

D. Chen, C. L. Giles, G. Z. Sun, H. H. Chen, Y. C. Lee, M. W. Goudreau

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Citations (Scopus)

Abstract

Recurrent neural networks are a natural model for learning and predicting temporal signals.In addition simple recurrent networks have been shown to be both theoretically and experimentally capable of learning finite state automata. However it is difficult to determine what is the minimal neural network structure for a particular automation. Using a large recurrent network, which would be versatile in theory in practice proves to be very difficult to train. Constructive or destructive recurrent methods might offer a solution to this problem. We prove that one current method. Recurrent cascade correlation has fundamental limitations in representation and thus in its learning capabilities. We give a preliminary approach on how to get around these limitation by devising a ″Simple″ constructive training method that adds neurons during training while still preserving the powerful fully recurrent structure. Through simulations we show that such a method can learn many types of regular grammars that the Recurrent Cascade Correlation method is unable to learn.

Original languageEnglish (US)
Title of host publication1993 IEEE International Conference on Neural Networks
PublisherPubl by IEEE
Pages1192-1201
Number of pages10
ISBN (Print)0780312007
StatePublished - Jan 1 1993
Event1993 IEEE International Conference on Neural Networks - San Francisco, California, USA
Duration: Mar 28 1993Apr 1 1993

Publication series

Name1993 IEEE International Conference on Neural Networks

Other

Other1993 IEEE International Conference on Neural Networks
CitySan Francisco, California, USA
Period3/28/934/1/93

Fingerprint

Correlation methods
Recurrent neural networks
Finite automata
Neurons
Automation
Neural networks

All Science Journal Classification (ASJC) codes

  • Engineering(all)

Cite this

Chen, D., Giles, C. L., Sun, G. Z., Chen, H. H., Lee, Y. C., & Goudreau, M. W. (1993). Constructive learning of recurrent neural network. In 1993 IEEE International Conference on Neural Networks (pp. 1192-1201). (1993 IEEE International Conference on Neural Networks). Publ by IEEE.
Chen, D. ; Giles, C. L. ; Sun, G. Z. ; Chen, H. H. ; Lee, Y. C. ; Goudreau, M. W. / Constructive learning of recurrent neural network. 1993 IEEE International Conference on Neural Networks. Publ by IEEE, 1993. pp. 1192-1201 (1993 IEEE International Conference on Neural Networks).
@inproceedings{a9a375eeb9934d7f854bc49bfeffd86f,
title = "Constructive learning of recurrent neural network",
abstract = "Recurrent neural networks are a natural model for learning and predicting temporal signals.In addition simple recurrent networks have been shown to be both theoretically and experimentally capable of learning finite state automata. However it is difficult to determine what is the minimal neural network structure for a particular automation. Using a large recurrent network, which would be versatile in theory in practice proves to be very difficult to train. Constructive or destructive recurrent methods might offer a solution to this problem. We prove that one current method. Recurrent cascade correlation has fundamental limitations in representation and thus in its learning capabilities. We give a preliminary approach on how to get around these limitation by devising a ″Simple″ constructive training method that adds neurons during training while still preserving the powerful fully recurrent structure. Through simulations we show that such a method can learn many types of regular grammars that the Recurrent Cascade Correlation method is unable to learn.",
author = "D. Chen and Giles, {C. L.} and Sun, {G. Z.} and Chen, {H. H.} and Lee, {Y. C.} and Goudreau, {M. W.}",
year = "1993",
month = "1",
day = "1",
language = "English (US)",
isbn = "0780312007",
series = "1993 IEEE International Conference on Neural Networks",
publisher = "Publ by IEEE",
pages = "1192--1201",
booktitle = "1993 IEEE International Conference on Neural Networks",

}

Chen, D, Giles, CL, Sun, GZ, Chen, HH, Lee, YC & Goudreau, MW 1993, Constructive learning of recurrent neural network. in 1993 IEEE International Conference on Neural Networks. 1993 IEEE International Conference on Neural Networks, Publ by IEEE, pp. 1192-1201, 1993 IEEE International Conference on Neural Networks, San Francisco, California, USA, 3/28/93.

Constructive learning of recurrent neural network. / Chen, D.; Giles, C. L.; Sun, G. Z.; Chen, H. H.; Lee, Y. C.; Goudreau, M. W.

1993 IEEE International Conference on Neural Networks. Publ by IEEE, 1993. p. 1192-1201 (1993 IEEE International Conference on Neural Networks).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Constructive learning of recurrent neural network

AU - Chen, D.

AU - Giles, C. L.

AU - Sun, G. Z.

AU - Chen, H. H.

AU - Lee, Y. C.

AU - Goudreau, M. W.

PY - 1993/1/1

Y1 - 1993/1/1

N2 - Recurrent neural networks are a natural model for learning and predicting temporal signals.In addition simple recurrent networks have been shown to be both theoretically and experimentally capable of learning finite state automata. However it is difficult to determine what is the minimal neural network structure for a particular automation. Using a large recurrent network, which would be versatile in theory in practice proves to be very difficult to train. Constructive or destructive recurrent methods might offer a solution to this problem. We prove that one current method. Recurrent cascade correlation has fundamental limitations in representation and thus in its learning capabilities. We give a preliminary approach on how to get around these limitation by devising a ″Simple″ constructive training method that adds neurons during training while still preserving the powerful fully recurrent structure. Through simulations we show that such a method can learn many types of regular grammars that the Recurrent Cascade Correlation method is unable to learn.

AB - Recurrent neural networks are a natural model for learning and predicting temporal signals.In addition simple recurrent networks have been shown to be both theoretically and experimentally capable of learning finite state automata. However it is difficult to determine what is the minimal neural network structure for a particular automation. Using a large recurrent network, which would be versatile in theory in practice proves to be very difficult to train. Constructive or destructive recurrent methods might offer a solution to this problem. We prove that one current method. Recurrent cascade correlation has fundamental limitations in representation and thus in its learning capabilities. We give a preliminary approach on how to get around these limitation by devising a ″Simple″ constructive training method that adds neurons during training while still preserving the powerful fully recurrent structure. Through simulations we show that such a method can learn many types of regular grammars that the Recurrent Cascade Correlation method is unable to learn.

UR - http://www.scopus.com/inward/record.url?scp=0027187853&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0027187853&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:0027187853

SN - 0780312007

T3 - 1993 IEEE International Conference on Neural Networks

SP - 1192

EP - 1201

BT - 1993 IEEE International Conference on Neural Networks

PB - Publ by IEEE

ER -

Chen D, Giles CL, Sun GZ, Chen HH, Lee YC, Goudreau MW. Constructive learning of recurrent neural network. In 1993 IEEE International Conference on Neural Networks. Publ by IEEE. 1993. p. 1192-1201. (1993 IEEE International Conference on Neural Networks).