Second-order recurrent neural networks for grammatical inference

C. L. Giles, D. Chen, C. B. Miller, H. H. Chen, G. Z. Sun, Y. C. Lee

Research output: Chapter in Book/Report/Conference proceedingConference contribution

27 Citations (Scopus)

Abstract

It is shown that a recurrent, second-order neural network using a real-time, feed-forward training algorithm readily learns to infer regular grammars from positive and negative string training samples. Numerous simulations which show the effect of initial conditions, training set size and order, and neuron architecture are presented. All simulations were performed with random initial weight strengths and usually converge after approximately a hundred epochs of training. The authors discuss a quantization algorithm for dynamically extracting finite-state automata during and after training. For a well-trained neural net, the extracted automata constitute an equivalence class of state machines that are reducible to the minimal machine of the inferred grammar. It is then shown through simulations that many of the neural net state machines are dynamically stable and correctly classify long unseen strings.

Original languageEnglish (US)
Title of host publicationProceedings. IJCNN - International Joint Conference on Neural Networks
Editors Anon
PublisherPubl by IEEE
Pages273-281
Number of pages9
ISBN (Print)0780301641
StatePublished - Jan 1 1992
EventInternational Joint Conference on Neural Networks - IJCNN-91-Seattle - Seattle, WA, USA
Duration: Jul 8 1991Jul 12 1991

Publication series

NameProceedings. IJCNN - International Joint Conference on Neural Networks

Other

OtherInternational Joint Conference on Neural Networks - IJCNN-91-Seattle
CitySeattle, WA, USA
Period7/8/917/12/91

Fingerprint

Recurrent neural networks
Neural networks
Equivalence classes
Finite automata
Neurons

All Science Journal Classification (ASJC) codes

  • Engineering(all)

Cite this

Giles, C. L., Chen, D., Miller, C. B., Chen, H. H., Sun, G. Z., & Lee, Y. C. (1992). Second-order recurrent neural networks for grammatical inference. In Anon (Ed.), Proceedings. IJCNN - International Joint Conference on Neural Networks (pp. 273-281). (Proceedings. IJCNN - International Joint Conference on Neural Networks). Publ by IEEE.
Giles, C. L. ; Chen, D. ; Miller, C. B. ; Chen, H. H. ; Sun, G. Z. ; Lee, Y. C. / Second-order recurrent neural networks for grammatical inference. Proceedings. IJCNN - International Joint Conference on Neural Networks. editor / Anon. Publ by IEEE, 1992. pp. 273-281 (Proceedings. IJCNN - International Joint Conference on Neural Networks).
@inproceedings{f095ed823acb4839b5d76917c8587ed0,
title = "Second-order recurrent neural networks for grammatical inference",
abstract = "It is shown that a recurrent, second-order neural network using a real-time, feed-forward training algorithm readily learns to infer regular grammars from positive and negative string training samples. Numerous simulations which show the effect of initial conditions, training set size and order, and neuron architecture are presented. All simulations were performed with random initial weight strengths and usually converge after approximately a hundred epochs of training. The authors discuss a quantization algorithm for dynamically extracting finite-state automata during and after training. For a well-trained neural net, the extracted automata constitute an equivalence class of state machines that are reducible to the minimal machine of the inferred grammar. It is then shown through simulations that many of the neural net state machines are dynamically stable and correctly classify long unseen strings.",
author = "Giles, {C. L.} and D. Chen and Miller, {C. B.} and Chen, {H. H.} and Sun, {G. Z.} and Lee, {Y. C.}",
year = "1992",
month = "1",
day = "1",
language = "English (US)",
isbn = "0780301641",
series = "Proceedings. IJCNN - International Joint Conference on Neural Networks",
publisher = "Publ by IEEE",
pages = "273--281",
editor = "Anon",
booktitle = "Proceedings. IJCNN - International Joint Conference on Neural Networks",

}

Giles, CL, Chen, D, Miller, CB, Chen, HH, Sun, GZ & Lee, YC 1992, Second-order recurrent neural networks for grammatical inference. in Anon (ed.), Proceedings. IJCNN - International Joint Conference on Neural Networks. Proceedings. IJCNN - International Joint Conference on Neural Networks, Publ by IEEE, pp. 273-281, International Joint Conference on Neural Networks - IJCNN-91-Seattle, Seattle, WA, USA, 7/8/91.

Second-order recurrent neural networks for grammatical inference. / Giles, C. L.; Chen, D.; Miller, C. B.; Chen, H. H.; Sun, G. Z.; Lee, Y. C.

Proceedings. IJCNN - International Joint Conference on Neural Networks. ed. / Anon. Publ by IEEE, 1992. p. 273-281 (Proceedings. IJCNN - International Joint Conference on Neural Networks).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Second-order recurrent neural networks for grammatical inference

AU - Giles, C. L.

AU - Chen, D.

AU - Miller, C. B.

AU - Chen, H. H.

AU - Sun, G. Z.

AU - Lee, Y. C.

PY - 1992/1/1

Y1 - 1992/1/1

N2 - It is shown that a recurrent, second-order neural network using a real-time, feed-forward training algorithm readily learns to infer regular grammars from positive and negative string training samples. Numerous simulations which show the effect of initial conditions, training set size and order, and neuron architecture are presented. All simulations were performed with random initial weight strengths and usually converge after approximately a hundred epochs of training. The authors discuss a quantization algorithm for dynamically extracting finite-state automata during and after training. For a well-trained neural net, the extracted automata constitute an equivalence class of state machines that are reducible to the minimal machine of the inferred grammar. It is then shown through simulations that many of the neural net state machines are dynamically stable and correctly classify long unseen strings.

AB - It is shown that a recurrent, second-order neural network using a real-time, feed-forward training algorithm readily learns to infer regular grammars from positive and negative string training samples. Numerous simulations which show the effect of initial conditions, training set size and order, and neuron architecture are presented. All simulations were performed with random initial weight strengths and usually converge after approximately a hundred epochs of training. The authors discuss a quantization algorithm for dynamically extracting finite-state automata during and after training. For a well-trained neural net, the extracted automata constitute an equivalence class of state machines that are reducible to the minimal machine of the inferred grammar. It is then shown through simulations that many of the neural net state machines are dynamically stable and correctly classify long unseen strings.

UR - http://www.scopus.com/inward/record.url?scp=0026743865&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0026743865&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:0026743865

SN - 0780301641

T3 - Proceedings. IJCNN - International Joint Conference on Neural Networks

SP - 273

EP - 281

BT - Proceedings. IJCNN - International Joint Conference on Neural Networks

A2 - Anon, null

PB - Publ by IEEE

ER -

Giles CL, Chen D, Miller CB, Chen HH, Sun GZ, Lee YC. Second-order recurrent neural networks for grammatical inference. In Anon, editor, Proceedings. IJCNN - International Joint Conference on Neural Networks. Publ by IEEE. 1992. p. 273-281. (Proceedings. IJCNN - International Joint Conference on Neural Networks).