Stable Encoding of Large Finite-State Automata in Recurrent Neural Networks with Sigmoid Discriminants

Christian W. Omlin, C. Lee Giles

Research output: Contribution to journalArticle

30 Citations (Scopus)

Abstract

We propose an algorithm for encoding deterministic finite-state automata (DFAs) in second-order recurrent neural networks with sigmoidal discriminant function and we prove that the languages accepted by the constructed network and the DFA are identical. The desired finite-state network dynamics is achieved by programming a small subset of all weights. A worst case analysis reveals a relationship between the weight strength and the maximum allowed network size, which guarantees finite-state behavior of the constructed network. We illustrate the method by encoding random DFAs with 10, 100, and 1000 states. While the theory predicts that the weight strength scales with the DFA size, we find empirically the weight strength to be almost constant for all the random DFAs. These results can be explained by noting that the generated DFAs represent average cases. We empirically demonstrate the existence of extreme DFAs for which the weight strength scales with DFA size.

Original languageEnglish (US)
Pages (from-to)675-696
Number of pages22
JournalNeural computation
Volume8
Issue number4
DOIs
StatePublished - May 15 1996

Fingerprint

Sigmoid Colon
Weights and Measures
Language
Recurrent Neural Networks
Encoding
Automata

All Science Journal Classification (ASJC) codes

  • Arts and Humanities (miscellaneous)
  • Cognitive Neuroscience

Cite this

@article{e704b804ea7248d19b64f8d2a5aee3ee,
title = "Stable Encoding of Large Finite-State Automata in Recurrent Neural Networks with Sigmoid Discriminants",
abstract = "We propose an algorithm for encoding deterministic finite-state automata (DFAs) in second-order recurrent neural networks with sigmoidal discriminant function and we prove that the languages accepted by the constructed network and the DFA are identical. The desired finite-state network dynamics is achieved by programming a small subset of all weights. A worst case analysis reveals a relationship between the weight strength and the maximum allowed network size, which guarantees finite-state behavior of the constructed network. We illustrate the method by encoding random DFAs with 10, 100, and 1000 states. While the theory predicts that the weight strength scales with the DFA size, we find empirically the weight strength to be almost constant for all the random DFAs. These results can be explained by noting that the generated DFAs represent average cases. We empirically demonstrate the existence of extreme DFAs for which the weight strength scales with DFA size.",
author = "Omlin, {Christian W.} and Giles, {C. Lee}",
year = "1996",
month = "5",
day = "15",
doi = "10.1162/neco.1996.8.4.675",
language = "English (US)",
volume = "8",
pages = "675--696",
journal = "Neural Computation",
issn = "0899-7667",
publisher = "MIT Press Journals",
number = "4",

}

Stable Encoding of Large Finite-State Automata in Recurrent Neural Networks with Sigmoid Discriminants. / Omlin, Christian W.; Giles, C. Lee.

In: Neural computation, Vol. 8, No. 4, 15.05.1996, p. 675-696.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Stable Encoding of Large Finite-State Automata in Recurrent Neural Networks with Sigmoid Discriminants

AU - Omlin, Christian W.

AU - Giles, C. Lee

PY - 1996/5/15

Y1 - 1996/5/15

N2 - We propose an algorithm for encoding deterministic finite-state automata (DFAs) in second-order recurrent neural networks with sigmoidal discriminant function and we prove that the languages accepted by the constructed network and the DFA are identical. The desired finite-state network dynamics is achieved by programming a small subset of all weights. A worst case analysis reveals a relationship between the weight strength and the maximum allowed network size, which guarantees finite-state behavior of the constructed network. We illustrate the method by encoding random DFAs with 10, 100, and 1000 states. While the theory predicts that the weight strength scales with the DFA size, we find empirically the weight strength to be almost constant for all the random DFAs. These results can be explained by noting that the generated DFAs represent average cases. We empirically demonstrate the existence of extreme DFAs for which the weight strength scales with DFA size.

AB - We propose an algorithm for encoding deterministic finite-state automata (DFAs) in second-order recurrent neural networks with sigmoidal discriminant function and we prove that the languages accepted by the constructed network and the DFA are identical. The desired finite-state network dynamics is achieved by programming a small subset of all weights. A worst case analysis reveals a relationship between the weight strength and the maximum allowed network size, which guarantees finite-state behavior of the constructed network. We illustrate the method by encoding random DFAs with 10, 100, and 1000 states. While the theory predicts that the weight strength scales with the DFA size, we find empirically the weight strength to be almost constant for all the random DFAs. These results can be explained by noting that the generated DFAs represent average cases. We empirically demonstrate the existence of extreme DFAs for which the weight strength scales with DFA size.

UR - http://www.scopus.com/inward/record.url?scp=0030585201&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0030585201&partnerID=8YFLogxK

U2 - 10.1162/neco.1996.8.4.675

DO - 10.1162/neco.1996.8.4.675

M3 - Article

C2 - 8624958

AN - SCOPUS:0030585201

VL - 8

SP - 675

EP - 696

JO - Neural Computation

JF - Neural Computation

SN - 0899-7667

IS - 4

ER -