### Abstract

Recurrent neural networks that are trained to behave like deterministic finite-state automata (DFAs) can show deteriorating performance when tested on long strings. This deteriorating performance can be attributed to the instability of the internal representation of the learned DFA states. The use of a sigmoidal discriminant function together with the recurrent structure contribute to this instability. We prove that a simple algorithm can construct second-order recurrent neural networks with a sparse interconnection topology and sigmoidal discriminant function such that the internal DFA state representations are stable, that is, the constructed network correctly classifies strings of arbitrary length. The algorithm is based on encoding strengths of weights directly into the neural network. We derive a relationship between the weight strength and the number of DFA states for robust string classification. For a DFA with n states and m input alphabet symbols, the constructive algorithm generates a "programmed" neural network with O(n) neurons and O(mn) weights. We compare our algorithm to other methods proposed in the literature.

Original language | English (US) |
---|---|

Pages (from-to) | 937-972 |

Number of pages | 36 |

Journal | Journal of the ACM |

Volume | 43 |

Issue number | 6 |

DOIs | |

State | Published - Nov 1996 |

### Fingerprint

### All Science Journal Classification (ASJC) codes

- Software
- Control and Systems Engineering
- Information Systems
- Hardware and Architecture
- Artificial Intelligence

### Cite this

*Journal of the ACM*,

*43*(6), 937-972. https://doi.org/10.1145/235809.235811

}

*Journal of the ACM*, vol. 43, no. 6, pp. 937-972. https://doi.org/10.1145/235809.235811

**Constructing deterministic finite-state automata in recurrent neural networks.** / Omlin, Christian W.; Giles, C. Lee.

Research output: Contribution to journal › Article

TY - JOUR

T1 - Constructing deterministic finite-state automata in recurrent neural networks

AU - Omlin, Christian W.

AU - Giles, C. Lee

PY - 1996/11

Y1 - 1996/11

N2 - Recurrent neural networks that are trained to behave like deterministic finite-state automata (DFAs) can show deteriorating performance when tested on long strings. This deteriorating performance can be attributed to the instability of the internal representation of the learned DFA states. The use of a sigmoidal discriminant function together with the recurrent structure contribute to this instability. We prove that a simple algorithm can construct second-order recurrent neural networks with a sparse interconnection topology and sigmoidal discriminant function such that the internal DFA state representations are stable, that is, the constructed network correctly classifies strings of arbitrary length. The algorithm is based on encoding strengths of weights directly into the neural network. We derive a relationship between the weight strength and the number of DFA states for robust string classification. For a DFA with n states and m input alphabet symbols, the constructive algorithm generates a "programmed" neural network with O(n) neurons and O(mn) weights. We compare our algorithm to other methods proposed in the literature.

AB - Recurrent neural networks that are trained to behave like deterministic finite-state automata (DFAs) can show deteriorating performance when tested on long strings. This deteriorating performance can be attributed to the instability of the internal representation of the learned DFA states. The use of a sigmoidal discriminant function together with the recurrent structure contribute to this instability. We prove that a simple algorithm can construct second-order recurrent neural networks with a sparse interconnection topology and sigmoidal discriminant function such that the internal DFA state representations are stable, that is, the constructed network correctly classifies strings of arbitrary length. The algorithm is based on encoding strengths of weights directly into the neural network. We derive a relationship between the weight strength and the number of DFA states for robust string classification. For a DFA with n states and m input alphabet symbols, the constructive algorithm generates a "programmed" neural network with O(n) neurons and O(mn) weights. We compare our algorithm to other methods proposed in the literature.

UR - http://www.scopus.com/inward/record.url?scp=0030286473&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0030286473&partnerID=8YFLogxK

U2 - 10.1145/235809.235811

DO - 10.1145/235809.235811

M3 - Article

AN - SCOPUS:0030286473

VL - 43

SP - 937

EP - 972

JO - Journal of the ACM

JF - Journal of the ACM

SN - 0004-5411

IS - 6

ER -