### Abstract

One of the issues in any learning model is how it scales with problem size. The problem of learning finite state machine (FSMs) from examples with recurrent neural networks has been extensively explored. However, these results are somewhat disappointing in the sense that the machines that can be learned are too small to be competitive with existing grammatical inference algorithms. We show that a type of recurrent neural network (Narendra & Parthasarathy, 1990, IEEE Trans. Neural Networks, 1, 4-27) which has feedback but no hidden state neurons can learn a special type of FSM called a finite memory machine (FMM) under certain constraints. These machines have a large number of states (simulations are for 256 and 512 state FMMs) but have minimal order, relatively small depth and little logic when the FMM is implemented as a sequential machine.

Original language | English (US) |
---|---|

Pages (from-to) | 1359-1365 |

Number of pages | 7 |

Journal | Neural Networks |

Volume | 8 |

Issue number | 9 |

DOIs | |

State | Published - 1995 |

### Fingerprint

### All Science Journal Classification (ASJC) codes

- Cognitive Neuroscience
- Artificial Intelligence

### Cite this

*Neural Networks*,

*8*(9), 1359-1365. https://doi.org/10.1016/0893-6080(95)00041-0

}

*Neural Networks*, vol. 8, no. 9, pp. 1359-1365. https://doi.org/10.1016/0893-6080(95)00041-0

**Learning a class of large finite state machines with a recurrent neural network.** / Lee Giles, C.; Horne, B. G.; Lin, T.

Research output: Contribution to journal › Article

TY - JOUR

T1 - Learning a class of large finite state machines with a recurrent neural network

AU - Lee Giles, C.

AU - Horne, B. G.

AU - Lin, T.

PY - 1995

Y1 - 1995

N2 - One of the issues in any learning model is how it scales with problem size. The problem of learning finite state machine (FSMs) from examples with recurrent neural networks has been extensively explored. However, these results are somewhat disappointing in the sense that the machines that can be learned are too small to be competitive with existing grammatical inference algorithms. We show that a type of recurrent neural network (Narendra & Parthasarathy, 1990, IEEE Trans. Neural Networks, 1, 4-27) which has feedback but no hidden state neurons can learn a special type of FSM called a finite memory machine (FMM) under certain constraints. These machines have a large number of states (simulations are for 256 and 512 state FMMs) but have minimal order, relatively small depth and little logic when the FMM is implemented as a sequential machine.

AB - One of the issues in any learning model is how it scales with problem size. The problem of learning finite state machine (FSMs) from examples with recurrent neural networks has been extensively explored. However, these results are somewhat disappointing in the sense that the machines that can be learned are too small to be competitive with existing grammatical inference algorithms. We show that a type of recurrent neural network (Narendra & Parthasarathy, 1990, IEEE Trans. Neural Networks, 1, 4-27) which has feedback but no hidden state neurons can learn a special type of FSM called a finite memory machine (FMM) under certain constraints. These machines have a large number of states (simulations are for 256 and 512 state FMMs) but have minimal order, relatively small depth and little logic when the FMM is implemented as a sequential machine.

UR - http://www.scopus.com/inward/record.url?scp=0029560406&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0029560406&partnerID=8YFLogxK

U2 - 10.1016/0893-6080(95)00041-0

DO - 10.1016/0893-6080(95)00041-0

M3 - Article

AN - SCOPUS:0029560406

VL - 8

SP - 1359

EP - 1365

JO - Neural Networks

JF - Neural Networks

SN - 0893-6080

IS - 9

ER -