Representation and induction of finite state machines using time-delay neural networks

Daniel S. Clouse, C. Lee Giles, Bill G. Horne, Garrison W. Cottrell

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

This work investigates the representational and inductive capabilities of time-delay neural networks (TDNNs) in general, and of two subclasses of TDNN, those with delays only on the inputs (IDNN), and those which include delays on hidden units (HDNN). Both architectures are capable of representing the same class of languages, the definite memory machine (DMM) languages, but the delays on the hidden units in the HDNN helps it outperform the IDNN on problems composed of repeated features over short time windows.

Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems 9 - Proceedings of the 1996 Conference, NIPS 1996
PublisherNeural information processing systems foundation
Pages403-409
Number of pages7
ISBN (Print)0262100657, 9780262100656
StatePublished - Jan 1 1997
Event10th Annual Conference on Neural Information Processing Systems, NIPS 1996 - Denver, CO, United States
Duration: Dec 2 1996Dec 5 1996

Publication series

NameAdvances in Neural Information Processing Systems
ISSN (Print)1049-5258

Other

Other10th Annual Conference on Neural Information Processing Systems, NIPS 1996
CountryUnited States
CityDenver, CO
Period12/2/9612/5/96

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint Dive into the research topics of 'Representation and induction of finite state machines using time-delay neural networks'. Together they form a unique fingerprint.

Cite this