Remembering the past: The role of embedded memory in recurrent neural network architectures

C. Lee Giles, Tsungnan Lin, Bill G. Horne

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Scopus citations

Abstract

There has been much interest in learning long-term temporal dependencies with neural networks. Adequately learning such long-term information can be useful in many problems in signal processing, control and prediction. A class of recurrent neural networks (RNNs), NARX neural networks, were shown to perform much better than other recurrent neural networks when learning simple long-term dependency problems. The intuitive explanation is that the output memories of a NARX network can be manifested as jump-ahead connections in the time-unfolded network. Here we show that similar improvements in learning long-term dependencies can be achieved with other classes of recurrent neural network architectures simply by increasing the order of the embedded memory. Experiments with locally recurrent networks, and NARX (output feedback) networks show that all of these classes of network architectures can have a significant improvement on learning long-term dependencies as the orders of embedded memory are increased, other things be held constant. These results can be important to a user comfortable with a specific recurrent neural network architecture because simply increasing the embedding memory order of that architecture will make it more robust to the problem of long-term dependency learning.

Original languageEnglish (US)
Title of host publicationNeural Networks for Signal Processing - Proceedings of the IEEE Workshop
PublisherIEEE
Pages34-43
Number of pages10
StatePublished - 1997
EventProceedings of the 1997 7th IEEE Workshop on Neural Networks for Signal Processing, NNSP'97 - Amelia Island, FL, USA
Duration: Sep 24 1997Sep 26 1997

Other

OtherProceedings of the 1997 7th IEEE Workshop on Neural Networks for Signal Processing, NNSP'97
CityAmelia Island, FL, USA
Period9/24/979/26/97

All Science Journal Classification (ASJC) codes

  • Signal Processing
  • Software
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Remembering the past: The role of embedded memory in recurrent neural network architectures'. Together they form a unique fingerprint.

  • Cite this

    Giles, C. L., Lin, T., & Horne, B. G. (1997). Remembering the past: The role of embedded memory in recurrent neural network architectures. In Neural Networks for Signal Processing - Proceedings of the IEEE Workshop (pp. 34-43). IEEE.