Neural Probabilistic Forecasting of Symbolic Sequences with Long Short-Term Memory

Michael Hauser, Yiwei Fu, Shashi Phoha, Asok Ray

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

This paper makes use of long short-term memory (LSTM) neural networks for forecasting probability distributions of time series in terms of discrete symbols that are quantized from real-valued data. The developed framework formulates the forecasting problem into a probabilistic paradigm as h X × Y [0, 1] such that y Yh (x,y)=1, where X is the finite-dimensional state space, Y is the symbol alphabet, and is the set of model parameters. The proposed method is different from standard formulations (e.g., autoregressive moving average (ARMA)) of time series modeling. The main advantage of formulating the problem in the symbolic setting is that density predictions are obtained without any significantly restrictive assumptions (e.g., second-order statistics). The efficacy of the proposed method has been demonstrated by forecasting probability distributions on chaotic time series data collected from a laboratory-scale experimental apparatus. Three neural architectures are compared, each with 100 different combinations of symbol-alphabet size and forecast length, resulting in a comprehensive evaluation of their relative performances.

Original languageEnglish (US)
Article number084502
JournalJournal of Dynamic Systems, Measurement and Control, Transactions of the ASME
Volume140
Issue number8
DOIs
StatePublished - Aug 1 2018

Fingerprint

forecasting
alphabets
Time series
Probability distributions
autoregressive moving average
Statistics
statistics
Neural networks
formulations
Long short-term memory
evaluation
predictions

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • Information Systems
  • Instrumentation
  • Mechanical Engineering
  • Computer Science Applications

Cite this

@article{95283e5f33b547ae88724fe1af1d4e60,
title = "Neural Probabilistic Forecasting of Symbolic Sequences with Long Short-Term Memory",
abstract = "This paper makes use of long short-term memory (LSTM) neural networks for forecasting probability distributions of time series in terms of discrete symbols that are quantized from real-valued data. The developed framework formulates the forecasting problem into a probabilistic paradigm as h X × Y [0, 1] such that y Yh (x,y)=1, where X is the finite-dimensional state space, Y is the symbol alphabet, and is the set of model parameters. The proposed method is different from standard formulations (e.g., autoregressive moving average (ARMA)) of time series modeling. The main advantage of formulating the problem in the symbolic setting is that density predictions are obtained without any significantly restrictive assumptions (e.g., second-order statistics). The efficacy of the proposed method has been demonstrated by forecasting probability distributions on chaotic time series data collected from a laboratory-scale experimental apparatus. Three neural architectures are compared, each with 100 different combinations of symbol-alphabet size and forecast length, resulting in a comprehensive evaluation of their relative performances.",
author = "Michael Hauser and Yiwei Fu and Shashi Phoha and Asok Ray",
year = "2018",
month = "8",
day = "1",
doi = "10.1115/1.4039281",
language = "English (US)",
volume = "140",
journal = "Journal of Dynamic Systems, Measurement and Control, Transactions of the ASME",
issn = "0022-0434",
publisher = "American Society of Mechanical Engineers(ASME)",
number = "8",

}

Neural Probabilistic Forecasting of Symbolic Sequences with Long Short-Term Memory. / Hauser, Michael; Fu, Yiwei; Phoha, Shashi; Ray, Asok.

In: Journal of Dynamic Systems, Measurement and Control, Transactions of the ASME, Vol. 140, No. 8, 084502, 01.08.2018.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Neural Probabilistic Forecasting of Symbolic Sequences with Long Short-Term Memory

AU - Hauser, Michael

AU - Fu, Yiwei

AU - Phoha, Shashi

AU - Ray, Asok

PY - 2018/8/1

Y1 - 2018/8/1

N2 - This paper makes use of long short-term memory (LSTM) neural networks for forecasting probability distributions of time series in terms of discrete symbols that are quantized from real-valued data. The developed framework formulates the forecasting problem into a probabilistic paradigm as h X × Y [0, 1] such that y Yh (x,y)=1, where X is the finite-dimensional state space, Y is the symbol alphabet, and is the set of model parameters. The proposed method is different from standard formulations (e.g., autoregressive moving average (ARMA)) of time series modeling. The main advantage of formulating the problem in the symbolic setting is that density predictions are obtained without any significantly restrictive assumptions (e.g., second-order statistics). The efficacy of the proposed method has been demonstrated by forecasting probability distributions on chaotic time series data collected from a laboratory-scale experimental apparatus. Three neural architectures are compared, each with 100 different combinations of symbol-alphabet size and forecast length, resulting in a comprehensive evaluation of their relative performances.

AB - This paper makes use of long short-term memory (LSTM) neural networks for forecasting probability distributions of time series in terms of discrete symbols that are quantized from real-valued data. The developed framework formulates the forecasting problem into a probabilistic paradigm as h X × Y [0, 1] such that y Yh (x,y)=1, where X is the finite-dimensional state space, Y is the symbol alphabet, and is the set of model parameters. The proposed method is different from standard formulations (e.g., autoregressive moving average (ARMA)) of time series modeling. The main advantage of formulating the problem in the symbolic setting is that density predictions are obtained without any significantly restrictive assumptions (e.g., second-order statistics). The efficacy of the proposed method has been demonstrated by forecasting probability distributions on chaotic time series data collected from a laboratory-scale experimental apparatus. Three neural architectures are compared, each with 100 different combinations of symbol-alphabet size and forecast length, resulting in a comprehensive evaluation of their relative performances.

UR - http://www.scopus.com/inward/record.url?scp=85044967289&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85044967289&partnerID=8YFLogxK

U2 - 10.1115/1.4039281

DO - 10.1115/1.4039281

M3 - Article

VL - 140

JO - Journal of Dynamic Systems, Measurement and Control, Transactions of the ASME

JF - Journal of Dynamic Systems, Measurement and Control, Transactions of the ASME

SN - 0022-0434

IS - 8

M1 - 084502

ER -