Deep Learning with Long Short-Term Memory for Time Series Prediction

Yuxiu Hua, Zhifeng Zhao, Rongpeng Li, Xianfu Chen, Zhiming Liu, Honggang Zhang

Research output: Contribution to journalArticleScientificpeer-review

Abstract

Time series prediction can be generalized as a process that extracts useful information from historical records and then determines future values. Learning long-range dependencies that are embedded in time series is often an obstacle for most algorithms, whereas LSTM solutions, as a specific kind of scheme in deep learning, promise to effectively overcome the problem. In this article, we first give a brief introduction to the structure and forward propagation mechanism of LSTM. Then, aiming at reducing the considerable computing cost of LSTM, we put forward a RCLSTM model by introducing stochastic connectivity to conventional LSTM neurons. Therefore, RCLSTM exhibits a certain level of sparsity and leads to a decrease in computational complexity. In the field of telecommunication networks, the prediction of traffic and user mobility could directly benefit from this improvement as we leverage a realistic dataset to show that for RCLSTM, the prediction performance comparable to LSTM is available, whereas considerably less computing time is required. We strongly argue that RCLSTM is more competent than LSTM in latency-stringent or power-constrained application scenarios.

Original languageEnglish
Article number8663965
Pages (from-to)114-119
Number of pages6
JournalIEEE Communications Magazine
Volume57
Issue number6
DOIs
Publication statusPublished - 1 Jun 2019
MoE publication typeA1 Journal article-refereed

Fingerprint

Time series
Neurons
Telecommunication networks
Computational complexity
Long short-term memory
Deep learning
Costs

Keywords

  • logic gates
  • computer architecture
  • biological neural networks
  • computational modeling
  • deep learning

Cite this

Hua, Yuxiu ; Zhao, Zhifeng ; Li, Rongpeng ; Chen, Xianfu ; Liu, Zhiming ; Zhang, Honggang. / Deep Learning with Long Short-Term Memory for Time Series Prediction. In: IEEE Communications Magazine. 2019 ; Vol. 57, No. 6. pp. 114-119.
@article{ae3a2482126549c28984aab82f331bf0,
title = "Deep Learning with Long Short-Term Memory for Time Series Prediction",
abstract = "Time series prediction can be generalized as a process that extracts useful information from historical records and then determines future values. Learning long-range dependencies that are embedded in time series is often an obstacle for most algorithms, whereas LSTM solutions, as a specific kind of scheme in deep learning, promise to effectively overcome the problem. In this article, we first give a brief introduction to the structure and forward propagation mechanism of LSTM. Then, aiming at reducing the considerable computing cost of LSTM, we put forward a RCLSTM model by introducing stochastic connectivity to conventional LSTM neurons. Therefore, RCLSTM exhibits a certain level of sparsity and leads to a decrease in computational complexity. In the field of telecommunication networks, the prediction of traffic and user mobility could directly benefit from this improvement as we leverage a realistic dataset to show that for RCLSTM, the prediction performance comparable to LSTM is available, whereas considerably less computing time is required. We strongly argue that RCLSTM is more competent than LSTM in latency-stringent or power-constrained application scenarios.",
keywords = "logic gates, computer architecture, biological neural networks, computational modeling, deep learning",
author = "Yuxiu Hua and Zhifeng Zhao and Rongpeng Li and Xianfu Chen and Zhiming Liu and Honggang Zhang",
year = "2019",
month = "6",
day = "1",
doi = "10.1109/MCOM.2019.1800155",
language = "English",
volume = "57",
pages = "114--119",
journal = "IEEE Communications Magazine",
issn = "0163-6804",
publisher = "Institute of Electrical and Electronic Engineers IEEE",
number = "6",

}

Deep Learning with Long Short-Term Memory for Time Series Prediction. / Hua, Yuxiu; Zhao, Zhifeng; Li, Rongpeng; Chen, Xianfu; Liu, Zhiming; Zhang, Honggang.

In: IEEE Communications Magazine, Vol. 57, No. 6, 8663965, 01.06.2019, p. 114-119.

Research output: Contribution to journalArticleScientificpeer-review

TY - JOUR

T1 - Deep Learning with Long Short-Term Memory for Time Series Prediction

AU - Hua, Yuxiu

AU - Zhao, Zhifeng

AU - Li, Rongpeng

AU - Chen, Xianfu

AU - Liu, Zhiming

AU - Zhang, Honggang

PY - 2019/6/1

Y1 - 2019/6/1

N2 - Time series prediction can be generalized as a process that extracts useful information from historical records and then determines future values. Learning long-range dependencies that are embedded in time series is often an obstacle for most algorithms, whereas LSTM solutions, as a specific kind of scheme in deep learning, promise to effectively overcome the problem. In this article, we first give a brief introduction to the structure and forward propagation mechanism of LSTM. Then, aiming at reducing the considerable computing cost of LSTM, we put forward a RCLSTM model by introducing stochastic connectivity to conventional LSTM neurons. Therefore, RCLSTM exhibits a certain level of sparsity and leads to a decrease in computational complexity. In the field of telecommunication networks, the prediction of traffic and user mobility could directly benefit from this improvement as we leverage a realistic dataset to show that for RCLSTM, the prediction performance comparable to LSTM is available, whereas considerably less computing time is required. We strongly argue that RCLSTM is more competent than LSTM in latency-stringent or power-constrained application scenarios.

AB - Time series prediction can be generalized as a process that extracts useful information from historical records and then determines future values. Learning long-range dependencies that are embedded in time series is often an obstacle for most algorithms, whereas LSTM solutions, as a specific kind of scheme in deep learning, promise to effectively overcome the problem. In this article, we first give a brief introduction to the structure and forward propagation mechanism of LSTM. Then, aiming at reducing the considerable computing cost of LSTM, we put forward a RCLSTM model by introducing stochastic connectivity to conventional LSTM neurons. Therefore, RCLSTM exhibits a certain level of sparsity and leads to a decrease in computational complexity. In the field of telecommunication networks, the prediction of traffic and user mobility could directly benefit from this improvement as we leverage a realistic dataset to show that for RCLSTM, the prediction performance comparable to LSTM is available, whereas considerably less computing time is required. We strongly argue that RCLSTM is more competent than LSTM in latency-stringent or power-constrained application scenarios.

KW - logic gates

KW - computer architecture

KW - biological neural networks

KW - computational modeling

KW - deep learning

UR - http://www.scopus.com/inward/record.url?scp=85062957095&partnerID=8YFLogxK

U2 - 10.1109/MCOM.2019.1800155

DO - 10.1109/MCOM.2019.1800155

M3 - Article

VL - 57

SP - 114

EP - 119

JO - IEEE Communications Magazine

JF - IEEE Communications Magazine

SN - 0163-6804

IS - 6

M1 - 8663965

ER -