Projects per year
Abstract
Time series prediction can be generalized as a process that extracts useful information from historical records and then determines future values. Learning long-range dependencies that are embedded in time series is often an obstacle for most algorithms, whereas LSTM solutions, as a specific kind of scheme in deep learning, promise to effectively overcome the problem. In this article, we first give a brief introduction to the structure and forward propagation mechanism of LSTM. Then, aiming at reducing the considerable computing cost of LSTM, we put forward a RCLSTM model by introducing stochastic connectivity to conventional LSTM neurons. Therefore, RCLSTM exhibits a certain level of sparsity and leads to a decrease in computational complexity. In the field of telecommunication networks, the prediction of traffic and user mobility could directly benefit from this improvement as we leverage a realistic dataset to show that for RCLSTM, the prediction performance comparable to LSTM is available, whereas considerably less computing time is required. We strongly argue that RCLSTM is more competent than LSTM in latency-stringent or power-constrained application scenarios.
Original language | English |
---|---|
Article number | 8663965 |
Pages (from-to) | 114-119 |
Number of pages | 6 |
Journal | IEEE Communications Magazine |
Volume | 57 |
Issue number | 6 |
DOIs | |
Publication status | Published - 1 Jun 2019 |
MoE publication type | A1 Journal article-refereed |
Keywords
- logic gates
- computer architecture
- biological neural networks
- computational modeling
- deep learning
Fingerprint
Dive into the research topics of 'Deep Learning with Long Short-Term Memory for Time Series Prediction'. Together they form a unique fingerprint.Projects
- 1 Finished
-
MISSION: Mission-Critical Internet of Things Applications over Fog Networks
Chen, X., Forsell, M., Chen, T. & Räty, T.
1/01/19 → 31/12/21
Project: Academy of Finland project