A Long-Short Term Memory Network for Chaotic Time Series Prediction

Margarita Terziyska, Zhelyazko Terziyski, Yancho Todorov

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

Abstract

This paper demonstrates the development of an Long-Short Term Memory (LSTM) network and its application for predicting Chaotic Time Series (CTS). LSTM is a Deep Learning (DL) architecture and a type of Recurrent Neural Network (RNN). Its ability to memorise past inputs make it extremely suitable for studying data sequences, such as CTS. To test the performance of a developed LSTM, several simulation studies were realized in a Matlab environment. The developed DL architecture was used to predict well-known standard CTSs, namely Mackey-Glass (MG), Rössler, and Lorenz. The obtained results were compared with Adaptive N euro Fuzzy Inference system (ANFIS) and Distributed Adaptive Neuro-Fuzzy Architecture (DANF A), previously developed by the authors. The comparison is made on the basis of Root Mean Squared Error (RMSE). It was found that the proposed LSTM structure is able to generalize the generated series and it has higher accuracy than the other two models.
Original languageEnglish
Title of host publication2021 Big Data, Knowledge and Control Systems Engineering (BdKCSE)
PublisherIEEE Institute of Electrical and Electronic Engineers
ISBN (Electronic)978-1-6654-1042-7
ISBN (Print)978-1-6654-1043-4
DOIs
Publication statusPublished - 28 Oct 2021
MoE publication typeA4 Article in a conference publication

Fingerprint

Dive into the research topics of 'A Long-Short Term Memory Network for Chaotic Time Series Prediction'. Together they form a unique fingerprint.

Cite this