The use of NARX neural networks to predict chaotic time series (original) (raw)
Related papers
Prediction of chaotic time series with NARX recurrent dynamic neural networks
Proceedings of the 9th WSEAS International …, 2008
The problem of chaotic time series prediction is studied in various disciplines now including engineering, medical and econometric applications. Chaotic time series are the output of a deterministic system with positive Liapunov exponent. A time series prediction is a suitable application for a neuronal network predictor. The NN approach to time series prediction is non-parametric, in the sense that it is not necessary to know any information regarding the process that generates the signal. It is shown that the recurrent NN (RNN) with a sufficiently large number of neurons is a realization of the nonlinear ARMA (NARMA) process. In this paper we present the nonlinear autoregressive network with exogenous inputs (NARX), the architecture, the training method, the input data to network, the simulation results.
A self-organizing NARX network and its application to prediction of chaotic time series
2001
This paper introduces the concept of dynamic embedding manifold (DEM), which allows the Kohonen self-organizing map (SOM) to learn dynamic, nonlinear input-ouput mappings. The combination of the DEM concept with the SOM results in a new modelling technique that we called Vector-Quantized Temporal Associative Memory (VQTAM). We use VQTAM to propose an unsupervised neural algorithm called Self-Organizing N A R X (SONARX) network. The SONARX network is evaluated on the problem of modeling and prediction of three chaotic time series and compared with MLP, RBF and autoregressive (AR) models. Its is shown that SONARX exhibits similar performance when compared to MLP and RBF, while producing much better results than the AR model. The influence of the number of neurons, the memory order, the number of training epochs and the size of the training set in the final prediction error is also evaluated.
Comparison of Feedforward and Recurrent Neural Network in Forecasting Chaotic Dynamical System
AJIT-e Online Academic Journal of Information Technology, 2019
Artificial neural networks are commonly accepted as a very successful tool for global function approximation. Because of this reason, they are considered as a good approach to forecasting chaotic time series in many studies. For a given time series, the Lyapunov exponent is a good parameter to characterize the series as chaotic or not. In this study, we use three different neural network architectures to test capabilities of the neural network in forecasting time series generated from different dynamical systems. In addition to forecasting time series, using the feedforward neural network with single hidden layer, Lyapunov exponents of the studied systems are forecasted.
Chaotic Time Series Prediction with Neural Networks - Comparison of Several Architectures
This paper presents experimental comparison between selected neural architectures for chaotic time series prediction problem. Several feed-forward architectures (Multilayer Perceptrons) are compared with partially recurrent nets (Elman, extended Elman, and Jordan) based on convergence rate, prediction accuracy, training time requirements and stability of results.
A New Look at Nonlinear Time Series Prediction with NARX Recurrent Neural Network
2006 Ninth Brazilian Symposium on Neural Networks (SBRN'06), 2006
The NARX network is a recurrent neural architecture commonly used for input-output modeling of nonlinear systems. The input of the NARX network is formed by two tapped-delay lines, one sliding over the input signal and the other one over the output signal. Currently, when applied to chaotic time series prediction, the NARX architecture is designed as a plain Focused Time Delay Neural Network (FTDNN); thus, limiting its predictive abilities. In this paper, we propose a strategy that allows the original architecture of the NARX network to fully explore its computational power to improve prediction performance. We use the well-known chaotic laser time series to evaluate the proposed approach in multi-step-ahead prediction tasks. The results show that the proposed approach consistently outperforms standard neural network based predictors, such as the FTDNN and Elman architectures.
A neural network scheme for long-term forecasting of chaotic time series
Neural Processing …, 2011
The accuracy of a model to forecast a time series diminishes as the prediction horizon increases, in particular when the prediction is carried out recursively. Such decay is faster when the model is built using data generated by highly dynamic or chaotic systems. This paper presents a topology and training scheme for a novel artificial neural network, named "Hybrid-connected Complex Neural Network" (HCNN), which is able to capture the dynamics embedded in chaotic time series and to predict long horizons of such series. HCNN is composed of small recurrent neural networks, inserted in a structure made of feed-forward and recurrent connections and trained in several stages using the algorithm back-propagation through time (BPTT). In experiments using a Mackey-Glass time series and an electrocardiogram (ECG) as training signals, HCNN was able to output stable chaotic signals, oscillating for periods as long as four times the size of the training signals. The largest local Lyapunov Exponent (LE) of predicted signals was positive (an evidence of chaos), and similar to the LE calculated over the training signals. The magnitudes of peaks in the ECG signal were not accurately predicted, but the predicted signal was similar to the ECG in the rest of its structure.
Chaotic time series prediction by artificial neural networks
Journal of Computational Methods in Sciences and Engineering, 2016
In this paper, we used four types of artificial neural network (ANN) to predict the behavior of chaotic time series. Each neural network that used in this paper acts as global model to predict the future behavior of time series. Prediction process is based on embedding theorem and time delay determined by this theorem. This ANN applied to the time series that generated by Mackey-glass equation that has a chaotic behavior. At the end, all neural networks are used to solve this problem and their results are compared and analyzed.
Forecasting Chaotic time series by a Neural Network
2008
This paper examines how efficient neural networks are relative to linear and polynomial approximations to forecast a time series that is generated by the chaotic Mackey-Glass differential delay equation. The forecasting horizon is one step ahead. A series of regressions with polynomial approximators and a simple neural network with two neurons is taking place and compare the multiple correlation coefficients. The neural network, a very simple neural network, is superior to the polynomial expansions, and delivers a virtually perfect forecasting. Finally, the neural network is much more precise, relative to the other methods, across a wide set of realizations.
Prediction of Chaotic Time Series with Neural Networks and the Issue of Dynamic Modeling
International Journal of Bifurcation and Chaos, 1992
This paper shows that the dynamics of nonlinear systems that produce complex time series can be captured in a model system. The model system is an artificial neural network, trained with backpropagation, in a multi-step prediction framework. Results from the Mackey-Glass (D=30) will be presented to corroborate our claim. Our final intent is to study the applicability of the method to the electroencephalogram, but first several important questions must be answered to guarantee appropriate modeling.
Prediction of Duffing Chaotic time series using focused time lagged recurrent neural network model
TENCON 2008 - 2008 IEEE Region 10 Conference, 2008
In this paper the multi step ahead prediction of typical Duffing Chaotic time series and the monthly sunspots real time series are carried out. These two time series are popularized due to their highly chaotic behavior. This paper compares the performance of two neural network configurations namely a Multilayer Perceptron (MLP) and proposed FTLRNN with gamma memory for the duffing time series for 1, 5,10,20,50 and 100-step ahead prediction and for monthly sunspot time series for 1, 6, 12, 18 & 24 month ahead prediction. The standard back propagation algorithm with momentum term has been used for both the models. It is seen that estimated dynamic fully recurrent model clearly outperforms the MLP NN in various performance matrices such as Mean square error (MSE), Normalized mean square error (NMSE) and correlation coefficient (r) on testing as well as training data set for multi step prediction (K=1,5,10,20,50,100) for duffing time series and for the sunspot time series for 1, 6, 12, 18 &24 month ahead prediction. In addition, the output of proposed neural network model closely follows the desired output for all the step ahead prediction. It is observed that suggested recurrent models have the remarkable capability of time series prediction. The major contribution of this paper is that Various parameters like number of processing elements, step size, momentum value in hidden layer, in output layer the various transfer functions like tanh, sigmoid, linear-tan-h and linear sigmoid, different error norms L1, L2 ,Lp to L.