transformerForecasting: Transformer Deep Learning Model for Time Series Forecasting (original) (raw)
Time series forecasting faces challenges due to the non-stationarity, nonlinearity, and chaotic nature of the data. Traditional deep learning models like Recurrent Neural Network (RNN), Long Short-Term Memory (LSTM), and Gated Recurrent Unit (GRU) process data sequentially but are inefficient for long sequences. To overcome the limitations of these models, we proposed a transformer-based deep learning architecture utilizing an attention mechanism for parallel processing, enhancing prediction accuracy and efficiency. This paper presents user-friendly code for the implementation of the proposed transformer-based deep learning architecture utilizing an attention mechanism for parallel processing. References: Nayak et al. (2024) <doi:10.1007/s40808-023-01944-7> and Nayak et al. (2024) <doi:10.1016/j.simpa.2024.100716>.
Version: | 0.1.0 |
---|---|
Depends: | R (≥ 4.0.0) |
Imports: | ggplot2, keras, tensorflow, magrittr, reticulate (≥ 1.20) |
Suggests: | dplyr, knitr, lubridate, readr, rmarkdown, utils |
Published: | 2025-03-07 |
DOI: | 10.32614/CRAN.package.transformerForecasting |
Author: | G H Harish Nayak [aut, cre], Md Wasi Alam [ths], B Samuel Naik [ctb], G Avinash [ctb], Kabilan S [ctb], Varshini B S [ctb], Mrinmoy Ray [ths], Rajeev Ranjan Kumar [ths] |
Maintainer: | G H Harish Nayak |
License: | GPL-3 |
NeedsCompilation: | no |
CRAN checks: | transformerForecasting results |
Documentation:
Downloads:
Linking:
Please use the canonical formhttps://CRAN.R-project.org/package=transformerForecastingto link to this page.