SEAformer: frequency domain decomposition transformer with signal enhanced for long-term wind power forecasting (original) (raw)
References
Wang X, Guo P, Huang X (2011) A review of wind power forecasting models. Energy Procedia 12:770–778 Article Google Scholar
Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, … Polosukhin I (2017) Attention is All you Need. In: Guyon I, Luxburg UV, Bengio S, Wallach H, Fergus R, Vishwanathan S, Garnett R (eds), Advances in Neural Information Processing Systems (Vol. 30). Curran Associates, Inc.
Kitaev N, Kaiser L, Levskaya A (2020) Reformer: the efficient transformer. In: 8th International conference on learning representations, ICLR 2020, Addis Ababa, Ethiopia, April 26-30, 2020
Wu H, Xu J, Wang J, Long M (2021) Autoformer: decomposition transformers with auto-correlation for long-term series forecasting. Adv Neural Inf Process Syst 34:22419–22430 Google Scholar
Zhou H, Zhang S, Peng J, Zhang S, Li J, Xiong H, Zhang W (2021) Informer: beyond efficient transformer for long sequence time-series forecasting. In: Proceedings of the AAAI conference on artificial intelligence, vol 35. pp 11106–11115
Zhou T, Ma Z, Wen Q, Wang X, Sun L, Jin R (2022) Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting. In: International conference on machine learning, pp. 27268–27286. PMLR
Kuznetsov V, Mohri M (2015) Learning theory and algorithms for forecasting non-stationary time series. Proceedings of the 28th International Conference on Neural Information Processing Systems - Vol 1. Presented at the Montreal, Canada. Cambridge, MA, USA: MIT Press, pp 541–549
Dragomiretskiy K, Zosso D (2013) Variational mode decomposition. IEEE Trans Signal Process 62(3):531–544 ArticleMathSciNet Google Scholar
Al-Yahyai S, Charabi Y, Gastli A (2010) Review of the use of numerical weather prediction (NWP) models for wind energy assessment. Renew Sustain Energy Rev 14(9):3192–3198 Article Google Scholar
Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780 Article Google Scholar
Box GE, Jenkins GM, MacGregor JF (1974) Some recent advances in forecasting and control. J Roy Stat Soc: Ser C (Appl Stat) 23(2):158–179 MathSciNet Google Scholar
Rangapuram SS, Seeger MW, Gasthaus J, Stella L, Wang Y, Januschowski T (2018) Deep State Space Models for Time Series Forecasting. In: Bengio S, Wallach H, Larochelle H, Grauman K, Cesa-Bianchi N, Garnett R (eds), Advances in Neural Information Processing Systems (Vol. 31). Curran Associates, Inc.
De Giorgi MG, Campilongo S, Ficarella A, Congedo PM (2014) Comparison between wind power prediction models based on wavelet decomposition with least-squares support vector machine (LS-SVM) and artificial neural network (ANN). Energies 7(8):5251–5272 Article Google Scholar
Assimakopoulos V, Nikolopoulos K (2000) The theta model: a decomposition approach to forecasting. Int J Forecast 16(4):521–530 Article Google Scholar
Holt CC (2004) Forecasting seasonals and trends by exponentially weighted moving averages. Int J Forecast 20(1):5–10 Article Google Scholar
Cleveland RB, Cleveland WS, McRae JE, Terpenning I (1990) STL: a seasonal-trend decomposition. J Off Stat 6(1):3–73 Google Scholar
Cleveland WP, Tiao GC (1976) Decomposition of seasonal time series: a model for the census x–11 program. J Am Stat Assoc 71(355):581–587 ArticleMathSciNet Google Scholar
Woo G, Liu C, Sahoo D, Kumar A, Hoi SCH (2022) Cost: contrastive learning of disentangled seasonal-trend representations for time series forecasting. In: The tenth international conference on learning representations, ICLR 2022, Virtual Event, April 25-29, 2022
McKenzie E, Gardner ES Jr (2010) Damped trend exponential smoothing: a modelling viewpoint. Int J Forecast 26(4):661–665 Article Google Scholar
Wen Q, Gao J, Song X, Sun L, Xu H, Zhu S (2019) Robuststl: a robust seasonal-trend decomposition algorithm for long time series. In: Proceedings of the AAAI conference on artificial intelligence, vol 33. pp 5409–5416
Huang S, Yan C, Qu Y (2023) Deep learning model-transformer based wind power forecasting approach. In: Frontiers in Energy Research
Fu X, Gao F, Wu J, Wei X, Duan F (2019) Spatiotemporal attention networks for wind power forecasting. In: 2019 International conference on data mining workshops (ICDMW), pp 149–154
Nascimento EGS, de Melo TAC, Moreira DM (2023) A transformer-based deep neural network with wavelet transform for forecasting wind speed and wind energy. Energy 278:127678 Article Google Scholar
Mo S, Wang H, Li B, Xue Z, Fan S, Liu X (2024) Powerformer: a temporal-based transformer model for wind power forecasting. Energy Rep 11:736–744 Article Google Scholar
Wang S, Shi J, Yang W, Yin Q (2024) High and low frequency wind power prediction based on transformer and BiGRU-attention. Energy 288:129753 Article Google Scholar
Deng B, Wu Y, Liu S, Xu Z (2022) Wind speed forecasting for wind power production based on frequency-enhanced transformer. In: 2022 4th International conference on machine learning, big data and business intelligence (MLBDBI), pp 151–155
Huang X, Jiang A (2022) Wind power generation forecast based on multi-step informer network. Energies 15(18):6642 Article Google Scholar
Li S, Jin X, Xuan Y, Zhou X, Chen W, Wang Y-X, Yan X (2019) Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. Advances in Neural Information Processing Systems, 32
Beltagy I, Peters ME, Cohan A (2020) Longformer: the long-document transformer. arXiv preprint arXiv:2004.05150
Choromanski KM, Likhosherstov V, Dohan D, Song X, Gane A, Sarlós T, Hawkins P, Davis JQ, Mohiuddin A, Kaiser L, Belanger DB, Colwell LJ, Weller A (2021) Rethinking attention with performers. In: 9th International conference on learning representations, ICLR 2021, Virtual Event, Austria, May 3-7, 2021
Wang S, Li BZ, Khabsa M, Fang H, Ma H (2020) Linformer: self-attention with linear complexity. arXiv preprint arXiv:2006.04768
Gulati, A., Qin, J., Chiu, C., Parmar, N., Zhang, Y., Yu, J., Han, W., Wang, S., Zhang, Z., Wu, Y., Pang, R.: Conformer: convolution-augmented transformer for speech recognition. In: Interspeech 2020, 21st Annual conference of the international speech communication association, virtual event, Shanghai, China, 25-29 October 2020, pp 5036–5040 (2020)
Box GE, Jenkins GM, Reinsel GC, Ljung GM (2015) Time series analysis: forecasting and control. John Wiley & Sons, Hoboken Google Scholar
Wen R, Torkkola K, Narayanaswamy B, Madeka D (2017) A multi-horizon quantile recurrent forecaster. arXiv:Machine Learning
Yu R, Zheng S, Anandkumar A, Yue Y (2017) Long-term forecasting using higher order tensor rnns. arXiv preprint arXiv:1711.00073
Cho K, Merriënboer B, Gulcehre C, Bahdanau D, Bougares F, Schwenk H, Bengio Y (2014) Learning phrase representations using RNN encoder–decoder for statistical machine translation. In: Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP), pp 1724–1734. Association for Computational Linguistics, Doha, Qatar
Salinas D, Flunkert V, Gasthaus J, Januschowski T (2020) Deepar: probabilistic forecasting with autoregressive recurrent networks. Int J Forecast 36(3):1181–1191 Article Google Scholar
Qin Y, Song D, Chen H, Cheng W, Jiang G, Cottrell GW (2017) A dual-stage attention-based recurrent neural network for time series prediction. In: Proceedings of the twenty-sixth international joint conference on artificial intelligence, IJCAI 2017, Melbourne, Australia, August 19-25, pp 2627–2633
Lai G, Chang W-C, Yang Y, Liu H (2018) Modeling long-and short-term temporal patterns with deep neural networks. In: The 41st International ACM SIGIR conference on research & development in information retrieval, pp 95–104
Shih S-Y, Sun F-K, Lee H-Y (2019) Temporal pattern attention for multivariate time series forecasting. Mach Learn 108:1421–1441 ArticleMathSciNet Google Scholar
Bai S, Kolter JZ, Koltun V (2018) An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. CoRR arxiv:abs/1803.01271
Devlin J, Chang M-W, Lee K, Toutanova K (2019) BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 conference of the north american chapter of the association for computational linguistics: human language technologies, vol 1 (Long and Short Papers), Minneapolis, Minnesota, pp 4171–4186
Liu Z, Lin Y, Cao Y, Hu H, Wei Y, Zhang Z, Lin S, Guo B (2021) Swin transformer: hierarchical vision transformer using shifted windows. In: Proceedings of the IEEE/CVF international conference on computer vision, pp 10012–10022
Rao Y, Zhao W, Zhu Z, Lu J, Zhou J (2021) Global filter networks for image classification. Adv Neural Inf Process Syst 34:980–993 Google Scholar
Zhu Z, Soricut R (2021) H-transformer-1D: Fast one-dimensional hierarchical attention for sequences. In: Proceedings of the 59th annual meeting of the association for computational linguistics and the 11th international joint conference on natural language processing (Volume 1: Long Papers), Online, pp 3801–3815
Zefan C, Xiaodong Y (2017) Cuckoo search algorithm with deep search. In: 2017 3rd IEEE international conference on computer and communications (ICCC), pp 2241–2246. https://doi.org/10.1109/CompComm.2017.8322934
Zhou J, Lu X, Xiao Y, Su J, Lyu J, Ma Y, Dou D (2022) SDWPF: a dataset for spatial dynamic wind power forecasting challenge at KDD cup 2022. CoRR arXiv:abs/2208.04360
Paparrizos J, Gravano L (2015) k-shape: efficient and accurate clustering of time series. In: Proceedings of the 2015 ACM SIGMOD international conference on management of data
Kingma DP, Ba J (2015) Adam: A method for stochastic optimization. In: 3rd International conference on learning representations, ICLR 2015, San Diego, CA, USA, May 7-9, 2015, Conference Track Proceedings
Paszke A, Gross S, Massa F, Lerer A, Bradbury J, Chanan G, Killeen T, Lin Z, Gimelshein N, Antiga L, et al (2019) Pytorch: an imperative style, high-performance deep learning library. Adv Neural Inf Process Syst 32
Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9:1735–1780 Article Google Scholar
Li C, Tang G, Xue X, Saeed A, Hu X (2020) Short-term wind speed interval prediction based on ensemble GRU model. IEEE Trans Sustain Energy 11:1370–1380 Article Google Scholar
Zhang GP (2003) Time series forecasting using a hybrid ARIMA and neural network model. Neurocomputing 50:159–175 Article Google Scholar