Transformer neural networks model for electricity prosumption short-term forecasting
Mim, Jhuma (2024)
Diplomityö
Mim, Jhuma
2024
School of Engineering Science, Laskennallinen tekniikka
Kaikki oikeudet pidätetään.
Julkaisun pysyvä osoite on
https://urn.fi/URN:NBN:fi-fe2024061754076
https://urn.fi/URN:NBN:fi-fe2024061754076
Tiivistelmä
This thesis studies and evaluates the performance of traditional and advanced models for short-term time series forecasting, with a focus on electricity consumption and production. It compares autoregressive models (ARIMA), non-deep learning methods with autoregressive (AR) and non-AR setups, and deep learning models including Long short-term memory, Recurrent neural network, Autoformer, and Informer. For energy consumption, the results show that the Informer model achieves superior performance, with an RMSE of 9.62, outperforming the Autoformer with 4.09 and ARIMA with 6.37. The Informer model performs better in terms of prediction accuracy and training time. However, for the prediction of solar energy production, the Autoformer performed better than the other models, achieving an RMSE of 4.55, which is 2.08 lower than the ARIMA model and 22.35 lower than the Informer model. The results suggest that models have their trade-offs based on their architectures and tasks. The Informer’s ProbSparse self-attention mechanism and self-attention distillation allow the model to achieve better performance and make it more precise for the prediction of energy consumption. At the same time, Autoformer is able to highlight long-term dependencies and periodical fluctuations, making it more appropriate for the forecasting of energy production.
