DOI QR코드

DOI QR Code

Neural network AR model with ETS inputs

지수평활법을 외생변수로 사용하는 자기회귀 신경망 모형

  • Minjae Kim (Department of Applied Statistics, Chung-Ang University) ;
  • Byeongchan Seong (Department of Applied Statistics, Chung-Ang University)
  • 김민재 (중앙대학교 응용통계학과) ;
  • 성병찬 (중앙대학교 응용통계학과)
  • Received : 2024.01.27
  • Accepted : 2024.02.01
  • Published : 2024.06.30

Abstract

This paper evaluates the performance of the neural network autoregressive model combined with an exponential smoothing model, called the NNARX+ETS model. The combined model utilizes the components of ETS as exogenous variables for NNARX, to forecast time series data using artificial neural networks. The main idea is to enhance the performance of NNAR using only lags of the original time series data, by combining traditional time series analysis methods with the neural networks through NNARX. We employ two real data for performance evaluation and compare the NNARX+ETS with NNAR and traditional time series analysis methods such as ETS and ARIMA (autoregressive integrated moving average) models.

본 논문에서는 자기회귀 신경망 모형과 지수평활법을 결합(NNARX+ETS 모형)하고 그 성능을 평가한다. 제안된 결합 모형은 시계열 자료를 예측하기 위하여 NNARX 모형의 외생변수로서 ETS 모형의 구성 성분을 활용한다. 이 모형의 주요 아이디어는, 신경망 모형이 원시계열 자료의 과거 시차만을 고려하는 것을 한계를 넘어서서 전통적 시계열 예측 방법인 지수평활법에 의해서 추출된 정제된 시계열 구성 성분까지도 추가로 신경망 모형의 입력값으로 사용하는 것이다. 예측 성능 평가는 2가지 실제 시계열 자료를 사용하였으며 제안된 모형을 NNAR 모형 및 전통적 시계열 분석 방법인 ETS와 ARIMA 모형과 비교하였다.

Keywords

Acknowledgement

이 논문은 2022년도 중앙대학교 연구장학기금 지원에 의한 것임.

References

  1. Brown RG (1959). Statistical Forecasting for Inventory Control, McGraw/Hill, New York.
  2. Box GEP, Jenkins GM, and Reinsel GC (1976). Time Series Analysis: Forecasting and Control, Holden-Day, San Francisco.
  3. Babu CN and Reddy BE (2014). A moving-average filter based hybrid ARIMA-ANN model for forecasting time series data, Applied Soft Computing, 23, 27-38. https://doi.org/10.1016/j.asoc.2014.05.028
  4. Crone SF, Hibon M, and Nikolopoulos K (2011). Advances in forecasting with neural networks? Empirical evidence from the M3 competition on time series prediction, International Journal of Forecasting, 27, 635-660. https://doi.org/10.1016/j.ijforecast.2011.04.001
  5. Gardner ES (1985). Exponential smoothing: The state of the art, Journal of Forecasting, 4, 1-28. https://doi.org/10.1002/for.3980040103
  6. Holt CC (1957). Forecasting seasonals and trends by exponentially weighted moving averages, International Journal of Forecasting, 20, 5-10. https://doi.org/10.1016/j.ijforecast.2003.09.015
  7. Hyndman RJ and Koehler AB (2006). Another look at measures of forecast accuracy, International Journal of Forecasting, 22, 679-688. https://doi.org/10.1016/j.ijforecast.2006.03.001
  8. Hyndman RJ, Koehler AB, Ord JK, and Snyder RD (2008). Forecasting with Exponential Smoothing: The State Space Approach, Springer, Berlin.
  9. Joseph M (2022). Modern Time Series Forecasting with Python, Packt Publishing, Birmingham.
  10. Khaled S, Fakhry MM, and Ahmed S (2020). Classification of PCG signals using a nonlinear autoregressive network with exogenous inputs (NARX). In Proceedings of 2020 International Conference on Innovative Trends in Communication and Computer Engineering (ITCE), Aswan, Egypt, 98-102.
  11. Makridakis S and Hibon M (2000). The M3-Competition: Results, conclusions and implications, International Journal of Forecasting, 16, 451-476. https://doi.org/10.1016/S0169-2070(00)00057-1
  12. Panigrahi S and Behera HS (2017). A hybrid ETS-ANN model for time series forecasting, Engineering Applications of Artificial Intelligence, 66, 49-59. https://doi.org/10.1016/j.engappai.2017.07.007
  13. Petropoulos F and Svetunkov I (2020). A simple combination of univariate models, International Journal of Forecasting, 36, 110-115. https://doi.org/10.1016/j.ijforecast.2019.01.006
  14. Smyl S (2020). A hybrid method of exponential smoothing and re- current neural networks for time series forecasting, International Journal of Forecasting, 36, 75-85. https://doi.org/10.1016/j.ijforecast.2019.03.017
  15. Svetunkov I (2023). Forecasting and Analytics with the Augmented Dynamic Adaptive Model (ADAM), Chapman and Hall/CRC, Florida.