DOI QR코드

DOI QR Code

A Kullback-Leibler divergence based comparison of approximate Bayesian estimations of ARMA models

  • Amin, Ayman A (Department of Statistics, Mathematics, and Insurance, Faculty of Commerce, Menoufia University)
  • 투고 : 2022.02.04
  • 심사 : 2022.04.06
  • 발행 : 2022.07.31

초록

Autoregressive moving average (ARMA) models involve nonlinearity in the model coefficients because of unobserved lagged errors, which complicates the likelihood function and makes the posterior density analytically intractable. In order to overcome this problem of posterior analysis, some approximation methods have been proposed in literature. In this paper we first review the main analytic approximations proposed to approximate the posterior density of ARMA models to be analytically tractable, which include Newbold, Zellner-Reynolds, and Broemeling-Shaarawy approximations. We then use the Kullback-Leibler divergence to study the relation between these three analytic approximations and to measure the distance between their derived approximate posteriors for ARMA models. In addition, we evaluate the impact of the approximate posteriors distance in Bayesian estimates of mean and precision of the model coefficients by generating a large number of Monte Carlo simulations from the approximate posteriors. Simulation study results show that the approximate posteriors of Newbold and Zellner-Reynolds are very close to each other, and their estimates have higher precision compared to those of Broemeling-Shaarawy approximation. Same results are obtained from the application to real-world time series datasets.

키워드

참고문헌

  1. Amin AA (2009). Bayesian inference for seasonal ARMA models: A Gibbs sampling approach (Masters thesis), Cairo University, Egypt.
  2. Amin AA (2017a). Gibbs sampling for double seasonal ARMA models. In Proceedings of the 29th Annual International Conference on Statistics and Modeling in Human and Social Sciences, Cairo, Egypt.
  3. Amin AA (2017b). Bayesian inference for double seasonal moving average models: A Gibbs sampling approach, Pakistan Journal of Statistics and Operation Research, 13, 483-499. https://doi.org/10.18187/pjsor.v13i3.1647
  4. Amin AA (2017c). Sensitivity to prior specification in Bayesian identification of autoregressive time series models, Pakistan Journal of Statistics and Operation Research, 14, 699-713. https://doi.org/10.18187/pjsor.v13i4.1498
  5. Amin AA (2018). Bayesian inference for double SARMA models, Communications in Statistics-Theory and Methods, 47, 5333-5345. https://doi.org/10.1080/03610926.2017.1390132
  6. Amin AA (2019a). Gibbs sampling for Bayesian prediction of SARMA processes, Pakistan Journal of Statistics and Operation Research, 15, 397-418. https://doi.org/10.18187/pjsor.v15i2.2174
  7. Amin AA (2019b). Kullback-Leibler divergence to evaluate posterior sensitivity to different priors for autoregressive time series models, Communications in Statistics-Simulation and Computation, 48, 1277-1291. https://doi.org/10.1080/03610918.2017.1410709
  8. Amin AA (2020). Bayesian analysis of double seasonal autoregressive models, Sankhya B, 82, 328-352. https://doi.org/10.1007/s13571-019-00192-z
  9. Amin AA (2022). Gibbs sampling for Bayesian estimation of triple seasonal autoregressive models, Communications in Statistics-Theory and Methods. Available from: https://doi.org/10.1080/03610926.2022.2043379
  10. Amin AA and Ismail MA (2015). Gibbs sampling for double seasonal autoregressive models, Communications for Statistical Applications and Methods, 22, 557-573. https://doi.org/10.5351/CSAM.2015.22.6.557
  11. Ali SS (1998). Approximations for posterior densities of moving average models: A comparative study (Masters thesis), Cairo University, Egypt.
  12. Broemeling LD and Shaarawy S (1984). Bayesian inferences and forecasts with moving average processes, Communications in Statistics, 13, 1871-1888. https://doi.org/10.1080/03610928408828800
  13. Broemeling LD and Shaarawy S (1988). Time series: A Bayesian analysis in the time domain, Bayesian Analysis of Time Series and Dynamic Models, (Ed. J. Spell, pp. 1-21), Marcel Dekker, New York.
  14. Box GEP, Jenkins GM, Reinsel GC, and Ljung GM (2015). Time Series Analysis: Forecasting and Control (5th ed), John Wiley & Sons, New York.
  15. Ismail MA (1994). Bayesian forecasting for nonlinear time series (Ph.D. thesis), University of Wales, UK.
  16. Ismail MA and Amin AA (2014). Gibbs sampling For SARMA models, Pakistan Journal of Statistics, 30, 153-168.
  17. Kullback S and Leibler RA (1951). On information and sufficiency, The Annals of Mathematical Statistics, 22, 79-86. https://doi.org/10.1214/aoms/1177729694
  18. McCulloch RE (1989). Local model influence, Journal of the American Statistical Association, 84, 473-478. https://doi.org/10.1080/01621459.1989.10478793
  19. Newbold P (1973). Bayesian estimation of box Jenkins transfer function noise models, Journal of the Royal Statistical Society: Series B, 35, 323-336.
  20. Shaarawy SM and Ali SS (2012). Bayesian model order selection of vector moving average processes, Communications in Statistics - Theory and Methods, 41, 684-698. https://doi.org/10.1080/03610926.2010.529531
  21. Soliman E (1999). On Bayesian time series analysis (Ph.D. thesis), Cairo University, Egypt.
  22. Zellner A and Reynolds R (1978). Bayesian analysis of ARMA models. In Proceedings of the Sixteenth Seminar on Bayesian Inference in Econometrics.