DOI QR코드

DOI QR Code

Improving a Test for Normality Based on Kullback-Leibler Discrimination Information

쿨백-라이블러 판별정보에 기반을 둔 정규성 검정의 개선

  • Choi, Byung-Jin (Department of Applied Information Statistics, Kyonggi University)
  • 최병진 (경기대학교 경상대학 경제학부 응용정보통계학)
  • Published : 2007.03.31

Abstract

A test for normality introduced by Arizono and Ohta(1989) is based on fullback-Leibler discrimination information. The test statistic is derived from the discrimination information estimated using sample entropy of Vasicek(1976) and the maximum likelihood estimator of the variance. However, these estimators are biased and so it is reasonable to make use of unbiased estimators to accurately estimate the discrimination information. In this paper, Arizono-Ohta test for normality is improved. The derived test statistic is based on the bias-corrected entropy estimator and the uniformly minimum variance unbiased estimator of the variance. The properties of the improved KL test are investigated and Monte Carlo simulation is performed for power comparison.

Arizono와 Ohta(1989)에 의해 소개된 정규성 검정은 쿨백-라이블러 판별정보를 이용하고 있으며, 검정통계량의 유도에 기반이 되는 판별정보의 추정량을 얻기 위해 Vasicek(1976)의 표본엔트로피와 분산의 최대가능도 추정량을 사용했다. 그런데 두 추정량은 편향성을 가지게 되므로 보다 정확한 판별정보의 추정을 위해 비편향 추정량을 사용하는 것이 바람직하다. 본 논문에서는 편향을 수정한 엔트로피 추정량과 분산의 균일최소분산비편향 추정량을 사용하여 판별정보의 추정량을 구하고 이로부터 유도되는 검정통계량을 사용하는 개선된 정규성 검정을 제시한다. 제안한 검정의 특성을 규명하고 검정력 비교를 위해서 모의실험을 수행한다.

Keywords

References

  1. 김종태, 이우동 (1998). 쿨백-레이블러 정보함수에 기초한 와이블분포와 극단값 분포에 대한 적합도 검정, <용용통계연구>, 11, 351-362
  2. Arizono, I. and Ohta, H. (1989). A test for normality based on Kullback-Leibler information, The American Statistician, 43, 20-22 https://doi.org/10.2307/2685161
  3. Chandra, M., De Wet, T. and Singpurwalla, N. D. (1982). On the sample redundancy and a test for exponentiality, Communications in Statistics-Theory and Methods, 11, 429--438 https://doi.org/10.1080/03610928208828246
  4. D'Agostino, R. B. and Stephens, M. A. (1986). Goodness-of-fit Techniques, Marcel Dekker, New York
  5. Dudewicz, E. J. and van der Meulen, E. C. (1981). Entropy-based tests of uniformity, Journal of the American Statistical Association, 76, 967-974 https://doi.org/10.2307/2287597
  6. Ebrahimi, N., Habibullah, M. and Soofi, E. S. (1992). Testing exponentiality based on Kullback-Leibler information, Journal of the Royal Statistical Society, Ser. B, 54, 739-748
  7. Gokhale, D. V. (1983). On entropy-based goodness-of-fit tests, Computational Statistics & Data Analysis, 1, 157-165 https://doi.org/10.1016/0167-9473(83)90087-7
  8. Kim, J. T., Lee, W. D., Ko, J. H., Yoon, Y. H. and Kang, S. G. (1999). Goodness of fit test for normality based on Kullback-Leibler information, The Korean Communications in Statistics, 6, 909--917
  9. Kullback, S. and Leibler, R. A. (1951). On information and sufficiency, The Annals of Mathematical Statistics, 22, 79--86 https://doi.org/10.1214/aoms/1177729694
  10. Seshadri, V. (1999). The Inverse Gaussian Distribution: Statistical Theory and Applications, Springer, New York
  11. Shannon, C. E. (1948). A mathematical theory of communication, The Bell System Technical Journal, 27, 349-423, 623--656
  12. Vasicek, O. (1976). A test for normality based on sample entropy, Journal of the Royal Statistical Society, Ser. B, 38, 54-59
  13. Wieczorkowski, R. and Grzegorzewski, P.(1999). Entropy estimators-improvements and comparisons, Communications in Statistics-Simulation and Computation, 28, 541-567 https://doi.org/10.1080/03610919908813564