A CONSISTENT AND BIAS CORRECTED EXTENSION OF AKAIKE'S INFORMATION CRITERION(AIC) : AICbc(k)

  • Kwon, Soon H. (School of Electrical and Electronic Eng., Yeungnam Univ.) ;
  • Ueno, M. (Dept. of Cognitive & Information Science, Chiba Univ.) ;
  • Sugeno, M. (Dept. of Computational Intelligence & Systems Science, Tokyo Institute of Technology)
  • Published : 1998.06.30

Abstract

This paper derives a consistent and bias corrected extension of Akaike's Information Criterion (AIC), $AIC_{bc}$, based on Kullback-Leibler information. This criterion has terms that penalize the overparametrization more strongly for small and large samples than that of AIC. The overfitting problem of the asymptotically efficient model selection criteria for small and large samples will be overcome. The $AIC_{bc}$ also provides a consistent model order selection. Thus, it is widely applicable to data with small and/or large sample sizes, and to cases where the number of free parameters is a relatively large fraction of the sample size. Relationships with other model selection criteria such as $AIC_c$ of Hurvich, CAICF of Bozdogan and etc. are discussed. Empirical performances of the $AIC_{bc}$ are studied and discussed in better model order choices of a linear regression model using a Monte Carlo experiment.