• Title/Summary/Keyword: Minimum Hellinger distance estimator

Search Result 9, Processing Time 0.022 seconds

Reducing Bias of the Minimum Hellinger Distance Estimator of a Location Parameter

  • Pak, Ro-Jin
    • Journal of the Korean Data and Information Science Society
    • /
    • v.17 no.1
    • /
    • pp.213-220
    • /
    • 2006
  • Since Beran (1977) developed the minimum Hellinger distance estimation, this method has been a popular topic in the field of robust estimation. In the process of defining a distance, a kernel density estimator has been widely used as a density estimator. In this article, however, we show that a combination of a kernel density estimator and an empirical density could result a smaller bias of the minimum Hellinger distance estimator than using just a kernel density estimator for a location parameter.

  • PDF

Minimum Hellinger Distance Bsed Goodness-of-fit Tests in Normal Models: Empirical Approach

  • Dong Bin Jeong
    • Communications for Statistical Applications and Methods
    • /
    • v.6 no.3
    • /
    • pp.967-976
    • /
    • 1999
  • In this paper we study the Hellinger distance based goodness-of-fit tests that are analogs of likelihood ratio tests. The minimum Hellinger distance estimator (MHDE) in normal models provides an excellent robust alternative to the usual maximum likelihood estimator. Our simulation results show that the Hellinger deviance test (Simpson 1989) based goodness-of-fit test is robust when data contain outliers. The proposed hellinger deviance test(Simpson 1989) is a more direcct method for obtaining robust inferences than an automated outlier screen method used before the likelihood ratio test data analysis.

  • PDF

The Minimum Squared Distance Estimator and the Minimum Density Power Divergence Estimator

  • Pak, Ro-Jin
    • Communications for Statistical Applications and Methods
    • /
    • v.16 no.6
    • /
    • pp.989-995
    • /
    • 2009
  • Basu et al. (1998) proposed the minimum divergence estimating method which is free from using the painful kernel density estimator. Their proposed class of density power divergences is indexed by a single parameter $\alpha$ which controls the trade-off between robustness and efficiency. In this article, (1) we introduce a new large class the minimum squared distance which includes from the minimum Hellinger distance to the minimum $L_2$ distance. We also show that under certain conditions both the minimum density power divergence estimator(MDPDE) and the minimum squared distance estimator(MSDE) are asymptotically equivalent and (2) in finite samples the MDPDE performs better than the MSDE in general but there are some cases where the MSDE performs better than the MDPDE when estimating a location parameter or a proportion of mixed distributions.

Minimum Hellinger Distance Estimation and Minimum Density Power Divergence Estimation in Estimating Mixture Proportions

  • Pak, Ro-Jin
    • Journal of the Korean Data and Information Science Society
    • /
    • v.16 no.4
    • /
    • pp.1159-1165
    • /
    • 2005
  • Basu et al. (1998) proposed a new density-based estimator, called the minimum density power divergence estimator (MDPDE), which avoid the use of nonparametric density estimation and associated complication such as bandwidth selection. Woodward et al. (1995) examined the minimum Hellinger distance estimator (MHDE), proposed by Beran (1977), in the case of estimation of the mixture proportion in the mixture of two normals. In this article, we introduce the MDPDE for a mixture proportion, and show that both the MDPDE and the MHDE have the same asymptotic distribution at a model. Simulation study identifies some cases where the MHDE is consistently better than the MDPDE in terms of bias.

  • PDF

Minimum Disparity Estimation for Normal Models: Small Sample Efficiency

  • Cho M. J.;Hong C. S.;Jeong D. B.
    • Communications for Statistical Applications and Methods
    • /
    • v.12 no.1
    • /
    • pp.149-167
    • /
    • 2005
  • The minimum disparity estimators introduced by Lindsay and Basu (1994) are studied empirically. An extensive simulation in this paper provides a location estimate of the small sample and supplies empirical evidence of the estimator performance for the univariate contaminated normal model. Empirical results show that the minimum generalized negative exponential disparity estimator (MGNEDE) obtains high efficiency for small sample sizes and dominates the maximum likelihood estimator (MLE) and the minimum blended weight Hellinger distance estimator (MBWHDE) with respect to efficiency at the contaminated model.

Robustness of Minimum Disparity Estimators in Linear Regression Models

  • Pak, Ro-Jin
    • Journal of the Korean Statistical Society
    • /
    • v.24 no.2
    • /
    • pp.349-360
    • /
    • 1995
  • This paper deals with the robustness properties of the minimum disparity estimation in linear regression models. The estimators defined as statistical quantities whcih minimize the blended weight Hellinger distance between a weighted kernel density estimator of the residuals and a smoothed model density of the residuals. It is shown that if the weights of the density estimator are appropriately chosen, the estimates of the regression parameters are robust.

  • PDF

Negative Exponential Disparity Based Deviance and Goodness-of-fit Tests for Continuous Models: Distributions, Efficiency and Robustness

  • Jeong, Dong-Bin;Sahadeb Sarkar
    • Journal of the Korean Statistical Society
    • /
    • v.30 no.1
    • /
    • pp.41-61
    • /
    • 2001
  • The minimum negative exponential disparity estimator(MNEDE), introduced by Lindsay(1994), is an excellenet competitor to the minimum Hellinger distance estimator(Beran 1977) as a robust and yet efficient alternative to the maximum likelihood estimator in parametric models. In this paper we define the negative exponential deviance test(NEDT) as an analog of the likelihood ratio test(LRT), and show that the NEDT is asymptotically equivalent to he LRT at the model and under a sequence of contiguous alternatives. We establish that the asymptotic strong breakdown point for a class of minimum disparity estimators, containing the MNEDE, is at least 1/2 in continuous models. This result leads us to anticipate robustness of the NEDT under data contamination, and we demonstrate it empirically. In fact, in the simulation settings considered here the empirical level of the NEDT show more stability than the Hellinger deviance test(Simpson 1989). The NEDT is illustrated through an example data set. We also define a goodness-of-fit statistic to assess adequacy of a specified parametric model, and establish its asymptotic normality under the null hypothesis.

  • PDF

Penalizing the Negative Exponential Disparity in Discrete Models

  • Sahadeb Sarkar;Song, Kijoung-Song;Jeong, Dong-Bin
    • Communications for Statistical Applications and Methods
    • /
    • v.5 no.2
    • /
    • pp.517-529
    • /
    • 1998
  • When the sample size is small the robust minimum Hellinger distance (HD) estimator can have substantially poor relative efficiency at the true model. Similarly, approximating the exact null distributions of the ordinary Hellinger distance tests with the limiting chi-square distributions can be quite inappropriate in small samples. To overcome these problems Harris and Basu (1994) and Basu et at. (1996) recommended using a modified HD called penalized Hellinger distance (PHD). Lindsay (1994) and Basu et al. (1997) showed that another density based distance, namely the negative exponential disparity (NED), is a major competitor to the Hellinger distance in producing an asymptotically fully efficient and robust estimator. In this paper we investigate the small sample performance of the estimates and tests based on the NED and penalized NED (PNED). Our results indicate that, in the settings considered here, the NED, unlike the HD, produces estimators that perform very well in small samples and penalizing the NED does not help. However, in testing of hypotheses, the deviance test based on a PNED appears to achieve the best small-sample level compared to tests based on the NED, HD and PHD.

  • PDF

Robust Discriminant Analysis using Minimum Disparity Estimators

  • 조미정;홍종선;정동빈
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2004.11a
    • /
    • pp.135-140
    • /
    • 2004
  • Lindsay and Basu (1994)에 의해 소개된 최소차이추정량 (Minimum Disparity Estimators)들은 실제 자료 분석 도구로써 유용하다. 본 논문에서는 최소일반화음지수 차이추정량 (Minimum Generalized Negative Exponential Disparity Estimator, MGNEDE)이 최대가능도추정량 (Maximum Likelihood Estimator, MLE)와 최소가중 헬링거거리추정량 (Minimum Blended Weight Hellinger Distance Estimator, MBWHDE)에 비해 오염된 정규모형에서 효율적이고 로버스트하다는 것을 모의실험을 통하여 확인하였다. 또한 세 가지 추정량들에 의해 추정된 모수들을 이용하여 판별하였을 때 자 추정량득의 판별율을 비교함으로써 오염된 정규모형에서 MLE의 대안으로 MGNEDE와 MBWHDE를 사용할 수 있음을 보였다.

  • PDF