• Title/Summary/Keyword: Density estimator

Search Result 132, Processing Time 0.017 seconds

ON HELLINGER CONSISTENT DENSITY ESTIMATION

  • Nicoleris, Theodoros;Walker, Stephen-G.
    • Journal of the Korean Statistical Society
    • /
    • v.32 no.3
    • /
    • pp.261-270
    • /
    • 2003
  • This paper introduces a new density estimator which is Hellinger consistent under a simple condition. A number of issues are discussed, such as extension to Kullback-Leibler consistency, robustness, the Bayes version of the estimator and the maximum likelihood case. An illustration is presented.

The Minimum Squared Distance Estimator and the Minimum Density Power Divergence Estimator

  • Pak, Ro-Jin
    • Communications for Statistical Applications and Methods
    • /
    • v.16 no.6
    • /
    • pp.989-995
    • /
    • 2009
  • Basu et al. (1998) proposed the minimum divergence estimating method which is free from using the painful kernel density estimator. Their proposed class of density power divergences is indexed by a single parameter $\alpha$ which controls the trade-off between robustness and efficiency. In this article, (1) we introduce a new large class the minimum squared distance which includes from the minimum Hellinger distance to the minimum $L_2$ distance. We also show that under certain conditions both the minimum density power divergence estimator(MDPDE) and the minimum squared distance estimator(MSDE) are asymptotically equivalent and (2) in finite samples the MDPDE performs better than the MSDE in general but there are some cases where the MSDE performs better than the MDPDE when estimating a location parameter or a proportion of mixed distributions.

Minimum Density Power Divergence Estimator for Diffusion Parameter in Discretely Observed Diffusion Processes

  • Song, Jun-Mo;Lee, Sang-Yeol;Na, Ok-Young;Kim, Hyo-Jung
    • Communications for Statistical Applications and Methods
    • /
    • v.14 no.2
    • /
    • pp.267-280
    • /
    • 2007
  • In this paper, we consider the robust estimation for diffusion processes when the sample is observed discretely. As a robust estimator, we consider the minimizing density power divergence estimator (MDPDE) proposed by Basu et al. (1998). It is shown that the MDPDE for diffusion process is weakly consistent. A simulation study demonstrates the robustness of the MDPDE.

The Nonparametric Deconvolution Problem with Gaussian Error Distribution

  • Cho, Wan-Hyun;Park, Jeong-Soo
    • Journal of the Korean Statistical Society
    • /
    • v.25 no.2
    • /
    • pp.265-276
    • /
    • 1996
  • The nonparametric deconvolution problems are studied to recover an unknown density when the data are contaminated with Gaussian error. We propose the estimator which is a linear combination of kernel type estimates of derivertives of the observed density function. We show that this estimator is consistent and also consider the properties of estimator at small sample by simulation.

  • PDF

Robustness of Minimum Disparity Estimators in Linear Regression Models

  • Pak, Ro-Jin
    • Journal of the Korean Statistical Society
    • /
    • v.24 no.2
    • /
    • pp.349-360
    • /
    • 1995
  • This paper deals with the robustness properties of the minimum disparity estimation in linear regression models. The estimators defined as statistical quantities whcih minimize the blended weight Hellinger distance between a weighted kernel density estimator of the residuals and a smoothed model density of the residuals. It is shown that if the weights of the density estimator are appropriately chosen, the estimates of the regression parameters are robust.

  • PDF

A Note on Deconvolution Estimators when Measurement Errors are Normal

  • Lee, Sung-Ho
    • Communications for Statistical Applications and Methods
    • /
    • v.19 no.4
    • /
    • pp.517-526
    • /
    • 2012
  • In this paper a support vector method is proposed for use when the sample observations are contaminated by a normally distributed measurement error. The performance of deconvolution density estimators based on the support vector method is explored and compared with kernel density estimators by means of a simulation study. An interesting result was that for the estimation of kurtotic density, the support vector deconvolution estimator with a Gaussian kernel showed a better performance than the classical deconvolution kernel estimator.

On Bias Reduction in Kernel Density Estimation

  • Kim Choongrak;Park Byeong-Uk;Kim Woochul
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2000.11a
    • /
    • pp.65-73
    • /
    • 2000
  • Kernel estimator is very popular in nonparametric density estimation. In this paper we propose an estimator which reduces the bias to the fourth power of the bandwidth, while the variance of the estimator increases only by at most moderate constant factor. The estimator is fully nonparametric in the sense of convex combination of three kernel estimators, and has good numerical properties.

  • PDF

M-Estimation Functions Induced From Minimum L$_2$ Distance Estimation

  • Pak, Ro-Jin
    • Journal of the Korean Statistical Society
    • /
    • v.27 no.4
    • /
    • pp.507-514
    • /
    • 1998
  • The minimum distance estimation based on the L$_2$ distance between a model density and a density estimator is studied from M-estimation point of view. We will show that how a model density and a density estimator are incorporated in order to create an M-estimation function. This method enables us to create an M-estimating function reflecting the natures of both an assumed model density and a given set of data. Some new types of M-estimation functions for estimating a location and scale parameters are introduced.

  • PDF

The Region of Positivity and Unimodality in the Truncated Series of a Nonparametric Kernel Density Estimator

  • Gupta, A.K.;Im, B.K.K.
    • Journal of the Korean Statistical Society
    • /
    • v.10
    • /
    • pp.140-144
    • /
    • 1981
  • This paper approximates to a kernel density estimate by a truncated series of expansion involving Hermite polynomials, since this could ease the computing burden involved in the kernel-based density estimation. However, this truncated series may give a multimodal estimate when we are estiamting unimodal density. In this paper we will show a way to insure the truncated series to be positive and unimodal so that the approximation to a kernel density estimator would be maeningful.

  • PDF

Minimum Hellinger Distance Estimation and Minimum Density Power Divergence Estimation in Estimating Mixture Proportions

  • Pak, Ro-Jin
    • Journal of the Korean Data and Information Science Society
    • /
    • v.16 no.4
    • /
    • pp.1159-1165
    • /
    • 2005
  • Basu et al. (1998) proposed a new density-based estimator, called the minimum density power divergence estimator (MDPDE), which avoid the use of nonparametric density estimation and associated complication such as bandwidth selection. Woodward et al. (1995) examined the minimum Hellinger distance estimator (MHDE), proposed by Beran (1977), in the case of estimation of the mixture proportion in the mixture of two normals. In this article, we introduce the MDPDE for a mixture proportion, and show that both the MDPDE and the MHDE have the same asymptotic distribution at a model. Simulation study identifies some cases where the MHDE is consistently better than the MDPDE in terms of bias.

  • PDF