• Title/Summary/Keyword: 표본 엔트로피

Search Result 24, Processing Time 0.031 seconds

A Modi ed Entropy-Based Goodness-of-Fit Tes for Inverse Gaussian Distribution (역가우스분포에 대한 변형된 엔트로피 기반 적합도 검정)

  • Choi, Byung-Jin
    • The Korean Journal of Applied Statistics
    • /
    • v.24 no.2
    • /
    • pp.383-391
    • /
    • 2011
  • This paper presents a modified entropy-based test of fit for the inverse Gaussian distribution. The test is based on the entropy difference of the unknown data-generating distribution and the inverse Gaussian distribution. The entropy difference estimator used as the test statistic is obtained by employing Vasicek's sample entropy as an entropy estimator for the data-generating distribution and the uniformly minimum variance unbiased estimator as an entropy estimator for the inverse Gaussian distribution. The critical values of the test statistic empirically determined are provided in a tabular form. Monte Carlo simulations are performed to compare the proposed test with the previous entropy-based test in terms of power.

Entropy-Constrained Sample-Adaptive Product Quantizer Design for the High Bit-Rate Quantization (고 전송률 양자화를 위한 엔트로피 제한 표본 적응 프로덕트 양자기 설계)

  • Kim, Dong-Sik
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.49 no.1
    • /
    • pp.11-18
    • /
    • 2012
  • In this paper, an entropy constrained vector quantizer for high bit-rates is proposed. The sample-adaptive product quantizer (SAPQ), which is based on the product codebooks, is employed, and a design algorithm for the entropy constrained sample adaptive product quantizer (ECSAPQ) is proposed. The performance of the proposed ECSAPQ is better than the case of the entropy constrained vector quantizer by 0.5dB. It is also shown that the ECSAPQ distortion curve, which is based on the scalar quantizer, is lower than the high-rate theoretical curve of the entropy constrained scalar quantizer, where the theoretical curve have 1.53dB difference from Shannon's lower bound.

Entropy-Coded Lattice Vector Quantization Based on the Sample-Adaptive Product Quantizer and its Performance for the Memoryless Gaussian Source (표본 적응 프로덕트 양자기에 기초한 격자 벡터 양자화의 엔트로피 부호화와 무기억성 가우시언 분포에 대한 성능 분석)

  • Kim, Dong Sik
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.49 no.9
    • /
    • pp.67-75
    • /
    • 2012
  • Optimal quantizers in conducting the entropy-constrained quantization for high bit rates have the lattice structure. The quantization process is simple due to the regular structure, and various quantization algorithms are proposed depending on the lattice. Such a lattice vector quantizer (VQ) can be implemented by using the sample-adaptive product quantizer (SAPQ) and its output can also be easily entropy encoded. In this paper, the entropy encoding scheme for the lattice VQ is proposed based on SAPQ, and the performance of the proposed lattice VQ, which is based on SAPQ with the entropy coder, is asymptotically compared as the rate increases. It is shown by experiment that the gain for the memoryless Gaussian source also approaches the theoretic gain for the uniform density case.

Prior distributions using the entropy principles (엔트로피 이론을 이용한 사전 확률 분포함수의 추정)

  • Lee, Jung-Jin;Shin, Wan-Seon
    • The Korean Journal of Applied Statistics
    • /
    • v.3 no.2
    • /
    • pp.91-105
    • /
    • 1990
  • Several practical prior distributions are derived using the maximum entropy principle. Also, an interactive method for estimating a prior distribution which uses the minimum cross-entropy principle is proposed when there are many prior informations. The consistency of the prior distributions obtained by the entropy principles is discussed.

  • PDF

Goodness-of-fit test for normal distribution based on parametric and nonparametric entropy estimators (모수적 엔트로피 추정량과 비모수적 엔트로피 추정량에 기초한 정규분포에 대한 적합도 검정)

  • Choi, Byungjin
    • Journal of the Korean Data and Information Science Society
    • /
    • v.24 no.4
    • /
    • pp.847-856
    • /
    • 2013
  • In this paper, we deal with testing goodness-of-fit for normal distribution based on parametric and nonparametric entropy estimators. The minimum variance unbiased estimator for the entropy of the normal distribution is derived as a parametric entropy estimator to be used for the construction of a test statistic. For a nonparametric entropy estimator of a data-generating distribution under the alternative hypothesis sample entropy and its modifications are used. The critical values of the proposed tests are estimated by Monte Carlo simulations and presented in a tabular form. The performance of the proposed tests under some selected alternatives are investigated by means of simulations. The results report that the proposed tests have better power than the previous entropy-based test by Vasicek (1976). In applications, the new tests are expected to be used as a competitive tool for testing normality.

A Comparison on the Empirical Power of Some Normality Tests

  • Kim, Dae-Hak;Eom, Jun-Hyeok;Jeong, Heong-Chul
    • Journal of the Korean Data and Information Science Society
    • /
    • v.17 no.1
    • /
    • pp.31-39
    • /
    • 2006
  • In many cases, we frequently get a desired information based on the appropriate statistical analysis of collected data sets. Lots of statistical theory rely on the assumption of the normality of the data. In this paper, we compare the empirical power of some normality tests including sample entropy quantity. Monte carlo simulation is conducted for the calculation of empirical power of considered normality tests by varying sample sizes for various distributions.

  • PDF

Comparison of Two Parametric Estimators for the Entropy of the Lognormal Distribution (로그정규분포의 엔트로피에 대한 두 모수적 추정량의 비교)

  • Choi, Byung-Jin
    • Communications for Statistical Applications and Methods
    • /
    • v.18 no.5
    • /
    • pp.625-636
    • /
    • 2011
  • This paper proposes two parametric entropy estimators, the minimum variance unbiased estimator and the maximum likelihood estimator, for the lognormal distribution for a comparison of the properties of the two estimators. The variances of both estimators are derived. The influence of the bias of the maximum likelihood estimator on estimation is analytically revealed. The distributions of the proposed estimators obtained by the delta approximation method are also presented. Performance comparisons are made with the two estimators. The following observations are made from the results. The MSE efficacy of the minimum variance unbiased estimator appears consistently high and increases rapidly as the sample size and variance, n and ${\sigma}^2$, become simultaneously small. To conclude, the minimum variance unbiased estimator outperforms the maximum likelihood estimator.

Double Clustering of Gene Expression Data Based on the Information Bottleneck Method (정보병목기법에 기반한 유전자 발현 데이터의 이중 클러스터링)

  • 김병희;황규백;장정호;장병탁
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2003.04c
    • /
    • pp.362-364
    • /
    • 2003
  • 기능 유전체학에서 클러스터링 기법은 고차원의 마이크로 어레이 데이터 분석을 위한 주된 도구 중의 하나이다. 본 논문에서는 정보병목(information bottleneck)기법 기반의 이중 클러스터링에 의한, 유전자 발현 데이터의 계층적 병합방식 클러스터링 기법을 제안한다. 정보병목기법은, 두 랜덤변수의 결합확률분포가 주어진 경우 두 변수의 상호 정보량을 최대한 보존하면서 한 변수를 압축하는 기법이며, 두 변수를 차례로 압축하는 것이 이중 클러스터링이다. 실제 마이크로 어레이 데이터인 NC160 데이터(암세포 내 유전자 발현 데이터)에 대한 실험에서, 먼저 유전자를 그 발현패턴에 따라 클러스터링 한 후 이를 이용하여 표본들을 클러스터링하고 그 성능을 다각도로 분석하였다. 상호 정보량과 유전자 및 표본 클러스터 수와 엔트로피 척도에 의한 성능을 검토해 본 결과, 표본이 추출 조직에 따라 구분 가능할 것이라는 가정을 검증할 수 있었으며, 적절한 클러스터의 수를 결정할 수 있는 임계점의 기준을 설정할 수 있었다.

  • PDF

Kullback-Leibler Information-Based Tests of Fit for Inverse Gaussian Distribution (역가우스분포에 대한 쿨백-라이블러 정보 기반 적합도 검정)

  • Choi, Byung-Jin
    • The Korean Journal of Applied Statistics
    • /
    • v.24 no.6
    • /
    • pp.1271-1284
    • /
    • 2011
  • The entropy-based test of fit for the inverse Gaussian distribution presented by Mudholkar and Tian(2002) can only be applied to the composite hypothesis that a sample is drawn from an inverse Gaussian distribution with both the location and scale parameters unknown. In application, however, a researcher may want a test of fit either for an inverse Gaussian distribution with one parameter known or for an inverse Gaussian distribution with both the two partameters known. In this paper, we introduce tests of fit for the inverse Gaussian distribution based on the Kullback-Leibler information as an extension of the entropy-based test. A window size should be chosen to implement the proposed tests. By means of Monte Carlo simulations, window sizes are determined for a wide range of sample sizes and the corresponding critical values of the test statistics are estimated. The results of power analysis for various alternatives report that the Kullback-Leibler information-based goodness-of-fit tests have good power.

Improving a Test for Normality Based on Kullback-Leibler Discrimination Information (쿨백-라이블러 판별정보에 기반을 둔 정규성 검정의 개선)

  • Choi, Byung-Jin
    • The Korean Journal of Applied Statistics
    • /
    • v.20 no.1
    • /
    • pp.79-89
    • /
    • 2007
  • A test for normality introduced by Arizono and Ohta(1989) is based on fullback-Leibler discrimination information. The test statistic is derived from the discrimination information estimated using sample entropy of Vasicek(1976) and the maximum likelihood estimator of the variance. However, these estimators are biased and so it is reasonable to make use of unbiased estimators to accurately estimate the discrimination information. In this paper, Arizono-Ohta test for normality is improved. The derived test statistic is based on the bias-corrected entropy estimator and the uniformly minimum variance unbiased estimator of the variance. The properties of the improved KL test are investigated and Monte Carlo simulation is performed for power comparison.