• Title/Summary/Keyword: Information Entropy

Search Result 880, Processing Time 0.026 seconds

A View on Extension of Utility-Based on Links with Information Measures

  • Hoseinzadeh, A.R.;Borzadaran, G.R.Mohtashami;Yari, G.H.
    • Communications for Statistical Applications and Methods
    • /
    • v.16 no.5
    • /
    • pp.813-820
    • /
    • 2009
  • In this paper, we review the utility-based generalization of the Shannon entropy and Kullback-Leibler information measure as the U-entropy and the U-relative entropy that was introduced by Friedman et al. (2007). Then, we derive some relations between the U-relative entropy and other information measures based on a parametric family of utility functions.

A Comparison on the Differential Entropy

  • Kim, Dae-Hak
    • Journal of the Korean Data and Information Science Society
    • /
    • v.16 no.3
    • /
    • pp.705-712
    • /
    • 2005
  • Entropy is the basic concept of information theory. It is well defined for random varibles with known probability density function(pdf). For given data with unknown pdf, entropy should be estimated. Usually, estimation of entropy is based on the approximations. In this paper, we consider a kernel based approximation and compare it to the cumulant approximation method for several distributions. Monte carlo simulation for various sample size is conducted.

  • PDF

EDISON 앱 개발 및 교육을 위한 Polymer Collapse 중 Polymer의 Entropy 및 Free Energy 계산

  • Park, Yun-Jae;Jang, Rak-U
    • Proceeding of EDISON Challenge
    • /
    • 2017.03a
    • /
    • pp.75-81
    • /
    • 2017
  • Polymer collapse transition에 대한 연구가 많이 진행되어왔다. 허나 각각의 microstate에 대한 entropy나 free energy에 대한 계산을 하지는 못하였다. 최근 local nonequilibrium thermodynamics와 관련한 논문이 발표되었는데 이는 비평형 상태에서의 각각의 microstate에 대한 확률 분포를 결정하는 물리량을 발견 및 특성을 규명하여 이 중 특별한 상태가 지니는 "information" 이라는 양이 내부에너지와 엔트로피와의 상관관계가 있음을 보였다. 또한, 이러한 information theory를 이용한 Shannon entropy를 사용하여 entropy를 정의하고 free energy와 같은 물리량을 계산하였다. 따라서 이를 이용하여 information theory를 이용한 Shannon entropy와 이로 정의된 free energy를 이용하여 polymer collapse중 entropy 및 free energy를 계산하였다.

  • PDF

Speech Emotion Recognition Based on GMM Using FFT and MFB Spectral Entropy (FFT와 MFB Spectral Entropy를 이용한 GMM 기반의 감정인식)

  • Lee, Woo-Seok;Roh, Yong-Wan;Hong, Hwang-Seok
    • Proceedings of the KIEE Conference
    • /
    • 2008.04a
    • /
    • pp.99-100
    • /
    • 2008
  • This paper proposes a Gaussian Mixture Model (GMM) - based speech emotion recognition methods using four feature parameters; 1) Fast Fourier Transform(FFT) spectral entropy, 2) delta FFT spectral entropy, 3) Mel-frequency Filter Bank (MFB) spectral entropy, and 4) delta MFB spectral entropy. In addition, we use four emotions in a speech database including anger, sadness, happiness, and neutrality. We perform speech emotion recognition experiments using each pre-defined emotion and gender. The experimental results show that the proposed emotion recognition using FFT spectral-based entropy and MFB spectral-based entropy performs better than existing emotion recognition based on GMM using energy, Zero Crossing Rate (ZCR), Linear Prediction Coefficient (LPC), and pitch parameters. In experimental Results, we attained a maximum recognition rate of 75.1% when we used MFB spectral entropy and delta MFB spectral entropy.

  • PDF

A Study on the Entropy of Binary First Order Markov Information Source (이진 일차 Markov 정보원의 엔트로피에 관한 연구)

  • 송익호;안수길
    • Journal of the Korean Institute of Telematics and Electronics
    • /
    • v.20 no.2
    • /
    • pp.16-22
    • /
    • 1983
  • In this paper, we obtained PFME(probability for maximum entropy) and entropy when a conditional probability was given in a binary list order Markov Information Source. And, when steady state probability was constant, the influence of change of a conditional probability on entropy was examined, too.

  • PDF

A hybrid evaluation of information entropy meta-heuristic model and unascertained measurement theory for tennis motion tracking

  • Zhong, Yongfeng;Liang, Xiaojun
    • Advances in nano research
    • /
    • v.12 no.3
    • /
    • pp.263-279
    • /
    • 2022
  • In this research, the physical education training quality was investigated using the entropy model to compute variance associated with a random value (a strong tool). The entropy and undefined estimation principles are used to extract the greatest entropy of information dependent on the index system. In the study of tennis motion tracking from a dynamic viewpoint, such stages are utilized to improve the perception of the players' achievement (Lv et al. 2020). Six female tennis players served on the right side (50 cm from the T point). The initial flat serve from T point was the movement under consideration, and the entropy was utilized to weigh all indications. As a result, a multi-index measurement vector is stabilized, followed by the confidence level to determine the structural plane establishment range. As a result, the use of the unascertained measuring technique of information entropy showed an excellent approach to assessing athlete performance more accurately than traditional ways, enabling coaches and athletes to enhance their movements successfully.

BIVARIATE DYNAMIC CUMULATIVE RESIDUAL TSALLIS ENTROPY

  • SATI, MADAN MOHAN;SINGH, HARINDER
    • Journal of applied mathematics & informatics
    • /
    • v.35 no.1_2
    • /
    • pp.45-58
    • /
    • 2017
  • Recently, Sati and Gupta (2015) proposed two measures of uncertainty based on non-extensive entropy, called the dynamic cumulative residual Tsallis entropy (DCRTE) and the empirical cumulative Tsallis entropy. In the present paper, we extend the definition of DCRTE into the bivariate setup and study its properties in the context of reliability theory. We also define a new class of life distributions based on bivariate DCRTE.

Deriving a New Divergence Measure from Extended Cross-Entropy Error Function

  • Oh, Sang-Hoon;Wakuya, Hiroshi;Park, Sun-Gyu;Noh, Hwang-Woo;Yoo, Jae-Soo;Min, Byung-Won;Oh, Yong-Sun
    • International Journal of Contents
    • /
    • v.11 no.2
    • /
    • pp.57-62
    • /
    • 2015
  • Relative entropy is a divergence measure between two probability density functions of a random variable. Assuming that the random variable has only two alphabets, the relative entropy becomes a cross-entropy error function that can accelerate training convergence of multi-layer perceptron neural networks. Also, the n-th order extension of cross-entropy (nCE) error function exhibits an improved performance in viewpoints of learning convergence and generalization capability. In this paper, we derive a new divergence measure between two probability density functions from the nCE error function. And the new divergence measure is compared with the relative entropy through the use of three-dimensional plots.

ONLINE TEST BASED ON MUTUAL INFORMATION FOR TRUE RANDOM NUMBER GENERATORS

  • Kim, Young-Sik;Yeom, Yongjin;Choi, Hee Bong
    • Journal of the Korean Mathematical Society
    • /
    • v.50 no.4
    • /
    • pp.879-897
    • /
    • 2013
  • Shannon entropy is one of the widely used randomness measures especially for cryptographic applications. However, the conventional entropy tests are less sensitive to the inter-bit dependency in random samples. In this paper, we propose new online randomness test schemes for true random number generators (TRNGs) based on the mutual information between consecutive ${\kappa}$-bit output blocks for testing of inter-bit dependency in random samples. By estimating the block entropies of distinct lengths at the same time, it is possible to measure the mutual information, which is closely related to the amount of the statistical dependency between two consecutive data blocks. In addition, we propose a new estimation method for entropies, which accumulates intermediate values of the number of frequencies. The proposed method can estimate entropy with less samples than Maurer-Coron type entropy test can. By numerical simulations, it is shown that the new proposed scheme can be used as a reliable online entropy estimator for TRNGs used by cryptographic modules.

An Improved Entropy Based Sensing by Exploring Phase Information

  • Lee, Haowei;Gu, Junrong;Sohn, Sung-Hwan;Jang, Sung-Jeen;Kim, Jae-Moung
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.35 no.9A
    • /
    • pp.896-905
    • /
    • 2010
  • In this paper, we present a new sensing method based on phase entropy. Entropy is a measurement which quantifies the information content contained in a signal. For the PSK modulation, the information is encoded in the phase of the transmitted signal. By focusing on phase, more information is collected during sensing, which suggests a superior performance. The sensing based on Phase entropy is not limited to PSK signal. We generalize it to PAM signal as well. It is more advantageous to detect the phase. The simulation results have confirmed the excellent performance of this novel sensing algorithm.