• Title/Summary/Keyword: Conditional Entropy

Search Result 36, Processing Time 0.018 seconds

A Study on the Entropy of Binary First Order Markov Information Source (이진 일차 Markov 정보원의 엔트로피에 관한 연구)

  • 송익호;안수길
    • Journal of the Korean Institute of Telematics and Electronics
    • /
    • v.20 no.2
    • /
    • pp.16-22
    • /
    • 1983
  • In this paper, we obtained PFME(probability for maximum entropy) and entropy when a conditional probability was given in a binary list order Markov Information Source. And, when steady state probability was constant, the influence of change of a conditional probability on entropy was examined, too.

  • PDF

Registration and Visualization of Medical Image Using Conditional Entropy and 3D Volume Rendering (조건부 엔트로피와 3차원 볼륨 렌더링기법을 이용한 의료영상의 정합과 가시화)

  • Kim, Sun-Worl;Cho, Wan-Hyun
    • Communications for Statistical Applications and Methods
    • /
    • v.16 no.2
    • /
    • pp.277-286
    • /
    • 2009
  • Image registration is a process to establish the spatial correspondence between images of the same scene, which are acquired at different view points, at different times, or by different sensors. In this paper, we introduce a robust brain registration technique for correcting the difference between two temporal images by the different coordinate systems in MR and CT image obtained from the same patient. Two images are registered where this measure is minimized using a modified conditional entropy(MCE: Modified Conditional Entropy) computed from the joint histograms for the intensities of two given images, we conduct the rendering for visualization of 3D volume image.

Rationale of the Maximum Entropy Probability Density

  • Park, B. S.
    • Journal of the Korean Statistical Society
    • /
    • v.13 no.2
    • /
    • pp.87-106
    • /
    • 1984
  • It ${X_t}$ is a sequence of independent identically distributed normal random variables, then the conditional probability density of $X_1, X_2, \cdots, X_n$ given the first p+1 sample autocovariances converges to the maximum entropy probability density satisfying the corresponding covariance constraints as the length of the sample sequence tends to infinity. This establishes that the maximum entropy probability density and the associated Gaussian autoregressive process arise naturally as the answers of conditional limit problems.

  • PDF

Uncertainty Improvement of Incomplete Decision System using Bayesian Conditional Information Entropy (베이지언 정보엔트로피에 의한 불완전 의사결정 시스템의 불확실성 향상)

  • Choi, Gyoo-Seok;Park, In-Kyu
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.14 no.6
    • /
    • pp.47-54
    • /
    • 2014
  • Based on the indiscernible relation of rough set, the inevitability of superposition and inconsistency of data makes the reduction of attributes very important in information system. Rough set has difficulty in the difference of attribute reduction between consistent and inconsistent information system. In this paper, we propose the new uncertainty measure and attribute reduction algorithm by Bayesian posterior probability for correlation analysis between condition and decision attributes. We compare the proposed method and the conditional information entropy to address the uncertainty of inconsistent information system. As the result, our method has more accuracy than conditional information entropy in dealing with uncertainty via mutual information of condition and decision attributes of information system.

Medical Image Registration by Combining Gradient Vector Flow and Conditional Entropy Measure (기울기 벡터장과 조건부 엔트로피 결합에 의한 의료영상 정합)

  • Lee, Myung-Eun;Kim, Soo-Hyung;Kim, Sun-Worl;Lim, Jun-Sik
    • The KIPS Transactions:PartB
    • /
    • v.17B no.4
    • /
    • pp.303-308
    • /
    • 2010
  • In this paper, we propose a medical image registration technique combining the gradient vector flow and modified conditional entropy. The registration is conducted by the use of a measure based on the entropy of conditional probabilities. To achieve the registration, we first define a modified conditional entropy (MCE) computed from the joint histograms for the area intensities of two given images. In order to combine the spatial information into a traditional registration measure, we use the gradient vector flow field. Then the MCE is computed from the gradient vector flow intensity (GVFI) combining the gradient information and their intensity values of original images. To evaluate the performance of the proposed registration method, we conduct experiments with our method as well as existing method based on the mutual information (MI) criteria. We evaluate the precision of MI- and MCE-based measurements by comparing the registration obtained from MR images and transformed CT images. The experimental results show that the proposed method is faster and more accurate than other optimization methods.

Entropy-based Spectrum Sensing for Cognitive Radio Networks in the Presence of an Unauthorized Signal

  • So, Jaewoo
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.9 no.1
    • /
    • pp.20-33
    • /
    • 2015
  • Spectrum sensing is a key component of cognitive radio. The prediction of the primary user status in a low signal-to-noise ratio is an important factor in spectrum sensing. However, because of noise uncertainty, secondary users have difficulty distinguishing between the primary signal and an unauthorized signal when an unauthorized user exists in a cognitive radio network. To resolve the sensitivity to the noise uncertainty problem, we propose an entropy-based spectrum sensing scheme to detect the primary signal accurately in the presence of an unauthorized signal. The proposed spectrum sensing uses the conditional entropy between the primary signal and the unauthorized signal. The ability to detect the primary signal is thus robust against noise uncertainty, which leads to superior sensing performance in a low signal-to-noise ratio. Simulation results show that the proposed spectrum sensing scheme outperforms the conventional entropy-based spectrum sensing schemes in terms of the primary user detection probability.

Identification of the associations between genes and quantitative traits using entropy-based kernel density estimation

  • Yee, Jaeyong;Park, Taesung;Park, Mira
    • Genomics & Informatics
    • /
    • v.20 no.2
    • /
    • pp.17.1-17.11
    • /
    • 2022
  • Genetic associations have been quantified using a number of statistical measures. Entropy-based mutual information may be one of the more direct ways of estimating the association, in the sense that it does not depend on the parametrization. For this purpose, both the entropy and conditional entropy of the phenotype distribution should be obtained. Quantitative traits, however, do not usually allow an exact evaluation of entropy. The estimation of entropy needs a probability density function, which can be approximated by kernel density estimation. We have investigated the proper sequence of procedures for combining the kernel density estimation and entropy estimation with a probability density function in order to calculate mutual information. Genotypes and their interactions were constructed to set the conditions for conditional entropy. Extensive simulation data created using three types of generating functions were analyzed using two different kernels as well as two types of multifactor dimensionality reduction and another probability density approximation method called m-spacing. The statistical power in terms of correct detection rates was compared. Using kernels was found to be most useful when the trait distributions were more complex than simple normal or gamma distributions. A full-scale genomic dataset was explored to identify associations using the 2-h oral glucose tolerance test results and γ-glutamyl transpeptidase levels as phenotypes. Clearly distinguishable single-nucleotide polymorphisms (SNPs) and interacting SNP pairs associated with these phenotypes were found and listed with empirical p-values.

A conditional entropy codingscheme for tree structured vector quantization (나무구조 벡터양자화를 위한 조건부 엔트로피 부호화기법)

  • 송준석;이승준;이충웅
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.22 no.2
    • /
    • pp.344-352
    • /
    • 1997
  • This paper proposes an efficient lossless coding scheme for tree structured vector quantization (TSVQ) system which efficiently exploits inter-block correlation. The TSVQ index of the current block is adaptively arithmeticencoded depending on the indices of the previous blocks. This paper also presents a reductio method, which effectively resolve the memory problem which usually arises in many conditional entropy coding schemes. Simulation results show that the proposed scheme provides remarkable bitrate reduction by effectively exploiting not only linear but also non-linear inter-block correlation.

  • PDF

Development of an Item Selection Method for Test-Construction by using a Relationship Structure among Abilities

  • Kim, Sung-Ho;Jeong, Mi-Sook;Kim, Jung-Ran
    • Communications for Statistical Applications and Methods
    • /
    • v.8 no.1
    • /
    • pp.193-207
    • /
    • 2001
  • When designing a test set, we need to consider constraints on items that are deemed important by item developers or test specialists. The constraints are essentially on the components of the test domain or abilities relevant to a given test set. And so if the test domain could be represented in a more refined form, test construction would be made in a more efficient way. We assume that relationships among task abilities are representable by a causal model and that the item response theory (IRT) is not fully available for them. In such a case we can not apply traditional item selection methods that are based on the IRT. In this paper, we use entropy as an uncertainty measure for making inferences on task abilities and developed an optimal item selection algorithm which reduces most the entropy of task abilities when items are selected from an item pool.

  • PDF