• Title/Summary/Keyword: divergence

Search Result 1,156, Processing Time 0.027 seconds

A NEW EXPONENTIAL DIRECTED DIVERGENCE INFORMATION MEASURE

  • JAIN, K.C.;CHHABRA, PRAPHULL
    • Journal of applied mathematics & informatics
    • /
    • v.34 no.3_4
    • /
    • pp.295-308
    • /
    • 2016
  • Depending upon the nature of the problem, different divergence measures are suitable. So it is always desirable to develop a new divergence measure. In the present work, new information divergence measure, which is exponential in nature, is introduced and characterized. Bounds of this new measure are obtained in terms of various symmetric and non- symmetric measures together with numerical verification by using two discrete distributions: Binomial and Poisson. Fuzzy information measure and Useful information measure corresponding to new exponential divergence measure are also introduced.

A Study on The Formation of Inferior Space in Louis I. Kahn's Architecture (루이스 칸 건축의 내부공간 형성에 관한 연구)

  • Yoon, Dong-Sik
    • Korean Institute of Interior Design Journal
    • /
    • v.17 no.5
    • /
    • pp.23-30
    • /
    • 2008
  • This thesis aims to analyze the visual perceptual effects drawn by 'axial composition and divergence' and to interpret the architecture of Kahn in respect of 'axial composition and divergence'. Axial composition of the form, the location of the entrance and divergence of internal movement were checked up about 53 works by extracting parti which is basic element of spatial composition. The 3D modeling simulation was performed for the selected 10 works in order to analyze the visual perceptual effect due to divergence of the internal movement. The reaction of the observer's actions and visual perception by 'axial composition and divergence' is presented in the following steps. 1. Divergence of the entrance/a panorama of expanding planes. 2. Divergence of internal movement/The process of perception of visual rotation and central spatial form. 'Perceptive form' created by 'divergence' is the result of diverse and flexible series of processes which must be experienced in person in order to reach the space as a room with a definite domain and center.

Direct Divergence Approximation between Probability Distributions and Its Applications in Machine Learning

  • Sugiyama, Masashi;Liu, Song;du Plessis, Marthinus Christoffel;Yamanaka, Masao;Yamada, Makoto;Suzuki, Taiji;Kanamori, Takafumi
    • Journal of Computing Science and Engineering
    • /
    • v.7 no.2
    • /
    • pp.99-111
    • /
    • 2013
  • Approximating a divergence between two probability distributions from their samples is a fundamental challenge in statistics, information theory, and machine learning. A divergence approximator can be used for various purposes, such as two-sample homogeneity testing, change-point detection, and class-balance estimation. Furthermore, an approximator of a divergence between the joint distribution and the product of marginals can be used for independence testing, which has a wide range of applications, including feature selection and extraction, clustering, object matching, independent component analysis, and causal direction estimation. In this paper, we review recent advances in divergence approximation. Our emphasis is that directly approximating the divergence without estimating probability distributions is more sensible than a naive two-step approach of first estimating probability distributions and then approximating the divergence. Furthermore, despite the overwhelming popularity of the Kullback-Leibler divergence as a divergence measure, we argue that alternatives such as the Pearson divergence, the relative Pearson divergence, and the $L^2$-distance are more useful in practice because of their computationally efficient approximability, high numerical stability, and superior robustness against outliers.

A Kullback-Leiber Divergence-based Spectrum Sensing for Cognitive Radio Systems (무선인지시스템을 위한 Kullback-Leiber Divergence 기반의 스펙트럼 센싱 기법)

  • Thuc, Kieu-Xuan;Koo, In-Soo
    • Journal of Internet Computing and Services
    • /
    • v.13 no.1
    • /
    • pp.1-6
    • /
    • 2012
  • In the paper, an information divergence called Kullback-Leiber divergence, which measures the average of the logarithmic difference between two probability density functions, is utilized to derive a novel method for spectrum sensing in cognitive radio systems. In the proposed sensing method, we test whether the observed samples are drawn from the noise distribution by using Kullback-Leiber divergence. It is shown by numerical results that under the same conditions, the proposed Kullback-Leiber divergence-based spectrum sensing always outperforms the energy detection based spectrum sensing significantly, especially in low SNR regime and in fading circumstance.

Void Formation Induced by the Divergence of the Diffusive Ionic Fluxes in Metal Oxides Under Chemical Potential Gradients

  • Maruyama, Toshio;Ueda, Mitsutoshi
    • Journal of the Korean Ceramic Society
    • /
    • v.47 no.1
    • /
    • pp.8-18
    • /
    • 2010
  • When metal oxides are exposed to chemical potential gradients, ions are driven to diffusive mass transport. During this transport process, the divergence of ionic fluxes offers the formation/annihilation of oxides. Therefore, the divergence of ionic flux may play an important role in the void formation in oxides. Kinetic equations were derived for describing chemical potential distribution, ionic fluxes and their divergence in oxides. The divergence was found to be the measure of void formation. Defect chemistry in scales is directly related to the sign of divergence and gives an indication of the void formation behavior. The quantitative estimation on the void formation was successfully applied to a growing magnetite scale in high temperature oxidation of iron at 823 K.

UNDERSTANDING NON-NEGATIVE MATRIX FACTORIZATION IN THE FRAMEWORK OF BREGMAN DIVERGENCE

  • KIM, KYUNGSUP
    • Journal of the Korean Society for Industrial and Applied Mathematics
    • /
    • v.25 no.3
    • /
    • pp.107-116
    • /
    • 2021
  • We introduce optimization algorithms using Bregman Divergence for solving non-negative matrix factorization (NMF) problems. Bregman divergence is known a generalization of some divergences such as Frobenius norm and KL divergence and etc. Some algorithms can be applicable to not only NMF with Frobenius norm but also NMF with more general Bregman divergence. Matrix Factorization is a popular non-convex optimization problem, for which alternating minimization schemes are mostly used. We develop the Bregman proximal gradient method applicable for all NMF formulated in any Bregman divergences. In the derivation of NMF algorithm for Bregman divergence, we need to use majorization/minimization(MM) for a proper auxiliary function. We present algorithmic aspects of NMF for Bregman divergence by using MM of auxiliary function.

INEQUALITIES FOR QUANTUM f-DIVERGENCE OF CONVEX FUNCTIONS AND MATRICES

  • Dragomir, Silvestru Sever
    • Korean Journal of Mathematics
    • /
    • v.26 no.3
    • /
    • pp.349-371
    • /
    • 2018
  • Some inequalities for quantum f-divergence of matrices are obtained. It is shown that for normalised convex functions it is nonnegative. Some upper bounds for quantum f-divergence in terms of variational and ${\chi}^2-distance$ are provided. Applications for some classes of divergence measures such as Umegaki and Tsallis relative entropies are also given.

SHADOWING, EXPANSIVENESS AND STABILITY OF DIVERGENCE-FREE VECTOR FIELDS

  • Ferreira, Celia
    • Bulletin of the Korean Mathematical Society
    • /
    • v.51 no.1
    • /
    • pp.67-76
    • /
    • 2014
  • Let X be a divergence-free vector field defined on a closed, connected Riemannian manifold. In this paper, we show the equivalence between the following conditions: ${\bullet}$ X is a divergence-free vector field satisfying the shadowing property. ${\bullet}$ X is a divergence-free vector field satisfying the Lipschitz shadowing property. ${\bullet}$ X is an expansive divergence-free vector field. ${\bullet}$ X has no singularities and is Anosov.

SOME NEW MEASURES OF FUZZY DIRECTED DIVERGENCE AND THEIR GENERALIZATION

  • PARKASH OM;SHARMA P. K.
    • The Pure and Applied Mathematics
    • /
    • v.12 no.4 s.30
    • /
    • pp.307-315
    • /
    • 2005
  • There exist many measures of fuzzy directed divergence corresponding to the existing probabilistic measures. Some new measures of fuzzy divergence have been proposed which correspond to some well-known existing probabilistic measures. The essential properties of the proposed measures have been developed which contains many existing measures of fuzzy directed divergence.

  • PDF

EINSTEIN-TYPE MANIFOLDS WITH COMPLETE DIVERGENCE OF WEYL AND RIEMANN TENSOR

  • Hwang, Seungsu;Yun, Gabjin
    • Bulletin of the Korean Mathematical Society
    • /
    • v.59 no.5
    • /
    • pp.1167-1176
    • /
    • 2022
  • In this paper, we study Einstein-type manifolds generalizing static spaces and V-static spaces. We prove that if an Einstein-type manifold has non-positive complete divergence of its Weyl tensor and non-negative complete divergence of Bach tensor, then M has harmonic Weyl curvature. Also similar results on an Einstein-type manifold with complete divergence of Riemann tensor are proved.