GENERALIZED 'USEFUL' INFORMATION GENERATING FUNCTIONS

  • Hooda, D.S. (K.U. Kurukshetra) ;
  • Sharma, D.K. (Department of Mathematics, Jaypee Institute of Engineering and Technology)
  • Published : 2009.05.31

Abstract

In the present paper, one new generalized 'useful' information generating function and two new relative 'useful' information generating functions have been defined with their particular and limiting cases. It is interesting to note that differentiations of these information generating functions at t=0 or t=1 give some known and unknown generalized measures of useful information and 'useful' relative information. The information generating functions facilitates to compute various measures and that has been illustrated by applying these information generating functions for Uniform, Geometric and Exponential probability distributions.

Keywords

References

  1. J. Aczel and Z.Daroczy, Uber Verallegemeinerite quasilinear mittelinerte, die mit gewin Chtsfunktionen gebilder sind, Pub.Math. Debrecen,10(1963), 171-190.
  2. M.Belis and S.Guiasu ,A quantitative-qualitative measure of information in Cybernetics System, IEEE Trans. Inform. Theory, IT-14(1968), 593-594.
  3. U.S. Bhaker, and D.S. Hooda, Mean value characterization of useful information mea- sures,Tamkang Journal of Mathematics, 24(1993), 383-39.
  4. S.W. Golomb, The information generating function of probability distribution, IEEE, Trans. Inform. Theory, IT-12(1966), 75-77.
  5. S. Guiasu and C. Reischer, The relative information generating function, Information Sciences, 35(1985), 235-41.
  6. G.H. Hardy, J.E. Littlewood and G. Polya, Inequalities, Cambridge University Press, London,(1934).
  7. D.S. Hooda and U.S. Bhaker, On a weighted entropy generating function, Research Bulletin of Panjab University, 45(1995),181-189.
  8. D.S.Hooda and U.S. Bhaker, On relative information generating function with utilities, Ganita, 56(2005), 45-54.
  9. D.S. Hooda and U .Singh, On useful information generating function, Statistica, 46(1986), 528-535.
  10. S. Kullback and R.A. Leibler, On Information and Sufficiency, Ann. Maths Statistics, 22(1951), 79-86. https://doi.org/10.1214/aoms/1177729694
  11. G. Longo, A noiseless coding theorem for source having utilities, SIAM J. Appl. Math., 30(1976), 739-748. https://doi.org/10.1137/0130067
  12. K.C. Mathur and B.R.K. Kashyap, A generalized information generating function, Journal of Mathematical Sciences, 10(1975), 13-20.
  13. A. Renyi, On measures of entropy and information, Proc. 4th Berkeley Symp. on Math., Stat. and Probability, University of California Press (1961), 547-61.
  14. C.E. Shannon, The Mathematical theory of Communication, Bell System Tech. J.,(1948), 379-423.
  15. B.D. Sharma, On account of type $\beta$ and other measures, Metrika, 19(1975), 1-10.
  16. H Theil, Economics and information theory, North Holland, Amsterdam,(1967).