Abstract
A more generalized version of the information inequality based on diffusivity which is a natural measure of dispersion for median-unbiased estimators developed by Sung et al. (1990) is presented. This non-Bayesian L$_{1}$ information inequality is free from regularity conditions and can be regarded as an analogue of the Chapman-Robbins inequality for mean-unbiased estimation. The approach given here, however, deals with a more generalized situation than that of the Chapman-Robbins inequality. We also develop a Bayesian version of the L$_{1}$ information inequality in median-unbiased estimation. This latter inequality is directly comparable to the Bayesian Cramer-Rao bound due to the van Trees inequality.