Abstract
The paper derives formulas for the mean-squared error distortion and resulting signal-to-noise (SNR) ratio of a fixed-rate scalar quantizer designed optimally in the minimum mean-squared error sense for a Gaussian density with the standard deviation ${\sigma}_q$ when it is mismatched to a Laplacian density with the standard deviation ${\sigma}_q$. The SNR formulas, based on the key parameter and Bennett's integral, are found accurate for a wide range of $p\({\equiv}\frac{\sigma_p}{\sigma_q}\){\geqq}0.25$. Also an upper bound to the SNR is derived, which becomes tighter with increasing rate R and indicates that the SNR behaves asymptotically as $\frac{20\sqrt{3{\ln}2}}{{\rho}{\ln}10}\;{\sqrt{R}}$ dB.