We investigate the behaviour of the entropy of convolutions of independent random variables on compact groups. We provide an explicit exponential bound on the rate of convergence of entropy to its maximum. Equivalently, this proves convergence of the density to uniformity, in the sense of Kullback-Leibler. We prove that this convergence lies strictly between uniform convergence of densities (as investigated by Shlosman and Major), and weak convergence (the sense of the classical Ito-Kawada theorem). In fact it lies between convergence in L1+ε and convergence in L1.
All Science Journal Classification (ASJC) codes
- Statistics and Probability
- Statistics, Probability and Uncertainty