LAPLACE RANDOM VECTORS, GAUSSIAN NOISE, AND THE GENERALIZED INCOMPLETE GAMMA FUNCTION Ivan W. Selesnick Polytechnic University, Brooklyn, NY ABSTRACT Wavelet domain statistical modeling of images has focused on modeling the peaked heavy-tailed behavior of the marginal distribution and on modeling the dependencies between coefficients that are adjacent (in location and/or scale). In this paper we describe the extension of the Laplace marginal model to the multivariate case so that groups of wavelet coefficients can be modeled together using Laplace marginal models. We derive the nonlinear MAP and MMSE shrinkage functions for a Laplace vector in Gaussian noise and provide computationally efficient approximations to them. The development depends on the generalized incomplete Gamma function. Index Terms— Image restoration, Estimation, MAP estimation, Exponential distributions, Wavelet transforms 1. INTRODUCTION Statistical modeling of images in the wavelet domain calls for peaked, heavy-tailed symmetric densities; and the problem of fitting suitable models (especially from noisy data) has several proposed solutions in the literature. The simplest model is Laplace, from which the soft-threshold rule [8] can be derived [10, 11, 17, 22]. A zero-mean Laplace random variable S with variance σ 2 has the density « „ √ 2 1 pS (s) = √ |s| . exp − σ 2σ Although the distribution of wavelet coefficients can be more accurately modeled using other distributions (generally with two or more parameters, for example [1, 4, 7, 9, 12, 15]), in this paper we focus on the Laplace model, but we pursue the multivariate version of it. If adjacent wavelet coefficients (in location and/or scale) are modeled together, then substantial improvement can be gained in applications. Even though adjacent wavelet coefficients are approximately uncorrelated (assuming the transform is orthonormal) they are strongly dependent [21]. Several methods have been developed to model groups of wavelet coefficients, for example [5, 7, 16, 19, 20]. The most effective wavelet-domain denoising algorithms are based on multivariate probability models that depend on a quadratic form (their level sets are ellipsoidal). In this paper we develop the MAP and MMSE nonlinear shrinkage rules that follow from the ellipsoidal multivariate Laplace density. Although we developed in [20] a multivariate density that is similar to the Laplace density, the marginals of the density in [20] are not Laplace. The multivariate density whose marginals are Laplace is described by Kotz et al. [13]. It is useful to employ that true multivariate Laplace density because then Laplace models (or Laplace mixture models [18]) for the marginals can be directly extrapolated This work was supported by ONR under grant N00014-03-1-0217. Author email:
[email protected] to multivariate models for groups of coefficients. For the scalar case, the posterior distribution and the MMSE estimator of a Laplace signal in additive independent Gaussian noise was derived in [10]. This paper extends those results to the multivariate case so that groups of coefficients can be modeled together. In this case, a d-component vector S is observed in Gaussian noise, Y = S + N,
Y, S, N ∈ Rd .
(1)
It turns out that to write the posterior density pY (y) and the MMSE estimator sˆ(y) we need the generalized incomplete Gamma function, a special function introduced in 1994 by Chaudhry and Zubair [2, 3]. 2. LAPLACE RANDOM VECTORS As described in [13], a spherically symmetric d-component random vector with Laplace marginals (with variance σ 2 ) can be generated by √ S = Z ·X (2) where Z is an exponential (scalar) random variable and X is a dcomponent Gaussian random vector, ( exp(−z), z≥0 pZ (z) = (3) 0, z