A Kullback-Leibler Divergence Variant of the Bayesian Cramér-Rao Bound

Document Type

Article

Publication Date

6-1-2023

Abstract

This paper proposes a Bayesian Cramér-Rao type lower bound on the minimum mean square error. The key idea is to minimize the latter subject to the constraint that the joint distribution of the input-output statistics lies in a Kullback–Leibler divergence ball centered at a Gaussian reference distribution. The bound is tight and is attained by a Gaussian distribution whose mean is identical to that of the reference distribution and whose covariance matrix is determined by a scalar parameter that can be obtained by finding the unique root of a simple function. Examples of applications in signal processing and information theory illustrate the usefulness of the proposed bound in practice.

Identifier

85147198241 (Scopus)

Publication Title

Signal Processing

External Full Text Location

https://doi.org/10.1016/j.sigpro.2023.108933

ISSN

01651684

Volume

207

This document is currently not available here.

Share

COinS