The Cramer-Rao Lower Bound (CRLB) establishes a lower limit for the variance of any unbiased estimator of the parameter (to maintain consistent notation with the literature,
in our case).
Let be a multidimensional random variable and
an unknown deterministic parameter. Let
be the probability density of
given
. We assume that such a probability density exists and is twice differentiable with respect to
.
| (3.1) |
Since the parameter is not known, the Cramer-Rao theorem only allows us to determine whether the estimator is optimal or not.