Examples of Error Propagation

In the field of computer vision, the theory of error propagation is crucial, as basic operations for feature extraction are often affected by noise. For instance, measuring color intensity or determining the position of a particular feature in an image can be influenced by noise, making it essential to understand how this noise impacts subsequent calculations.

The measurement error due to additive noise is formalized as $x = \hat{x} + \varepsilon$, where $x$ is the observed value, $\hat{x}$ is the true value, and $\varepsilon$ is the additive noise, for instance, white Gaussian noise with variance $\sigma^{2}_{x}$.

In the case of vision, it might be interesting to estimate how the error generated by the imprecise observation of a point in the image propagates through the system. In this scenario, the observed variables will be $(x,y)$, image coordinates both affected by localization error with variances $\sigma^{2}_{x}$ and $\sigma^{2}_{y}$ respectively, which are normally (at least in the first approximation) uncorrelated with each other.

Using the result from equation (2.29), the generic function $z(x,y)$, which is a function of two random variables, can be approximated to the first order through a Taylor series expansion as From which we can estimate the uncertainty on the value of $z$, by applying the variance propagation discussed in the previous section, yielding

\begin{displaymath}
\sigma^{2}_{z} = \left( \frac{\partial z}{\partial x} \righ...
...left( \frac{\partial z}{\partial y} \right)^{2} \sigma^{2}_{y}
\end{displaymath} (2.31)

having chosen to calculate the derivative at $(x,y)$.

With this formulation, several examples can be presented:

Example 1
The propagation of the error of $z = x + y$ is
\begin{displaymath}
\sigma^{2}_{z} = \sigma^2_{x} + \sigma^2_{y}
\end{displaymath} (2.32)

a result already seen previously.

Esempio 2
The propagation of the error of $z = x y$ is
\begin{displaymath}
\sigma^{2}_{z} = y^2 \sigma^2_{x} + x^2 \sigma^2_{y}
\end{displaymath} (2.33)

Example 3
The propagation of the error of $z = \frac{1}{x \pm y}$ is
\begin{displaymath}
\sigma^{2}_{z} = \frac{ \sigma^{2}_{x} + \sigma^{2}_{y} } { (x \pm y)^4 }
\end{displaymath} (2.34)

Esempio 4
The propagation of the error of $z = \frac{x}{y}$ is
\begin{displaymath}
\sigma^{2}_{z} = \frac{1}{y^{2}} \sigma^{2}_{x} + \frac{x^{2}}{y^{4}} \sigma^{2}_{y}
\end{displaymath} (2.35)

Esempio 5
The propagation of the error of $z = \sqrt{x^2 + y^2}$ is
\begin{displaymath}
\sigma^{2}_{z} = \frac{x^2 \sigma^{2}_{x} + y^2 \sigma^{2}_{y}}{x^2 + y^{2}}
\end{displaymath} (2.36)

It is interesting to note from these equations how the absolute values taken by the variables ($x$ and $y$ in the examples) directly influence the estimation of the error on the final variable $z$: some variables yield results with lower variance as their intensity increases, while others may exhibit the opposite behavior. For these reasons, depending on the transformation and consequently the model estimation one aims to achieve, certain points in the image may be more important to observe than others.

Paolo medici
2025-10-22