Elements of Probability

In this section, some useful probability relations are presented for the subsequent section.

Let's define the probability density function (PDF) as

\begin{displaymath}
p_X (x) = P(X = x)
\end{displaymath} (2.75)

to facilitate the transition from the discrete case to the continuous case.

Bayes' theorem (or Bayes' formula) is a relationship that is derived by combining the theorem of compound probability with the theorem of absolute probability.

Starting from the definition of conditional probability $P(A,B)=P(A\vert B)P(B)$ (multiplication rule), we obtain:

\begin{displaymath}
P(A\vert B) = \frac{P(A,B)}{P(B)}
\end{displaymath} (2.76)

and conversely
\begin{displaymath}
P(B\vert A) = \frac{P(B,A)}{P(A)}
\end{displaymath} (2.77)

with the consideration that $P(A,B)=P(B,A)$ is obtained
\begin{displaymath}
P(A\vert B) = \frac{P(B\vert A)P(A)}{P(B)}
\end{displaymath} (2.78)

The same reasoning can be applied in the case of three variables:

\begin{displaymath}
P(A,B,C) = P(A\vert B,C)P(B,C)=P(B\vert A,C)P(A,C)=P(B,A,C)
\end{displaymath} (2.79)

leading to Bayes' theorem
\begin{displaymath}
P(A\vert B,C) = \frac{P(B\vert A,C)P(A\vert C)}{P(B\vert C)}
\end{displaymath} (2.80)

where the dependence on a third variable $C$ is evident.

Another important formula that will be used in the next section is the law of total probability:

\begin{displaymath}
P(B)=\sum P(A_i, B) = \sum P(A_i) P(B\vert A_i)
\end{displaymath} (2.81)

or in the continuous case
\begin{displaymath}
p_X(x)=\int p_{X,Y}(X=x,Y=y) dy=\int p(x\vert Y=y) p(y) dy
\end{displaymath} (2.82)

the marginal density of $\mathbf{X}$.

Paolo medici
2025-10-22