Total Least Squares
We now extend the linear problem
to the more general case where the coefficient matrix
is also perturbed (Errors-In-Variables model, EIV (VHV91)). This type of least squares regression problem is referred to as Total Least Squares (TLS).
The solution of the perturbed system
 |
(3.20) |
corresponds to finding the solution
that minimizes the Frobenius norm
, subject to the constraint (3.20). In classical TLS, all columns of the data matrix contain noise. If some columns are error-free, then the solution is referred to as mixed TLS-LS.
The system (3.20) can be rewritten as
![\begin{displaymath}
\left( \left[ \mathbf{A}\vert\mathbf{b} \right] + \left[ \ma...
...\begin{bmatrix}
\mathbf{x} \\
-1
\end{bmatrix} = \mathbf{0}
\end{displaymath}](img836.svg) |
(3.21) |
Utilizing the SVD decomposition and the Eckart-Young-Mirsky theorem (the matrix formed by the first
terms of the SVD decomposition is the matrix that best approximates the matrix
under the Frobenius norm), it is possible to find the solution to the problem (3.20).
Let us denote
![\begin{displaymath}
\mathbf{C} := \left[ \mathbf{A}\vert\mathbf{b} \right] = \mathbf{U} \boldsymbol\Sigma \mathbf{V}^{\top}
\end{displaymath}](img838.svg) |
(3.22) |
the Singular Value Decomposition of the matrix
, where
.
The Total Least Squares solution, if it exists, can be expressed as
 |
(3.23) |
having partitioned
 |
(3.24) |
and it is possible to obtain the best estimate of
as
 |
(3.25) |
Paolo medici
2025-10-22