-
Notifications
You must be signed in to change notification settings - Fork 30
Singular Value Decomposition
The SVD is a more expensive decomposition than either LU or QR, but it can also do more. Unlike LU and QR, it works even on singular matrices. It reveals ranks and condition numbers. And it provides ortho-normal basises for the matrix domain and kernel and the corresponding output spaces.
In this tutorial, we will be using the PrintMatrix method that we defined earlier to display results.
Here's some code to perform a SVD:
using Meta.Numerics.Matrcies;
SquareMatrix A = new SquareMatrix(new double[,] {
{ 1, -2, 3 },
{ 2, -5, 12 },
{ 0, 2, -10 }
});
SingularValueDecomposition svdA = A.SingularValueDecomposition();
The SVD determines two seperate orthogonal matrices with, acting together from the left and right, diagonalize the original matrix. Here's some code that shows you how to get them, that they are orthogonal, and that they do their job:
SquareMatrix V = svdA.LeftTransformMatrix;
SquareMatrix W = svdA.RightTransformMatrix;
PrintMatrix("V^T V", V.MultiplyTransposeBySelf());
PrintMatrix("W^T W", W.MultiplyTransposeBySelf());
PrintMatrix("V^T A W", V.Transpose * A * W);
These diagonal elements are the singular values. Don't worry; the SingularValueDecomposition object gives you a better way to get them without doing this matrix multiplication.
Easy:
using System;
Console.WriteLine($"A has rank = {svdA.Rank}, condition number = {svdA.ConditionNumber}.");
Since our example A is invert-able, it has full rank. The condition number measures how much small changes in the input vectors upon which the matrix acts can be amplified in output vectors.
In exactly the same way as a non-singular matrix. Here is some code that decomposes a manifestly singular matrix, showing exactly what its rank is:
SquareMatrix S = new SquareMatrix(new double[,] {
{ 11, 9, 0 },
{ 9, 11, 0 },
{ 0, 0, 0 }
});
SingularValueDecomposition svdS = S.SingularValueDecomposition();
Console.WriteLine($"S has rank = {svdS.Rank}, condition number = {svdS.ConditionNumber}.");
Since it is singular, it has an infinite condition number.
The singular values, and their corresponding left and right singular vectors, are accessible via the Contributors property. Here is some code that illustrates how to get at them:
RectangularMatrix R = new RectangularMatrix(svdS.RowCount, svdS.ColumnCount);
foreach (SingularValueContributor contributor in svdS.Contributors) {
Console.WriteLine($"s * L * R^T with s = {contributor.SingularValue}");
PrintMatrix("L^T", contributor.LeftSingularVector.Transpose);
PrintMatrix("R^T", contributor.RightSingularVector.Transpose);
R += contributor.SingularValue * contributor.LeftSingularVector * contributor.RightSingularVector.Transpose;
}
PrintMatrix("R", R);
Besides illustrating how to get at the SVD contributors, what is this code doing? One illuminating way to think of the SVD is that it tells you how to construct your matrix from rank-1 combinations of ortho-normal basis vectors of the source and target spaces, with different weights. The weights are the singular values. That means we can re-construct the original matrix by adding these contributions, and that's exactly what our code did.
Oh, here's some more code showing how to access singular contributors by index, and proving that the left and right bases are ortho-normal:
SingularValueContributorCollection contributors = svdS.Contributors;
for (int i = 0; i < contributors.Count; i++) {
for (int j = 0; j <= i; j++) {
Console.WriteLine($"l_{i}^T l_{j} = " +
$"{contributors[i].LeftSingularVector.Transpose * contributors[j].LeftSingularVector}");
Console.WriteLine($"r_{i}^T r_{j} = " +
$"{contributors[i].RightSingularVector.Transpose * contributors[j].RightSingularVector}");
}
}
Absolutely. Here some code that does it for our non-singular matrix example:
ColumnVector b = new ColumnVector(0.0, 1.0, 2.0);
ColumnVector x = svdA.Solve(b);
PrintMatrix("Ax", A * x);
Amazingly, the SVD can even (kind-of) do it for a singular matrix:
ColumnVector t = new ColumnVector(1.0, -1.0, 0.0);
ColumnVector y = svdS.Solve(t);
PrintMatrix("Sy", S * y);
What the SVD gives you, in this case, is the input vector y that makes Sy as close it it can to t. If, as in our example, t is in the domain of S, this is exactly what you want. Even if t isn't fully in the domain of S, that may be a useful thing to know.
- Project
- What's New
- Installation
- Versioning
- Tutorials
- Functions
- Compute a Special Function
- Bessel Functions
- Solvers
- Evaluate An Integral
- Find a Maximum or Minimum
- Solve an Equation
- Integrate a Differential Equation
- Data Wrangling
- Statistics
- Analyze a Sample
- Compare Two Samples
- Simple Linear Regression
- Association
- ANOVA
- Contingency Tables
- Multiple Regression
- Logistic Regression
- Cluster and Component Analysis
- Time Series Analysis
- Fit a Sample to a Distribution
- Distributions
- Special Objects
- Linear Algebra
- Polynomials
- Permutations
- Partitions
- Uncertain Values
- Extended Precision
- Functions