Singular Value Decomposition
Singular value decompositiongeneralizes thespectral theoremto arbitrary -by- matrices. Treating an -by- matrix as representing alinear transformation from to , singular value decomposition states that there arebasesfor and in which the matrix of takes a diagonal form.
If is any -by- matrix with entries in , there exists an -by- orthogonal matrix , an -by- orthogonal matrix , and an -by- diagonal matrix such that Furthermore, the numbers on the diagonal of arenon-negative.
Recall that an -by- matrix is calleddiagonalwhen any entry whose row does not equal its column is zero. For example, the -by- matrix is diagonal.
Remark About Special Case:In the case when , note that singular value decomposition is strictlyweakerthan the spectral theorem, since the orthogonal matrices whose existence is ensured by singular value decomposition are not necessarily related (whereas in the spectral theorem, one orthogonal matrix is inverse to the other). However, in singular value decomposition, the entries of the diagonal matrix are ensured to be non-negative, whereas the spectral theorem makes no such guarantee.
Proof Using Spectral Theorem
Remark About Notation:In this proof, we will useblock matrix notation. In block matrix notation, square submatrices of a matrix are represented as entries of another matrix, in order to ease notation and computation. For example, consider the -by- matrix Letting denote the diagonal matrix we can write the -by- matrix above in the block form where the zeroes represent the zero matrices in dimensions -by- , -by- , and -by- .
Let be the -by- matrix that we wish to singular value decompose, and let denote the -by- matrix .
Step 1 - Eigenvalues Of Are Non-Negative Reals:If is an eigenvalue of , with eigenvector , we compute This equation is only possible if .
Step 2 - Using Spectral Theorem:By the spectral theorem, since 是对称的,有一个正交矩阵 such that is diagonal. Write the block form where is a diagonal matrix that has only positive entries. We can ensure this order for the eigenvalues in the diagonal matrix by multiplying with a suitablepermutation matrix.
Step 3 - Breaking Into Blocks:Partition into the block form so that In particular, and . This latter equation implies , since the diagonal entries of are norms of the rows of . Furthermore, note that , where is the identity matrix of some suitable dimension; this is true from a coordinate-wise computation using the fact that is orthogonal.
Step 4 - Fudging Blocks To Get What We Want:Since the entries of are positive, there is a diagonal matrix whose entries are also positive and whose square is ; furthermore, since the entries are positive, is invertible. Let . We compute where the final equality follows from the fact . This isalmostwhat we want, but and are not necessarily orthogonal because they aren't even necessarily square!
Step 5 - Fudging and To Actually Get What We Want:Note This implies the columns of are orthogonal, and hence this collection of columns can beextendedto an orthogonal basis. Consequently, there is a matrix such that the -by- block matrix is orthogonal. Now, we fudge by adding or removing the suitable number of zero rows/columns; call the diagonal matrix obtained by this process . Then, a computation will prove so we are nowactuallydone.
Singular Values of Matrix
If is a singular value decomposition of , the orthogonal matrices and are not unique. However, the diagonal entries of areunique, at least up to a permutation. These entries are called thesingular valuesof .