Graduate → Numerical Analysis → Numerical Linear Algebra ↓
Matrix Factorizations
Matrix factorization is an important concept in numerical linear algebra, an important area of numerical analysis. This topic is relevant in many applications in science, engineering, and mathematics. The idea is to express a given matrix as a product of two or more matrices with specific properties. This can make solving linear systems, finding inverses, and determining matrix properties computationally efficient. In this lesson, we will explore different types of matrix factorization, their purposes, and simple examples of their applications.
Basic terminology and concepts
Before diving into specific matrix factorizations, it is important to understand the basic matrix operations and properties. Matrices are two-dimensional arrays of numbers that can be added, subtracted, and multiplied by following certain rules. A fundamental operation is the multiplication of matrices, which is non-commutative, meaning that multiplying a matrix A
by B
does not always give the same result as multiplying B
by A
Matrix factorization involves decomposing a matrix into products of simpler matrices. It can help solve the following types of equations:
Ax = b
where A
is a matrix, and x
and b
are vectors.
LU decomposition
LU decomposition is a method in which a matrix A
is decomposed into two matrices, L
and U
L
is a lower triangular matrix, and U
is an upper triangular matrix. This factorization is possible if A
is a square matrix, which means it has the same number of rows and columns.
The goal of LU decomposition is to simplify the process of solving linear systems and calculating determinants and inverses. A matrix is decomposed as follows:
A = LU
Here's a simple example with a 3x3 matrix:
A = | 2 3 1 |
| 4 7 -1 |
| -2 4 5 |
L = | 1 0 0 |
| 2 1 0 |
| -1 1 1 |
U = | 2 3 1 |
| 0 1 -3 |
| 0 0 7 |
Through forward and backward substitution on L
and U
, solving Ax = b
becomes more computationally feasible.
Cholesky decomposition
The Cholesky decomposition is a specific type of LU decomposition that applies to Hermitian, positive-definite matrices. It expresses the matrix A
as:
A = LL T
where L
is a lower triangular matrix with real and positive diagonal entries, and L T
is the transpose of L
Consider a symmetric matrix example:
A = | 4 12 -16 |
| 12 37 -43 |
| -16 -43 98 |
L = | 2 0 0 |
| 6 1 0 |
| -8 5 3 |
L T = | 2 6 -8 |
| 0 1 5 |
| 0 0 3 |
Applications in system solutions
The Cholesky decomposition is useful for solving the system Ax = b
where A
is symmetric and positive definite. It is advantageous because of its efficiency and numerical stability.
QR decomposition
The QR decomposition decomposes a matrix A
into the product of two matrices Q
and R
, where Q
is an orthogonal matrix, and R
is an upper triangular matrix:
A = QR
It is particularly useful in solving linear systems and eigenvalue calculations, and can also be used in calculating the least squares solution of Ax = b
.
Consider this example:
A = | 1 2 |
| 3 4 |
| 5 6 |
Q = | 1/√35 2/√35 |
| 3/√35 4/√35 |
| 5/√35 6/√35 |
R = | √35 0 |
| 0 √35 |
Singular value decomposition (SVD)
One of the most powerful decomposition techniques, singular value decomposition, represents the matrix A
as follows:
A = UΣV *
where U
and V
are unitary matrices, and Σ
(sigma) is a diagonal matrix.
SVD is particularly important because it can be applied to any mxn matrix, not just square matrices. It is used in signal processing, statistics, and computer vision.
Example:
A = | 1 0 1 |
| 0 1 0 |
| 1 0 1 |
U = | 1/√2 0 1/√2 |
| 0 1 0 |
| 1/√2 0 1/√2 |
Σ = | 2 0 0 |
| 0 1 0 |
| 0 0 0 |
V * = | 1/√2 0 1/√2 |
| 0 1 0 |
| -1/√2 0 1/√2 |
Applications of SVD
SVD is used in solving systems of equations, signal processing, and statistics. Its application in Principal Component Analysis (PCA) helps reduce the dimensionality of the dataset, which is important for machine learning applications.
Practical considerations
Matrix factorizations are not just theoretical constructs but are implemented in various computational libraries and software such as LAPACK, MATLAB and NumPy in Python. These libraries help in performing efficient numerical calculations required for large-scale problems.
When applying decompositions, numerical stability and computational cost are important considerations. Depending on the properties of the matrix (e.g., sparse, symmetric, etc.), certain decompositions are more suitable.
Conclusion
Matrix factorizations such as LU, Cholesky, QR, and SVD are integral tools in numerical linear algebra. They streamline operations on matrices, such as solving linear systems and analyzing matrix properties, making them invaluable for practical applications. Understanding these factorizations expands your ability to solve complex computational problems efficiently and accurately.