Cholesky Decomposition Calculator

Created by Rijk de Wet
Last updated: Dec 14, 2021

Welcome to the Cholesky decomposition calculator. In this accompanying text to the tool, we'll learn all there is to know about the Cholesky factorization, which decomposes a matrix into a product of matrices. We'll specifically cover how to calculate the Cholesky decomposition and an example of Cholesky decomposition for a 3×3 matrix.

What is a matrix decomposition?

Before we can learn about the Cholesky decomposition, we must define what a matrix decomposition is. In the world of linear algebra, a matrix decomposition (or a matrix factorization) is the factorization of a matrix into a product of matrices. So, just as you can factorize $16$ into groups of products like $4\times4$ or $2\times8$, you can factorize a matrix like $A$ below… but how?

$A = \begin{bmatrix} 2 & 3 \\ 3 & 5 \\ \end{bmatrix}$

Factorizing a matrix is much harder than a number. Lucky for us, mathematicians have discovered many different methods of performing matrix decompositions. The most famous of these methods are the LU decomposition, the QR decomposition, and the Cholesky decomposition. Which method you'd want to use depends on the problem you're trying to solve.

What is the Cholesky decomposition?

Knowing what matrix decomposition is, we can go on to define the Cholesky decomposition. The Cholesky decomposition (or the Cholesky factorization) is the factorization of a matrix $A$ into the product of a lower triangular matrix $L$ and its transpose. We can rewrite this decomposition in mathematical notation as:

$A = L\cdot L^T$

To be Cholesky-decomposed, matrix $A$ needs to adhere to some criteria:

• $A$ must be symmetric, i.e. $A^T = A$.

• By extension, this means $A$ must be square.

• $A$ must be positive definite (meaning its eigenvalues must all be positive).

If $A$ doesn't tick all the items on our list, no suitable $L$ can exist.

How to calculate the Cholesky decomposition?

The goal of any matrix decomposition method is to find the factorization's terms, and so we want to find the lower triangular matrix $L$. The Cholesky decomposition has no single mathematical formula, but it is easily obtained by hand for a small matrix. For larger matrices, there's an algorithmic process to follow.

Let's begin by looking at the simple $2\times2$ matrix case symbolically. We first define:

$A = \begin{bmatrix} a_{1,1} & a_{1,2} \\ a_{2,1} & a_{2,2} \\ \end{bmatrix}$

…and:

$L = \begin{bmatrix} b_{1,1} & 0 \\ b_{2,1} & b_{2,2} \\ \end{bmatrix}$

Note that the elements of $L$ above its diagonal are zero, as $L$ is a lower triangular matrix.

We know from the definition of the Cholesky factorization that $A = L\cdot L^T$, so let's take a look at the right-hand side of this equation.

$\begin{split} &\ L\cdot L^T \\ =&\ \begin{bmatrix} b_{1,1} & 0 \\ b_{2,1} & b_{2,2} \\ \end{bmatrix} \cdot \begin{bmatrix} b_{1,1} & b_{2,1} \\ 0 & b_{2,2} \\ \end{bmatrix} \\ =&\ \begin{bmatrix} (b_{1,1})^2 & b_{1,1}\cdot b_{2,1} \\ b_{1,1}\cdot b_{2,1} & (b_{2,1})^2 + (b_{2,2})^2 \\ \end{bmatrix} \end{split}$

Knowing the above and that $A = L\cdot L^T$, we can just equate corresponding elements of $A$ and $L\cdot L^T$ and solve the equations:

\begin{alignat*}{2} & a_{1,1} = (b_{1,1})^2 \\ \therefore\ & b_{1,1} = \sqrt{a_{1,1}} \\ & a_{2,1} = b_{1,1}\cdot b_{2,1} \\ \therefore\ & b_{2,1} = a_{2,1}\ /\ b_{1,1} \\ & a_{2,2} = (b_{2,1})^2 + (b_{2,2})^2 \\ \therefore\ & b_{2,2} = \sqrt{a_{2,2} - (b_{2,1})^2} \end{alignat*}

Notice that we need earlier elements of $L$ to solve for the later elements: $b_{2,2}$ needs $b_{2,1}$, which needs $b_{1,1}$.

We can use the equations above to solve $L$ for the $3\times3$ matrix $A$ that we defined at the top:

$\begin{split} A &= \begin{bmatrix} 2 & 3 \\ 3 & 5 \end{bmatrix} \\ \therefore L &= \begin{bmatrix} \sqrt{2} & 0 \\ \frac{3}{\sqrt{2}} & \sqrt{\frac{1}{2}}\end{bmatrix} \end{split}$

The principles of the above example apply to the Cholesky factorization of any sized matrix. For larger matrices, we can generalize the process with the following two equations.

For elements on $L$'s diagonal:

$b_{j,j} = \sqrt{a_{j,j} - \sum_{k=1}^{j-1} (b_{j,k})^2}$

For elements off $L$'s diagonal:

$b_{i,j} = \frac{1}{b_{j,j}}\left(a_{j,j} - \sum_{k=1}^{j-1} b_{i,k}\cdot b_{j,k}\right)\quad (i > j)$

How to use the Cholesky decomposition calculator?

With the help of our calculator, you can easily calculate $L$ if you know what $A$ is. Remember, $A$ must be symmetric and positive definite. If it's not, the calculator can't give you $L$ as no $L$ that complies with $A = L\cdot L^T$ exists.

1. Select $A$'s shape. Our Cholesky decomposition solver supports $2\times2$, $3\times3$, and $4\times4$ matrices.

2. Give the calculator your matrix $A$. The matrix's elements are separated by row — see the graphical representation at the top of the calculator if you're unsure.

3. Find the result $L$ below. Our Cholesky decomposition solver will calculate $L$ and display it below your matrix. If $A$ is not symmetric and positive definite, the calculator will notify you accordingly.

Example Cholesky decomposition of a 3×3 matrix

Now that we know how to compute the Cholesky factorization for a matrix of any size, let's do an example. We'll use the general-case algorithmic equations we've just discussed to solve $L$ given the following $3\times3$ matrix $A$:

$A = \begin{bmatrix} 25 & 15 & 5 \\ 15 & 13 & 11 \\ 5 & 11 & 21 \\ \end{bmatrix}$

We can jump straight into the solving process, starting at the top left and moving left-to-right, top-to-bottom.

\begin{alignat*}{2} b_{1,1} &= \sqrt{a_{1,1}} &= 5 \\ b_{2,1} &= a_{2,1}\ /\ b_{1,1} &= 3 \\ b_{2,2} &= \sqrt{a_{2,2} - (b_{2,1})^2} &= 2 \\ b_{3,1} &= a_{3,1}\ /\ b_{1,1} &= 1 \\ b_{3,2} &= (a_{3,2} - b_{3,1}\cdot b_{2,1})\ /\ b_{2,2} &= 4 \\ b_{3,3} &= \sqrt{a_{3,3} - (b_{3,1})^2 - (b_{3,2})^2} &= 2 \\ \end{alignat*}

And so, after all our effort, we've fully obtained $L$,

$L = \begin{bmatrix} 5 & 0 & 0 \\ 3 & 2 & 0 \\ 1 & 4 & 2 \\ \end{bmatrix}$

Finally, to test our answer, we can see if $L\cdot L^T$ really is equal to $A$:

$\begin{split} &\ L\cdot L^T \\ =&\ \begin{bmatrix} 5 & 0 & 0 \\ 3 & 2 & 0 \\ 1 & 4 & 2 \end{bmatrix} \cdot \begin{bmatrix} 5 & 3 & 1 \\ 0 & 2 & 4 \\ 0 & 0 & 2 \end{bmatrix} \\ =&\ \begin{bmatrix} 5^2 & 5\times3 & 5\times1 \\ 3\times5 & 3^2 + 2^2 & 3\times1 + 2\times4 \\ 1\times5 & 1\times3 + 4\times2 & 1^2 + 4^2 + 2^2 \\ \end{bmatrix} \\ =&\ \begin{bmatrix} 25 & 15 & 5 \\ 15 & 13 & 11 \\ 5 & 11 & 21 \end{bmatrix} \\ =&\ A \end{split}$

Perfect! We've just performed a complete decomposition of $A$. You can also try out this example in our Cholesky decomposition calculator.

What are the applications of the Cholesky decomposition?

Matrix decomposition methods exploit the structure of the factorization's terms to make solving system of equations easier. Remember how we said that $L$ is lower triangular? Lower triangular matrices are especially easy to work with, and therefore the Cholesky decomposition is frequently the method of choice in solving systems of equations.

FAQ

How do you determine whether a matrix has a Cholesky decomposition?

For the matrix A to have a Cholesky decomposition, it must be symmetric, and it must be positive definite (meaning it must have only positive eigenvalues). If A does not adhere to these requirements, it cannot have a Cholesky decomposition, meaning no matrix L that satisfies L·LT = A can exist.

What does the Cholesky decomposition do?

Like any matrix decomposition method, the Cholesky decomposition takes a matrix A and factorizes it. It produces a lower triangular matrix L, which when multiplied with its transpose LT produces the original matrix A. This is valuable in many matrix operations, as the structure of a lower triangular matrix can be exploited to make the operations compute much faster.

What is a symmetric matrix?

A symmetric matrix is one that is equal to its transpose. Redefined mathematically, symmetric matrices satisfy the condition A = AT. This has the implication that the matrix must also be square. Graphically, a symmetric matrix is symmetric along its main diagonal.

What is a positive definite matrix?

A positive definite matrix is a matrix with only positive eigenvalues. The formal mathematical definition is that matrix A is positive definite if zT·A·z > 0 for every nonzero column vector z. Alternatively, A is only positive definite if it has a Cholesky decomposition.

What is the Cholesky decomposition of the identity matrix?

The identity matrix's Cholesky decomposition is also the identity matrix. This is because the identity matrix's transpose is itself (IT = I) and so I·IT = I·I = I. In general, the Cholesky decomposition L of a diagonal matrix D is also diagonal, and its diagonal entries are the square roots of D's. When the diagonal entries are all 1 like they are in I, we get the identity matrix back.

Rijk de Wet
A=
 ⌈ a₁ a₂ ⌉ ⌊ b₁ b₂ ⌋
Matrix size
2x2
First row
a₁
a₂
Second row
b₁
b₂
People also viewed…

Grams to cups

The grams to cups calculator converts between cups and grams. You can choose between 20 different popular kitchen ingredients or directly type in the product density.

Perimeter of a polygon

How to find the perimeter of a polygon? What is the perimeter of a polygon formula? This perimeter of a polygon calculator knows the answer!