Skip to content
Prasant Chidella edited this page Aug 17, 2014 · 1 revision

QR decomposition is the factorization of a matrix A into Q, which is an orthogonal matrix, and R which is an upper triangular matrix.
If A is a square matrix, then A will always have a decomposition.
In case of rectangular matrices, QR decomposition will exist only if for a matrix A of size m x n, m > n. Such a decomposition will produce an m x m orthogonal matrix and an m x n upper triangular matrix.

There are many ways of computing QR factorization of a matrix. A few being:

  • Gram-Schmidt process
  • House transformations
  • Givens rotations

This decomposition has several applications:

Solving a system of linear equations

If we have a matrix that can be decomposed as A = QR and we want to solve Ax = b, where b is a vector of size m x 1,

      Ax = b
     QRx = b            (A = QR)
   Q*QRx = Q*b          (multiplying both sides by Q*, the transpose of Q)
      Rx = Q*b          (Q is orthogonal, so QQ* = I)

Because R is a triangular matrix, this final equation is very easy to solve using backward substitution.

QR v/s LU

Both QR decomposition and LU decomposition (and its variants) can be used for solving Ax = b.

Both of these methods take O(n3) time to run, but the coefficient before n3 for QR is much higher than for LU. So QR is slower.
However, because QR uses orthogonal matrices, it is more numerically stable and is less sensitive to ill conditioned linear systems.

Solving over determined systems

When we are trying to solve Ax = b where A is of size m x n, m > n,
we want to minimize

since orthogonal matrices are isometric, multiplying a vector by an orthogonal matrix does not change its norm, so we multiply the equation by QT.

=
=
=
We can represent this in block form as follows (by separating the first n rows from the rest):


=
=

The second term in this expression in irrelevant because it is independent of x, so we must minimize the first one. If R is non singular (it is if A is), then the minimum norm is zero. The second term will be the residual of the least squares system. We can solve the equation R1x = QTb using back substitution since R1 is triangular.

Solving under determined systems

Now we have a matrix A which has more rows than columns.
So, we factorize AT as AT = QR, and therefore A = RTQT
So the equation can be written as:

As we can see, the second term in the LHS does not affect the result. To get the norm, we set it to zero, leading to the solution: