The QR decomposition
The QR decomposition, also known as the QR factorization, is another method of solving
linear systems of equations using matrices, very much like the LU decomposition. The
equation to solve is in the form of Ax=B, where matrix A=QR. However, in this case, A is a
product of an orthogonal matrix, Q, and upper triangular matrix, R. The QR algorithm is
commonly used to solve the linear least-squares problem.
An orthogonal matrix exhibits the following properties:
 •   It is a square matrix.
 •   Multiplying an orthogonal matrix by its transpose returns the identity matrix:
 •   The inverse of an orthogonal matrix equals its transpose:
An identity matrix is also a square matrix, with its main diagonal containing 1s and 0s
elsewhere.
The problem of Ax=B can now be restated as follows:
Using the same variables in the LU decomposition example, we will use theqr() method
of scipy.linalg to compute our values of Q and R, and let the y variable represent our value
of BQT with the following code:
In    [ ]:
      """
      QR decomposition with scipy
      """
      import numpy as np
      import scipy.linalg as linalg
      A = np.array([
          [2., 1., 1.],
          [1., 3., 2.],
[1., 0., 0]])
  B = np.array([4., 5., 6.])
  Q, R = scipy.linalg.qr(A) # QR decomposition
  y = np.dot(Q.T, B) # Let y=Q'.B
   x = scipy.linalg.solve(R, y) # Solve Rx=y
Note that Q.T is simply the transpose of Q, which is also the same as the inverse of Q:
In [ ]:
   print(x)
Out[ ]:
   [ 6. 15. -23.]
We get the same answers as those in the LU decomposition example.