We deliver solutions for the AI eraâcombining symbolic computation, data-driven insights and deep technology expertise.
yields the QR decomposition for a numerical matrix m. The result is a list {q,r}, where q is a unitary matrix and r is an upper‐triangular matrix.
Details and Options Examplesopen allclose all Basic Examples (3)The decomposition of a 2×2 matrix into a unitary (orthogonal) matrix and upper triangular matrix :
Compute the QR decomposition for a 3×2 matrix with exact values:
Compute the QR decomposition for a 2×3 matrix with approximate numerical values:
Scope (11) Basic Uses (7)Find the QR decomposition of a machine-precision matrix:
QR decomposition for a complex matrix:
Use QRDecomposition for an exact matrix:
QR decomposition for an arbitrary-precision matrix:
Use QRDecomposition with a symbolic matrix:
The QR decomposition for a large numerical matrix is computed efficiently:
QR decomposition of a non-square matrix:
Special Matrices (4)Find the QR decomposition for a sparse matrix:
QR decompositions of structured matrices:
Use with a QuantityArray structured matrix that has consistent units:
The matrix is dimensionless; the matrix gets the units:
QR decomposition of an IdentityMatrix consists of two identity matrices:
QR decomposition of HilbertMatrix:
Options (4) Pivoting (1)Compute the QR decomposition using machine arithmetic with pivoting:
The elements along the diagonal of r are in order of decreasing magnitude:
The matrix p is a permutation matrix:
QRDecomposition satisfies m.p==ConjugateTranspose[q].r:
Applications (8) Geometry of QRDecomposition (4)Find an orthonormal basis for the column space of the following matrix , and then use that basis to find a QR factorization of :
Define as the column of and as the element of the corresponding Gram–Schmidt basis:
Let be the matrix whose rows are the :
Let be the matrix whose elements are the components of along the basis vector:
This is the same result as given by QRDecomposition:
Compare QR decompositions found using Orthogonalize and QRDecomposition for the following matrix :
Let be the result of applying Orthogonalize to the columns of :
This is the same result as given by QRDecomposition:
Compare QR decompositions found using Orthogonalize and QRDecomposition for the following matrix :
Let be the result of applying Orthogonalize to the complex-conjugated columns of :
Up to sign, this is the same result as given by QRDecomposition:
For some applications, it use useful to compute a so-called full QR decomposition, in which the is square (and thus unitary) and has the same dimensions as the input matrix. Compute the full QR decomposition for the following matrix :
There are only two linearly independent columns, so and each have only two rows:
Use NullSpace to find vectors outside the span of the rows of , then orthogonalize the complete set:
Simply pad the matrix with zeros to make it the same shape as :
Verify that this is also a valid QR decomposition:
Least Squares and Curve Fitting (4)Use the QR decomposition to find the that minimizes for the following matrix and vector :
Since , , and the normal equations can be recast as :
As is invertible (because the columns of are linearly independent), the solution is :
Confirm the result using LeastSquares:
Use the QR decomposition to solve for the following matrix and vector :
Compute the QR decomposition of , which gives an invertible , as has linearly independent rows:
Let as if solving the least-squares problem:
As the columns of span , must be a solution of the equation:
QRDecomposition can be used to find a best-fit curve to data. Consider the following data:
Extract the and coordinates from the data:
Let have the columns and , so that minimizing will be fitting to a line :
As the columns of are linearly independent, the coefficients for a linear least‐squares fit are :
Verify the coefficients using Fit:
Plot the best-fit curve along with the data:
Find the best-fit parabola to the following data:
Extract the and coordinates from the data:
Let have the columns , and , so that minimizing will be fitting to :
As the columns of are linearly independent, the coefficients for a least‐squares fit are :
Verify the coefficients using Fit:
Plot the best-fit curve along with the data:
Properties & Relations (10)The rows of q are orthonormal:
m is equal to ConjugateTranspose[q].r:
If is an matrix, the matrix will have columns and the matrix columns:
QRDecomposition computes the "thin" decomposition, where and have MatrixRank[m] rows:
If m is real-valued and invertible, the matrix of its QR decomposition is orthogonal:
If m is invertible, the matrix of its QR decomposition is unitary:
If a is an matrix and MatrixRank[a]==n, the matrix of its QR decomposition is unitary:
If a is an matrix and MatrixRank[a]==m, the matrix of its QR decomposition is invertible:
Moreover, PseudoInverse[a]==Inverse[r].q:
Orthogonalize can be used to compute a QR decomposition:
For an approximate matrix, it is typically different from the one found by QRDecomposition:
LeastSquares and QRDecomposition can both be used to solve the least-squares problem:
The Cholesky decomposition of coincides with 's QR decomposition up to phase:
Compute CholeskyDecomposition[ConjugateTranspose[m].]m:
Find the QR decomposition of :
is the same as except for the choice of phase for each row:
Wolfram Research (1991), QRDecomposition, Wolfram Language function, https://reference.wolfram.com/language/ref/QRDecomposition.html (updated 2024). TextWolfram Research (1991), QRDecomposition, Wolfram Language function, https://reference.wolfram.com/language/ref/QRDecomposition.html (updated 2024).
CMSWolfram Language. 1991. "QRDecomposition." Wolfram Language & System Documentation Center. Wolfram Research. Last Modified 2024. https://reference.wolfram.com/language/ref/QRDecomposition.html.
APAWolfram Language. (1991). QRDecomposition. Wolfram Language & System Documentation Center. Retrieved from https://reference.wolfram.com/language/ref/QRDecomposition.html
BibTeX@misc{reference.wolfram_2025_qrdecomposition, author="Wolfram Research", title="{QRDecomposition}", year="2024", howpublished="\url{https://reference.wolfram.com/language/ref/QRDecomposition.html}", note=[Accessed: 12-July-2025 ]}
BibLaTeX@online{reference.wolfram_2025_qrdecomposition, organization={Wolfram Research}, title={QRDecomposition}, year={2024}, url={https://reference.wolfram.com/language/ref/QRDecomposition.html}, note=[Accessed: 12-July-2025 ]}
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4