Orthogonalize

Orthogonalize[{v1,v2,}]

gives an orthonormal basis found by orthogonalizing the vectors vi.

Orthogonalize[{e1,e2,},f]

gives an orthonormal basis found by orthogonalizing the elements ei with respect to the inner product function f.

Details and Options

  • Orthogonalize[{v1,v2,}] uses the ordinary scalar product as an inner product.
  • The output from Orthogonalize always contains the same number of vectors as the input. If some of the input vectors are not linearly independent, the output will contain zero vectors.
  • All nonzero vectors in the output are normalized to unit length.
  • The inner product function f is applied to pairs of linear combinations of the ei.
  • The ei can be any expressions for which f always yields real results. »
  • Orthogonalize[{v1,v2,},Dot] effectively assumes that all elements of the vi are real. »
  • Orthogonalize by default generates a GramSchmidt basis.
  • Other bases can be obtained by giving alternative settings for the Method option. Possible settings include: "GramSchmidt", "ModifiedGramSchmidt", "Reorthogonalization", and "Householder".
  • Orthogonalize[list,Tolerance->t] sets to zero elements whose relative norm falls below t.

Examples

open allclose all

Basic Examples  (3)

Find an orthonormal basis for the span of two 3D vectors:

Construct an orthonormal basis from three 3D vectors:

Confirm the result is orthonormal:

Orthogonalize vectors containing symbolic entries:

Scope  (13)

Basic Uses  (6)

Orthogonalize a set of machine-precision vectors:

Orthogonalize complex vectors:

Orthogonalize exact vectors:

Orthogonalize arbitrary-precision vectors:

Orthogonalize symbolic vectors:

Simplify the result assuming a and b are real-valued:

Large numerical matrices are handled efficiently:

Special Matrices  (4)

Orthogonalize the rows of a sparse matrix:

Orthogonalize the rows of structured matrices:

Orthogonalizing a diagonal matrix produces another diagonal matrix:

Orthogonalize HilbertMatrix:

General Inner Products  a(3)

Find a symbolic basis, assuming all variables are real:

Orthogonalize vectors that are not lists using an explicit inner product:

Specify the inner product using a pure function:

Options  (3)

Tolerance  (1)

Below the tolerance, two vectors are not recognized as linearly independent:

Method  (2)

m forms a set of vectors that are nearly linearly dependent:

Deviation from orthonormality for the default method:

Deviation for all of the methods:

For a large numerical matrix, the Householder method is usually fastest:

Applications  (12)

Geometry  (3)

Project the vector on the plane spanned by the vectors and :

Construct an orthonormal basis that spans the same space as the :

The projection in the plane is the sum of the projections onto :

Find the component perpendicular to the plane:

Confirm the result by projecting onto the normal to the plane:

Visualize the plane, the vector and its parallel and perpendicular components:

The FrenetSerret system encodes every space curve's properties in a vector basis and scalar functions. Consider the following curve:

Construct an orthonormal basis from the first three derivatives using Orthogonalize:

Ensure that the basis is right-handed:

Compute the curvature, , and torsion, , which quantify how the curve bends:

Verify the answers using FrenetSerretSystem:

Visualize the curve and the associated moving basis, also called a frame:

Find the orthogonal projection of the vector onto the space spanned by the vectors , and :

First, construct an orthonormal basis for the space:

The component in the space is given by sum_(i=1)^3 TemplateBox[{{(, {(, {e, _, i}, )}}}, Conjugate].v )e_i:

The difference is perpendicular to any vector in the span of the :

Least Squares and Curve Fitting  (3)

If the linear system has no solution, the best approximate solution is the least-squares solution. That is the solution to , where is the orthogonal projection of onto the column space of . Consider the following and :

The linear system is inconsistent:

Find an orthonormal basis for the space spanned by the columns of :

Compute the orthogonal projection of onto the spaced spanned by the :

Visualize , its projections onto the and :

Solve :

Confirm the result using LeastSquares:

Orthogonalize can be used to find a best-fit curve to data. Consider the following data:

Extract the and coordinates from the data:

Let have the columns and , so that minimizing TemplateBox[{{{m, ., {{, {a, ,, b}, }}}, -, y}}, Norm] will be fitting to a line :

Get the coefficients and for a linear leastsquares fit:

Verify the coefficients using Fit:

Plot the best-fit curve along with the data:

Find the best-fit parabola to the following data:

Extract the and coordinates from the data:

Let have the columns , and , so that minimizing TemplateBox[{{{m, ., {{, {a, ,, b, ,, c}, }}}, -, y}}, Norm] will be fitting to :

Construct orthonormal vectors that have the same column space as :

Get the coefficients , and for a leastsquares fit:

Verify the coefficients using Fit:

Plot the best-fit curve along with the data:

Matrix Decompositions  (2)

Find an orthonormal basis for the column space of the following matrix , and then use that basis to find a QR factorization of :

Apply GramSchmidt to the columns of , then define as the matrix whose columns are those vectors:

Let R=TemplateBox[{Q}, Transpose].a:

Confirm that :

Compare with the result given by QRDecomposition; the matrices are the same:

The matrices differ by a transposition because QRDecomposition gives the row-orthonormal result:

For a Hermitian matrix (more generally, any normal matrix), the eigenvectors are orthogonal, and it is conventional to define the projection matrices p_k=TemplateBox[{{{, {e, _, k}, }}}, Transpose].TemplateBox[{{{, {e, _, k}, }}}, Conjugate], where is a normalized eigenvector. Show that the action of the projection matrices on a general vector is the same as projecting the vector onto the eigenspace for the following matrix :

Verify the is Hermitian:

Find the eigenvalues and eigenvectors:

and are both orthogonal to since they come from different eigenspaces:

They need not and are not orthogonal to each other, since they have the same eigenvalue:

Use Orthogonalize to create an orthonormal basis out of the :

Compute the projection matrices:

Confirm that multiplying a general vector by equals the projection of the vector onto :

Since the form an orthonormal basis, the sum of the must be the identity matrix:

Moreover, the sum of is the original matrix :

General Inner Products and Function Spaces  (4)

A positive-definite, real symmetric matrix or metric defines an inner product by :

Being positive-definite means that the associated quadratic form is positive for :

Note that Dot itself is the inner product associated with the identity matrix:

Orthogonalize the standard basis of TemplateBox[{}, Reals]^n to find an orthonormal basis:

Confirm that this basis is orthonormal with respect to the inner product :

Fourier series are projections onto an orthonormal basis in the inner product space . Define the standard inner product on square-integrable functions:

Orthogonalize functions of the form in this inner product:

equals the symmetric Fourier coefficient corresponding to FourierParameters{0,1}:

Confirm using FourierCoefficient:

The Fourier series is the projection of onto the space spanned by the :

Confirm the result using FourierSeries:

LegendreP defines a family of orthogonal polynomials with respect to the inner product . Orthogonalize the polynomials for from zero through four to compute scalar multiples of the first five Legendre polynomials:

Compare to the conventional Legendre polynomials:

For each , and TemplateBox[{k, x}, LegendreP] differ by a factor of :

HermiteH defines a family of orthogonal polynomials with respect to the inner product <f,g>=int_(-infty)^inftyTemplateBox[{{ , f}}, Conjugate] g exp(-x^2)dx. Apply the unnormalized GramSchmidt process to the polynomials for from zero through four to compute scalar multiples of the first five Hermite polynomials:

Compare to the conventional Hermite polynomials:

For each , and TemplateBox[{k, x}, LegendreP] differ by a constant multiple:

Properties & Relations  (9)

For linearly independent vectors, the result is an orthonormal set:

This extends to any inner product:

For linearly independent -vectors, the result is a unitary matrix:

If the vectors are real-valued, the matrix is additionally orthogonal:

If input vectors are not linearly independent, the result is padded with zero vectors to a matching length:

If is the result of Orthogonalize[vecs], u.TemplateBox[{u}, ConjugateTranspose] is a diagonal matrix:

It only has ones and zeroes on the diagonal:

Zeroes on the diagonal correspond to zero vectors in the result:

In dimensions, there can be at most elements in the orthonormal basis:

Most sets of random -dimensional vectors are spanned by exactly basis vectors:

With the default method, the first element of the basis is always a multiple of the first vector:

Orthogonalize can be implemented by repeated application of Projection and Normalize:

Orthogonalize[m] is related to QRDecomposition[Transpose[m]]:

They are the same up to sign:

Wolfram Research (2007), Orthogonalize, Wolfram Language function, https://reference.wolfram.com/language/ref/Orthogonalize.html.

Text

Wolfram Research (2007), Orthogonalize, Wolfram Language function, https://reference.wolfram.com/language/ref/Orthogonalize.html.

CMS

Wolfram Language. 2007. "Orthogonalize." Wolfram Language & System Documentation Center. Wolfram Research. https://reference.wolfram.com/language/ref/Orthogonalize.html.

APA

Wolfram Language. (2007). Orthogonalize. Wolfram Language & System Documentation Center. Retrieved from https://reference.wolfram.com/language/ref/Orthogonalize.html

BibTeX

@misc{reference.wolfram_2023_orthogonalize, author="Wolfram Research", title="{Orthogonalize}", year="2007", howpublished="\url{https://reference.wolfram.com/language/ref/Orthogonalize.html}", note=[Accessed: 18-March-2024 ]}

BibLaTeX

@online{reference.wolfram_2023_orthogonalize, organization={Wolfram Research}, title={Orthogonalize}, year={2007}, url={https://reference.wolfram.com/language/ref/Orthogonalize.html}, note=[Accessed: 18-March-2024 ]}