Projection

Projection[u,v]

finds the projection of the vector u onto the vector v.

Projection[u,v,f]

finds projections with respect to the inner product function f.

Details

  • For ordinary real vectors u and v, the projection is taken to be .
  • For ordinary complex vectors u and v, the projection is taken to be , where is Conjugate[v]. »
  • In Projection[u,v,f], u and v can be any expressions or lists of expressions for which the inner product function f applied to pairs yields real results. »
  • Projection[u,v,Dot] effectively assumes that all elements of u and v are real. »

Examples

open allclose all

Basic Examples  (3)

Project the vector (5, 6, 7) onto the axis:

Project onto another vector:

Project a symbolic vector onto a numeric one:

Scope  (9)

Basic Uses  (6)

Find the projection of a machine-precision vector onto another:

Projection of a complex vector onto another:

Projection of an exact vector onto another:

Projection of an arbitrary-precision vector onto another:

The projection of large numerical vectors is computed efficiently:

Project symbolic vectors:

General Inner Products  a(3)

Give an inner product of Dot to assume all expressions are real-valued:

Project vectors that are not lists using an explicit inner product:

Specify the inner product using a pure function:

Applications  (18)

Geometry  (4)

Project the vector on the line spanned by the vector :

Visualize and its projection onto the line spanned by :

Project the vector on the plane spanned by the vectors and :

First, replace with a vector in the plane perpendicular to :

The projection in the plane is the sum of the projections onto and :

Find the component perpendicular to the plane:

Confirm the result by projecting onto the normal to the plane:

Visualize the plane, the vector and its parallel and perpendicular components:

Use Projection to reflect the vector with respect to the line normal to the vector :

Since is perpendicular to the line, subtracting twice will reflect across the line:

Compare with the result of ReflectionTransform:

Visualize and its reflection as a twice-repeated translation by :

The FrenetSerret system encodes every space curve's properties in a vector basis and scalar functions. Consider the following curve (a helix):

Construct an orthonormal basis from the first three derivatives by subtracting parallel projections:

Ensure that the basis is right-handed:

Compute the curvature, , and torsion, , which quantify how the curve bends:

Verify the answers using FrenetSerretSystem:

Visualize the curve and the associated moving basis, also called a frame:

Bases and Matrix Decompositions  (3)

Apply the GramSchmidt process to construct an orthonormal basis from the following vectors:

The first vector in the orthonormal basis, , is merely the normalized multiple :

For subsequent vectors, components parallel to earlier basis vectors are subtracted prior to normalization:

Confirm the answers using Orthogonalize:

Find an orthonormal basis for the column space of the following matrix , and then use that basis to find a QR factorization of :

Define is the column of :

Define as the element of the corresponding GramSchmidt basis:

Define as the matrix whose columns are :

Let R=TemplateBox[{Q}, Transpose].a:

Confirm that :

Compare with the result given by QRDecomposition; the matrices are the same:

The matrices differ by a transposition because QRDecomposition gives the row-orthonormal result:

For a Hermitian matrix (more generally, any normal matrix), the eigenvectors are orthogonal, and it is conventional to define the projection matrices p_k=TemplateBox[{{{, {e, _, k}, }}}, Transpose].TemplateBox[{{{, {e, _, k}, }}}, Conjugate], where is a normalized eigenvector. Show that the action of the projection matrices on a general vector is the same as projecting the vector onto the eigenspace for the following matrix :

Verify the is Hermitian:

Find the eigenvalues and eigenvectors:

Compute the normalized eigenvectors:

Compute the projection matrices:

Confirm that multiplying a general vector by equals the projection of the vector onto :

Since the form an orthonormal basis, the sum of the must be the identity matrix:

Moreover, the sum of is the original matrix :

Least Squares and Curve Fitting  (3)

If the linear system has no solution, the best approximate solution is the least-squares solution. That is the solution to , where is the orthogonal projection of onto the column space of . Consider the following and :

The linear system is inconsistent:

Find orthogonal vectors that span . First, let be the first column of :

Let be a vector in the column space that is perpendicular to :

Compute the orthogonal projection of onto the spaced spanned by the :

Visualize , its projections onto the and :

Solve :

Confirm the result using LeastSquares:

Projection can be used to find a best-fit curve to data. Consider the following data:

Extract the and coordinates from the data:

Let have the columns and , so that minimizing TemplateBox[{{{m, ., {{, {a, ,, b}, }}}, -, y}}, Norm] will be fitting to a line :

The following two orthogonal vectors clearly span the same spaces as the column of :

Get the coefficients and for a linear leastsquares fit:

Verify the coefficients using Fit:

Plot the best-fit curve along with the data:

Find the best-fit parabola to the following data:

Extract the and coordinates from the data:

Let have the columns , and , so that minimizing TemplateBox[{{{m, ., {{, {a, ,, b, ,, c}, }}}, -, y}}, Norm] will be fitting to :

Construct orthonormal vectors that have the same column space as :

Get the coefficients , and for a leastsquares fit:

Verify the coefficients using Fit:

Plot the best-fit curve along with the data:

General Inner Products and Function Spaces  (5)

A positive-definite, real symmetric matrix or metric defines an inner product by :

Being positive-definite means that the associated quadratic form is positive for :

Note that Dot itself is the inner product associated with the identity matrix:

Apply the GramSchmidt process to the standard basis to obtain an orthonormal basis:

Confirm that this basis is orthonormal with respect to the inner product :

Fourier series are projections onto a particular basis in the inner product spaces . Define the standard inner product on square-integrable functions:

Let denote for different integer values of :

The are orthogonal to each other, though not orthonormal:

The Fourier series of a function is the projection of onto the space spanned by the :

Confirm the result using FourierSeries:

Moreover, equals the Fourier coefficient corresponding to FourierParameters{-1,1}:

Confirm using FourierCoefficient:

Unnormalized GramSchmidt algorithm:

Do GramSchmidt on a random set of 3 vectors:

Verify orthogonality; as the vectors are not normalized, the result is a general diagonal matrix:

Use a positive and real symmetric matrix to define a complex inner product:

Do GramSchmidt on a random set of three complex vectors:

Verify orthogonality:

LegendreP defines a family of orthogonal polynomials with respect to the inner product . Apply the unnormalized GramSchmidt process to the monomials for from zero through four to compute scalar multiples of the first five Legendre polynomials:

Compare to the conventional Legendre polynomials:

For each , and TemplateBox[{k, x}, LegendreP] differ by a constant multiple, which can be shown to equal 2^k TemplateBox[{{1, /, 2}, k}, Pochhammer]/k!:

Compute an orthonormal set:

Compare with an explicit expression for the orthonormalized polynomials:

HermiteH defines a family of orthogonal polynomials with respect to the inner product <f,g>=int_(-infty)^inftyTemplateBox[{{ , f}}, Conjugate] g exp(-x^2)dx. Apply the unnormalized GramSchmidt process to the monomials for from zero through four to compute scalar multiples of the first four Hermite polynomials:

Compared to the conventional Hermite polynomials, is smaller by a factor of :

The orthonormal polynomials differ by a multiple of in the denominator:

Compare with an explicit expression for the orthonormalized polynomials:

Quantum Mechanics  (3)

In quantum mechanics, states are represented by complex unit vectors and physical quantities by Hermitian linear operators. The eigenvalues represent possible observations and the squared norm of projections onto the eigenvectors the probabilities of those observations. For the spin operator and state given, find the possible observations and their probabilities:

Computing the eigensystem, the possible observations are :

The relative probabilities are for and for :

In quantum mechanics, the energy operator is called the Hamiltonian , and a state with energy evolves according to the Schrödinger equation . Given the Hamiltonian for a spin-1 particle in a constant magnetic field in the direction, find the state at time of a particle that is initially in the state representing :

Computing the eigensystem, the energy levels are and :

The state at time is the sum of each eigenstate evolving according to the Schrödinger equation:

For the Hamiltonian , the eigenvector is a function that is a constant multiple of TemplateBox[{n, x}, HermiteH]exp(-(x^2)/2) , and the inner product on vectors is <f,g>=int_(-infty)^infty TemplateBox[{f}, Conjugate]gdx. For a particle in the state psi=(exp(-x^4))/(sqrt(2 TemplateBox[{{5, /, 4}}, Gamma])), find the probability that it is in one of the first four eigenstates. First, define an inner product:

Confirm that is a unit vector in this inner product:

Project onto the first four states; for and , the projection and hence probability is zero:

The probability is given by the squared norm of the projection. For , it is just under 90%:

For , it is just under 9%:

Properties & Relations  (8)

The projection of u onto v is in the direction of v:

The projection of v onto itself is v:

For ordinary vectors and , the projection is taken to be ( TemplateBox[{v}, Conjugate].u)/(TemplateBox[{v}, Conjugate].v)v:

If and have real entries, TemplateBox[{{Proj, (, {u, ,, v}, )}}, Norm]=TemplateBox[{u}, Norm]cos(theta) , where is the angle between and :

For vectors u and v, u-Projection[u,v] is orthogonal to v:

Orthogonalize can be implemented by repeated application of Projection and Normalize:

For ordinary vectors and , the projection can be computed as (TemplateBox[{{{, v, }}}, Transpose].TemplateBox[{{{, v, }}}, Conjugate].u)/(TemplateBox[{v}, Conjugate].v):

The projection of u onto v is equivalent to multiplication by an outer product matrix:

Wolfram Research (2007), Projection, Wolfram Language function, https://reference.wolfram.com/language/ref/Projection.html (updated 2014).

Text

Wolfram Research (2007), Projection, Wolfram Language function, https://reference.wolfram.com/language/ref/Projection.html (updated 2014).

CMS

Wolfram Language. 2007. "Projection." Wolfram Language & System Documentation Center. Wolfram Research. Last Modified 2014. https://reference.wolfram.com/language/ref/Projection.html.

APA

Wolfram Language. (2007). Projection. Wolfram Language & System Documentation Center. Retrieved from https://reference.wolfram.com/language/ref/Projection.html

BibTeX

@misc{reference.wolfram_2023_projection, author="Wolfram Research", title="{Projection}", year="2014", howpublished="\url{https://reference.wolfram.com/language/ref/Projection.html}", note=[Accessed: 29-March-2024 ]}

BibLaTeX

@online{reference.wolfram_2023_projection, organization={Wolfram Research}, title={Projection}, year={2014}, url={https://reference.wolfram.com/language/ref/Projection.html}, note=[Accessed: 29-March-2024 ]}