# Projection

Projection[u,v]

finds the projection of the vector u onto the vector v.

Projection[u,v,f]

finds projections with respect to the inner product function f.

# Details

- For ordinary real vectors u and v, the projection is taken to be .
- For ordinary complex vectors u and v, the projection is taken to be , where is Conjugate[v]. »
- In Projection[u,v,f], u and v can be any expressions or lists of expressions for which the inner product function f applied to pairs yields real results. »
- Projection[u,v,Dot] effectively assumes that all elements of u and v are real. »

# Examples

open allclose all## Basic Examples (3)

## Scope (9)

### Basic Uses (6)

### General Inner Products a(3)

Give an inner product of Dot to assume all expressions are real-valued:

Project vectors that are not lists using an explicit inner product:

## Applications (17)

### Geometry (3)

Project the vector on the line spanned by the vector :

Visualize and its projection onto the line spanned by :

Project the vector on the plane spanned by the vectors and :

First, replace with a vector in the plane perpendicular to :

The projection in the plane is the sum of the projections onto and :

Find the component perpendicular to the plane:

Confirm the result by projecting onto the normal to the plane:

Visualize the plane, the vector and its parallel and perpendicular components:

The Frenet–Serret system encodes every space curve's properties in a vector basis and scalar functions. Consider the following curve:

Construct an orthonormal basis from the first three derivatives by subtracting parallel projections:

Ensure that the basis is right-handed:

Compute the curvature, , and torsion, , which quantify how the curve bends:

Verify the answers using FrenetSerretSystem:

Visualize the curve and the associated moving basis, also called a frame:

### Bases and Matrix Decompositions (3)

Apply the Gram–Schmidt process to construct an orthonormal basis from the following vectors:

The first vector in the orthonormal basis, , is merely the normalized multiple :

For subsequent vectors, components parallel to earlier basis vectors are subtracted prior to normalization:

Confirm the answers using Orthogonalize:

Find an orthonormal basis for the column space of the following matrix , and then use that basis to find a QR factorization of :

Define as the element of the corresponding Gram–Schmidt basis:

Define as the matrix whose columns are :

Compare with the result given by QRDecomposition; the matrices are the same:

The matrices differ by a transposition because QRDecomposition gives the row-orthonormal result:

For a Hermitian matrix (more generally, any normal matrix), the eigenvectors are orthogonal, and it is conventional to define the projection matrices , where is a normalized eigenvector. Show that the action of the projection matrices on a general vector is the same as projecting the vector onto the eigenspace for the following matrix :

Find the eigenvalues and eigenvectors:

Compute the normalized eigenvectors:

Compute the projection matrices:

Confirm that multiplying a general vector by equals the projection of the vector onto :

Since the form an orthonormal basis, the sum of the must be the identity matrix:

### Least Squares and Curve Fitting (3)

If the linear system has no solution, the best approximate solution is the least-squares solution. That is the solution to , where is the orthogonal projection of onto the column space of . Consider the following and :

The linear system is inconsistent:

Find orthogonal vectors that span . First, let be the first column of :

Let be a vector in the column space that is perpendicular to :

Compute the orthogonal projection of onto the spaced spanned by the :

Visualize , its projections onto the and :

Confirm the result using LeastSquares:

Projection can be used to find a best-fit curve to data. Consider the following data:

Extract the and coordinates from the data:

Let have the columns and , so that minimizing will be fitting to a line :

The following two orthogonal vectors clearly span the same spaces as the column of :

Get the coefficients and for a linear least‐squares fit:

Verify the coefficients using Fit:

Plot the best-fit curve along with the data:

Find the best-fit parabola to the following data:

Extract the and coordinates from the data:

Let have the columns , and , so that minimizing will be fitting to :

Construct orthonormal vectors that have the same column space as :

Get the coefficients , and for a least‐squares fit:

Verify the coefficients using Fit:

### General Inner Products and Function Spaces (5)

A positive-definite, real symmetric matrix or metric defines an inner product by :

Being positive-definite means that the associated quadratic form is positive for :

Note that Dot itself is the inner product associated with the identity matrix:

Apply the Gram–Schmidt process to the standard basis to obtain an orthonormal basis:

Confirm that this basis is orthonormal with respect to the inner product :

Fourier series are projections onto a particular basis in the inner product spaces . Define the standard inner product on square-integrable functions:

Let denote for different integer values of :

The are orthogonal to each other, though not orthonormal:

The Fourier series of a function is the projection of onto the space spanned by the :

Confirm the result using FourierSeries:

Moreover, equals the Fourier coefficient corresponding to FourierParameters{-1,1}:

Confirm using FourierCoefficient:

Unnormalized Gram–Schmidt algorithm:

Do Gram–Schmidt on a random set of 3 vectors:

Verify orthogonality; as the vectors are not normalized, the result is a general diagonal matrix:

Use a positive and real symmetric matrix to define a complex inner product:

Do Gram–Schmidt on a random set of three complex vectors:

LegendreP defines a family of orthogonal polynomials with respect to the inner product . Apply the unnormalized Gram–Schmidt process to the polynomials for from zero through four to compute scalar multiples of the first five Legendre polynomials:

Compare to the conventional Legendre polynomials:

For each , and differ by a constant multiple:

HermiteH defines a family of orthogonal polynomials with respect to the inner product . Apply the unnormalized Gram–Schmidt process to the polynomials for from zero through four to compute scalar multiples of the first four Hermite polynomials:

Compared to the conventional Hermite polynomials, is smaller by a factor of :

The orthonormal polynomials have some multiple of in the denominator:

### Quantum Mechanics (3)

In quantum mechanics, states are represented by complex unit vectors and physical quantities by Hermitian linear operators. The eigenvalues represent possible observations and the squared norm of projections onto the eigenvectors the probabilities of those observations. For the spin operator and state given, find the possible observations and their probabilities:

Computing the eigensystem, the possible observations are :

The relative probabilities are for and for :

In quantum mechanics, the energy operator is called the Hamiltonian , and a state with energy evolves according to the Schrödinger equation . Given the Hamiltonian for a spin-1 particle in a constant magnetic field in the direction, find the state at time of a particle that is initially in the state representing :

Computing the eigensystem, the energy levels are and :

The state at time is the sum of each eigenstate evolving according to the Schrödinger equation:

For the Hamiltonian , the eigenvector is a function that is a constant multiple of , and the inner product on vectors is . For a particle in the state , find the probability that it is in one of the first four eigenstates. First, define an inner product:

Confirm that is a unit vector in this inner product:

Project onto the first four states; for and , the projection and hence probability is zero:

The probability is given by the squared norm of the projection. For , it is just under 90%:

## Properties & Relations (8)

The projection of u onto v is in the direction of v:

The projection of v onto itself is v:

For ordinary vectors and , the projection is taken to be :

If and have real entries, , where is the angle between and :

For vectors u and v, u-Projection[u,v] is orthogonal to v:

Orthogonalize can be implemented by repeated application of Projection and Normalize:

For ordinary vectors and , the projection can be computed as :

The projection of u onto v is equivalent to multiplication by an outer product matrix:

#### Text

Wolfram Research (2007), Projection, Wolfram Language function, https://reference.wolfram.com/language/ref/Projection.html (updated 2014).

#### CMS

Wolfram Language. 2007. "Projection." Wolfram Language & System Documentation Center. Wolfram Research. Last Modified 2014. https://reference.wolfram.com/language/ref/Projection.html.

#### APA

Wolfram Language. (2007). Projection. Wolfram Language & System Documentation Center. Retrieved from https://reference.wolfram.com/language/ref/Projection.html