Projection of a vector on a line

Definition

Consider the line in mathbf{R}^n passing through x_0 in mathbf{R}^n and with direction u in mathbf{R}^n:

 left{ x_0+t u ~:~ t in mathbf{R} right} ,

The projection of a given point x on the line is a vector z located on the line, that is closest to x (in Euclidean norm). This corresponds to a simple optimization problem:

 min_t : |x - x_0 - tu|_2.

(This particular problem is part of a general class of optimization problems known as least-squares.)

alt text 

Projection of the vector x=(1.6,2.28) on a line passing through the origin (x_0=0) and with (normalized) direction u = (0.8944,0.4472). At optimality the ‘‘residual’’ vector x-z is orthogonal to the line, hence z = tu, with t = x^Tu = 2.0035. Any other point on the line is farther away from the point x.

Closed-form expression

Assuming that u is normalized, the optimal solution to the above problem is

 t^ast = u^T(x-x_0),

and the expression for the projected vector is

 z^ast = x_0 + t^ast u = x_0 + u^T(x-x_0) u.

In the case when u is not normalized, the expression is

 z^ast = x_0 + frac{u^T(x-x_0)}{u^Tu} u .

Proof: Let us first assume that u is normalized. We express the square of the objective function as

 |x - x_0 - tu|_2^2 = t^2 - 2 alpha t + beta^2 = (t-alpha)^2 + mbox{rm constant},

where alpha := u^T(x-x_0), beta := |p-p_0|_2^2. The minimum is clearly attained when the first term is zero, which yields

 t^ast = u^T(x-x_0),

as claimed. clubsuit