Matrices as collections of vectors
Matrices can be viewed simply as a collection of vectors of same size, that is, as a collection of points in a high-dimensional space.
Matrices as collections of columns
Matrices can be described in column-wise fashion: given vectors in , we can define the matrix with ’s as columns:
Geometrically, represents points in a -dimensional space.
Transpose
The notation denotes the element of sitting in row and column . The transpose of a matrix . denoted by , is the matrix with element , , .
Matrices as collections of rows
Similarly, we can describe a matrix in row-wise fashion: given vectors in , we can define the matrix with the transposed vectors as rows:
Geometrically, represents points in a -dimensional space.
The notation denotes the set of matrices.
Examples:
Matrix-vector product
We define the matrix-vector product between a matrix and a -vector , and denote by , the -vector with -th component
If the columns of are given by the vectors , , so that , then can be interpreted as a linear combination of these columns, with weights given by the vector :
Alternatively, if the rows of are the row vectors , :
then is the vector with elements , :
Examples:
Matrix-matrix product
Definition
We can extend matrix-vector product to matrix-matrix product, as follows. If and , the notation denotes the matrix with element given by
It can be shown that transposing a product changes the order, so that .
Column-wise interpretation
If the columns of are given by the vectors , , so that , then can be written as
In other words, results from transforming each column of into .
Row-wise interpretation
The matrix-matrix product can also be interpreted as an operation on the rows of . Indeed, if is given by its rows , , then is the matrix obtained by transforming each one of these rows via , into , :
(Note that ’s are indeed row vectors, according to our matrix-vector rules.)
Matrix-matrix products by blocks
Matrix algebra generalizes to blocks, provided block sizes are consistent. To illustrate this, consider the matrix-vector product between a matrix and a -vector , where are partitioned in blocks, as follows:
where is , , , . Then
Likewise, if a matrix is partitioned into two blocks , each of size , , with , then
Example: Gram matrix.
Trace, scalar product
Trace
The trace of a square matrix , denoted by , is the sum of its diagonal elements: .
Scalar product
We can define the scalar product between two matrices via
We can interpret the above scalar product as the (vector) scalar product between two long vectors of length each, obtained by stacking all the columns of on top of each other.
Special matrices
Important classes of matrices include the following.
Identity matrix
The identity matrix (often denoted , or simply , if context allows), has ones on its diagonal and zeros elsewhere. It is diagonal, symmetric, and orthogonal, and satisfies for every matrix with columns.
Square matrices
Square matrices are matrices that have the same number of rows as columns.
Diagonal matrices
Diagonal matrices are square matrices with when .
Symmetric matrices
Symmetric matrices are square matrices that satisfy for every pair . An entire topic is devoted to symmetric matrices.
Orthogonal matrices
Orthogonal matrices are square matrices, such that the columns form an orthonormal basis. If is an orthogonal matrix, then
Thus, . Similarly, .
Orthogonal matrices correspond to bases that are a rotation of the standard basis. Their effect on a vector is to rotate it, leaving its length (Euclidean norm) invariant: for every vector ,
Example: A orthogonal matrix.
|