Mean and Covariance Matrix of a Random Variable

If X is a vector random variable, the mean of X is hat{X} :=mathbf{E}(X), where mathbf{E} denotes the expectation operator with respect to the distribution that X follows. The covariance matrix of X is defined as
 Sigma := mathbf{E} (X - mathbf{E}(X))(X - mathbf{E}(X))^T .
Note that the covariance matrix is always a positive semi-definite matrix.

Assume that the distribution is discrete, with p_i the probability that the random variable X takes a certain value x_i in mathbf{R}^n, i=1,ldots,N. Then the expected value of X is
 hat{X} := mathbf{E} (X) = frac{1}{N}sum_{i=1}^N x_i,
and the covariance matrix is
 Sigma = frac{1}{N}sum_{i=1}^N (x_i-hat{X})(x_i-hat{X})^T .

Example: if X takes the two values x_1,x_2, with
 x_1 = left(begin{array}{c} 1  -2 end{array} right), ;; x_2 = left(begin{array}{c} -1  3 end{array} right),
with the probability of x_1 (resp. x_2) being p_1 = 0.3 (resp. p_2 = 0.7), then the mean of X is
 hat{X} = mathbf{E}(X) = p_1 x_1 + p_2 x_2 = 0.3 left(begin{array}{c} 1  -2 end{array} right) + 0.7 left(begin{array}{c} -1  3 end{array} right) = left(begin{array}{c} -0.4  1.5 end{array} right),
and its covariance matrix is
 Sigma = p_1 (x_1-hat{X})(x_1-hat{X})^T +  p_2 (x_2-hat{X})(x_2-hat{X})^T =  left(begin{array}{cc} 0.84 & -2.1 -2.1 & 5.25 end{array} right).