
In this short post, we show why the covariance matrix $\Sigma \in \mathbb{R}^{n\times n}$ of a multivariate Gaussian $\vx\in\mathbb{R}^n$ is always symmetric positive definite. That is,

1. for all non-zero vector $\vy \in \mathbb{R}^n$, $\vy^T\Sigma \vy>0$
2. $\Sigma_{ij} = \Sigma_{ji}$.

The latter condition $(2)$ follows from the definition of the covariance matrix and the commutative property of multiplication. For condition $(1)$, we can prove it in two steps:

First, for all non-zero vector $\vy \in \mathbb{R}^n$, $\vy^T\Sigma \vy \geq 0$. This follows from ${\vx}^T\Sigma \vx=\sum_{ij} y_i y_j \Sigma_{ij}=\sum_{ij}y_i y_j E[(x_i - \bar{x}_i)(x_j - \bar{x}_j)]=E\big[\sum_{ij}y_i y_j (x_i - \bar{x}_i)(x_j - \bar{x}_j)\big]\;,$ with $z_i =x_i - \bar{x}_i$, we have $% $ since $(\vy^T\vz)^2\geq 0$.

Second, for $\Sigma$ to hold as a covariance matrix for a multivariate Gaussian, it must be invertible. That is to say, its rank is $n$, i.e., $n$ non-zero eigenvalues ${\lambda_i }_{1\leq i \leq n}$ which are also positive $($from step 1$)$. This means that all non-zero vectors $\vx \in \mathbb{R}^n$ can be written as linear combinations of $\Sigma$’s eigen vectors $\{\mathbf{v}_i\}_{1\leq i \leq n}\;.$

Therefore, ${\vx}^T \Sigma \vx= \sum_{ij} w_i \mathbf{v}^T_i \lambda_j w_j \mathbf{v}_j= \sum_{i} \lambda_i w^2_i > 0\;.$