In this post, we show how Linear Algebra can be used to solve one of the common optimization problems that arise in a variety of domains. In particular, we are interested in the quadratic optimization problem of the form $ \newcommand{\vx}{\mathbf{x}} \newcommand{\vq}{\mathbf{q}} \newcommand{\lm}{\lambda} \newcommand{\norm}[1]{||#1||_2} $

\begin{align} \text{max}_{\vx}\hspace{1.15cm} \vx^TA\vx \newline \hspace{1em}\text{s.t.}\hspace{1em} \norm{\vx}=1 \label{eq:opt-problem} \end{align}

where $A\in\mathbb{R}^{n^2}$ is a real symmetric $($square$)$ matrix, $\vx\in\mathbb{R}^n$. An optimal solution of this problem is $\vq_1\in\mathbb{R}^n$, the eigenvector that corresponds to $\lambda_1\in \mathbb{R}$, the largest eigenvalue of $A$.

Now that we have defined the problem, let’s try to solve it by making use of Linear Algebra, from which we know the following.

  1. If $Q\in\mathbb{R}^{n^2}$ is an orthogonal matrix, then $Q^{T}Q=QQ^T=I_n$ where $I_n\in\mathbb{R}^{n^2}$ is the $n\times n$ identity matrix.
  2. Columns (and rows) of an orthogonal matrix have unit norm.
  3. Any real symmetric matrix $A$ can be decomposed into $Q\Lambda Q^T$, where $Q\in\mathbb{R}^{n^2}$ is an orthogonal matrix whose columns are the eigenvectors of $A$, while $\Lambda = diag(\mathbf{\lambda})$ is a diagonal matrix whose diagonal entries are the eign values of $A$ such that $\lambda_1\geq \lambda_2 \geq \cdots\geq \lambda_n$ and $\lambda_i$ is the eigenvalue of the $i^{th}$ column of $Q$.

From (3.), one can write $\vx^TA\vx$ as $\vx^T Q\Lambda Q^T \vx \leq \lambda_1 \vx^T QI_n Q^T \vx$. Based on (1.) and the constraint of $\vx$’s unit norm, the last term can further be simplified to have $\vx^TA\vx \leq \lambda_1 \norm{\vx}\leq \lambda_1$.

On the one hand, we found that $\max_{\vx} \vx^T A \vx = \lambda_1$. On the other hand, $A\vq_1 = \lambda_1\vq_1$, and thus $\vq_1^TA\vq_1 = \lambda_1$. Therefore, $\vq_1 \in \arg\max_{\vx} \vx^T A \vx$.