4

I saw a post https://mathoverflow.net/questions/98367/a-sum-of-eigenvalues that said

It is well-known that $\sum^{r}_{i = 1} \lambda_{i}(X)$ is convex.

and I saw an explanation in Boyd and Vandenberghe - "Convex Optimization", question 3.26(a) that said

$$\sum^{k}_{i = 1} \lambda_{i}(X) = \sup\{\operatorname{tr}(V^{T}XV) \mid V^{T}V = I\}.$$

The variational characterization shows that $f$ is the pointwise supremum of a family of linear functions $\operatorname{tr}(V^{T}XV )$

is an explanation for why it is convex.

But I don't really understand how the sup is the sum of the first $k$ eigenvalues and how that makes it convex.

1 Answers1

4

The supremum of any family of convex functions is convex. (proof)

Every linear function is convex.

Hence the supremum of a family of linear functions is convex.

Edit: To prove the supremum identity first note that for $V \in \mathbb R^{n \times k}$ with $V^TV=I$ \begin{align} \textrm{tr}(V^TXV) &= \textrm{tr}(XVV^T)\\ &\le \lambda(X)^T \lambda(V V^T) \end{align} Now since $(VV^T)^2 = VV^T$ the only eigenvalues are $0$ and $1$. Also note that $$\textrm{tr}(VV^T) = \textrm{tr}(V^TV) = k$$ Hence $VV^T$ has $k$ eigenvalues that are 1 and the rest are zero. Combining this with the previous inequality yields, and noting that any sum of $k$ eigenvalues is less than the sum of the $k$ largest eigenvalues yields $$\textrm{tr}(V^TXV) \le \sum_{i=1}^k \lambda_i(X)$$.

Taking $$V = [v_1, \ldots v_k]$$ where $v_j$ is the normalised eigenvector corresponding to $\lambda_j(X)$ achieves the supremum.

Ben Martin
  • 1,766
  • 1
  • 7
  • 16
  • Yes, I understand that the sup of any family of convex functions is convex, I just don't understand by the sup is the first k eigenvalues – Tigertron May 18 '22 at 03:39
  • Does this also mean that $f(x) = \sum^{k}{i} \lambda{i}(g(x))$ is also convex – Tigertron May 18 '22 at 14:35
  • No, take $g(x) = -x^2$ and $n=1$. The sup identity still holds but the family of functions need not be convex. – Ben Martin May 18 '22 at 21:22