0

Let $\alpha$ and $\beta$ be vectors in $\mathbb{R}^n$, and $\beta^tx>0$ for all $x$ in $\mathbb{R}^n$, the Hessian Matrix is $\{H(x)\}_{ij}=\frac{2}{(\beta^t x)^3}\left((\alpha^t x)^2(\beta_i\beta_j)+(\beta^t x)^2(\alpha_i\alpha_j)-(\alpha^t x)(\beta^t x)(\beta_i\alpha_j+\alpha_i\beta_j)\right)$, where $\alpha^t$, $\beta^t$ are the transpose of $\alpha$ and $\beta$.

Prove the Hessian Matrix $H(x)$ is positive semidefinite.

1 Answers1

1

To prove $H(x)$ is positive semidefinite, we only need to prove $s^t H(x) s \geq 0$, for any real vector $s$. Notice that $s^t H(x) s = \Sigma_{ij} ( s_i \{ H(x) \}_{ij} s_j )$, given the formula expression of $H(x)$, we have \begin{align} s^t H(x) s &= \frac{2}{(\beta^t x)^3} ((\alpha^t x)^2 (\Sigma_i \beta_i s_i) (\Sigma_j \beta_j s_j) + (\beta^t x)^2 (\Sigma_i \alpha_i s_i) (\Sigma_j \alpha_j s_j) \\ & ~~~~~~~~~~~~~~~~ - (\alpha^t x) (\beta^t x) ((\Sigma_i \beta_i s_i) (\Sigma_j \alpha_j s_j) + (\Sigma_i \alpha_i s_i) (\Sigma_j \beta_j s_j))) \\ &= \frac{2}{(\beta^t x)^3} ((\alpha^t x)^2 (\beta^t s)^2 + (\beta^t x)^2 (\alpha^t s)^2 - 2 (\alpha^t x) (\beta^t x) (\beta^t s) (\alpha^t s)) \\ &= \frac{2}{(\beta^t x)^3} ((\alpha^t x) (\beta^t s) - (\beta^t x) (\alpha^t s))^2 \\ &\geq 0 \end{align}

Therefore the matrix $H(x)$ is positive semidefinite.

Miles Zhou
  • 174
  • 5