2

I came across this problem and I'm not sure if the proof I have found is OK. I may be overcomplicating things. Any suggestions or confirmation of correctness would be appreciated.

The Problem:

Let $A$ be an $n \times n$ Hermitian matrix such that $\det A_k > 0 $ for all $k = 1, 2, . . . , n − 1$ and $\det A \geq 0$. Prove that $A$ is positive semidefinite.

My shot:

Consider the matrix $A_{n-1}$. By Silvester's Criterion of Positivity, we know $A_{n-1}$ is positive definite $(A_{n-1}>0)$ and therefore it can be diagonalized by congruence using simple row operations (making sure to conserve the determinant) into a diagonal matrix $A_{n-1}'$ with positive entries.

We now consider the matrix $A'$, defined as the matrix $A$ but with the submatrix $A_{n-1}$ diagonalized as explained above:

$$A'=\begin{pmatrix} A_{n-1}' & *\\* & a_{n,\,n}\end{pmatrix}$$

$A'$ has the same determinant as $A$, so $\det A'\geq0$. We can now further diagonalize the matrix $A'$, keeping the determinant, to obtain:

$$D=\begin{pmatrix} A_{n-1}' & 0\\0 & a_{n,\,n}'\end{pmatrix},\:\det D\geq0$$

We compute the determinant of $D$:

$$\det D=\begin{vmatrix} A_{n-1}' & 0\\0 & a_{n,\,n}'\end{vmatrix}= a_{n,\,n}'\cdot\det A_{n-1}' \geq 0$$

As we have $\det A_{n-1}'>0$, we know that $a_{n,\,n}' \geq0$. Therefore, we have diagonalized the matrix A into a diagonal matrix with all its entries satisfying $a_{i,\,i}' \geq 0$, $\forall i=1,\;2,\;...,\;n$.

As stated by Sylvester's Law of Inertia, the number of positive, zero and negative entries of the diagonal matrix depends only on $A$ and not on the specific diagonalization that we use. As the matrix $A$ is Hermitian, one of its possible diagonal forms will have the eigenvalues of $A$ in its diagonal entries. Therefore, by Sylvester's Law of Inertia, the eigenvalues $\lambda_i$ of $A$ will also satisfy $\lambda_i\geq0$, $\forall i=1,\;2,\;...,\;n$. This means that the matrix $A$ is positive semidefinite ($A\geq0$).

Clerni
  • 438
  • Sounds good to me. – A.G. Dec 27 '20 at 13:10
  • Are you sure A and D are congruent matrices ? I think you went too fast on this argument. – Velobos Dec 28 '20 at 09:51
  • @Velobos Yes, because in all cases we diagonalize by congruence, as stated above. This means we do the same operations to the rows of $A$ as to the columns of $A$. If we now take the identity matrix and perform the same elementary row operations, but only on the rows, we will obtain a matrix $P$ that will perform those row operations when multiplied on the left of $A$ (that is, when we do $PA$). If we then take the adjoint $P^$ of $P$, we obtain a matrix that will perform the same operations but on the columns when multiplied on the right of $A$ (that is, when we do $AP^$). – Clerni Dec 28 '20 at 14:04
  • Then, if we multiply on the right by $P^$ and on the left by $P$, then we will have performed both the row and column operations that we performed on $A$ when we diagonalized by congruence. Hence, $D=PAP^$, and $A$ and $D$ are congruent. – Clerni Dec 28 '20 at 14:04
  • This is one of the methods used to diagonalize quadratic forms and it always gives congruent matrices. – Clerni Dec 28 '20 at 14:07
  • As an aside, it's worth stressing that all the principal leading minors must be positive rather than merely nonnegative. For instance, the answer here gives a matrix with positive entries and nonnegative leading principal minors which nevertheless is indefinite. – Semiclassical Jan 01 '21 at 20:03

1 Answers1

0

Just so this question does not go unanswered, I have confirmed that the above approach is correct. Any other suggestions are welcome.

Clerni
  • 438