Let $A$ be a matrix $n\times n$ matrix such that for any matrix $B$ we have $\det(A+B)=\det(A)+\det(B)$. Does this imply that $A=0$? or $\det(A)=0$?
-
they were question at PHD exam in iran! if we consider B=[1,0:0,1] it is enough that trac(A)=0. but for every matrix B what?then it is atleast a necessery condition!but may be not enough – Somaye Mar 13 '13 at 12:16
4 Answers
No. Consider the 1-by-1 case, where $\det(A+B)\equiv A+B\equiv\det(A)+\det(B)$.
However, the statement is true for $n>1$. By multiplying elementary matrices, we may assume that $A=I_k\oplus0_{(n-k)\times(n-k)}$, where $k$ is the rank of $A$. If $k=n$, i.e. $A=I$, consider $B=0\oplus -I_{n-1}$. Then $\det(A+B)=\det(A)+\det(B)$ implies that $0=1$, which is a contradiction. Thus $k<n$. Now let $B=0_{k\times k}\oplus I_{n-k}$. Then $\det(A+B)=\det(A)+\det(B)$ implies that $1=\det(B)$. Hence $k$ must be equal to $0$, i.e. $A=0$.
- 54,185
- 139,064
When $n\geq 2$ and $A,B$ are $n\times n$ matrices over a principal ideal domain $R$ (a fortiori any field), this condition implies $A=0$. The argument of @user1551 works the same.
By the Smith normal form decomposition, there exist two invertible matrices $P,Q$ and a diagonal matrix $D$ such that $$ PAQ=D. $$ Then $$ \det (D+PBQ)=\det P \det (A+B)\det Q=\det P(\det A+\det B)\det Q =\det D+\det(PBQ) $$ so $$ \det (D+C)=\det D+\det C \qquad\forall C\in M_n(R). $$ Assume $D$ is nonzero. Without loss of generality, we can assume that $D=\mbox{diag}(d_1,\ldots,d_k,0\ldots,0)$ where $k\geq 1$ and all the $d_j\neq 0$.
If $k=n$, we get a contradiction with $C=\mbox{diag}(-d_1,0\ldots,0)$: $0=\det D+0$ while $\det D\neq 0$.
If $k<n$, we get a contradiction with $C=\mbox{diag}(0,\ldots,0,1\ldots,1)$ where there are $k$ zeros: $d_1\cdots d_k=0+0$.
So $$ D=0\qquad \Rightarrow \quad A=P^{-1}DQ^{-1}=0. $$
Note: if $R$ is not a principal ideal domain, I have no idea.
This happens of course when $n = 1$, as noted by @user1551
If $n \ge 2$, consider the matrix $C$ which has all zeroes, except the last row, which equals the first row of $A$ minus the last row of $A$. Thus $$ 0 = \det(A + C) = \det(A) + \det(C) = \det(A). $$ This is because $A+C$ has the first and last row equal, and $\det(C) = 0$ as $n > 1$.
Then $$ \det(-A + \lambda I) = (-1)^{n} \det(A) + \lambda^{n} = \lambda^{n}, $$ so all eigenvalues of $A$ are zero, and $A$ is nilpotent. Consider without loss of generality the case when $A \ne 0$ consists of a single Jordan block $$ A = \begin{bmatrix} 0&1&0&\dots\\ 0&0&1&\dots\\ \dots\\ \dots&&0&1&0\\ \dots&&&0&1\\ &&&&0 \end{bmatrix}, $$ where I have omitted zero values. Consider $$ B = \begin{bmatrix} 0&0&0\dots\\ 0&0&0\dots\\ \dots\\ \dots&0&0&0\\ \dots&&0&0\\ 1&&&0& \end{bmatrix}. $$ Then $\det(A) = \det(B) = 0$, but $\det(A+B) = 1$.
It follows that $A = 0$.
- 68,873
We are going to prove that $A=0$.
Let $A= (a_{ij})$. First it is obvious that $\det(A)=0$.
Claim 1 There exists an $i$ so that $a_{ii}=0$.
Define $B=(b_{ij})$ where $b_{ij}=a_{ij}$ if $j>i$ and $0$ otherwise. (Thus $B$ is 0 on the diagonal and under and
Then $\det(A-B)=\det(A)+(-1)^n \det(B)=0\pm 0=0$.
Since $A-B$ is lowertriangular, $\det(A-B)$ is the product of the diagonal entries and thus $a_{11}...a_{nn}=0$.
Claim 2 $a_{11}=..=a_{nn}=0$.
Proof.
Let $c_{ij}=a_{ij}$ if $j>i$ and $c_{ij}=0$ if $j<i$.
Now, we define $c_{ii}=0$ if $a_{ii}\neq 0$ and $1$ otherwise.
Then, by construction $A-C$ is an invertible lower triangual matrix, and $C$ is an upper triangular matrix.
Then
$$0 \neq \det(A-C)=\det(A)+(-1)^n\det(C)=\pm c_{11}c_{22}...c_{nn}$$
This proves that $c_{ii}\neq 0 \Rightarrow a_{ii} = 0$ for all $i$.
Claim 3 $A=0$.
Proof: Let $C$ be any invertible square matrix. Now, for any square matrix $B$ we have
$$\det(CA+B)=\det(C) [\det (A+C^{-1}B)]=\det(C) [\det (A)+\det(C^{-1}B)]= \det(C) \det (A)+ \det(C) \det(C^{-1}B)=\det(CA)+\det(B)$$
This proves that $CA$ has the same property for all invertible $C$, and thus, for any invertible matrix $C$ all the diagonal entries of $CA$ are $0$. [same result can be proven for $AC$].
To complete the exercise all you need to do is prove the following lemma, which is easy (trivial if you know permutation or elementary matrices.):
Lemma Let $A$ be a matrix so that for all invertible matrices $C$, the diagonal entries of $CA$ are all $0$. Then $A=0$.
Second proof Let $J$ be the cannonical form of $A$. Then exactly as in claim 3, $J$ has the same property.
Let $\lambda_1,..,\lambda_n$ be the eigenvalues of $A$ and let $D$ be any diagonal matrix with entries $x_1,..,x_n$.
Then $\det(J+D)=\det(J)+\det(D)$ yields
$$(\lambda_1+x_1)...(\lambda_n+x_n)=\lambda_1...\lambda_n+x_1...x_n$$
in ${\mathbb C}[X_1,..,X_n]$. It is trivial to conclude from here that $\lambda_1=..=\lambda_n=0$ , esspecially since we already know that $\det(A)=0$.
This shows that either $J=0$, or $J$ has some non zero entries, and all such extries appear in the diagonal above the main diagonal. But then by adding a matrix with at most $n-1$ non-zero elements, we can create an invertible matrix, which contradicts $\det(J+B)=\det(J)+\det(B)$.
- 132,525