Assume that a matrix $B$ (real or complex, it's not important) has a null eigenvector x, namely $B x = 0$. Therefore, $B^{-1}$ does not exist (the kernel of $B$ has dimension greater or equal to 1). Define $A = \alpha I +B$, where $I$ is the identity and $\alpha$ is a number to be chosen such that $A^{-1}$ exists.
Is it true, in general, that $A^{-1}x = \alpha^{-1}x$? I guess that this statement is true, because $$x = A A^{-1}x= A^{-1} (Ax) = A^{-1} (\alpha x ) = \alpha ( A^{-1}x ) \, .$$
Now, just for curiosity: is it possible to find a more "explicit/direct" proof based on some known structural properties of $A^{-1}$?
Note: it is always possible to find a whole interval of values of $\alpha$ such that $A^{-1}$ exists, see this question about a slightly different case (here I add $\alpha$ only on the diagonal of $B$, not to every component).