How can the eigenvector corresponding to zero eigenvalue be found out? I was trying with the following simple matrix in Matlab:
$$A=\left[\begin{array}{ccc}1 & -2 & 3 \\ 2 & -3 & 4 \\ 3 & -4 & 5 \end{array}\right] \; .$$
In matlab computations, the matrix seemed nearly singular with one of the eigenvalues very close to zero (3e-15). That means the usual shifted inverse power methods for finding out the unit eigenvector corresponding to an eigenvalue won't work. But Matlab returns an eigenvector corresponding to 0. How? Basically, I would like to develop a program to compute this eigenvector given any singular matrix. What algorithm should I use?
Edit: (1) Edited to reflect that the 'nearly singular' comment was corresponding to Matlab calculation. (2) Edited to specify the actual question.
$\endgroup$ 12 Answers
$\begingroup$This matrix is singular, the determinant is zero, so it has an eigenvector for eigenvalue $0$. Nothing mysterious there -- you might want to check the calculation that made you think it was only nearly singular.
As for how to find eigenvectors with eigenvalue $0$: They are just the solutions of the homogeneous system of linear equations corresponding to this matrix, $Ax=0$, so you can use e.g. Gaussian elimination.
$\endgroup$ 3 $\begingroup$A matrix $A$ has eigenvalue $\lambda$ if and only if there exists a nonzero vector $\mathbf{x}$ such that $A\mathbf{x}=\lambda\mathbf{x}$. This is equivalent to the existence of a nonzero vector $\mathbf{x}$ such that $(A-\lambda I)\mathbf{x}=\mathbf{0}$. This is equivalent to the matrix $A-\lambda I$ having nontrivial nullspace, which in turn is equivalent to $A-\lambda I$ being singular (determinant equal to $0$).
In particular, $\lambda=0$ is an eigenvector if and only if $\det(A)=0$. If the matrix is "nearly singular" but not actually singular, then $\lambda=0$ is not an eigenvalue.
As it happens, $$\begin{align*} \det(A) &= \left|\begin{array}{rr} -3 & \hphantom{-}4\\ -4 & 5 \end{array}\right| + 2\left|\begin{array}{cc} 2 & 4\\ 3 & 5 \end{array}\right| + 3\left|\begin{array}{rr} \hphantom{-}2 & -3\\ 3 & -4 \end{array}\right|\\ &= \Bigl(-15 + 16\Bigr) + 2\Bigl( 10 - 12\Bigr) + 3\Bigl(-8+9\Bigr)\\ &= 1 - 4 + 3 = 0, \end{align*}$$ so the matrix is not "nearly singular", it is just plain singular.
The eigenvectors corresponding to $\lambda$ are found by solving the system $(A-\lambda I)\mathbf{x}=\mathbf{0}$. So, the eigenvectors corresponding to $\lambda=0$ are found by solving the system $(A-0I)\mathbf{x}=A\mathbf{x}=\mathbf{0}$. That is: solve $$\begin{array}{rcrcrcl} x & - & 2y & + & 3z & = & 0 \\ 2x & - & 3y & + & 4z & = & 0 \\ 3x & - & 4y & + & 5z & = & 0. \end{array}$$ The solutions (other than the trivial solution) are the eigenvectors. A basis for the solution space (the nullspace of $A$) is a basis for the eigenspace $E_{\lambda}$.
Added. If you know a square matrix is singular, then finding eigenvectors corresponding to $0$ is equivalent to solving the corresponding system of linear equations. There are plenty of algorithms for doing that: Gaussian elimination, for instance (Wikipedia even has pseudocode for implementing it). If you want numerical stability, you can also use Grassmann's algorithm.
$\endgroup$ 2