Here is simple C code using Jacoby’s algorithm to find eigenvectors and eigenvalues of real symmetric matrices. A prototype:
void eigen(int n, R m[n][n], R l[n], R vc[n][n]);
R may be double or float. m is the matrix. The caller provides the space for the answers which go into two arrays:

The main routine at the end is merely a sample invocation and test case.


An Algol 68 transcription
This is a set of OCaml matrix routines including eigenvalues.
For eigenvalues of general complex matrices see ideas here together with the notion that dual subspaces as computed here. Find an eigenvalue λ, via polynomial root, M−λI, spans a space whose dual we take and which is the spanned by the eigenvectors corresponding to λ. I think.

In more detail: Given an complex n×n matrix M, evaluate the determinant of M−λI for n distinct values of λ. Find the unique polynomial in λ with those values. Find the roots of that polynomial. These are the eigenvectors. Collect these into equivalence classes by some ε threshold. For each eigenvalue λ the matrix M−λI spans a subspace of Cn. The rank of M−λI is the degeneracy of the eigenvalue. The dual space is a the subspace of all eigenvectors for the given eigenvalue.

There is perhaps an iteration opportunity to improve accuracy. Once we have a guess for the roots, we can evaluate the determinant again at those points to compute a new polynomial.


I recall here from memory some theory of projections in complex vector spaces. A projection E is a linear transformation from a complex vector space to itself where E2 = E. An eigenvalue λ of E must thus satisfy λ2 = λ and thus λ = 0 or 1. A transformation is a projection iff its spectrum ⊆ {0, 1}. Any transformation from a complex vector space to itself is a linear combination of projections. The coefficients are the eigenvalues and each of the projections, E, is derived from its eigenvector, v, thus: Eij = vivj.

See Jordan Normal Form about eigen pathology.

Eigenvalues and vectors for lop-sided matrices.

Arbitrary complex square matrices?