Face-Recognition Home

 

 4.5 Eigenvectors and Eigenvalues

 if we multiply a square matrix with any other vector  then we will get another vector that is transformed from it's original position. It is the nature of the transformation that the eigenvectors arise from. Imagine a transformation matrix that ,when multiplied on the left, reflected vectors in the line y=x. Then we can see that if there were a vector that lay on the line y=x ,it is reflection of itself. This vector (and all multiples of it, because it wouldn't matter how long the vector was), would be an eigenvector of that transformation matrix. Eigenvectors can only be found for square matrices. And not every square matrix has eigenvectors. And given an n x n matrix that does have eigenvectors, there are n of them. Another property of eigenvectors is that even if we scale the vector by some amount before we multiply it, we will still get the same multiple of it as a result. This is because if we scale a vector by some amount ,all we are doing is making it longer,

 

Lastly, all the eigenvectors of a matrix are perpendicular,ie. at right angles to each other, no matter how many dimensions you have. By the way, another word for perpendicular ,in math talk, is orthogonal. This is important because it means that we can express the data in terms of these perpendicular eigenvectors, instead of expressing them in terms of the x and y axes. Every eigenvector has a value associated with it ,which is called as eigenvalue. Principal eigenvectors are those which have the highest eigenvalues associated with them.