Latest content was relocated to https://bintanvictor.wordpress.com. This old blog will be shutdown soon.

Friday, January 18, 2013

eigen vector, linear transform - learning notes

To keep things simple and concrete, let's limit ourselves to square matrices up to 3D.

I'm no expert on linear transform (LinT). I feel LinT is about mapping a columnar vector (actually ANY columnar in a 2D space) to another vector in another 2D space. Now, there are many (UNLIMITED actually) such 2D mapping functions. Each _specific_ mapping function can be characterized by a _specific_ LHS multiplying matrix. MxV again!

Now eigenvector is about characterizing such a matrix. Suppose we are analysing such a matrix. The matrix accepts ANY columnar (from the 2D space) and transforms it. Again there are UNLIMITED number of input vectors, but someone noticed one (among a few) input vector is special to this particular matrix. It goes into the transform and comes out perfectly scaled. Suppose this special input vector is (2,1,3) , it comes out as (20,10,30). The scaling factor (10 in this case) is the eigenvalue corresponding to the eigenvector.

Let's stop for a moment. This is rare. Most input vectors don't come out perfectly scaled - they get linearly transformed but not perfectly scaled. This particular input vector (and any scaled version of it) is special to this matrix. It helps to characterise the matrix.

It turns out for a 3D square matrix, there are up to 3 such special vectors -- the eigenvectors of the matrix.

The set of all eigenvectors of a matrix, each paired with its corresponding eigenvalue, is called the eigensystem of that linear transform.

Instead of "eigen", the terms "characteristic vector and characteristic value" are also used for these concepts.

 should be written as a columnar actually.