The computation of eigenvalues and eigenvectors can serve many purposes; however, when it comes to differential equations eigenvalues and eigenvectors are most often used to find straight-line solutions of linear systems.
To find eigenvalues, we use the formula:
`A vec(v) = lambda vec (v)`
where `A = ((a,b), (d,c))` and `vec(v)= ((x),(y))`
`((a,b), (d,c))((x),(y))= lambda ((x),(y))`, which can be written in components as
`ax + by = lambda x`
`cx + dy = lambda y`
We want to solve for non-zero solution, such that the system becomes
`(a- lambda)x + by=0`
`cx + (d-lambda)y =0`
We can prove that given a matrix A whose determinant is not equal to zero, the only equilibrium point for the linear system is the origin, meaning that to solve the system above we take the determinant and set it equal to zero.
`det ((a-lambda,b), (c, d-lambda))= 0`
Every time we compute eigenvalues and eigenvectors we use this format, which can also be written as `det(A - lambda vec(I)) =0`, where I is the Identity matrix `vec(I)=((1, 0), (0, 1))`. Computation of `det(A - lambda vec(I)) =0` leads to the Characteristic Polynomial, where the roots of this polynomial are the eigenvalues of the matrix A.
`det(A - lambda vec(I))=det ((a-lambda, b), (c, d-lambda)) = (a-lambda)(d-lambda)-bc=0`, which expands to the quadratic polynomial
`lambda^(2) - (a+d)lambda +(ad-bc)=0.`
This is referred to as the characteristic polynomial, where the characteristic polynomial always has two roots. These roots can be real or complex, and they do not have to be distinct. If the roots are complex we say that the matrix has complex eigenvalues. Otherwise, we say that the matrix has real eigenvalues.
Here are examples of how to solve for both kinds of eigenvalues:
Let's begin with an example where we compute real eigenvalues:
Suppose we have the matrix:
`A = ((5,4),(3,2))`
`det(A - lambda I)= det ((5-lambda, 4), (3, 2-lambda))=(5-lambda)(2-lambda)-4*3=0`
`(5-lambda)(2-lambda)-12=lambda^2 -7lambda+(-2)=0`
The roots are:
`lambda = frac(7 pm sqrt(49-48))(2)`
`lambda = 4, 3`
Now we will compute complex eigenvalues:
Before we start we should review what it means to have a complex number.
"Complex numbers are numbers of the form x + iy, where x and y are real numbers and I is the 'imaginary number' `sqrt(-1)` " (Blanchard, Devaney, Hall, 291).
Consider the system where A = `((-2, -3), (3, -2))`
`det(A-lambda I) = det ((-2-lambda, -3),(3, -2-lambda)) = (-2-lambda)(-2-lambda)-(-3*3)=lambda^2+4 lambda +13 =0.`
The roots are:
`lambda = frac(-4 pm sqrt(-36))(2)`
We see that the `sqrt(-36)` is equal to 6i, such that the eigenvalues become:
`lambda = frac(-4 pm 6i)(2) = -2 pm 3i`
Given a matrix `A = ((a,b), (c,d))` and we know that `lambda` is an eigenvalue, we use the same equation from above `A vec(v) = lambda vec (v)` to solve for `vec(v)` of the form `vec(v) = ((x), (y))`. We notice that `A vec(v) = lambda vec(v)` turns into a system of linear equations:
`ax + by = lambda x`
`cx + dy = lambda y`
Because we have already solved for lambda, "we know that there is at least an entire line of eigenvectors (x, y) that satisfy this system of equations. This infinite number of eigenvectors means that the equations are redundant. That is, either the two equations are equivalent, or one of the equations is always satisfied" (Blanchard, Devaney, Hall, 266).
We will give an example to demonstrate what is meant by the statement above:
Suppose the matrix A = `((2, 2),(1,3))`
`det(A-lambda I) = (2-lambda)(3-lambda)-(2*1)=0`
`lambda^2-5 lambda+4 =0 `
`lambda = 1, 4 ` or `lambda_(1) = 4 , lambda_(2) =1`
Let's use `lambda_(2) ` in the equation:
`A((x),(y))= ((2, 2),(1,3)) ((x),(y)) = 1((x),(y))`
Rewritten in terms of components, the equation becomes
`2x + 2y = x`
`1x + 3y = y`
or
`x+2y=0`
`x+2y=0`
It is obvious that `frac(-1)(2) x = y` satisfies both equations, such that the eigenvector for `lambda_2 = ((1), (frac(-1)(2)))`
Now let's view an example where there are complex eigenvalues and a complex eigenvector:
Let's begin where we left off in the example from before where A = `((-2, -3), (3, -2))`
We found that eigenvalues were `lambda_(1) = -2 + 3i, lambda_(2) = -2 - 3i`
Let's take `lambda_(1)` and plug it into the equation,
`A((x),(y))= ((2, 2),(1,3)) ((x),(y)) = (-2+3i)((x),(y))`
As a system of equations we have
`-2x - 3y = (-2 + 3i)x`
`3x - 2y = (-2 + 3i)y `
which can be rewritten as
`(-3i)x + 3y = 0`
`3x + (-3i)y = 0 .`
Just as in the example above, the equations are redundant. We see that `(i)x= y ` and `vec(v) = ((1), (i))`
https://youtu.be/bOreOaAjDno
http://tutorial.math.lamar.edu/Classes/DE/LA_Eigen.aspx
https://www.khanacademy.org/math/linear-algebra/alternate-bases/eigen-everything/v/linear-algebra-introduction-to-eigenvalues-and-eigenvectors