|
Size: 5213
Comment: Typo
|
Size: 5173
Comment: Link
|
| Deletions are marked like this. | Additions are marked like this. |
| Line 13: | Line 13: |
| The square matrix '''''A''''' is a linear transformation mapping vector ''a'' to vector ''b''. There are certain vectors that map to a linear combination of themself, i.e. multiplying by '''''A''''' works out to multiplying by some scalar ''λ''. | The square matrix '''''A''''' is a [[LinearAlgebra/LinearMapping|mapping]] of vectors. There are certain input vectors that map to a linear combination of themself, i.e. '''''A''': a -> λa''. |
| Line 15: | Line 15: |
| Another way to think of this is that there are certain vectors which maintain their direction through a transformation. For vectors in these directions, the transformation only involves some stretching factor ''λ''. | Another way to think of this is that there are certain input vectors which maintain their direction through a transformation. For vectors in these directions, the transformation only involves some stretching factor ''λ''. |
| Line 17: | Line 17: |
| These certain vector are called '''eigenvectors'''. Each eigenvector has a corresponding scaling factor ''λ'', called '''eigenvalues'''. | These certain input vector are called '''eigenvectors'''. Each eigenvector has a corresponding scaling factor ''λ'', called an '''eigenvalue'''. |
Eigenvalues and Eigenvectors
Matrices are characterized by their eigenvalues and eigenvectors.
Contents
Introduction
The square matrix A is a mapping of vectors. There are certain input vectors that map to a linear combination of themself, i.e. A: a -> λa.
Another way to think of this is that there are certain input vectors which maintain their direction through a transformation. For vectors in these directions, the transformation only involves some stretching factor λ.
These certain input vector are called eigenvectors. Each eigenvector has a corresponding scaling factor λ, called an eigenvalue.
Consider the matrix representing rotation in the y-axis:
┌ ┐ │ cos(θ) 0 sin(θ)│ │ 0 1 0│ │ -sin(θ) 0 cos(θ)| └ ┘
Given this as A, the vector [0 1 0] (and any linear transformation of it) maps to itself. The members of this infinite set of vectors are all eigenvectors of A. (And because there is no scaling, their corresponding eigenvalues λ are all 1.)
Eigenvectors and eigenvalues often include complex numbers.
julia> using LinearAlgebra
julia> A = [0 0 1; 0 1 0; -1 0 0]
3×3 Matrix{Int64}:
0 0 1
0 1 0
-1 0 0
julia> eigvals(A)
3-element Vector{ComplexF64}:
0.0 - 1.0im
0.0 + 1.0im
1.0 + 0.0im
julia> eigvecs(A)
3×3 Matrix{ComplexF64}:
0.707107-0.0im 0.707107+0.0im 0.0+0.0im
0.0-0.0im 0.0+0.0im 1.0+0.0im
0.0-0.707107im 0.0+0.707107im 0.0+0.0imNote that in the above, eigenvectors are returned as an eigenvector matrix. This is usually notated as S.
Description
Eigenvalues and eigenvectors are the pairs of λ and x that satisfy Ax = λx and |A - λI| = 0.
Given a matrix of size n x n, either there are n unique pairs of eigenvectors and eigenvalues, or the matrix is defective.
Properties
Only square matrices have eigenvectors.
Adding nI to A does not change its eigenvectors and adds n to the eigenvalues.
The trace is the sum of eigenvalues. The determinant is the product the eigenvalues.
A diagonal matrix is a trivial case because...
- the columns are its eigenvectors
- the numbers in the diagonal are its eigenvalues
This also means that any diagonalizable matrix of size n x n must have n unique pairs of eigenvectors and eigenvalues, and cannot be defective.
Calculation
Conventional Method
Because eigenvalues are characterized by |A - λI| = 0, they can be solved for by:
subtracting λ from each value on the diagonal
- formulating the determinant for this difference
- setting the formulation for 0
solving for λ
In a simple 2 x 2 matrix, this looks like:
| A - λI | = 0 │ ┌ ┐ ┌ ┐ │ │ │ a b│ -│ λ 0│ │ = 0 │ │ c d│ │ 0 λ│ │ │ └ ┘ └ ┘ │ │ ┌ ┐ │ │ │ a-λ b│ │ = 0 │ │ c d-λ│ │ │ └ ┘ │ (a-λ)(d-λ) - bc = 0
This leads to the characteristic polynomial of A; solving for the roots, as through either factorization or the quadratic formula, gives the eigenvalues.
Because eigenvectors are characterized by...
Ax = λx
Ax - λx = 0
(A - λI)x = 0
...eigenvectors can be solved for given eigenvalues using substitution.
In a simple 2 x 2 matrix, this looks like:
( A - λI ) x = 0 / ┌ ┐ ┌ ┐ \ ┌ ┐ ┌ ┐ │ │ a b│ -│ λ 0│ │ │ u│ = │ 0│ │ │ c d│ │ 0 λ│ │ │ v│ │ 0│ \ └ ┘ └ ┘ / └ ┘ └ ┘ ┌ ┐ ┌ ┐ ┌ ┐ │ a-λ b│ │ u│ = │ 0│ │ c d-λ│ │ v│ │ 0│ └ ┘ └ ┘ └ ┘ ┌ ┐ ┌ ┐ ┌ ┐ │ a-λ b│ │ u│ = │ 0│ │ c d-λ│ │ v│ │ 0│ └ ┘ └ ┘ └ ┘ (a-λ)u + (b)v = 0 (c)u + (d-λ)v = 0
Shortcut
This method is only applicable to 2 x 2 matrices.
- Because the trace is equal to the sum of eigenvalues, it follows that 1/2 of the trace is also the mean of the eigenvalues.
Because the characteristic polynomial must be quadratic, the eigenvalues must be evenly spaced from the center (i.e., the mean). Given the above mean as m and an unknown distance as d, the eigenvalues must be (m-d) and (m+d).
By definition of the determinant, |A| = (m-d)(m+d) = m2 - d2. This can be solved for d.
Altogether,
