|
Size: 725
Comment: Initial commit
|
Size: 4355
Comment: Simplifying matrix page names
|
| Deletions are marked like this. | Additions are marked like this. |
| Line 3: | Line 3: |
| Matrices are characterized by their '''eigenvalues''' and '''eigenvectors'''. | |
| Line 10: | Line 11: |
| == Introduction == '''''A'''x'' is a linear transformation: '''''A''''' maps the input vector ''x'' to the output vector ''b''. For some linear transformations (i.e. some, but not all, matrices '''''A'''''), there are '''certain input vectors''' ''x'' where the output vector ''b'' is also just a linear transformation of the input vector ''x''. In other words, '''''A''''' mapped ''x'' to a scaled version of ''x''. That '''scaling factor''' is notated ''λ''. For example, rotation around the ''y'' axis in 3 dimensions by ''θ'' degrees is calculated with: {{{ ┌ ┐ │ cos(θ) 0 sin(θ)│ │ 0 1 0│ │ -sin(θ) 0 cos(θ)| └ ┘ }}} The the vector ''[0 1 0]]'', and importantly any linear transformation of that unit vector, will not be transformed by this '''''A'''''. (And because the linear transformation involves no scaling, ''λ = 1''.) The '''certain input vectors''' are the '''eigenvectors''' of '''''A'''''. The '''scaling factors''' are the '''eigenvalues''' of '''''A'''''. {{{ julia> using LinearAlgebra julia> A = [0 0 1; 0 1 0; -1 0 0] 3×3 Matrix{Int64}: 0 0 1 0 1 0 -1 0 0 julia> eigvals(A) 3-element Vector{ComplexF64}: 0.0 - 1.0im 0.0 + 1.0im 1.0 + 0.0im julia> eigvecs(A) 3×3 Matrix{ComplexF64}: 0.707107-0.0im 0.707107+0.0im 0.0+0.0im 0.0-0.0im 0.0+0.0im 1.0+0.0im 0.0-0.707107im 0.0+0.707107im 0.0+0.0im }}} Note that the other eigenvectors and eigenvalues are complex. Note also that the eigenvectors are returned as the eigenvector matrix, usually notated as '''''S'''''. ---- |
|
| Line 12: | Line 61: |
| '''Eigenvalues''' and '''eigenvectors''' are paired. For an [[LinearAlgebra/MatrixInversion|invertible]] ''n'' x ''n'' matrix, there should be ''n'' pairs. They satisfy the conditions '''''A'''x = λx'' and ''|'''A''' - λ'''I'''| = 0''. | '''Eigenvalues''' and '''eigenvectors''' are the pairs of ''λ'' and ''x'' that satisfy '''''A'''x = λx'' and ''|'''A''' - λ'''I'''| = 0''. |
| Line 14: | Line 63: |
| If there is a repeated eigenvalue, there may not be ''n'' independent eigenvalues. | Unless '''''A''''' is '''defective''', there should be ''n'' unique pairs of eigenvectors and eigenvalues. If there is a repeated eigenvalue, there may not be ''n'' independent eigenvectors. |
| Line 22: | Line 71: |
| The sum of the eigenvalues is the trace (sum of diagonal). The product of the eigenvalues is the [[LinearAlgebra/Determinants|determinant]]. | The sum of the eigenvalues is the trace (sum of diagonal). The product of the eigenvalues is the [[LinearAlgebra/Determinant|determinant]]. ---- == Solution == === Special Cases === A [[LinearAlgebra/Diagonalization|diagonal matrix]] is a special and trivial case for finding eigenvalues and eigenvectors. === Finding Eigenvalues === Because eigenvalues are characterized by ''|'''A''' - λ'''I'''| = 0'', they can be solved for by: * subtracting ''λ'' from each value on the diagonal * formulating the determinant for this difference * setting the formulation for 0 * solving for ''λ'' In a simple 2 x 2 matrix, this looks like: {{{ | A - λI | = 0 │ ┌ ┐ ┌ ┐ │ │ │ a b│ -│ λ 0│ │ = 0 │ │ c d│ │ 0 λ│ │ │ └ ┘ └ ┘ │ │ ┌ ┐ │ │ │ a-λ b│ │ = 0 │ │ c d-λ│ │ │ └ ┘ │ (a-λ)(d-λ) - bc = 0 }}} === Finding Eigenvectors === Because eigenvectors are characterered by: * '''''A'''x = λx'' * which can be rewritten as '''''A'''x - λx = 0'' * which can be refactored as ''('''A''' - λ'''I''')x = 0'' Eigenvectors can be solved for given eigenvalues. For each given ''λ,,i,,'', substitute it into that final equation and solve for the corresponding ''x,,i,,''. In a simple 2 x 2 matrix, this looks like: {{{ ( A - λI ) x = 0 / ┌ ┐ ┌ ┐ \ ┌ ┐ ┌ ┐ │ │ a b│ -│ λ 0│ │ │ u│ = │ 0│ │ │ c d│ │ 0 λ│ │ │ v│ │ 0│ \ └ ┘ └ ┘ / └ ┘ └ ┘ ┌ ┐ ┌ ┐ ┌ ┐ │ a-λ b│ │ u│ = │ 0│ │ c d-λ│ │ v│ │ 0│ └ ┘ └ ┘ └ ┘ ┌ ┐ ┌ ┐ ┌ ┐ │ a-λ b│ │ u│ = │ 0│ │ c d-λ│ │ v│ │ 0│ └ ┘ └ ┘ └ ┘ (a-λ)u + (b)v = 0 (c)u + (d-λ)v = 0 }}} |
Eigenvalues and Eigenvectors
Matrices are characterized by their eigenvalues and eigenvectors.
Contents
Introduction
Ax is a linear transformation: A maps the input vector x to the output vector b.
For some linear transformations (i.e. some, but not all, matrices A), there are certain input vectors x where the output vector b is also just a linear transformation of the input vector x. In other words, A mapped x to a scaled version of x. That scaling factor is notated λ.
For example, rotation around the y axis in 3 dimensions by θ degrees is calculated with:
┌ ┐ │ cos(θ) 0 sin(θ)│ │ 0 1 0│ │ -sin(θ) 0 cos(θ)| └ ┘
The the vector [0 1 0]], and importantly any linear transformation of that unit vector, will not be transformed by this A. (And because the linear transformation involves no scaling, λ = 1.)
The certain input vectors are the eigenvectors of A. The scaling factors are the eigenvalues of A.
julia> using LinearAlgebra
julia> A = [0 0 1; 0 1 0; -1 0 0]
3×3 Matrix{Int64}:
0 0 1
0 1 0
-1 0 0
julia> eigvals(A)
3-element Vector{ComplexF64}:
0.0 - 1.0im
0.0 + 1.0im
1.0 + 0.0im
julia> eigvecs(A)
3×3 Matrix{ComplexF64}:
0.707107-0.0im 0.707107+0.0im 0.0+0.0im
0.0-0.0im 0.0+0.0im 1.0+0.0im
0.0-0.707107im 0.0+0.707107im 0.0+0.0imNote that the other eigenvectors and eigenvalues are complex. Note also that the eigenvectors are returned as the eigenvector matrix, usually notated as S.
Definition
Eigenvalues and eigenvectors are the pairs of λ and x that satisfy Ax = λx and |A - λI| = 0.
Unless A is defective, there should be n unique pairs of eigenvectors and eigenvalues. If there is a repeated eigenvalue, there may not be n independent eigenvectors.
Properties
Adding nI to A does not change its eigenvectors and adds n to the eigenvalues.
The sum of the eigenvalues is the trace (sum of diagonal). The product of the eigenvalues is the determinant.
Solution
Special Cases
A diagonal matrix is a special and trivial case for finding eigenvalues and eigenvectors.
Finding Eigenvalues
Because eigenvalues are characterized by |A - λI| = 0, they can be solved for by:
subtracting λ from each value on the diagonal
- formulating the determinant for this difference
- setting the formulation for 0
solving for λ
In a simple 2 x 2 matrix, this looks like:
| A - λI | = 0 │ ┌ ┐ ┌ ┐ │ │ │ a b│ -│ λ 0│ │ = 0 │ │ c d│ │ 0 λ│ │ │ └ ┘ └ ┘ │ │ ┌ ┐ │ │ │ a-λ b│ │ = 0 │ │ c d-λ│ │ │ └ ┘ │ (a-λ)(d-λ) - bc = 0
Finding Eigenvectors
Because eigenvectors are characterered by:
Ax = λx
which can be rewritten as Ax - λx = 0
which can be refactored as (A - λI)x = 0
Eigenvectors can be solved for given eigenvalues. For each given λi, substitute it into that final equation and solve for the corresponding xi.
In a simple 2 x 2 matrix, this looks like:
( A - λI ) x = 0 / ┌ ┐ ┌ ┐ \ ┌ ┐ ┌ ┐ │ │ a b│ -│ λ 0│ │ │ u│ = │ 0│ │ │ c d│ │ 0 λ│ │ │ v│ │ 0│ \ └ ┘ └ ┘ / └ ┘ └ ┘ ┌ ┐ ┌ ┐ ┌ ┐ │ a-λ b│ │ u│ = │ 0│ │ c d-λ│ │ v│ │ 0│ └ ┘ └ ┘ └ ┘ ┌ ┐ ┌ ┐ ┌ ┐ │ a-λ b│ │ u│ = │ 0│ │ c d-λ│ │ v│ │ 0│ └ ┘ └ ┘ └ ┘ (a-λ)u + (b)v = 0 (c)u + (d-λ)v = 0
