|
Size: 2422
Comment: Julia
|
Size: 4994
Comment: Rewrite
|
| Deletions are marked like this. | Additions are marked like this. |
| Line 3: | Line 3: |
| Matrices are characterized by their '''eigenvalues''' and '''eigenvectors'''. | |
| Line 12: | Line 13: |
| '''''A'''x'' is a linear transformation: '''''A''''' maps the input vector ''x'' to the output vector ''b''. | The square matrix '''''A''''' is a linear transformation mapping vector ''a'' to vector ''b''. There is at least one vector that maps to a linear combination of itself, i.e. multiplying by '''''A''''' works out to multiplying by some scalar ''λ''. |
| Line 14: | Line 15: |
| For some linear transformations (i.e. some, but not all, matrices '''''A'''''), there are '''certain input vectors''' ''x'' where the output vector ''b'' is also just a linear transformation of the input vector ''x''. In other words, '''''A''''' mapped ''x'' to a scaled version of ''x''. That '''scaling factor''' is notated ''λ''. | Each of these vectors is an '''eigenvector'''. Each eigenvector has a corresponding scaling factor ''λ''. These are '''eigenvalues'''. |
| Line 16: | Line 17: |
| For example, rotation around the ''y'' axis in 3 dimensions by ''θ'' degrees is calculated with: | Consider the [[LinearAlgebra/RotationMatrix|matrix representing rotation in the y-axis]]: |
| Line 26: | Line 27: |
| The the vector ''[0 1 0]]'', and importantly any linear transformation of that unit vector, will not be transformed by this '''''A'''''. (And because the linear transformation involves no scaling, ''λ = 1''.) | Given this as '''''A''''', the vector ''[0 1 0]]'' (and any linear transformation of it) maps to itself. The members of this infinite set of vectors are ''all'' eigenvectors of '''''A'''''. (And because there is no scaling, their corresponding eigenvalues ''λ'' are all 1.) |
| Line 28: | Line 29: |
| The '''certain input vectors''' are the '''eigenvectors''' of '''''A'''''. The '''scaling factors''' are the '''eigenvalues''' of '''''A'''''. | Eigenvectors and eigenvalues often include complex numbers. |
| Line 52: | Line 53: |
| Note that the other eigenvectors and eigenvalues are complex. Note also that the eigenvectors are returned as the eigenvector matrix, usually notated as '''''S'''''. | Note that in the above, eigenvectors are returned as an eigenvector matrix. This is usually notated as '''''S'''''. |
| Line 58: | Line 59: |
| == Definition == | == Description == |
| Line 62: | Line 63: |
| Unless '''''A''''' is '''defective''', there should be ''n'' unique pairs of eigenvectors and eigenvalues. If there is a repeated eigenvalue, there may not be ''n'' independent eigenvectors. | Given a matrix of size ''n x n'', either there are ''n'' unique pairs of eigenvectors and eigenvalues, or the matrix is '''defective'''. |
| Line 68: | Line 69: |
| Only square matrices have eigenvectors. |
|
| Line 70: | Line 73: |
| The sum of the eigenvalues is the trace (sum of diagonal). The product of the eigenvalues is the [[LinearAlgebra/Determinants|determinant]]. | The [[LinearAlgebra/Trace|trace]] is the sum of eigenvalues. The [[LinearAlgebra/Determinant|determinant]] is the product the eigenvalues. A [[LinearAlgebra/Diagonalization|diagonal matrix]] is a trivial case because... * the columns are its eigenvectors * the numbers in the diagonal are its eigenvalues This also means that any diagonalizable matrix of size ''n x n'' must have ''n'' unique pairs of eigenvectors and eigenvalues, and cannot be defective. ---- == Calculation == === Conventional Method === Because eigenvalues are characterized by ''|'''A''' - λ'''I'''| = 0'', they can be solved for by: * subtracting ''λ'' from each value on the diagonal * formulating the determinant for this difference * setting the formulation for 0 * solving for ''λ'' In a simple ''2 x 2'' matrix, this looks like: {{{ | A - λI | = 0 │ ┌ ┐ ┌ ┐ │ │ │ a b│ -│ λ 0│ │ = 0 │ │ c d│ │ 0 λ│ │ │ └ ┘ └ ┘ │ │ ┌ ┐ │ │ │ a-λ b│ │ = 0 │ │ c d-λ│ │ │ └ ┘ │ (a-λ)(d-λ) - bc = 0 }}} This leads to the '''characteristic polynomial''' of '''''A'''''; solving for the roots, as through either factorization or the quadratic formula, gives the eigenvalues. Because eigenvectors are characterized by... '''''A'''x = λx'' '''''A'''x - λx = 0'' ''('''A''' - λ'''I''')x = 0'' ...eigenvectors can be solved for given eigenvalues using substitution. In a simple ''2 x 2'' matrix, this looks like: {{{ ( A - λI ) x = 0 / ┌ ┐ ┌ ┐ \ ┌ ┐ ┌ ┐ │ │ a b│ -│ λ 0│ │ │ u│ = │ 0│ │ │ c d│ │ 0 λ│ │ │ v│ │ 0│ \ └ ┘ └ ┘ / └ ┘ └ ┘ ┌ ┐ ┌ ┐ ┌ ┐ │ a-λ b│ │ u│ = │ 0│ │ c d-λ│ │ v│ │ 0│ └ ┘ └ ┘ └ ┘ ┌ ┐ ┌ ┐ ┌ ┐ │ a-λ b│ │ u│ = │ 0│ │ c d-λ│ │ v│ │ 0│ └ ┘ └ ┘ └ ┘ (a-λ)u + (b)v = 0 (c)u + (d-λ)v = 0 }}} === Shortcut === This method is only applicable to ''2 x 2'' matrices. * Because the trace is equal to the sum of eigenvalues, it follows that 1/2 of the trace is also the mean of the eigenvalues. * Because the characteristic polynomial must be quadratic, the eigenvalues must be evenly spaced from the center (i.e., the mean). Given the above mean as ''m'' and an unknown distance as ''d'', the eigenvalues must be ''(m-d)'' and ''(m+d)''. * By definition of the determinant, ''|'''A'''| = (m-d)(m+d) = m^2^ - d^2^''. This can be solved for ''d''. Altogether, {{attachment:shortcut.svg}} |
Eigenvalues and Eigenvectors
Matrices are characterized by their eigenvalues and eigenvectors.
Contents
Introduction
The square matrix A is a linear transformation mapping vector a to vector b. There is at least one vector that maps to a linear combination of itself, i.e. multiplying by A works out to multiplying by some scalar λ.
Each of these vectors is an eigenvector. Each eigenvector has a corresponding scaling factor λ. These are eigenvalues.
Consider the matrix representing rotation in the y-axis:
┌ ┐ │ cos(θ) 0 sin(θ)│ │ 0 1 0│ │ -sin(θ) 0 cos(θ)| └ ┘
Given this as A, the vector [0 1 0]] (and any linear transformation of it) maps to itself. The members of this infinite set of vectors are all eigenvectors of A. (And because there is no scaling, their corresponding eigenvalues λ are all 1.)
Eigenvectors and eigenvalues often include complex numbers.
julia> using LinearAlgebra
julia> A = [0 0 1; 0 1 0; -1 0 0]
3×3 Matrix{Int64}:
0 0 1
0 1 0
-1 0 0
julia> eigvals(A)
3-element Vector{ComplexF64}:
0.0 - 1.0im
0.0 + 1.0im
1.0 + 0.0im
julia> eigvecs(A)
3×3 Matrix{ComplexF64}:
0.707107-0.0im 0.707107+0.0im 0.0+0.0im
0.0-0.0im 0.0+0.0im 1.0+0.0im
0.0-0.707107im 0.0+0.707107im 0.0+0.0imNote that in the above, eigenvectors are returned as an eigenvector matrix. This is usually notated as S.
Description
Eigenvalues and eigenvectors are the pairs of λ and x that satisfy Ax = λx and |A - λI| = 0.
Given a matrix of size n x n, either there are n unique pairs of eigenvectors and eigenvalues, or the matrix is defective.
Properties
Only square matrices have eigenvectors.
Adding nI to A does not change its eigenvectors and adds n to the eigenvalues.
The trace is the sum of eigenvalues. The determinant is the product the eigenvalues.
A diagonal matrix is a trivial case because...
- the columns are its eigenvectors
- the numbers in the diagonal are its eigenvalues
This also means that any diagonalizable matrix of size n x n must have n unique pairs of eigenvectors and eigenvalues, and cannot be defective.
Calculation
Conventional Method
Because eigenvalues are characterized by |A - λI| = 0, they can be solved for by:
subtracting λ from each value on the diagonal
- formulating the determinant for this difference
- setting the formulation for 0
solving for λ
In a simple 2 x 2 matrix, this looks like:
| A - λI | = 0 │ ┌ ┐ ┌ ┐ │ │ │ a b│ -│ λ 0│ │ = 0 │ │ c d│ │ 0 λ│ │ │ └ ┘ └ ┘ │ │ ┌ ┐ │ │ │ a-λ b│ │ = 0 │ │ c d-λ│ │ │ └ ┘ │ (a-λ)(d-λ) - bc = 0
This leads to the characteristic polynomial of A; solving for the roots, as through either factorization or the quadratic formula, gives the eigenvalues.
Because eigenvectors are characterized by...
Ax = λx
Ax - λx = 0
(A - λI)x = 0
...eigenvectors can be solved for given eigenvalues using substitution.
In a simple 2 x 2 matrix, this looks like:
( A - λI ) x = 0 / ┌ ┐ ┌ ┐ \ ┌ ┐ ┌ ┐ │ │ a b│ -│ λ 0│ │ │ u│ = │ 0│ │ │ c d│ │ 0 λ│ │ │ v│ │ 0│ \ └ ┘ └ ┘ / └ ┘ └ ┘ ┌ ┐ ┌ ┐ ┌ ┐ │ a-λ b│ │ u│ = │ 0│ │ c d-λ│ │ v│ │ 0│ └ ┘ └ ┘ └ ┘ ┌ ┐ ┌ ┐ ┌ ┐ │ a-λ b│ │ u│ = │ 0│ │ c d-λ│ │ v│ │ 0│ └ ┘ └ ┘ └ ┘ (a-λ)u + (b)v = 0 (c)u + (d-λ)v = 0
Shortcut
This method is only applicable to 2 x 2 matrices.
- Because the trace is equal to the sum of eigenvalues, it follows that 1/2 of the trace is also the mean of the eigenvalues.
Because the characteristic polynomial must be quadratic, the eigenvalues must be evenly spaced from the center (i.e., the mean). Given the above mean as m and an unknown distance as d, the eigenvalues must be (m-d) and (m+d).
By definition of the determinant, |A| = (m-d)(m+d) = m2 - d2. This can be solved for d.
Altogether,
