Eigenvalues And Eigenvectors

Eigenvalues-and-Eigenvectors

1.1 Eigenvalues and Eigenvectors

1.1.1 Eigenvalues and Eigenvectors

  • Definition: Let A be an n×n matrix. A scalar $\lambda$ is said to be an eigenvalue of A if there exists a nonzero vector $x\ne 0$ such that \(Ax=\lambda x\) Such $x\ne 0$ is said to be an eigenvector for A belonging to $\lambda$.
    • The pair ($\lambda$,x) is sometimes called an eigen-pair for A.
    • Eigenvalues and eigenvectors are only defined for square matrices
    • Eigenvectors must be nonzero
  • Property
    • A is singular ⇔ $\lambda$=0 is an eigenvalue of A.
    • If A is nonsingular, then $\lambda^{−1}$ is an eigenvalue of $A^{−1}$, and x is an eigenvector of $A^{−1}$ belonging to $\lambda$^(−1), i.e. $A^(−1) x=\lambda^(−1) x$.
    • For $m\geq 1$, $\lambda ^m$ is an eigenvalue of $A^m$, and x is an eigenvector of $A^m$ belonging to $\lambda^m$, i.e. $A^m x=\lambda^m x.$
    • If $A^2=A$, what can $\lambda$ be? Hint: $A^2 x=Ax$.
    • If $A^30=O$, what can $\lambda$ be? Hint: $A^30 x=𝑂x$

Find eigenvalue and eigenvector \(Ax=\lambda x\Leftrightarrow Ax−\lambda x=0\Leftrightarrow (A−\lambda I)x=0⇔x∈N(A−\lambda I)\)

  • $N(A−\lambda I)$: eigenspace 特征空间 corresponding to the eigenvalue $\lambda$. This has a nontrivial solution x \ne 0 if and only if we pick $\lambda$ so that the square matrix (A−$\lambda$I) is singular.That is: \(det⁡(A−\lambda I)=0.\) First use det⁡(A−$\lambda$I)=0 to find all eigenvalues for A, and then use N(A−$\lambda$I) to find all eigenvectors for A.

  • Property: Let A be an n×n matrix and $\lambda$ be a scalar. The following statements are equivalent:
    • $\lambda$ is an eigenvalue of A.
    • $(A−\lambda I)x=0$ has nontrivial solutions.
    • $N(A−\lambda I)\ne{0}$
    • A−$\lambda$I is singular.
    • det⁡(A−$\lambda$I)=0
  • $p(\lambda)=det⁡(A− \lambda I)$ is a degree n polynomial in variable $\lambda$, called the characteristic polynomial 特征多项式.
  • $det⁡(A−\lambda I)=0$: characteristic equation for A

1.2. Diagonalization

1.2.1. Theorem 4.3.1

If $\lambda_1, \lambda_2, … \lambda_k$ are distinct eigenvalues of an n×n matrix A with corresponding eigenvectors $x_1,x_2,…,x_k$, then the vectors $x_1,x_2,…,x_k$ are linearly independent.

1.2.2. Diagonalization

An n×n matrix A is said to be diagonalizable if there exists a nonsingular matrix X and a diagonal matrix D such that \(X^{−1} AX=D\) We say that X diagonalizes A.

1.2.3. Theorem 4.3.2

An n×n matrix A is diagonalizable if and only if A has n linearly independent eigenvectors.

  • If A is diagonalizable, then the column vectors of the diagonalizing matrix X are eigenvectors of A and the diagonal elements of D are the corresponding eigenvalues of A.
  • The diagonalizing matrix X is not unique. Reordering the columns of a given diagonalizing matrix X or multiplying them by nonzero scalars will produce a new diagonalizing matrix.
  • If A is n×n and A has n distinct eigenvalues, then A is diagonalizable. If the eigenvalues are not distinct, then A may or may not be diagonalizable, depending on whether A has n linearly independent eigenvectors.
  • If A is diagonalizable, then A can be factored as a product \(A=XDX^{−1}\) \(A^k=XD^k X^{−1}\)

Shortcut: For a 2×2 matrix

trace = a+d =$\lambda _1+\lambda _2$ det(A)=$\lambda _1\lambda _2$

1.2.4. Symmertric Matrix

Let A be a symmetric n×n matrix: $A^T=A.$

  • Property:
    • $(x)^T (Ay)=(Ax)^T (y).$
    • If x is an eigenvector of A and y⊥x, then Ay⊥x. ^[$(Ay)^Tx=(y)^T (Ax)= (y)^T (\lambda x)=\lambda (y)^T (x)=0$]
    • If $x_1,x_2$ are eigenvectors of A with $\lambda _1\ne \lambda _2$, then $x_1⊥x_2$. ^[Proof: Compare $(x_1 )^T (Ax_2 )$ and $(Ax_1 )^T (x_2 )$: $(x_1 )^T (Ax_2 )=(Ax_1 )^T (x_2 )$=>$(x_1 )^T (\lambda _2x_2 )=(\lambda _1x_1 )^T (x_2 )$, but $\lambda _1\ne \lambda _2$. Therefore $(x_1 )^T (x_2 )=0$ => $x_1⊥x_2$]

1.2.5. Orthogonally Diagonalizable

A square matrix A is ==orthogonally== diagonalizable if $Q^T AQ=D$ for some diagonal matrix D and some orthogonal matrix Q.

  • n×n matrix A is orthogonally diagonalizable if and only if A has n orthonormal eigenvectors (columns of Q)
  • From the previous slide, if ==a symmetric matrix A has n distinct eigenvalues==, then A has ==n mutually orthogonal eigenvectors==.
  • Surprisingly, symmetric n×n matrix A ALWAYS has 𝒏 orthonormal eigenvectors, and ALL eigenvalues are real!

1.2.6. Spectral Theorem for Real Symmetric Matrices 6.3.3

If A is a real symmetric matrix, then there is an orthogonal matrix Q that diagonalizes A; that is, $Q^T AQ=D$, where $D$ is diagonal, with real entries.

1.3. Ⅴ The Singular Value Decomposition

1.3.1. Square Symmertric Matrix

Properties

  • Eigenvalue: nonnegative , real
  • Eigenvector: Orthoganal

Suppose a m×n matrix A Then

  1. $AA^T$ and $A^TA$ are symmetric. are orthogonally diagonalizable
  2. $AA^T$ and $A^TA$ are orthogonally diagonalizable

$AA^T$ has m orthonormal eigenvectors $A^TA$ has n orthonormal eigenvectors

  1. All eigenvalues of $AA^T$ and $A^TA$ are non-negative.^[ $   Ax   ^2 = (Ax)^T Ax =x^TA^TAx$
    = $\lambda   x   ^2$ >0]

1.3.2. The Singular Value Decomposition

Assume that A is an m×n matrix with m ≥ n. $U\sum V^T$ is SVD of A Factorize A into a product $U\sum V^T$ , where

  • U is an m × m orthogonal matrix $\rightarrow U^TU=I_m$
  • V is an n ×n orthogonal matrix $\rightarrow V^TV=I_n$
  • \sum is an m ×n matrix whose off-diagonal entries are all 0’s and whose diagonal elements satisfy: \(\sigma_1\geq \sigma_2\geq ...=\sigma_n\geq 0\) \(A = U\sum V^T\) Properties
  • The first r columns of V is an orthonormal basis for $R(A^T)$
  • The rest of columns of V is an orthonormal basis for $N(A)$
  • The first r columns of U is an orthonormal basis for $R(A)$
  • The rest of columns of U is an orthonormal basis for $N(A^T)$

1.3.3. Theorem 4.5.1

If A is an m × n matrix, then A has a singular value decomposition.

1.4. Quadratic Forms

2×2: \(⟨x,Ax⟩=x^T Ax =𝑎𝑥^2+2𝑏𝑥𝑦+𝑐𝑦^2\) 3×3: \(⟨x,Ax⟩=x^T Ax =𝑎𝑥^2+𝑏𝑦^2+𝑐𝑧^2+2𝑑𝑥𝑦+2𝑒𝑥𝑧+2𝑓𝑦𝑧\)

  • In $\mathbb{R}^n$, for a symmetric n×n matrix A, the function $𝑞:\mathbb{R}^n→\mathbb{R}$ \(𝑞(x)=x^T Ax=\sum _{𝑖=1}^n \sum _{𝑖=1}^n 𝑎_𝑖𝑗 𝑥_𝑖 𝑥_𝑗\)

A real n×n symmetric matrix A is said to be

  • Positive definite 正定 if $x^T Ax>0$ for all nonzero x in $\mathbb{R}^n$
  • Negative definite 负定 if $x^T Ax<0$ for all nonzero x in $\mathbb{R}^n$
  • Positive semidefinite 半正定 if $x^T Ax≥0$ for all nonzero x in $\mathbb{R}^n$
  • Negative semidefinite 半负定 if $x^T Ax≤0$ for all nonzero x in $\mathbb{R}^n$
  • Indefinite 不定 if $x^T Ax$ takes positive and negative values

1.4.1. Principal Axes Theorem 4.6.1

If A is a real symmetric n×n matrix, then there is a change of variables $c=Q^T x$, where Q is an orthogonal matrix, such that \(x^T Ax=c^T Dc\) where D is a real diagonal matrix.

1.4.2. Theorem 4.6.2

Let A be a real symmetric n×n matrix. Then A is positive definite if and only if all its eigenvalues are positive. A is

  • Positive definite iff all its eigenvalues are positive
  • Negative definite iff all its eigenvalues are negative
  • Positive semidefinite iff all its eigenvalues are nonnegative
  • Negative semidefinite iff all its eigenvalues are nonpositive
  • Indefinite iff all it has both positive and negative eigenvalues

1.4.3. Theorem 4.6.3

Let A be a real symmetric n×n matrix. The following are equivalent:

  • A is positive definite.
  • The leading principal submatrices $A_1,A_2,…,A_n$ all have positive determinant.

Leading principal submatrices:upper-left 1×1, 2×2, 3×3,… If some determinants are positive, and others negative: A is indefinite. If these determinants are all ≥0: positive semidefinite.

BackLink