Eigenvalues-and-Eigenvectors

1.1 Eigenvalues and Eigenvectors

1.1.1 Eigenvalues and Eigenvectors

  • Definition:
    Let A be an n×n matrix.
    A scalar is said to be an eigenvalue of A if there exists a nonzero vector such that
    Such is said to be an eigenvector for A belonging to .
    • The pair (,x) is sometimes called an eigen-pair for A.
    • Eigenvalues and eigenvectors are only defined for square matrices
    • Eigenvectors must be nonzero
  • Property
    • A is singular ⇔ =0 is an eigenvalue of A.
    • If A is nonsingular, then is an eigenvalue of , and x is an eigenvector of belonging to ^(−1), i.e. .
    • For , is an eigenvalue of , and x is an eigenvector of belonging to , i.e.
    • If , what can be? Hint: .
    • If , what can be? Hint:

Find eigenvalue and eigenvector

  • : eigenspace 特征空间 corresponding to the eigenvalue .
    This has a nontrivial solution x \ne 0 if and only if we pick so that the square matrix (A−I) is singular.That is: First use det⁡(A−I)=0 to find all eigenvalues for A, and then use N(A−I) to find all eigenvectors for A.

  • Property:
    Let A be an n×n matrix and be a scalar. The following statements are equivalent:

    • is an eigenvalue of A.
    • has nontrivial solutions.
    • A−I is singular.
    • det⁡(A−I)=0
  • is a degree n polynomial in variable , called the characteristic polynomial 特征多项式.

  • : characteristic equation for A

1.2. Diagonalization

1.2.1. Theorem 4.3.1

If are distinct eigenvalues of an n×n matrix A with corresponding eigenvectors , then the vectors are linearly independent.

1.2.2. Diagonalization

An n×n matrix A is said to be diagonalizable if there exists a nonsingular matrix X and a diagonal matrix D such that
We say that X diagonalizes A.

1.2.3. Theorem 4.3.2

An n×n matrix A is diagonalizable if and only if A has n linearly independent eigenvectors.

  • If A is diagonalizable, then the column vectors of the diagonalizing matrix X are eigenvectors of A and the diagonal elements of D are the corresponding eigenvalues of A.
  • The diagonalizing matrix X is not unique. Reordering the columns of a given diagonalizing matrix X or multiplying them by nonzero scalars will produce a new diagonalizing matrix.
  • If A is n×n and A has n distinct eigenvalues, then A is diagonalizable. If the eigenvalues are not distinct, then A may or may not be diagonalizable, depending on whether A has n linearly independent eigenvectors.
  • If A is diagonalizable, then A can be factored as a product

Shortcut: For a 2×2 matrix

trace = a+d =
det(A)=

1.2.4. Symmertric Matrix

Let A be a symmetric n×n matrix:

  • Property:
    • If x is an eigenvector of A and y⊥x, then Ay⊥x. [1]
    • If are eigenvectors of A with , then . [2]

1.2.5. Orthogonally Diagonalizable

A square matrix A is orthogonally diagonalizable if for some diagonal matrix D and some orthogonal matrix Q.

  • n×n matrix A is orthogonally diagonalizable if and only if A has n orthonormal eigenvectors (columns of Q)
  • From the previous slide, if a symmetric matrix A has n distinct eigenvalues, then A has n mutually orthogonal eigenvectors.
  • Surprisingly, symmetric n×n matrix A ALWAYS has 𝒏 orthonormal eigenvectors, and ALL eigenvalues are real!

1.2.6. Spectral Theorem for Real Symmetric Matrices 6.3.3

If A is a real symmetric matrix, then there is an orthogonal matrix Q that diagonalizes A; that is, , where is diagonal, with real entries.

1.3. Ⅴ The Singular Value Decomposition

1.3.1. Square Symmertric Matrix

Properties

  • Eigenvalue: nonnegative , real
  • Eigenvector: Orthoganal

Suppose a m×n matrix A Then

  1. and are symmetric. are orthogonally diagonalizable
  2. and are orthogonally diagonalizable

has m orthonormal eigenvectors
has n orthonormal eigenvectors

  1. All eigenvalues of and are non-negative.[3]

1.3.2. The Singular Value Decomposition

Assume that A is an m×n matrix with m ≥ n. is SVD of A
Factorize A into a product , where

  • U is an m × m orthogonal matrix
  • V is an n ×n orthogonal matrix
  • \sum is an m ×n matrix whose off-diagonal entries are all 0’s and whose diagonal elements satisfy: Properties
  • The first r columns of V is an orthonormal basis for
  • The rest of columns of V is an orthonormal basis for
  • The first r columns of U is an orthonormal basis for
  • The rest of columns of U is an orthonormal basis for

1.3.3. Theorem 4.5.1

If A is an m × n matrix, then A has a singular value decomposition.

1.4. Quadratic Forms

2×2: 3×3:

  • In , for a symmetric n×n matrix A, the function

A real n×n symmetric matrix A is said to be

  • Positive definite 正定 if for all nonzero x in
  • Negative definite 负定 if for all nonzero x in
  • Positive semidefinite 半正定 if for all nonzero x in
  • Negative semidefinite 半负定 if for all nonzero x in
  • Indefinite 不定 if takes positive and negative values

1.4.1. Principal Axes Theorem 4.6.1

If A is a real symmetric n×n matrix, then there is a change of variables , where Q is an orthogonal matrix, such that
where D is a real diagonal matrix.

1.4.2. Theorem 4.6.2

Let A be a real symmetric n×n matrix. Then A is positive definite if and only if all its eigenvalues are positive.
A is

  • Positive definite iff all its eigenvalues are positive
  • Negative definite iff all its eigenvalues are negative
  • Positive semidefinite iff all its eigenvalues are nonnegative
  • Negative semidefinite iff all its eigenvalues are nonpositive
  • Indefinite iff all it has both positive and negative eigenvalues

1.4.3. Theorem 4.6.3

Let A be a real symmetric n×n matrix. The following are equivalent:

  • A is positive definite.
  • The leading principal submatrices all have positive determinant.

Leading principal submatrices:upper-left 1×1, 2×2, 3×3,…
If some determinants are positive, and others negative: A is indefinite.
If these determinants are all ≥0: positive semidefinite.


  1. ↩︎
  2. Proof:
    Compare and :
    =>, but . Therefore => ↩︎

  3. = >0↩︎