Vector-Space

1.1. Vector and Vector Space Axioms

1.1.1. [1]

  • Definition:
    • The set of all column n-vectors is called
      n-dimensional real space and is denoted .[2]
    • is the component of .

Addition and Scalar Multiplication

  • Let and be two elements of and let c be a scalar. Addition and scalar multiplication are defined as
    • Addition:
    • Scalar multiplication:

Prove a VectorSpace

  • ( is closed under addition)
  • ( is closed under scalar multiplication)

Zero vector

  • Definition:
    The vector , having n zero components, is called the zero vector of and is denoted 0.
  • Property: for any vector v, 0 + v = v, v + 0 = v.

Negative Vector

  • Definition:
    The vector is written and is called the negative of . It is a vector having the same magnitude as , but lies in the opposite direction to .
  • Property: for any vector , $u + (–u) = 0 = (- u) + u $

1.1.2. Vector Space Axioms

  • Definition:
    Let V be a set on which the operations of addition and scalar multiplication are defined.[3] The set V together with the operations of addition and scalar multiplication is said to form a vector space
    if the following axioms are satisfied:
    • . for any x and y in V.
    • . for any x, y, and z in V.
    • . There exists an element 0 in V such that x + 0 = x for each x ∈ V.
    • . For each x ∈ V, there exists an element −x in V such that x + (−x) = 0.
    • for each scalar α and any x and y in V.
    • for any scalars α and β and any x ∈ V.
    • for any scalars α and β and any x ∈ V.
    • for all x ∈ V.

is a Vector Space [4]

  • prove:
    • Addition: [the sum is still an m×n matrix]
    • Scalar multiplication: [It is still an m×n matrix]

is a Vector Space

  • Definition:
    is the set of all real-valued continuous functions defined on the interval . Vectors in this case are continuous real-valued functions.
  • prove:
    • Addition:
    • Scalar multiplication: (𝛼𝑓)(x) =𝛼𝑓(x)

is a Vector Space

Definition:
A polynomial of degree k (k≥0) is a function of the form
where .
is also considered to be a polynomial.

1.1.3. Theorem

  • If V is a vector space and x ∈V , then
    1. 0x = 0
    2. x + v = 0 implies that v = -x
    3. (-1)x = -x.
  • Proof
    1. x=(1+0)x=1x+0x=x+0x, therefore 0x=0
    2. x+v=0, then –x=-x+0=-x+(x+v)=v
    3. 0=0x=(1-1)x=1x+(-1)x, x+(-1)x=0, therefore (-1)x=-x

1.2. Subspace & Null space

1.2.1. Subspaces of Vector Spaces

  • Definition:
    Let V be a vector space (for example, ). A nonempty subset S of V is a subspace if it is closed under addition and closed under scalar multiplication.

To show that a subset S of a vector space forms a subspace, we must show that

  • S is nonempty (it contains the zero vector)
  • (S is closed under addition)
  • (S is closed under scalar multiplication)

Trivial subspace

  • Definition:
    If V is a vector space, then V is a subspace of itself, and S = {0} is a subspace of V.
    All the other subspaces are referred to as proper subspaces.
  • We refer to {0} as the zero subspace.

1.2.2. Null space of a Matrix

  • Definition:
    For a m×n matrix A, let 𝑁(A) denote the set of all solutions to the homogeneous system Ax = 0, that is 𝑁(A)={|Ax=𝟎}
    Null space is a subspace
    • Addition: if 𝒙,𝒚∈𝑁(A), then A𝒙=𝟎, A𝒚=𝟎⇒A(𝒙+𝒚)=𝟎, so x+𝑦∈𝑁(A)
    • Scalar multiplication: if 𝒙∈𝑁(A), c∈R, then A𝒙=𝟎⇒A(c𝒙)=𝟎 ⇒ c𝒙∈𝑁(A)

1.2.3. The Span of a Set of Vectors

  • Definition:
    • For vectors from a vector space V and scalars the sum is called a linear combination of .
    • The set of all possible linear combinations of is called the span of , denoted by :
      ()={}

1.2.4. Theorem

If are vectors from a vector space V , then is a subspace of V.

1.2.5. Spanning Set

  • Definition:
    The set of vectors {} is a spanning set for a vector space V if and only if every vector in V can be written as a linear combination of , that is to say, for any x𝜖V there exist scalars such that

How to check if a set spans

  • Step 1: Put the column vectors into a matrix, A=[,...]
  • Step 2: Augment the matrix by the vector
  • Step 3: Row reduce the augmented matrix to check whether it is consistent for All a, b, c
  • Step 4: If is consistent for All a, b, c, then the columns of , is a spanning set of .

1.3. Linear Independence

1.3.1. Linear Dependence

  • Definition:
    The vectors in a vector space V are said to be linearly dependent (线性相关) if and only if there exist scalars , not all zero, such that
  • Let , then is linearly dependent if and only if the null space of A, 𝑁(A), is NOT the zero subspace {0}.

1.3.2. Theorem 1.3.0

Nonzero vectors are linearly dependent if and only if at least one vector is a linear combination of the others.

1.3.3. Linear Independence

  • Definition:
    The vectors in a vector space V are said to be linearly independent (线性无关) if implies that all the scalers must equal to 0.
  • Let , then is linearly independent if and only if the null space of A is the zero subspace: 𝑁(A)={𝟎}.
  • None of the vectors can be written as a linear combination of the others.

1.3.4. Theorem 1.3.1 [5]

Let be n vectors in and let .

  • The vectors will be linearly dependent if and only if 𝑋 is singular.

1.3.5. Theorem 1.3.2

Let be vectors in a vector space V.

  • A vector can be written uniquely as a linear combination of if and only if are linearly independent.

1.4. Basis and Dimension

  • Definition:
    The vectors form a basis for a vector space V if and only if
    • are linearly independent.
    • span V.
  • If form a basis for V, then:for any 𝒃∈V, there exists unique such that

1.4.1. Standard Basis of

  • Definition:
    The set {} of n vectors is the standard basis for .
    • Standard basis for : {}.
    • Standard basis for : {}.

1.4.2. Dimension

  • Definition:
    Let V be a vector space. If V has a basis consisting of n vectors, we say that V has dimension n and write dim V=n
  • The subspace {0} of V is said to have dimension 0.
  • V is finite dimensional if there is a finite set of vectors that spans V; otherwise V is infinite dimensional.
    • dim R^n = n
    • dim R^(m×n) = m×n
    • dim Pn = n

1.4.3. Theorem 1.4.1

  • If S = {} is a spanning set for a vector space V , then any collection of m vectors in V with m > n is linearly dependent.

Important Corollary (1.4.2)

  • If {} and {} are both bases for a vector space V , then m = n.

1.4.4. Theorem 1.4.3

If V is a vector space with dim ⁡V = n > 0, then

  • Any set of n linearly independent vectors also spans V;
  • Any n vectors that span V are also linearly independent

1.4.5. Theorem 1.4.4

If V is a vector space with dim V = n > 0, then

  • No set with k < n vectors can span V ;
  • Any subset of k < n linearly independent vectors can be extended to form a basis for V;
  • Any spanning set of V with m > n vectors can be cut down to form a basis for V .

1.5. Change of Basis

1.5.1. Coordinate Vector

  • Definition:
    Let V be a vector space and let E={} be an ordered basis for V. If v is any element of V, then v can be written uniquely in the form
    where are scalers.
    Thus, we can associate each vector v a unique column vector {}. The vector is called the coordinate vector of v with respect to (w.r.t). the ordered basis E, denoted .

1.5.2. Changing Coordinates

Given vector , find its coordinates (w.r.t). and .

  • As we have x=Uc.
  • Matrix U is nonsingular, therefore .
  • Thus, is the transition matrix from {} to {}.

Given vector , find its coordinates w.r.t. and .

  • If c is the coordinate vector of x with respect to the basis {}, then
    • x=Uc
  • And
  • Where U={} is called the transition matrix from the {} to the standard basis {}, and is the transition matrix from {} to {}.

Changing coordinate vectors from one basis {} of to another basis {}:

  • Suppose for a given vector x, its coordinates (w.r.t.) {} are known: We want to write x as .
  • We have ; Let and .
  • Then is the transition matrix from {} to {}.
  • Property
    • A transition matrix is nonsingular
    • If S is the transition matrix from {} to {}, then is the transition matrix from {} to {}.

1.6. Row space & column space

1.6.1. Row space & column space

For an m×n matrix A, the rows are n-vectors from , and the column vectors are m-vectors from .

  • Definition:
    • For an m×n matrix A, the subspace of spanned by the row vectors of A is called the row space of A.
    • The subspace of spanned by the column vectors of A is called the column space of A.

1.6.2. Theorem 1.6.1

  • Two row-equivalent matrices have the same row space.

1.6.3. Theorem 1.6.2

  • Two row-equivalent matrices have the same null space.
  • If U is the rref of A and , then als .
  • All the column vectors of A corresponding to the leading variables in U form a basis of the column space of A.

1.6.4. Rank

  • Definition:
    The rank of a matrix A is the dimension of the row space of A, denoted rank(A).
  • Property: The rank of a reduced row echelon form is equal to the number of lead variables

1.6.5. Dimension Theorem 1.6.3

  • dim(Row Space A) = dim(Column Space A) = rank(A).

1.6.6. Consistency theorem for Linear System 1.6.4

  • The linear system Ax = 𝐛 is consistent if and only if 𝐛 is in the column space of A.
    • An n×n square matrix A is nonsingular if and only if the column vectors of A form a basis for ⇔ Ax=b always have unique solution ⇔ 𝑟ank(A)=n⇔det⁡(A)≠0

1.6.7. Nullity

  • Definition:
    The nullity of a matrix A is the dimension of the null space of A, that is nullity(A) = dim(𝑁(A)).
  • Properties:
    • The number of lead variables: rank(A)
    • The number of free variables: dim(N(A))
    • The number of variables: n

1.6.8. Theorem 1.6.5

  • rank(A)+dim(N(A))=n

1.6.9. Property

Let . The following are equivalent:

  • The columns of A are linearly independent
  • 𝑁(A)={𝟎}
  • Nullity(A) = 0
  • A𝒙=𝒃 has at most 1 solution for any 𝒃

Let . The following are equivalent:

  • The columns of A spans
  • The column space of A is
  • Rank(A) = m
  • A𝒙=𝒃 always have at least 1 solution for any 𝒃


  1. Euclidean Vector Spaces↩︎
  2. Perhaps the most elementary vector spaces are the Euclidean vector spaces .↩︎
  3. By this we mean that, with each pair of elements x and y in V, we can associate a unique element x+y that is also in V, and with each element x in V and each scalar α, we can associate a unique element αx in V.↩︎
  4. Here “vector” is one m×n matrix↩︎
  5. Only Square matrix↩︎