6.4 BASIS AND DIMENSION (Review)
DEF 1 Vectors v1, v2, · · · , vk in a vector space V are said to form
a basis for V if
(a) v1, · · · , vk span V and
(b) v1, · · · , vk are linearly independent.
HMHsueh
1
Natural Basis or Standard Basis of Rn, Pn:
• {e1, e2, · · · , en} is the natural basis for Rn.
• {tn, tn−1, · · · , t, 1} is the natural basis for Pn.
HMHsueh
2
EX 1 (Ex. 2) Show that S = {v1, v2, v3, v4} is a basis for R4,
where
v1 = (1, 0, 1, 0), v2 = (0, 1, −1, 2), v3 = (0, 2, 2, 1), v4 = (1, 0, 0, 1).
Proof. (a) Show that S spans R4.
For any vector v = (a, b, c, d) ∈ R4, find the scalars k1, k2, k3, k4
such that
k1v1 + k2v2 + k3v3 + k4v4 = v.
For simplicity, consider to express the vectors v1, · · · , v4 and v in
column vectors.
HMHsueh
3
Define the matrix A and the column vector k,
k1
³
´
k2
A = v1 v2 v3 v4 , k =
.
k3
k4
Since
k1
³
´ k
Ak =
v1 v2 v3 v4 2
k3
k4
= v1k1 + v2k2 + v3k3 + v4k4.
The equation is equivalent to
Ak = v,
where v is the vector v expressed in column.
HMHsueh
4
Since the matrix A
1
0
0 1
0
2
1 1
0 1 2 0
A=
1 −1 2 0
³
´
= v1 v2 v3 v4 ,
is row equivalent to an identity matrix, I4, the solution exists
and is equal to
k1 = 4a + 5b − 3c − 4d,
k2 = 2a + 3b − 2c − 2d,
k3 = −a − b + c + d,
k4 = −3a − 5b + 3c + 4d.
That is, any vector v in R4 is a linear combination of v1, v2, v3, v4,
and thus span S = V .
HMHsueh
5
(2)Show that S is linearly independent.
Again, if the zero vector is expressed by
c1v1 + c2v2 + c3v3 + c4v4 = 0,
then according to previous results, it is equivalent to solve
Ac = 0,
where c =
c1
c2
c3
c4
.
Since A is row equivalent to I4, only the trivial solution is found,
c1 = c2 = c3 = c4 = 0.
Thus, S is linearly independent.
S is a basis for R4.
HMHsueh
6
REMARKS:
Determine whether S = {v1, · · · , vk } is a basis of V = Rn.
• Step 0. Written the vectors in S in column vectors and form
the coefficient matrix A = (v1, · · · , vk ).
• Step 1. S spans V . For any v ∈ V , check the existence of a
solution k for Ak = v.
• Step 2. S is linearly independent. Check if the homogenous
linear system Ak = 0 has only the trivial solution.
HMHsueh
7
A vector space V is called finite-dimensional if a basis of V
which is a finite subset of V exists.
If there is no such finite subset of V , then V is called infinitedimensional.
For example,R2, R4, P2, P4 are finite-dimensional.
But
P = {all polynomials},
C(−∞, ∞) = {all continuous functions, f : R → R }
are infinite-dimensional.
HMHsueh
8
V : A vector space and S = {v1, v2, · · · , vn} ⊂ V .
THM 1 (Thm. 6.5)
S is a basis of V if and only if every vector in V can be written
in unique way as a linear combination of the vectors in S.
THM 2 (Thm. 6.6)
Let W = span S. Then some subset T of S is a basis for W .
T=?
HMHsueh
9
Find the subset T when V = Rm(P306)
If V = Rm and S = {v1, · · · , vn}, each vi ∈ Rm, n ≥ m.
W = span S. Find a basis T of W which is a subset of S.
Let
SOL.
Step 1: If S is linearly independent, S is a basis of W .
Step 2: If S is not linearly independent, find and remove the set
of vectors which are linear combinations of other vectors. The
remaining set T is shown to span W .
Step 3: Show that T is linearly independent. T is thus a basis
of W .
HMHsueh
10
S is linearly independent?
Solve the homogeneous linear system of m equations in n unknowns ci’s,
c1v1 + c2v2 + · · · + cnvn = 0.
Expressing each vi as a column vector, form the coefficient matrix
Am×n = (v1 v2 · · · vn) ,
and the linear system can be written by
c1
Ac = 0, where c = ... .
cn
If the trivial solution is the only solution, then S is linearly independent and S itself is a basis of W .
HMHsueh
11
Now since n ≥ m, the number of unknowns is no less than the
number of equations, nontrivial solutions exist. (Step 1 fails and
Step 2 continues).
HMHsueh
12
Assume the reduced row echelon form of A has r(r ≤ m) nonzero
rows. Without loss of generality, assume that the r leading 1s
in the nonzero rows occur in the first r columns.
Let the reduced row echelon form of A be
B=
1
0
...
0
0
...
0
0
1
...
0
0
...
0
0
0
...
0
0
...
0
···
···
...
···
···
...
···
0 b1 r+1 · · · b1 n
0 b2 r+1 · · · b2 n
...
...
...
...
1 br r+1 · · · br n
··· 0
0
0
...
...
...
...
0
0
··· 0
,
then c1, · · · , cr can be solved in terms of cr+1, · · · , cn.
HMHsueh
13
Thus
c1 = −b1 r+1cr+1 − · · · − b1 ncn
c2 = −b2 r+1cr+1 − · · · − b2 ncn
...
cr = −br r+1cr+1 − · · · − br ncn,
where cr+1, · · · , cn are arbitrary real values.
Q: Which vectors are linear combinations of others?
A: The vectors vr+1, · · · , vn corresponding to the non-leading 1
columns.
HMHsueh
14
Show that vr+1 is a linear combination of v1, · · · , vr .
Consider the solution cr+1 = 1, cr+2 = 0, · · · , cn = 0,
c1 = −b1 r+1, c2 = −b2 r+1, · · · , cr = −br r+1,
for the linear system, we have
c1v1 + c2v2 + · · · + cnvn = 0
⇔
−b1 r+1v1 − b2 r+1v2 + · · · − br r+1vr + vr+1 = 0,
⇔
b1 r+1v1 + b2 r+1v2 + · · · + br r+1vr = vr+1.
vr+1 is a linear combination of v1, · · · , vr . Thus
span {v1, · · · , vr , vr+2, · · · , vn} = span S = W,
and vr+1 can be deleted from S.
HMHsueh
15
Show that vr+2 is a linear combination of v1, · · · , vr .
Consider the solution cr+1 = 0, cr+2 = 1, · · · , cn = 0, and
c1 = −b1 r+2, c2 = −b2 r+2, · · · , cr = −br r+2
for the linear system, we have
c1v1 + c2v2 + · · · + cnvn = 0
⇔
−b1 r+2v1 − b2 r+2v2 + · · · − br r+2vr + vr+2 = 0,
⇔
b1 r+2v1 + b2 r+2v2 + · · · + br r+2vr = vr+2,
vr+2 is a linear combination of v1, · · · , vr . Thus
span {v1, · · · , vr , vr+3, · · · , vn} = span S = W,
and vr+2 can be deleted from S as well.
HMHsueh
16
Continuing the process, we found that vr+1, · · · , vn are linear
combinations of {v1, · · · , vr } and
span {v1, · · · , vr } = span S = W,
that is, T = {v1, · · · , vr } is a spanning set of W . (End of Step 2)
HMHsueh
17
Show that T = {v1, · · · , vr } is linearly independent.
Let AD be the matrix formed by v1, · · · , vr ,
AD = (v1 v2 · · · vr ) .
Now consider to solve the following homogenous linear system,
c1v1 + · · · + cr vr = 0 ⇔ AD c = 0.
Then according to previous results, AD is row equivalent to the
matrix BD which consists of the first r columns of B,
BD =
HMHsueh
1
0
...
0
0
...
0
0
1
...
0
0
...
0
0
0
...
0
0
...
0
···
···
...
···
···
...
···
0
0
...
1
0
...
0
18
Then the homogeneous systems
AD c = 0 and BD c = 0
are equivalent and have only the trivial solution,
c1 = · · · = cr = 0.
Thus, T = {v1, · · · , vr } are linearly independent.
We have found a subset T of S which is a basis of span S.
HMHsueh
19
EX 2 (Ex. 5) Let S = {v1, · · · , v5} ⊂ R4, where
v1 = (1, 2, −2, 1), v2 = (−3, 0, −4, 3), v3 = (2, 1, 1, −1)
v4 = (−3, 3, −9, 6), v5 = (9, 3, 7, −6).
Find a subset of S that is a basis for W = span S.
SOL. Form the coefficient matrix A,
A = (v1 v2 v3 v4 v5) =
HMHsueh
1 −3
2 −3
9
2
0
1
3
3
−2 −4
1 −9
7
1
3 −1
6 −6
.
20
Since the reduced row echelon form of A,
1 0
0 1
B=
0 0
0 0
1
2
−1
2
3
2
3
2
0 0
0 0
3
2
−5
2
,
0
0
has leading 1 in columns 1 and 2.
So {v1, v2} is a basis for W = span S.
HMHsueh
21
REMARK:
The basis for W depends on the order of the vectors in the
original set S.
(Ex. 5 Cont.) Let S = {v4, v3, v2, v1, v5}, then the matrix A and
its reduced row echelon form B are
1
3
1 0
1
3
0 0
0 0
0 1 −1 1
A = (v4 v3 v2 v1 v5) , B =
0 0
0 0
1
−3
4
.
0
0
Thus {v4, v3} is also a basis for W = span S.
HMHsueh
22
REMARK:
A nonzero real vector space always has infinitely many bases.
Corollary. (T.9) If {v1, v2, · · · , vn} is a basis for a vector space V ,
then {cv1, v2, · · · , vn} is also a basis if c 6= 0.
HMHsueh
23
THM 3 (Thm.6.7) If S = {v1, · · · , vn} is a basis for a vector
space V and T = {w1, · · · , wr } ⊂ V is a linearly independent set,
then r ≤ n.
HMHsueh
24
PROOF.
Let T1 = {w1, v1, · · · , vn}. Then
span S = span T1 = V.
Since w1 is a linear combination of S, T1 is linearly dependent.
By Thm. 6.4, some vj is a linear combination of the preceding
vectors in T1. Deleting the vj in T1, we have
S1 = {w1, v1, · · · , vj−1, vj+1, · · · , vn}.
Since any vector in V can be expressed by a linear combination
of S1, S1 spans V .
HMHsueh
25
Next, let
T2 = {w2, w1, v1, · · · , vj−1, vj+1, · · · , vn}.
Then since w2 is a linear combination of S1, T2 is linearly dependent.
Some vector is a linear combination of preceding vectors in T2.
Since w1, w2 from T are linearly independent, the vector cannot
be w2 and must be some vi, i 6= j.
Deleting the vi from T2, we have S2 and S2 spans V .
Repeat this process.
HMHsueh
26
If n ≤ r, one obtains
Sn = {wn, wn−1, · · · , w1},
and Sn spans V . Then the set
Tn+1 = {wn+1, wn, wn−1, · · · , w1}
should be linearly dependent which contradicts the assumption
of independence for T .
Thus r ≤ n.
HMHsueh
27
REMARKS: If S = {v1, · · · , vn} and T = {w1, · · · , wr } are subsets
of a vector space V .
(1) If S is a basis and r > n, then T is a linearly dependent set.
(2) If T is linear independent and r > n, then S cannot be a basis
of V .
HMHsueh
28
COROLLARY 6.1 If S = {v1, v2, · · · , vn} and T = {w1, w2, · · · , wm}
are bases for a vector space, then n = m.
PROOF.
Since S is a basis and T is linearly independent, Thm. 6.7 implies
that m ≤ n.
Similarly, since T is a basis and S is linearly independent, thus
n ≤ m.
Thus, m = n.
HMHsueh
29
DIMENSION
DEF 2 The dimension of a nonzero vector space V is the number of vectors in a basis for V .
For the zero vector space {0}.
The space is spanned by the set {0} which is linearly dependent
and is not a basis. The zero vector space is thus set to have
dimension 0.
HMHsueh
30
EX 3 (Ex. 6, 7)
The dimension of R2 is 2; the dimension of R3 is 3; in general,
the dimension of Rn is n. (Hint: The standard basis of Rn?)
The dimension of P 2 is 3; the dimension of P 3 is 4; in general,
the dimension of P n is n + 1. (Hint: The standard basis of P n?)
HMHsueh
31
EX 4 (Ex. 8) In Ex. 5, S = {v1, · · · , v5} ⊂ R4, where
v1 = (1, 2, −2, 1), v2 = (−3, 0, −4, 3), v3 = (2, 1, 1, −1)
v4 = (−3, 3, −9, 6), v5 = (9, 3, 7, −6),
and W = span S.
It’s found that the basis of the subspace W = span S is {v1, v2}.
Thus W has dimension 2.
HMHsueh
32
REMARKS:
(T.2) If V is a finite-dimensional vector space, then every nonzero
subspace W of V has a finite basis and
dim W ≤ dim V.
Hint: The basis of W is linearly independent in V .
HMHsueh
33
(T.3, T.4)If a vector space V has dimension n,then
• any set of n + 1 vectors in V is linearly dependent.
(Hint:Simply by Thm. 6.7)
• no set of n − 1 vectors can span V .
(Hint: If there exists such a set, dim V < n. Contradiction.)
HMHsueh
34
THM 4 (Thm 6.8)(T.5) If S is a linearly independent set in a
finite-dimensional vector space V , then there is a basis T for V ,
which contains S.
PROOF. (Using the similar approach of Thm 6.7)
Assume that dim V = n and a basis of V is T0 = {v1, · · · , vn}.
Further S = {w1, · · · , wr }, then by Thm.6.7, r ≤ n.
Consider the set
T1 = {w1, v1, · · · , vn}.
T1 spans V and is linearly dependent. Then there exists one vi
which is a linear combination of the preceding vectors. Deleting
vi, the resulting set S1 is a basis of V .
HMHsueh
35
Further consider the set
T2 = {w2, w1, v1, · · · , vi−1, vi+1, · · · , vn},
again, T2 spans V but is linearly dependent. Since w2, w1 are
linear independent, there exists one vj which is a linear combination of the preceding vectors. Deleting vj , the resulting set S2
is a basis of V .
Repeat the process until S is contained in the basis. Then one
has found a basis which contains S.
HMHsueh
36
EX 5 (Ex. 9) Find a basis for R4 that contains the vectors
v1 = (1, 0, 1, 0), and v2 = (−1, 1, −1, 0).
SOL. Consider the natural basis of R4,
e1 = (1, 0, 0, 0), e2 = (0, 1, 0, 0) e2 = (0, 0, 1, 0) e4 = (0, 0, 0, 1).
Let S = {v1, v2, e1, e2, e3, e4}.
Then S spans V but is linearly dependent.
Now use the strategy on P306 to find a subset of S which is a
basis for R4 = span S.
HMHsueh
37
Form the coefficient matrix
A = (v1 v2 e1 e2 e3 e4 ) =
1 −1 1 0 0
0
1 0 1 0
1 −1 0 0 1
0
0 0 0 0
0
0
0
1
Then it can be found that the reduced row echelon form of A,
1
0
0
0
0
1
0
0
0
0
1
0
1
1 0
1
0 0
0 −1 0
0
0 1
,
has leading 1s in columns 1,2,3,6.
Thus, {v1, v2, e1, e4} is a basis for R4 containing v1 and v2.
HMHsueh
38
THM 5 (Thm. 6.9)(T.6) Let V be an n-dimensional vector
space, and let S = {v1, v2, · · · , vn} be a set of n vectors in V .
(a) If S is linearly independent, then it is a basis for V .
(Hint: By Thm.6.8)
(b) If S spans V , then it is a basis for V .
(Hint: If S is linear dependent, dim < n. Contradiction!)
HMHsueh
39
EX 6 (Ex. 10)
In Ex. 5, S ⊂ R4 and W = span S, thus dim W ≤ 4. Since S
contains 5 vectors, S can’t be linearly independent and thus is
not a basis for W .
In Ex. 2, S ⊂ R4 and S contains 4 vectors. Thus S is possibly
a basis for R4. By Thm 6.9, only one of the conditions, S is
linearly independent or S spans R4, is necessary to check.
HMHsueh
40
PART I:
EXERCISE 2(a),4(a),6,7,9,14,18,21,30,32,35
T1,T8,T9,T10,T12,T13
PART II:
EXERCISE 11, 13,15,28
T1, T2, T3, T4, T6, T7
HMHsueh
41
© Copyright 2026 Paperzz