1 Linear Combination and Spanning Set

Lecture 4
January 17, 2012
1
Linear Combination and Spanning Set
We have seen the notions of orthigonality and orthogonal complements. We
have identified four basic subspaces with a matrix A ∈ Rm×n , namely
1. NA and RAT - subspaces of Rn , and
2. NAT and RA - subspaces of Rm
Basically we would like to analyse these subspaces. Now, for any positive
integer k, any subspace of Rk , has an infinte number of vectors. At the
outset , therefore it appears that in order to analyse a subspace we have to
look at an infinite number of vectors, which is impractical. Therefore we
try to do a “sampling” of the subspace. While working out the strategy for
sampling we must try to keep the sampling size as small as possible. We
shall look at sampling from the matrix point of view. Suppose we have a
subspace W of Rk and u and v are vectors in W . Suppose we sample what A
does to u and v. Then we know Au and Av, From this knowledge how much
information can we exract? If x ∈ W is of the form
x = αu + βv, (where α, β ∈ R)
(1.1)
Ax = A(αu + βv)
= αAx + βAv
(1.2)
then we get
Since we know x we know α and β, and we know Au and Av by our sampling.
Hence from (1.2) we can get Ax. Thus sampling at u and v enables to get
information about all vectors x of the form in (1.1). Such vectors are called
linear combinations of u and v. In general, we give the following definition:
Definition 1.1 Let u1 , u2 , · · · , ur be a set of vectors in Rk . Then any vector
of the form
x = α1 u1 + α2 u2 + · · · + αr ur , where αj ∈ R for 1 ≤ j ≤ r
is called a “linear combination” of the vectors u1 , u2 , · · · , ur
1
From our analysis above, it follows that, if we sample what A does to
the vectors u1 , u2 , · · · , ur , then we can extract information about what A
does to all the linear combinations of the vectors u1 , u2 , · · · , ur . We denote by L[u1 , u2 , · · · , ur ], the set of all linear combinations of the vectors
u1 , u2 , · · · , ur . We have


L[u1 , u2 , · · · , ur ] =
x ∈ Rk : x =

r
X
αj uj , αj , 1 ≤ j ≤ r

j=1
Example 1.1 Consider R3 ., and




1


u =  0 
0
0


v =  1 
0
Then clearly







α



3
x ∈ R : x =  β  , α, β ∈ R
L[u, v] =




0
This is actually the XY plane. Whereas the vector


1

x =  −4 

0
is in L[u, v], the vector


1


v =  2 
3
is not in L[u, v].
2


(1.3)
Example 1.2 Consider the following vectors in R4 :




u = 



v = 

1
1
0
0

0
0
1
1









Then clearly
L[u, v] =









x ∈ R4 : x = 




α
α
β
β



,






α, β ∈ R



This is the set of all vectors in R4 which have the first two components equal
and the last two components equal.
We now observe an important property of the set of all linear combinations
of a give set of vectors. Consider a set of vectors
S = u1 , u2 , · · · , ur
in Rk . Let L[s] denote the set of all linear combinations of these vectors. We
observe that,
1. Clearly uj are all in L[S] and hence L[S] is a nonempty subset of Rk
2. We shall now see that L[S] is closed under addition. We have,
x, y ∈ L[S] =⇒ x =
r
X
αj uj , and y =
j=1
=⇒ x + y =
=⇒ x + y =
r
X
βj uj where αj , βj ∈ R
j=1
r
X
(αj + βj )uj
j=1
r
X
γj uj where γj = (αj + βj ) ∈ R
j=1
=⇒ x + y ∈ L[S]
3
Thus L[S] is closed under addition.
3. Next we observe that L[S] is closed under scalar multiplication. We
have
x ∈ L[S], α ∈ R =⇒ x =
r
X
αj uj , and αj , α ∈ R
j=1
r
X
=⇒ αx =
ααj uj
j=1
=⇒ x + y =
r
X
γj uj where γj = ααj ∈ R
j=1
=⇒ αx ∈ L[S]
Thus L[S] is closed under saclar multiplication.
From the above three properties it follows that L[S] is a subspace of Rk .
We now observe an important property of this subspace. Now consider the
set S of vectors u1 , u2 , · · · , ur . Clearly, this will not form a subspace of Rk .
So suppose we wish to put it inside a subspace. Let W be any subspace
that cotains the S vectors. Since W is a subspace it is closed under addition
and scalar multiplication. Therefore, since W contains the S vectors it must
contain all the linear combinations of S vectors and hence W must contain
the set of all linear combinations of the S vectors and hence
L[S] ⊆ W
Thus every subaspace that contains S must contain L[S]. But L[S] is itself
a subspace that contains S. We can, therefore, conclude that,
L[S] is the smallest subspace containing S
Now if W is any subspace and S is a subset of W such that L[S] = W then
what we have is that every vector in W is a linear combination of the S
vectors and hence sampling what A does to the S vectors will enable us to
extract information about all the vectors in W and hence S can be taken as
a sampling set for W . We call such sets a s spanning sets for W . We have,
Definition 1.2 A subset S of a subspace W of Rk is said to be a “spanning
set” for W if L[S] = W
4
2
Linear Independence
Consider a spanning set
S = u1 , u2 , · · · , ur
for a subspace W of Rk . Then clearly the zero vector θk can be expressed as
a linear combination of these vectors as
θk = 0u1 + 0u2 + · · · + 0uk
The linear combination in which all the coefficients are zero is called the
“Trivial linear combination”. Thus the zero vector is the trivial linear
combination of any set S of vectors in W . Suppose the set S is such that
the zero vector can be expressed only as the trivial linear combination of the
S then the set is called “linearly independent”. A linearly independent
spanning set for W is called a“Basis” for W . Thus a basis is a special type
of spanning set.
For a set S = u1 , u2 , · · · , ur , to qualify as a spanning set for W we must have,
1. u1 , u2 , · · · , ur are all in W ,
2. every vector in W can be expressed as a linear combination of the S
vectors, that is, L[S] = W
For a set S to qualify as a basis for W we nust have linear independence, in
addition to the above properties, that is we must have,
1. u1 , u2 , · · · , ur are all in W ,
2. every vector in W can be expressed as a linear combination of the S
vectors, that is, L[S] = W , and
3. S is linearly independent
We can show the following properties of a basis:
1. Any two bases for a subspace W of Rk must have the “same number
of vectors. This number is called the “dimension of the subspace”
5
2. For a d dimensional subspace any d linearly independent vectors in W
will form a basis.
We shall now look at an important property of an orthonormal set. Let
S = ϕ1 , ϕ2 , · · · , ϕr
be an orthonormal set in Rk . Then
α1 ϕ1 + α2 ϕ2 + · · · + αr ϕr
= θk
=⇒ (ϕj , α1 ϕ1 + α2 ϕ2 + · · · + αr ϕr ) = 0 for all j
=⇒ αj = 0 for all j
(since (ϕj , ϕk ) = 0 if j6= k and ϕj , ϕj ) = 1)
Thus we have that S is linearly independent. Hence we get the important
conclusion,
Every in Rk is linearly independent
If an (which is automatically independent) in Rk is a basis for a subspace
W then it is called an basis.
6