Vector Subspaces

Vector Subspaces
Introduction
Recall that a vector space consists of a set V along with two operations, addition and scalar multiplication. Moreover, the addition operation satisfies the abelian-group axioms, while the multiplication
satisfies the scalar axioms. Now suppose that W ⊆ V is a subset of V. Then the addition and multiplication operations are also defined over W. Does this make W a vector space? Not necessarily.
The following are reasons for why W can fall short of being a vector space. In what follows, we’ll
assume that V = R2 .
Operations not Closed Suppose W consists of the single element (1, 1). Then (1, 1) + (1, 1) =
(2, 2) 6∈ W and so W is not closed under addition. Similarly, 2(1, 1) = (2, 2) 6∈ W, and W is
also not closed under scalar multiplication.
No Zero Vector If (0, 0) 6∈ W, then addition no longer has the required identity element.
Missing Inverses Suppose W consists of all vectors (x, y) in Quadrant I; i.e. x ≥ 0 and y ≥ 0.
Then (1, 1) ∈ W, but its inverse (−1, −1) is not in W.
The above examples suggest that many if not most subsets of a vector space will not themselves
be vector spaces. However, whenever we do encounter a subset W that is closed under addition
and scalar multiplication, and the abelian and scalar axioms are still satisfied, then we call W a
subspace of V. Moreover, the following theorem tells us that W being closed under addition and
scalar multiplication is not only necessary, but is also sufficient.
Theorem 1 Let V be a vector space with addition and scalar operations + and ·, and W ⊆ V a
nonempty subset of V. Then W is a subspace of V iff W is closed under + and ·. In other words, W
is a subspace provided
1. For all u, v ∈ W, u + v ∈ W.
2. If v ∈ W and r is a scalar, then r · v ∈ W.
1
Proof of Theorem 1. Since addition is associative and commutative in V, it must also be associative
and commutative in W. Moreover, if u, v, and w are all vectors in W, then by 1. so are u + (v + w),
(u + v) + w, u + v, and v + u. Furthermore, given v ∈ W, 0v = 0 ∈ W by 2. So W contains the
identity element. Also by 2., for any v ∈ W, −1v = −v ∈ W. So every vector in W also has its
additive inverse in W. Therefore, addition over W satisfies the abelian axioms.
As for the scalar axioms, since W is a subset of V, the only way an axiom equation would not hold
in W is if the left or right side evaluated to a vector in W, but the other side evaluated to something
not in W. We argue that this can never happen for two of the axioms, and leave the other two as
exercises.
Consider the Unit Scalar axiom. If v ∈ W, then v ∈ V, and so v = 1v ∈ W.
Now consider the Distributive over V axiom. If u, v ∈ W, then u + v ∈ W by 1., and r(u + v) ∈ W
by 2.. Moreover, by 2., ru, rv ∈ W and so ru + rv ∈ W. Therefore, the left and right side of the
axiom equation both evaluate to a vector in W.
Subspace Examples
Example 1. Let W be the subset of R2 consisting of all scalar multiples of the vector (1, 2). We
show that W is a subspace of R2 . The first step is to show that W is closed under addition. Let
u, v ∈ W be arbitrary. Since u and v are both scalar multiples of (1, 2), then we may write u = r(1, 2)
and v = s(1, 2), where r and s are scalars. Then
u + v = r(1, 2) + s(1, 2) = (r + s)(1, 2)
is also a scalar multiple of (1, 2), and hence is in W. What scalar axiom was used in the above chain
of equalities?
The second step is to show that W is closed under scalar multiplication. Let v ∈ W be arbitrary.
Since v is scalar multiple of (1, 2), then we may write v = r(1, 2), for some scalar r. Then, for any
scalar s,
sv = s(r(1, 2)) = (sr)(1, 2)
is also a scalar multiple of (1, 2), and hence is in W. What scalar axiom was used in the above chain
of equalities?
2
Example 2. Recall that M2,2 denotes the vector space of 2 × 2 matrices that uses matrix addition
and matrix scalar multiplication. Let W be the subset of M2,2 that consists of matrices whose
off-diagonal elements equal zero. Show that W is subspace of M2,2 .
Example 3. Recall that F(R, R) denotes the vector space of functions having domain and co-domain
equal to R. Let W be the subset of F(R, R) that consists of functions f (x) for which f (1) = 0.
Show that W is subspace of F(R, R). Would W still be a subspace if we required f (1) = 2?
3
Example 4. Recall that a homogeneous system of m equations over n variables x1 , . . . , xn can be
represented as AX = 0 where A is the coefficient matrix of the system, and


x1


X =  ...  .
xn
A solution vector to the equation is any n × 1 vector v for which Av = 0. Show that the set of
solution vectors is a subspace of Rn . Note that this subspace is called the solution space to the
system of equations. Would the set of solution vectors still form a subspace for the general case
AX = B, where B is an m × 1 matrix whose entries are not necessarily 0?
Linear Combinations
Given vectors v1 , . . . , vn ∈ V, a linear combination of v1 , . . . , vn is any sum of the form
c1 v1 + · · · + cn vn ,
where c1 , . . . , cn are scalars. Note that a linear combination of vectors is itself a vector. For example,
u is a linear combination of v1 , . . . , vn iff there exist scalars c1 , . . . , cn for which u = c1 v1 + · · · + cn vn .
Example 5. Given R2 vectors v1 = (−3, 4) and v2 = (2, 5),
1v1 − 3v2 = 1(−3, 4) + 3(2, 5) = (−3, 4) + (6, 15) = (3, 19).
Therefore, (3, 19) is a linear combination of v1 and v2 .
4
Example 6. Show that every vector in R3 is a linear combination of vectors e1 = (1, 0, 0), e2 =
(0, 1, 0), and e3 = (0, 0, 1).
Example 7. Is u = (9, 2, 7) a linear combination of v1 = (1, 2, −1) and v2 = (6, 4, 2)? If yes, provide
c1 and c2 . If not, explain why. Same question, but now u = (4, −1, 8).
5
Linear Spans
Let S be a subset of vector space V. The linear span of S, denoted span(S), is the set of all possible
linear combinations of the form c1 v1 + · · · + cn vn , where v1 , . . . , vn ∈ S, and c1 , . . . , cn are scalars.
Note that a linear combination in span(S) need not use all vectors in S. In fact, S could be an
infinite set.
Theorem 2. Let S be a nonempty subset of vector space V. Then span(S) is a subspace of V.
Moreover, any other subspace W of V that contains S, must also contain span(S). In other words,
span(S) is the smallest subspace that contains S.
Proof of Theorem 2. To prove that span(S) is a subspace, it must be shown that span(S) is
closed under both addition and scalar multiplication. Let u, v ∈ span(S) be arbitrary. Then there
are vectors v1 , . . . , vn ∈ S for which u = c1 v1 + · · · + cn vn and v = d1 v1 + · · · + dn vn , where c1 , . . . , cn
and d1 , . . . , dn are scalars (why may we assume that u and v are linear combinations of the exact
same vectors v1 , . . . , vn from S?). Then
u + v = (c1 v1 + · · · + cn vn ) + (d1 v1 + · · · + dn vn ) = (c1 v1 + d1 v1 ) + · · · + (cn vn + dn vn ) =
(c1 + d1 )v1 + · · · + (cn + dn )vn
is a linear combination of vectors from S, and hence u + v ∈ span(S). Similarly, if r is a scalar, then
ru = r(c1 v1 + · · · + cn vn ) = r(c1 v1 ) + · · · + r(cn vn ) = (rc1 )v1 + · · · + (rcn )vn
is a linear combination of vectors from S, and hence ru ∈ span(S).
Finally, suppose W is a subspace of V and contains S. Let u = c1 v1 + · · · + cn vn be an arbitrary
element in span(S), where v1 , . . . , vn ∈ S and c1 , . . . , cn are scalars. Then, since S ⊆ W, it follows
that v1 , . . . , vn ∈ W. Moreover, since W is a subspace, it is closed under both additiion and scalar
multiplication. In other words, any linear combination of v1 , . . . , vn must also be in W. In particular
u ∈ W. And since u ∈ span(S) was arbitrary, it follows that span(S) is a subset of W, and hence
span(S) is the smallest subspace that contains S.
6
Example 8. Let S = {(1, 1, 2), (1, 0, 1), (2, 1, 3)} be a subset of R3 . Is it true that span(S) = R3 ?
Perhaps the most natural way of making a subspace of a vector space V is to take a subset S ⊆ V
7
Exercises
1. Let W be a subset of vector space V that is closed under both vector-space addition and scalar
multiplication. Prove that if v ∈ W, and r and s are scalars then both r(sv) and (rs)v are in
W. Conclude that scalar multiplication over W is associative.
2. Let W be a subset of vector space V that is closed under both vector-space addition and scalar
multiplication. Prove that if v ∈ W, and r and s are scalars then both (r + s)v and rv + sv
are in W. Conclude that scalar multiplication over W is distributive over scalars.
3. Let W be the subset of R3 consisting of vectors of the form (a, 0, 0), where a is a real number.
Is W a subspace of R3 ? Prove or disprove. Same question, but now W consists of vectors of
the form (a, 1, 1).
4. Let W be the subset of R3 consisting of vectors of the form (a, b, c), where b = a + c. Is W a
subspace of R3 ? Prove or disprove.
5. Let W be the subset of M2.2 consisting of matrices
a b
,
c d
where a, b, c, and d are all integers. Is W a subspace? Prove or disprove.
6. Let W be the subset of M2.2 consisting of matrices
a b
,
c d
where a + b + c + d = 0. Is W a subspace? Prove or disprove.
7. Recall that Pm represents the vector space of polynomials having degree at most m. Explain
why Pm is a subspace of Pn for every n > m.
8. Show that the set W of polynomials of the form a1 x + a2 x2 is a subspace of P2 .
9. Let W be the subset of F(R, R) that consists of all functions f (x) for which f (x) ≤ 0, for all
x ∈ R. Is W a subspace of F(R, R)? Prove or disprove.
10. Let W be the subset of F(R, R) that consists of all constant functions (i.e. f (x) = k, for some
constant k). Is W a subspace of F(R, R)? Prove or disprove.
11. Which of the following are linear combinations of u = (1, −1, 3), and v = (2, 4, 0)? a) (3, 3, 3),
b) (4, 2, 6), c) (1, 5, 6), d) (0, 0, 0)
12. Express (2, 2, 4) as a linear combination of u = (2, 1, 4), v = (1, −1, 3), and w = (3, 2, 5).
13. Repeat the previous exercise, but now express (2, 0, 6) as a linear combination of u = (2, 1, 4),
v = (1, −1, 3), and w = (3, 2, 5).
14. Express polynomial 3x2 +2x+2 as a linear combination of p1 (x) = 4x2 +x+2, p2 (x) = 3x2 −x+1
and p3 (x) = 5x2 + 2x + 3.
8
15. Write matrix
6 −1
−8 −8
as a linear combination of
1 2
0 1
4 −2
A=
,B =
, and C =
.
−1 3
2 4
0 −2
16. Do the vectors v1 = (1, 3, 3), v2 = (1, 3, 4), v3 = (1, 4, 3), v4 = (6, 2, 1) span R3 ? Show work.
17. Do the vectors v1 = (3, 1, 4), v2 = (2, −3, 5), v3 = (5, −2, 9), v4 = (1, 4, −1) span R3 ? Show
work.
18. Do the polynomials p1 (x) = −x2 + 2x + 1, p2 (x) = x2 + 3, p3 (x) = −x2 + 4x + 5, p4 (x) =
−2x2 + 2x − 2 span P 2 ? Show work.
19. Which of the following functions are in the span of cos2 x and sin2 x? a) cos 2x, b) x2 + 3, c) 1,
d) sin x
9
Exercise Solutions
1. If v ∈ W and r and s are scalars, then, since W is closed under scalar multiplication, sv, r(sv),
and (rs)v are all vectors of W.
2. If v ∈ W and r and s are scalars, then, since W is closed under scalar multiplication, sv, rv,
and (s + r)v are all vectors of W. And since W is closed under addition, rv + sv is also a vector
of W.
3. Given u, v ∈ W, then u = (a, 0, 0) and v = (b, 0, 0), where a and b are real numbers. Then
u + v = (a + b, 0, 0) is also in W, and so W is closed under addition. Also, given scalar r,
ru = r(a, 0, 0) = (ra, 0, 0) is in W, and so W is closed scalar multiplication. One the other
hand if the elements of W were of the form (a, 1, 1), then 2(a, 1, 1) = (2a, 2, 2) is not in W, and
so W would not be a subspace under this definition.
4. Yes. Given u, v ∈ W, then u = (a1 , b1 , c1 ) and v = (a2 , b2 , c2 ), b1 = a1 + c1 and b2 = a2 + c2 .
Then u + v = (a1 + a2 , b1 + b2 , c1 + c2 ), and, by adding both sides of the two previous equations,
b1 + b2 = (a1 + c1 ) + (a2 + c2 ) = (a1 + a2 ) + (c1 + c2 ).
Hence, u + v ∈ W. Similarly, if v = (a, b, c) and r is a scalar, then rv = (ra, rb, rc), and, since
b = a + c, it follows that
rb = r(a + c) = ra + rc,
and rv ∈ W. Therefore, W is a subspace.
5. No. Consider matrix
A=
1 1
1 1
.
Then A ∈ W, but
0.5A =
0.5 0.5
0.5 0.5
is not in W. Hence, W is not closed under scalar multiplication, and is therefore not a subspace.
6. Yes. The proof is very similar to that given in the solution to Exercise 4.
7. Since m < n, every polynomial p in Pm has degree at most m, and hence at most n. Thus p
is also in Pn . Moreover, from the Vector-Space lecture, we already know that Pm is a vector
space (under addition and scalar multiplication of polynomials). Therefore, Pm is a subspace
of Pn .
8. Let p = ax + bx2 and q = cx + dx2 . Then p, q ∈ W. Moreover, p + q = (a + c)x + (b + d)x2 is
also in W. Hence, W is closed under addition. Moreover, for scalar r, rp = (ra)x + (rb)x2 is
also in W. Hence, W is closed under scalar multiplication. Therefore W is a subspace of P2 .
9. No. f (x) = −x2 is in W, but not (−1f )(x) = −1(−x2 ) = x2 . So W is not closed under scalar
multiplication.
10. Yes. If f (x) = k1 and g(x) = k2 are constant functions, then so is (f + g)(x) = k1 + k2 . Also,
for scalar r and f (x) = k, (rf )(x) = rk is also a constant function. Therefore W is a subspace.
10
11. To see if vector (a, b, c) is a linear combination of u = (1, −1, 3) and v = (2, 4, 0), we must find
c1 and c2 so that the following system of equations has a solution.
c1 + 2c2 = a
−c1 + 4c2 = b
3c1 = c
The third equation gives c1 = c/3, while adding the first two equations gives c2 = (a + b)/6.
For (a, b, c) = (3, 3, 3), this implies c1 = c2 = 1, while (a, b, c) = (4, 2, 6) implies c1 = 2
and c2 = 1. Check that these coefficients are correct for both a) and b). On the other
hand, (a, b, c) = (1, 5, 6) also implies c1 = 2 and c2 = 1, but 2u + v 6= (1, 5, 6). Finally,
0u + 0v = (0, 0, 0).
12. We must find constants c1 , c2 , c3 such that c1 u + c2 v + c3 w = (2, 2, 3). In other words, we must
solve the system of equations
2c1 + c2 + 3c3 = 2
c1 − c2 + 2c3 = 2
4c1 + 3c2 + 5c3 = 4
Solving the system gives c1 = 3, c2 = c3 = −1.
13. c1 = 4, c2 = 0, c3 = −2.
14. 3x2 + 2x + 2 = 21 p1 − 12 p2 + 21 p3 .
15. Matrix
6 −1
−8 −8
= 2A − 3B + C.
16. Yes.
17. No.
18. No.
19. a and c only.
11