MATH 550 HOMEWORK SOLUTIONS 1 p. 19 (8) Suppose Uα is a

MATH 550 HOMEWORK SOLUTIONS 1
p. 19 (8)
Suppose Uα is a subspace of V , for all α ∈ A, and define U = ∩α∈A Uα . For every
α ∈ A, 0 ∈ Uα , since Uα is a subspace, so 0 ∈ U .
Assume that u, v ∈ U , and a ∈ F. Then, for any α ∈ A, we know that u, v ∈
Uα , so u + v ∈ Uα , and au ∈ Uα , since Uα is closed under addition and scalar
multiplication, respectively. Therefore, u + v ∈ U , and au ∈ U , showing that U is
closed under addition and scalar multiplication. By the subspace criterion (p. 13),
U is a subspace of V .
p. 35 (3)
Suppose (v1 , . . . , vn ) is linearly independent in V , w ∈ V , and (v1 + w, . . . , vn + w)
is linearly dependent. We need to show that w ∈ span(v1 , . . . , vn ).
Because (v1 +w, . . . , vn +w) is linearly dependent, there exist scalars a1 , . . . , an ∈
F, not all zero, such that
a1 (v1 + w) + · · · + an (vn + w) = 0, which implies
(1)
a1 v1 + · · · + an vn = −(a1 + · · · + an )w.
The right-hand side of Equation (1) cannot be zero, because (v1 , . . . , vn ) is linearly
independent, and the scalars a1 , . . . , an are not all zero. Therefore, the scalar
a = −(a1 + · · · + an ) is not zero, and
a−1 a1 v1 + · · · + a−1 an vn = w,
showing that w ∈ span(v1 , . . . , vn ).
p. 35 (7)
(⇒) To prove the rightward implication, we assume that V is infinite dimensional
and must show the existence of vectors v1 , v2 , . . . in V , such that (v1 , . . . , vn ) is
linearly independent for all n ∈ N. We proceed by induction.
Because V is infinite dimensional, V 6= {0}. Therefore, let v1 ∈ V \ {0}, and
note that (v1 ) is linearly independent. Now, suppose that (v1 , . . . , vn ) is linearly
independent in V . Since V is infinite dimensional, (v1 , . . . , vn ) does not span V ,
and we let vn+1 ∈ V \ span(v1 , . . . , vn ). The list (v1 , . . . , vn+1 ) is then linearly
independent by the Linear Dependence Lemma (p. 25). The existence of the
desired sequence of vectors follows by induction.
(⇐) Suppose that v1 , v2 , . . . are vectors satisfying the conditions given in the problem. We need to show that V is infinite dimensional.
Assume, for the purpose of a proof by contradiction, that V is finite dimensional.
Then there exists a list of vectors (w1 , . . . , wn ) spanning V . However, (v1 , . . . , vn+1 )
is a linearly independent list of vectors with greater length, which is impossible by
Theorem 2.6.
1
2
MATH 550 HOMEWORK SOLUTIONS 1
p. 59 (1)
Suppose V is a one-dimensional vector space over F, and let T ∈ L(V ). We need
to show there exists some a ∈ F, such that T v = av, for all v ∈ V .
Let (v1 ) be a basis of V (the basis has length one since dim V = 1). Since
T v1 ∈ V and (v1 ) is a basis of V , there exists some a ∈ F, such that T v1 = av1 .
Now, suppose v ∈ V , and note that v = bv1 , for some b ∈ F. Then T v = T bv1 =
bT v1 = bav1 = av, showing that T v = av, for all v ∈ V .
(Notice that there exists one a ∈ F corresponding to the mapping T that works
for all v ∈ V . Therefore, we start by defining what a is: it’s the unique scalar with
the property T v1 = av1 . After we define what a is, we show that it works for all
v ∈ V .)
p. 59 (5)
Suppose that T ∈ L(V, W ) is injective, and (v1 , . . . , vn ) is linearly independent in
V . We need to show that (T v1 , . . . , T vn ) is linearly independent in W , that is,
for all a1 , . . . , an ∈ F, if a1 T v1 + · · · + an T vn = 0, then a1 = · · · = an = 0.
Assume that a1 , . . . , an ∈ F such that a1 T v1 + · · · + an T vn = 0. By linearity of T ,
T (a1 v1 + · · · + an vn ) = 0, which implies
a1 v1 + · · · + an vn = 0,
since T is injective and consequently null T = {0}. Because (v1 , . . . , vn ) is linearly
independent, the equation above implies that a1 = · · · = an = 0, as desired.
(In this problem, we are trying to prove that
a1 T v1 + · · · + an T vn = 0 implies a1 = · · · = an = 0,
so we start by assuming the left-hand side and deduce the right-hand side. We do
not start off by assuming that a1 v1 + · · · + an vn = 0, because we are not trying to
prove that a1 v1 + · · · + an vn = 0 implies something. You might say that we know
a1 v1 + · · · + an vn = 0, because (v1 , . . . , vn ) is linearly independent, but that is not
correct. We only know that if a1 v1 + · · · + an vn = 0, then a1 = · · · = an = 0.
Linear independence of a set of vectors does not say anything about the existence of
certain scalars satisfying certain conditions. It says that if scalars exist satisfying
one condition, then they satisfy another condition.
We can’t even write a1 v1 + · · · + an vn = 0 without first defining what the aj ’s
are. But the only way to declare what the aj ’s are which is consistent with what
we are proving is to let the aj ’s be scalars satisfying a1 T v1 + · · · + an T vn = 0.)