Lecture 3. Vectors in M-Dimensional Spaces

Lecture 3. Vectors in M-Dimensional Spaces
Recall that three dimensional space, often denoted as R3 ; is the set of all vectors [x; y; z] ;
i.e.,
R3 = f[x; y; z] : x; y; and z are any real numbersg :
Furthermore, the sum, di¤erence, dot product of two vectors as well as scalar multiplication
are de…ned, respectively as
[x1 ; y1 ; z1 ] + [x2 ; y2 ; z2 ] = [x1 + x2 ; y1 + y2 ; z1 + z2 ]
[x1 ; y1 ; z1 ] ¡ [x2 ; y2 ; z2 ] = [x1 ¡ x2 ; y1 ¡ y2 ; z1 ¡ z2 ]
[x1 ; y1 ; z1 ] ¢ [x2 ; y2 ; z2 ] = x1 x2 + y1 y2 + z1 z2
¸[x; y; z] = [¸x; ¸y; ¸z]
(¸ is a real number).
All these can be extended to de…ne m¡dimensional space.
² M¡Dimensional Space
The straight forward de…nition of a m¡dimensional vector is a group of ordered m real
numbers, u1 ; u2 ; :::; um ; and may be denoted by [u1 ; u2 ; :::; um ] : For convenience, here we use
the column form. Thus, we de…ne a m¡dimensional vector as a m £ 1 matrix, or a column,
2
3
u1
6 u2 7
6
7
~u = 6 .. 7
:
4 . 5
um m£1
The m¡dimensional space, denoted by Rn ; is de…ned as the set of all m¡dimensional vectors
82
9
3
>
>
u
1
>
>
>
>
< 6 u2 7
=
6
7
n
R = 6 .. 7 : u1 ; u2 ; :::; um are any real numbers :
>
>
4 . 5
>
>
>
>
: u
;
m
In Rm ; addition, subtraction, scalar multiplication and dot products are de…ned in the manner similar to 3-D vectors as follows:
2
3
2
3
u1
v1
6 u2 7
6 v2 7
6
7
6
7
Let ~u = 6 .. 7 ; ~v = 6 .. 7 ; ¸ be a real number.
4 . 5
4 . 5
um
vm
1
Then
2
6
6
~u + ~v = 6
4
2
6
6
~u ¡ ~v = 6
4
u1
u2
..
.
um
u1
u2
..
.
um
2
6
6
¸~u = 6
4
3
2
3
2
7 6
7 6
7+6
5 4
7 6
7 6
7¡6
5 4
¸u1
¸u2
..
.
¸um
v1
v2
..
.
vm
v1
v2
..
.
vm
3
3
2
3
2
7 6
7 6
7=6
5 4
7 6
7 6
7=6
5 4
u1 + v1
u2 + v2
..
.
um + vm
u1 ¡ v 1
u2 ¡ v 2
..
.
um ¡ vm
3
7
7
7
5
3
7
7
7
5
7
7
7
5
~u ¢ ~v = u1 v1 + u2 v2 + ::: + um vm :
The norm, or length of vector ~u; is de…ned as
q
j~uj = (u1 )2 + (u2 )2 + ::: + (um )2 :
A vector ~u is called a unit vector if its length is one, i.e.,
j~uj = 1:
For any vector ~u; the unit vector
~u
=
j~uj
µ
¶
1
~u
j~uj
is called the unit vector in the same direction as ~u; or simply the direction of ~u. All properties
that hold for 3-D vectors extend to m-D vectors without modi…cation.
² Properties of Linear Operations
Let ~u; ~v and w
~ be three vectors, ¸ and ± be two real numbers. Then
(1) ~u + ~v = ~v + ~u
(2) ~u + (~v + w)
~ = (~u + ~v) + w
~
(3) ~u + ~0 = ~u
(4) ~u + (¡~u) = ~0
(5) ¸(~u + ~v ) = ¸~u + ¸~v
(6) (¸+ ±) ~u = ¸~u + ±~u
(7) (¸±) ~u = ¸(±~u)
(8) 1 ¢ ~u = ~u
2
² Some Properties of Dot Product:
(1) ~u ¢ ~u = j~uj2
(2) ~u ¢ ~v = ~v ¢ ~u
(3) ~u ¢ (~v + w)
~ = ~u ¢ ~v + ~u ¢ w
~
(4) (¸~u) ¢ ~v = ¸(~u ¢ ~v ) = ~u ¢ (¸~v )
(5) ~0 ¢ ~u = ~u ¢ ~0 = 0:
² linear combination
Given p vectors ~u1 ; ~u2 ; :::; ~up in Rm and given p constants ¸1 ; ¸2 ; :::; ¸p; the vector ~y de…ned
by
~y =
p
X
¸i~ui = ¸1 ~u1 + ¸2 ~u2 + ::: + ¸p~up
i=1
is called a linear combination of ~u1 ; ~u2 ; :::; ~up with weights ¸1 ; ¸2 ; :::; ¸p :
Example 3.1 Determine whether ~y is a linear combination of ~u1 and ~u2 ; where
2
3
2 3
2
3
1
2
7
~u1 = 4 ¡2 5 ; ~u2 = 4 5 5 ; ~y = 4 4 5 :
¡5
6
¡3
Solution: By de…nition, we need to see if ~y has the form ~y = ¸1 ~u1 + ¸2 ~u2 for some
constants ¸1 ; ¸2 : This leads to the equations for ¸1 ; ¸2 :
2
3
2 3 2
3
1
2
7
¸1 4 ¡2 5 + ¸2 4 5 5 = 4 4 5 ;
¡5
6
¡3
or equivalently
2
3 2
3
¸1 + 2¸2
7
4 ¡2¸1 + 5¸2 5 = 4 4 5 :
¡5¸1 + 6¸2
¡3
This is the same as solving the following system of linear equations :
¸1 + 2¸2 = 7
¡2¸1 + 5¸2 = 4
¡5¸1 + 6¸2 = ¡3
whose augmented matrix is
2
3
..
1 2 . 7
6
7
6 ¡2 5 ... 4 7 :
4
5
..
¡5 6 . ¡3
3
Its reduced Echelon form is
2
.
1 0 .. 3
6
6 0 1 ... 2
4
.
0 0 .. 0
3
7
7:
5
Therefore, it is consistent, and consequently, ~y is a linear combination of ~u1 and ~u2 : The
weights are the solution (¸1 ; ¸2 ) = (3; 2) ; i.e.,
~y = 3~u1 + 2~u2 :
² Subset of All Linear Combinations
Given p vectors ~u1 ; ~u2 ; :::; ~up in Rm ; the set of all possible linear combinations of ~u1 ; ~u2 ; :::; ~up
is called a subspace spanned by ~u1 ; ~u2 ; :::; ~up ; and is denoted by
Span f~u1 ; ~u2 ; :::; ~up g :
The above example (of determining whether vector ~y is a linear combination of ~u1 and
~u2 ) may be rephrased as follows: determine whether
~y 2 Span f~u1 ; ~u2 g :
² Vector Equations – Vector forms of systems of linear equations
Let us go back to linear systems and their augmented matrices:
8
2
a11 x1 + a12 x2 + ::: + a1n xn = b1
a11 a12
>
>
<
6
a21 x1 + a22 x2 + ::: + a2n xn = b2
a21 a22
; M =6
4 :::
::::::
:::
>
>
:
am1 x1 + am2 x2 + ::: + amn xn = bm
am1 am2
De…ne n + 1 column vectors in Rm :
2
3
2
a11
6 a21 7
6
6
7
6
~a1 = 6 .. 7 ; :::; ~ai = 6
4 . 5
4
am1
a1i
a2i
..
.
ami
3
2
7
6
7
6
7 ; :::;~an = 6
5
4
a1n
a2n
..
.
amn
3
::: a1n
::: a2n
::: :::
::: amn
j
j
j
j
2
3
7
6
7 ~ 6
7; b = 6
5
4
b1
b2
..
.
bm
7
7
7:
5
3
b1
b2 7
7:
::: 5
bm
We call them the columns (or column vectors) of M: Then, the coe¢cient matrix can be
written as
A = [~a1 ~a2 ::: ~an ] ;
4
and the augmented matrix is
h
i h i
M = ~a1 ~a2 ::: ~an ~b = A ~b :
Accordingly, since
2
6
6
x1~a1 = x1 6
4
3
a11
a21
..
.
am1
2
7 6
7 6
7=6
5 4
a11 x1
a21 x1
..
.
am1 x1
3
2
7
6
7
6
7 ; :::; xi~ai = xi 6
5
4
a1i
a2i
..
.
ami
3
2
7 6
7 6
7=6
5 4
the linear system may be expressed as the following vector equation
n
X
a1i xi
a2i xi
..
.
ami xi
3
7
7
7;
5
xi~ai = x1~a1 + x2~a2 + ::: + xn~an = ~b:
i=1
Now, system of linear equations, augmented matrix, and vector equation are all equivalent
and all mean the same thing. More precisely, the following statements are equivalent:
1. The systems of linear equations
8
a11 x1 + a12 x2 + ::: + a1n xn = b1
>
>
<
a21 x1 + a22 x2 + ::: + a2n xn = b2
::::::
>
>
:
am1 x1 + am2 x2 + ::: + amn xn = bm
or in its vector form
n
X
xi~ai = x1~a1 + x2~a2 + ::: + xn~an = ~b
i=1
admits at least one solution.
2. The linear system
n
X
xi~ai = x1~a1 + x2~a2 + ::: + xn~an = ~b
i=1
is consistent.
3. Let ~a1 ;~a2 ; :::; ~an ; ~b be the columns of its augmented matrix. Then
~b is a linear combination of f~a1 ; ~a2 ; :::; ~an g
5
4. ~b belongs to the subspace spanned by ~a1 ; ~a2 ; :::;~an ; i.e.,
~b 2 Span f~a1 ;~a2 ; :::; ~an g :
² Linear Dependence and Linear Independence
A set of p vectors ~u1 ; ~u2 ; :::; ~up in Rm is called linearly independent if the vector equation
x1 ~u1 + x2 ~u2 + ::: + xp ~up = ~0
has only the trivial solution xi = 0; i = 1; 2; :::; p: We can also say that ~u1 ; ~u2 ; :::; ~up are
linear independent. Otherwise, if the above system admits at least one non-trivial solution
(not-all-zero solution) the set is called linearly dependent. We can also say that ~u1 ; ~u2 ; :::; ~up
are linear dependent. In the latter case, any non-trivial solution (x1 ; :::; xp ) is called a linear
relation.
Example 3.2 Show that any one single (non-trivial) vector ~u is always linear independent, and that two vectors ~u1 ; ~u2 are dependent i¤ ~u1 = ¸~u2 .
Proof: By de…nition, since
2
3 2 3
xu1
0
6 xu2 7 6 0 7
6
7 6 7
x~u = ~0 () 6 .. 7 = 6 .. 7 () x = 0 (unless all ui = 0):
4 . 5 4 . 5
xum
0
This shows that unless ~u = ~0; the equation x~u = ~0 has only the trivial solution x = 0; i.e.,
linear independent. To verify the second assertion , we …rst assume that ~u1 ; ~u2 are dependent.
Then, there is a non-trivial solution (x1 ; x2 ) 6= (0; 0) for
x1 ~u1 + x2 ~u2 = ~0:
Since (x1 ; x2 ) 6= (0; 0) ; at least one of them, say x1 6= 0: Thus, we …nd
~u1 +
x2
x2
~u2 = ~0 or ~u1 = ¸~u2 ; where ¸= ¡ :
x1
x1
On the other hand, if ~u1 = ¸~u2 , then obvious
~u1 ¡ ¸~u2 = ~0
and thus (x1 ; x2 ) = (1; ¡¸) is a non-trivial solution to
x1 ~u1 + x2 ~u2 = ~0:
This example shows that for two vectors, linear dependence is equivalent to saying that one
vector is a linear combination of another. This is no longer the case when there are more
than two vectors. We will discuss this issue later.
6
The notation of linear dependence (or independence) is closely related the following linear
systems
p
X
xi~ai = x1~a1 + x2~a2 + ::: + xp~ap = ~0:
i=1
We call such a system Homogeneous System. In fact, the above homogeneous system has at
least non-trivial solution i¤ its column vectors ~a1 ; ~a2 ; :::;~an are linearly dependent.
Example 3.3 (1) Determine whether the set of the following three vectors is linearly
dependent:
2 3
2 3
2 3
1
2
4
~u1 = 4 2 5 ; ~u2 = 4 1 5 ; ~u3 = 4 5 5 :
3
0
6
(2) Find a linear relation.
Solution. (1) We need to determine whether
x1 ~u1 + x2 ~u2 + x3 ~u3 = ~0
has a non-trivial
solution. To this end, we need to perform row operation on the augmented
h
i
~
matrix A; 0 ; where
2
3
1 2 4
A = 4 2 1 5 5:
3 0 6
h
i
Since the very last column in A; ~0 is zero vector, for simplicity, we only work on A; since
the last column will forever remains zero under any row operation. Thus
2
3
2
3
2
3
1 2 4
1 2
4
1 2
4
R ¡ 2R1 ! R2
A=4 2 1 5 5 2
! 4 0 ¡3 ¡3 5 ! 4 0 ¡3 ¡3 5 :
(1)
R3 ¡ 3R1 ! R3
3 0 6 ¡¡¡¡¡¡¡¡¡¡¡¡!
0 ¡6 ¡6
0 0
0
Apparently, x3 is a free variable since the third column is the only non-pivot. Set x3 = 1;
we can solve for x1 and x2 accordingly. Thus, the linear system has a non-trivial solution
(in fact, there are in…nite many non-trivial solutions by choosing di¤erent x3 ), and the set
f~u1 ; ~u2 ; ~u3 g is linearly dependent.
(2) Finding a linear relation ~u1 ; ~u2 ; ~u3 , i.e.,
x1 ~u1 + x2 ~u2 + x3 ~u3 = ~0
means to …nd a non-trivial solution. From (1), we …nd the above system reduces to
x1 + 2x2 + 4x3 = 0
¡3x2 ¡ 3x3 = 0:
7
Since we only need one non-zero solution, we take x3 = 1. Then x2 = ¡1; x1 = ¡ (2x2 + 4x3 ) =
¡2; and a linear relation is (¡2; ¡1; 1) so that
¡2~u1 ¡ ~u2 + ~u3 = ~0:
From the above relation, we see that, for instance,
~u2 = ¡2~u1 + ~u3 ;
i.e., when f~u1 ; ~u2 ; ~u3 g is linearly dependent, ~u2 is a linear combination of f~u1 ; ~u3 g : In general,
we have
Theorem 3.1 (Characterization of linear dependence) Vectors ~u1 ; ~u2 ; :::; ~up are linearly
dependent i¤ at least one is a linear combination of the rest.
Proof. Suppose that one is a linear combination of the rest. Without loss of generality, we
say that ~u1 is a linear combination of ~u2 ; :::; ~up; i.e., there exist ¸2 ; :::; ¸p such that
~u1 = ¸2 ~u2 + ::: + ¸p ~up :
It follows that
¡~u1 + ¸2 ~u2 + ::: + ¸p~up = 0:
This implies that ~u1 ; ~u2 ; :::; ~up are linearly dependent. On the other hand, suppose that
~u1 ; ~u2 ; :::; ~up are linearly dependent. Then, we have
¸1~u1 + ¸2 ~u2 + ::: + ¸p~up = 0;
where at least one of p constants f¸1 ; ¸2 ; :::; ¸p g is not zero. Without loss of generality, we
say ¸1 6= 0: Then we can solve for ~u1 as
µ ¶
µ ¶
¸2
¸p
~u1 =
~u2 + ::: +
~up :
¸1
¸1
Hence, ~u1 is a linear combination of ~u2 ; :::; ~up ; the rest.
Note that when ~u1 ; ~u2 ; :::; ~up are linearly dependent, it does NOT mean ANY one member
is a linear combination of the rest. For instance,
· ¸ ·
¸ · ¸
1
¡1
1
;
;
are linearly dependent.
2
¡2
0
But
Consider n vectors
·
1
0
¸
is not a linear combination of
2
6
6
~a1 = 6
4
a11
a21
..
.
am1
3
2
7
6
7
6
;
:::;
~
a
=
7
6
i
5
4
8
a1i
a2i
..
.
ami
3
·
1
2
¸ ·
¸
¡1
;
:
¡2
2
7
6
7
6
;
:::;
~
a
=
7
6
n
5
4
a1n
a2n
..
.
amn
3
7
7
7
5
and the m £ n matrix
A = [~a1 ~a2 ::: ~an ]m£n :
By a series row operations, A can be row equivalent to its an Echelon form.
De…nition 3.1 We call the number of pivots of a m £ n matrix A the RANK of A; and
denote it by r (A) :
For instance, the matrix in Example 3.3,
2
3
2
3
1 2 4
1 2
4
A = 4 2 1 5 5 ! 4 0 ¡3 ¡3 5
3 0 6
0 0
0
Thus, since there are two pivots, r (A) = 2:
Obviously, the rank cannot exceed the number of rows or columns, i.e., r (A) · m;
r (A) · n: In particular, if m < n; then r (A) · m: This observation leads to the following
theorem.
Theorem 3.2 A set of vectors ~a1 ; ~a2 ; :::; ~an in Rm is linearly dependent i¤ n > r (A) ;
where A is the m £ n column matrix [~a1 ~a2 ::: ~an ] : In particular, n vectors in Rm are always
linearly dependent if n > m:
Proof: By de…nition, whenever n > r (A) ; there is at least one non-pivot column. This
implies that the homogeneous system
x1~a1 + x2~a2 + ::: + xn~an = ~0
has at least one free variable, and consequently, there are in…nite many solutions (x1 ; x2 ; :::; xn ) :
When n > m; we see that
r (A) · m < n:
Therefore, the set of n vectors are linearly dependent.
Example 3.4 Determine whether the following set is linearly dependent. If it is linearly
dependent …nd a set of linearly independent vectors ~v1 ; ~v2 ; ::: such that Span f~v1 ; ~v2 ; ::g =
Span f~u1 ; ~u2 ; ::g :
2 3
2 3
2 3
2 3
1
4
2
1
4
5
4
5
4
5
4
~u1 = 2 ; ~u2 = 5 ; ~u3 = 1 ; ~u4 = 1 5 :
3
6
0
1
Solution. The answer is yes, it is linearly dependent because the number of vectors is
great than the dimension.
We next describe all vectors ~b 2 Span f~u1 ; ~u2 ; ::g using parametric vector representation.
As we did in Example 3.3, we need to describe ~b such that
x1 ~u1 + x2 ~u2 + x3 ~u3 + x4~u4 = ~b
9
h
i
~
has a solution. To this end, we perform row operation on ~u1 ; ~u2 ; ~u3 ; ~u4 ; b to arrive at an
Echelon form:
2
3
2
3
1 4 2 1 b1
1 4
2
1
b1
42 5 1 1 b2 5 R2 ¡ 2R1 ! R2 40 ¡3 ¡3 ¡1 b2 ¡ 2b1 5
R3 ¡ 3R1 ! R3
3 6 0 1 b3 ¡¡¡
¡¡¡¡¡¡¡¡¡! 0 ¡6 ¡63 ¡2 b3 ¡ 3b1
2
1 4
2
1
b1
b2 ¡ 2b1 5 :
R3 ¡ 2R2 ! R3 40 ¡3 ¡3 ¡1
¡¡¡¡¡¡¡¡¡¡¡!
0 0
0
0 b1 ¡ 2b2 + b3
We know that the system is consistent i¤ the last column in the augmented matrix is zero,
i.e.,
b1 ¡ 2b2 + b3 = 0;
or
b1 = 2b2 ¡ b3 :
Therefore, the system is consistent i¤
2 3 2
3
2 3
2 3
b1
2b2 ¡ b3
2
¡1
~b = 4b2 5 = 4 b2 5 = b2 415 + b3 4 0 5 ;
b3
b3
0
1
where b2 and b3 are free variable. Obviously the set of
2 3
2 3
2
¡1
~v1 = 415 ; ~v2 = 4 0 5
0
1
is linearly independent, because the only solution for ~b = 0 is when b2 = b3 = 0: We also see
that any linear combination, ~b; of f~u1 ; ~u2 ; ~u3 ; ~u4 g is a linear combination of f~v1 ; ~v2 g : Thus
Span f~u1 ; ~u2 ; ~u3 ; ~u4 g ½ Span f~v1 ; ~v2 g :
On the other hand, for any ~b 2 Span f~v1 ; ~v2 g ; the system
x1 ~u1 + x2 ~u2 + x3 ~u3 + x4~u4 = ~b
is consistent, i.e., ~b 2 Span f~u1 ; ~u2 ; ~u3 ; ~u4 g : Thus
Span f~v1 ; ~v2 g ½ Span f~u1 ; ~u2 ; ~u3 ; ~u4 g
and we conclude
Span f~v1 ; ~v2 g = Span f~u1 ; ~u2 ; ~u3 ; ~u4 g :
10
² Homework 3
1. Determine whether ~y is a linear combination of ~u1 ; ~u2 ; ~u3 :If yes, …nd linear relation.
(a)
(b)
2
3
2 3
2
3
2
3
1
0
5
2
~u1 = 4 ¡2 5 ; ~u2 = 4 1 5 ; ~u3 = 4 ¡1 5 ; ~y = 4 ¡1 5 :
0
2
6
6
2
3
2 3
2 3
2
3
1
0
2
¡5
~ 1 = 4 ¡2 5 ; ~u2 = 4 5 5 ; ~u3 = 4 0 5 ; ~y = 4 11 5 :
u
2
5
8
¡7
2. Find the rank of each matrix.
2
3
1 2 2 1 1
(a) 4¡3 5 1 1 05
2 1 0 1 2
2
3
1
2 0 1 ¡1
5 1 1
05
(b) 4 2
¡2 ¡4 0 ¡2 2
3. (I) Determine whether ~u1 ; ~u2 ; ~u3 ; ~u4 are linearly dependent. If yes, …nd the linear
relation. Find also a set of linear independent vectors that span Span f~u1 ; ~u2 ; ~u3 ; ~u4 g :
(II) If ~u1 ; ~u2 ; ~u3 ; ~u4 are the columns of the coe¢cient matrix A of a homogeneous linear
system A~x = ~0, …nd general solution.
(a)
(b)
2
3
2
3
2 3
2
3
1
1
2
¡5
~u1 = 4 ¡2 5 ; ~u2 = 4 5 5 ; ~u3 = 4 0 5 ; ~u3 = 4 11 5 :
0
¡2
1
2
2
3
2
1
¡1
6 ¡2 7
6 3
7 ~2 = 6
~u1 = 6
4 0 5; u
4 ¡2
¡1
1
3
2
3
2
2
¡2
7
6 0 7
6 ¡1
7 ; ~u3 = 6
7 ~3 = 6
5
4 1 5; u
4 2
¡2
1
3
7
7:
5
4. For each of the following statements, determine whether it is true or false. If your
answer is true, state your rationale. If false, provide an counter-example (the example
contradicting the statement).
11
2
3
1
(a) Another notation for the vector [1; 2; ¡1] is 4 2 5 :
¡1
h
i
(b) The solution set the linear system whose augmented matrix is ~a1 ; ~a2 ; ~a3 ; ~a4 ; ~b is
the same as the solution set of x1~a1 + x2~a2 + x3~a3 + x4~a4 = ~b:
(c) The columns of a matrix are linearly independent if A~x = ~0 has only the trivial
solution.
(d) If ~a1 ; ~a2 ; ~a3 ;~a4 is linearly dependent, then ~a4 is a linear combination of ~a1 ; ~a2 ;~a3 :
(e) The column of a 6 £ 7 matrix are linearly dependent.
(f) If ~a1 ;~a2 ;~a3 ; ~a4 are linear independent, then the rank of the matrix [~a1 ; ~a2 ; ~a3 ; ~a4 ]
is 4:
12