Inner Product Spaces: Definition: Let V be a vector space and the

Inner Product Spaces:
Definition: Let V be a vector space and the inner product of two vectors u, v V is a function denoted by <u,v> that assigns a
real # to each pair of vectors u & v. An inner product space is a vector space with an inner product function. This function has the
following properties:
Properties of Inner Products:
1.
2.
3.
4.
5.
<u1 + u2 , v> = < u1 , v> + < u2 , v>
<ku , v> = k<u , v>
<u , v> = <v , u>
<u , u>  0 if u  0 & <u , u> = 0 if u = 0
Property 1 and 2 together  <au1 + bu2 , v> = a<u1 , v> + b<u2 , v>
Definition: Suppose u = (u1 , u2 ,….,un ) and v= (v1 , v2 ,….,vn ) then the standard inner product of n is defined as <u , v> = u1v1 +
u2v2 + ….. + unvn
Theorem: If V is a vector space of all n x n matrices then <A , B> = Tr(B TA)
The Norm: Let V be an inner product space & u  V. The norm of u is defined as ||u|| =
or ||u||2 = <u , u>
Definition: The distance between vectors in V is defined: d(u,v) = <u – v , u - v>
Theorem: |<u , v>|  ||u|| ||v||
Orthogonal Vectors: let S = {v1 , v2 , …. , vn} & V is an inner product space then S is an orthogonal set if <v i , vj> = 0 i  j
Orthonormal Set: If ||vi|| = 1 for all i and <vi ,vj> = 0 i  j the set is called an orthonormal set.
Theorem: Orthogonal vectors are independent.
Theorem: let V be an inner product space and S = { v1 , v2 , …. , vn} be an orthogonal basis for V. Then any vector u  V can be
expressed as:
u = (<u , v1> / ||v1||2) + (<u , v2> / ||v2||2) + …. + (<u , vn> / ||vn||2)
In particular if vi’s are orthogonal vectors u = <u , v1>v1 + <u , v2>v2 + … + <u , vn>vn
Orthogonal Compliment: Let V be an inner product space and W a subspace of V. The orthogonal compliment of W is denoted
by W and is the collection of all vectors in V orthogonal to vectors in W.
Properties of Orthogonal Compliment: Let V be an inner product space and W a subspace of V. Let B = {W 1 , … , Wn} a basis
for W B’ = {u1 , …. , un} a basis for the orthogonal compliment then:
1.
2.
3.
W  W = {0}
The set { W1 , … , Wn , u1 , …. , un} has dimensions dim(V = dimW +dimW)
Any vector in V can be expressed as the sum of vectors in W and a vector in W
Vector Projection: Let W be a subspace of an inner product space V. Suppose B = { u 1 , u2 , …. , un} an orthogonal basis for W
and let u be any vector in V. Then the projection from u to W is
= (<u1 , u1> / ||u1||2)(u1) + (<u1 , u2> / ||u2||2) (u2) + …. +
2
(<u1 , un> / ||un|| ) (un) and the component of u orthogonal to w is determined by u Theorem: Any finite inner product space has an orthogonal basis.
To find the orthogonal basis have vn = un – [(<un , v1> / ||v1||2)(v1) + (<un , v2> / ||v2||2) (v2) + …. + (<un , vn-1> / ||un-1||2) (un-1)]
Reminders:
1.
If V is a vector in an inner and { w1 , w2 , … , wn} is an orthogonal basis for a subspace of V.
Then the
= (<v , w1> / ||w1||2)(w1) + (<v , w2> / ||w2||2) (w2) + …. + (<v , wn> / ||wn||2) (wn)
2.
3.
Let A be an mxn matrix. The N(A) & R(A) are orthogonal compliments. The N(AT) & col(A) are also orthogonal
compliments.
Any vector b in an inner product space V can be written as b = w1 + w2 , where w1W & w2  W
Approximating solutions for x in an inconsistent system: Replace b by
and this replacement provides the best solution to Ax = b.
then the system Ax =
is consistent
1.
2.
Ax = b  inconsistent
Ax =
consistent
3.
4.
bis a vector in the col(A)
b – Ax = b –
5.
6.
7.
8.
b N(AT)
b – Ax  N(AT)
AT(b – Ax) = 0
ATAx = ATb  This equation is called the normal equation associated with Ax = b the solution to the normal
equation is the least square solution to Ax = b.
COS() = (< u, v >)/(||u|| ||v||)
Theorem: Eigen Vectors of any symmetric matrix that corresponds to different Eigen values are
orthogonal.
Theorem: Any symmetric matrix Amxn has n orthogonal (orthonormal) eigenvectors if there two or more
sets of Eigen vectors then use the gram Schmidt algorithm. vn = un – [(<un , v1> / ||v1||2)(v1) + (<un , v2> / ||v2||2)
(v2) + …. + (<un , vn-1> / ||un-1||2) (un-1)]
Theorem: If A is an nxn symmetric matrix than A=P -1DP if we select the columns of P to be orthonormal eigen
vectors, than P has the property P-1 = PT
Definition: Any Matrix A with the property A-1=AT(AAT=I) is called an orthogonal matrix.
Definition: Any diagonalizable matrix whose diagonalzing matrix is an orthogonal matrix is said to be orthogonally
diagonalizable.
Theorem: Let A be a symmetric matrix & V1, V2, …,Vn orthogonal eigen vectors associated with eigen values 1,
2, .., n than A = 1V1V1T + 2V2V2T+….+ nVnVnT
All are column vectors.
Definition: A quadratic form is a polynomial of the form Q(X) = X TAX let x =
, A=
a symmetric
matrix,
then Q(x1, x2, …, xn) = a11x12+a22x22+..+ammxm2+2  aijxixj
Theorem: Any quadratic form can be written as the sum of squares. Q(X) = X TAX , A = PDP-1 where P is
orthogonal so P-1=PT
let Y = PTX , then Q(X) = Q’(Y) = YTDY =1y12+2y22+…+nyn2
Complex Properties:
1.
2.
3.
4.
Modulus:Z = a + bi  |Z| = (a2+b2)(1/2)
: Z= a + bi  = a - bi
Z (a + bi)(a – bi) = a2+b2 and become all real numbers
Dividing by a complex number: Just multiply the fraction by its complex conjugate.
5.
Reciprocal of
6.
7.
8.
9.
10.
0
=( )( )
1
i = 1, i = i =
, i2= -1, i3= -i
Two complex numbers (a+bi) and (c+di) are equal iff a = c and b = d
(a+bi)+(c+di)=(a+c)+(b+d)i
(a+bi)(c+di)=(ac-bd)+(ad+bc)i
Polar form Z =r(cos + isin)
Theorem: A complex # Z is a real # iff Z =
Complex Inner Product Space:
Let U = (u1,u2,…,un) and V=(v1,v2,…,vn) be vectors in
<U,V>=u1
1
+ u2 2 +…+ un
n
or <U,V>=
. Then
1v1+ 2v2+…+ nvn
<U,U> = 0 then U = 0 or if U 0 then <U,U>  0
||U||2= |u1|2+|u2|2+…+|un|2
Key Definitions:
1.
2.
3.
4.
5.
6.
7.
n
Symmetric Matrix  AT = A
Skew Symmetric Matrix  AT = - A
Orthogonal Matrix  A-1 = AT
Hermitian Matrix  A* = A
Skew- Hermitian Matrix  A*= - A
Unitary Matrix  A*= A-1
Normal Matrix AA* = A*A