Final review packet

18.700 FALL 2011, FINAL EXAM OUTLINE / REVIEW PACKET
TRAVIS SCHEDLER
(1)
(2)
(3)
(4)
(5)
Do the final review problems (one of which will appear on the final!)
Take the Fall 2010 final as practice and compare against the solutions.
Go to the final review session, to be scheduled (probably Sunday or Monday, Dec 18 or 19).
Know all the midterm 1 and 2 problems and solutions.
Know all the homework problems and all the solutions. Be able to look at any
problem on the homework or in the book, or any similar problem, and know immediately
how to do it.
(6) Outline of subjects for the final (not all will be included):
• Vector spaces, subspaces, intersections, direct sums
• Linear transformations, nullspace, range
• Injectivity, surjectivity, and isomorphisms
• The rank-nullity theorem
• Gaussian elimination; computing null space, row space, column space, and left nullspace
of matrices [for column space and left null space, you can just use the transpose of the
matrix and find its row space and null space]; solving systems of equations
• Polynomials applied to matrices (if f (x) = an xn + · · · + a0 , then f (A) = an An + · · · +
a1 A + a0 I).
• Eigenspaces and eigenvalues
• Upper triangular and block upper triangular form matrices M(T ) for T ∈ L(V ), F = R
or C, V finite-dimensional
• Eigenbases and diagonal matrices
• Inner product spaces; norms
• Gram-Schmidt orthogonalization and block upper-triangular matrices in orthonormal
bases
• Orthogonal complements and projections
• Adjoint operators and matrices in orthonormal bases
• Self-adjoint operators, isometries, and normal operators
• The spectral theorems
• Positive operators and matrices
• Polar decomposition
• Singular value decomposition
• Characteristic polynomial
• Trace and determinant (in terms of sum and product of eigenvalues, or as formulas)
• Sign of permutations
• Generalized eigenspaces
• The decomposition theorem
• The Cayley-Hamilton theorem
• Minimal polynomials
• Jordan canonical form
(7) Definitions (there is a lot of repetition with the above):
• Vector spaces, fields, vector subspaces; intersections, direct sums
1
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Linear transformations, nullspace, range
Injectivity, surjectivity, and isomorphisms
Null space, row space, column space, and left null space of matrices
Row echelon and reduced row echelon form matrices
Eigenspaces and eigenvalues
Trace, determinant, and characteristic polynomial of 2 × 2-matrices
A block upper-triangular (or block diagonal) matrix
Inner product spaces
Norm associated to an inner product space
Abstract normed spaces
The parallelogram identity
Orthogonal complement
Projection and orthogonal projection operators
Adjoint operator
Self-adjoint operators, isometries, and normal operators
Orthonormal lists, orthonormal bases, and orthonormal eigenbases
Singular values and a singular value decomposition
Characteristic polynomial of a complex operator or matrix in terms of upper-triangular
form
• Trace and determinant in terms of eigenvalues
• Sign of a permutation
• Trace and determinant in terms of matrix entries (the sum formula)
• Nilpotent operators and matrices
• Generalized eigenspaces
• Characteristic polynomial of a complex operator or matrix in terms of generalized
eigenspaces
• Characteristic polynomial for arbitrary fields using determinant
• Uniqueness property of determinant
• Minimal polynomial
• A matrix in Jordan canonical form
(8) Computations involving eigenvalues and (generalized) eigenspaces:
• Compute the eigenvalues of a two-by-two matrix using characteristic polynomial (and
trace and determinant)
• Compute the eigenvalues of a larger square matrix A using Gaussian elimination on
A − xI.
• Compute the determinant of a square matrix using Gaussian elimination, keeping track
of the row operations that change determinant (so you can invert the effect later: if
you multiply a row by 5, then once you compute the determinant you have to divide
it by 5, etc.).
• Compute the integer eigenvalues of a matrix by computing its trace and determinant
(you have to verify that these are actual eigenvalues, until you can get it down to only
two eigenvalues left and then trace and determinant uniquely determine them).
• Compute the determinant of xI −A as above to compute the characteristic polynomial.
• Given an eigenvalue, compute the eigenspace using Gaussian elimination.
• Similarly, compute the generalized eigenspace (the nullspace of (A − λI)dim V : you can
square (A − λI) in the 2 × 2 case, or square it twice in the 3 × 3 or 4 × 4 case...)
• For a general linear transformation, find a basis in which its matrix is (block) upper
triangular and compute this matrix.
(9) Computations involving orthogonalization and orthogonal projections:
2
• Compute the orthogonal complement of a vector or a vector subspace
• Compute the orthogonal projection of a vector to a subspace
• Implement the Gram-Schmidt procedure to obtain an orthonormal basis from an arbitrary one by an upper-triangular change of basis.
(10) Computation of polar and singular value decompositions
• Given a linear transformation, compute its polar decomposition.
• Use polar decomposition to compute the singular value decomposition of an operator.
• By considering a matrix as a linear transformation, compute its singular value decomposition.
• Use the preceding to write a matrix in the form U1 DU2−1 where D is diagonal with
nonnegative entries, and U1 and U2 are unitary matrices.
(11) Computation of generalized eigenspaces, characteristic polynomials, minimal polynomials,
and Jordan canonical form:
• Compute the characteristic polynomial of a linear transformation or matrix
• Given a linear transformation, compute its eigenvalues using characteristic polynomial
or preceding techniques.
• Given the eigenvalues, compute the generalized eigenspaces (using Gaussian elimination)
• Using this, compute the characteristic polynomial using the formula from class
• Compute the Jordan canonical form of a nilpotent matrix. Generalize to arbitrary
matrices.
• Compute the minimal polynomial of a matrix.
(12) Major results (Know the statements, but don’t memorize them word for word—know what
they say mathematically. Only if mentioned below, know how to prove or demonstrate
implications). Especially important ones are in bold.
• How to prove properties of vector spaces: Propositions 1.2–1.6
• Equivalent statements about direct sum: Propositions 1.8 and 1.9
• Theorems about bases: Theorem 2.6, and how it implies Theorem 2.14
• Theorems about direct sums: Theorem 2.12, and how it implies Proposition 2.13;
Theorem 2.12, complements of subspaces
• Properties of direct sum: Theorem 2.18; more generally, if V = V1 + · · · + Vn is
finite-dimensional, then V = V1 ⊕ · · · ⊕ Vn if and only if dim V = dim V1 + · · · + dim Vn .
• Rank-nullity theorem (Theorem 3.4), and also the theorem: even if V and W are
infinite dimensional, if T ∈ V, W and V = U ⊕ null T , then the restriction of T to
T |U : U → range(T ) is an isomorphism onto range(T ). Corollaries 3.5 and 3.6.
• Theorem 3.18: all finite-dimensional vector spaces of the same dimension are isomorphic
• Theorem 3.21: equivalent statements for isomorphisms (for the finite-dimensional
case).
• Also, in the finite-dimensional case, ST is invertible iff S and T are invertible. ST = I
iff T S = I; then we say S = T −1 .
• Theorem 4.7: fundamental theorem of algebra
• Theorem 5.6, and its generalization, the decomposition theorem below ! Corollary
5.9.
• Existence of eigenvalues in complex case: Theorem 5.10
• Block upper-triangular matrices for linear transformations: Theorem 5.13
and slides for lecture 13
• Cauchy-Schwarz inequality: Theorem 6.6; triangle inequality: Theorem 6.9
• Gram-Schmidt orthogonalization: 6.20
3
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Orthogonal complements: Theorem 6.29
Existence of adjoints: Theorem 6.45 and following discussion; see also Lecture 17
The spectral theorem: Theorems 7.9, 7.13, and 7.25; see also lectures
Polar decomposition: Theorem 7.41
Singular Value Decomposition: Theorem 7.46
The decomposition theorem: Theorem 8.23; see strengthened version in
Lecture 21!
The Cayley-Hamilton theorem: Theorem 8.20; warm-up from Lecture 21;
PS 11 #2
The number of times λ appears on diagonal in upper-tri matrix is the dim
of V (λ): Lecture 21
Jordan canonical form: Theorem 8.47 (also Lemma 8.40); Lecture 22
Uniqueness of Jordan canonical form: Lecture 22
Minimal polynomials: Theorems 8.34 and 8.36; PS 10 # 8; warm-up to
Lecture 23
Square Roots: Theorem 8.32; PS 10 # 9
Theorems on trace: PS 10 # 5; Proposition 10.9, Corollary 10.10, Theorem 10.11,
Corollary 10.12. Take a look at Corollary 10.13 too!
Characterization of determinant: Theorem from Lecture 23; in particular det(AB) =
det(A) det(B), cf. Theorem 10.31.
Determinant and volume: Theorem 10.38; Lecture 24
Characteristic polynomial and determinant: Lecture 23, slides 11–12; cf. 10.17.
Use this to define determinant over general F.
Determinant and characteristic polynomial of block upper-triangular matrices: PS 11
# 5. Also minimal polynomials of block diagonal matrices: see the same exercise.
4