Lecture07 EigenFaces File

EigenFaces
Overview
 Consider a set of n database images (faces)

Stored as a m-dimensional vector (where m = width * height)
 Do Principal Component Analysis (PCA)



Reduces dimension from m to p
p << m
p is the number of important features
Each feature is an m-dimensional vector
 An "EigenFace"
 Represents a common feature.


A form of compression
 Take a novel image, convert to p-space and find the
closest match

Using distance in p-space.
Simple PCA example
 Take a simple table



3 columns: height, weight, and GPA
height and weight are related, GPA is (hopefully) not.
[Show it on the board, including the visualization]
 Calculate the average value: 𝝁
 Calculate the co-variance matrix, C

Detour: Variance in general, then co-variance matrix.
 Calculate the EigenSystem of C


Another day (?) a derivation of this math
For now: "lines of greatest variance"
A set of perpendicular "basis vectors" (the EigenVectors)
 bigger associated EigenValue = more variance in original data set.

Simple PCA example, continued
 Results of PCA:
𝑬𝑽𝟏
80
70
𝑬𝑽𝟐 𝝁
60
50
40
30
20
10
0
0
20
40
60
80
Simple PCA example, continued
 Conversion from Real Space=> Eigen Space
 A projection


1
0
0
Real Space axes: ℎ = 0 , 𝑤 = 1 , 𝑔 = 0
0
0
1
A point in real space 𝑝 = 𝑝1 ℎ + 𝑝2 𝑤 + 𝑝3 𝑔

We need to figure out an alternate way of storing 𝑝 (call it 𝑝′)
Subtract μ from 𝑝 to get 𝑝𝑟𝑒𝑙

Project 𝑝𝑟𝑒𝑙 onto each of the E.S. axes to get a component of 𝑝′

Simple PCA example, continued
 Conversion from Eigen Space => Real Space
PCA as a compression technique
EigenFaces