SEMI-SUPERVISED NMF WITH HARD C ONSTRAINTS

SEMI-SUPERVISED NMF WITH
HARD
CONSTRAINTS
Zhao Long
2012_TPAMI_IEEE_Constrained Nonnegative Matrix Factorization for
Image Representation
• UNSUPERVISED NMF
• Each row of V represent each point’ low-dim representation or
clustering result.
• SEMI-SUPERVISED Clustering
• Consider a data set consisting of n data points x1,x2,…,xn ,among
which the label information is available for the first l data points
x1,...,xl, and the rest of the (n – l) data points xl+1 ,...,xn are unlabeled.
• Suppose there are c classes. Each data point x1,...,xl is labeled with
one class. We first build an l× c indicator matrix C where cij=1 if xi is
labeled with the jth class. Otherwise ,cij=0. With the indicator matrix
C, we define a label constraint matrix A as follows:
• To use label information into NMF
• Another method to use label information: label propagation techniques
• Y is a n ×c Y ij = 1 if xi is labeled as yi= j and Y ij = 0 otherwise. That is, we
just use left part of A.
• A=Y V=YZ
• CNMF
• Popular label propagation algorithm:
• Gaussian Fields Harmonic Function (GFHF) (Y -> Y1)
• Learning with Local and Global Consistency (LLGC) (Y -> Y2)
• Experiments
•
•
•
•
•
•
CNMF
CNMF+ GFHF
CNMF+LLGC
ACCMU
NMFMU
Semi-supervised Graph regularized Nonnegative Matrix Factorization on
Manifold (GNMF)
• Results(c=1:20,Yale has 15 classes)
• ACC
• NMI
ACC
k
CNMF CNMF+ GFHF CNMF+LLGC ACCMU NMFMU
GNMF
NMI
• k
CNMF CNMF+ GFHF CNMF+LLGC ACCMU NMFMU
GNMF