Mining Preferences from
Superior and Inferior
Examples
1
KDD’08
Outline
Introduction
Problem Definition
Greedy Method
Term-Based Algorithm
Condition-Based Algorithm
Experimental Result
Conclusion
2
Introduction
In a multidimensional space where the user
preferences on some categorical attribute are
unknown.
Example
ID
Price
Age
Developer
a
1600
2
X
b
2400
1
Y
c
3000
5
Z
3
Cont.
Superior Example (S)
For each superior example o, according to the
customer preference there does not exist another
realty o’ which is as good as o in every aspect, and is
better than o in at least one aspect.
Inferior Example (Q)
For each inferior example o, according to the
customer preference there exist at least one realty o’
which is as good as o in every aspect, and is better
than o in at least one aspect.
4
Problem Definition
D = DD DU such that DD DU =
DD : Determined attribute.
DU : Undetermined attribute.
SPS ( Satisfying Preference Set)
R {
,..., d } is called a SPS if
d ' i (1 i d - d ') is a perference on attribute Dd ' i
( D , d ' 1 ,..., d )
d '+1
d: The dimensionality of D
d’: The number of determined attribute
5
Example
A: Determined attribute a1
B,C: Undetermined attribute
S = { o1 , o 3 } Q ={ o2 }
Object
A
B
C
o1
a1
b1
c1
o2
o3
a1
a2
b1
b1
c2
c2
o4
o5
a2
a2
b2
b3
c2
c2
c1
A
a2
C
c2
c1 C c2
6
Cont.
Problem 1 (SPS Existence)
Given a set of superior examples S and a set of
inferior examples Q , determine whether there exists
at least SPS R with respect to S and Q.
Problem (Minimal SPS)
For a set of superior example S and a set of inferior
examples Q, find a SPS R { d '+1 ,..., d } with
respect to S and Q such that | E ( d ' 1 ,... d ) | is
minimized. R is called a minimal SPS.
7
Cont.
Theorem (Multidimensional Preference)
In space D D D ... D , let
be
i (1 i d )
1
2
d
a preference on attribute Di and
( 1 ,..., d ) .
Then
d
d
i 1
i 1
| E ( ) | ( | E ( i ) | | Di | ) - | Di |
| Di | is the number of distinct values in attribute Di
8
Term-Based Algorithm
Object-id
D1
D2
D3
D4
Label
o1
o2
o3
1
1
1
5
6
6
a3
a2
a2
b3
b1
b3
Inferior
o4
o5
o6
o7
2
2
3
3
2
5
1
4
a1
a2
a4
a2
b1
b2
b3
b2
o8
o9
o10
6
6
6
1
1
2
a5
a5
a1
b1
b3
b1
Inferior
Inferior
Inferior
9
Inferior
Cont.
Inferior
P(q)
o2
o1
o3
CO 2 (O1 ) (a3
3
a2 ) (b3
CO 2 (O3 ) (b3
4
b1 )
o1
o4
o4
CO 5 (O1 ) (a3
3
a2 ) (b3
CO 5 (O4 ) (a1
3
a2 ) (b1
o5
o7
o8
o6
o6
o9
Condition Cq (p)
CO 7 (O4 ) (a1
CO 7 (O6 ) (a4
3
3
4
b1 )
4
a2 ) (b1
a2 ) (b3
CO8 (O6 ) (a4 3 a5 ) (b3
CO8 (O9 ) (b3 4 b1 )
4
b2 )
4
b2 )
4
4
b2 )
b2 )
b1 )
10
Cont.
d
| E ( ) | ( | E (
i 1
d
i
) | | Di | ) - | Di |
i 1
Complexity increment CI
R { 3,
3
4
{a1
} | D3 | 5,| D4 | 3
3
a2 } and
4
| E ( R) | (1 5) (0 3) 5 3 3
If a3
3
a1 is selected on D3
| E ( 3 ) | 3
| E ( R) | (3 5) (0 3) 5 3 9
CI (a3
3
a1 ) 9 3
11
Cont.
Inferior Example Coverage
Cov(t) is the number of interior examples newly
satisfied if t is selected.
Cov(b3
4
b1 ) max{0.5,1} max{0.5,1} 2
Inferior
P(q)
o2
o1
o3
o1
CO 2 (O1 ) (a3
3
a2 ) (b3
CO 2 (O3 ) (b3
4
b1 )
CO 5 (O1 ) (a3
3
a2 ) (b3
o4
o4
o6
CO5 (O4 ) (a1
3
a2 ) (b1
CO 7 (O6 ) (a4
o6
o9
CO8 (O6 ) (a4 3 a5 ) (b3
CO8 (O9 ) (b3 4 b1 )
o5
o7
o8
Condition Cq (p)
CO 7 (O4 ) (a1
3
3
4
b1 )
4
a2 ) (b1
a2 ) (b3
4
b2 )
4
b2 )
4
4
b1 )
b2 )
b2 )
12
Cont.
Cov(t )
score(t )
CI (t )
Term\Iteration
D3
D4
1
2
3
a2
1/3
1/4*
\
1/3
1/6
1/6
1/8
1/8
\
1/8
1/8
\
1/5
2/5*
1/5
1/10
\
1/5
2/12*
\
1/6
a1
3
a3
3
a2
a4
3
a2
a4
3
a5
b1
4
b2
b3
4
b1
b3
4
b2
13
Condition-Based Algorithm
Inferior
Condition
1
2
3
o2
o1
o3
o1
3/9
2/5*
2/9
\
\
1.5/10*
\
\
\
o4
o4
o6
2/9
2/9
1.5/9
2/16
2/16
1.5/10
1/12
1/5*
o6
o9
2.5/9
2/5
\
\
\
\
o5
o7
o8
14
Experimental Results
15
Conclusion
Mining Preference
Greedy Method: Score (t) ?
16
© Copyright 2026 Paperzz