Connotation - stony brook cs

Connotation:
A Dash of Sentiment
Beneath the Surface Meaning
ConŸnotation
“com-” (“together or with”)
1
| “notare” (“to mark”)
ConŸnotation
“com-” (“together or with”)
| “notare” (“to mark”)
• Commonly understood cultural or emotional
association that some word carries, in addition
to its explicit or literal meaning (denotation).
• Generally described as positive or negative.
2
Good News? Bad News?
“Decrease in deforestation drives the trend,
but emissions from energy and agriculture grow.”
3
Good News? Bad News?
the removal of trees
“Decrease in deforestation
deforestation drives the trend,
but emissions
emissions from energy and agriculture grow.”
The production and discharge of
something, esp. gas or
4
Good News? Bad News?
the removal of trees
“Decrease in deforestation
deforestation drives the trend,
but emissions
emissions from energy and agriculture grow.”
The production and discharge of
something
5
Good News? Bad News?
the removal of trees
“Decrease in deforestation
deforestation drives the trend,
but emissions
emissions from energy and agriculture grow.”
The production and discharge of
something,,esp. gas or radiation
6
Learning the General Connotation
“Decrease in deforestation
deforestation drives the trend,
but emissions
emissions from energy and agriculture grow.”
Negatively connotative in general
7
ConŸnotation
“com-” (“together or with”)
| “notare” (“to mark”)
• Commonly understood cultural or emotional
association that some word carries, in addition
to its explicit or literal meaning (denotation).
• Generally described as positive or negative.
8
Sentiment vs. Connotation
9
Sentiment vs. Connotation
joy
10
sick
Sentiment vs. Connotation
Neutral
Sentiment
joy
11
sick
surfing
scientist
grid
rose
header
blister
emission
salt
bedbug
…
Sentiment vs. Connotation
Neutral
Sentiment
joy
scientist
music
surfing
rose
12
sick
deforestation
flu emission
Connotation
surfing
scientist
grid
rose
header
blister
emission
salt
bedbug
…
Sentiment vs. Connotation
Neutral
joy
sick
scientist
Connotation deforestation
music
surfing
flu emission
rose
bedbug
13
surfing
scientist
grid
rose
header
blister
emission
salt
bedbug
…
Learning the General Connotation
}
}
}
}
}
Data
Linguistic insights
Graph representation
Inference algorithms
Evaluations
14
Data
}
Web-driven data
Google Web 1T (Brants and Franz (2006))
}
}
}
}
N-grams (1 <= n <= 5)
Frequency of occurrences
Example: “prevent financial malware
Dictionary-drive data
WordNet (George A. Miller (1995))
}
15
Synsets: synonyms, antonyms
4130”
Learning the General Connotation
}
}
}
}
}
Data
Linguistic insights
Graph representation
Inference algorithms
Evaluations
16
Diverse Linguistic Insights
}
}
}
}
Semantic prosody
Semantic parallelism of coordination
Distributional similarity
Semantic relations
17
Diverse Linguistic Insights
}
}
}
}
Semantic prosody
Semantic parallelism of coordination
Distributional similarity
Semantic relations
18
Semantic Prosody
ptimization: ILP / LP
930,000,000
19
Sinclair (1991, 2004), louw (1993)
2,650,000
Selectional Preference on Connotation
accident ß “avoid accident”,“cause accident”, …
avoid
à “avoid bedbugs”,“avoid danger”, …
enjoyPred à [enjoy]Pred [music]Argoid bedbugs”,oid danger”,
…
art
ß “enjoy the art”,“appreciate the art”, … à“enjoy the
wine”,“enjoy life”, …
preventPred à [prevent]Pred [bedbugs]Arg
20
Selectional Preference on Connotation
accident ß “avoid accident”,“cause accident”, …
avoid
à “avoid bedbugs”,“avoid danger”, …
enjoyPred à [enjoy]Pred [music]Argoid bedbugs”,oid danger”,
…
Candidates
art
ß “enjoy the art”,“appreciate the art”, …y
à“enjoy the wine”,“enjoy life”, …
preventPred à [prevent]Pred [bedbugs]Arg
21
Connotative Predicate:
Feng et al. 2011
A predicate that has selectional preference on the
connotative polarity of some of its semantic arguments.
22
Connotative Predicate:
Feng et al. 2011
A predicate that has selectional preference on the
connotative polarity of some of its semantic arguments.
Connotative
Predicates
Sentiment
of
predicate
Preference
on
arguments
suffer
cure
cause
negative
positive
neutral
negative
negative
negative
23
Examples
“suffering from cough”
“cure backache”
“caused CO2 emissions”
Connotative Predicate:
Feng et al. 2011
A predicate that has selectional preference on the
connotative polarity of some of its semantic arguments.
Connotative
Predicates
Sentiment
of
predicate
Preference
on
arguments
suffer
cure
cause
negative
positive
neutral
negative
negative
negative
24
Examples
“suffering from cough”
“cure backache”
“caused CO2 emissions”
20 Positive
Connotative Predicates
Accomplish
Achieve
Advance
Advocate
Admire
Applaud
Appreciate
Compliment
Congratulate
Develop
Desire
Enhance
Enjoy
Improve
Praise
Promote
Respect
Save
Support
Win
20 Negative
Connotative Predicates
Alleviate
Accuse
Avert
Avoid
Cause
Complain
Condemn
Criticize
Detect
Eliminate
Eradicate
Mitigate
Overcome
Prevent
Prohibit
Protest
Refrain
Suffer
Tolerate
Withstand
Feng et al. 2011
25
Diverse Linguistic Insights
}
Semantic prosody [Corpus: GoogleNgram]
[enjoy]Pred [music]Arg,S[prevent]Pred [begbugs]Argantic
parallelism of coordination (Google)
Pattern “ * and * ”, e.g., “music and wine”
}
Distributional similarity (Google)
“findings”–“potential” > “findings” – “corrections”
Semantic relations (WordNet)
Synonyms
Antonyms
26
Diverse Linguistic Insights
}
Semantic prosody [Corpus: GoogleNgram]
}
}
“enjoy * ”; “prevent * ”
Semantic parallelism of coordination
}
Pattern “ * and * ”, e.g., “music and wine”
Semantic relations (WorldNet)
Synonyms
Antonyms
27
[Corpus: GoogleNgram]
Diverse Linguistic Insights
}
Semantic prosody [Corpus: GoogleNgram]
}
}
Semantic parallelism of coordination
}
}
“enjoy * ”; “prevent * ”
[Corpus: GoogleNgram]
Pattern “ * and * ”, e.g., “music and wine”
Distributional similarity [Corpus : GoogleNgram]
}
“findings”–“potentials” > “findings” – “modifications”
Semantic relations (WorldNet)
Synonyms
Antonyms
28
Diverse Linguistic Insights
}
Semantic prosody [Corpus: GoogleNgram]
}
}
Semantic parallelism of coordination
}
}
[Corpus: GoogleNgram]
Pattern “ * and * ”, e.g., “music and wine”
Distributional similarity [Corpus : GoogleNgram]
}
}
“enjoy * ”; “prevent * ”
“findings”–“potentials” > “findings” – “modifications”
Semantic relations [Corpus: WordNet]
}
}
29
Synonyms
Antonyms relations (WorldNet)
Learning the General Connotation
}
}
}
}
}
Data
Linguistic insights
Graph representation
Inference algorithms
Evaluations
30
Graph Representation
}
Graph G = (V, E)
}
}
V = {Pred} ∪ {Arg}
E1: Pred – Arg
Pred-­Arg
enjoy
thank
avoid
prevent
31
writing
profit
help
…
risk
Graph Representation
}
Graph G = (V, E)
}
}
V = {Pred} ∪ {Arg}
E1: Pred – Arg
Pred-­Arg
enjoy
thank
}
E2: Arg – Arg
avoid
prevent
32
Arg-­Arg
writing
reading
profit
investment
help
…
risk
aid
Graph Representation
}
Graph G = (V, E)
}
}
V = {Pred} ∪ {Arg}
E1: Pred – Arg
Pred-­Arg
enjoy
thank
}
E2: Arg – Arg
avoid
prevent
“a1 and w1” à PMI(a1, w1)
“a1 and w2” à PMI(a1, w2)
…
“a1 and w3” à PMI(a1, wn)
33
Arg-­Arg
PMI(a1, w1)
a1 := PMI(a1, w2)
…
PMI(a1, wn)
writing
reading
profit
investment
help
…
risk
aid
Graph Representation
…
gain
tax
prevent
bonus
loss
suffer
writing
investment
enjoy
profit
thank
cold
preventing
…
prosody
coordination
34
flu
synonyms
antonyms
Learning the General Connotation
}
}
}
}
}
Data
Linguistic insights
Graph representation
Inference algorithms
Evaluations
35
Inference Algorithm
Algorithms
HITS/PageRank
Label Propagation
Integer Linear
Programing
Belief Propagation
36
Linguistic Insights
Semantic Prosody
Semantic Parallelism
of Coordination
Distributional
Similarity
Semantic Relations
W
Inference Algorithm
Algorithms
HITS/PageRank
Label Propagation
Integer Linear
Programing
Belief Propagation
37
Linguistic Insights
Semantic Prosody
Semantic Parallelism
of Coordination
Distributional
Similarity
Semantic Relations
W
ILP: Problem Formulation
}
For each unlabeled word 𝑖 ,
solve xi, yi, zi ∈ {0, 1}, xi à positive,
yi à negative,
zi à neutral;
xi + yi + zi = 1.
38
ILP: Problem Formulation
}
For each unlabeled word 𝑖 ,
solve xi, yi, zi ∈ {0, 1}, xi à positive,
yi à negative,
zi à neutral;
xi + yi + zi = 1.
}
Initialization (Hard constraints)
}
}
39
Positive seed predicates (e.g. “achieve”) à xi = 1
Negative seed predicates (e.g. “prevent”) à yi = 1
ILP: Objective Function
Maximize
40
ILP: Objective Function
Maximize
tax
prevent
suffer
enjoy
loss
writing
bonus
profit
thank
cold
preventing
…
41
flu
ILP: Objective Function
Maximize
tax
prevent
suffer
enjoy
loss
writing
bonus
profit
thank
cold
preventing
…
42
flu
ILP: Objective Function
43
ILP: Objective Function
44
ILP: Soft Constraints
}
Predicate – Argument
}
Argument – Argument
45
ILP: Hard Constraints
§
Semantic relations
§
Antonym pairs will not have the same positive or negative
polarity.
§
Synonym pairs will not have the opposite polarity.
46
Inference Algorithm
Algorithms
HITS/PageRank
Label Propagation
Integer Linear
Programing
Belief Propagation
47
Linguistic Insights
Semantic Prosody
Semantic Parallelism
of Coordination
Distributional
Similarity
Semantic Relations
W
Graph
LEMMA
…
ache
SENSE
wound
pain
prevent
accident
loss
suffer
injure
-lose
-decrease
lose
life
win
profit
gain
enjoy
-injury
-wound
-win
-profits
achieve
success
-gain
-acquire
investment
…
put on
prosody (Pred-Arg)
coordination
48
-gain,
-put on
memberships
antonyms
Graph
LEMMA
…
ache
SENSE
wound
pain
prevent
accident
loss
suffer
injure
-lose
-decrease
lose
life
win
profit
gain
enjoy
-injury
-wound
-win
-profits
achieve
success
-gain
-acquire
investment
…
put on
prosody (Pred-Arg)
coordination
49
-gain,
-put on
memberships
antonyms
Graph
LEMMA
Types of Nodes
• Lemmas prevent
(115K)
• Synsets (63K)
…
ache
SENSE
wound
pain
accident
loss
suffer
injure
-lose
-decrease
lose
life
win
Types of Edges
enjoy
profit
1. Predicate-argument
(179K)
gain
2. Argumentargument (144K)
achieve
3. Argument-synset (126K)
success
4. Synset-synset (34K)
investment
…
put on
prosody (Pred-Arg)
coordination
50
-injury
-wound
-win
-profits
-gain
-acquire
-gain,
-put on
memberships
antonyms
Problem Formulation
}
}
GLemma + Sense = (V, E)
Nodes (random variables)
}
}
}
Typed edges
}
}
}
E = {e(vi, vj, vk) } where vi, vj ∈ V, Tk ∈ T
T = {pre-arg, arg-arg, arg-syn, syn-syn}
Neighborhood function
}
}
V = {v1, v2, …, vn}
Unobserved variables: ϒ = {Y1,Y2 ,...,Yn }, yi
Nv = {u | e(u, v) ∈ E}
Labels
}
51
L = {+, -}, yi denotes the label of Yi.
Pairwise Markov Random Fields
52
Objective Function
1
t
P(y | x) =
ψi (yi )∏ ψij (yi , y j )
∏
Z(x) Yi∈ϒ
53
Objective Function
An assignment to all the unobserved variables
1
t
P(y | x) =
ψi (yi )∏ ψij (yi , y j )
∏
Z(x) Yi∈ϒ
54
Objective Function
An assignment to all the unobserved variables
1
t
P(y | x) =
ψi (yi )∏ ψij (yi , y j )
∏
Z(x) Yi∈ϒ
Variables with known labels
55
Objective Function
1
t
P(y | x) =
ψi (yi )∏ ψij (yi , y j )
∏
Z(x) Yi∈ϒ
Prior mapping:
yi refers to Yi’s label
ψi is prior mapping.
56
Objective Function
1
t
P(y | x) =
ψi (yi )∏ ψij (yi , y j )
∏
Z(x) Yi∈ϒ
Prior mapping:
L àℝ≥0
57
Compatibility mapping:
L x L à ℝ≥0
Objective Function
An assignment to all the unobserved variables
58
Loopy Belief Propagation
}
Message passing
mi→ j (yi ) = α ∑ (ψijt (yi , y j )ψi (yi )
yi ∈L
}
Yk ∈Ni∩ϒ \Yi
Belief
bi (yi ) = βψi (yi )
∏
Y j ∈Ni∩ϒ
59
∏
m j→i (yi ), ∀yi ∈ L
mk→i (yi )), ∀y j ∈ L
Loopy Belief Propagation
}
}
}
Initialize “message” between all node pairs connected by
an edge.
Initialize priors for all nodes
Integrative message passing until all messages stop
changing
mi→ j (yi ) = α ∑ (ψijt (yi , y j )ψi (yi )
yi ∈L
}
Compute beliefs.
∏
mk→i (yi )), ∀y j ∈ L
Yk ∈Ni∩ϒ \Yi
bi (yi ) = βψi (yi )
∏
Y j ∈Ni∩ϒ
}
Assign the label
60
Li ← max i bi (yi )
m j→i (yi ), ∀yi ∈ L