Value of Information

Information
Lecture 19
November 2, 2005
12-706 / 19-702 / 73-359
Admin Issues
Landfill Gas Projects
Great job. Range 75-98%, median/mean 92.
HW 5 Due next Wednesday
This week’s office hours Thurs 4pm, Fri 1:30pm.
Next week back to normal (Pauli Mon AM).
Schedule Set for Rest of Semester
Agenda
Value of Information
Facility Feasibility Case Study
Comments on Doing Sensitivity Analysis
Value of Information
We have been doing decision analysis with best
guesses of probabilities
Have been building trees with chance and decision
nodes, finding expected values
It is relevant and interesting to determine how
important information might be in our decision
problems.
Could be in the form of paying an expert, a fortune
teller, etc. Goal is to reduce/eliminate uncertainty in
the decision problem.
Willingness to Pay = EVPI
We’re interested in knowing our WTP for
(perfect) information about our decision.
The book shows this as Bayesian probabilities,
but think of it this way..
We consider the advice of “an expert who is always
right”.
If they say it will happen, it will.
If they say it will not happen, it will not.
They are never wrong.
Bottom line - receiving their advice means we
have eliminated the uncertainty about the
event.
Discussion
The difference between the 2 trees (decision
scenarios) is the EVPI
$1000 - $580 = $420.
That is the amount up to which you would be willing
to pay for advice on how to invest.
If you pay less than the $420, you would expect to
come out ahead, net of the cost of the information.
If you pay $425 for the info, you would expect to
lose $5 overall!
Finding EVPI is really simple to do in @RISK /
PrecisionTree plug-in (not so for treeplan!)
Similar: EVII
Imperfect, rather than perfect, information
(because it is rarely perfect)
Example: our expert acknowledges she is not
always right, we use conditional probability
(rather than assumption of 100% correct all the
time) to solve trees.
Ideally, they are “almost always right” and “almost
never wrong”
e.g.. P(Up Predicted | Up) is less than but close to 1.
P(Up Predicted | Down) is greater than but close to 0
Assessing the Expert
Expert side of EVII tree
This is more complicated than EVPI because we do not know whether
the expert is right or not. We have to decide whether to believe her.
Use Bayes’ Theorem
“Flip” the probabilities.
We know P(“Up”|Up) but instead need P(Up |
“Up”).
P(Up|”Up”) =

P(“Up”|Up)*P(Up)
P(“Up”|Up)*P(Up)+ .. P(“Up”|Down)P(Down)
=
0.8*0.5
0.8*0.5+0.15*0.3+0.2*0.2
=0.8247
EVII Tree Excerpt
Rolling Back to the Top
Final Thoughts on Plugins
You can combine (in @RISK) the decision
trees and the sensitivity plugins.
Can probably do in Treeplan, I havent
tried it.
Do “Sensitivity of Expected Values” by
varying the probabilities (see end Chap 5)
Also - can do EVPI/EVII with @RISK.
Don’t need to do everything by hand!
Transition
Speaking of Information..
How valuable would it be to have had
better knowledge of the costs/revenues of
a facility project after its been around for
a while?
Would it change our decision?