24.Bayesian.Skyline

Bayesian approaches:
Posterior probability ~
Prior probability x Likelihood
Specify a model with parameters, and
priors (distribution of values for
parameters), and integrate (evaluate) this
product across a range of parameter values
Probability(Hypothesis | Evidence) =
Conditional Probability x Prior Probability
Posterior probability =
Marginal Probability
Holder & Lewis 2003
Beaumont & Rannala (2004)
Conditional probability
or likelihood
Prior probability
Posterior probability
Marginal probability
(averaged across all
conditions)
Comparing different models: Bayes factors
Model 1
The set (vector) of parameters for model 1
Model 2
The set (vector) of parameters for model 2
Bayes’ factors
According to I. J. Good a change in a weight of
evidence of 1 deciban or 1/3 of a bit (i.e. a change in
an odds ratio from evens to about 5:4) is about as
finely as humans can reasonably perceive their degree
of belief in a hypothesis in everyday use.[6]
http://en.wikipedia.org/wiki/Bayes_factor
Likelihood ratio test
D ~ c2, df = (dfnull – dfalt )
Models must be ‘nested’
Akaike information criterion
k = the number of free parameters to be estimated.
p(x|k) = the probability of the observed data given the number of parameters
L = the maximized value of the likelihood function for the estimated model.
Choose the model with the minimum AIC (2k is a penalty for over-fitting)
Bayesian information criterion (BIC)
x = the observed data;
n = the number of data points in x, the sample size;
k = the number of free parameters to be estimated.
p(x|k) = the probability of the observed data given the number of parameters
L = the maximized value of the likelihood function for the estimated model.
1a=Modern clade
Blue = North
America south of
Beringia
Green = Asia and
west Beringia
Red = east Beringia
Beringian clade that
moved south
Genealogies and population size
From Hedrick (2009)