slides

Galaxy Structure through
Bayesian Hierarchical modelling
Josh Argyle
Supervisors: Jairo Méndez-Abreu & Vivienne Wild
DEX, Edinburgh, 9th January 2017.
[email protected]
2D Photometric Decompositions
PGC 35772
Erwin 2014
*see Lange et al. 2016 for more details.
2D Photometric Decompositions
PGC 35772
Erwin 2014
Problems with conventional methods:
1) Local minima trapping.
2) Unrealistic solutions.
3) Which model?
4) Representation of errors.
*see Lange et al. 2016 for more details.
Bayesian Inference
Baye’s Rule:
Posterior Joint
distribution
˿
Likelihood
function
x
Prior
distribution
1/3
Bayesian Inference
Baye’s Rule:
Posterior Joint
distribution
˿
Likelihood
function
x
Prior
distribution
1/3
Bayesian Inference
Baye’s Rule:
Posterior Joint
distribution
˿
Likelihood
function
x
Prior
distribution
a
b
uniform
1/3
Bayesian Inference
Baye’s Rule:
Posterior Joint
distribution
˿
Likelihood
function
x
Prior
distribution
a
b
uniform
Previous Problems:
1) Local minima trapping.
2) Unrealistic solutions.
3) Which model?
4) Representation of errors.
Markov Chain Monte Carlo Solutions:
1) Exploration of parameter space.
2) Priors.
3) Bayesian model selection.
4) Posterior probabilities.
1/3
2.80
Joint posterior
2.70
2.60
2.50
2.40
n
n
2.20
True value
2.00
Posterior median
1.80
3.25
3.20
3.2
3.0
2.8
Sérsic
Sersicindex,
Index, n n
h [/pixels]
h [/Pixels]
5.30
5.20
5.10
5.00
10
4.5
9
4.0
3.5
3.0
4.70
3.00
3.20
2.50 2.60 2.70 2.80 2.90 3.00
Re [/Pixels]
log(Ie) [/counts]
Re [/pixels]
1.60
1.80
2.00
n
n
2.20
2.40
3.05
3.10 3.15 3.20
log I0 [/counts]
3.25
4.70 4.80 4.90 5.00 5.10 5.20 5.30
h [/Pixels]
log(I0) [/counts]
h [/pixels]
7
6
5
2.0
4
2.5
2.0
1.5
1.0
1.
3.05 3.10 3.15
log Ie [/counts]
8
2.5
4.90
4.80
3.1
5.0
3.15
3.05
3.2
3.0
3.0
3.10
3.3
2.6
0
log(Ilog0)I [/counts]
[/counts]
1.60
3.4
log Central Intensity, I0
2.90
3.4
Scale length, h
log log
effective
intensity,
Ie
Effective Intensity,
Ie
3.00
e
Re [/pixels]
R [/Pixels]
e
Marginalised posterior
log
effective
Re
Effective
Radius,radius,
Re [/Pixels]
log(Ilog
e)I [/counts]
[/counts]
Bayesian Inference
10.
100.
Iteration
Iteration
1000.
Hierarchical Bayesian Inference
Hierarchical Bayesian Inference
Population
Hierarchical Bayesian Inference
Population
Piece-wise constant representation*:
Heaviside step function
- Probability of finding an object in the bin labelled d
*Mathematical description of a d-dimensional histogram
Hierarchical Bayesian Inference
log(Re) [/pc]
Joint posteriors
Marginalised posterior
0.25
Gadotti ’09 fit to elliptical galaxies
n
Probability
0.20
0.15
0.10
μ0 [i-mag arcsec-2]
0.05
0.00
0
2
4
Sersic index, n
log(h) [/pc]
Higher probability
<μe> [i-mag arcsec-2]
log(Re) [/pc]
n
μ0 [i-mag arcsec-2]
Lower probability
6
8