Background. People are better at comparing things rather - D-PAC

A review study to identify adaptive algorithms for increasing the efficiency of
Comparative Judgement
San Verhavert , Vincent Donche, Sven De Maeyer and Liesje Coertjens
Background.
People are better at
comparing things rather
than
making
an
absolute
judgement
about them.
Drawing on this and
derived
from
Thurstone’s Law of
Comparative
Judgement (1927a, b),
the
method
of
Comparative
Judgement (CJ) has
already proven to be a
reliable
assessment
method merits (e.g. see
Jones & Inglis, 2015;
Selection Algorithms
Staircase
Not
Maximizing/Minimizing a function
Not Information Measure Based
No Random Component
• Fixed Step Size Staircase
• Up-Down Staircase with Decreasing Step Size
No Staircase
•
•
•
•
No Staircase
• Categorization Sort
• Adaptive Generalized Pólya Urn (GPU) design
Random Component
Staircase
• Fixed Step Size Staircase with random component
No Staircase
Maximizing/Minimizing a function
• Elo-type algorithm with fixed β_i
• Elo-type algorithm with changing β_i
• Elo-type algorithm with changing β_i and
uncertainty tuning
• Simulated Annealing Algoritm
• ML estimation
Neither
Weighted,
nor Balanced
Independent of
Statistical Approach
Staircase
• Parrameter Estimation by Sequenial Testing (PEST)
No Staircase
Balanced
• ML-estimation “bias corrected”
No Staircase
Bayesian Approach
• Minimum Expected Posterior Variance
Staircase
• ZEST (Bayesian staircase)
Information Measure Based
Fisher Information
Jones, Swan, & Pollitt, 2015).
However, CJ suffers from efficiency
problems.
Despite this issue being known quite
a long time in the field (i.e. since
Bramley, Bell, & Pollitt, 1998) no
systematic
search
has
been
conducted toward algorithms that
might increase CJ efficiency.
This systematic review is an attempt
to fill this gap.
Not
Maximizing/Minimizing a function
Independent of
Statistical Approach
Maximizing(Minimizing) a function
Independent of
Statistical Approach
Random Component
Weighted
No Random Component
Neither
Weighted,
nor Balanced
Weighted
Balanced
Frequentist Approach
Bayesian Approach
Not Stratified
Not Stratified
• Proportional method
Static Constraints
• point Fisher information
Static Constraints
• Maximum Information per time unit
Static Constraints
• Efficiency Balanced Information Criterion
Random Component
Both
Weighted,
and Balanced
Not Stratified
No Random Component
Neither
Weighted,
nor Balanced
Not Stratified
No Random Component
Weighted
Not Stratified
Weighted
Not Stratified/
Blocked
Static Constraints
• Progressive method
Flexible constraints
• Fisher information over an interval
Flexible constraints
• Fisher information weighted by likelihood
Flexible Constraints
• Maximum Posterior Weighted Fisher Information
Static Constraints
Blocked
Kullback-Leibler
Information
Maximizing/Minimizing a function
Independent of
Statistical Approach
No Random Component
Neither
Weighted,
nor Balanced
Not Stratified
Not Stratified
• Maximum Posterior-weighted Information (MPI)
with theta/b interval blocking
• Maximum Expected Information (MEI)
with theta/b interval blocking
Flexible constraints
• Bayesian Adaptive Testing
Static Constraints
• Point Kullback-Leibler information
Flexible constraints
• Kullback-Leibler Information over an interval(A)(B)
Frequentist Approach
Analyses.
From the articles we extracted the
algorithm descriptions and we
attempted to make a taxonomy
based
on
the
algorithm’s
adaptiveness.
Not Stratified
Static Constraints
• Efficiency Balanced Information Criterion over Interval
Balanced
Method.
Our method is an adaptation of that
described in Petticrew and Roberts
(2006).
Not Stratified
Flexible constraints
Research Question.
What adaptive selection algorithms
can potentially increase efficiency in
the context of CJ?
[email protected]
Stochastic approximation (SA)
Accelerated Stochastic Approximation (ASA)
Modified Binary Search (MOBS)
History based selection
Bayesian Approach
Other
Information
Criterion
Maximizing/Minimizing a function
Bayesian Approach
No Random Component
Weighted
No Random Component
Weighted
No Random Component
Neither
Weighted,
nor Balanced
Balanced
www.D-PAC.be
Not Stratified
Not Stratified
Flexible constraints
• Kullback-Leibler information weighted by likelihood
Flexible constraints
• Poterior weighted Kullback-Leibler information
Static Constraints
Not Stratified
Not Stratified
• Adaptive Bayesian methodology or
Bayesian mutual information selection
Static Constraints
• Adaptive Baysian methodology expected cost balanced
www.eduBROn.be
www.uantwerpen.be