Doing Open Science - Open Science Framework

Doing Open Science
Lorne Campbell
With inspiration and ideas from colleagues
Etienne LeBel & Timothy Loving
I am Inclined to simply show this talk by Victoria Stodden:
https://www.youtube.com/watch?v=sZIxzTsvWhw
Why Is Science Open?
(title of first slide from Stodden’s talk)
• The only way to fully evaluate scientific
conclusions is to fully understand the methods
that underlie those conclusions
• Sharing study details allows for research to be
reproducible
• Scientific community places more value in
findings that are consistently reproduced with
the same procedures
Donoho (via Stodden): The
published paper is only an
advertisement of the
scholarship; it is not the
scholarship itself
When and What to Share?
• When
– Typically at the end of the research process
• in a manuscript submitted for peer review
• What
– Details related to the particular results being
presented
• Subset of conditions, measures, procedures, analytic plan
(including analyses actually run)
– As thorough as space limitations allow (example on
next slide)
• Other details “available upon request”
M. Vess, 2012: “Warm Thoughts” paper in
Psychological Science
Method
I recruited 56 individuals (32 females and 24 males) residing in the United
States (mean age = 33.50 years, SD = 11.09 years) through Amazon’s
Mechanical Turk (see Buhrmester, Kwang, & Gosling, 2011). Each participant
received $0.35 as compensation for taking part in Study 1.
Participants first completed a brief measure of adult attachment avoidance
and attachment anxiety (Wei, Russell, Mallinckrodt, & Vogel, 2007). They
were then randomly assigned to two conditions: In one condition,
participants were asked to reflect on a past romantic breakup, whereas in the
other condition, they were asked to reflect on an ordinary event. All
participants then rated the desirability of warm-temperature refreshments
(“hot tea/coffee,” “warm pie,” and “soup”) and neutral-temperature
refreshments (“crackers,” “candy bar,” “potato chips,” and “pretzels”) on 11point scales ranging from not at all desirable to extremely desirable. This
measure was modeled after the one developed by Zhong and Leonardelli
(2008). For each participant, ratings were averaged to create composite
scores for warm-temperature desirability (α = .65) and neutral-temperature
desirability (α = .66).
Retrospective Reporting
Biased Recall
• A known problem with retrospective research
methods
Recalling Details of the Research Process:
– What, specifically, was predicted before data was
collected and/or analyses run? Why?
– Many decisions made during the research process
– Many initial hypotheses may be more abstract
than specific
– Data exclusion rules generated on the spot
– What analyses were run, with what variables?
Another Feynman Quote
• Another thing I must point out is that you cannot prove a vague
theory wrong….[For example] ‘A’ hates his mother. The reason is, of
course, because she did not caress him or love him enough when he
was a child. But if you investigate you find out that as a matter of fact
she did love him very much…. Well then, it was because she was
over-indulgent when he was a child! By having a vague theory it is
possible to get either result. The cure for this one is the following. If
it were possible to state exactly, ahead of time, how much love is not
enough, and how much love is over-indulgent, then there would be a
perfectly legitimate theory against which you could make tests. It is
usually said when this is pointed out, ‘When you are dealing with
psychological matters things can’t be defined so precisely’. Yes, but
then you cannot claim to know anything about it. (Feynman, 1965,
pp. 158-159)
Theory Building
• Requires testing “risky predictions” (Meehl,
1967, 1978)
• Risky prediction: hypothesis that stands a high
chance of being wrong (Feynman, 1974;
Popper, 1959)
Set of Possible Outcome From A Study
High
Vagueness of
hypothesis
Low
Range of Outcomes Consistent with Hypothesis
But…
• “…the ‘derivations’ from most of the theories in social
psychology are usually not unequivocal, or strictly logical,
for they skip steps, they depend on unexpressed
assumptions, and they rest on the criterion of intuitive
reasonableness or plausibility rather than on formal
logical criteria of consistency.” (p. 7) – Deutsch & Krauss
(1965)
• Theory testing in psychology is hard: Our
Theories do not dictate specific hypotheses
• Requires more truly confirmatory research
(Schaller, 2015; Simpson, 2013)
Not all Research is Confirmatory
• We are not always testing specific hypotheses
• May have ideas of what to expect
• Or, patterns in the data may lead to the
development of novel findings/hypotheses
• One process of discovery
• Agreed, this type of discovery is important
– Simply share how the results were obtained
– Can state at the beginning that specific
hypotheses not being tested
Doing Open Science
• We receive a lot of training on research methods
and statistical procedures (but likely not
enough—another talk!)
• But, not much (if any) on how to do open science
• Technology today allows for open science
practices
Why I Transitioned to Open Science
Practices
• My support for the value of replication, and other
activities occurring in the field in 2011-12, resulted
in me conducting a direct replication with Etienne
LeBel of research published in Psychological Science
by Vess (2012)
• Created an OSF account, partly for this project
• Our results were published in PS in 2013
• My OSF account at that time had 2 projects—the 2
replication attempts of Vess (2012)
– Note: I had conducted other research during this time
period. Hmm.
Decisions
(1) I used open science practices with the
replication studies
(2) I was using closed research practices for my
own original research
• So I asked myself if I only value openness for
other people’s research?
– I answered no. From that point on I was all in for
open science
• But how?
What to do?
• A lot of uncertainties, unanswered questions,
and not a lot of guidance
• John Lennon: “Well I tell them there’s no
problem, only solutions”
• Wrote paper with Timothy Loving and Etienne
LeBel (2014) attempting to find some
solutions
Example
What Else?
• Blind Analysis?
• Methods Videos?
“You Had an Option, Sir”
We all have the option to adopt open, or closed, research
practices; it is our choice. When deciding what option to choose, ask
yourself if that is the best choice for advancing scientific discovery.
We have the obligation of sharing our choice for open or closed
research practices.