Graham Gibbs - Oxford Brookes University

Improving Student Learning
Theory and Practice -10 Years On
Ten Years
of
Improving Student Learning
Graham Gibbs
Centre for Higher Education Practice,
Open University, UK
Background

1980’s - lack of use of theory and evidence to
inform course design, teaching and assessment
methods
Background

two year study applying concepts from
phenomenographic research to action research
studies carried out by teachers
Background

advertised for terrible courses where students
took a surface approach: over 100 teachers
applied!
Background

What did we already know?








students vary in their approach to learning
students vary in their conception of learning
conception constrains approach
outcomes vary qualitatively
approach is related to outcome
individuals vary in their approach between contexts
crucial context characteristics identifiable
some teaching methods embody many of these
characteristics (e.g. PBL)
 almost no evidence of successful intervention
Background

What tools did we have?




inventories measuring students’ approach
phenomenographic category systems
SOLO taxonomy
descriptions of how research conducted
Background

8 case studies of principled change
Background


8 case studies of principled change
diagnosed course problems using literature and
evidence from students
Background



8 case studies of principled change
diagnosed course problems using literature and
evidence from students
no focus on teachers at all
Background




8 case studies of principled change
diagnosed course problems using literature and
evidence from students
no focus on teachers at all
redesigned courses, guided by literature
Background

evidence of impact on:






students’ approach
students’ self-reports of studying
student performance
quality of outcomes: SOLO taxonomy
other evaluation evidence (e.g. employers)
other courses
Background

Book: “Improving the Quality of Student Learning”

Two (small) national dissemination events
…. next

Advertised event and invited those doing similar
types of action research to present what they
were doing
…. next


Advertised event and invited those doing similar
types of action research to present what they
were doing
Booked space for 60 in London: 180 booked moved to Warwick
…. next





advertised event and invited those doing similar
types of action research to present what they
were doing
booked space for 60 in London: 180 booked moved to Warwick
9 Symposia
over 2,000 participants
374 published papers
Rationale

Bring theorists and those who develop research
tools together with those that use these concepts
and tools: developers and teachers
Rationale


Bring theorists and those who develop research
tools together with those that use these concepts
and tools: developers and teachers
Focus on improving student learning, not just on
studying it
Rationale



Bring theorists and those who develop research
tools together with those that use these concepts
and tools: developers and teachers
Focus on improving student learning, not just on
studying it
Build ‘community of practice’ that goes about
improving student learning in a particular way
Rationale




Bring theorists and those who develop research
tools together with those that use these concepts
and tools: developers and teachers
Focus on improving student learning, not just on
studying it
Build ‘community of practice’ that goes about
improving student learning in a particular way
Build research capacity
... if you want to present research
…or want to describe practice
…there are other places...
ISL Symposia
1
2
3
4
5
6
7
8
9
ISL: theory and practice
ISL through assessment and evaluation
Using research to ISL
ISL through course design
Improving students as learners
Improving student learning outcomes
ISL through the disciplines
ISL strategically
ISL using learning technology
Development of theory
Biggs (1994)
Characteristics
of the student
Students’
approaches
to learning
Teaching context
Students’
learning
outcomes
Development of theory
Prosser et al (2000) - as perceived by the student
Characteristics
of the student
Students’
perceptions
of context
Course and
department
learning context
Students’
approaches
to learning
Students’
learning
outcomes
Development of theory
Prosser et al (2000) - as perceived by the teacher
Characteristics
of the teacher
Teachers’
perceptions
of context
Course and
department
learning context
Teachers’
approaches
to teaching
Teachers’
perceptions
of leadership
context
Development of evidence








Students’ approach looks different in contexts
Possible to change students’ approach directly
and indirectly
Teachers’ approaches differ
Teachers’ approach linked to students’ approach
Teachers’ perceptions of context linked to
teachers’ approach and to students’ approach
Possible to change teachers’ approach directly
Perceptions of leadership context linked to
teachers’ approach
Possible to improve student performance
Personal highlights
Linder and Marshall (1996, 1997)






used theory about metacogniton and culturally
relevant learning to develop repertoire of teaching
methods for physics
studied impact: ASI, interviews, standardised tests
changes in students’ conceptions of learning
changes in conceptions of science
improved student performance
theory development: ‘epistemological shift’
Personal highlights
Allan (1995, 1996)
 strong challenge (Martin) to ‘learning outcomes’
movement
 showed introduction of learning outcome-driven
module design:
 increased sophistication of students’ conceptions of
learning (8% Saljo level 4 to 55% level 4)
 increased constructive alignment: congruence
between students and teachers of goals

next year studied features of module design that
influence perceptions
Development of research community

Continuity
 participation
 papers: Prosser et al, also Linder & Marshall,
Norton, McDowell, Meyer, de la Harpe & Radloff

Colleague involvement
 SHU, Anglia Polytechnic University: Wisker

Institutional involvement
 Lund

ISL mailbase discussions Jackson
Development of research capacity






51 papers concerned with development of
methodology and research tools
Research methods workshops: up to 140
participants
Workshops turned into ‘tutorial’ papers (Lindsay)
‘Work in progress’
Informal ‘supervision’
Increased use of research and action research as
a change process (Jackson, 1995)
Analysis of 10 years of papers


nine Symposia
374 published papers
Analysis of 10 years of papers



nine Symposia
374 published papers
workshops, seminars, work in progress…
Analysis of 10 years of papers



nine Symposia
374 published papers
workshops, seminars, work in progress…
Coded
 by perspective (internal/external)
 by topic (what student learn, developing teaching
methods ...)
 by focus (theory, methodology, practice …)
Research Perspective

One third (33%) internal perspective, two thirds
external perspective.
Research Perspective

One third (33%) internal perspective, two thirds
external perspective.

1993:
58% internal perspective
Research Perspective

One third (33%) internal perspective, two thirds
external perspective.

1993:
58% internal perspective

2001:
11% internal perspective
Research Perspective

One third (33%) internal perspective, two thirds
external perspective.

1993:
58% internal perspective

2000:
11% internal perspective

Internal perspective not the same as student
focussed
Topics of papers
Development of practice





Developing teaching methods
Developing students
Developing assessment methods
Developing courses/modules
Developing programmes
61%
20%
13%
13%
8%
7%
Topics of papers
What and how of student learning



How students learn
How students differ/develop
What students learn
21%
10%
6%
5%
Topics of papers
Teachers


How teachers differ/develop
Developing teachers/academics
11%
6%
5%
Topics of papers
Systems/contexts


7%
Developing departmental/institutional contexts 4%
QA systems
3%
Evidence of impact
evidence of a positive impact on student learning
process and/or outcome
vs
evidence of no impact
evidence of negative impact
or no evidence

student self-reported evaluation comments not
accepted as evidence
Evidence of impact


13% of papers reported evidence of improved
student learning process and/or outcome
c. 5 articles/year
Evidence of impact




13% of papers reported evidence of improved
student learning process and/or outcome
c. 5 articles/year
much of evidence about process, not about
outcome
useful impact often inferred from changed process
Evidence of impact




13% of papers reported evidence of improved
student learning process and/or outcome
c. 5 articles/year
much of evidence about process, not about
outcome
10% studies taking an external perspective
reported evidence of impact
Evidence of impact






13% of papers reported evidence of improved
student learning process and/or outcome
c. 5 articles/year
much of evidence about process, not about
outcome
10% studies taking an external perspective
reported evidence of impact
many ‘no difference’ findings
some negative impact findings (Jones & Hassall)
Evidence of impact

Developing students
20%
 Ramsden vs Norton

Developing courses
15%

What and how of student learning
14%
 86% descriptive of variation at one point in time
Evidence of impact

Developing dept/institutional contexts
5%

Developing teachers
4%

Developing QA systems
0%
 Lack of use of theory in L&TSs
 Lack of evaluation built into L&TSs
 Lack evaluation of impact of training of teachers
Evidence of impact
Variations in extent of evidence by year

Best 1995:
18%
Using research to improve student learning

Worst 2001:
0%
Improving student learning through using learning
technology
Purposes




Development of practice
58%
Discussion of issues
17%
Development of theory/concepts
14%
Development of research tools/methodology 10%
Purposes




Development of practice
58%
Discussion of issues
17%
Development of theory/concepts
14%
Development of research tools/methodology 10%
Very uneven pattern across topic areas:

How students learn: half about developing theory
and concepts and a quarter about developing
methodology and research tools
Discussion


In 2’s and 3’s …
what conclusions do you draw from this analysis
of ISL papers?
Conclusions

Little evidence of impact
Conclusions


Very little evidence of impact
Lots of description of variation
Conclusions



Very little evidence of impact
Lots of description of variation
Masses of (atheoretical) accounts of practice,
especially about C&IT
Conclusions




Very little evidence of impact
Lots of description of variation
Masses of (atheoretical) accounts of practice
Original strong internal perspective is being lost
Conclusions





Very little evidence of impact
Lots of description of variation
Masses of (atheoretical) accounts of practice
Original strong internal perspective is being lost
Evidence of impact and internal perspective both
strongest when focus is on research rather than
on practice
Conclusions






Very little evidence of impact
Lots of description of variation
Masses of (atheoretical) accounts of practice
Original strong internal perspective is being lost
Evidence of impact and internal perspective
strongest when focus on research
Available tools/measures (e.g. of reflection) not
used
Conclusions







Very little evidence of impact
Lots of description of variation
Masses of (atheoretical) accounts of practice
Original strong internal perspective is being lost
Evidence of impact and internal perspective
strongest when focus on research
Existing measures (e.g. of reflection) not used
Little focus on programmes or on context:
departments, institutions and QA systems
(despite Entwistle, 1995)
Comments


The value of using conceptual frameworks about
teachers’ approach, perceptions of context, and
approaches to leadership of teaching, relies on
long chains of logical connection back to original
research on approach and outcome.
The proportion of variance in outcome explained
by e.g. leadership, likely to be small.
Comments


Do improved CEQ scores mean student
learning/performance/employability is better?
Are context variables (underlying CEQ) from
1980’s still dominant today?
Comments


Tool for measuring perceptions of context include
variables (class size, student variation) that it is
difficult to do anything about…. need to focus on
what can be changed.
Claim that staff development unlikely to be an
effective lever is contradicted by evidence
Comments


research and practice from the USA extremely
unlikely to be taken up and used (e.g. Angelo)
theoretical and methodological perspectives from
mainland Europe often ‘one-offs’
Comments

There is evidence of significant impact on student
performance with very modest interventions.
Comments

There is evidence of significant impact on student
performance with very modest interventions (Price
and Rust)

There is conflicting evidence of impact about
almost identical interventions
Comments



There is evidence of significant impact on student
performance with very modest interventions.
There is conflicting evidence of impact about
almost identical interventions
We need better theory about teaching methods:
what is it about forms of implementation that
makes a difference to student learning
Comments




There is evidence of significant impact on student
performance with very modest interventions.
There is conflicting evidence of impact about
almost identical interventions
We need better theory about teaching methods:
what is it about forms of implementation that
makes a difference to student learning
Its not all about approach: competence matters
 Trigwell’s definition of excellent teaching
 attempts to improve students as learners
Need for better theories of teaching

Why do methods sometimes work and sometimes
not?
 ‘learning to learn’
 ‘open learning’

Sophisticated diagnosis of students ‘at risk’
followed by unsophisticated intervention
 Meyer et al
 Open University
 US ‘first year experience’ being theorised in terms of
social and academic integration and self-efficacy
Towards a theory of the way
assessment supports learning





assessment is central to both student learning
and to course design (Snyder, Miller and Parlett)
many accounts of assessment practice
lack of conceptual framework for evaluating
existing assessment practice
CEQ ‘Appropriate Assessment’ scale ‘broad
brush’ (Norton et al)
CEQ item that correlates best with student
performance is about ‘feedback’ - but no ISL
study of feedback
“Conditions under which assessment
supports learning”

Influences of assessment on the volume, focus
and quality of studying: 4 conditions

Influences of feedback on learning: 7 conditions
Influences of assessment on the
volume, focus and quality of studying
Condition 1

Sufficient assessed tasks are provided for
students to capture sufficient study time
Influences of assessment on the
volume, focus and quality of studying
Condition 2

These tasks are engaged with by students,
orienting them to allocate appropriate amounts of
time and effort to the most important aspects of
the course.
Influences of assessment on the
volume, focus and quality of studying
Condition 3

Tackling the assessed task engages students in
productive learning activity of an appropriate kind
Influences of assessment on the
volume, focus and quality of studying
Condition 4

Assignments, exam questions and criteria
convey clear goals and high expectations
Influences of feedback on learning


greater effects sizes than for any other feature of
schooling
many examples of modest interventions with
dramatic learning gains
Influences of feedback on learning
Condition 5

Sufficient feedback is provided, both often
enough and in enough detail
Influences of feedback on learning
Condition 6

The feedback focuses on learning and on actions
under the students’ control, rather than on the
students themselves and on their characteristics
Influences of feedback on learning
Condition 7

The feedback is timely in that it is received by
students while it still matters to them and in time
for them to pay attention to further learning or to
receive further assistance
Influences of feedback on learning
Condition 8

Feedback is appropriate to the purpose of the
assignment and to its criteria for success
Influences of feedback on learning
Condition 9

Feedback is appropriate, in relation to students’
understanding of what they are supposed to be
doing




Students’ conceptions of the task
Students’ conceptions of learning
Students’ conception of knowledge
Students’ conception of the discourse of the
discipline
Influences of feedback on learning
Condition 10

Feedback is received and attended to
Influences of feedback on learning
Condition 11

Feedback is acted upon by the student
Student Case









Oriented to passing and qualifying as a teacher
Copes with workload by concentrating on assignments
Looks first at the assignment questions
Assignments the “driving force for learning”
Remembers for revision what was in assignments
Studies half as many hours in non-assignment weeks
Feedback comes too late: the course has moved on
Skips units once enough marks accumulated
‘Fakes good’
Features of research...






collaborative with teams of teachers in varied
contexts
mix of internal and external perspectives
mix of qualitative and quantitative methods
multi-stage interventions, evaluation tool
construction, insights and theory building
Goal 1: improve student learning by changing
assessment
Goal 2: develop an approach to improving
assessment so that it improves student learning
Aspirations for ISL





continues to flourish …
emphasises evidence of impact - both from those
developing theory and those developing practice
develops theory with clearer implications for
practice
develops practices with clearer rationales
retains its ‘look and feel’
…see you in 2012!