And in conclusion - Edge Hill University

Reflections …
Main slides from Keynote to Edge Hill
SOLSTICE Conference, June 2013.
Prof Peter Hartley
[email protected]
Assumptions and expectatians



Note the ‘power’ to shape our thinking.
Beware the traps!
How can we consider alternatives?
PS did you spot the deliberate mistake on this slide?
2
From chalkface to screenface*
But first …

A wee trip down memory lane ….

* See Alison Ruth’s article
in Research in Learning Technology, 20, 2012
3
The joy of handcranked
handouts from the Banda
http://1.bp.blogspot.com/_u3ERflVWg38/S8Zm2c-Zm8I/AAAAAAAAB0s/IUdo6ketmxA/s320/Ditto+Machine.jpg
What ‘portable’ pcs used to
look like …
http://oldcomputers.net/pics/osborne1.jpg
And admire the specification
For this and other images, see http://oldcomputers.net
And for those of us
on limited budget
http://en.wikipedia.org/wiki/BBC_Micro
All human knowledge is here?
For a vision of the future, see http://www.youtube.com/watch?v=AJVtfRTH4mk
And let’s give students razor
blades to edit audio ..
For other examples, see http://www.schimmel.talktalk.net/tape/
Coming shortly …
10
And now we’ve got to
Digital Literacy and beyond?


Definitions?
What is next?



Wearable computing,
e.g. Google Glasses
Digital ‘language’
and gesture control
Big data and
data-driven?
And a note on digital literacy:
Change in UK HE?
It was 40 years ago today…
Then










Students were ‘top 3%’
Binary divide
CNAA validated Polytechnics
Professional teaching support ?
Research/scholarship in LT?
Teaching roles in Faculties?
No ‘e’
National student voice?
Degree structures course-based
Degree classification system
Then and Now compared …
Then










Students were ‘top 3%’
Binary divide
CNAA validation for Polys only
Professional teaching support?
Research/scholarship in LT?
Teaching roles in Faculties?
No ‘e’
National student voice?
Degree structures course-based
Degree classification system
Now (and potential)










40%/50% targets; WP
League tables for all Univs
QAA: Audit, NQF, Prog Specs
HEA and UKPSF
Growing evidence/outlets
NTFS, Univ Fellowships
Email, MS Office, VLE, Web 2
NSS - National Student Survey
Modules, CATS, Semesters
PDP, Burgess report & HEAR
Meet your new students?
This prospective student can:




Switch on devices.
Select folders which contain specific
applications.
Use the interface effectively (e.g. swiping and
pointing on touchscreen; voice control) to
operate and control applications.
Switch between applications using different
methods.
This prospective student can:
Switch on devices.
 Select folders which contain specific
applications.
 Use the interface effectively (e.g. swiping and
pointing on touchscreen; voice control) to
operate and control applications.
 Switch between application using different
methods.
How old is he/she?

Here he is in action …
18
Your future student?




Alexander is just 2 years old.
He operates the iPad fluently and is disappointed
that the desktop in the study does not have a
touchscreen.
He is used to (and prefers) software which
demands interaction and user response.
How digitally literate will he be
when he arrives at University?
And so if I were
to start again …
1.
2.
3.
4.
5.
6.
Reinvent my teaching.
Rediscover ‘the course’.
Focus on assessment strategy.
Revisit regulations.
Rebuild the furniture.
And have a go at some myths.
Reinventing teaching:
Resources, tools and role



Resources are no longer limited!
We have new flexible tools which our
students can use with us.
New models of tutor behaviour:

Instead of ‘sage on the stage’,
what about the:


‘Guide on the side’
‘Meddler in the middle’
See the work of Erica McWilliam, e.g.
http://www.appa.asn.au/conferences/2011/mcwilliam-presentation.pdf
Unlimited resources?
Old teaching
And now?
Library texts
Library texts
Film and video/off-air
YouTube and BOB (in the UK)
Web searches (note C-Link later)
Wikipedia
iTunesU
Collections, e.g. TED
Specific University websites
Project outputs and staff websites
Resource banks: JORUM, Merlot etc.
And a fourth role?
23
A personal example:
Zimbardo’s prison expt


When I was teaching a course on
Interpersonal Communication, this
lecture was one of my ‘best
performances’.
This lecture is now completely
redundant – I have been (and should
be) replaced by ‘better’ online sources
as you can see on the next slide.
24
Zimbardo’s prison expt:
materials now available
Old
teaching
And with OER?
Few Library texts
Library texts: books and journal articles – still limited
Film too costly;
limited off-air
YouTube: original experiment with footage of participants, both now and then;
commentaries; replications and simulations
Google videos: clips and documentaries; SlideShare: Yr 12 Psych example.
BOB – allows download and edits
Web searches (note C-Link later today): 75,000 results; you can quickly find both
the Prison website and Zimbardo’s website, and the challenging BBC Prison Study
Wikipedia: dedicated page (where first year students will go first!)
iTunesU: e.g. OU Critical Social Psychology course – inc transcripts
Web Collections, e.g. TED has Zimbardo profile with links plus 2008 talk inc photos
from Abu Ghraib (how people become monsters) plus links plus blog;
Specific University websites: MIT OpenCourseWare; OU OpenLearn;
And so …


Why should I lecture on Zimbardo when
all students can see the man himself in
action on TED (as nearly 2.5 million
people have done already)?
How can I use the resources (e.g.
original experimental footage on
YouTube) to help students become
critically engaged?
26
New flexibilities …
one possibility
An old way
A new possibility
Lecture
Key question circulated online with weblinks
leads to
points at
reading
resources
which leads
into
which (individually or collectively)
take you into
online posting or discussion,
which then leads into
seminar or large
group discussion
class session
(may be mix of lecture and seminar/workshop activity)
which generates
the next questions …
Compare this outline with more recent discussion of the flipped classroom
Which e-tools are essential for
most or all teaching staff?
I assume we all have:
 MS Office (or equivalent) and
email
 VLE & plug-ins (e.g.Turnitin)
What else do we need?
Take 30 seconds to
answer this question
for yourself
Which tools are essential?
– my personal list ‘this week’










E-portfolio (e.g PebblePad) 
Concept mapping (Cmap)

Screen capture (Camtasia) 
Podcasting (e.g. feedback)

Twitter
Social networking (FaceBook)

Search (Google/ C-Link)

RLO tools (e.g. GloMaker)

OER (e.g. TED, YouTube)

Livescribe pen or equiv.

Survey tools (SurveyMonkey)
Mobile devices (e.g iPod, iPad)
Camera (e.g.smartphone,
ipod)
iTunes (and the U)
Videoconference (Collaborate)
Photo editing (Photoshop)
Interactive multimedia
Blogs & Wikis (e.g.Wikipedia)
Speech recognition (Dragon)
CAA (e.g. QM Perception)
How do you respond
to my list?



Are these simply the ramblings of an
elderly techy/geek?
What range of applications can we
realistically expect most staff to become
familiar with?
Which applications are really important
(and in which disciplines)?
30
Flexible materials/tools:
3 personal examples

The materials:




Making Groupwork Work
Interviewer (Careers and Viva)
C-Link
The rationale in each case



Clarify the educational ‘problem’
Find/develop the appropriate technology
Implement as cost-effectively as possible
31
Example 1
Making Groupwork Work:
Supporting student groupwork
through multimedia and web …
Freely available at this website
University of Bradford
University of Leeds
Sample screenshot
Making Groupwork Work
Rationale
 Clarify the
educational
‘problem’
 Find/develop the
appropriate
technology
 Implement as costeffectively as
possible
Comment
 Students do not
work effectively in
groups
 Need examples of
how issues can be
identified/resolved.
 Materials developed
with small grant
from CETL
34
Making Groupwork Work from
the LearnHigher resources
35
And the full story
Details at http://www.palgrave.com/products/title.aspx?pid=371507
36
Example 2
2nd edition on DVD still available from Gower.
New online version available shortly at Edge Hill.
provides:




opportunity to respond to real interview
questions, and review your
performance, as often as you like
‘non-threat’ arena to improve skills
additional feedback and guidance
flexibility

as a stand-alone resource or as part of a
course on career planning; can support
staff contact and guidance.
does not provide:

The ‘right answer’
Example of screen shot
from the current
software:
As soon as the
interviewer finishes the
question, your webcam
switches on and you
can respond and review
your response.
When you review, you
can look at hints and
tips and consider our
suggestions on what
the interviewer is
looking for.
Preparing students for their
research viva: a new approach
Prof Peter Hartley, Centre for Educational Development
University of Bradford, [email protected]
Prof Gina Wisker, Head of Centre for Learning and Teaching
University of Brighton, [email protected]
Why bother?
• Postgrad students
perceive the Viva
process as a ‘black
box’ – impact on
anxiety and nerves.
• Students do not know
how to prepare.
• Limited supervisor
time and resources.
• Students may not
anticipate the broader
‘helicopter’ questions.
What Viva offers?
• General overarching questions
• Flexible and
unlimited practice.
• Self and/or peer
assessment.
• Onscreen
feedback.
• A process for
preparation.
• Potential use with
supervisors.
What users think?
• ‘saved my life’.
• ‘would not have known
where to start without it’.
• ‘gave me a process to plan
my preparation’.
• ‘boosted my confidence’.
Interviewer and Viva
Rationale
 Clarify the
educational
‘problem’
 Find/develop the
appropriate
technology
 Implement as costeffectively as
possible
Comment
 Students do not
perform to their best
in interviews/vivas.
 Need system which
supports interaction
and reflection.
 Online solution will be
minimum cost to HEI/
no cost to student. 42
Example 3
Info Search into Cmap: C-Link




A new search approach to identify links
and paths between concepts.
Can export into concept maps (Cmap).
Currently set up for Wikipedia
To explore and use C-Link:


Go to www.conceptlinkage.org/
To go straight into the tool:

www.conceptlinkage.org/clink/
Example map
generated by C-Link
This map was chosen
as it is very simple but
does show how related
terms can have very
different origins and
histories.
Most searches deliver
more complex maps.
The maps can be
exported so that
students can do further
work on them.
44
C-Link
Rationale
 Clarify the
educational
‘problem’
 Find/develop the
appropriate
technology
 Implement as costeffectively as
possible
Comment
 Students do not
information search
critically or effectively
 System builds a map
of relationships to
stimulate enquiry
 System developed in
JISC project – now
freely available.
45
And so …



Ever-increasing range of useful and
accessible materials and tools.
Can offer learning experiences which
are not practicable or achievable by
traditional means.
We can all get involved in this!
46
And consider
adaptive systems

PBL with consequences – you get
immediate feedback on the
consequences of your decisions.

e.g. The G4.5 project at St George’s

Their Ethics simulation - iEthics
2. What is your course?


How do we ‘see’ and define our
courses?
Let us eat cake …
The workshop exercise
using visual analogies:


What would your
‘ideal course’ cake look like?
You might like to try this as a short
exercise with a course team to
stimulate discussion about our
preconceptions and assumptions in
curriculum design.
And mine …



Emphasises the
journey and the
goal
Notion of travel up
through levels but
this is not a series
of tidy steps
Some people do
get stuck!
3. Assessment Strategy
http://www.pass.brad.ac.uk
http://www.testa.ac.uk
TESTA project


NTFS group project with 4 partners
starting from audit of current practice
on nine programmes:


surveyed students using focus groups
and AEQ – Assessment Experience
Questionnaire – Graham Gibbs et al
also using tool to identify programme
level ‘assessment environments’ (Gibbs)
Consistent practice?
Characterising programme-level assessment
environments that support learning by Graham Gibbs
and Harriet Dunbar-Goddet
Published in: Assessment & Evaluation in Higher Education,
Volume 34, Issue 4 August 2009 , pages 481 - 489
Data from TESTA
Your ideal
assessment environment?
The need for strategy

An example finding from Gibbs


‘greater explicitness of goals and
standards was not associated with
students experiencing the goals and
standards to be clearer’
And what did make a difference?
The need for strategy

An example finding from Gibbs


‘greater explicitness of goals and standards
was not associated with students
experiencing the goals and standards to
be clearer’
And what did make a difference?



Formative-only assessment;
More oral feedback;
Students ‘came to understand standards
through many cycles of practice and
feedback’.
Overall TESTA findings:

“consistent relationships between characteristics
of assessment and student learning responses,
including a strong relationship between quantity
and quality of feedback and a clear sense of
goals and standards, and between both these
scales and students’ overall satisfaction.”

Tansy Jessop, Yassein El Hakim & Graham Gibbs (2013): The whole is
greater than the sum of its parts: a large-scale study of students’ learning
in response to different programme assessment patterns,
Assessment & Evaluation in Higher Education.
58
The PASS project
What do we mean by PFA? #1
“The first and most critical point is that the
assessment is specifically designed to address
major programme outcomes rather than very
specific or isolated components of the course. It
follows then that such assessment is integrative in
nature, trying to bring together understanding and
skills in ways which represent key programme aims.
As a result, the assessment is likely to be more
authentic and meaningful to students, staff and
external stakeholders.”
From the PASS Position Paper: http://www.pass.brad.ac.uk/position-paper.pdf
What do we mean by PFA? #2
Varieties of
PFA
High
Extent to which
assessment
covers all the
specified
programme
outcomes
Typical module
assessment
Low
High
Weighting of the assessment in the final qualification
60
What do we mean by PFA? #3
Varieties of
PFA
Personal evidence
against
programme
outcomes
High
Final heavily
weighted integrative
assessment
Extent to which
assessment
covers all the
specified
programme
outcomes
Integrative
level/year
assessment
Integrative
semester/term
assessment
Low
High
Weighting of the assessment in the final qualification
61
An example:
Peninsula Medical School


Case study already available.
Includes:


four assessment modules that run
through the 5 year undergraduate
medical programme and are not
linked directly to specific areas of
teaching.
focus on high-quality learning (Mattick
and Knight, 2007).
Examples from Brunel

Biomedical Sciences



Study and assessment blocks in all
years.
Cut assessment load by 2/3rds;
generated more time for class contact.
Synoptic exam in all three years.
Examples from Brunel

Biomedical Sciences




Study and assessment blocks in all years.
Cut assessment load by 2/3rds; generated
more time for class contact.
Synoptic exam in all three years.
Mathematics


Conventional modules in final year only.
Improved understanding and ‘carry-over’
of ‘the basics’ into year 2.
And finally …assessment/
identity interface
Students as ‘conscientious consumers’
(Higgins et al, 2002).
But:
And finally …assessment/
identity interface
Students as ‘conscientious consumers’
(Higgins et al, 2002).
But:
 personal identity as ‘mediator’.

e.g. apprentice (‘feedback is useful
tool’) cf. victim (‘feedback is another
burden’).
So need to change the mindsets of
some students.
4. Revisit the regulations
67
Example:
New regulations at Brunel









2009 Senate Regulations give almost total freedom in the design of
Levels.
Allows conventional modules (modular blocks) = study and
assessment credit coterminous.
Allows separate assessment blocks and study blocks.
Study blocks = purely formative, no summative assessment. Study
blocks can be any volume of study credits.
Assessment blocks can summatively assess learning from more than
one study block. Assessment blocks can be 5 - 40 credits.
Each UG level = 120 study credits + 120 assess credits.
Study credits = expected student study time.
Assessment credits (no time) but reflect complexity and importance.
Encourages Level-based design of study and assessment, as
opposed to a module-based approach.
68
Flexible regulations
allow new structures
69
5. Rebuild the rooms: an
example which worked

IT4SEA project at Bradford





100-seater facility with thin client technology.
QMP as University standard for summative
assessment.
Procedures agreed with Exam Office.
Design of room
(available as cluster outside assessment
times)
Teaching potential.
The main
CAA room at Bradford
And the growth …
And recent changes …

growth in ‘hybrid exams’:



mix of automatic marking (QMP) and open
ended response items (e.g. short answer
questions).
short answers collated into a spreadsheet
and marked by human.
Example of impact:

‘reduced my marking load for this module
from 5 days to one day, whilst still enabling
assessment of higher order cognitive skills.’
And finally:
some myths to work on







Marks are numbers?
Courses can be built like Lego?
“you only need to tell them once”
Students understand assessment criteria?
Learning styles?
Communication is 93% nonverbal?
We can multi-task?
Anyone care to join me
in some collaborative research?
75
And so if I were
to start again …
1.
2.
3.
4.
5.
6.
Reinvent my teaching.
Rediscover ‘the course’.
Focus on assessment
strategy.
Revisit regulations.
Rebuild the furniture.
And have a go at some
myths.
And I would work
at 3 levels …
1.
2.
3.
4.
5.
6.
Reinvent my teaching.
Rediscover ‘the course’.
Focus on assessment
strategy.
Revisit regulations.
Rebuild the furniture.
And have a go at some
myths.
In other words,
challenging:
 ME


The course
team
The institution
Thank you

Peter Hartley:
[email protected]
78