DUGGAN Cognitive testing

Cognitive testing in the
application process
Philip Duggan
Head of Secondary PGCE
Programmes & Graduate Diplomas
The plan
• Cognitive testing used in three ways:
– To inform the application process
– To assist trainees to market themselves
– As a longitudinal research study
Why do we want it?
• Cognitive testing to inform:
–
–
–
–
Commitment, motivation
Confidence, enthusiasm, energy
Team worker, effective manager, communicator
Resilience, robustness, balance
The application process
• Prepare applicants for:
–
–
–
–
–
–
Diagnostic written tests (M level)
Group discussions
Pupil activities
Pupil interview panels
Cognitive testing (to inform selection)
Skills tests (to be passed before entry on to the
course)
– Answering the question/background research
How do we start?
• Use of an established consultancy who is
already heavily involved in our own recruitment
processes
• Commitment to support the on-going
development of the tests
Pilot: cognitive testing
• 90 PGCE trainees (2012-13)
– Earlier this year, a number of trainees undertook the
cognitive tests and were provided with feedback on
their behavioural styles and preferences, with a view
to supporting their personal development and helping
them secure employment at the end of their course.
– Longitudinal development with trainees invited to
participate in further research and followed up at 1, 2
and 5 year points.
Testing details
• The trainees receive two reports:
– Work Strengths Behavioural Report (ITT assessment)
– Work Strengths Candidate Report (Marketing Report)
ITT Assessment
Work strengths
Work strengths
How are they used?
• Used in a very similar way across PG and UG
selection processes.
– All interviewers look at them, but only where there are
decisions to be made between very similar applicants
are they used to enhance their judgements, and then
only post interview.
– As yet no-one uses them to inform questioning during
the interview, nor as a source for supplementary
questions.
How are they used?
• As far as supporting selection is concerned,
there is a strong sense that the reports should
be used very carefully, and be interpreted via
professional judgements.
– Subject routes who have large numbers of applicants
to select from may use the reports to check for
specific strengths, but only after the interviews have
taken place.
School Direct
• In recruitment to School Direct their use is
variable.
– Schools do not consider it useful to use a tool they do
not use in their own recruitment processes and of
which they have no experience.
– They do not ask applicants to take the test until they
are brought to the review panel for final selection, and
when they do refer to the tests they are most
interested in the areas of communication, and in the
case of strong candidates they are looking for
leadership potential – nevertheless, this is post
interview.
Barriers
• Schools are finding large numbers of applicants
not turning up for interview so are conscious of
the waste of money caused by earlier testing.
• They are also concerned by applicants making
dual applications – SD and standard PGCE –
who may be tested twice, again at expense to
the TA.
Concerns
• The primary team note that the same
judgements are applied to 17/18 year olds as
PGs which may impact upon the results.
– Use in induction for UGs may not be useful as they
are developing skills naturally as they mature.
– There is further concern about cost implications, in
that a significant number of applicants accept
interview dates, complete tests and then fail to attend.
– It may be worth further investigation as to establishing
any correlation between low turnout at interviews and
the amount of pre-interview testing (skills tests as well
as non-cognitive testing).
Follow up
• There is a fairly positive attitude towards using
the reports as part of the trainee’s self-auditing
procedure which takes place during induction,
where new trainees might engage with the
development of their personal qualities as well
as that of their subject and pedagogical
knowledge.
Any Questions?