A Proforma for Doing Educational Supervision

A Proforma for Doing Educational Supervision
Gail Crowley (TPD Rotherham), Mei-Ling Denney (TPD Edinburgh) and Ramesh Mehay (TPD Bradford)
In order to be able to make informed evidence-based
judgements about the trainee, they will need a reasonable
amount of evidence within their e-portfolio. And the more
assessments you look at, the better the picture you get of the
trainee and the more accurate your holistic judgement is
going to be at the end of it. Because it is anticipated that a
trainee's demonstrated competence will increase as they
progress through their year, the later assessments should
show more evidence of competence than the earlier ones.
The same thing applies to the Learning Log and PDP. If
there are few entries then it's worth looking at the majority of them, but if you have an
enthusiastic trainee who has a large number of entries, you may need to use judicious
sampling.
This is what the e-portfolio’s left navigation menu looks like and a summary of what to do:
Trainee’s Name
Summary
Learning Log
PDP*
Evidence
Posts
Educator’s Notes
Curriculum Coverage*
Skills Log*
Competence Areas
Reviews
Create Review
Continue/Edit Review
Here’s what you need to do (in 10 easy steps):
Click here first, read the entries and map to competency areas.
Review the PDP – is it alive and kicking?
Review the frequent WPBA assessments (CBDs, COTs, Mini-CEXs)
Then review the infrequent WPBA assessments (MSF, PSQ, CSR).
What do the Educator(s) have to say? What themes emerge?
What areas of the curriculum are low and need more work?
Are the DOPS being achieved in a timely way?
Which competence areas lack evidence and need further work?
 Finally, click here to create a review & fill in the online ESR** but also:
Review Trainee’s Self Rating of the 12 Competency areas
Rate them on the 12 competency areas yourself (objectively).
*Although these three areas can also be found in the Online ESR (in the ‘Create Review’ section), we think it’s easier just to
follow the e-portfolio’s navigation system in order.
** The online ESR has 5 sections for you to comment on: 1) Curriculum Coverage 2) Skills Log 3) PDP
4) Competence Areas – Trainee 5) Competence Areas – Ed Sup.
Let’s go through each of these in turn.
 The Learning Log (including NOE)
The Learning Log: Read as many entries as is necessary for you to get an
accurate idea of how the trainee is using the log and performing. If the
Clinical Supervisor hasn’t read many, then you may need to go through them
systematically. Entries that have not been marked as ‘shared’ by the trainee
will not be visible to you: so, if the log looks a bit sparse, check this out with
the trainee.
What to do:
 Are the entries being made in a timely way? Log entries should be made throughout
the placement, and not all at once just at the eleventh hour. So, look at the dates! The
date visible is not necessarily the date when it was entered (unlike the PDP) – you
have to open them up individually and scroll to the bottom where it says


.
Are there enough entries? As a rough guide there should be around 1-2 entries per
week if in a hospital post and 3 for those in a GP post.
Are there a variety of learning activities being recorded? From: a wide range of
clinical encounters, tutorials, reading, significant events, OOH entries etc.
For each entry you read:
 Review the quality of the entry – what’s the depth of reflection/analysis like? Use
the ISCE criteria (see below). Entries can often be descriptive rather than reflective.
 Check the validity of the curriculum statement headings the trainee has linked it to.
 Add links to the 12 professional competencies you feel are appropriate.
 Make a comment if you feel it would help – e.g. how the entry could have been
improved (after discussing it with the trainee).
A good log entry will:
a) Show a good depth of reflection (more on reflection below)
b) Clearly highlight the learning outcome
c) Demonstrate coverage of one or more curriculum areas
d) Provide evidence for one or more professional competencies.
Time Out - let’s talk a bit about reflection
To be able to judge whether a log entry shows good reflection, you need to understand what
reflection is. It is a skill which we all do to varying depths; being a skill means it can be
strengthened through practise.
Why do we go on and on about reflection?
Reflecting on or during some experience in light of known theoretical concepts or previous
learning should lead to new insights into different aspects of that situation. Therefore,
effective learning won’t happen unless you reflect. The outcome of reflection is learning
(Mezirow2, 1981)
The proper definition

Kemmis3 (1985) - the process of reflection is more than a process that focuses 'on the
head'. It is a positive active process that reviews, analyses and evaluates experiences,
draws on theoretical concepts or previous learning and so provides an action plan for
future experiences.

Johns4 (1995) adds that reflection is a personal process enables the practitioner to
assess, understand and learn through their experiences. This results in some change
for the individual in their perspective of a situation or creates new learning for the
individual.
Judging the depth of reflection in a log entry – the ISCE criteria
We’ve decided to call them the ISCE criteria but they were originally described by
Richardson and Maltby5 in 1995.




Information: How well does the trainee describe what happened or was
observed? Is it in enough detail?
Self Awareness: Is the trainee open and honest about performance (usually
through writing about own feelings and/or that of others)?
Critical Thinking: Does the log entry show evidence of analysing the bigger and
smaller pictures, problem solving and describing own thought processes.
Evaluation: Does the trainee pull together the above three things (synthesis)
before going on to describe what needs to be learned, why and how?
In terms of these 4 things (ISCE), here are some descriptors of what is acceptable:
Have you ever read a trainee’s log entry and thought it was a bit dire (and that was putting it
politely)? Did you struggle to pinpoint why it was so dire? Then use the ISCE criteria to
help you figure this out! It’s often because the trainee has written endless descriptive notes
without further analysis or evaluation. The descriptors under the ‘excellent’ column in the
table above tell you what makes a really good and deeply reflective log entry.
Why not print this little table and keep it with you when reading log entries? How about
sharing it with the trainee and get them to assess their own level of reflection on their own log
entries? They’re more likely to learn in this way. Another way you can get them to learn is
to pick a poor log entry and get them to apply Kolb’s6 experiential learning cycle to it.
Kolb’s learning cycle (1984) is covered in greater depth in Chapter XXX Educational Theory
Worth Knowing but we’ve outlined it briefly for you here.
Kolb’s Learning Cycle:

CONCRETE EXPERIENCE is about something that has happened to you or that
you have done.

REFLECTION is concerned with reviewing the event or experience and exploring
what you did and how you and others felt about it.

ABSTRACT CONCEPTUALISATION is all about developing an understanding of
what happened by seeking more information or bringing in theoretical concepts
or previous learning to form new ideas about ways of doing things in the future.

ACTIVE EXPERIMENTATION is about trying out these newly formed ideas.
The more you can coach your trainee to do reflection properly, the more you move them from
ignorance to understanding. That should help them move more swiftly, requiring less input
from you and making your work as Educational (or Clinical) Supervisor easier. Bliss!
My trainee doesn’t like writing it all down. He says he does all this reflective stuff in his
head anyway and all this e-portfolio stuff is unnecessary!
It’s not uncommon to hear this, but remember, reflection is an
active process rather than about passive thinking. The problem
with thinking ‘in your head’ is that people often rush through the
stages of the reflective process. Writing it down encourages them
to slow their pace leading to a better description, better critical
analysis, better self-awareness and better evaluation (or learning) –
Richardson & Maltby5 1995, Zubbrizarreta7 1999 and Tryssenaar8
1995. Is that enough evidence for you?
Back to log entries – validating them
When a trainee adds a log entry, it needs to be ‘mapped’ to two things:
1) The GP curriculum statement headings
Examples:
 The General Practice Consultation
 ENT problems
 Sexual Health
The GP Trainee needs to do this mapping.
The supervisor needs to validate them
2) The 12 professional competencies
Examples:
 Practising holistically
 Data gathering & interpretation
 Making diagnosis/decision
Only the supervisor can do this mapping.
The trainee can link more than one curriculum statement heading to a log entry (just as you
can with professional competencies). Although it is the trainee who maps to curriculum
statement headings, you need to validate them to make sure they are not over linking (adding
things unnecessarily) or under linking.
Validation is not the same as competence: when you validate a log entry against say
‘Care of Older Adults’, you are simply saying ‘Yes, this log entry is something about
Care of Older Adults’. You are NOT making a judgement call as to whether the trainee
is competent in this area or not.
However, when you are mapping to professional competencies, only link where:
(a) The written log entry is clearly about that competence and
(b) The trainee has made a clear reflection and analysis in terms of that competence.
Why is all this mapping and validation important?
All the mapped entries will later appear on a web page of collated evidence that you will be
reviewing later – anything you do now to make this better will pay dividends in the end.
Naturally Occurring Evidence (NOE): Some Deaneries have specific requirements for
what they want to see as part of ‘Naturally Occurring Evidence’ (usually Significant Events,
Audit or Reflection on QoF, Case Study or Presentation and Reflections on the Post).
What to do: Check to see if these have been done and their quality.
 The PDP
The PDP should show a number of entries that clearly relate to the
personal learning needs of that particular trainee. Mandatory
requirements should not be entered on the PDP, and most entries
should fulfil SMART (Specific, Measurable, Achievable, Realistic,
Time bound) criteria. Alarm bells ring when there is nothing
planned… no thought about what to get out of the job. And when
nothing is signed off.
What to do
 Check entries are appropriate for the PDP – are they doing any forward planning?
 Most entries should fulfil SMART criteria
 Are they are actioning and fulfilling some of them.
What’s a PDP and how should you assess it? A personal viewpoint.
A useful analogy would be to view the PDP like a shopping list. We might go shopping with a list of
items we know we need to buy, but we may find we need to change the items when in the shop
because of lack of availability etc. And we may also spot things we didn't know we needed until we
saw them on the shelves. Therefore, I think the PDP is great for planning learning (and particularly
useful at the beginning of each post and at review points), but in my view day to day small items of
learning can be addressed perfectly adequately through the learning log. The PDP is a tool to aid
reflection and learning, and is not an end in itself.
Comparing sophisticated PDPs to more rudimentary plans is like comparing a Rolls Royce to a Mini.
The Rolls is considerably more expensive and sophisticated, but may be no more effective in getting
from A to B, and in certain circumstances (e.g. climbing a snowy hill) the mini may be the better
vehicle. The difference between an elaborate and basic PDP may be more about preferred learning
style than effectiveness of learning strategies. In my view, the key question is whether the trainee has
shown understanding of the process, and has planned and completed learning journeys.
In terms of assessment, we need an approach that is more qualitative than quantitative when looking
at PDPs, and I think the PDP also has to be seen as just one element of the bigger portfolio, rather
than something to be judged in isolation. I am looking for evidence in the e-portfolio that the trainee is
able to identify and address learning needs, and demonstrate application of learning to practice. If the
PDP is scantily populated but the learning log demonstrates reflective practice and completion of
learning cycles then I think this is acceptable. A PDP where nothing is signed off does of course raise
concerns.
I think one also needs to be careful about labouring the importance of "SMART" objectives (although I
know it borders on heresy to say so!). Great idea in theory and it is useful to ask a trainee ‘what
exactly do you need to know when you say you want to learn more about diabetes?’ but tidying up
objectives to make them SMARTer doesn't necessarily add value to the learning in my experience. I
think the discussions about "SMARTness" are important, and it is of course helpful to clarify the nature
of the need and how it will be addressed and completion demonstrated. However, specificity in
particular can sometimes be difficult to define in advance of learning, and sometimes useful outcomes
may be achieved that bear little resemblance to initial objectives. I don't think this necessarily matters,
although it is great if trainees include some reflection on this.
Going back to the qualitative vs. quantitative argument, as experienced educators we are in a position
to make credible judgements about PDPs, even if our judgements may differ about the same PDP. I
suspect that criteria that give best inter-rater reliability may be the least useful, and maybe we
shouldn't get too hung up about this.
Nick Field (APD Sheffield)
Reviewing the frequent WPBA assessments (CBDs, COTs, Mini-CEXs)
Explore your confidence in the clinical supervisor
As you look at each set of assessments, try to get a feel for the accuracy of the grades given
under the various competency or other headings. If the trainee is in a hospital post, it is more
difficult to know how confident you can be about the ratings of the Clinical Supervisor
(compared to Clinical Supervisors in General Practice who are often Trainers, have had
formal training and thus have a similar understanding as you). The assessments have to be
done by doctors in grades higher than the trainee, preferably by the Clinical Supervisor or a
designated senior doctor who has an understanding of the assessments – but even then, you
probably won’t know the exact seniority of the doctor. And remember, many of them will
not have received the level of training Trainers get and others may be new to all of this.
Warning:
Alarm bells should ring if you see:
 The same grade being given under every heading
 An excessive number of ‘Excellent’ grades.
 Grades awarded under headings that are unlikely to have been
tested in a hospital post (e.g. ‘Primary Care admin and IMT’).
CBD, COT and mini-CEX mapping
After sampling enough CBDs, COTs and mini CEXs for confidence in the rating, it’s then
important to step back and get an overall view. To do this, get the trainee to map out each
competency set for each assessment so that you can see if there are any patterns that emerge.
For example, you may notice out of the 8 CBDs done so far that there’s a competency that
has never been assessed, or has always been graded as needing further development – thus
helping you to advise the trainee. There are competency mapping sheets on the Bradford
VTS website (link below) that we urge you to encourage the trainee to fill in (it will make
your life easier if you get the trainee to do the work for their own e-portfolio).
The mapping sheets also help you get an overview of the context, complexity and level of
difficulty of the cases seen. This may have a bearing on the ratings given by the assessor,
and you will need to take this into consideration. If they are of low challenge, making it
difficult to judge your trainee's competency, it is worth noting this in your comments in the
ESR.
Reviewing the infrequent WPBA assessments (MSF, PSQ, CSR).
Multi Source Feedback (MSF)
In a general practice post, trainees need five clinicians and five non clinicians to complete an
MSF report, but in a hospital post they need only five clinicians, whatever their ST year.
When you click on ‘MSF’, you will be offered a bar chart
and spyglass icon . Click on
the bar chart icon
to see the collated evidence. This will be split under the two headings
of:
1. Overall professional behaviour and
2. Overall clinical performance.
You will be able to see how many times the trainee has been rated on each of the seven
grades, from very poor to outstanding, and also the average grade mark benchmarked against
the national average. The free text comments in the MSF are especially useful, giving insights
that would not be gained from the other individual assessments.
What to do:
 Pick out themes (positive and negative) from the free text entries under ‘Highlights in
performance’ for both professional behaviour and clinical performance.
 Formulate suggestions that might help the trainee to do better (by reviewing the
suggestions offered and developing some of your own).
 Incorporate these into the final ES report.
 The results of the MSF are not automatically released to the trainee (they want you
to see them first). To release them, select

from the drop down
menu at the bottom of the page and then hit the
button..
Do not release the MSF report to the trainee if there are potentially damaging
comments within them; these should be discussed with the trainee face-to-face,
preferably after some more information about them has been gleaned in order to put
the comments in context.
Patient Satisfaction Questionnaire (PSQ)
A PSQ is only ever done when the trainee is in a primary care placement.
What to do:
Click on the spyglass
icon:
 First, look at the lowest and highest scores, which will give you the range (or
spread) of responses. A narrow range means most responses were similar. A wide
range means there were wide and differing opinions.
 Then look at the median scores on each of the 11 sections. We suggest using the
median (rather than the mean) as the most useful measure of averages because it is the
one that it least likely to be skewed by unusual scores.
 In simple terms, compare the trainee’s median to that of the peer median.
 If the same: trainee performed similarly to the average GP trainee population.
 If individual > peer: trainee performed better.
 If individual < peer: trainee performed less than the average (a concern!).
 As with the MSF
- Make some comments in the ESR (positive and negative themes, suggestions
– having discussed these with the trainee first).
- ‘Release’ the PSQ results to the trainee if you are happy about doing this.
Clinical Supervisor’s Report (CSR)
There should be a CSR for each post that the trainee has been in (GP and in hospital). In
posts which are split between secondary and primary care (Integrated, Innovative, Modular or
whatever else they are called) there needs to be a CSR from both hospital clinical supervisor
and GP Trainer. These are likely to give a very helpful overview of the trainee's performance
during that post. Comments are split under the three headings of RDM-p
1. Relationship
2. Diagnostics (= decision making skills)
3. Management ( = management of oneself rather than clinical management)
4. Professionalism
The
college
have
produced
some
guidance on how to
write a CSR. It might
be worth sharing this
with
the
Clinical
Supervisor before they
do the report:
What to do:
 Review each of the R, D, M and P categories and pick out positive and negative
themes.
 Consider looking back over previous CSRs to see if there were additional themes
(are these now resolved?) and/or whether the current themes were picked up back
then and that little progress has been made.
 Discuss with the trainee – especially what’s going on for them and possible
suggestions.
 Summarise all of this for the ESR.
 The Educator’s Notes
How you use the Educator’s Notes is not set in stone. Although only Educators (Trainers,
Clinical and Educational Supervisors) can write stuff in it, the section is viewable by all
stakeholders (trainees, ARCP etc.). ARCPs find them particularly helpful.
Tips:
Use ‘Educator’s Notes’ to add comments relating to:
 Performance concerns discussed with the trainee
 Performance of the trainee that is additional to or at odds with the evidence
from the learning log.
 Personal or professional circumstances of the trainee that put the number
and quality of assessments into context e.g. family problems, sickness, or
problems with the training placement itself.
 Recording additional ES meetings (i.e. those on top of the final ‘official’
one for the post).
 Reminding the trainee to follow up something.
 For tutorial planning – reminding you and the trainee about what you are
planning to cover next week and what both of you need to do before then.
What to do:
As an Educational Supervisor:
 Read through the Educator’s Notes to get an overall picture of how training is going.
 Pick out any obvious positive and negative themes (and discuss with the trainee).
 Add to it if you feel what you have to say will help the trainee or the educators
involved.
 Curriculum Coverage
What to do:
 Decide whether the pattern is broadly appropriate
for the level of the trainee and the posts they have
done so far. For example, if they’re in Paeds, there
should be a number of entries in the curriculum
heading ‘8 Care of Children and Young People’
 You will be able to see the number of linked log entries, and if you click on each
curriculum heading, you will be able to see the evidence that contributes to it.
Sample some of these to determine whether the entries have been appropriately
tagged. If some of them appear irrelevant or the link very tenuous it is worth
discounting these.
 Are there any obvious lacunae in the curriculum, taking account of the posts they
have already covered?
 Even at ST3 stage there may be low numbers for certain headings like Learning
Disabilities and Genetics. But remember: the GP curriculum was designed for a
lifetime career in General Practice; it’s too much to be covered adequately in a GP
training programme.
Dip into what is there: the few log entries might be
comprehensive and deeply reflective enough to be considered acceptable. So, whilst
numbers may be low for some areas, what you’re assessing is whether there are
‘learning cycles’ happening.
 The DOPS
Under the Skills Log you will be able to look at the DOPS that have been completed - both
mandatory and optional ones. Remember: these only need to be completed by the final
review in ST3, but may be done throughout their three years training. Only the DOPS that
have actually been observed and independently assessed will appear in the blue central
column. Click on the entries within this blue column to see the actual ratings by the assessor.
In the right hand column click the spyglass icon to reveal the trainee’s comments. Here
they may state that they have done several of that particular DOP either within their specialty
training envelope or prior to starting it, but these comments are subjective and cannot be
counted towards the trainee's observed DOPS.
What to do:
 Just check that they’re broadly on track.
 Make sure they are not getting other trainees to assess them! (be clear about this)
 Competence Areas
This section provides the available evidence for the 12 competence headings. The two
columns to the right will give you the number of linked entries from their learning log, and
the number from their assessment forms respectively.
What to do:
 First note which areas are sparse with evidence and need working on.
 In a similar way to the ‘Curriculum Coverage’, click on each competence heading and
sample from the learning log entries or WPBA forms that have contributed to the
evidence.
 Make a note of some specific pieces of evidence to quote when you come to doing
your rating of the trainee (point 10 below).
 The Trainee’s Self Rating (competence areas)
This section is quite interesting because it provides information on the trainee’s reflective
skills and their level of insight which may warrant further discussion if there is a large
variance from yours.
The spy glass on the right of each competency enables you to
compare current comments and ratings to past ones.
Tips:
This will make your job of writing the ESR easier. Before the ES meeting, remind
the trainee to rate themselves. Ask them specifically to:
a) Pretend to be someone who doesn’t know them and judge themselves
according to the evidence (like COTs, CBDs and log entries) present
within their e-portfolio rather than their own personal opinion.
b) Signpost this evidence in the ‘Evidence’ text box (and be specific).
Write up in the ‘Actions’ text box anything they feel they need to do in the future
to better the evidence for that competence.
What to do:
 Look at each competency and see if the ‘Evidence’ box comments marry with the
rating they have awarded themselves. If not, discuss (especially if they write
subjective comments rather than objectively by quoting the evidence).
 Read the ‘Actions’ box comments – reasonable suggestions?
 Compare their rating scores with yours – in fact, it might be better to do your (the
Educational Supervisor’s) rating scale first and to then compare it with theirs. By
doing yours first, you will not be influenced by what they’ve already written and that
helps retain greater objectivity. Discuss any variance.
 The Educational Supervisor’s Ratings (competence areas)
This is where you have to assess the trainee’s e-portfolio and rate them under the 12
professional competence headings. If you do not feel that the evidence is there, or that the
evidence displayed is not sufficiently robust for you to be able to make a judgement, give the
rating of ‘NFD – Below Expectations’ (e.g. for an ST3 in post 2) or ‘NFD – Meets
Expectations’ (e.g. for an ST1 in post 1). The expected grade by the end of ST3 is
‘Competent for Licensing’. The ‘Excellent’ grade is reserved for exceptional trainees and
should not be not be awarded in a willy nilly way.
What to do:
 In the ‘Evidence’ box, specify the evidence that justifies your rating.
- For example: ‘The last 5 COTs and 4 CBDs show competence in practising
holistically’.
- Sometimes it can be helpful to signpost even more specific evidence: ‘The last CBD
case was about xxx and the trainee did yyy which indicates practising holistically’.
 In the ‘Actions’ box, specify what the trainee need to do (in terms of evidence) to
better their score.
 If you haven’t already done so, compare your ratings with the trainee’s self-rating.
Discuss any variance and summarise it in the ESR.
Filling out the online ESR form
As you read through the evidence, it’s a good idea to make notes at the same time. There’s a
lot to read, and it is difficult to keep it all in your head by the time you come to doing the
ESR (especially the bit where you have to rate the trainee according to the evidence). Here
are two suggestions:
1. Write things down on a piece of paper OR
2. Load a second e-portfolio browser window and enter your comments directly into the
ESR.
Click on the
button at any point to save comments, log out and return
to it later. DO NOT click
(which locks and finalises things).
To access the online ESR form, you have to click ‘Create Review’. Once you’ve done that
and filled in the first set of boxes, you’ll see the following horizontal navigation menu:
Setup
Curriculum
Coverage
Skills Log
Review
of PDP
Competence Areas –
Trainee
Competence Areas –
Ed Sup
Finish
Review
Simply go through each section and fill in the remaining boxes with the trainee (to minimise
the risk of shock)! Let’s look at some of the other boxes we’ve not mentioned so far –
namely those in the ‘Finish Review’ section.
The ‘Finish Review’ section
Quality of Evidence Presented
Make a comment on the quality of
 Log entries – level of reflection, types of learning
activities
 CBDs – complexity of cases and a range of cases
 COTs – complexity of cases and their context (e.g.
palliative care, children, older adults)
Recommendation
It is worth postponing this bit if you want the trainee to add more stuff first – for instance, if
they are missing a mandatory assessment (in which case you hit the
button). Only
click the
after they’ve done it and you’re ready to ‘finalise’.
recommendation of the Educational Supervisor can be one of four choices:
1. Satisfactory
2. Unsatisfactory
3. Panel Opinion
Requested
4. Out Of Post
All of the evidence is complete, and there are no concerns. You believe
the trainee is competent in all 12 professional domains and are ready for
independent practice as a GP.
You are likely to make this recommendation either because the portfolio is
severely lacking, or there is evidence to suggest that the trainee’s progress is
unsatisfactory, or there is some other cause for concern. Let the Training
Programme Directors know early on.
The portfolio may be incomplete in some way, or you may be in doubt as to
whether you can pass it as satisfactory for some other reason. Remember, by
choosing this option, you ARE NOT failing the trainee – you’re merely asking
for a second opinion (explain it to trainees that way). Again, let the Training
Programme Directors know early on.
The trainee has taken a break from specialty training, for example Maternity
Leave or ‘Out of Programme Experience’.
The ‘Comments/Concerns’ box below the ‘Recommendation’ field
Comment on things not covered elsewhere:






The
The Learning Log – things not covered elsewhere
CBDs/COTs/mini-CEXs – all present?
Any comments? (positive or
negative)
MSF/PSQ – brief summary of positive/negative themes
Naturally occurring evidence
CSR – positive/negative themes
‘Educators’ Notes’ - themes
The ‘Agreed Learning Plan’
Feedback on areas for development and try to be specific:




What is lacking in the trainee’s e-portfolio.
What they should be doing to improve their e-portfolio (content & process)
How they can maximise their learning opportunities
How they can address any other concerns or deficiencies.
If the trainee is their final Review in ST3:
 Check that they have passed both AKT and CSA- it is worth commenting if they
have passed very well or failed very badly.
 Make sure that they hold a valid CPR and an AED certificate.
 Look at their OOH competency handbook or individual OOH sheets and check that
they have met the required number of hours (at the time of writing: 36 hours per each
6 months in General Practice but check current national and local guidance).
Which button do I press?
or
The
button saves the ESR in an editable format. The
button
starts the ‘finalisation’ process - sends the report to the trainee who will then be able to see it
and accept it. Once accepted, it is permanently locked and non-editable (‘finalised’). ARCP
panel members can only view finalised (complete and submitted) reports.
How Many Assessments and When? Remind me.
Please remember that minimum number means exactly that – the minimum. You should be
encouraging trainees to achieve much more. For those doing 4 month rotations, a similar
chart to the one below is available on our website.
Trainees doing 6 month rotations
ST1 post 1
CBD x3
Mini-CEX x3
MSFx1
ST1 post 2
CBD x3
Mini-CEX x3
MSFx1
If any of these is in GP: replace
mini-CEX with COT
at least x1 PSQ if any of these posts is in GP
ST2 post 1
CBD x3
Mini-CEX x3
ST2 post 2
CBD x3
Mini-CEX x3
If any of these is in GP: replace
mini-CEX with COT
ST3 post 1
CBD x6
COT x6
MSFx1
ST3 post 2
CBD x6
COT x6
MSFx1
x1 PSQ at some stage during ST3
Quality Assuring the ESR
Finally, it is worth remembering that your report will be subject to scrutiny by the Deanery
and RCGP. At least 10% of the ESRs that have been deemed satisfactory will be reviewed for
quality assurance purposes. The following criteria are used for the assessment:
The most common feedback comments from the panels for the less than acceptable ESRs are
listed in the table below.








Evidence quoted but does not justify the competency rating. Excellent progress does not
equate to competency rating of excellent.
No links to evidence. No competencies linked to log entries
Competency ratings show minimal comments with no linkage to evidence. Other comments
in the ESR are also minimalistic.
No links to evidence to justify rating.
No comment under making a diagnosis + more than 2 months before end of review.
Not evidence based. Poor linkage to competencies. No specific development points raised.
Linkage to competences poor, no comment on quality of trainees log.
No specific linkage to evidence