Best value policing: making preparations

Police Research Series
Paper 116
Best value policing:
making preparations
Adrian Leigh
Gary Mundy
Rachel Tuffin
Police Research Series
Paper 116
Best value policing:
making preparations
Adrian Leigh
Gary Mundy
Rachel Tuffin
Editor: Barry Webb
Home Office
Policing and Reducing Crime Unit
Research, Development and Statistics Directorate
Clive House, Petty France
London, SW1H 9HD
C
Crown Copyright 1999
First Published 1999
Policing and Reducing Crime Unit: Police Research Series
The Policing and Reducing Crime (PRC) Unit was formed in 1998 as a result of
the merger of the Police Research Group (PRG) and the Research and Statistics
Directorate. The PRC Unit is now one part of the Research, Development and
Statistics Directorate of the Home Office. The PRC Unit carries out and
commissions research in the social and management sciences on policing and crime
reduction, broadening out the role that PRG played.
The PRC Unit has now combined PRG’s two main series into the Police Research
Series, continuing PRG’s earlier work. This will present research material on crime
prevention and detection as well as police management and organisation issues.
ISBN 1-84082-341-0
Copies of this publication can be made available in formats accessible to
the visually impaired on request.
(ii)
Forewor d
The statutory requirement for police authorities to ensure their forces are delivering
best value services represents a potential step change in the way policing is
delivered. From April 2000, police authorities and forces must review each policing
service at least every five years to examine whether it should be provided in the
first place, consult widely on its delivery, compare its performance with the best
and consider other competitive ways of providing it.
This new framework for service delivery and performance management, to ensure
continuous improvements in quality, effectiveness and efficiency, will affect every
member of each police authority and police force. It is vital that senior police force
staff, in particular, are aware of what is required of them so that they can work with
police authority members to drive forward changes likely to have a significant
managerial and cultural impact in the service.
This research provides the first authoritative overview of how the police service has
been preparing for best value to date. It finds that there is a patchwork of ongoing
preparations across English and Welsh policing and highlights particular issues
where early lessons have been learnt.
To date, there has been relatively little published information about how best to
prepare for best value and none specifically related to policing. This research aims
to help fill that gap. The study should prove an invaluable companion to
authorities and forces as they prepare for best value. Future Home Office briefing
information, to be issued over the coming months, will take these findings and
further research into account.
Gloria Laycock
Policing and Reducing Crime Unit
Research, Development and Statistics Directorate
Home Office
August 1999
(iii)
Acknowledgements
The authors would like to thank all the police authorities and forces in England,
Wales and Scotland who contributed to this work. They would also like to thank
the Association of Chief Police Officers (ACPO), the Audit Commission, Her
Majesty’s Inspectorate of Constabulary (HMIC), the Association of Police
Authorities (APA) and Warwick Business School for their support and comments
during the research.
The authors would also like to thank Paul Quinton and Angela Strathern in PRC
for their contribution to the interviews and data analysis.
The Authors
Adrian Leigh, Gary Mundy and Rachel Tuffin are researchers in the Home Office
Policing and Reducing Crime Unit.
PRC Unit would like to thank Professor Tim Newburn at Goldsmiths College, for
acting as independent assessor for this report.
(iv)
Executive summar y
Background
Best value is a statutory duty, from 1 April 2000, on local authorities, fire and
police authorities that aims both to engender closer links between their work and
the priorities of local communities, and encourage continuous improvements in
local service delivery. The legislation requires authorities to review all their services
over five years, in each case ‘challenging’ whether the service is needed in the first
place, ‘consulting’ widely on its provision, ‘comparing’ it with other providers’
services, and examining ‘competitive’ alternatives (the ‘4Cs’). Authorities must
publish the review findings and their planned improvement actions, with measures
and targets, in an annual Performance Plan.
The legislation and guidance deliberately do not prescribe
what mechanisms and
techniques are needed to deliver best value. Instead, along with other public
service providers, police authorities and forces must decide how best to establish
their strengths and weaknesses; how to benchmark; what consultation methods to
use; how to produce and carry forward Action Plans, and so on.
In early 1999, the Home Office Working Group on Police Performance and Best
Value1 agreed to develop briefing information on best value policing. Recognising
the need for this to draw on practical experiences in the three formal best value
policing pilots and preparatory work elsewhere in the service, the Working Group
asked the Policing and Reducing Crime (PRC) Unit to explore:
●
●
●
●
what preparations police authorities and forces in England and Wales were
making;
what models and techniques were being considered by police authorities and
forces;
the relationships developing between police authorities and forces; and
what early lessons were emerging from the experiences of the authorities and
forces most advanced in their preparations.
To address these aims, the research team carried out work in three phases:
●
●
●
a telephone survey of all 43 forces in England and Wales during March and
April 1999;
the subsequent construction and analysis of a database of best value
developments; and
semi-structured face-to-face interviews with key personnel in police authorities
and forces in nine areas during June and July 1999. Visited force areas were:
Greater Manchester; Cleveland; South Wales; Dyfed-Powys; West Yorkshire;
Derbyshire; Lincolnshire; Hertfordshire; and Northamptonshire.
(v)
1
The Working Group includes
representatives from ACPO, the
APA, HMIC, HM Treasury,
the Audit Commission and the
Home Office.
One immediate finding was that all authorities and forces were in the midst of
preparations, but were at differing stages. It was clear, however, that developments
were moving at a very fast pace and readers need to bear in mind, therefore, that
the research reflects a snapshotof how authorities and forces were approaching best
value in the first half of 1999. Furthermore, the findings do not suggest definitive
answers – instead the key issues authorities and forces said they faced whilst
preparing for best value are drawn out, along with information on how they had
tackled these issues.
Key findings
All police authorities and forces in England and Wales were making progress in
their preparations for best value, having concluded that they could not wait for
guidance or findings from the pilots. It was also clear that the picture was rapidly
changing: in April authorities and forces were in the early stages of considering
widely differing approaches to best value, but by July their preparedness and
approaches seemed to be broadly converging, as they started to pilot reviews and
encountered similar issues and difficulties.
Key areas where these patterns were emerging included:
●
●
●
●
●
●
●
authorities and forces were generally aiming to build on existing structures,
systems and cycles, rather than planning radical organisational changes in the
first stages of best value;
some were choosing to develop their systems before piloting reviews, whilst
others were doing the reverse: the end results, however, were broadly similar in
terms of programmes and approaches to review;
perceptions of preparedness seemed to be directly related to forces’ experiences
in best value: with those forces most advanced in their preparations more aware
of how much work it involved;
invariably, forces were combining various tools and models in a ‘toolkit’, so that
each review was conducted according to the service’s circumstances;
all forces were using the Business Excellence Model, though to different extents
– usually as a self-assessment tool. Many were process mapping to identify
activities;
forces were taking a hybrid approach to service definition to avoid the
disadvantages of just one approach – reviewing services by function or by taking
a process-based approach (although the picture was complicated by how they
combined these approaches with different review targets and levels);
reviews were usually being prioritised according to a combination of services’
past performance, their budget size and the potential for savings to be made;
(vi)
●
●
●
●
●
●
increasingly, forces were forming central teams to oversee best value’s day-to-day
management;
where resources were limited, however, forces were also tending to let services
self-assess, with support and guidance from the centre;
authorities and forces were concluding that review programmes needed to be
flexible to take account of changing circumstances;
early pilot reviews were taking longer than expected, but forces had concluded
that time-scales would shorten as they became more used to them;
there was increasing evidence of forces and local authorities combining their
consultation and even starting to discuss joint reviews; and
there were signs, too, that police authorities and forces were developing a closer
working relationship – in some areas police authority members were
participating in regular seminars and in others, taking part in best value steering
groups and in reviews.
Alongside these developments, some common concerns and difficulties were also
emerging, along with some illuminating responses to them:
●
Forces were starting to experience the cultural implications of best value. Some
staff – particularly in support services – were feeling threatened by the
‘challenge’ and ‘compete’ elements of reviews, whilst service heads were
occasionally reluctant to help reviews. Many forces were therefore developing
force-wide communication strategies to ‘market’ best value thereby encouraging
staff and staff association participation. Some forces were also mixing ‘marketing
material’ with their public consultation to explain policing services and manage
the potential for raised public expectations.
●
Authorities and forces were finding that their other planning cycles did not
match with their best value programme. Some, however, were addressing this by
varying the length of their review programme to match cycles, such as that for
Crime and Disorder.
●
Some police authorities were finding that best value was remaining the
preserve of a minority of members – they were therefore finding ways to
communicate developments across the authority. To counter the danger that
best value might lead to friction between authority and force, some had agreed
their respective roles and others had joint steering groups.
●
Some forces had found they could learn little from the local authorities in their
area – so they were examining developments further afield. Others had tried to
(vii)
develop joint reviews, but been frustrated by differences in timetables and
agendas – they had identified the need for earlier planning in the future.
●
All of the tools for best value had their potential disadvantages (in terms of
complexity, relevance or comparability), but forces were accepting that these
models alone were no more than aids, and that some of their faults could be
overcome by combining or adapting them.
●
In a number of reviews, forces had found that the data they needed were
unavailable. Some were reacting to this by requiring all services to self-assess
regularly, so that the information would automatically be available once they
were due for review.
●
Forces were finding it difficult to benchmark because it was often hard to
identify leading service providers and data were seldom comparable.
Accurate benchmarking would also be assisted by activity costing, but this
was still in development and comparisons were made harder by differing
techniques, definitions and conventions. In particular, however, it was
proving difficult to compare police services with the non-police sector.
Despite this, there were many examples of effective benchmarking between
forces, and involving non-police organisations, that had led to savings.
Forces consistently pressed for the establishment of a national database or
website to help them communicate developments, learn lessons from each
other and benchmark more easily.
●
It was sometimes proving difficult to consult effectively on particular services
(because of the nature of the service or reluctance on the part of the user or
stakeholder). Nevertheless, forces were developing some very innovative and
interactive techniques that could have wider lessons for improving consultation
more generally.
●
Forces were finding it hard to envisage applying the ‘compete’ element of the
‘4Cs’ to services other than support, because of the nature of policing. However,
there was a growing appreciation that ‘contracting-out’ was not the only option
and evidence that forces were examining alternative arrangements – such as
‘comparative advantage’ agreements and consortia.
Future issues
There were also some issues identified as key to the future success of best value
policing. In particular, police authorities and forces will need to consider how
they will:
(viii)
●
●
●
●
●
manage and resource the rising tide of work that will result from best value as it
rolls forward;
manage and monitor Action Plans;
monitor their best value approach as a whole and what they will do if it seems to
be failing;
meet the requirements of the audit and inspection regime, once it becomes clear;
and
monitor and learn from other approaches – formal pilots, other forces, local and
fire authorities.
Conclusions
These findings represent only the more commonly described features of current
preparations for best value: it would be impossible to describe in one report all the
topics raised and preparations underway. Best value will not come into effect until
April 2000 and it will, therefore, be some time before it is clear which approaches,
in what circumstances, are most likely to be successful. Throughout the research
and even as this report was published, more issues and potential lessons were
emerging. Future research is planned to follow up developments but, for now, this
report can only raise issues and, sometimes, potential solutions: it cannot supply all
the questions, let alone the answers.
Ultimately, police authorities and forces must bear in mind that they will be
judged not on their mechanisms and review approaches, but on whether they are
self-reviewing rigorously and improving on the basis of the findings of those
reviews. In short, authorities and forces need to deliver best value policing – the
ways by which they do so can differ, but the result must be the same: continuous
improvements in service.
(ix)
Annex: Issues raised by authorities and forces
Throughout the study, authorities and forces identified issues they had already
addressed or would need to address in their preparations: the list below summarises
these as questions. This list draws on experiences to date; it is not a definitive
checklist. In many cases, however, authorities and forces explained how they had
addressed or would address these questions and these examples are described in the
report – not as ‘good practice’, but as possible responses.
Understanding best value and building on existing practices
Are key players aware of the main elements of best value and its timetable,
and do they have an informed view of how well prepared they are?
Before April 2000, will the focus be on checking support systems or on
piloting a review model?
How will existing planning cycles be co-ordinated with the best value cycle?
Section(s)
1,2
2
2,6
Should the best value review cycle mirror the three-year Crime and Disorder cycle?
2
What changes might need to be made to the existing service review system?
2
To what extent will they be able to draw on the results of existing
consultation mechanisms?
2,6
Does further work need to be carried out to enable accurate activity costing?
2,6
Should every review be required to find 2% savings for the Efficiency Plan?
2
Should efficiency savings be reallocated within a reviewed service or
spread across the force?
2
Managing the preparations
Section(s)
What management structure is required and who should be represented at each level?
2
Should the force have a best value team and who should carry out and lead
service reviews?
2
In what ways can a supportive managerial culture be promoted?
3
What type of monitoring system is needed to assess and influence implementation
of best value?
2,3,6
In what ways can visible and credible commitment from the top be demonstrated?
3
To what extent should staff from associations and interest groups be involved
in decision-making?
3
What techniques can be used to keep staff informed and/or market best value?
3
Would it be useful to include awareness of best value in appraisal and promotion
processes?
3
Is there a need to allay the fears of support staff and, if so, how can this be done?
3,6
Should the centre have full access to detailed self-assessment findings, or
just summaries?
During service reviews, how will staff within the service be kept informed of
developments?
(x)
3
3,5,6
Working together: police authorities and forces
Section(s)
How regularly will the police authority and the force meet to discuss best value?
3
How will roles and responsibilities be shared, for example should authority
members/officers attend or chair the steering group for best value or take
a practical role in reviews?
3
If the authority has a best value committee how will internal communication
be ensured?
3
How will authority members and officers be trained/inducted in best value?
3
Should the authority review itself?
3
To what extent are authorities making use of the knowledge and experience of
all their members?
3
How will the authority manage the potential tension between needing both to work
closely with and to challenge the force?
3
In what ways will the authority cope with the growing volume of work?
Working with others
3,5
Section(s)
Is there a need to develop closer links with local authorities, or national networks?
Is there potential for money-saving arrangements with other agencies and forces?
Using tools and techniques
3
3,6
Section(s)
What are the advantages and disadvantages of the models and tools used, and how
will their limitations be addressed?
Who will apply the tools – members of the reviewed service, a central team or
‘outsiders’?
4
3,4,6
What might be needed in terms of communication, training, toolkits and guides?
3,6
How can forces ensure they do not get ‘bogged down’ and lose sight of the point
of reviews?
4
Can the same tools be used for each review or should they be combined according
to service type?
4
How will tools, or aspects of them, be combined?
4
Will ‘off the shelf’ tools be applied as they stand or will they need to be adapted?
4
How will forces avoid becoming reliant on one particular tool (e.g. the Business
Excellence Model)?
4
To what extent will the tools be explained to police staff?
(xi)
3,4
Corporate reviews; defining services and prioritising reviews; action plans
How frequently will corporate reviews be carried out?
Section(s)
5
From where will the data for the key elements of the review be drawn?
4,5,6
Are consultation findings sufficiently relevant and up-to-date?
2,5,6
Is there a need to develop a new policing strategy, and how will future policing
issues be identified and incorporated?
2,5
How will the corporate review be linked to prioritisation of service reviews?
5
Which review approaches and levels should be adopted, and how could these be
combined according to the target for review?
5
In what ways can significant overlaps between reviews, and the number of reviews
overall, be kept to a minimum?
5
Who should decide priorities for reviews and on what factors, or models?
5
What resources, including time, will be needed and what contingency plans
exist if they over-run?
3,5,6
Can joint reviews be negotiated with other forces and local service providers?
3,5,6
What resource implications will the accumulation of Action Plans have over time?
3,5,6
What information should be included in the Performance Plan, how long will it
take to produce and how will it be distributed?
How will Action Plan findings, objectives and measures be combined and
summarised in the plan?
(xii)
5
5,6
Reviewing services with the 4Cs
Section(s)
Is a standard template needed, or should the strategy be tailored for each review?
What will the minimum standard for each review be and how can this be
quality assured?
4,5,6
6
Are reliable data available or is there a need for all services to collect data well
before reviews?
4,6
What cultural implications might there be, particularly from challenging
and competing?
3,6
How should services be challenged and who should carry out the challenge?
3,6
Should forces compare/benchmark whole services, or elements of them?
6
Should forces benchmark against: neighbouring forces or those in the same region;
service leaders; forces in the same family or all forces; organisations belonging to
clubs, or on databases, if so which; and non-police organisations on an ad hoc
service-by-service basis?
6
How will data incomparability and confidentiality of police information be addressed?
How can ‘consultation overload’ be avoided, can forces ‘piggyback’ on others’
consultation?
6
3,6
In what ways can raised public expectations be managed and the force
approach marketed?
6
Are all users, stakeholders and providers adequately consulted; is there a need
for innovation?
6
What options and combinations of competitive approaches are most appropriate?
3,6
Is there scope for procurement and resource consortia with other forces and agencies?
3,6
How can staff fears be addressed, and unequal application of competition be avoided?
3,6
Who should devise Action Plans, objectives and measures and in what format
should they be?
3,6
How will ownership of Action Plans be ensured and who will monitor and
manage their implementation?
(xiii)
6
Contents
Page
Forewor d
(iii)
Acknowledgements
(iv)
Executive summar y
(v)
List of Tables
(xiv)
List of Figures
(xiv)
List of Boxes
(xv)
1.Introducing best value
Best value and policing
The research
The structure of the report
1
2
3
5
2.Best value policing – what’s new?
The main elements of best value policing
Perceptions and pilots
Best value policing – what’s new?
7
7
7
8
3.Preparing for and managing best value policing
Best value: organisation and resources
Management, culture and communication
Police authority and force relations
Working with other authorities
14
14
16
18
20
4. Tools, models and standards for best value policing
Tools for best value policing
The Business Excellence Model (BEM)
Process mapping and Process Expert (PE)
The Balanced Scorecard (BSC)
Investors in People (IIP) and Charter Mark
22
22
23
26
28
30
(xiv)
5. Defining services and running review programmes
Corporate review
Defining services
Review prioritisation
Constructing review programmes
Producing Performance Plans
32
32
35
39
42
44
6. Service reviews and the ‘4Cs’
Review phases and the ‘4Cs’
Challenging
Comparing
Consulting
Competing
Review outcomes: objectives and Action Plans
45
45
47
48
52
54
56
7. Overview and the futur e
58
References
62
Appendix 1: The Business Excellence Model
Appendix 2: Links between the Business Excellence Model
and other initiatives
Appendix 3: Forces cur rently piloting national Process
Improvement Projects
Appendix 4: Case Study 1: Best Value in Greater Manchester Police
Appendix 5: Case Study 2: Best Value in Derbyshire Police
64
66
(xv)
68
70
75
List of tables
Table No.
Caption
Page
1.
Best value policing pilots in England and Wales
2.
Police planning and reporting cycles
11
3.
Responsibility for service reviews
16
4.
Force perceptions of possible roles for police authority members
or officers
19
9
5.
The most frequently cited tools and standards for
best value policing
22
6.
Force perceptions of proposed methods for producing corporate
reviews
33
7.
Approaches to service review
36
8.
The advantages and disadvantages of different approaches to
service reviews
37
9.
Factors influencing review prioritisation
40
10.
Likely review targets
41
List of figures
Figure No.
Caption
Page
1.
The timing of best value’s key elements (1999 onwards)
2.
Force management structures and roles for best value
14
3.
The Business Excellence Model
64
4.
Links between BEM and other initiatives
66
5.
GMP’s Review Programme (1999-2005)
73
(xvi)
8
List of boxes
Box No.
Caption
Page
1.
Best value defined
1
2.
The ‘Four Cs’
7
3.
Cleveland Police
10
4.
BEM and South Wales Police
24
5.
FAST and Lincolnshire Police
28
6.
The Balanced Scorecard in Dumfries and Galloway Constabulary
30
7.
What is a corporate review?
32
8.
Corporate review in West Yorkshire
34
9.
Force Command and corporate review
35
10.
Guidance on review prioritisation
39
11.
Review prioritisation in Leicestershire Constabulary
42
12.
A possible service review
46
13.
Challenging questions
48
14.
Competition and policing
54
15.
Extracts from a GMP Performance Improvement Plan (PIP)
72
16.
The seven-stage review process in Derbyshire Constabulary
77
(xvii)
19
INTRODUCTION
1. Introducing best value
Following its election in May 1997, the new government signalled its intention to
modernise local government by bringing councils and their communities closer
together and encouraging continuous improvements in local service delivery. A key
element of this Modernisation Programme – best value – was outlined in a
consultative paper published by the Department for Environment, Transport and
the Regions in March 1998 (1998a). Four months later, the DETR published its
White Paper, Modern Local Government: In Touch with the People (1998b)2,
describing in more detail the aims of best value and its principal features (Box 1).
2
In Wales, a consultative paper
was published in April 1998
and the White Paper, Local
Voices: Modernising Local
Government in Wales, was
published in July 1998.
3
CCT is a legal requirement for
certain public services to be
opened to market-testing,
tendering and possible
contracting-out if external
providers appear likely to be
cheaper than in-house teams.
Box 1: Best value defined
“Best value will be a duty to deliver services to clear standards – covering both cost and quality
– by the most effective, economic and efficient means available…Local authorities will set those
standards – covering both cost and quality – for all the services for which they are responsible…
Under best value local people will be clear about the standards of services which they can
expect to receive, and better able to hold their councils to account for their record in meeting
them.”
DETR (1998b)
To begin with, authorities often interpreted best value primarily as a replacement
for Compulsory Competitive Tendering (CCT), due to be repealed on 2 January
20003. It soon became clear, however, with the publication of the White Paper and,
more recently, the passing of the Local Government Act 1999, which gives best
value its legal basis, that the new regime will have a far greater impact than did
CCT. To summarise its key features, best value:
●
is a statutory duty , from 1 April 2000 , on authorities to develop a five-year
programme of service reviews , and to summarise the findings and resulting
Action Plans in a published annual Performance Plan ;
●
applies not only to local authorities but also fire and police authorities, and
covers all their services – not just those above CCT thresholds;
●
defines the key elements for each review to ensure that each service is needed
in the first place, has been widely consulted on, compared with other providers’
services, and subject to competitive consideration;
●
requires that reviews go beyond CCT’s role to ensure economy (which tended to
favour lowest cost providers), by promoting efficiency (i.e. maximising outputs
to inputs), requiring the effective use of resources (i.e. properly allocated
resources with outcomes meeting objectives) and quality (i.e. high standards
that meet user requirements);
1
INTRODUCTION
●
builds on the movement towards multi-agency partnerships , by encouraging
authorities to identify and tackle cross-cutting themes (‘wicked issues’) in
partnership with other authorities, businesses, education, health, social services,
probation, etc;
●
is part of a performance management framework , which includes national
performance indicators and standards against which best value authorities will be
measured, and requires authorities to set targets to at least match the
performance of the top quartile;
●
subjects service providers’ reviews and programmes to regular in-depth audit and
inspection to ensure compliance and validate outcomes; and
●
enables the Secretary of State to intervene wherever there is clear evidence
that an authority is failing to meet its statutory obligations.
For the public services, then, some of which have been quite sheltered from the
rigours of competition and benchmarking, best value will be a significant challenge
– moving beyond the regime where they have been exhortedto improve
performance to one where they are requiredto.
Best value and policing
The inclusion of police authorities as ‘best value authorities’ means that the police
service will be subject to the same regime as local authorities. Because the best
value duty falls on police authorities, theywill be held responsible if their force fails
to meet the requirements. The tripartite structure of policing, however, means that
chief officers and their staff will play a key role in delivering best value on a day-today basis. Police authorities and forces need, therefore, to work closely together to
develop an integrated approach for the delivery of best value policing.
For the police, best value is a logical progression from the following recent
developments, which have underlined the need for forces to demonstrate their
efficient and effective use of resources:
●
HMIC Thematic Review ‘What Price Policing’ (HMIC, 1998) which
highlighted the need for forces to improve progress in costing services and their
ability to recognise, deliver and demonstrate value for money;
●
outcome of the 1998 Comprehensive Spending Review , highlighting the need
for savings in areas such as sickness, ill-health retirements, asset management
and procurement;
2
INTRODUCTION
●
development in 1998 of overarching aims and objectives for policing (including
Guiding Principles on efficiency), which will lead to a new national suite of
performance indicators (PIs) underpinning best value; and
●
requirement for police authorities to find 2% year on year efficiency savings from
April 1999 and to include planned efficiency improvements in their Policing
Plans.
These last two developments specifically trailed the introduction of best value.
Efficiency Plans dovetail with best value’s Performance Plans, whilst the new PIs
include the delivery of efficiency savings and encourage the application of best
value through the new suite of nationally comparable indicators of performance
and quality.
Best value, then, is not a new concept for the police, who already have experience
of service reviews and performance management regimes. Instead, it builds on
existing practices and recent developments in policing – fixing them within a more
rigorous national framework. The potential value of existing structures is
considered in Section 2.
Although best value’s key elements have been known for some time, the
mechanisms and techniques underpinning them are deliberately not prescribed
by the
legislation or related guidance. Instead, along with other public service providers,
police authorities and forces have to decide how best to establish their strengths
and weaknesses; how to identify their services and conduct service reviews; how to
benchmark; what consultation methods to employ; how to produce and carry
forward Action Plans, and so on. Authorities and forces also need to establish their
individual and shared roles in taking forward best value.
The research
In early 1999, the Home Office Working Group on Police Performance and Best
Value4, agreed to develop best value policing guidance, but recognised the need for
this to draw on practical experiences. Two English police forces were already
piloting best value (along with forty local authorities) for independent evaluation
by Warwick Business School on behalf of DETR, and one Welsh force was having
its pilot evaluated by Cardiff Business School, on behalf of the Welsh Office5. The
Working Group, however, had heard anecdotal evidence of other pilots and wanted
to be able to draw on all these experiences before preparing guidance. They
therefore asked the Policing and Reducing Crime (PRC) Unit to research:
3
4
The Working Group includes
representatives from ACPO,
the Association of Police
Authorities, HMIC, the
Treasury, the Audit
Commission and the
Home Office.
5
These forces are: Cleveland
Police, Greater Manchester
Police and South Wales Police.
INTRODUCTION
●
●
●
●
what preparations police authorities and forces in England and Wales were
making;
what models and techniques were being considered by police authorities and
forces;
the relationships developing between police authorities and forces; and
what early lessons were emerging from the experiences of the authorities and
forces most advanced in their preparations.
Methodology
The research comprised three phases:
●
●
●
A telephone survey of all 43 forces in England and Wales during March and
April 1999.
The subsequent construction and analysis of a database of best value
developments.
Semi-structured face-to-face interviews with key personnel in police authorities
and forces in nine areas during June and July 1999. Visited force areas were:
Greater Manchester; Cleveland; South Wales; Dyfed-Powys; West Yorkshire;
Derbyshire; Lincolnshire; Hertfordshire; and Northamptonshire.
The sample forces for the visits were chosen because they appeared to demonstrate
a range of approaches and experiences. We also maintained contact with Warwick
Business School, to draw on lessons emerging from the local authority pilots.
Finally, although the research and its findings were confined to England and Wales,
we also visited Scotland to discover the differences in their approach and identify
potential lessons. The best value regime for Welsh forces is broadly similar to that
for English forces (the only differences relating to terminology), although the need
for the Secretary of State to consult the Welsh Assembly on key decisions may
have an impact in the future. In Scotland, however, there has been no best value
legislation – instead, local authorities have been taking forward best value since the
Scottish Office requested them to develop plans in 1997. The Scottish police were
asked, in 1998, to ‘catch up’ with local authorities by producing their first
Performance Plans in March 1999, outlining their understanding of best value, how
they will identify services for review, and which services will be reviewed first. In
practice, given the extent of preparations south of the border, we found that
Scottish forces were not necessarily ‘ahead’, but were facing issues very similar to
those for the English and Welsh police.
4
INTRODUCTION
Some caveats
Best value, and how the police are preparing, is a fast-moving area of interest.
Readers need to bear in mind that this report reflects only a snapshot of how
authorities and forces were approaching best value in the first half of 1999: future
research will be required to update this picture and identify new developments and
lessons.
Given the pragmatic approach to the mechanics of best value, our findings are
descriptive and do not suggest definitive answers. Although we identify possible
issues for consideration and approaches the police service might adopt, we can only
rely on what respondents told us: others might disagree with their comments and,
therefore, our findings. There is, however, no uniform solution or approach to best
value: what works for one police authority or force may not for another.
In particular, we have aimed to make our report as practical as possible, by relating
to actual examples and reporting case studies. We are not suggesting, however, that
these examples are necessarily ‘good practice’ in best value nor is this report a guide
on how to deliver best value policing . Although this report raises many
questions for authorities and forces, and identifies some of the ways in which they
have been tackled, it does not provide the solutions. The Working Group will,
however, issue briefings from mid-1999 that will address many of the issues raised
during the research.
Finally, best value will not come into effect until April 2000. It is, therefore
impossible at this stage to determine which approaches in which circumstances are
most likely to be successful. It may be years before such lessons emerge. We can,
however, report early lessons identified by forces preparing for best value and, in
some cases, piloting its main elements.
The structure of the r epor t
Section 2 of this report considers the main elements of best value and how they
might relate to existing practices. Section 3 highlights the key managerial issues
facing authorities and forces as they prepare for best value, whilst Section 4 takes a
detailed look at the main tools and models being adopted by forces across England,
Wales and Scotland and identifies their advantages and disadvantages. Section 5
considers how the police have been defining services and drafting their programmes
for review, whilst Section 6 examines the approach to service reviews. Section 7
comprises a summary of the key findings and conclusions from the research and
highlights the issues that are likely to be crucial to the police between now and
April 2000.
5
INTRODUCTION
The Appendices also include full case studies of preparations in two force areas for
best value and some of the key lessons that they had learned. As with all the
examples in this report, these case studies are included not as demonstrations of
‘good practice’, but illustrations of how parts of the police service have been
approaching the requirements.
6
BEST VALUE POLICING – WHAT’S NEW?
2. Best value policing – what’s new?
The main elements of best value policing
To begin with, police authorities and forces need to be familiar with the main
elements of best value policing and its timetable. The key to best value is an
ongoing review process, which aims to encourage a culture of continuous
improvement, and, in practice, means police authorities must:
●
●
●
undertake a corporate review , setting out a strategy for local policing and
examining performance, to help identify services for review and develop a fiveyear review programme;
subject all policing services to a rolling series of Fundamental Performance
Reviews 6, ensuring each is examined according to the ‘4Cs’ (see Box 2); and
produce annual Performance Plans setting out force achievements and
reporting plans, targets, priorities for, and outcomes of, service reviews.
Box 2: The ‘Four Cs’
“In practice, the reviews will:
challenge why and how a service is being provided;
invitecomparison with others’ performance across a range of relevant indicators, taking
into account the views of both service users and potential suppliers;
● consult with local taxpayers, service users and the wider business community in the
setting of new performance targets; and
● embrace faircompetition as a means of securing efficient and effective services.”
●
●
DETR (1998b)
The product of each review will be an Action Plan, describing the findings,
explaining how the service will drive up its performance and setting measures and
targets for the future. This ongoing cycle of review and improvement will follow a
tight timetable (see Figure 1).
Perceptions and pilots
Respondents to our telephone survey often perceived that their force would meet
best value’s requirements without too much difficulty. Every interviewee thought
they already had at least some elementsin place and nearly half believed they only
needed to pull together existing ‘fragmented’ elements. One interviewee
commented: “Best value is a rebranding of what forces have done before…it’s nothing
new”. Experience of piloting best value, however, seemed to bring a realisation of
the extent of the work required. Interviewees from forces piloting reviews tended to
consider they had more preparational work to do than did those from forces that
7
6
Service reviews are officially
termed Fundamental
Performance Reviews (FPRs)
but, for simplicity, are refer
red
to throughout this report as
either “service reviews” or
“reviews”.
BEST VALUE POLICING – WHAT’S NEW?
Figure 1: The timing of best value’s key elements (1999 onwards)
July 1999 –  Dec 1999 –
March 2000
March 2000
 By 31 March 2000
 April 2000 –
March 2001
 By March 31 2001

 Police
 Publish first
 Take forward  Publish second

authority and
Performance Plan
Action Plans
Performance Plan
force draft
(within Annual
which:
Review
first best
Policing Plan)
services
● Includes
Define
value
which:
identified as
assessment of
services for
Performance
● Sets out the
targets for first
previous year’s
review
Plan
programme and
year, applying
performance and
timetable of
the ‘4Cs’
comparisons with
Devise
reviews
other forces
programme
Draft second
● Includes
● Re-examines the
of service
Performance
assessment of past
review
reviews
Plan
performance and
programme
comparisons with
● Examines the
Possibly pilot
other forces
outcomes of past
reviews/test
● Summarises any
Action Plans and
tools and
pilot reviews and
considers the
models
Action Plans
need for further
action
● Sets measurable
targets for years
● Includes Action
ahead
Plans from the
latest service
(Plan must be
reviews
externally audited
● Sets measurable
by 30 June)
targets for years
ahead
Undertake
corporate
review
AUDIT AND INSPECTION REGIME
Sections 4 to 6 consider the key stages of this timetable (highlighted) in greater detail.
had not started. The implication – and a message frequently repeated by forces
advanced in preparations – was that forces should not underestimate what was
needed. With nearly half of all forces planning to pilot best value before 2000
(Table 1), it is possible that their perceptions may have since changed.
Best value policing – what’s new?
Linked to forces’ perceptions that they were partly prepared for best value was the
recurring view that it represented a framework into which they could ‘slot’ existing
practices – in particular, their current:
●
●
●
●
planning cycles;
review systems;
consultation techniques; and
mechanisms and plans for costing policing.
8
BEST VALUE POLICING – WHAT’S NEW?
Table 1: Best value policing pilots in England and Wales
Extent of forces’ piloting (May 1999)
Forces
No pilot and no intention to pilot
22
Currently piloting best value review of a service or services
12
Aiming to pilot a service review in next 6-10 months
4
Trialling specific tools for best value (e.g. Business Excellence Model)
3
Undertaking a review to pilot a specific element of the ‘4Cs’ (e.g. compete)
2
These issues are considered in more detail below. Whilst some respondents
suggested that their practices needed little adjustment, others felt that these
‘building blocks’ required significant development work before best value could be
effectively implemented. One force, an official pilot, had concluded that the key to
preparing for best value was to develop the necessary structures and improve
existing practices, before piloting reviews and developing a programme (Box 3).
Box 3: Cleveland Police
The pilot included a range of projects to help roll out best value across the force. Key
areas addressed were:
●
●
●
●
●
●
●
●
●
activity based costing, with particular focus on the use of mobile and personal data
terminals for operational officers;
mapping and management of processes;
benchmarking;
partnership working and problem solving, with shared responsibility and accountability;
devolved Resource Management;
development of an IT solution to unite existing data systems and provide performance
management information for any level and identified grouping within the organisation;
performance management;
strategic and financial planning; and
internal and external consultation and feedback.
Some projects focused on the delivery of operational policing at District level, whilst others
served to develop organisational support structures for operational policing. All had since
developed in scope and become interlinked. Drawing on these experiences, Cleveland had
recently developed its five-year programme of reviews and its review model.
Review systems
Forces had been reviewing their services for many years, although most admitted
that this had been informal and ad-hoc, with reviews differing in scope and depth
and often taking place only when a service appeared to have weaknesses, or had
been the subject of an HMIC or auditor’s report. These reviews had tended to
9
BEST VALUE POLICING – WHAT’S NEW?
concentrate on support services, where it seemed there was greater scope to
examine activities and find savings. Most force representatives also considered that,
although they had experience of comparing and consulting for reviews, they were
weaker in terms of challenging why services were provided and considering
competitive alternatives.
Interviewees, therefore, agreed that best value would impact on review systems,
citing ways in which they thought best value would bring order to their force’s
regime, including:
●
●
●
●
●
●
requiring them to link reviews to force objectives;
forcing them to review all their services;
prompting them to construct long-term programmes of reviews;
providing a common structure for reviews;
moving them away from concentrating on ‘cost-cutting’ and toward service
improvement; and
requiring them to publish their findings along with action plans and outcomes.
For some force representatives, however, this meant more change than for others: a
few interviewees, for instance, explained that their force had developed their own
force-wide review programmes a few years earlier and planned simply to adapt these
for best value.
Planning and reporting cycles
Best value introduces a new cycle to policing, to be placed alongside many others
(Table 2). The outcomes of these cycles will help authorities and forces manage
best value and provide information relevant to service reviews. Some interviewees,
however, felt that the picture was becoming too complex because cycles differed in
terms of:
●
●
●
●
frequency and duration : most cycles were yearly, but the difference between
three-yearly Crime and Disorder Strategies and five-yearly review programmes
raised concerns;
length of planning cycle : some cycles required significant preparation beforehand
(e.g. budgets), whereas others only started a few months before publication;
responsibility : the mixture of responsibilities between forces, police authorities,
local authorities and other agencies required extensive communication
networks; and
level: many plans were also broken down to divisional level or to local authority
level (with boundaries not always coterminous).
10
BEST VALUE POLICING – WHAT’S NEW?
Table 2: Police planning and reporting cycles
Plan
Frequency
Responsibility
Force/Corporate Strategy
Either every one, three or five
years (may be reviewed yearly)
Police Authority and Force
Annual Reports
Yearly
Police Authority and Force
Local/Annual Policing Plan
Yearly
Police Authority
Budget
Yearly
Police Authority and Force
Efficiency Plan
Yearly
Police Authority
Youth Justice Plan
Yearly
Local authorities with others
(inc. police)
Crime and Disorder Audit
Every three years
Local authorities with others
(inc. police)
Crime and Disorder Strategy
Every three years
Local authorities with others
(inc. police)
Best Value Programme
Every five years ( may be
reviewed more frequently)
Police Authority
Best Value Performance Plan
Yearly
Police Authority
Police authorities and forces will need to demonstrate in their corporate reviews
how they will be co-ordinating their cycles so that they are related and mutually
supportive. No guidance on this had been issued by the time of this report,
however, leaving interviewees unclear how cycles related to one another and
concerned they might clash. One force, for instance, had made heavy use of the
outcomes of its crime and disorder consultation when reviewing a division, but
acknowledged that these data would lose their value with age. Some forces were
already reacting: Greater Manchester Police had decided to adopt three-year cycles
for divisional reviews, so that they could align them with the timetable for their
next Crime and Disorder Strategies.
Authorities and forces also needed to be considering the link between Efficiency
Plans and Performance Plans. Some had decided that they would require every
review to deliver savings of 2%, but others considered this endangered the principle
that best value should improve quality – arguing that some review findings might
recommend additional spending. One force representative described particular
problems encountered when their central best value team had decided that savings
from a review should contribute to the force-wide efficiency target and be reinvested elsewhere: the reviewed service had strongly opposed this, arguing that
they were entitled to keep the savings.
11
BEST VALUE POLICING – WHAT’S NEW?
Consultation
Most force representatives in the telephone survey seemed confident that the
necessary public consultation mechanisms were largely in place, given the existence
of Police and Community Consultative Groups (PCCGs), surveys for national
performance indicators and consultation for the Crime and Disorder Audits. Many,
however, emphasised that their force would be drawing on additional existing
mechanisms, such as:
●
●
●
●
focus groups;
local business contacts;
large scale surveys (often shared with local authorities); and
meetings with voluntary sector organisations.
The extent to which these and other mechanisms might help forces meet best
value’s requirements is examined in Section 6.
Costed policing
The Police and Magistrates’ Courts Act 1994 (incorporated in the Police Act
1996) placed a requirement on the police to produce costed policing plans. In a
thematic inspection in 1998, HMIC reflected slow progress in the costing of police
services and highlighted the increasing pressure to develop mechanisms to enable
forces to demonstrate the efficient and effective use of resources. The requirement
for efficiency savings and introduction of best value have added to this pressure –
driving forces to introduce activity-based costing (ABC) so that savings can be
found and services more accurately compared.
Nearly half the forces were piloting ABC. Most popular was Humberside’s model,
by which forces measure operational staff activities for two weeks a year, then link
them to one of three policing strategies (public reassurance, crime management and
traffic management), with the costs of delivering the strategies subsequently
calculated using a pay ready reckoner. Forces were then usually apportioning other
costs pro-rata and breaking down total costs by function and incident. A few forces,
such as Cleveland, were adopting more sophisticated systems, enabling them to
track activities electronically by giving officers hand-held devices and linking their
time keeping directly to payrolls.
These efforts were starting to support best value pilot reviews – many of which
included attempts to cost activities. Despite the ongoing work, however, it was
clear that it would be some time before forces could accurately cost all services.
12
BEST VALUE POLICING – WHAT’S NEW?
The differences between service definitions, costing conventions and local
circumstances prompted some interviewees to suggest that it would never be
possible to compare costed services between forces. This issue is returned to in
Section 6.
13
PREPARING FOR AND MANAGING BEST VALUE POLICING
3. Preparing for and managing best value policing
The previous section described how authorities and forces might build on existing
practices for best value. As they make their preparations, however, they will also
need to consider what resource, organisational, managerial and cultural
preparations are needed and how they might work with others to deliver best value.
This section examines these issues.
Best value: organisation and resources
Managing best value
Two-thirds of forces had either a defined ‘best value team’ or group of people
closely associated with implementing best value but not called a ‘team’. The
majority of ‘teams’ were dedicated to best value, although some were dealing with
other issues too. Most commonly, staff were drawn from headquarters departments
such as corporate development, performance review and strategic planning. In a
handful of cases, however, there was just one ‘Best Value Officer’ with few resources
available to them: other forces were critical of this limited approach, regarding it as
unsustainable.
Force representatives described either a dual- or triple-level management structure
to oversee the implementation of best value (Figure 2). Three levels were typical,
although a small number of forces had only the first and third layer, or combined
the first and second layers into one ‘best value team’. All but two forces apparently
had regular ACPO-rank officer involvement in their best value preparations.
Figure 2: Force management structures and roles for best value
First layer
Second layer
Third layer
Best Value Team
Usually 4-8 people based
at HQ and headed by
Insp/Ch Insp or Supt.
Three forces had a single
‘Best Value Officer’.
Usually responsible for the
practical work in
delivering best value –
undertaking reviews or
overseeing and supporting
reviews undertaken by
reviewed services. Also
invariably responsible for
developing the force’s
approach to best value and
monitoring
implementation.
Best Value Board/Steering
Group
Usually chaired by DCC or
ACC. Invariably includes
Heads of Finance, Support
Services, Legal Advisors,
Quality Development, Best
Value Team, one or more
divisional commanders, staff
representatives, etc. Nine
forces had Police Authority
representatives on the
board. Group usually
oversees the force’s best
value approach, discussing
reports, examining review
outcomes, agreeing actions
and reporting to Chief
Officers.
Chief Officers
The regular meetings
of ACPO-rank
officers, also now
considering
recommendations of
the Best Value Team
and/or Steering
Group. In many cases,
the main link with the
police authority.
14
PREPARING FOR AND MANAGING BEST VALUE POLICING
It was too early for us to judge the merits or otherwise of these management
structures, although the inclusion on steering groups of staff representatives from
reviewed services was considered key by many. One force had found that this was
particularly important for support staff, who could not be deployed elsewhere in the
same way as police officers if posts were threatened by the outcome of a service
review. A number of forces also highlighted the importance of a participative
approach throughout all the stages of best value, to manage employee fears.
Police Federation, Superintendents Association and UNISON representatives were
frequently involved in steering groups but forces did not seem to be involving other
staff associations (such as women’s groups or representatives of staff from ethnic
minorities). One force voiced concerns that inputs from too many staff
representatives could slow down decision-making.
Force representatives with experience of piloting reviews also stressed the
importance of clear senior officer, and particularly chief officer, commitment and
willingness to take responsibility for difficult or unpopular decisions. In one force, it
seemed that a review had effectively run aground when senior officers, under
pressure from officers in the reviewed service, had not been prepared to carry
forward the unpopular recommendations.
7
Who undertakes reviews?
7
Respondents – particularly from larger forces that had piloted reviews – considered
that central staff alone would not be able to carry out a whole programme of
reviews. Furthermore, staff in one force had reacted badly to a review team coming
in from elsewhere – perceiving them as ‘the enemy’ and failing to co-operate with
requests for information. On the other hand, some respondents were also
concerned about the potential for the ‘challenge’ element of reviews to be limited
if all the reviewers had a vested interest in the service continuing unchanged.
Reviews generally combined some form of assessment exercise – where comparative
and consultation-based data on each service was gathered and examined – with
other elements, such as workshops to ‘challenge’, and a formal examination of
competitive alternatives. Most forces were tending to conclude that roles in and
responsibility for these elements should be split and shared (Table 3). In many
cases, then, members of the reviewed service were conducting self-assessments, with
advice and support from the ‘best value team’, but other review elements involved
people from outside the service – either from the central team or from another
service.
15
See Appendices 4 and 5 for case
studies of reviews.
PREPARING FOR AND MANAGING BEST VALUE POLICING
Table 3: Responsibility for service reviews
Who will undertake reviews
Forces
Staff in the reviewed service with central support
19
Central (HQ-based) team alone (after receiving training)
9
Staff in the reviewed service alone
6
How much will it cost?
Numerous interviewees were concerned that best value meant costly bureaucracy,
explaining that they hoped to avoid drawing away resources from elsewhere (by
using existing teams and expertise wherever possible). There was particular concern
that best value should not affect operational deployment of officers – although the
experience of those forces advanced in preparations was that this was not a
significant issue provided there was strong support from a central team.
8
Costs were mainly accounted
for by secondees’ opportunity
costs, time spent on reviews and
senior management costs.
The potential costs of best value should, anyway, be weighed against its potential
benefits. Although it was too soon to estimate the financial costs of best value, one
of the pilot forces had calculated that in one year it had allocated 0.1% of its
annual budget to best value8. To date, this force had recorded efficiency gains of at
least 0.2% of its annual budget and, although it expected opportunity costs to
increase as best value was rolled out, it was aiming to achieve consistent efficiency
gains of double the costs. Another force had found that one review had produced
monetary and efficiency savings of £330,000, for an estimated cost of approximately
£30,000.
Apart from the internal costs of implementing best value, the majority of forces
also expected to use consultants: only eleven had ruled out this option – often
because they had experienced problems with them in the past. Due to their high
costs, however, consultants’ roles were usually limited to developing in-house
expertise in specific issues and not as key to their approach – although one force
was considering asking a consultant after each review, to provide an independent
‘challenge’ which could be undertaken relatively quickly and at low cost. Most
forces, however, were finding that district auditors were key to their understanding
of best value and were increasingly involving them in their preparations – either as
a source of guidance and advice, or more actively assisting with pilot reviews.
Management, culture and communication
Attitudes to change were seen as key to the success of forces’ best value approaches
and the force’s history of change management was a major influence. Interviewees
16
PREPARING FOR AND MANAGING BEST VALUE POLICING
from forces that had experienced significant changes in the recent past, although
acknowledging that some staff were weary of upheaval, thought that colleagues had
become more accepting of change and might even welcome best value. One
commented: “The main hindrance will be a cultural one, people will be thinking it’s yet
another change. It needs effective communication and education. Staff are fairly
forward thinking, we just need to make sure they all know what’s happening .
Working by example will help more than anything.”
Many forces had recognised a need to communicate best value to staff, but most
were reluctant to spend a great deal of time or resources on formal training –
preferring, instead, to ‘market’ it. Three forces piloting reviews planned to use the
force intranet to keep staff informed of developments: they felt that staff needed to
understand best value’s practical implications for them and were more likely to be
won over by ‘early wins’. Staff seemed more receptive to messages about best value
if they came from their peers. Cleveland Police, for instance, had used an
operational officer from a division to conduct briefings on best value who
commented: “a lot of thought should be put into the first 30 seconds…it’s best not to
mention the term best value at first or they’ll switch off…I focused on the new legal
obligations instead ” . Officers had shown more interest in hearing about how
consultation might work and in what ways the force would be measured.
Longer-term resistance to best value was thought likely if staff were not kept
informed or felt distanced from the reviewers and decision-makers. In one case, for
instance, where staff had been brought in from outside to conduct a review, an
atmosphere of friction and distrust had developed which had led to a breakdown in
communication. Some forces had decided that, to build up trust amongst staff, the
centre would be provided with only summary findings from self-assessment
exercises (such as BEM), rather than having access to detailed material that might
embarrass particular staff members and thus potentially undermine the
comprehensiveness of the exercise.
Communication from the bottom-up was also seen to be key, to encourage
contributions in terms of problem solving. One interviewee noted that staff had
often “introduced good ideas in the course of their work but not thought to draw
attention to the fact”. To embed best value, forces were also considering
adding it to probationer training and making it a topic for staff assessment and
appraisal for promotion.
Finally, management quality was a key factor in best value implementation. One
force representative who had compared pilot reviews concluded that: “managers
17
PREPARING FOR AND MANAGING BEST VALUE POLICING
need to develop the right mindset: an understanding of the performance culture, an ability
to come up with solutions and the willingness to take risks." Senior officers
needed to support managers to:
●
●
●
●
●
‘think outside the box’ – outside the bounds of existing structures and
management practices;
challenge their role and encourage staff to challenge their activities;
take risks and promote a blame-free culture;
encourage staff to identify problems and suggest solutions; and
support change where it was needed.
As one interviewee suggested, ‘traditional’ managers were more likely to think, “If I
get rid of this department, I’ll lose my own post!”, than were managers used
to working in a performance culture. They had noted that senior officers,
previously used to managing large operational teams but now heading support
departments, had a tendency to build up their departments and were more likely to
resist any change that might reduce their resources. Forces, needed, therefore, to
consider how to encourage effective management of best value and to take careful
account of the culture of each review target and the personalities involved.
Police authority and force relations
Best value will radically alter the role of police authorities – making them legally
responsible for continuous improvement in forces’ performance and providing them
with the opportunity to acquire a more detailed picture of policing activities.
Despite this potential sea change, however, our research suggested that, until
recently, many authorities had only briefly and infrequently discussed best value.
Internal issues for police authorities
Most authorities had made an existing audit committee or equivalent responsible
for best value, although a small number had established a specific sub-committee
and a few had a lead member or a named ‘Best Value Officer’. Although the
establishment of a separate sub-committee suggested they had recognised best
value’s importance, some authority representatives felt it might become the
preserve of a minority of members or officers. It was clear that authorities needed to
consider how best to communicate developments across their membership: one
authority was addressing this by inviting all members to its Best Value SubCommittee.
Some authority and force representatives believed that elected members – given
their knowledge of developments in local authorities – tended to have a better
18
PREPARING FOR AND MANAGING BEST VALUE POLICING
grasp of best value than independent members. Elected members’ potential reserve
of experience may be valuable to forces as they develop their approaches but,
increasingly, police authorities were also discovering they could draw on the public
and private sector experience of their independent members to good effect.
The relationship with the force
At the time of the telephone survey many authorities and forces had not decided
how they would work together to deliver best value and there did not seem to have
been much discussion of how this relationship might develop. Many forces,
however, foresaw a practical role for authority members or officers (Table 4).
Table 4: Force perceptions of possible roles for police authority members or officers
Role
Forces
Involved in prioritising services for review
30
Members to take part in service reviews
13
At least one member/officer to attend force Best Value Steering Group
9
Consulted at each stage
5
Monitoring role
4
This highlighted, however, that authorities would need to consider whether they
could cope with the growth in work – such as assessing review materials, making
decisions and monitoring progress. Most police authority representatives felt that
lack of resources and members’ time meant that best value might not receive the
attention it should.
Although, by April 1999, only four forces had not formally discussed best value
with their police authority, the frequency and extent of contact between other
authorities and forces differed quite markedly. Often, forces were simply ‘keeping
their authority informed’ of progress – either by communicating developments at
the main meetings or by having an authority representative on the force Steering
Group. Police authority members that had attended informal seminars run by
forces, where issues could be discussed in a less formal environment than at
committee, had found these useful and were taking an increasingly active and
‘questioning’ role.
Our discussions with authorities and forces suggested that the extent of the role of
the police authority in best value depended, in particular, on:
●
●
the historical relationship between senior management and the police authority;
the personalities, skills, interests and enthusiasm of members and officers;
19
PREPARING FOR AND MANAGING BEST VALUE POLICING
●
●
●
how the turnover of members was handled – especially in terms of best value
induction;
the resources available in terms of time and administrative support; and
the way in which their involvement in local level consultation had developed.
Some force representatives were apparently still concerned that police
authority members might ‘dictate too much’ and ‘want to influence
operational policing decisions’ – particularly if they could see a political gain.
The implication was that by addressing, at an early stage, their respective roles
and relationships, police authorities and forces might avoid potential future
friction. West Yorkshire Police Authority and its force, for instance, had
agreed to a joint Best Value Steering Group, chaired by the Deputy Chair of
the Police Authority. To further this integrated approach, the police authority
concluded that, as part of a publicly funded service, they themselves should be
subject to the same best value review process as, for example, the Force
Command Team. In Derbyshire Constabulary, the ACPO rank officers had
already taken part in training with members of the police authority – also
agreeing their respective roles and planning a best value review of the
authority for 2004/5.
Northamptonshire Police Authority and their force were holding regular seminars
as a means of communicating openly, outside the public arena, on ‘large-scale
issues’. Police authority members had already taken part in an earlier force review,
and advocated a partnership approach, which one member described as being like a
‘critical friend’ to the force. Potential difficulties for police authorities were
highlighted, however, in terms of the conflict between the monitoring and
executive roles which best value legislation was seen to require.
The roles of the authority and force in the consultation process were also unclear.
It was proving difficult to disentangle areas where the force would consult from
those that were seen to be the preserve of the police authority. One force
interviewee identified a broader need to develop a system for working with the
police authority on research and consultation, thus “spreading the load
”. The issue of
consultation is returned to in Section 6.
Working with other authorities
Nearly half the forces had not discussed best value formally with local authorities
in their area by April 1999 – although only one suggested it had no plans to do so.
Six forces had found regular meetings helped avoid duplication (particularly in
consultation) and enabled them to discuss approaches, exchange good practice and
draw out early lessons.
20
PREPARING FOR AND MANAGING BEST VALUE POLICING
One force was planning to go further, by running joint reviews whilst another had
already benefited from local authority staff input to their implementation plans.
Another force had developed a sustainable community project with the local
authority, where they shared the same performance indicator: the number of
lettings in the area (i.e. not directly a crime-related indicator). Other forces,
however, had become frustrated when trying to arrange joint reviews, because local
authorities in their area were working to different timetables and agendas.
A few forces felt that they were ahead of their local authorities in terms of
preparations for best value and were better served by contact with local authorities
elsewhere who were more advanced, such as the London Borough of Newham. Two
force representatives recommended access to local authority networks of
information from which they had gained useful material, for example on innovative
partnership working with the private sector.
Most forces had yet to have much contact with other authorities, such as those for
fire and health, to discuss best value: although they frequently mentioned that they
would be prepared to consider sharing resources such as command and control
centres. One force, however, had been unable to advance discussions because, they
believed, the potential for sharing services could have budget implications: “We
would like to look at sharing with other emergency services but they feel very threatened by
all of this and are point blank refusing to discuss it . ”
21
TOOLS, MODELS AND STANDARDS FOR BEST VALUE POLICING
4. Tools, models and standards for best value policing
Having identified some emerging management lessons, this report turns to the
practical details of the new regime, and how tools may help forces deliver best value
policing. Appendices 4 and 5 also include descriptions of some tools in action.
Tools for best value policing
The introduction of best value comes at the end of a decade that has seen a rapid
adoption of management tools and aids in both the public and private sector. Best
value authorities have been turning to these either to give structure to their whole
approach or to help them apply key elements.
Police authorities and forces were no exception to this trend: all but eight forces
were planning to use various models – mixing and matching them as necessary.
Most interviewees from forces planning to combine models added that their force
was developing or aiming to produce a toolkit and guidance to help officers decide
which model to use and when. Greater Manchester Police, for instance, had
adapted and were using the concept of a Best Value Toolkit originally developed by
Newham. Many of the local authorities in the Greater Manchester area were also
using the idea of a toolkit.
Table 5 lists the models and standards most frequently cited by forces as likely to
aid their approach to best value – either by helping to drive the whole review
structure, or as one element within their reviews. The list excludes project
management tools – of which there is a large range – cited by many forces as a
means of mapping and monitoring the progress of reviews.
Table 5: The most frequently cited tools and standards for best value policing
Tools/standards
Forces
Business Excellence Model (BEM)
43
Process mapping
26
IIP accreditation
12
Balanced Scorecard
9
Charter Mark
7
Some respondents, however, felt there was a danger that early concentration on
tools might lead authorities and forces to conclude that these, rather than the
practical, cultural, organisational and managerial implications of best value,
deserved most attention. The models described in this section are, then, potential
aids to reviews and not guarantors or substitutesfor the delivery of best value
policing. This section examines each tool in turn, considering its application as
well as its advantages and disadvantages.
22
TOOLS, MODELS AND STANDARDS FOR BEST VALUE POLICING
The Business Excellence Model (BEM)
BEM, developed between 1988 and 1992 by the European Foundation for Quality
Management (EFQM), contends that controlled processes, supported by strategic
leadership, resources and skilled personnel, will deliver continually improving
performance and enhance staff and customer satisfaction whilst having a beneficial
impact on society more widely. For readers unfamiliar with BEM, the model is
described in Appendix 1.
The public sector soon recognised BEM’s potential and, by the mid-1990s it was
being adopted by local authorities, central government departments and national
organisations, such as Royal Mail. The model’s emphasis on continuous
improvement, enhanced customer satisfaction and performance benchmarking
made it an obvious candidate for aiding best value delivery. The main attractions of
BEM for forces, however, as they developed their review process included:
●
●
●
●
●
●
●
it is the most well-established ‘off the shelf’ tool – making it easier for the police
to compare performance with and learn from other public sector bodies and the
private sector;
all forces are using BEM in some way – increasing the opportunities for findings
and lessons to be shared in a common format;
using BEM as a template for reviews effectively means that forces are likely
automatically to cover three of the four Cs: compare, consult and compete;
BEM assessments lead to action plans – in line with best value requirements;
if forces choose to score or grade their services as a result of BEM review, they
can reuse the model as a consistent way of measuring changes in performance;
forces can adapt it to their circumstances by changing headings, weightings and
issues; and
BEM coincides with many other disparate initiatives – such as Investors in
People, Charter Mark, QS 9000, BS EN ISO 9000 and ISO 4000. These links
(described in Appendix 2) mean that forces can potentially combine earlier
initiatives with BEM, or need not adopt them.
Even before best value many forces were adopting BEM and, in 1998, ACPO
formally recommended its police version as a means of managing and monitoring
performance. It was, therefore, not surprising that BEM was easily the most
frequently cited tool – representing at least an element of the likely approach to
best value in every force. The frequency with which the model was cited masked,
however, differences in the way it was being applied. Variations included:
23
TOOLS, MODELS AND STANDARDS FOR BEST VALUE POLICING
●
●
●
●
●
●
employing consultants to apply the model, tasking a central review team or
training members of each reviewed service to self-assess;
using the original model, the revised model, the public sector version, the police
version or the force’s own adaptation;
with full scoring, partial scoring, grading or no scores;
as a way of reviewing the whole force to prioritise reviews (i.e. for corporate
review) and/or as an integral part of the reviews themselves;
using differing mixtures or levels of evidence to score, such as the reviewers’
perceptions, practitioner workshops or full evidence gathering; and
as the standard review template or one of many tools to be applied to all/some
services.
We could not establish a detailed picture of exactly how each force would be
employing BEM (see Appendix 4 for one example). Typically, however, forces
intended to adapt the original model or use the police version and most regarded it
primarily as a self-assessment technique. Forces were expecting to have to tailor the
issues and questions for each heading and reviewed service, but very few intended
to employ the full ‘weights and scores’ model. A small number planned a ‘partial
scoring’ approach, by reviewing their findings and grading each service (e.g. A, B,
C, etc.). Two forces planned to require all divisions and departments to complete
regular BEMs – so that much of the necessary data would be already available to
inform the review and to aid ongoing management and planning (see Box 4).
Box 4: BEM and South Wales Police
South Wales Police, one of the pilots, was asking all divisions and departments to
undertake annual BEM assessments. The force, in common with many others, considered
that it would take a few years for BEM to bed down. The assessments, therefore, would be
undertaken by trained staff in each service and based, to begin with, solely on their
perceptions of the service’s performance. This would be followed a year later by a more
rigorous practitioners’ workshop-based approach and it was proposed that these
assessments would take place every two years. The best value reviews would therefore be
able to draw immediately on up-to-date information.
Despite the frequency with which BEM was mentioned, it was clear that it was not
universally popular. Indeed, most forces identified weaknesses and a small number
suggested they had only adopted it because others had. One interviewee, from a
force that had decided to make limited use of BEM, explained that BEM’s results
were only useful if they were acceptable: “We tried it five years ago and the Chief
Officers so disagreed with its findings that they got rid of it”.
24
TOOLS, MODELS AND STANDARDS FOR BEST VALUE POLICING
BEM’s key disadvantages included:
●
●
●
●
●
●
●
it is formulaic, complex, bureaucratic and costly;
forces often have to provide extensive training or use costly consultants;
everyone is using different versions – reducing the scope for comparability;
BEM increases the time for best value to bed down, whilst provoking resistance
from staff who tend to regard it, and thus best value, as another jargon-rich fad;
BEM’s self-assessment approach is too reliant on perceptions, meaning that
scores cannot accurately be benchmarked between forces or compared with
other organisations;
BEM highlights weaknesses, but not potential solutions – this has to be done
separately; and
the basic model has no built-in ‘challenge’ aspect: it implicitly assumes that the
police should be providing each service.
Despite its relevance to best value, therefore, the police need to be alert to BEM’s
potential limitations. Forces we visited stressed that the police cannot base their
whole approach to best value solely on one model, such as BEM – they need to
adapt models to their circumstances, adding ways to ensure that each service is
‘challenged’ and all realistic options for improvement considered.
Whether forces adopt the fully scored and weighted BEM depends on the depth to
which they want to take assessments and how comparable over time and between
services they want reviews to be. An immediate problem encountered by one of the
forces applying the full model was that the necessary data – particularly historical –
were often lacking (although the force considered this finding a valuable outcome
– enabling them to identify areas for improvement in data capture). Section 6
examines further the issue of data availability.
Some of the forces we spoke to indicated that they were considering sharing the
self-assessment function across the force – so that members of one service might
apply the BEM to another service. The potential advantage of this approach was
that it could both reduce the danger that services assess themselves too favourably,
and increase opportunities for lessons to be shared between services. One of the
pilots, however, had decided against it on the basis that staff would come to regard
reviews as akin to an audit – encouraging them to talk up their performance and
possibly conceal information. Instead, they had decided that a Lead Assessor from
the centre would head review teams elsewhere, acting as a control on rigour and
quality. They also intended to ask external consultants to examine their reviews.
25
TOOLS, MODELS AND STANDARDS FOR BEST VALUE POLICING
We were aware throughout our research of the emphasis being placed by forces on
BEM. To sum up, however, as one interviewee from a force well experienced in
BEM stressed: “The BEM is not a panacea – it doesn’t guarantee best value, but it does
help to identify what is and isn’t working. You don’t just do BEM – you have to use the
results. BEM is a tool – not the end ” .
Process mapping and Process Expert (PE)
Process mapping, which enables organisations to identify and map out for each
function every single interlinked activity, is not new to policing. Many forces
described their use of it to break policing down into costed activities, or help them
to understand the minutiae of a particular service. The potential value of process
mapping to best value, however is that it:
●
●
●
●
enables links between services to be identified because it breaks activities down
by processes rather than functions (such as departments), therefore teasing out
every link in the chain of policing so that reviews can more easily cut across
departments and divisions;
requires forces to define a service and then identify every related activity – so
that each can, in turn, be challenged and compared with other services or forces;
makes costing, and thus the ‘compete’ element of best value, more
straightforward by enabling every input to be identified and assessed; and
allows every member of staff providing or receiving services across the force to
be identified – so enabling best value consultation to be more focused.
Process mapping was popular even with ‘front-line’ officers. In one force, the
mapping of ‘accident procedures’ prompted one officer to comment that the
resulting policy to stop recording ‘damage-only accidents’ had been “the only good
thing that ever came out of headquarters ”.
Although there is a range of mapping products on the market, many forces planned
to use Process Expert (PE) Professional Software. This tool has been developed as a
result of an ACPO-backed project to develop a national database of process
models, with the intention of developing a common language and consistency in
service delivery as well as to enable more accurate activity costing. PE uses the
Police Process Classification Framework (agreed by ACPO in 1998), to break
policing down into over 300 processes and sub-processes. PE software enables forces
to record every related staff activity for each of these processes so that complex
process maps are gradually created which enable all policing activities to be costed
and compared between forces.
26
TOOLS, MODELS AND STANDARDS FOR BEST VALUE POLICING
In May 1999, 19 forces had started to pilot PE for six defined processes (Appendix
3). By mid-1999, all but three forces aimed to purchase the software and many were
applying it independently to inform their reviews. The possible additional
advantages of PE to best value policing included:
●
●
●
as best value progresses and forces map all their processes, it will enable the
creation of a national database of process blueprints to aid the dissemination of
good practice;
it ensures consistency of approach across forces in terms of modelling and
costing, so that inter-force comparisons are more accurate; and
it is based on software that can be easily shared and updated.
Our survey also suggested that the software’s promotion had at least partly driven
the decision of some forces to define services by processes rather than functions
(see Section 5). The future spread of PE might, therefore, have a significant impact
on how forces implement best value. If all forces vigorously take up PE it is possible
that reviews will become increasingly process driven – with geographic- and
function-based approaches becoming rarer. Some force contacts even explained
that their force had decided to delay devising their programme of service reviews
until they had used PE to help them define their services.
One alternative to PE was the Functional Analysis System Technique (FAST)
Model, which combines process mapping with a structure for challenging each
identified activity (Box 5).
Whilst the apparent value of process mapping to forces wishing to break down
services into activities was clear, it was too early for us to judge PE’s impact. Many
forces regarded PE highly – suggesting it had great potential provided all forces
signed up and used it rigorously. Despite this, few regarded PE as an important
benchmarkingtool. This may partly be related to the scale of work required to build
the data warehouse – PE’s promoters suggest that complete benchmarking data will
not be available until all forces have reviewed all services (i.e. April 2005). One
force contact, however, considered it unlikely that PE would ever be able to create
a full database of all processes, because many forces would review their services in
other ways.
Some contacts were less enthusiastic about process mapping and, as with BEM,
interviewees were often concerned that forces should not mistake process mapping
as a direct route to best value. One commented: “process mapping is not the same as a
best value review: it doesn’t involve consulting people…it’s just a desk-based exercise”.
27
TOOLS, MODELS AND STANDARDS FOR BEST VALUE POLICING
Box 5: FAST and Lincolnshire Police
Lincolnshire and seven other forces had jointly benchmarked and challenged procedures
for firearms and explosives licensing, using FAST – a tool combining process mapping
with a challenge element.
Under FAST, a focus group of practitioners breaks a function down into its component
parts, mapping out all the interconnected processes for the whole service on a single sheet
of paper. They then brainstorm each process, by applying a verb and noun to defend it (for
instance: “the service does ‘x’ to save money”). A ‘wild card’ member is present at the
meeting constantly to question why things are done in the way they are. Once
performance information is added (e.g. cost, frequency, time taken) the group can identify
which parts of the service require more detailed process mapping to identify precisely the
activities where attention is needed.
The model had some drawbacks: it did not necessarily produce results that could be
compared with non-participating forces, and was potentially unpredictable because of its
dependence on the personalities and knowledge of the participants. Nevertheless,
Lincolnshire considered FAST a potentially effective means of both benchmarking and
challenging services (to some extent the challenge was supplied by other forces asking
why one force took an approach they had deemed unnecessary). One outcome of the pilot
was that the force now combined visits to all holders whose licences were due to expire
within twelve months, rather than visiting each in turn as their licence expired: resulting
in projected savings of over £80,000.
The Balanced Scorecard (BSC)
Robert Kaplan and David Norton first proposed the BSC, in 1992, because of the
apparent failure of organisations to link measurement of past performance to future
strategies. They proposed a framework of measures providing a comprehensive
picture of an organisation’s overall performance against its long-term vision and
strategy. With the BSC, once an organisation has established its vision, aims and
objectives, it identifies the required actions or initiatives and establishes for each
action key inter-related ratios under the following four headings to help keep track
of its progress:
●
●
●
●
customer perspective – measures of customer and stakeholder expectations,
perceptions and levels of satisfaction;
internal business process perspective – measures of current performance against
targets for the organisation’s identified key processes;
continuous improvement perspective – measures of the organisation’s ability to
communicate and learn from improvements; and
financial perspective – measures such as cost ratios.
28
TOOLS, MODELS AND STANDARDS FOR BEST VALUE POLICING
Based on the premise that ‘what gets measured gets done’, these measures are
subsequently used to drive up performance.
Apart from its emphasis on the drive for continuous performance improvement,
BSC’s relevance to best value derives from its emphasis on the organisation’s vision
(as in a corporate review) and setting of measures which ensure that each service is
subject to the ‘4Cs’. The model’s potential advantages as a tool for best value
policing include:
●
●
●
●
●
BSC can be applied to the entire organisation (as a corporate review) or to
individual services;
it can provide a structure for service reviews by breaking down police activities
so that they can be challenged, compared and consulted on;
BSC’s heavy emphasis on performance measurement coincides with the
development in the last decade of an increasingly comprehensive police
performance management framework;
it provides order for disparate police performance measures – placing them in a
structure that ties them explicitly to long-term aims and objectives; and
it provides a wider picture of outcomes than traditional performance
measurement (for instance, a drugs initiative resulting in only a few arrests
might be regarded as a failure, but BSC would include other overlooked
outcomes – such as more officers with experience of dealing with the issue and
increased public awareness).
Nine forces, therefore, regarded BSC as a potential tool to help them break down
service objectives and set defensible measures. Usually, however, they were
integrating the model with the BEM: by establishing future measures as part of
BEM’s ‘key performance results’. The most advanced demonstration we
encountered of the application of BSC to policing was in Scotland, where the
Accounts Commission (1998) was promoting its use for best value (Box 6).
29
TOOLS, MODELS AND STANDARDS FOR BEST VALUE POLICING
Box 6: The Balanced Scorecard in Dumfries and Galloway Constabular
y
Dumfries and Galloway piloted BSC to define services and set performance measures. For
example, they identified a force aim to “make the streets safer” before breaking this down
into three key goals, to reduce:
●
●
●
alcohol-related incidents;
street disorder; and
the use of knives and weapons.
They then broke these goals down into actions, using a ‘what-how’ analysis. This involved
taking each goal (the ‘what’) and asking ‘how’ to deliver it. As each ‘how’ is identified it
becomes a ‘what’ and the question is again asked ‘how’ to achieve it – so that an action is
identified. For example, to reduce alcohol related incidents (the ‘what’), the force might
decide to encourage public houses not to serve obviously drunk clients (the ‘how’) and to
achieve this end the action might be to visit particular premises and keep following up.
Taking the reduction of alcohol-related incidents as an example, the force identified a
range of actions and categorised them according to the BSC’s four headings, before setting
performance measures. For example, one action was to change the customer’s perspective
by targeting underage drinking and the measure was “number of cautions/offences reported
”.
Another action to ensure continuous improvement was to have internal briefings on the
issue with the measure being “the change in officers’ attitudes and awareness”.
Whilst none of the interviewees from the forces considering the model thought
that BSC had any drawbacks, a few from forces not using it felt it should not be key
to any force’s approach because it was too biased toward performance indicators
and historic data.
Investors in People (IIP) and Charter Mark
IIP is a national standard for employers to help them plan, manage and evaluate
staff development to ensure that employees can deliver services effectively. Charter
Mark is the government’s award scheme for encouraging public service excellence
against nine criteria, including high performance standards, customer satisfaction,
complaints handling and value for money.
Investors in People (IIP) and Charter Mark are standards to which service
providers aspire, rather than tools for best value. They were usually regarded by
forces as evidence, therefore, that they were already supplying best value services,
rather than as a mechanism forces should now adopt to meet the requirements.
Forces were, however, drawing on information gathered during the accreditation
processes to help inform their self-assessments. For instance, if they were using the
BEM, forces were able to use IIP’s consultation process to inform their evaluation of
30
TOOLS, MODELS AND STANDARDS FOR BEST VALUE POLICING
‘people results’. On the other hand, some forces had concluded that their
application of BEM was sufficient to ensure that they need not adopt IIP
(Appendix 2 explains the links between BEM and other initiatives).
Charter Mark potentially has a more significant relationship to best value than IIP,
because of its concentration on high standards in service provision and customer
satisfaction across the board. Despite this, few interviewees highlighted it as critical
to their force’s approach to best value. Interviewees from forces with Charter
Marked services invariably regarded it as both vital and highly valuable and often
suggested that the best value legislation should have exempted these services. A
couple of interviewees, in contrast, were highly critical of the standard – suggesting
that it signified little for an enormous amount of work and was unnecessary: “We
dislike the Charter Mark system: it involves too much effort for little gain in relation to the
requirements of best value.”
31
DEFINING SERVICES AND RUNNING REVIEW PROGRAMMES
5. Defining services and running review programmes
The previous three Sections considered some of the issues the police need to
address as they prepare for best value – management, culture, communications,
external relations and potential tools. This Section and the next move on to
examine best value’s practical implications and some early lessons in terms of the
key stages of the framework:
●
●
●
●
●
corporate review – developing a strategy for local policing, based on a review of
performance;
defining services – deciding how to divide up policing services for review;
review prioritisation – deciding which services to review when;
constructing and managing review programmes – agreeing roles and a
timetable; and
producing Performance Plans – deciding how best to present review findings,
objectives, measures and ongoing results of monitoring from past reviews.
Corporate review
Box 7: What is a corporate review?
“Effective corporate review involves establishing a clear picture of an authority’s strengths and
weaknesses, agreeing a local vision for improvement…and setting practical goals...”.
‘Better by Far’, Audit Commission
DETR (1999) suggest that the first item on an authority’s action list should be to “carry out an
assessment of the strengths and weaknesses of the authority ” .The guidance continues: “as a
first step, authorities will need to establish, where they have not already done so, their strategic
objectives and corporate priorities”
.
Section 2 described how police authorities and forces have, for some time, been
publishing mission statements, aims, objectives, measures and targets. Just over half
the force representatives in the telephone survey considered these provided
sufficiently comprehensive information from which to derive a corporate review –
although a handful planned first to update their data. Only thirteen forces appeared
to have interpreted best value as requiring a new and separate force-wide ‘health
check’ at the start or in the early stages of best value (Table 6).
32
DEFINING SERVICES AND RUNNING REVIEW PROGRAMMES
Table 6: Proposed methods for producing corporate reviews
Method
Forces
Drawing on existing plans (Strategic Plans, Local Policing Plans, Business Plans, etc.)
22
Undertaking new force-wide review
9
Updating existing plans (Strategic Plans, Local Policing Plans, Business Plans, etc.)
5
Building up picture from rapid review of all services in first year
4
Building up force-wide picture over 5-year best value programme
3
We were not in a position to judge whether existing planning structures would be
sufficient to enable the production of a corporate review. Key elements of a
corporate review, however, should include:
●
●
●
●
●
●
an overarching vision and a practical set of objectives for achieving it;
the results of community consultation (including users, potential users,
businesses and other agencies) on current performance and future priorities;
an overall assessment of performance9 (including, where possible, comparisons
with non-police organisations) and a clear picture of force strengths and
weaknesses;
a consideration of future policing issues and what preparations are required;
the co-ordination of existing planning processes (see Section 2); and
an explicit link between the corporate review and the selection of services for
review.
Forces, therefore, need to consider whether they have the sort of up-to-date
information required: if not, they may need to ‘plug the gaps’ so that their first
Performance Plan and review programme is demonstrably based on full knowledge
of where they are headed and where they currently stand.
The legislation and guidance do not prescribe how frequent corporate review
should be, but some forces apparently regarded it as a one-off exercise, replacing
previous long-term force strategies. Review programmes, however, have to be
sufficiently flexible to take account both of the findings from reviews and forces’
changing circumstances (including national developments, such as the MacPherson
Report). Some forces, therefore, had decided to keep their programmes up-to-date
by revisiting them regularly or undertaking frequent corporate reviews (Box 8).
33
9
For the moment, forces will
have to base their comparisons
of performance on the existing
disparate set of national
performance indicators. From
April 2000, however, they will
be able to use the new national
suite of performance indicators
developed to underpin the
Overarching Aims and
Objectives for Policing
(published August 1998).
DEFINING SERVICES AND RUNNING REVIEW PROGRAMMES
Box 8: Corporate review in West Yorkshir e
West Yorkshire had adapted its Annual Strategic Review process to become an annual
corporate review every autumn. Since 1995, this approach had helped the force review
Corporate Strategy – including resource allocation and the prioritisation of organisational
change projects (such as the Force Process Improvement Programme of reviews). The
original approach involved a structured process undertaken over two days by members of
the Force Command Team, facilitated by the Management Support Unit. Key stages
included:
●
●
●
What sort of organisation do we want to be? (i.e. purpose and values).
What environmental issues need to be considered? (employing environmental
scanning).
How well have we done? (i.e. an assessment against objectives and targets, activity
analysis findings and reviews of major organisational change projects).
Building on these data, the Force Command Team would agree the force’s Strategic
Objectives (with linked targets) and how best to allocate resources.
For its 1998 review, to meet best value, the force had built on this structured process by
adding an annual policing excellence assessment (i.e. the BEM), adopting a new review
approach to cover the whole force over five years, and improving the consultation of hard
to reach groups. To help ‘plug possible gaps’ in their consultation, the police authority was
also considering whether to fund a citizens’ panel focused on policing.
One force had decided to include a review of its own Force Command within the
corporate review, which would enable a longer-term strategy to parallel their review
programme (Box 9). Others had concluded that there was no discernible difference
between corporate review and review prioritisation: for them, the process of
selecting services for review equated to a corporate assessment. Although, however,
the two processes of corporate review and review prioritisation are explicitly linked,
the latter derives from and is informed by the former. Forces will need to be able to
demonstrate that their review prioritisation is a product of a corporate picture of
how their force shouldbe performing and how it really is.
Defining services
Neither the best value legislation nor any related guidance prescribes any particular
approach to dividing up services for review. The Audit Commission (1998)
however, identified four ways by which authorities might ‘cut up’ their services,
adding that they might take a hybrid pick-and-mix approach, by combining these
options according to the characteristics of their services:
●
●
service-based – by existing service units;
area-based – geographic sub-areas within overall boundaries;
34
DEFINING SERVICES AND RUNNING REVIEW PROGRAMMES
Box 9: Force Command and corporate review
Greater Manchester Police had decided to review their Force Command at the outset of
best value. For them, this approach had important advantages: helping Commanding
Officers to develop an understanding of best value, to demonstrate leadership and show
the rest of the force that no one was to be immune.
The Force Command Review also enables Commanding Officers to re-examine force
performance, set a vision and develop a strategic plan to help inform the review
programme. This development of a new Strategic Plan effectively required the Force
Command to undertake a corporate review of their force – gathering a wide range of
performance data and consultation finding. By combining the review of its Command
Team with a corporate review, GMP also hoped to break out of the silo mentality –
whereby officers were tempted to think in terms of their areas of command rather than
across the force. The resulting corporate review would initially inform the best value
programme, but would also be a living document which could be revisited over time as the
force regularly re-examined its Strategic Plan in the light of service reviews.
●
●
customer focus – how the service appears from the customer’s perspective; and
cross-cutting issues/ ‘wicked issues’ – issues cutting across localities and
demographic groups, best tackled by a multi-agency approach.
Our discussions with forces suggested, however, that the potential pattern of service
reviews was likely to be more complex than this. We identified three inter-related
elements to the definition of services for review:
●
●
●
the review approach – whether to review by function, process, theme, etc.;
the review target – the topic of a review (i.e. a division, a department, a theme,
etc.); and
the level of review – whether at the level of division, sub-division, department,
team, etc.).
Review approaches
By asking interviewees to suggest potential review examples, we discerned the
following approaches (Table 7).
Until best value is fully underway, it will not be possible to reach clear conclusions
about which review approach might be most appropriate to forces’ circumstances.
Forces, however, identified potential advantages and disadvantages for all the
approaches. These are outlined below in Table 8 which also draws on the Audit
Commission’s work (1998). As forces decide which approaches to adopt and when,
they also need to consider how best to address the potential disadvantages.
35
DEFINING SERVICES AND RUNNING REVIEW PROGRAMMES
Table 7: Approaches to service review
Approaches
Forces
Function/Service-based: This involves isolating and reviewing the providers of a
service (e.g. a support department or a custody suite).
28
Process: Identifying a series of related policing activities, often across a range of
service providers, from start to finish (e.g. crime management, broken down into
crime recording, reporting, investigation, etc.). Confusingly, some forces referred to
this approach as ‘cross-cutting’ or ‘thematic’ because it ranged across a number of
internal functions and issues.
28
Geographic/Area-based: Defining services by geographic layout (e.g. reviewing all
HQ departments or individual divisions). Whilst local authorities might have many
sites, however, most forces only have divisions and departments (often concentrated
at one main site). There tended, therefore, to be strong similarities between the
functional and geographic approach.
16
Cross-cutting: Identifying wide-ranging policing issues/themes best tackled in
partnership with non-police service providers (e.g. drugs, community safety, etc.).
14
Customer-focused: Defining services according to how users perceive them
(i.e. points of public contact with the police and their perceptions of what
happens behind the scenes).
3
The most frequently cited review combinations we encountered comprised
functional reviews for non-operational parts of the force (e.g. headquarters
departments) and either process reviews for operational policing (e.g. crime
investigation or patrol) or geographical reviews (e.g. division by division). Those
forces already advanced in their planning had also usually allowed for the
development of cross-cutting reviews with other agencies, although none had
progressed very far.
Review targets
Forces’ potential review targets included ‘training’, ‘custody’ and ‘patrol’. In the
main, these targets frequently related to the review approach. For instance, an ‘IT ‘
or ‘personnel review’ usually implied a functional approach (i.e. focusing on the IT
and personnel departments).
It became clear, however, that forces could, if they wanted, apply different review
approaches for the same target
. For example, as Table 8 demonstrates, a review of the
IT department could potentially be addressed simply by examining the activities of
IT staff, or by widening this to a process review, encompassing all IT usage across
the whole force (whether by specialist or operational officers). Likewise a focus on
‘custody’ might mean a functional review of all custody suites (i.e. costs, location,
activities, staffing, etc.) or a wider process review of every police activity relating to
detainee handling – from point of arrest through entry into a custody suite and
36
DEFINING SERVICES AND RUNNING REVIEW PROGRAMMES
Table 8: The advantages and disadvantages of different approaches to service reviews
Potential advantages
Function
(e.g. the
IT dept.)
●
●
●
●
Process
(e.g. file
preparation
and record
systems –
which might
include the
IT dept.)
●
●
●
●
●
Geographic
(e.g. by
division, or
department
according
to their site
– which
might
include the
IT dept.)
Crosscutting
(e.g. agency
data sharing
– inc. force
IT dept.)
Customer focused
(e.g.
complaints
handling –
might
include IT
records)
●
●
●
●
●
●
●
●
●
●
Potential disadvantages
Straightforward, relating to existing
structures
Data already exist in this format
Easier to identify review teams and
assign objectives
Easier to explain to staf f
Opportunity for forces to identify how
departments and divisions work
together
Functions can be linked in a way that
makes sense and overcomes ‘silo
mentality’
PE and other process mapping
approaches can help identify activities
More likely to relate to customer
perceptions
Using processes may make
benchmarking easier (forces can
compare ‘bits’ of services)
Relates to existing structures, so
reviews may be easier to conduct
Encourages local view of service
provision more attuned to public
perceptions and priorities
Easier to relate reviews to existing
local partnerships (especially given
Crime and Disorder Strategies)
Data usually exist down to local level
May encourage competition between
services – possibly driving up
performance
Enables forces to develop or build on
existing partnerships
More likely to focus on issues of public
concern
More attuned to public perceptions
and priorities
Relates directly to issues of local
concern
May make it easier to work with other
agencies
●
●
●
●
●
●
●
●
●
●
●
●
●
●
Encourages a ‘silo mentality’ – where
staff tend to think only in terms of their
own service
Difficult to dissociate services when they
rely on one another
Harder to identify cross-functional
lessons
Forces need to process map (which can
be complex and time consuming)
Harder to identify review teams and
assign objectives
Benchmarking harder unless forces have
defined processes in exactly the same
way
Difficult to know ‘where to stop’ when
defining a service (i.e. service
boundaries are unclear)
Danger of duplication of similar reviews
Less likely to learn cross-force lessons
May encourage competition between
services – possibly hindering the sharing
of lessons
May overlook opportunities for
economies of scale from shared resources
Creates confusion if different areas then
take different actions to improve
More complex if boundaries are not
coterminous with local authorities
Different divisions often have the same
external partners – reviewing them
separately might confuse these partners
●
●
●
●
●
●
●
37
Difficult to co-ordinate reviews with
lots of other agencies – which may be
working to different timetables
Complex and lengthy
May be harder to assign and coordinate follow-up actions
Hard to establish an accurate picture
of customer perceptions
Public perceptions may be inaccurate
Will not cover all force services – such
as those where the public have no
contact
Difficult to develop performance
measures and targets
DEFINING SERVICES AND RUNNING REVIEW PROGRAMMES
eventual release. Both these functional or process reviews might also have
geographic elements by examining, separately or in parallel, headquarters policy
and divisional differences in activities and performance.
Levels of review
Forces were likely to be focusing their reviews at different levels. Functional
reviews of departments might, for instance, cover the whole department or one part
of it, whilst geographic reviews might focus on divisions or sub-divisions.
One force was considering devolving the best value requirements to departments –
so that each one would have to devise its own five-year programme for internal
reviews. They thought this might have the advantage of increasing local ownership
whilst allowing for the diverse responsibilities of some of the departments. It would
also enable them to prioritise reviews more accurately by focusing on particular
aspects of a department or division particularly in need of review. There were,
however, also potential disadvantages: so many reviews at different times could
make it hard to learn lessons across the force and make best value’s management
complex.
GMP had piloted its reviews in four sub-divisions and intended to take forward this
approach (Appendix 4). However, the force had also concluded that the outcomes
would have to be aggregated and taken forward at divisional level to disseminate
lessons, avoid different actions being taken by neighbouring teams of officers and
aid the management of best value.
Service reviews: hybrid approaches
Although, in our survey, only three forces had not yet considered how to define
their services for review, the majority had only recently started to address the issue.
Despite this, three-quarters had decided that that they would adopt some form of
pick-and-mix approach. One interviewee acknowledged that although their hybrid
mix of reviews “might entail some review overlaps, it will hopefully counter our fear of
missing something out…”.The key danger with this was identified by one pilot as a
‘burgeoning number of reviews’ that might mean “forces gradually drowning in best
value”. They concluded that it was critical for forces to keep the number of reviews
to a minimum.
We found, therefore, that it was overly simplistic to categorise force approaches to
best value as process- or geographic-based. Although a force might be
concentrating on a set of process reviews, it might also undertake functional
38
DEFINING SERVICES AND RUNNING REVIEW PROGRAMMES
reviews. Likewise, it might be undertaking a functional review of a service whilst its
neighbours undertook a process review of the same service. The force might also
combine process and functional elements in the same review. This potential
patchwork of approaches, targets and levels will have important consequences for
forces comparing progress and outcomes, as well for the HMIC and Audit
Commission as they develop their audit and inspection regime.
Review prioritisation
Having defined their services, police authorities and forces need to agree review
priorities. The White Paper suggested that authorities should concentrate first on
services where the public is poorly served (Box 10).
Box 10: Guidance on review prioritisation
“Where the performance of a service is demonstrably poor by any standards – and the framework of
national indicators will highlight these – then authorities will be expected to review that service
quickly and effectively. Where performance is poor but there are a number of areas needing
attention, there may need to be some scope for phasing a review. Where will also be a case for
addressing some of the stronger areas of performance early, so what the lessons of success can be
spread quickly. But it would be unacceptable for any authority to put off reviewing significant areas
of weakness without good cause…”
DETR (1998b)
Although our survey suggested that poor performance would be the key factor
influencing the prioritisation of police service reviews, it was clear that there would
be others (Table 9), with two-thirds of forces intending to weigh up a range of
factors before defining their programmes. The size of services’ budgets and the
potential for financial savings (not necessarily an indication of weakness) were
frequently cited factors. Surprisingly few forces, however, were considering tying
their review timetables into those of other forces or agencies.
Forces were reluctant to place too much emphasis on poor performance as a factor
influencing review priorities because of its potentially adverse impact on morale.
One interviewee commented: “In terms of divisions, in particular, we will look to do
the reviews roughly at the same time to avoid the stigma that one division is being done
sooner because it is weakest .” Some forces also considered that weakness,
defined solely on the basis of performance indicators, would not necessarily identify
services that delivered poor quality to the public. Another interviewee explained:
“Defining weakness is hard – do we rely on PIs, finance information or public opinion?
Also, a service may not be weak in all aspects . ” A few forces also emphasised the
39
DEFINING SERVICES AND RUNNING REVIEW PROGRAMMES
Table 9: Factors influencing review prioritisation
Main factors influencing prioritisation
Forces
Weak performance
21
Size of service’s budget
14
After public consultation/recognised as most important to public
11
Size of potential financial savings
8
Likelihood of ‘quick win’
7
When service had last been reviewed
6
Extent to which service is ‘public facing’
5
External pressures (e.g. from HMIC, Home Office, Audit Commission)
5
Existing corporate/strategic/local objectives
2
Identified as a priority by other reviews
2
To tie in with other agencies’ reviews (e.g. local authorities)
2
To reflect what other forces are doing
2
need to take account of trends, rather than relying on the most recent data that
might conceal a dynamic picture of rapidly improving or worsening performance.
Forces also seldom seemed to link corporate review and service prioritisation
explicitly: indeed, many had already identified which services they were likely to
review from April 2000 (Table 10).
Although divisions were often mentioned as targets, forces already piloting reviews
usually preferred to start with support services. Particular favourites for early review
were the departments responsible for IT, training, personnel, finance and fleet
management. Less frequently mentioned were reviews of operational policing –
such as patrol, CID or particular districts. A few forces were also reviewing
specialist units, such as child protection and domestic violence units (Appendix 5
describes one of these reviews).
It was apparent that, despite those factors cited by forces as key to their review
prioritisation (Table 9), there were other considerations influencing their choice for
early reviews. Support services were favoured by many because they were regarded
as areas where reviews were most likely to identify efficiency savings, to contribute
to the force’s 2% target. Support services were also seen as easier to define for
review purposes and more likely to be familiar with reviews. Some of the forces
concentrating on operational policing, however, had done so because this offered
more scope for a rapid impact on ‘front-line’ policing, in line with force and
national objectives, and because performance data were more readily available.
40
DEFINING SERVICES AND RUNNING REVIEW PROGRAMMES
Table 10: Likely review targets
Target
Forces
Individual divisions/BCUs
8
Personnel/sickness policy/administration/catering
6
IT services
6
Finance/payroll/procurement
6
Custody
6
Frontline patrol/response
6
Crime management (recording, reporting, investigation)
6
Call handling/crime desk/help desk
6
Fleet management/transport
5
Training
4
Estate management
4
Public relations/publicity/printing services
4
Community safety/multi-agency work/Problem Oriented Policing
4
In most cases, where the decision had been made, responsibility for the initial
prioritisation would rest with whatever central team their force had. The team’s
recommendations, however, would be put to the force’s Best Value Steering Group
or equivalent and to the police authority.
Models for prioritisation
Although the Audit Commission (1998) highlighted how some local authorities
were adopting models to aid prioritisation, only nine forces intended to do the
same – usually by adapting the London Borough of Newham’s model10. Some,
however, had developed their own version (Box 11).
The advantage of adopting a model for prioritisation was that it simplified decisionmaking – enabling all the choices to be made rapidly and according to the same
criteria. It also ensured that these decisions were potentially more objective and,
therefore, more defensible. The value of the approach, however, depended on the
criteria chosen, the weight applied to the scores and how, or by whom, the scores
were decided.
Constructing review programmes
Once authorities and forces have agreed how they plan to define policing
services, they should draw on their corporate review findings to devise a
41
10
This model prioritises by scoring
each defined service from one to
five points (where five means
‘high priority’) according to
factors such as public
satisfaction, importance to the
public, user satisfaction,
performance and cost.
DEFINING SERVICES AND RUNNING REVIEW PROGRAMMES
Box 11: Review prioritisation in Leicestershire Constabular6y
Leicestershire Constabulary had developed a weighted model using the following factors:
● importance
of the service – assessment of extent to which service matters to public and
stakeholders – it is likely that core services will score more highly than ancillary services;
● public satisfaction – whether this is improving/falling and perceived performance
getting better/worse;
● comparative performance – PIs and their trends;
● cost/benefit – the cost of the service, likely cost of review and potential for savings;
● external influences – pressure to review particular services, areas or practices from
Audit Commission, HMIC, the government or elsewhere;
● benchmarking – depending on the availability of data from forces and elsewhere (in
some cases, reviews may be delayed until information is available);
● corporate objectives – set at force level, relating to areas for organisational
development;
● other organisations – whether review might be co-ordinated with others; and
● timing – taking account of factors such as when the last review took place, whether it
complements other reviews and whether legislation is anticipated.
Each potential target will be assessed against these nine factors and given a score between
one and five to reflect their level of priority. A score of five would indicate a high priority
for review, and one will be the lowest priority. The services with the highest total score are
likely to be reviewed earlier in the programme.
programme that will ensure all services are reviewed between 1 April 2000 and
31 March 2005. This does not mean, however, that programmes can or need to
list every review that will occur over the next five years – programmes need to be
flexible, so that new reviews can be added or planned reviews amended or
rescheduled in the light of:
●
●
●
findings from service reviews – which might, for instance, highlight new areas
for review or be extended to include an area originally intended for separate
review;
changes to force objectives/priorities following consultation; and
external factors – such as new legislation or national developments (e.g. the
MacPherson report and Human Rights legislation), or local factors such as the
development of a new town.
Although some forces had already listed a large number of reviews for particular
years, most were yet to define their programmes. Many force contacts mentioned
that they had difficulty knowing what should be reviewed in two years time – let
alone five. The forces we visited that had developed review programmes had
decided that the best approach was to have a pragmatic outline structure of reviews
with an element of regular review. Given the need to link service prioritisation to
42
DEFINING SERVICES AND RUNNING REVIEW PROGRAMMES
corporate review, forces might consider having regular force-wide reviews (Box 6,
page 30).
Forces advanced in devising their review programmes had concluded, in hindsight,
that others should bear in mind the following:
●
Resources need to be planned alongside self-assessments – both in terms of
central support and whether the service can spare the staff. One force, having
reviewed a support department commented: “…we almost had to put everything on
hold while we did the review. It was not too difficult to stop the department for a while,
but it’s going to be very hard for the divisions: they are going to have to keep everything
flowing whilst they review themselves
”.
●
Reviews tended to take longer than planned, although they were likely to
become faster as the force became used to them. Review programmes will,
however, have to be realistic and sufficiently flexible to cope with unexpected
delays.
●
It would be worthwhile to discuss the programme with neighbouring police
authorities, forces and local authorities in the area, so that reviews might be
linked and lessons shared. One force we visited had found it very difficult to coordinate its review of child protection units with social services, but had
concluded that earlier contact and planning might have made this easier.
The programme should take account of different review approaches and tools.
Process reviews, with cross-force communications, may take longer than other
approaches. Likewise, self-assessment using the full BEM is resource-intensive
and requires careful planning.
●
●
Although best value requires all services to be reviewed over five years, this does
not mean that a force need only review a service once in that period. Some of
the forces we visited had concluded that five years was too long between reviews
and had devised a cycle that required more frequent reviews, or intended to
review certain key services more often.
●
They need to plan how to assess and manage not only a rapidly growing number
of reviews, but also their findings, action plans and progress monitoring for each,
as the programme rolls forward. Interviewees – even in forces advanced in their
preparations – suggested that they had only recently appreciated the extent to
which best value meant an accumulation of work. One interviewee commented:
”we’ve really just been concentrating on what we needed to do over the next year – but
best value is not just a flow of reviews, it’s a wave that will get higher”.
43
DEFINING SERVICES AND RUNNING REVIEW PROGRAMMES
Producing Performance Plans
By 31 March every year, police authorities and forces will need to have agreed their
Performance Plans. No detailed guidance had been issued, by the date of this
report, on the content and format of these Plans. However, DETR (1998b, 1999)
and the Local Government Act 1999, suggest that an authority’s Plan should:
●
●
●
●
summarise the force’s objectives and its assessment of how well it had been
meeting these;
compare its performance with the performance of others;
report the findings of reviews in the previous year and proposed actions to
improve performance (including measures and targets); and
set out the timetable for its proposed service reviews.
Few authorities or forces we spoke to had started to consider how to develop their
first Performance Plan – although many were concerned that they had received no
advice. Experience suggested that review reports and Action Plans tended to be
bulky documents unsuited to wide-scale publication. The guidance (DETR, 1999),
however, had also made it clear that Plans should employ “plain language, a clear
layout and relevant information”.
It was clear, therefore, that published Plans would
have to summarise quite extensively the results of reviews – concentrating
primarily on the proposed objectives, actions, measures and targets. One force,
Greater Manchester Police, had already taken this approach by setting out what
residents could expect to see receiving attention in short glossy reports of its
reviewed sub-divisions. The force intended to combine these Performance
Improvement Plans into a contribution to their force-wide Performance Plan.
44
SERVICE REVIEWS AND THE ‘4CS’
6. Service reviews and the ‘4Cs’
Review phases and the ‘4Cs’
The success of the service’s approach to best value ultimately depends on the
effectiveness of its reviews. Authorities and forces will need to demonstrate that
every service has been subjected to the ‘4Cs’ (Box 2, page 7), and should be clear,
by April 2000, how they will do this.
Although this Section examines the ‘4Cs’ as distinct review components they are
unlikely, in practice, to form consecutive tasks within each review. Instead, many
forces were tending to divide their pilot reviews into phases that mixed and
combined elements of the ‘4Cs’ in a way that served to highlight how, in fact, the
requirements were inter-related. Some interviewees, therefore, found it difficult to
disentangle the ‘4Cs’, concluding that:
●
●
●
consultation findings enhanced performance comparisons;
comparisons helped them consider competitive alternatives; and
information on competitiveness enabled them to challenge service provision,
and so on.
It was also apparent that there was no optimum way to approach the ‘4Cs’ – some
forces preferred to challenge at the start of a review, whilst others did so at the end
(on the basis that they needed all the information first). Nevertheless, authorities
and forces needed both to plan how they would cover the ‘4Cs’ and demonstrate
retrospectively that they had done so. A typical review might start with the review
team mapping out their timetable and then initially considering why the service
existed (i.e. challenge) and what other options might exist (i.e. partial
consideration of ‘compete’). From there, the review might mix and match the
elements as it progressed (Box 12). In practice, forces did not need to adopt a rigid
framework: many emphasised that they would plan each review’s structure at the
start – ensuring that, at minimum, they could demonstrate they had met all the
requirements. Some forces, therefore, planned to produce a template that could be
added to and altered for each review.
45
SERVICE REVIEWS AND THE ‘4CS’
Box 12: A possible service review
Phase
The ‘4Cs’
Initial planning meeting/workshop also discusses
whether/why service should be provided by the police
Gathering and discussing available data to compare
with past data (PIs, staffing levels, activities, costs)
Gathering and discussing comparative performance
data available for similar service providers (possibly
using benchmarking clubs or questionnaires)
Challenge
Challenge, Compare,
Compete
Compare
Questionnaire/focus groups/interviews for a sample
of users, staff and stakeholders
Consult
Self-assessment (e.g. BEM) to gather more detailed
information (consultation may be part of this stage, as
might process mapping/workshop on activities)
Examination of all other options for service and
consideration (using cost and benchmarking data)
Compare, Consult,
Challenge
Challenge, Compete
Report listing outcomes and options for consideration
by Steering Group, Chief Officers and Police Authority
Agreed Action Plan with objectives and measures
Challenge, Compare,
Consult, Compete
Data availability and reliability
Most forces had decided that the gathering of all available data was a key initial
stage in each review. Many were finding, however, that this was proving much
easier for operational services than for support services, which were unused to
having their performance examined and monitored so closely. To some extent, this
was influencing review prioritisation, but it was also prompting some to put in
place new data collection systems that would mean the data would be available
once they came to review the service. This was one reason for South Wales’
approach (Box 4, page 24).
One force, that had reviewed a support service, had found particular problems with
the data they collected – not only in terms of availability but also reliability. They
emphasised that forces should be careful not to draw conclusions from one data
source, but triangulate (i.e. compare other related data) wherever possible.
46
SERVICE REVIEWS AND THE ‘4CS’
Challenging
All forces recognised that ‘challenging’ involved at least some assessment of why
they were providing a service, but there were differences in opinion on the ‘extent’
of this challenge. Whilst some forces considered that ‘challenge’ meant questioning
in every case whether a service should be provided, others defined it simply as
questioning how the service was provided . Finally, a few forces felt that the
challenge element was to be only partially applied to policing – principally the
support services.
This confusion partly stemmed from the differing nature of services for review and
also illustrated another interlinkage between the 4Cs: this time, between challenge
and compete. The legislation recognises that core policing services (such as CID)
are exempt from being subject to competition. Some forces had interpreted this to
mean that core policing functions need not be challenged in the same way. Whilst
some policing services might not be opened to competition, however, this does not
mean that they are exempt from challenging questions, such as:
●
●
●
●
Why are we providing this service?
Should we supply this service in this way?
Are there elements of this service we could supply differently, or possibly not
at all?
To what extent does this service impact on our other services?
Forces that had completed reviews had already concluded that important
operational policing services could be challenged and thus improved. One force, for
instance, had reviewed its domestic violence unit, concluding that civilian staff
could more efficiently provide it, because police powers were seldom required.
Ways to challenge
Many forces were unsure about exactly how they would challenge services,
although most were considering adding standard ‘challenging’ questions to a review
template. Some forces, having discovered that reviewed services found it hard to
define their purpose, were starting their challenge element by establishing why
each service considered it existed before challenging each statement in a workshop
(Box 13). Others were waiting until process mapping, such as PE or FAST, had
identified all staff activities, before challenging these consecutively (sometimes
with a ‘wildcard’ present, who could give an outsider’s perspective). Appendices 4
and 5 detail the approaches taken by two forces.
47
SERVICE REVIEWS AND THE ‘4CS’
Box 13: Challenging questions
One force had developed the following framework to help challenge the purpose of a
division or department. To begin with, review teams and key service members
brainstormed or listed the main areas for which the division or department was
responsible. Taking witness interviews as an example, they might construct a series of
purpose statements (to summarise why a service existed):
ACTION VERB
OBJECT
RESULT OR STANDARD
To interview
Child witnesses
In accordance with legislation and the
force Memorandum of Good Practice
They then challenged each statement in a workshop setting, asking why each was
necessary and what possible alternatives existed.
Who should challenge?
The ‘challenge’ and ‘compete’ elements seemed the most culturally difficult to
handle. Whilst questions such as ‘why does this service exist?’ might productively
involve all staff or a sample, many forces seemed to be concluding that the more
difficult ones such as ‘what activities could you cease?’ or ‘who else might provide
your service?’ needed more careful handling.
One force was considering using staff from reviewed service to undertake the bulk
of the review work, but asking an independent team to conduct the ‘challenge’ and
‘compete’ elements with key service members only – both to ensure rigour and
maintain morale. Some forces also expressed a need for external involvement in
review processes to allow challenges to services to be genuine and credible. One
commented: “We don’t want units to self-review. It’s essential we have an independent
team or they’ll just spend too much time justifying themselves”.
Comparing
Forces were already familiar with the concept of comparing service performance on
a nationallevel, using internal performance measures to compare operational
performance and the suite of national performance indicators to compare
externally. Some were also comparing at force family level, and had in place
arrangements to share data on a quarterly or monthly basis. Most added that they
also examined HMIC and Audit Commission data and reports.
Best value, however, adds a new dimension to performance comparisons by
requiring service-level comparisons. Services should also not simply be compared
with other forces, but also, where possible, across the public and the private sector.
48
SERVICE REVIEWS AND THE ‘4CS’
Furthermore, guidance (DETR, 1998b) suggests that, initially, reviews must
produce quality, cost and efficiency targets over five years that, as a minimum, are
consistent with the performance of the top 25% of all authorities at the time the
targets are set. To set such targets, the police need to be aware of performance
elsewhere and whether it is calculated on a comparable basis, so that they can
benchmark.
The reliability and validity of comparative data was, in fact, the forces’ main
concern about the ‘4Cs’. Although it was more likely to be a problem as forces
extended their comparisons beyond the police service, it was clear that there were
data problems between and even within forces. One interviewee commented:
“Even with response times to 999 calls we have problems comparing with other forces
because they measure this in very different ways. For example, when does the response
time begin? Is it when the phone rings? Is it when it is first answered? Is it when the call is
completed?” This sort of comparability issue has been well-known for some time,
but best value extends it into areas not usually compared – such as custody and
victim support services. Although quality of service may be judged by user surveys,
quality definitions differ and such exercises are often conducted in different ways.
Effective benchmarking should also take into account the costs of delivering a
service. In the current absence, however, of nationally harmonised service
definitions and costing practices (see Section 2) forces were usually restricting
costed benchmarks to in-force comparisons of operational policing activities.
The comparability of data within and between forces is an issue that will
require national attention as best value progresses. Although some national
level indicators already exist to aid reviews, accurate comparisons at service
level require individual functions and processes to be broken down into
activities. The national process mapping exercise (Section 4) may contribute,
by enabling similarly defined processes and activities to be compared. In the
meantime, in the absence of a national database for police benchmarking or a
contact list for forces undertaking similar reviews (both requested by many
interviewees), forces were making their own formal or informal benchmarking
arrangements.
Benchmarking between forces
Despite the difficulties, many forces were starting to co-operate with others to
benchmark performance for particular services. Four variants of this were
immediately identifiable:
49
SERVICE REVIEWS AND THE ‘4CS’
●
●
●
●
local comparisons – with neighbouring forces/those in the ACPO region;
service leaders – with forces acknowledged to lead in the service being
reviewed;
most similar-force – with the services provided by other forces in the old HMIC
families or the new families of most similar forces; and
all forces – by sending detailed questionnaires on each service to every force to
gather comparative data.
All of these approaches have their advantages and disadvantages. Local
comparisons were probably the easiest to organise but, as one interviewee
commented: “if the force next door provides a rubbish service, what’s the point in
benchmarking against it?
” Some forces felt that, without national standards, they
could not accurately identify service leaders, adding that they might, in any case,
benefit from special circumstances not enjoyed by other forces. Service comparisons
with most similar forces might ensure that key differences between forces were
ironed out – but could not guarantee that comparison at servicelevel was fair.
Finally, sending out questionnaires to all forces might elicit detailed information
(provided definitions matched), but if every force did this there would be a tidal
wave of questionnaires across the forces: already some reported being fed up with
receiving questionnaires and many were ignoring them.
It seemed likely, therefore, that forces would benefit from combining and mixing
these approaches – both to establish as wide a picture as possible and avoid relying
on one particular technique with its potential disadvantages.
Some forces, however, were also concerned that their performance was already high
and that they experienced diminishing returns from benchmarking the higher they
were in the performance ‘league’. One interviewee commented: “Our performance in
most areas tends to be better than others, so there appears little scope for improvement.
It might be more appropriate for us to compare with forces in other countries . ”
Although some forces might have good records in particular services, however, it
seemed unrealistic to suggest that performance across all services could not be
improved.
It was common practice for forces to compare divisions within their own force and,
therefore, well known that such comparisons had to take into account the differing
characteristics of each (in terms of size, local population, socio-economic factors,
etc.). Some forces had developed their own families of divisions to try to overcome
these differences, but this did not solve the problem when they came to compare
divisional performance with other forces. One interviewee commented: “What we
really need to be able to do is to benchmark at BCU level between forces – there is a
50
SERVICE REVIEWS AND THE ‘4CS’
major gap here which needs to be filled. We would like to compare Tunbridge Wells
with, say, Exeter11 – if only we could be sure that they were comparable ” .
Another interviewee, however, felt that, given the new Crime and Disorder
partnerships, benchmarking should more accurately take place at this level – thus
enabling data sharing to encompass local authorities.
Future research is planned to develop a national set of local-level families for
police performance comparisons. In the meantime, however, authorities and forces
may have to consider alternatives – such as constructing informal families within
their regions.
Benchmarking beyond policing
Most interviewees commenting on external benchmarking associated it solely with
non-operational policing. Many pointed out that operational policing was a unique
service, with no obvious comparators in the private sector. Some added that there
was even less certainty that data were similar and valid when comparing with nonpolice organisations than there was between forces.
Nevertheless, a large number of forces had already started to compare services
externally. We identified three current approaches:
●
●
●
joining a wide-ranging benchmarking club (such as the Best Practice Club, the
Civil Service Benchmarking Club or the European Network of Benchmarking);
paying for access to a large benchmarking database (e.g. Andersens Consulting);
and
benchmarking on an informal and ad-hoc basis with individual public/private
service providers to compare specific services (e.g. fire and ambulance, BT car
fleets, etc.)
As with the approaches to benchmarking between forces, interviewees identified
advantages and disadvantages to all these approaches. Benchmarking clubs and
databases potentially enabled forces rapidly to identify and contact similar service
suppliers, but interviewees found their coverage was often largely irrelevant to
policing and some were concerned that forces might end up being bothered by
other organisations wanting to benchmark against them. Benchmarking on an
informal basis was popular, but required a relationship to be built up, and some
national providers were reportedly becoming weary of receiving successive informal
requests for help from force after force.
One other key problem interviewees identified with external benchmarking was
the need to protect the confidentiality of police information. This either restricted
51
11
The names of the towns have
been changed.
SERVICE REVIEWS AND THE ‘4CS’
the extent and depth of comparisons or required lengthy weeding-out of certain
information before benchmarking could take place.
Despite these difficulties, we encountered numerous examples of forces comparing
service performance with private organisations. A couple of forces had compared
their help desks with large city firms or Virgin Direct, whilst forces reviewing fleet
management had approached BT and ParcelForce. One force had compared its
Exhibition Unit with those in the NHS and fire service, and its intranet with BT’s
and Shell’s. There was evidence that lessons were being learnt from these exercises:
one force had found that its fleet saved money by converting to liquid petroleum
gas, whilst another discovered that a service was ‘top-heavy’ in comparison to the
private sector. One force had discovered how to remove faults from its intranet
after comparing it with a private company’s similar system. Another force had also
discovered that, whilst it might be difficult to find adequate private sector
comparators for whole services, it was potentially possible to break the policing
service down into its component activities and then find comparators for these.
Consulting
Consultation is key to best value, both at force-level, to inform the corporate
review, and at service-level, to inform individual service reviews. Section 2
described how forces were often confident that existing consultation mechanisms
meant they largely met these requirements. Forces undertaking reviews of
operational policing, for instance, were able to draw on existing public consultation
information (from past public satisfaction surveys and Crime and Disorder Audits).
Very often, however, interviewees explained that their force also planned to extend
existing mechanisms by establishing new focus groups, or by ‘piggybacking’ on local
authorities’ large surveys or citizens panels. Some were concerned, however, about
the value of large-scale surveys, when there was a need to target particular areas or
sections of the community. Indeed, with all these new consultation practices, some
forces were concerned that they and the public would start to suffer ‘consultation
overload’. Some felt that the solution was for forces to agree and co-ordinate their
consultation with the consultative role of their police authority, and to discuss
combining different forms of consultation, such as focus groups, with those run by
their local authorities (which would also save money). Collaborative consultation
between neighbouring forces was also raised as a way of collecting information
more efficiently.
Others were concerned that increased consultation might raise public expectations
to unrealistic levels. A number of forces were intending to tackle this second
52
SERVICE REVIEWS AND THE ‘4CS’
danger by combining marketing exercises with their public consultation, to manage
public expectations. For example, these forces would be sending out leaflets with
their surveys, explaining policing services and what the public could expect.
Despite the often considerable efforts being made to consult the public at a higher
level, however, forces also have to tailor consultation exercises for each review to
ensure the following groups, where relevant, were included:
●
●
●
●
●
public users (e.g. victims of domestic violence who had contact with domestic
violence units);
non-public users (e.g. businesses in a particular division);
force users (e.g. users of the force legal services department);
stakeholders (e.g. other agencies, such as health authorities or social services);
and
providers (i.e. the staff delivering the service).
In the main, staff using services and those working in the reviewed service were
keen to be consulted. Some forces emphasised that the inclusion of staff, and staff
associations, in service reviews was important not only for completeness of the
review, but also to maintain morale by keeping employees informed and promoting
a participative atmosphere.
Forces, however, were finding other groups more difficult to consult: not everyone
had an opinion or wanted to share it with the police. One interviewee had
attended numerous business events and frequented the local Chambers of
Commerce, but had found it very difficult to generate any enthusiasm amongst
businesses for sharing their views with the police. Furthermore, given the nature of
some police work, not all users could be consulted: one force, for example, which
had reviewed its child protection units pointed out that it would be unreasonable
to consult child victims on their views of the policing service they had received.
Past research has demonstrated the problems experienced by forces when trying to
establish PCCGs (e.g. Morgan, 1992; Elliott and Nicholls, 1996). Low turnout at
‘open space’ events is an obvious problem. One interviewee felt, however, that this
was no excuse: “Lack of attendance at meetings in draughty halls on a wet Wednesday
evening, is not an indication of a lack of interest in local policing issues, it is an indication
of a lack of imagination on behalf of the force. We need to be much more innovative in
terms of how we reach our community
”.
Numerous force representatives described innovative approaches. They were, for
instance, starting focus groups for hard to reach groups, introducing loops for the
53
SERVICE REVIEWS AND THE ‘4CS’
hard of hearing, recruiting interpreters to extend consultation to non-English
speakers and installing terminals in public centres for people to find out about local
policing and record their opinions. Some forces were also using their intranet both
to disseminate best value developments throughout the force and invite comments
on particular services under review. Suffolk were planning to bring both
consultation and their use of the BEM to life by introducing an interactive IT
version (used by Eastern Power and Barclays Bank). The force’s respondent
explained: “members of focus groups are given handsets (like on ‘Who Wants to be a
Millionaire’) and they score the force against questions based on the BEM structure.”
These findings were encouraging, but only further research will identify which
innovations were most effective in differing circumstances and why.
Competing
Many forces were finding the ‘compete’ element the most difficult of the ‘4Cs’ to
come to terms with. The White Paper (DETR, 1998b) had defined competition as
a test of whether a service might be delivered in a more efficient and effective way
(Box 14), adding that core policing services would be exempt from this
requirement. Despite this, forces remained unsure about what ‘competition’ meant
in the context of policing and concerned that it should not extend to operational
activities and those that involved access to confidential information.
Box 14: Competition and policing
“The key strategic choice for authorities is whether to provide services directly themselves or to
secure them through other means. The key test is which of the options is more likely eto secur
best value for local people. Services should not be delivered directly if other more efficient
and effective means are available…Retaining work in-house without subjecting it to real
competitive pressure can rarely be justified…The Government recognises that it would not be
appropriate to expose to competition certain core statutory activities carried out by the police.”
DETR (1998b)
Experience of competition in the service has, to date, been more limited than most
public services, both because of the nature of policing and the relatively late stage
at which CCT was extended to forces. Although many forces, particularly the
larger ones, could cite examples of contracting-out, these were invariably ‘one off’
cases, relating to peripheral support services such as cleaning, catering and finance.
Competition in policing
For the police, therefore, best value’s requirement that reviews subject all but their
core functions to competitive testing is likely to have a significant impact. In
54
SERVICE REVIEWS AND THE ‘4CS’
practice, it means that each review should examine whether another supplier could
be procured to supply the service or whether some other means, such as a
partnership approach, could reduce costs without lowering standards or raise
standards for the same cost. This need to compare costs, however, highlighted
difficulties faced by forces trying to apply economic evaluation to policing
(Stockdale et. al., 1999). One force explained that it was difficult to calculate for
all factors – such as the potential danger that contracting-out vehicle maintenance
to a cheaper external provider might mean poorer service and more accidents.
To counter the view that competition necessarily leads to contracting-out a whole
service to external bidders, the White Paper (DETR, 1998b) suggested possible
competitive alternatives to existing service provision, including:
●
●
●
●
●
commissioning an independent benchmarking report so that in-house services
could be restructured to match the performance of the best providers;
providing a service in-house, but buying in top-up support from the private
sector;
forming a joint venture or partnership following a competition for an external
partner;
tendering part of a service with an in-house team bidding against external
bidders, before deciding whether to provide the bulk of a service internally or
externally; and
disposing or selling-off competitively a service and its assets to another provider.
Authorities and forces advanced in their preparations acknowledged that best value
was more flexible than CCT, with the potential for combining models of service
delivery – including contracting-out parts of services, or agreeing joint
arrangements with other agencies for certain related activities. Competition was
also about ensuring value for money, so that savings could be re-invested (“more
bang for your bucks”as one interviewee put it). Whilst it was too soon to assess
whether forces were employing competitive alternatives, reviews were identifying
ways of improving value for money – for example, by leasing rather than buying
equipment, or by using support staff to undertake some duties so that officers could
be released for operational policing.
Authorities and forces were also starting to examine options for consortia to
save procurement and management costs (for instance, sharing resources such as
a helicopter). A number of forces already had arrangements based on
‘comparative advantage’ (i.e. specialising in areas they were best at and ‘buying
in’ other services from forces specialising in their best areas). For instance, forces
with a small stretch of motorway had agreements with neighbouring forces with
55
SERVICE REVIEWS AND THE ‘4CS’
more extensive motorway coverage. Although fears of ‘regionalisation’ were
expressed, it was likely these arrangements would be expanded to other services
and possibly other emergency service providers (such as health and fire
authorities). A popular example was the frequently mentioned potential for
shared control rooms.
Managing the ‘compete’ element
Forces were usually planning a final discussion of review results, at which the team
and senior managers would use the data they had gathered and analysed both to
challenge the service and to consider competitive alternatives. This Section earlier
described how some forces directly equated ‘compete’ to ‘challenge’. For them,
examining whether there was a more economic way of providing a service, meant
challenging their direct provision of it. This potentially overlooked, however, the
fundamental basis of ‘challenge’ – to establish whether the service should be
provided at all, internally or externally.
One of the official pilots had found it important to address the fear of competition
– particularly amongst support service staff. Their message had been that
competition did not mean providing a service in the cheapest way, but improving
quality in relation to cost and, where possible, releasing resources for use elsewhere.
Some interviewees, however, suggested that the police should remain alert to the
danger that competition would fall unevenly across each force with adverse
implications for morale. One commented: “There’s a real danger that competition will
fall on small areas of the service constantly. For example, a lot will fall on HQ
departments: areas with a lot of support staff are likely to be under constant scrutiny.”
Review outcomes: objectives and Action Plans
Once a review has been completed and its findings gathered in a report, the force
and police authority need to consider the options before agreeing an Action Plan.
In most cases, forces considered that the best value team should draw up the
alternatives for discussion – rather than leaving it to the reviewed service or the
steering group. One force had decided to present service managers with three
options from which they had to choose one – this approach had the advantage of
limiting discussion to the realistic whilst ensuring that the service felt it had
‘ownership’ of its future.
Forces that had conducted reviews and agreed their Action Plan’s objectives were
invariably breaking these down into small projects, each with its own objectives
and an identified owner. Two forces at this stage highlighted that these objectives
56
SERVICE REVIEWS AND THE ‘4CS’
needed to be SMART (Specific, Measurable, Achievable, Relevant and
Timebound). Project or activity owners were then agreeing what actions they
would take and resources they would need with the service manager – who retained
overall ownership of the Action Plan.
It was too soon to assess how Action Plans were being taken forward, but some
forces were already identifying the need to have in place:
●
●
●
●
●
●
continued central support, advice and monitoring of progress;
monitoring mechanisms within the service (one force was considering requiring
each service manager to run COMPSTAT12-like meetings with project owners);
project management tools (such as PRINCE 2 and Microsoft Project);
the ability to produce audit trails;
a blame-free culture should a project fail despite careful implementation; and
the flexibility to amend the Action Plan and projects in the light of results and
changing circumstances.
57
12
COMPSTAT, introduced in
New York by the then Police
Commissioner William Bratton,
entails managers or divisional
commanders being subjected
before their peers to detailed and
sometimes hostile questioning on
their performance.
OVERVIEW AND THE FUTURE
7. Overview and the future
All police authorities and forces in England and Wales were making progress in
their preparations for best value, having concluded that they could not afford to
wait for guidance or findings from the pilots. It was also clear that the picture was
rapidly changing: in April forces were in the early stages of considering widely
differing approaches to best value, but by July their preparedness and approaches
seemed to be broadly converging, as they started to pilot reviews and encountered
similar issues and difficulties. Key areas where patterns were emerging included:
●
●
●
●
●
●
●
●
●
●
●
●
●
authorities and forces were generally aiming to build on existing structures,
systems and cycles, rather than planning radical organisational changes in the
first stages of best value;
some were choosing to develop their systems before piloting reviews, whilst
others were doing the reverse – the end results, however, were broadly the same
in terms of programmes and approaches to review;
perceptions of preparedness seemed to be directly related to forces’ experiences
in best value: with those forces most advanced in their preparations more aware
of how much work it involved;
invariably, forces were combining various tools and models in a ‘toolkit’, so that
each review was conducted according to the service’s circumstances;
all forces were using the Business Excellence Model, though to different extents
– usually as a self-assessment tool. Many were process mapping to identify
activities;
forces were taking a hybrid approach to service definition to avoid the
disadvantages of just one approach – reviewing services by function or by taking
a process-based approach (although the picture was complicated by how they
combined these approaches with different review targets and levels);
reviews were usually being prioritised according to a combination of services’
past performance, their budget size and the potential for savings to be made;
increasingly, forces were forming central teams to oversee best value’s day-to-day
management;
where resources were limited, however, forces were also tending to let services
self-assess, with support and guidance from the centre;
authorities and forces were concluding that review programmes needed to be
flexible to take account of changing circumstances;
early pilot reviews were taking longer than expected, but forces had concluded
that time-scales would shorten as they became more used to them;
there was increasing evidence of forces and local authorities combining their
consultation and even starting to discuss joint reviews; and
there were signs, too, that police authorities and forces were developing a closer
working relationship – in some areas authority members were participating in
regular seminars, and in others, taking part in best value steering groups and in
reviews.
58
OVERVIEW AND THE FUTURE
Alongside these developments, some common concerns and difficulties were also
emerging, along with some illuminating responses to them:
●
Forces were starting to experience the cultural implications of best value. Some
staff – particularly in support services – were feeling threatened by the
‘challenge’ and ‘compete’ elements of reviews, whilst service heads were
occasionally reluctant to help reviews. Many forces were therefore developing
force-wide communication strategies to ‘market’ best value thereby encouraging
staff and staff association participation. Some forces were also mixing ‘marketing
material’ with their public consultation to explain policing services and manage
the potential for raised public expectations.
●
Authorities and forces were finding that their other planning cycles did not
match with their best value programme. Some however, were addressing this by
varying the length of their review programme to match cycles, such as that for
Crime and Disorder.
●
Some police authorities were finding that best value was remaining the preserve
of a minority of members – they were therefore finding ways to communicate
developments across the authority. To counter the danger that best value might
lead to friction between authority and force, some had agreed their respective
roles and others had joint steering groups.
●
Some forces had found they could learn little from the local authorities in their
area – so they were examining developments further afield. Others had tried to
develop joint reviews, but been frustrated by differences in timetables and
agendas – they had identified the need for earlier planning in the future.
●
All of the tools for best value had their potential disadvantages (in terms of
complexity, relevance or comparability), but forces were accepting that these
models alone were no more than aids, and that some of the faults could be
overcome by combining or adapting tools.
●
In a number of reviews, forces had found that the data they needed were
unavailable. Some were reacting to this by requiring all services to self-assess
regularly, so that the information would automatically be available once they
were due for review.
●
Forces were finding it difficult to benchmark because it was often hard to
identify leading service providers and data were seldom comparable. Accurate
benchmarking would also be assisted by activity costing, but this was still in
59
OVERVIEW AND THE FUTURE
development and comparisons were made harder by differing techniques,
definitions and conventions. In particular, it was proving difficult to compare
police services with the non-police sector. Despite this, there were many
examples of effective benchmarking between forces, and involving non-police
organisations, that had led to savings. Forces pressed for the establishment of a
national database or website to help them communicate developments, learn
lessons from each other and benchmark more easily.
●
It was sometimes proving difficult to consult effectively on particular services
(because of the nature of the service or reluctance on the part of the user or
stakeholder). Nevertheless, forces were developing some very innovative and
interactive techniques that could have wider lessons for improving consultation
more generally.
●
Forces envisaged difficulties applying the ‘compete’ element of the ‘4Cs’ to
services other than support, because of the nature of policing, and the lack of
clarity regarding which areas of policing would be exempt from the legislation.
However, there was a growing appreciation that ‘contracting-out’ was not the
only option and evidence that forces were examining alternative arrangements –
such as ‘comparative advantage’ agreements and consortia.
Future issues
There were also some issues identified as key to the future success of best value
policing. In particular, police authorities and forces will need to consider how
they will:
●
●
●
●
●
manage and resource the rising tide of work that will result from best value as it
rolls forward;
manage and monitor Action Plans;
monitor their best value approach as a whole and how they will take action if it
seems to be failing;
meet the requirements of the audit and inspection regime, once it becomes clear;
and
monitor and learn from developments elsewhere – the formal pilots, other forces,
local authorities and fire authorities.
60
OVERVIEW AND THE FUTURE
Conclusion
These findings represent only the more commonly described features of current
police preparations for best value: it would be impossible to describe, in one report,
all the topics raised and preparations underway. Best value will not come into effect
until April 2000 and it will, therefore, be some time before it is clear which
approaches in what circumstances are most likely to be successful. Throughout the
research and even as this report was published, more issues and potential lessons
were emerging. Future research is planned to follow up developments but, for now,
this report can only raise issues and, sometimes, potential solutions: it cannot
supply all the questions, let alone the answers.
Ultimately, police authorities and forces must bear in mind that they will be judged
not on their mechanisms and review approaches, but on whether they are selfreviewing rigorously and improving on the basis of the findings of those reviews. In
short, forces need to deliver best value policing – the ways by which they do so can
differ, but the result must be the same: continuous improvements in service.
61
REFERENCES
References
Accounts Commission (1998), The Measures of Success: Developing a Balanced
Scorecard to Measure PerformancEdinburgh: Accounts Commission for Scotland.
Audit Commission (1998), Better by far: preparing for Best Value ,
Management Paper, London: Audit Commission.
Bovaird, T (1998b), Achieving Best Value through competition, benchmarking and
performance networks, DETR/Warwick paper No. 6.
British Quality Foundation (1998), Local Government Interpretation of the Business
Excellence ModelLondon: BQF.
DETR (1998a), Modernising local government: improving local services through Best
Value. Green Paper, March 1998. London: DETR.
DETR (1998b), Modern Local Government: in touch with the people.
White Paper, July 1998. Cm 4013. London: Stationery Office.
DETR (1999), Preparing for Best Value, Guidance note, London: DETR.
Elliott, R and Nicholls, J (1996), It’s Good To Talk: Lessons in public consultation
and feedback, Police Research Series Paper 22, London: Home Office.
Geddes M (1998), Achieving Best Value through partnership
: : Paper No. 7,
Warwick/DETR Best Value Series, Warwick: Local Government Centre.
Hartley, J (1998), Organization-wide approaches to Best Value
: Paper No. 10,
Warwick/DETR Best Value Series, Warwick: Local Government Centre.
HMIC (1998), What Price Policing: A Study of Efficiency and Value for Money in the
Police Service, London: HMIC.
House of Commons (1998), Local Government Bill and Explanatory Notes.
London: The Stationery Office.
Kaplan, R S and Cooper, R (1998), Cost and Effect: Using integrated cost systems to
drive profitability and performance, Boston: Harvard Business School Press.
Kaplan, R S and Norton, D P (1996), The Balanced Scorecard,
Boston: Harvard Business School Press.
62
REFERENCES
Lewis, M (1998), Achieving Best Value through Quality Management,
Paper No. 9,
Warwick/DETR Best Value Series, Warwick: Local Government Centre.
Local Government Management Boar d (1999), Best Value: An Introductory Guide
,
London: LGMB.
Martin, S (1998), Achieving Best Value Through Public Engageme,Paper No. 8,
Warwick/DETR Best Value Series, Warwick: Local Government Centre.
Martin, S J and Hartley , J (with BMG Marketing, LGA, LGMB) (1998),
Best Value: Current Developments and Future Challenges
, LGA, London.
Morgan, R (1992), ‘Talking about Policing’. In D. Downes (ed.) Unravelling
Criminal Justice , London: Policy Studies Institute.
Olve, N, Roy, J and Wetter, M (1999), Performance Drivers: a practical guide to
using the Balanced Scorecard, Chichester: Wiley.
Porter, L and Tanner, S (1996), Assessing Business Excellence
,
London: Butterworth-Heinemann.
Sanderson, I (1998), Achieving Best Value through fundamental performance review
,
Paper No. 7, Warwick/DETR Best Value Series, Warwick: Local Government
Centre.
Sanderson, I, Boivard, T, Davis, P, Martin, S and Foreman,
A(1998), Made to
Measure: Evaluation in Practice in Local Government, London: LGMB.
Stockdale, J E, Whitehead, C M and Gresham, P J (1999), Applying Economic
Evaluation to Policing Activity, Police Research Series Paper 103, London: Home
Office.
Zairi, M (1996), Benchmarking for Best Practice: Continuous learning through
sustainable innovatn, London: Butterworth-Heinemann.
63
APPENDIX 1
Appendix 1: The Business Excellence Model (BEM)
BEM, developed between 1988 and 1992 by the European Foundation for Quality
Management (EFQM), contends that controlled processes, supported by strategic
leadership, resources and skilled personnel, will deliver continually improving
performance and enhance staff and customer satisfaction whilst having a beneficial
impact on society more widely. The EFQM was founded in 1988, by 14 leading
European companies, to develop guidance on enhancing the competitiveness of
European companies in the world market. In the United Kingdom, the model is
promoted by both the Department of Trade and Industry and the British Quality
Foundation (which, by 1999, had over 1800 members).
The model, last revised in April 1999, divides business excellence into nine key
criteria, each representing either an ‘enabler’ or a ‘result’. Enablers are concerned
with how an organisation is run, whilst the results are what the organisation has
achieved and is perceived to have achieved by its stakeholders (i.e. customers,
employees, the community and other related organisations).
Figure 3: The Business Excellence Model
The Business Excellence Model / EFQM Excellence Model
ENABLERS
RESULTS
People
Results
People
Leadership
Policy &
Strategy
Processes
Partnerships
& Resources
Customer
Results
Key
Performance
Results
Society
Results
INNOVATIONANDLEARNING
® 1999 EFQM The Model is a registered trademark of the EFQM
Assessors award points by judging evidence (such as performance data and
interviews) for the criteria against a series of 32 related sub-criteria. For instance,
under ‘people results’, assessors will ask whether and how the organisation measures
employee satisfaction and what staff perceptions are of management style, their
working environment and morale. For each criterion, the number of points
awarded depends on the extent to which there is evidence that the:
●
●
Results required have been determined;
Approach has been planned and developed;
64
APPENDIX 1
●
●
chosen approach has been Deployed; and
approach and its deployment has been Assessed and Reviewed.
Rather than this RADAR approach, many forces were using an earlier version (on
which the police specific version was based). For this version, the number of points
for the ‘enabling’ criteria depends on the evidence of the approach taken and
extent to which it is deployed. For the ‘results’ criteria, the number of points
depends on the comparative performance trend and extent to which this impacts
across the organisation. Each criterion’s points, determined by this version, or the
RADAR model, are then weighted by its relative importance within the model, so
that an overall score is produced. The total number of points theoretically
achievable is 1000 (split evenly between enablers and results), although most
organisations fall well short of this.
65
APPENDIX 2
Appendix 2:
Links between the Business Excellence
Model and other initiatives
Figure 4 describes the extent to which five key initiatives undertaken by many
police forces in the last decade address the nine elements of the Business
Excellence Model:
no tick
✔
✔✔
✔✔✔
= not addressed
= indirectly addressed
= secondary impact
= critical impact
Figure 4: Links between BEM and other initiatives
BS EN
ISO
9000
QS 9000
ISO 4000
Investors
in People
Charter
Mark
Leadership
✔
✔
✔
✔✔
✔
People
✔
✔
✔
✔✔✔
✔✔
Policy and strategy
✔✔
✔✔
✔✔
✔✔
✔✔
Partnership and Resources
✔✔
✔✔
✔✔
✔
✔✔
✔✔✔
✔✔✔
✔✔
✔
✔
✔
✔
✔✔✔
✔
✔✔
✔✔✔
✔
✔
✔✔✔
✔
✔✔✔
✔✔
✔
✔✔
✔✔✔
BEM Enablers
Processes
BEM Results
People results
Customer results
Society results
Key performance results
✔
Source: adapted from ‘Excellence Links’, published by Business Link, BQF and DTI.
BS EN ISO 9000 : the ISO framework, strongly identified with Total Quality
Management (TQM), concentrates heavily on process optimisation from beginning
to end, to ensure customer satisfaction. It therefore primarily links with the BEM’s
‘processes’ criterion. Relying, as they do, on system audits and reviews, the ISO
standards were criticised by some force contacts for potentially becoming
bureaucratic paper exercises.
QS 9000 : similar to ISO 9000, this standard promotes on-time delivery, lower costs
and, in particular, improved quality. The initiative therefore is mainly associated
with the ‘processes’ and ‘customer results’ criteria of the BEM.
66
APPENDIX 2
ISO 4000 : this series of standards relates an organisation’s activities to its impact
on the environment – with a consequent beneficial impact on customers’ and
stakeholders’ perceptions as well as staff morale. The primary link is to the BEM’s
‘society results’ criterion.
67
APPENDIX 3
Appendix 3:
Forces cur rently piloting national
Process Improvement Projects
The following forces are currently piloting the use of Process Expert Software to
map out all the activities relating to key sets of processes:
Crime management
Norfolk
Custody procedures
South Wales
Avon & Somerset
Wiltshire
Greater Manchester
Northumbria
Call handling
Lancashire
Cheshire
West Midlands
Gloucestershire
Thames Valley
West Mercia
Case file preparation
Lancashire
Sussex
City of London
South Yorkshire
Merseyside
Develop and train staf f
Leicestershire
Dorset
Derbyshire
Scene management
Northumbria
Kent
68
APPENDIX 3
For each of these processes or a related sub-process, the respective forces are
identifying every interlinking activity from start to finish and then comparing their
results in the form of multi-level maps of decision-making trees, inputs and related
costs.
69
APPENDIX 4
Appendix 4:
Case study one: Greater Manchester
Police
Between April 1998 and April 1999, GMP piloted its approach to best value in
four sub-divisions belonging to one of its in-house families (each sub-division
having been assigned to a family on the basis of population, policing incidents and
land usage). The pilots covered a challenging inner-city area, with a high local
index of deprivation. Each sub-division underwent the same three-stage self-review
process detailed below. Although these reviews were geographical (i.e.
concentrating on the services provided by the sub-divisions) they also sought to
identify lessons which could be read-across to other divisions and departments, as
well as issues that required further, separate review.
Stage 1: Producing the Baseline
Between April and May 1998, selected staff in each sub-division were trained in
the application of the GMP Service Assessment Model (GMPSAM, the force’s
version of BEM). Each team comprised a lead assessor and approximately eight
evidence gatherers, drawn from support staff and officers from Constable to
Inspector rank. During June and July these teams collected evidence that included
interviews, questionnaires and focus group findings and applied GMPSAM, using
full scores and weights (on the grounds that anything less would provide
insufficiently detailed information). The lead assessor from each team then
prepared a comprehensive Baseline Assessment for each sub-division, reporting
their GMPSAM findings and also:
●
●
●
●
●
a description of the area (i.e. local population, households, employment and
facilities) and policing services (i.e. establishment, buildings, IT, uniform
coverage, CID, etc);
an examination of the results of external consultation (past and current) and the
findings of a staff survey;
an analysis of existing consultation and market research methods and recent
findings;
a basic costing of policing broken down by functional area and per incident
(using details from the financing system, activity data for uniform officers and
apportioned costs); and
a performance review, based on national and local PIs and compared with the
results from other forces in the family of forces, the provincial forces’ average
and the national average (i.e. basic comparative benchmarking).
These additional elements were included in the review because the force felt that
GMPSAM failed to cover the service’s background adequately – particularly in
terms of providing the sort of information to enable them to ‘challenge’ the service.
70
APPENDIX 4
This stage, therefore, encompassed information collection and self-assessment. As
in most other best value pilots, the ‘4Cs’ were not separated out in the process, but
data relevant to the ‘compare’, ‘consult’ and ‘compete’ elements were clearly
identifiable.
Stage 2: Option appraisal and development of Performance Improvement
In stage two, the review team and service management team met to discuss the
review results and consider the service’s activities and objectives against police
authority objectives and priorities; force priorities; Crime and Disorder Strategies;
Home Office priorities and new legislation. During this stage, the force also
completed the ‘4Cs’ by challenging the sub-divisions activities and services, asking:
●
●
●
●
●
●
●
Is the service necessary?
Is the service important?
How well is the service currently delivered?
Can/should someone else deliver the service?
Should we deliver the service in partnership?
Can the force stop doing the task/service?
What priority should the task/service be given?
Using the outcomes from these discussions and the data collected during the
review, the assessors then drew up a list of options to tackle the areas for
improvement for each sub-division. Their aim was to identify the priorities for
improvement so that the sub-divisions could focus their efforts – as one of the
review team members commented: “everything is important but we have to identify
those that are crucial”.The assessors and management teams in each sub-division
then drafted a two-year Performance Improvement Plan (PIP) listing the proposed
improvement objectives and targets. To heighten participation and inclusiveness,
these draft plans were circulated for comments to officers and staff in the respective
sub-divisions and shared at a later date with partners. All of the Plans had
subsequently been amended before being published and made available to the local
population and all members of the sub-division.
71
APPENDIX 4
Box 15: Extracts from a GMP Performance Improvement Plan (PIP)
The following extracts are from the Greenheys sub-division PIP, which comprises 3 Aims
and 13 Objectives:
Aim: Attack crime that hurts the community most
Objective 1: Reduce burglary dwelling
Target: 15% reduction in burglary dwelling
Target: Establish amount of burglary repeat victim premises and set target to reduce
Target: Increase level of victim satisfaction to 95%
Aim: Strive for total trust from our community
Objective 7: Encourage the reporting of crimes against vulnerable groups
Target: Establish the level of under-reporting of crimes and incidents and set a target in
year 2 to reduce the level
Aim: Focus resources to improve service
Objective 11: Increase operational availability of all officers
Target: Establish average number of days lost (e.g. through sickness) per officer and set
target to reduce abstractions for year 2
Target: 90% of immediate response incidents answered within target time
Stage 3: Implementation and monitoring of the Performance Improvement Plans
Each objective in the PIPs has been assigned an appropriate owner, from Sergeant
to Superintendent rank, or a relevant member of support staff who will be
responsible for taking it forward with their own ‘objective team’. For instance, one
of Greenheys’ Inspectors has agreed a strategy to address objective 1 (Box 15) that
includes the following actions:
a)
b)
c)
d)
e)
f)
g)
continue to identify and monitor hotspots (three located to date);
use better intelligence to identify offenders;
target offenders by covert operations;
target harden the vulnerable premises;
introduce graded police response to burglary dwelling premises;
forward victim satisfaction survey to all burglary victims; and
improve officer awareness of and training in crime prevention;
These are further broken down into individual output measures and targets. For
instance, one of the measures for actions a) to c) is ‘number of crime detection
initiatives targeted at hot spots’ with the target being ‘at least one initiative per
month between April 1999 and April 2001’. Progress against the objectives is
monitored at meetings held to discuss performance which take place at least
every two months.
72
APPENDIX 4
Actions taken in relation to objective 7, including improved reporting procedures
for racial harassment, better awareness training for officers and the establishment of
a new special unit had convinced the community that the police were taking the
issue seriously. Reports of racial harassment had increased substantially in 1999
when compared with the previous year. Other initiatives being pursued included
the introduction of a dedicated community safety unit to focus on high-volume
crime, the promotion of CCTV along a main arterial route, and introduction of a
drive and ride service to reduce thefts from vehicles. The force had found that this
sort of ‘quick win’ was the most effective way to demonstrate best value to staff and
secure their support for it.
Outcomes and the lessons learnt
A key aim of the pilots was to aid the development of a model that could be rolled
out across the force. GMP has subsequently developed a hybrid review programme
(Figure 5) – combining two types of review based on that already piloted (initial
and secondary/cross-cutting), with three targets for review (divisions, departments
and specific services or parts of services).
Figure 5: GMP’s Review Programme (1999-2005)
Type of
review
Initial
Reviews
Secondary
/crosscutting
reviews
Target for
review
April 1999 –
March 2001
Divisions
and Force
Command
All divisions
and Force
Command
April 2001 – April 2002 – April 2003 – April 2004 –
March 2002 March 2003 March 2004 March 2005
All divisions
and Force
Command
Departmental 12
13
units
departmental departmental
units
units
14
departmental
units
Specific
services
(e.g.
transport)
Initial reviews will be completed on a three-year cycle because the force had
concluded that five years between reviews was too long and three years fitted better
with the Crime and Disorder Audit cycle. The secondary reviews will be similar in
methodology, but will draw on issues arising from the initial reviews: for example
the pilots had identified personnel issues, and these would soon be examined as
part of a personnel review.
The force has also decided that, although self-assessments will continue to be
undertaken at sub-divisional level, the outcomes will be pulled together in a
73
APPENDIX 4
divisional Performance Improvement Plan, so that the divisional approach to best
value is less fragmented. Although retaining sub-divisional reviews may still be
complicated by the need to consider some functions (such as traffic) at a divisional
level, the force envisaged little friction between sub-divisions and divisions,
because the former will have to take account of divisional objectives.
Other key lessons included:
●
The force had identified the need for each review to have an initial planning
stage – so that its scope and objectives were clear from the start and it remained
within time-scale and resources. The subsequent data gathering stage was also a
vital element of the approach: forces had to understand the background to a
service before they reviewed it.
●
GMP had concluded that officers from a mix of ranks should undertake the
reviews – so that leadership and tasks could be allocated according to ability, but
also to ensure there was local understanding and ownership of the plan. Option
appraisal and action ownership, however, were better maintained at Inspector
and Superintendent level to ensure leadership and avoid complex managerial
structures. The main disadvantage of using operational officers to do the reviews
was the danger of abstractions, but GMP had found that this could be overcome
if the centre was responsible for data gathering (one of the most labour intensive
aspects of best value).
●
The force had realised that it needed a permanent central support team
(currently about 20 people work on best value in the Corporate Development
department – although they have a range of other responsibilities). This team’s
role was to animate the process, give support, help with data gathering and deal
with cross-cutting issues that had multiple-leadership implications. Despite this,
the centre had to be careful not to appear to be taking the lead: it was vital that
innovation was devolved. One danger inherent in the devolution of innovation,
however, was the potential for conflict with the need for corporacy. The force
was keen to ensure that good practice and improvements were highlighted and
rolled out elsewhere in the force.
74
APPENDIX 5
Appendix 5:
Case study two: Derbyshir e
Constabular y
In February 1999, Derbyshire Constabulary established a new Best Value Section,
within its Corporate Development Unit, by combining its Audit and Evaluation
Section (responsible for thematic inspections) and Business Process Section
(responsible for managing business change). The new Section comprises two Chief
Inspectors, two Inspectors, two Research Officers and a Research Assistant.
To begin with, the Best Value Section took on responsibility for three key areas:
●
●
●
the force’s five-year rolling review programme;
the development and application of a seven stage review process; and
ten pilot reviews commencing before 1 April 2000 – including ones for crime
reporting and recording, forensic science, administration, helicopter use and
crime intelligence.
The force review programme
Between 1998 and 1999, members of the police authority, the force’s Chief Officers
and the Head of Corporate Development developed a review programme after
assessing:
●
●
●
●
available benchmark data (HMIC, Audit Commission and local indicators). The
force had already amassed comparative data on 150 indicators which enabled
them to identify which services were, in relation to those of other forces, in the
lowest quartile (categorised ‘red’) in the two middle quartiles (categorised
‘amber’) or in the top quartile (categorised ‘green’);
the priorities of divisional commanders and department heads, including their
views on where most financial gains might be made;
customer feedback from PCCGs, the customer comments system and an annual
survey conducted by a market research company; and
the force’s previous reviews (in particular, to avoid reviewing any service
examined within the last two years, unless other factors necessitated this).
The resulting timetable envisages 67 service reviews over the five years, with the
force’s approach equally split between a functional approach (e.g. dog section, task
force, Special Branch, Special Constabulary, legal services, police surgeons) and a
process approach (e.g. accident investigation, case preparation, incident
management, timeliness and quality of files). The force had also identified a
reasonably large number of cross-cutting reviews, likely to involve other agencies
(e.g. drugs, diversionary schemes for offenders, community safety).
75
APPENDIX 5
Whilst the Best Value Section will be responsible on a day-to-day basis, the
programme will be overseen by a Best Value Board, comprising the Chief Constable
(chair), Vice Chair of the Police Authority, Head of Corporate Development, Head
of Finance and the Best Value Manager (a senior representative from the reviewed
service). The police authority’s Audit Committee will receive copies of all best
value reports before they are presented to the full authority. The programme will be
flexible – changing to meet local and national priorities and reviewed annually as a
matter of course.
The seven-stage review process
The force had developed a seven-stage process common to each review, with
related forms to enable progress to be recorded and tracked (Box 16). Rather than
having a central manager for reviews, the police authority and force agreed that the
Board should appoint individual Best Value Plan Managers from each reviewed
service. This approach was chosen to aid the consultation process, ensure that the
manager had a good working knowledge of the area under review and could carry
forward any resulting changes. The Best Value Section will support the Board and
Best Value Plan Manager, by:
●
●
●
●
●
●
●
●
helping to define the terms of reference;
developing a plan of work and allocating tasks;
advising on and monitoring project management;
gathering qualitative and quantitative data by process mapping, activity analysis,
interviews with service providers, and applying the BEM;
undertaking surveys for internal and external consultation;
supplying and supervising the completion of forms for the seven stages;
producing and presenting briefing papers for the Board; and
producing a guidance manual for reviews.
The Section’s practical and consultancy role had highlighted their need for
omnicompetence.
76
APPENDIX 5
Box 16: The seven-stage review process in Derbyshire Constabular
y
Stage one:
Agreeing the terms of reference. The Best Value Plan Manager and key
stakeholders consult Corporate Development, the Finance department and
staff associations, and develop terms of reference for approval by the Board,
before considering the timetable, staff involvement, ‘action managers’ for
each stage and how to monitor progress (using PRINCE 2 and Gantt
charts).
Stage two: Where are we now? The Best Value Section conducts an initial review of
quantitative data (including activity analysis, process mapping/FAST and a
possible Balanced Scorecard Review) and another part of Corporate
Development consults stakeholders and conducts interviews. The reviewed
service undertakes a BEM self-assessment, whilst the Finance department
gathers costing information.
Stage three: How does our performance compare with others? Structured visits, to
learn lessons, may be made to forces, other agencies or private sector
organisations that appear to have a better performance record in the
reviewed service.
Stage four: Can we improve? The Best Value Section, having reviewed quality
standards and consulted other specialist departments, develops options for
improvement for the Best Value Plan Manager and stakeholders to consider.
Stage five: The way forward. The Best Value Section produces a Business Benefits
Plan for the options, setting out the skills/training needs, the timetable,
targets for improvement and consultation findings. The Board discusses the
options, making recommendations to Force Management and Policy Groups.
The choice is then presented to the police authority.
Stage six:
Implementation and change management. The Best Value Manager
implements the ‘Business Benefits Plan’ and Corporate Development
monitors its introduction. The Benefits Plan anticipates definite, expected,
logical and intangible benefits (both financial and non-financial) from the
proposed changes.
Stage seven: Have we realised the benefits? The Board sets a date for the Best Value
Plan Manager to review the outcomes (usually within 12-18 months) against
the Benefits Plan, by examining quantitative and costing data. The reviewed
service also repeats its BEM assessment.
The pilot review of child protection units and domestic violence units
A seven-member team carried out these pilot reviews over seven months between
late 1998 and early 1999. The review was already planned before the preparations
for best value but, as an experiment, the force applied its new review approach
(except for the use of BEM). Given their related characteristics, the reviews were
conducted simultaneously and a joint Performance Improvement Assessment and
Business Benefits Plan developed. Although the review addressed the ‘4Cs’ as
77
APPENDIX 5
combined, rather than separate, elements of best value (see Section 6), the detailed
review findings were drawn together under the individual headings of the ‘4Cs’:
Compare
The review team examined:
●
●
internal performance indicators from existing units; and
benchmark data from similar units in their family of forces.
The force found benchmarking with other forces frustrating and time-consuming:
in many cases – particularly for domestic violence units – reliable or comparable
data were unavailable. Derbyshire also found that benchmarking on the average
number of child protection referrals was misleading because of definitional
differences. Contacts with staff in similar units elsewhere revealed little, so
Derbyshire instead turned to the equivalents of their Corporate Development
Department and collected information on the roles, responsibilities, structure and
staffing of units. Officers visited Cambridgeshire to examine the potential for
adopting Family Units, by combining both child protection and domestic violence
roles.
Challenge
Key tasks undertaken by the review team included:
●
●
●
●
●
●
●
questioning staff on their unit’s Terms of Reference, line management, training,
the need for protocols and staff need for welfare and support;
assessing the additional workload likely to result from extending roles;
examining video training and consulting officers on the extent to which they
used these skills;
using interviews, referral data and activity analysis to consider structure and
staffing levels;
developing process maps by interviewing staff to create exhaustive lists of
activities (including their duration and frequency) to examine scope for reducing
the number of tasks;
undertaking activity analysis across all divisions (i.e. measuring, over four weeks,
time spent travelling, on administration, in contact with victims, witnesses and
agencies, etc.) to identify ways of freeing-up time; and
an assessment of the impact of the Osman vs. United Kingdom ruling (removing
the absolute bar on the police from being sued for negligence).
78
APPENDIX 5
Findings included:
●
●
●
●
●
●
●
child protection officers gave better support to victims than divisional officers;
video skills were used regularly by child protection officers but only by a
selection of trained divisional officers;
the child protection unit’s role could be extended without increasing staffing
levels;
divisional detective inspectors lacked the time to manage the units effectively;
the number of child protection units could be reduced and their structures
changed to guard against duplication and improve inter-unit support;
protocols for information sharing with other agencies were required; and
coding of domestic incidents should be improved.
In terms of costing, the force had been undertaking activity analysis before process
mapping, but concluded that it was better to get the processes and definitions right
before measuring activities against them. Now that they were clearer on unit
processes they had been able to develop an activity card so that workers in the
units could record their activities more accurately during the annual force-wide
activity analysis and for future reviews.
Consult
The review team:
●
●
●
●
●
conducted face-to-face interviews with almost all of the units’ staff;
sent 578 questionnaires to divisional operational constables and sergeants
(response rate 31%) asking them about their awareness of the issues and
confidence in dealing with them; their contact with the units and assessment of
the service provided;
for child protection units, surveyed social services staff and conducted a separate
survey of other external partners (mainly head/deputy head teachers, child
protection co-ordinators, police surgeons and senior partners in doctors’ general
practices);
for domestic violence units, sent 306 questionnaires to a variety of agencies,
solicitors, hostels, the probation service, social services, borough councils and
victim support (28% response rate); and
for domestic violence units, conducted a telephone survey of a sample of 48
victims but, for child protection units, concluded that a survey of child abuse
victims would be better conducted during a joint review of Area Child
Protection Procedures, to be led by social services.
79
APPENDIX 5
Consultation suggested that respondents were largely satisfied with the services
provided by the units, but identified a need for the police to market and explain
the units’ roles to some agencies.
Compete
The review team:
●
●
●
●
examined the results of a pilot scheme where text processing staff produced
video interview summaries, rather than unit staff;
considered the value of developing a new force-wide networked computer system
to share intelligence;
considered whether staff should still be paid mileage costs, or should have
fleet/loan vehicles; and
examined the scope for civilianising many or all of the posts in the domestic
violence units.
Outcomes and lessons learnt
The resulting Performance Improvement Assessment proposed options for change
in key areas, detailing potential ‘financial’ and ‘non-financial’ business benefits and
their degree of predictability:
●
●
●
●
definite (may be predicted with confidence);
expected (may be predicted on the basis of historic trends);
logical (can be anticipated and monitored, but not predictable); and
intangible (may be predictable, but difficult to substantiate).
Recommendations and their potential benefits included:
Value type
Proposed change
Definite
carry out risk assessment
at referral stage
civilianise some domestic
violence unit posts
Financial benefit
Non-financial benefit
reduced risk of
litigation
initial cash saving of
£30k, decreasing over
eight years to £4k p.a.
Expected
central welfare and
support system
improved staff morale
Logical
central referral point
for agencies
improved service and
fewer referrals
Intangible
computer generation
of standard letters
more timely intervention and increased
identification of repeat victimisation
80
APPENDIX 5
Other key lessons included:
●
Best Value Plan Managers needed to be fully aware of their roles and
responsibilities and have substantial ownership of the process and findings.
●
Face to face interviews had been too time-consuming as a means of data
collection and required a great deal of analysis. In the future, groups of service
providers would be asked to identify issues using the BEM and interviews with
selected staff would drill down into these issues.
●
Staff in the reviewed service needed to be involved in the analysis underlying
the discussions on structures and staffing levels, so that they could be satisfied
that the findings were accurate and not prompt the reviewers to ‘retrace their
steps’. The staff also needed to be aware that data would be triangulated to
ensure accuracy: a few managers had been embarrassed by findings indicating
that they had given some inaccurate information, particularly on workloads.
●
Many stakeholders, unhappy with findings suggesting that their working
practices and place of work needed to change, fought the results by challenging
them. The force commented: “it is rare that a best value review will result in status
quo – there is a need to overcome emotive responses by constantly reminding
stakeholders of their role and the need for decisions based on the evidence ” .
It was vital that the Plan Manager and Head of Best Value encourage
stakeholders to take ownership of the decision-making process, but emphasise
that ‘no change’ was unacceptable. To guide the process, Plan Managers had to
make sure that meetings with stakeholders were structured and therefore not
“reduced to a mêlée of subjective views and personal insults ” . Review leaders
also needed to support the review team as they faced potential criticism from
stakeholders.
81
RECENT POLICING AND REDUCING CRIME UNIT PUBLICATIONS:
Police Research Series papers
97. Testing Performance Indicators for Local Anti-Drugs Strategies.
Mike Chatterton, Matthew Varley and Peter Langmead-Jones. 1998.
98.
Opportunity Makes the Thief: Practical Theory for Crime
Prevention. Marcus Felson and Ronald V. Clarke. 1998.
99.
Sex Offending Against Children: Understanding the Risk.
Don Grubin. 1998.
100. Policing Domestic Violence: Effective Organisational Structures.
Joyce Plotnikoff and Richard Woolfson. 1999.
101. Pulling the Plug on Computer Theft. Paula Whitehead and
Paul Gray. 1999.
102. Face Value? Evaluating the Accuracy of Eyewitness Information.
Mark R. Kebbell and Graham F. Wagstaff. 1999.
103. Applying Economic Evaluation to Policing Activity. J.E. Stockdale,
C.M.E. Whitehead and P.J.Gresham. 1999.
104. Arresting Evidence: Domestic Violence and Repeat Victimisation.
Jalna Hanmer, Sue Griffiths and David Jerwood. 1999.
105. Proactive Policing on Merseyside. Alana Barton, Roger Evans. 1999.
106. Tenure: Policy and Practice. Gary Mundy. 1999.
107. Career Progression of Ethnic Minority Police Officers. Nick Bland,
Gary Mundy, Jacqueline Russell and Rachel Tuffin. 1999.
108. Preventing Residential Burglary in Cambridge: From Crime Audits
to Targeted Strategies. Trevor Bennett and Linda Durie. 1999.
109. Policing Drug Hot-Spots. Jessica Jacobson. 1999.
110. Understanding and preventing police corruption: lessons from
the literature Tim Newburn. 1999.
111. On a Course: Reducing the impact of police training on
availability for ordinary duty. Peter R. Langmead Jones. 1999.
112. Hot Products: understanding, anticipating and reducing demand
for stolen goods. Ronald V. Clarke. 1999.
113. Consolidating Police Crackdowns: findings from an anti-burglary
project. Graham Farrell, Sylvia Chenery and Ken Pease. 1999.
114. Missing presumed...? The police response to missing persons.
Geoff Newiss. 1999.
115. Interviewing Child Witnesses: A review of the Memorandum
of Good Practice. Graham M. Davies and Helen L. Westcott. 1999.
82
Home Office
Research, Development and Statistics Directorate