Metrics in the real world

Insight Consulting Ltd.
http://www.insight.ie
Metrics in the real world
One of the areas where I find organisations often have difficulty is metrics. There is no doubt that
measurement and analysis is fundamental to process improvement but there are many pitfalls that can
cause a metrics programme to become ineffective including some significant people issues. What
follows are some dos and don’ts based on real world experiences.
Don’t:
 Play politics with metrics. I have seen a number of larger organisations where measures were
being collected by senior management with an implied agenda to compare the performance of
individual projects/sites. This introduces apprehension and fear as to how this information will be
used against people. Inevitably, the data will be fudged i.e. the integrity of the data collected
becomes so poor as to be meaningless.
 Apply metrics to the individual. Examples would include using the results of peer reviews or
testing in an individual’s performance review. Authors will therefore not want their work products
peer reviewed. If they are peer reviewed, authors will argue to have major faults reclassified as
minor faults resulting in conflict and the eventual demise of effective peer reviews. In dynamic
testing, developers will argue that bugs found by testers are ‘features’ and finding faults will no
longer be perceived as a positive contribution… a basic premise of effective testing.
 Use metrics too prescriptively. I once observed an organisation use the count of major faults found
per page as a target that had to be reached before the inspection could be deemed complete. As a
result people turned minors into majors to meet the targets! Another example involved the
introduction of a threshold for the rate of tests passing. When the threshold was met it meant the
subsequent stage of testing could commence as the fault detection rate was low enough from this
test stage. With the pressure of time to market, the test manager pressurised the testers to reach the
threshold. They duly obliged and picked simple tests they knew would pass. The introduction of
the metric changed the goal of testing from finding faults to NOT finding faults!
Do:
 Obtain consensus on the key goal-based measures to put in place1. The starting point has to be
prioritised business objectives and information needed to make decisions. A useful approach to
deriving metrics from goals is the Goal-Question-Metric (GQM) approach from Basili 2. The
GQM approach involves constructing questions to provide a stepping-stone to help develop
metrics and states that you:
 Define your principal goals for your activity (in this case SPI)
 Construct a comprehensive set of questions that, when answered, help assess where you are
relative to each goal
 Define and gather the data required to answer these questions
 Start simple and consider focusing on basic project management measures initially and evolving
to wider organisational measures later.
 Educate staff on why the data is being collected and how it will be used.
 Make it easy to collect the data.
 Analyse the data early and provide frequent feedback to those collecting the data as to what the
results of analysis show and what actions have resulted.
 Use them! I once reviewed an organisation’s inspection process and discovered they spent a
significant amount of time at the end of each meeting analysing every fault found into pre-defined
categories. When I asked how the information was being used and what actions have resulted
from the analysis, my questions were met with blank stares. A goal-based approach would have
avoided this scenario.
 Remember that measures do not have to be 100% perfect to be useful. Good indicators of whether
you are achieving your goals may be all you need.
Are your metrics working in the real world?
As the CMMI puts it: ‘specify the objectives of measurement and analysis such that they are aligned
with identified information needs and objectives’. See the new Measurement and Analysis process area
in the CMMI for some useful practices.
2
V. Basili et. al., ‘A Methodology for collecting valid software engineering data’, IEEE Transactions
on Software Engineering, vol. SE-10, no. 6 (Nov. 1984): 728:738
1
 2002 Insight Consulting Ltd.