Access this Content

History of Thought in Quality Assurance
Dr. Tommy Gardner
Chief Technology Officer
March 19, 2012
Outline
•
•
•
•
•
•
•
•
Goals and Definitions
Egypt
Greece
Rome
Middle Ages
Industrial Age
Wartime Production and Post War
Modern Quality Thought
1.
2.
3.
4.
5.
6.
2
Shewhart
Taguchi
Deming
Womack
Solzhenitsyn
Rickover
Goal
1. To show some insight into the hypothesis that
the Quality function has been with Mankind
throughout the ages.
2. To draw what we can from quality experts of
the past.
3. To learn from the past and challenge all on
what are the next breakthroughs in Quality
3
First a Failure of Quality
4
Dilbert
5
ASQ: Quality Assurance and Quality Control
The terms “quality assurance” and “quality control” are often used
interchangeably to refer to ways of ensuring the quality of a service or
product. The terms, however, have different meanings.
Assurance: The act of giving confidence,
the state of being certain or the act of
making certain.
Quality Assurance: The planned and
systematic activities implemented in a quality
system so that quality requirements for a
product or service will be fulfilled.
6
Control: An evaluation to indicate needed
corrective responses; the act of guiding a
process in which variability is attributable to
a constant system of chance causes.
Quality Control: The observation
techniques and activities used to fulfill
requirements for quality.
Ancient Egypt
The Pyramids were built around 5,000 BC.
The organization, planning, scheduling, design and calculations involved would
be a major challenge today.
7
Modern Egypt
8
Greece
Alexander the Great conquered the world.
He used extensive logistics training and planning.
He needed repeatable processes and a dependable supply chain.
Aristotle said, "We are what we repeatedly do; excellence, then, is not an act,
but a habit"
9
Rome
Rome was built on the ideals of the Greeks with financing from Egypt.
10
Middle Ages up to the Industrial Revolution
During the Middle Ages, guilds adopted responsibility for
quality control of their members, setting and maintaining
certain standards for guild membership.
Royal governments purchasing material were interested in
quality control as customers. For this reason, King John of
England appointed William Wrotham to report about the
construction and repair of ships. Centuries later, Samuel
Pepys, Secretary to the British Admiralty, appointed multiple
such overseers.
11
The Industrial Revolution
Prior to the extensive division of labor and mechanization resulting from
the Industrial Revolution, it was possible for workers to control the quality of their
own products. The Industrial Revolution led to a system in which large groups of
people performing a specialized type of work were grouped together under the
supervision of a foreman who was appointed to control the quality of work
manufactured.
12
Mass Production
At the time of the First World War, manufacturing processes typically became
more complex with larger numbers of workers being supervised. This period saw
the widespread introduction of mass production and piece work, which created
problems as workmen could now earn more money by the production of
extra products, which in turn occasionally led to poor quality workmanship being
passed on to the assembly lines.
To counter bad workmanship, full time inspectors were introduced to identify,
quarantine and ideally correct product quality failures. Quality control by
inspection in the 1920s and 1930s led to the growth of quality inspection
functions, separately organized from production and large enough to be headed
by superintendents.
13
Wartime Production
The systematic approach to quality started in industrial manufacturing during the
1930s, mostly in the USA, when some attention was given to the cost
of scrap and rework. With the impact of mass production required during
the Second World War made it necessary to introduce an improved form of
quality control known as Statistical Quality Control, or SQC.
SQC includes the concept that every production
piece cannot be fully inspected into acceptable
and non acceptable batches. By extending the
inspection phase and making inspection
organizations more efficient, it provides
inspectors with control tools such as
sampling and control charts, even where 100
per cent inspection is not practicable. Standard
statistical techniques allow the producer to
sample and test a certain proportion of the
products for quality to achieve the desired level
of confidence in the quality of the entire batch or
production run.
14
Postwar
In the period following World War II, many
countries' manufacturing capabilities that
had been destroyed during the war were
rebuilt. General Douglas
MacArthur oversaw the re-building
of Japan. During this time, General
MacArthur involved two key individuals in
the development of modern quality
concepts: W. Edwards Deming and Joseph
Juran. Both individuals promoted the
collaborative concepts of quality to
Japanese business and technical groups,
and these groups utilized these concepts in
the redevelopment of the Japanese
economy.
Although there were many individuals
trying to lead United States industries
towards a more comprehensive approach
to quality, the U.S. continued to apply the
Quality Control (QC) concepts of inspection
and sampling to remove defective product
from production lines, essentially ignoring
advances in QA for decades.
Deming
15
Juran
Breakout - Modern Quality Thought
-
16
Shewhart
Taguchi
Deming
Womack
Solzhenitsyn
Rickover
Shewhart
Father of statistical quality control
“Shewhart simulated theoretical models by
marking numbers on three different sets of
metal-rimmed tags. Then he used an
ordinary kitchen bowl – the Shewhart bowl
– to hold each set of chips as different
sized samples were drawn from his three
different populations. There was a bowl,
and it played a vital role in the development
of ideas and formulation of methods
culminating in the Shewhart control charts.”
– Ellis R. Ott, Tribute to Walter A.
Shewhart, 1967
The industrial age was easing into its
second century when a young engineer
named Walter A. Shewhart came along and
altered the course of industrial history.
Shewhart, ASQ’s first Honorary member,
successfully brought together the
disciplines of statistics, engineering, and
economics and became known as the
father of modern quality control.
17
The lasting and tangible evidence of that
union for which he is most widely known is
the control chart, a simple but highly
effective tool that represented an initial step
toward what Shewhart called “the
formulation of a scientific basis for securing
economic control.”
Shewhart was concerned that statistical
theory serve the needs of industry. He
exhibited the restlessness of one looking
for a better way. A man of science who
patiently developed and tested his ideas
and the ideas of others, he was an astute
observer of developments in the world of
science and technology. While the literature
of the day discussed the stochastic nature
of both biological and technical systems,
and spoke of the possibility of applying
statistical methodology to these systems,
Shewhart actually showed how it was to be
done; in that respect, the field of quality
control can claim a genuine pioneer in
Shewhart.
ASQ Shewhart’s Legacy
His monumental work, Economic Control of
Quality of Manufactured Product,published
in 1931, is regarded as a complete and
thorough exposition of the basic principles
of quality control.
Shewhart wrote Statistical Method from the
Viewpoint of Quality Control in 1939 and
gained recognition in the statistical
community. In addition, he published
numerous articles in professional journals,
and many of his writings were held
internally at Bell Laboratories. One of these
was the historic memorandum of May 16,
1924, in which he proposed the control
chart to his superiors.
An element in Shewhart’s success was his
searching out other bright and
knowledgeable individuals for their ideas,
methodically cultivating these sources and
drawing from them information and advice
in a way that endeared him to all. In a
series of tributes to Shewhart published
in Industrial Quality Control in August 1967,
the most striking comment from the
contributors—many of whom were
themselves important figures in the
development of the quality control field—
was their respect for Shewhart’s
gentlemanly approach and sincere interest
in the work and concerns of others.
Shewhart’s influence on ASQ runs deep.
Shortly before his death, he remarked to
members that they “extended the field
beyond my early visions and saw areas of
service that pleased and amazed me. I
hope that you continue.”
18
Shewhart’s Control Chart
When Dr. Shewhart joined the Western
Electric Company Inspection Engineering
Department at the Hawthorne Works in
1918, industrial quality was limited to
inspecting finished products and removing
defective items. That all changed on May
16, 1924. Dr. Shewhart's boss, George D.
Edwards, recalled: "Dr. Shewhart prepared
a little memorandum only about a page in
length. About a third of that page was given
over to a simple diagram which we would
all recognize today as a schematic control
chart. That diagram, and the short text
which preceded and followed it, set forth all
of the essential principles and
considerations which are involved in what
we know today as process quality
control."[1]
19
Shewhart's work pointed out the
importance of reducing variation in a
manufacturing process and the
understanding that continual processadjustment in reaction to non-conformance
actually increased variation and degraded
quality.
Shewhart framed the problem in terms
of assignable-cause and chancecause variation and introduced the control
chart as a tool for distinguishing between
the two. Shewhart stressed that bringing a
production process into a state of statistical
control, where there is only chancecause variation, and keeping it in control, is
necessary to predict future output and to
manage a process economically. Dr.
Shewhart created the basis for the control
chart and the concept of a state
of statistical control by carefully designed
experiments.
Taguchi
Genichi Taguchi provided a whole new way
to evaluate the quality of a product.
Traditionally, product quality has been a
correlation between loss and market size
for the product. Actual quality of the
product was thought of as an adherance to
product specifications. Loss due to quality
has usually only been thought of as
additional costs in manufacturing (i.e.
materials, re-tooling, etc.) to the producer
up to the time of shipment or sale of the
product. It was believed that after sale of
the product, the consumer was the one to
bear costs due to quality loss either in
repairs or the purchase of a new product. It
has actually been proven in most cases
that in the end the manufacturer is the one
to bear the costs of quality loss due to
things like negative feedback from
customers.
20
Taguchi changed the perspective of quality
by correlating quality with cost and loss in
dollars not only at the manufacturing level,
but also to the customer and society in
general.
You will most likely encounter Taguchi
methods in a manufacturing context. They
are statistical methods developed by
Genichi Taguchi to improve the quality of
products. Where as statisticians before him
focused on improving the mean outcome of
a process, Taguchi recognized that in an
industrial process it is vital to produce a
product on target , and that the variation
around the mean caused poor
manufactured quality. For example, car
windshields that have the target average
mean are useless if they each vary
significantly from the target specifications.
Taguchi Quality Loss Function
Taguchi's key argument was that the cost
of poor quality goes beyond direct costs to
the manufacturer such as reworking or
waste costs. Traditionally manufacturers
have considered only the costs of quality
up to the point of shipping out the product.
Taguchi aims to quantify costs over the
lifetime of the product. Long term costs to
the manufacturer would include brand
reputation and loss of customer satisfaction
leading to declining market share. Other
costs to the consumer would include costs
from low durability, difficulty interfacing with
other parts, or the need to build in safety
margins.
21
Think for a moment about how the costs of
quality would vary with the products
deviation on either side of the mean. Now if
you were to plot the costs versus the
diameter of a nut, for example, you would
have a quadratic function, with a minimum
of zero at the target diameter. We expect
therefore that the loss (L) will be a
quadratic function of the variance (σ, or
standard deviation) from the target (m).
The squared-error loss function has been
in use since the 1930's, but Taguchi
modified the function to represent total
losses. Next we will walk though the
derivation of the Taguchi Loss Function.
QLF Description
Where:
L = Loss in Dollars
y = Quality Characteristic (diameter,
concentration, etc)
m = Target Value for y
k = Constant (defined below)
The Taguchi quality loss function is a way
to assess economic loss from a deviation in
quality without having to develop the
unique function for each quality
characteristic. As a function of the
traditionally used process capability index,
it also puts this unitless value into monetary
units.
A graphical representation of the Nominal
Characteristic is shown at left. As the
output value (y) deviates from the target
value (m) increasing the mean squared
deviation, the loss (L) increases. There is
no loss when the output value is equal to
the target value (y = m).
22
Deming
Deming said when asked by the government why won’t you help us
and he replied because you do not want to do this.
23
Womack
Based on the Toyota (lean) model, which
combines operational excellence with
value-based strategies to produce steady
growth through a wide range of economic
conditions.
24
Solzhenitsyn
Aleksandr Isayevich Solzhenitsyn from The Gulag Archipelago, "Dwell on the
past you'll lose an eye, Forget the past you'll lose both eyes." Old Russian
Proverb
25
Rickover
One of the great icons of the 20th Century
was Admiral Hyman Rickover. He is known
as the “father” of our nuclear navy, and his
efforts have made America safer. Born in a
ghetto in Warsaw in 1900, Rickover rose to
rank of Admiral and directed the
development of our nuclear navy, which
has a tremendous safety record. He
recognized he was dealing with a highly
risky, highly complex issue, and he
developed rules for success.
26
How can these rules help you in your highly
complex, highly risky world of government
operations? How did his focus on quality
control penetrate the organization so
deeply so as to reach to the line employee
level in the nuclear navy? Let’s take a look
at each of these rules.
Rickover Rules
Rule 1. You must have a rising standard of
quality over time, and well beyond what is
required by any minimum standard.
Rule 2. People running complex systems
should be highly capable.
Rule 3. Supervisors have to face bad news
when it comes, and take problems to a
level high enough to fix those problems.
Rule 4. You must have a healthy respect
for the dangers and risks of your particular
job.
27
Rule 5. Training must be constant and
rigorous.
Rule 6. All the functions of repair, quality
control, and technical support must fit
together.
Rule 7. The organization and members
thereof must have the ability and
willingness to learn from mistakes of the
past.
http://www.pennprime.com/index.asp?Type
=B_BASIC&SEC=%7B88B394EE-A0F54090-AA0F-90860D3542E1%7D
BACKUP MATERIAL
28
Rickover Rule 1
Rule 1. You must have a rising standard of quality over time, and well beyond what is
required by any minimum standard.
We have to get better and better at what we do. Our public deserves it. Our personnel
deserve it. We must be constantly looking for a better way to do things. Status Quo — we
have always done it this way — is no longer acceptable. On an organizational level, there are
better ways to get and keep good people. There are better ways to build your policy manual.
There are better ways to train your personnel. There are better ways to supervise. There are
better ways to discipline errant employees. On an operational level, we must improve our
performance in response times, quality and timeliness of written reports, training, candor in
performance evaluations, equipment and vehicle maintenance, physical conditioning, and
anything else that we can measure. Continuous improvement has got to be part of the way
we do business.
29
Rickover Rule 2
Rule 2. People running complex systems should be highly capable.
Successful government operations require people who know how to think. Fifty years ago,
you did not need to be all that sharp to be a government employee. Things have changed.
Technology, equipment, strategies and tactics involved in providing services to our
constituents have all changed. This is an extremely complex job, and if you hire people who
can’t think things through, you are in route to disaster. If you allow the hiring of problem
employees, they will not disappoint you — they will always be problem employees. In view of
the consequences that can occur when things do not go right in your complex, high-risk job
— this may end being the cause of a future tragedy. Every nickel you spend in weeding out
problem employees up front has the potential to save you a million dollars. And I can prove
that statement if you want me to.
30
Rickover Rule 3
Rule 3. Supervisors have to face bad news when it comes, and take problems to a level high
enough to fix those problems.
When you take an honest look at tragedies in any aspect of government, from the lawsuits to
the injuries, deaths, embarrassments, internal investigations and even the rare criminal filing,
so many of them get down to supervisors not behaving like supervisors. The primary mission
of a supervisor is “systems implementation.” If you promote people who either can’t or won’t
enforce policy, you are in route to tragedy. To be sure, the transition from line employee to
supervisor is a difficult one, but the people you choose to be supervisors have to like their
people so much, that they will enforce the policy to protect each of them from loss. Not to
beat this point to death, but you show me a tragedy in government operations — including
some in the news today — and I will show you the fingerprints of a supervisor not behaving
like a supervisor.
31
Rickover Rule 4
Rule 4. You must have a healthy respect for the dangers and risks of your particular job.
Many government jobs are high risk in nature, and the consequences for not doing things
right can be dramatic. Remember the basic rules of Risk Management. RPM — Recognize,
Prioritize, Mobilize. You must do a risk assessment on each job in every government
department and identify the tasks that have the highest probability of causing you grief. Then
you must prioritize these tasks in terms of potential frequency, severity, and available time to
think prior to acting. Finally, you must mobilize (act) to address the recognized risks
appropriately and prevent consequences.
32
Rickover Rule 5
Rule 5. Training must be constant and rigorous.
Every day must be a training day! We must focus the training on the tasks in every job
description that have the highest probability of causing us grief. These are the high risk, low
frequency, non-discretionary time events. We must assure that all personnel are adequately
trained to address the tasks that give them no time to think, and that they understand the
value of thinking things through when time allows.
33
Rickover Rule 6
Rule 6. All the functions of repair, quality control, and technical support must fit together.
Audits and inspections are an important part of your job as a leader in government. We
cannot assume that all is going well. We must have control measures in place to assure
things are being done right. This is not micro-management — it is called doing your job. If you
do not have the audits (formal and informal) in place, you will not know about problems until
they become consequences, and then you are in the domain of lawyers. That is too late for
action, as all you can do then is address the consequences. And if you take the time to study
the life of Admiral Rickover, you will quickly learn that he was widely despised in the Navy
because of his insistence on using the audit process as a tool to hold people accountable.
34
Rickover Rule 7
Rule 7. The organization and members thereof must have the ability and willingness to learn
from mistakes of the past.
Analysis of past data is the foundation for almost all of risk management. We (government
operations) keep on making the same mistakes over and over again. As I read the lawsuits,
injuries and deaths, organizational embarrassments, internal investigations and even the rare
criminal filing against our personnel, I know that we can learn so much by studying the
mistakes we have made in the past.
35
SPC
Statistical process control (SPC) is the
application of statistical methods to the
monitoring and control of a process to
ensure that it operates at its full potential to
produce conforming product. Under SPC, a
process behaves predictably to produce as
much conforming product as possible with
the least possible waste. While SPC has
been applied most frequently to controlling
manufacturing lines, it applies equally well
to any process with a measurable output.
Key tools in SPC are control charts, a focus
on continuous improvementand designed
experiments.
Much of the power of SPC lies in the ability
to examine a process and the sources of
variation in that process using tools that
give weight to objective analysis over
subjective opinions and that allow the
strength of each source to be determined
numerically. Variations in the process that
may affect the quality of the end product or
service can be detected and corrected,
thus reducing waste as well as the
likelihood that problems will be passed on
to the customer. With its emphasis on early
detection and prevention of problems, SPC
has a distinct advantage over other quality
methods, such as inspection, that apply
resources to detecting and correcting
problems after they have occurred.
In addition to reducing waste, SPC can
lead to a reduction in the time required to
produce the product or service from end to
end.
36
Progression of Charting
When excessive variation is identified by
the control chart detection rules, or the
process capability is found lacking,
additional effort is exerted to determine
causes of that variance. The tools used
include Ishikawa diagrams, designed
experiments and Pareto charts. Designed
experiments are critical to this phase of
SPC, as they are the only means of
objectively quantifying the relative
importance of the many potential causes of
variation.
Once the causes of variation have been
quantified, effort is spent in eliminating
those causes that are both statistically and
practically significant (i.e. a cause that has
only a small but statistically significant
effect may not be considered cost-effective
to fix; however, a cause that is not
statistically significant can never be
considered practically significant).
37
For digital SPC charts, so-called SPC rules
usually come with some rule specific logic
that determines a 'derived value' that is to
be used as the basis for some (setting)
correction. One example of such a derived
value would be (for the common N
numbers in a row ranging up or down
'rule'); derived value = last value + average
difference between the last N numbers
(which would, in effect, be extending the
row with the to be expected next value).
Most SPC charts work best for numeric
data with Gaussian assumptions. Recently
a new control chart: The real-time contrasts
chart[6]was proposed to handle process
data with complex characteristics, e.g.
high-dimensional, mix numerical and
categorical, missing-valued, non-Gaussian,
non-linear relationship.
Quality Process Steps
Steps for a typical quality assurance
process
There are many forms of QA processes, of
varying scope and depth. The application
of a particular process is often customized
to the production process.
A typical process may include:
test of previous articles
plan to improve
design to include improvements and
requirements
manufacture with improvements
review new item and improvements
test of the new item
38
Failure testing
Valuable processes to perform on a
whole consumer product is failure testing
or stress testing. In mechanical terms this
is the operation of a product until it fails,
often under stresses such as
increasing vibration, temperature,
and humidity. This exposes many
unanticipated weaknesses in a product,
and the data is used to drive engineering
and manufacturing process improvements.
Often quite simple changes can
dramatically improve product service, such
as changing to mold-resistant paint or
adding lock-washer placement to
the training for new assembly personnel.
TQM
Statistical control
Total quality management
Many organizations use statistical process
control to bring the organization to Six
Sigma levels of quality,[citation needed] in
other words, so that the likelihood of an
unexpected failure is confined to
six standard deviations on the normal
distribution. This probability is less than
four one-millionths. Items controlled often
include clerical tasks such as order-entry
as well as conventional manufacturing
tasks.[citation needed]
The quality of products is dependent upon
that of the participating constituents, some
of which are sustainable and effectively
controlled while others are not. The
process(es) which are managed with QA
pertain to Total Quality Management.
Traditional statistical process controls in
manufacturing operations usually proceed
by randomly sampling and testing a fraction
of the output. Variances in critical
tolerances are continuously tracked and
where necessary corrected before bad
parts are produced.
39
If the specification does not reflect the true
quality requirements, the product's quality
cannot be guaranteed. For instance, the
parameters for a pressure vessel should
cover not only the material
and dimensions but operating,
environmental, safety, reliability andmaintai
nability requirements.
Models and standards
ISO 17025 is an international standard that
specifies the general requirements for the
competence to carry out tests and
or calibrations. There are 15 management
requirements and 10 technical
requirements. These requirements outline
what a laboratory must do to become
accredited. Management system refers to
the organization's structure for managing
its processes or activities that transform
inputs of resources into a product or
service which meets the organization's
objectives, such as satisfying the
customer's quality requirements, complying
with regulations, or meeting environmental
objectives.
40
Other standards:
ISO 9000:2008, ISO 20000, CMMI, ITIL
During the 1980s, the concept of “company
quality” with the focus
on management and people came to the
fore. It was realized that, if
all departments approached quality with an
open mind, success was possible if the
management led the
qualityimprovement process.
In the system of Company Quality, the work
being carried out was shop floor inspection
which did not reveal the major quality
problems. This led to quality assurance or
total quality control, which has come into
being recently.
CMMI
In 1988, the Software Engineering Institute introduced the notion that SPC can be
usefully applied to non-manufacturing processes, such as software engineering
processes, in the Capability Maturity Model (CMM). This idea exists today within
the Level 4 and Level 5 practices of the Capability Maturity Model Integration
(CMMI). This notion that SPC is a useful tool when applied to non-repetitive,
knowledge-intensive processes such as engineering processes has encountered
much skepticism, and remains controversial today.
41