Ian Hacking Emergence of Probability Presentation Future

25 mins: 3600 words
5 mins question time during prez
 Disclaimer before we get started: I have no formal training in philosophy
and only a basic understanding of first-semester college calculus. I also
have very little experience presenting. So don’t hesitate to ask questions,
let me know if I need to clarify or if I’m explaining something in too much
detail
 Thesis
 What is Epistemology?
 Foucault
o No, I didn’t photoshop sunglasses and a leather jacket onto him, this
is a real photo. He’s just French and gay in the 60s. So look him up.
 Nature of Probability (epistemological/aleatory)
 Overview of evidence (revolutionary ways of using statistics, not used
before even though had data)
 Why not before? Even though had data!
 Medieval concept of probability; best decision inevitably moral question
 Pascal’s Wager
 Closer examples of evidence
 Effects of probability; Voltaire…
 3rd paragraph: In conclusion… probability as a science made possible
decision theory as a science
YOU COULDN’T HAVE A NORMATIVE ECONOMIC THEORY BEFORE
The normative was inseparably tied in, indeed confused, with the descriptive
economic theory
Around 1650, Hacking argues "the taming of chance" allowed scientific models of
prediction, measurement, & quantitative interpretation to develop. It's
conspicuous how statistics were applied in revolutionary ways after 1650: e.g.
nations began to raise income by selling annuities, statistics of births and deaths
were derived from data available for centuries prior, gambling became a
mathematical subject, reliability of testimony in legal disputes began to be
evaluated statistically. Obviously these changes began with a formally educated
elite and slowly trickled down over many decades.
Most importantly, probability as a mathematical, quantitative tool gave anyone
who could understand the concept a much more scientific way of making
decisions, i.e. made possible decision theory as a science. In medieval times,
probability was a qualitative concept. Whenever someone said something was
"probable," what they meant was that an authority, such as a saint, king, or
parent, had vouched for it. Thus, to some extent, determining the best decision
was inevitably a *moral* question.
Freeing decisions involving prediction and accuracy and risk assessment from
cultural values, religion, and hierarchy empowered people to make decisions
independently, and was essential for widespread and mathematically sound
investment (instead of making investments based on religious precepts or your
personal association with the investee). Opinions were no longer sound *only*
because a high-status person had them; they were sound because, in theory,
ideally, they could be objectively justified. "Taming chance" was also essential for
formation of modern states, bureaucracies, and businesses; e.g. Hacking holds
that "no British gov't before 1789 appears to have made the cost of annuity a
function of the age of the purchaser" due to lacking a "theory of the relation
between age at purchase and annual payments." Awkward.
Essential to Hacking's argument is what he describes as the duality of probability:
"On the one hand it is epistemological, having to do with support by evidence. On
the other hand it is statistical [aleatory], having to do with stable frequencies."
Even if developments were made in these two elements of probability, they were
not merged into one cohesive concept until around 1650.
Disclaimer before we get started: I have no formal training in philosophy and only
a basic understanding of, like, the first quarter of college calculus. Please don’t
hesitate to ask questions, and let me know if I need to clarify.
Ian Hacking’s thesis is based in the context of Western Europe. He argues that
modern scientific probability emerged gradually after a revolution in probabilistic
thinking around 1660. The emergence of probability was a necessary
prerequisite for a larger scientific revolution because probabilistic thinking was
needed in order to prefer facts to testimony and make reliable predictions.
So, first we’ll look at a philosophical framework for Hacking’s theory. Then we’ll
explore indications of a revolution in probability science around 1660. After going
through the premodern concepts of probability and the emergence of the modern
idea of probability, we’ll conclude with the consequences of the “taming of
chance” that Hacking traces back to the seventeenth century.
So, first of all, epistemology. What is it? Well, it’s a fancy word for the
philosophical study of knowledge, and has its root in the Greek word “epistēmē,”
which means “knowledge.” So it’s basically knowledge of knowledge, which is
meta because it is about itself; it’s self-referential. Like having a dream about a
dream. Epistemology is basically the Inception of philosophy. Questions that
epistemologists might ask themselves include, “What is knowledge? How do you
actually know that you know something? What is opinion? How is it different from
fact? What is impossible to know?” Now, this brings us to my favorite angry bald
French philosopher,
Michel Foucault. No, I did not photoshop sunglasses and a black leather jacket
onto him, this is actually a real photo. He just really is that effortlessly awesome.
You should look him up.
So in 1962, Foucault wrote a book called The Order of Things, which, true to its
ambitious title, created a huge revolution in the study of knowledge. I’m going to
commit a crime against all knowledge here by very reductively trying to
summarize some of his main ideas. In different eras and different places, the
construction of scientific knowledge is based on orderly subconscious structures
that are influenced by both culture and biology. This underlying system of
understanding the world defines knowledge and justified belief. It’s how you know
that you know something, and it’s how you know that it’s possible or impossible
to know something. This subconscious epistemological framework for knowledge,
this essential “order of things,” is what Foucault calls an “episteme”.
So, Ian Hacking’s thesis is based in the context of the Western European
episteme, the underlying system of understanding the world. He argues that
modern scientific probability emerged gradually after a revolution in probabilistic
thinking around 1660. This meant that the epistemological landscape actually
changed dramatically during this time. The emergence of probability was
symptomatic of a larger epistemological revolution, which had profound effects
on the applications of knowledge, such as statistics and technology, and the way
in which individual people use knowledge.
Around 1660, a lot of people independently hit on basic probability ideas.
Revolutionary changes in probabilistic thinking began with the formally educated
elite and gradually trickled down to common culture over many decades. For the
first time in Western European history, nations began to raise income by selling
annuities [a fixed sum of money paid each year for the rest of someone’s life],
governments statistically analyzed births and deaths, gambling became a
mathematical subject, and the reliability of testimony in legal disputes began to
be evaluated statistically.
In contrast, before 1660, data on births and deaths that had been collected for
centuries was not used for statistical inferences. A long history of analyzing law
and gambling did not involve formal statistics. Meanwhile, nations actually lost
money from selling annuities because they failed to relate cost to the purchaser’s
age. This is why Ian Hacking says, “with only slight reservations, that there was
no probability until about 1660.” Why?
In order to answer that question, we need to understand the difference between
modern and premodern probability concepts. According to Ian Hacking,
probability is a dual concept. On the one hand, it is epistemological, measuring
the likelihood of something based on support by evidence. On the other hand, it
is also statistical, measuring likelihood based on stable long-run quantitative
frequencies. “Probability” as a holistic whole therefore measures credibility both
qualitatively and quantitatively. In practice, Hacking holds that the statistical and
epistemological approaches to knowledge are inseparable from one another.
Interpreting, applying, and identifying statistical patterns will always involve some
epistemological evaluation, and judging qualitative propositions will always
involve some statistical evaluation. This duality is essential to modern probability.
However, the premodern precursor to probability, which is not actually probability
as we know it, was not a dual concept. Before about 1650, scholars did not see a
relation between statistical laws of chance processes and epistemological
assessments of reasonable degrees of belief. In terms of language, the word
“probability” referred to a qualitative assessment.
A “probable doctor” was a medical man who could be trusted to perform well. As
late as 1776, the English historian Edward Gibbon wrote in a popular history of
the Roman Empire, “Such a fact is probable but undoubtedly false.” In the 1724
novel Roxana, Daniel Defoe wrote, “This was the first view I had of living
comfortably indeed, and it was a very probable way, I must confess, seeing we
had very good conveniences, six rooms on a floor, and three stories high.”
In the era of premodern probability, whenever someone said something was
“probable,” what they meant was that an authority, such as a saint, king, or
parent, had vouched for it. Thus, to some extent, determining the best decision
was always a moral question. Our modern concepts of meritocracy and objective
evaluation of evidence were therefore generally “not possible” before 1660. In the
epistemological framework of premodern probability, credibility was based on
tradition and authority. When we look back on the medieval era, the feudalist
system, the monarchy, and the dogmatic Christian church seem unscientific and
corrupt. We call it the Dark Ages, and many people often think that these idiots
who believed magical fountains could heal leprosy and the Sun revolved around
the Earth must have been blind. But we’re blind to the way in which they saw the
world. They had a whole different episteme. While mathematical developments in
statistical probability were made before 1660, statistics was simply not relevant at
all to the assessment of the correct thing to do or the correct way of thinking
before the mid-seventeenth century.
The French philosopher and mathematician Blaise Pascal is often considered the
founder of probabilistic theory. Ian Hacking identifies him as one of the first
people to break out of the premodern probabilistic episteme. Around 1662,
Pascal devised a theory called “Pascal’s wager.” The idea is that humans all bet
with their lives, either that God exists or that he does not exist.
A rational person should live as though God exists and seek to believe in God.
Why? Well, if God does not actually exist, you will have only a finite loss. But if
God does exist, you will receive infinite gains while avoiding infinite losses. So,
we’re not going to go into the wager itself, which most scholars and theologians
today no longer consider very relevant, and which Hacking considers “dubious, if
not patently false”.
However, the reasoning behind it represents a tremendous breakthrough in
probabilistic thinking. It assumes an “isomorphism,” that is, a structural similarity,
between decision making when objective physical chances are known to exist,
such as tossing an evenly weighted coin, and decision making when no objective
physical chances are known, such as the question of the existence of God. This
is the key basis for the modern dual concept of probability.
It took governments a while to catch on. In the early 18th century, the French
government attached a lottery to the sale of government bonds. However, the
prizes to be disbursed substantially exceeded the money to be gained in ticket
receipts, meaning the government actually lost money through this venture. The
young mathematician Charles-Marie de La Condamine and his friends, including
Voltaire, formed a ticket-buying cartel that gamed the broken system for a profit.
Lottery players were to write a motto on their ticket, to be read aloud when a
ticket won the jackpot; Voltaire wrote cheeky slogans like “All men are equal!” on
his tickets. Eventually the government figured it out, but not before Condamine
and his friends got it made for life. What, you think Voltaire got rich writing
brilliant essays and making pretty sketches?
The consequences of modern probability science on government were enormous,
though the details are beyond the scope of Hacking’s work. The focus of legal
assessment of credibility gradually shifted from testimony to factual evidence,
which was essential for the later development of impersonal trial procedure. The
emergence of an independent judicial branch was central to the rise of rule of law
in Western Europe. Thus, the “taming of chance” was essential for formation of
modern states, bureaucracies, and businesses. Hacking holds that “no British
government before 1789 appears to have made the cost of annuity a function of
the age of the purchaser” due to lacking a “theory of the relation between age at
purchase and annual payments.” Awkward.
But most importantly, probability as a quantitative, mathematical tool gave
anyone who could understand the concept a much more scientific way of making
decisions. It made possible decision theory as a science. Freeing decisions
involving prediction, accuracy, and risk assessment from cultural values, religion,
and hierarchy empowered people to make decisions independently, and was
essential for widespread and mathematically sound investment, instead of
making investments based on religious precepts or your personal association
with the investee. For example, the Christian New Testament of the Bible and the
Quran ban usury; as Shakespeare warned, “neither a borrower nor a lender be.”
For centuries, Jews were the only moneylenders in Western Europe because
Christians were forbidden from debasing themselves with such an unholy
practice. You can see how this would have gotten in the way of the rise of free
market capitalism and our modern system of confidence-based currency.
In conclusion, after the rise of modern probabilistic thinking, opinions were no
longer considered sound only because a high-status person had them; they were
sound because, in theory, ideally, they could be objectively justified, and because
they satisfied your personal standards of credibility.
Thus, an epistemological revolution in probabilistic thinking made possible the
episteme of Western Europe in modern times, and, by extension, political and
economic systems that Western Europe has influenced in the last three hundred
years.