Cholera in 19th Century New York City

Roshan Ahmed
Cholera in 19th Century New York City: An insight into how the birth of empirical
science reformed medical ethics
by Roshan Ahmed
Three major Cholera epidemics struck New York City during the nineteenth
century, whose effects would have a lasting influence on the medical field, public health
institutions, and medical ethics in the United States. Charles E. Rosenberg describes the
medical and social thought during these outbreaks, noting changes in medical theory,
industrialization, and public opinion as tantamount to overcoming the disease. Striking
the country in 1832, 1849 and again in 1866, these epidemics served to encourage the
development of a legitimate board of public health, discount religious and moralistic
arguments for disease spread, and elevate hygiene and living conditions for the poor.
While prior experience with contagious diseases such as Tuberculosis and Malaria
served to prime the public health institutions in the United States, there was no
precedent for dealing with a disease like Cholera. Swift and deadly, Cholera could take
victims within a day of the appearance of symptoms. Early reactions from the 1832
epidemic included government secrecy, public abandonment of cities for the country,
and even murders of those suffering from the disease. However as religious authority
was replaced first by the moralistic rationality of the Jackson Age and later by scientific
reasoning, the United States was able to come to terms with the disease by enacting
public health initiatives that were based upon the beginnings of what would become
germ theory. The stark difference between the 1832 and 1866 epidemics portray that
changes in the conception of disease, including its causes and repercussions,
necessitated ethical reform.
The 1832 epidemic started in Eastern Europe, making its way across the
continent and finally, through trading ships, to the United States. It is critical that the
United States, while making some preparations, was initially secure in the belief that it
was protected from the suffering and disease of the Old World. Religion, and moral
uprightness in the form of the virtue of cleanliness and meaningful labor were contrasted
with the filth and idleness of those in European cities. Thus the conception of disease at
this period of time was based solely upon religion and morality. Even physicians of the
time, while discounting pure faith as a preventative means, regarded moral stature as
paramount to the health of an individual.
While quarantines were initially used to stop the spread of Cholera from the Old
World to New York City, very little was done immediately after the disease took hold.
The government’s board of health attempted to hide incidences of Cholera from the
public, worried that news of an epidemic would cause trade and business to halt and
results in great losses for the city. The Medical Society finally spread the news of a
Cholera outbreak in the interest of prevention, however this rather than being thought of
as a step towards truth, was viewed as a misguided blunder doomed to induce panic.
The public regarded doctors as mere citizens with no usable knowledge of the disease
or how damaging news of the outbreak would be for businesses. Monetary concerns
overtook those of health during this time with the government having the most incentive
for secrecy concerning incidence of Cholera. With no way to prevent disease, doctors
had no status with which to wield authority.
Although the protective veil of faith seemed to have no credence after the onset
of the outbreak, the public still hung on the belief that disease was linked to depravity
and vice, making prevention difficult. If the only theoretical basis of disease of the time
was religious and moralistic, prayer, fasting, cleanliness, and labor should be considered
as preventative measures and a focus of public health. However for the poor, drunkards,
Roshan Ahmed
immigrants, and others who were already considered irreparably depraved, death by
Cholera was expected and considered either God’s punishment or moral justice. During
the 1932 outbreak of Cholera it is evident that the perceived moral cause of disease
resulted in a breakdown of public health but was this also an ethical failure?
Government response during the first Cholera epidemic was severely lacking.
Initial complacence however can be excused due to the reasoning behind disease
spread during the time. If the New York City government truly believed that America was
safe from Cholera due to the country’s faith and morality then prevention should not
have been a concern. However after the initial cases of Cholera, the government shirked
responsibility, remaining inert during a critical time. Rather than attempt to use all
resources to warn the public, clean the streets, and prevent spread of disease, the
government waited out the epidemic in silence. This “official silence” that Rosenberg
describes is an ethical failing of enormous proportions. The government could have
made the news of the outbreak public sooner in the interest of disease prevention.
Lack of medical authority complicated matters as well. Since the Medical Society
had no clear prevention or cure for Cholera, there could be no protocol for the public to
rely upon after news of the outbreak spread. In fact the news of Cholera increased
spread of the disease through a mass exodus from the city to the countryside.
Rosenberg effectively contrasts this initial Cholera epidemic to those that occurred in
1849 and 1866. During the 1849 epidemic, religion and faith had lost hold on the
American public. During the Age of Jackson Americans embraced a reason based on
the natural morality of man rather than a faith based upon spirituality and God. However
conceptions of disease were not affected by this shift in philosophy as disease was still
thought of as mainly a moralistic affliction exacerbated by violent emotion, idleness, and
uncleanliness.
Doctors were still considered unknowledgeable and in some cases even
charlatans as it became more and more clear that they did not know the cause or
prevention of Cholera. Quacks charged high prices for ineffectual medications and many
doctors refused to make calls to the dirtier streets inhabited by the poor. The public lost
faith in these doctors, calling for a unified medical opinion between doctors. Only
surgeons retained status, due to the efficacy of their methods. It was not until the
epidemic of 1866, that doctors were able draw conclusions about the source of disease.
In terms of medical progress, the epidemics of 1932 and 1949 were very similar.
Disease was still thought to be an atmospheric malaise caused by local fermentation of
air. Many doctors denied the contagious nature of the disease, as they could not prove
the route by which it spread between people. By 1866, attitudes in medicine had
changed shifting from knowledge based on moral rationality to that based on empirical
evidence. This would provide crucial to the prevention of disease and reformation of
medical ethics.
In 1866, New York City’s poor were victim to mass industrialization living in
unsanitary tenement housing. Outbreaks of typhoid and dysentery were common.
Doctors pushed for public health reform in the interest of preventing another Cholera
outbreak, and finally the government yielded. It has at this time become more apparent
that the condition of the poor was a failing of society as a whole rather than a testament
to the morality of the individuals. The public health bill, which would later become a law
to create a Metropolitan Board of Health, would be integral in the prevention of the
spread of Cholera during this epidemic.
The ethical reform of the epidemic of 1866, as seen through the lens of public
health initiatives, was a function of increased knowledge of the nature of disease and
changes in social opinion. The medical community had started using “exact methods of
investigation” to examine the diseased and had learned that Cholera was spread though
Roshan Ahmed
the bodily fluids, vomit and diarrhea, of those affected. Preventative measures could
easily be taken to prevent spread by sterilizing all items that come into contact with
these fluids. While the rise of industrialization made an appropriate home disease
outbreak, it also leant the public tools by which to stop disease.
In 1854 John Snow, an anesthetist in London, published data proving that
the mode of transmission of Cholera was through contaminated water using empirical
means correlating Cholera incidence with the water supply of one of two different water
providers. After this discovery, it was clear to most physicians that Cholera was indeed
transmissible. The Metropolitan Board was able to put together plans in the event of
Cholera outbreaks to effectively prevent spread of disease. Once a case was reported,
the address of the individual’s home was telegraphed to the Board. The Board
dispatched trained individuals to sanitize the house and belongings of the victim. This
change in public health initiative and the power given to the Metropolitan Board is an
effect of increased knowledge of disease transmission and a change in societal
leanings. Once satisfied with religion and moral explanations, proof had shown the
public that Cholera was transmissible through specific means and could be prevented.
Doctors no longer provided ineffectual remedies and cures at high prices, rather
focusing on prevention and preserving the community.
The example of the New York Cholera epidemics show that the very conception
of disease as well as the opinion of the people needed to change to allow ethical reform.
Without knowledge of causation in 1832, it was impossible to devise means to control
the chaotic public, convince the government to act, or devise a plan of action. Only
through the rise of empirical scientific thought and fall of religion, could the medical field
gain the authority necessary to provide a protocol to minimize damage in an ethically
sound manner. Concepts of the causation of disease are critical to determining ethical
responses.
In 1832 government inertia was acceptable at first and even favored by the
people, who feared loss of profits more than disease. Doctors had no resources through
which to act and unethically provided treatments that they knew were not curative.
These issues still remain relevant today, although we have many governing bodies to
keep checks and balances. The FDA approves drugs that are not curative of disease but
rather improve the quality of life of patients, such as treatments for terminal diseases like
pancreatic cancer. Additionally many pharmaceutical companies might provide
treatments without doing adequate long-term clinical trials to see if drugs are harmful to
patients, in the interest of profits. In 1866 we see the beginnings of some of the concepts
governing medical ethics in the present such as public sanitation for all areas in the city,
public health protocols in the incidence of an epidemic, and the underpinnings of the
American Medical Association. The Cholera outbreaks described by Rosenberg provide
a critical means by which to examine cultural attitudes, medical and social, during the
19th century. The emergence of epidemiology, empirical science and germ theory
introduced the cultural attitudes concerning medical ethics that we espouse today, with
blame taken away from those affected by disease and responsibility placed upon society
as a whole.
Meagan Belcher Dufrisne 1
Plastic Fantastic: How the Biggest Fraud in Physics Shook the Scientific World by Eugenie Samuel Reich In this excellent non‐fiction work, Eugenie Samuel Reich takes an in depth look at one of the most elaborate cases of scientific fraud of our lifetime. Jan Hendrik Schön was a prolific postdoc and thought‐to‐be wunderkind in the field of organic crystal physics. While working at Bell Labs, he fraudulently claimed to have observed the quantum Hall effect and superconductivity on the surface of organic crystals. From this not‐so‐humble starting point, he published several different papers in high‐profile journals such as Nature and Science in numerous fields of physics. His publications included findings that organic crystals could be used to make lasers or have exciting roles in nanotechnology. This book not only elucidates many contributing factors to scientific fraud, including the supervisor‐
researcher relationship, the pressure to publish, and the importance of data management and record keeping, but it asks the controversial yet fascinating questions: Is science as unwaveringly self‐regulatory as we all claim? And can a case of fraud ever enhance science? Reich seems to suggest that this case shows the inefficiency of scientific self‐
regulation and reaches to inspire fear of what happens when we forget Schön and his tale. In my mind, however, this is a perfect example of how truth comes to light under the patience and skepticism of science, and how seeing the wheels of science turning reminds us what we strive for. How could this have happened? How could one man mislead this community of experts for close to five years? Reich emphasizes the hunger of the field for these great breakthroughs. Schön’s findings were scientifically relevant, and if true, had huge implications for the advancement of electronics and computing capabilities. It was a hot field. People desperately wanted to see evidence of these concepts, so may have clung to fraudulent data longer than they would have in other circumstances. Reich also emphasizes that Schön arrived at Bell Labs at a time of transition and that he was left with very little oversight from supervisors or project managers. He never showed original data, and it seems, even when asked for it, was not pressed by his superiors. The level of fraud may have also been heightened by pressure to publish. Schön’s claims had gotten him published in some of the world’s most prestigious journals: Nature Meagan Belcher Dufrisne (mlb2223) 2
and Science. Once he published in such high level journals, he felt pressure to produce the same level of success. In this way, his early fraud fueled even more fraud later on. Schön’s publishing status also lent credibility to his claims and earned him a lot of respect in the field. Critics of Schön were confronted with statements such as “How many papers have you published in Nature recently?” Even some of his peers at Bell Labs were scolded for their lack of results while citing Schön’s latest success. Reich also points to the structure of the scientific community as failing in this case of fraud. Reich quotes Peter Littlewood as saying, “For a long time, I didn’t believe it could have been fraud because I didn’t believe one person could make all that up. Then I realized, we all made it up.” Littlewood is referring to Schön’s habit of taking criticism as a suggestion of how he could make his fraud more convincing. His peers would suggest experiments that would clarify his results, and by forging even more data for these new experiments, his claims seems substantiated and even more credible to his audience of experts. Reich makes the same point about the journal review process: “…a thorough review may do little more than reveal to authors what changes they need to make in order to turn a false claim into a more plausible scam.” Skepticism seems to have fueled his fraud. So does this mean the self‐regulation of the scientific institution has failed? Though science is based on skepticism necessary to discern the truth, it is also based on trust of the scientific community. There is a sort of honor system based on the assumption that your fellow scientists hold the same sincerity and respect for the scientific institution. In the case of Jan Hendrik Schön, this trust seems to have been a detriment and Reich criticizes Schön’s peers for not seeing through his ruse sooner. Reich even says, “…most scientists found it inappropriate to talk about fraud simply because a result seemed generally unconvincing.” Though Reich seems to use this as a criticism, I believe it is correct to be cautious about accusing fraud. Because of the damage that can come to one’s career by even the implication of fraud, it important to only bring those claims when there is little to no doubt of their veracity. This trust was also never completely blind. Groups of researchers attempted to reproduce his results. They did not simply move onto the next step of research without attempting to confirm the assumption. This repeated failure to reproduce his results was Schön’s ultimate downfall. In this respect, the system of science Meagan Belcher Dufrisne worked exactly how it should. It’s also important to note, that although altering any 3
number of oversights and increasing skepticism may have stopped this fraud before it became so monumental, it is important that some element of trust remain in science. Collaboration and comradery are essential in good science. After Schön’s disgrace in the scientific community, research groups have gone on to show results beyond Schön’s wildest imagination. “Scientists working with graphene were able to apply the field effect and measure the quantum Hall effect even at room temperature, a temperature far higher than Schön, constrained by the expectations of known science, had ever dared to claim.” Did Schön’s original fraudulent claims, which created such excitement in the field, actually fuel these legitimate discoveries? Reich quotes Brooks Hanson at the journal Science as saying, “Does it advance science? Even a paper that’s wrong can encourage new science.” And Reich himself adds, “The future course of science, not the journal review process is the ultimate arbiter of truth or falsity of scientific claims.” The incendiary idea that fraud can be good for science is not exactly what I would like to profess. However, the uncovering of fraud renews faith in science. It reaffirms the importance of peer review and skepticism in general. Though those who commit the act are left ruined and disgraced, those left holding the pieces have a better idea of what is right. It reminds scientists that truth is what we search for, not awards and prestige. That is sublimely encouraging. For his misconduct, Schön was eventually fired, stripped of his PhD and cannot receive any type of public funding in science for eight years. He will never be able to work in the field again. His tale is cautionary and outrageous. But I would argue, in spite of Reich’s doubts, that this tale is also clear evidence of the scientific process working. The self‐regulation of science, however “messy” or “hap‐hazard” is clearly present. Responsible conduct of Research and Related Policy Issues: Term paper
Submitted by: Sumayyah Ebrahim
History is fraught with numerous examples of ethical and human rights violations in the area
of research involving human subjects. These incidents have led to the development of
guidelines for clinical research1. Experimentation involving humans in World War II German
concentration camps led to the Nuremberg Code in 1948. The Nuremberg Code outlines ten
principles including the absolute need for voluntary consent and the requirement that research
risks should not exceed its benefits. In 1964, the Declaration of Helsinki was adopted to
tackle the limitations of the Nuremberg Code, and is generally regarded as the “cornerstone
document of human research ethics”1,2. The Declaration states that ethical principles must be
observed when research is combined with clinical care, just as when conducting research on
healthy volunteers. The Belmont Report was published in 1979, following the 40-year-long
Tuskegee syphilis study; The Tuskegee study enrolled approximately 400 underprivileged
African-American men who had syphilis. They were not informed that they had syphilis, and
were not treated for it, even after treatment with penicillin became the standard of care. The
Belmont Report is “often cited for 3 basic ethical principles: (1) respect for persons (subjects
must enter into research voluntarily and with adequate information), (2) beneficence
(research should maximize benefits and reduce risks), and (3) justice (subjects must be
selected fairly for participation in research, and the burdens and benefits of research should
be fairly distributed in society)”3. Institutional Review Boards (IRBs) were established during
this time in response to these and other earlier research abuses. IRBs must ensure the
“meaningful consent” of research subjects and are also responsible for overseeing human
subject research ensuring that is “scientific, ethical, and regulatory”1.
Informed consent is a legal procedure to make sure that a patient or research subject is aware
of the risks and costs associated with a treatment/study. The components of informed consent
include describing the nature of the treatment, potential alternative treatments, and possible
risks and benefits of the treatment. In order for informed consent to be considered valid, the
subject must be competent and consent should be given voluntarily2. The person involved
should have legal capacity to provide this consent, should not be coerced into participation in
the study, and should have sufficient knowledge and comprehension of elements of the
consent form and study objectives to allow him/her to make an informed decision2. I will
1 discuss the issue of informed consent as illustrated in the following case study of ethical
violations in Human Papillomavirus (HPV) vaccination trials in India2,4.
In India, obtaining valid consent for vaccine trials is the norm, however, questions were
raised recently in certain trials as to whether the consent process was really ‘informed’ or not.
In 2009, the “Ministries for Health and Family Welfare in association with the Indian Council
of Medical Research (ICMR) and PATH (a non-profit organization based in USA) in the
states of Andhra Pradesh and Gujarat launched what it described as a ‘demonstration project’
for vaccination against cervical cancer”4. The HPV vaccine was administered to 14,000 girls
between the ages of 10 and 14 years; and there were six reported deaths following vaccine
administration. The vaccine was administered through a ‘camp’ approach in hostels and
schools, where the wardens of these residential schools and hostels were asked to provide
consent or permission for vaccination - parents were not informed. This represents a violation
of standard ethical norms as a warden, “whether a legal guardian or not, was allowed to
provide consent for hundreds of children without consulting their parents, who are their
natural guardians”2,4. In some cases (non-residential schools), the consent form was handed
out to children who were then asked to get signatures from their parents. This was clearly in
breach of the defined protocols for obtaining informed consent, where it is necessary that the
researcher directly provide information required for informed consent to the parents (in cases
involving minors). Some girls in the study were given HPV immunization cards, which were
in English – an unfamiliar language to both study participants and their parents. Furthermore,
all involved parties (wardens, teachers, students and parents) were under the impression that
the initiative was part of the routine public immunization programme, and were unaware that
they were in fact, part of a research study. They were not informed that they had a choice
regarding participation in the study4.
After a compelling campaign by health networks, women’s advocacy groups and politicians,
the trial was subsequently suspended by the Indian government A formal inquiry committee
was set up by the government and in April 2010, ICMR admitted that their “ethical guidelines
had been flouted in the course of this trial”4. While these findings are germane in preventing
ethical violations in future clinical trials in India and other developing countries, it provides
little peace of mind and redress for the parents and families of the girls who lost their lives in
the course of this trial.
2 Vaccines in the context of clinical trials raise a special concern. It behoves the scientific
community to ensure that new drugs and vaccines undergo proper scientific evaluation before
they are considered for approval. This case also highlights the requisite for a “national
vaccine policy that addresses public health interests, rather than those of the market, before
introducing any new vaccine in the national immunization programme, or even in the open
market”5. These issues are further compounded in vulnerable communities such as in this
case involving minors and their parents; some of whom may well be illiterate and
marginalized, with poor access to healthcare, information, and means for reporting adverse
effects4. In the context of clinical trials in these settings, it is especially important for the
scientific community to ensure that the best interests of these vulnerable communities are
protected.
In conclusion clinical research involving human subjects raises complex, thorny, ethical and
methodological challenges6. Ethical guidelines are imperative in clinical research as they
safeguard patients’ health and privacy1,7. Unethical research can undermine public trust,
leading to an increase in litigation, and decreased confidence in new research findings, as
well as wasting time and money1. Research and documentation centred on human rights have
led to the establishment of “rights-based interventions and the promotion of human rights in
the core strategies of international health organizations” and research ethics committees6. It is
incumbent upon all members of the scientific and medical communities involved in clinical
patient care and research to uphold these core values.
References
1. 2. 3. 4. 5. 6. 7. Chung KC, Kotsis SV. The ethics of clinical research. The Journal of hand surgery. 2011;36(2):308‐315. Rajput M, Sharma L. Informed consent in vaccination in India: Medicolegal aspects. Human vaccines. 2011;7(7):723‐727. National Institutes of Health OHSR . The Belmont Report: Ethical principles and guidelines for the protection of human subjects of research. http://ohsr.od.nih.gov/guidelines/belmont.html. Nigam A. Ethical violations of HPV vaccination trials in India: SAMA. . http://samawomenshealth.wordpress.com/2010/05/17/trial‐and‐error‐ethical‐violations‐of‐
hpv‐vaccination‐trials‐in‐india/. Accessed 26th March 2013. Sarojini N, Srinivasan S, Madhavi Y, Srinivasan S, Shenoi A. The HPV vaccine: science, ethics and regulation. Economic & Political Weekly. 2010;45(48):27. Amon JJ, Baral SD, Beyrer C, Kass N. Human rights research and ethics review: protecting individuals or protecting the state? PLoS medicine. 2012;9(10):e1001325. Kalantri S. Ethics in clinical research. Indian Journal of Anaesthesia. 2003;47(1):30‐32. 3 The Case of Stephen Glass
by Dr. Daniel Freedberg
At first glance, the story of Stephen Glass is simple. A creative, precocious child is rewarded for
his ability to confabulate. He enters the world of adulthood with an unusually undiminished
imagination. He becomes a journalist and his previously academic confabulations become
"real." Yet he continues to make things up, telling bigger and bigger lies until he is inevitably
caught. However, behind this simple, sad story there lurk several challenging questions. Could
Glass' life have ended differently? What can we learn by studying Glass' character? What
mistakes were made by those close to him? And, as researchers engaged in human subjects
studies, what can we learn by scrutinizing this cautionary tale?
Stephen Glass came from an upper middle class family -- his father was a physician -- and he
was an early success. His Chicago suburb had a tradition in the dramatic arts (the creator of
the cult classic Risky Business came from the same town) and Glass loved theater. Although
socially insecure, he felt comfortable when playing a role. He did well in a very competitive
public high school and went to the University of Pennsylvania. There he was mediocre
academically but excelled as the executive editor of the college paper. After graduating he
became a professional journalist, first at Policy Review and later The New Republic. At The
New Republic Glass quickly worked his way up the ladder by writing pieces that were amazingly
full of colorful anecdotes and characters. By age 25 he was one of the top magazine journalists
writing for Rolling Stone, Harper's, and The New York Times. How did he find all these great
magazine stories? Easy: he made them up.
Of course, it all came quickly tumbling down. In hindsight, the lies are evident from Glass' first
piece, a story about taxi drivers. The anecdotes were just too good. His next piece, about a
Washington think tank, was proven to be fabricated. And from there the lies expanded.
Questions were raised about a story, entirely fabricated, that described a hacker who
threatened an electronics company. Glass was defensive, stalled, and resisted. When pushed,
he made his younger brother, a college student, pose as a phony company executive. But it
was too late: Glass was caught. And then it all unraveled. The stories were basically all lies
with proven falsehoods in 27 of 31 published articles.
The Glass story has the fascination of watching a train wreck. It would be easy to dismiss Glass
as pathological, a born liar destined for jail or ignominy. Although Glass earned a decent salary,
no one has suggested that he was motivated by money. Nor was Glass lazy or sloppy. On the
contrary, he was The New Republic's best fact-checker, working long and hard hours. But while
there is truth behind a depiction of Glass as pathological, such a portrait would be too
convenient as a sole explanation of his actions. Glass was human and had human
characteristics. He was sensitive, vulnerable, often endearing. He had many friends and was
liked by colleagues. What can we learn by probing deeper into Glass' personality?
Glass' most obvious character flaw was an excessive eagerness to please. Pushing this idea a
bit farther, we can hypothesize: the fundamental problem was that Glass didn't know himself.
Buzz Bissinger's 1998 Vanity Fair article on Glass makes this point clearly. According to
Bissinger, Glass was "almost brutally self-flagellating about his own work and abilities."
Bissinger points out that colleagues were unsure of Glass' sexual orientation. Glass is straight
but apparently didn't always send the right signals: he wrote an article for his college paper
describing an episode when a man tried to kiss him after dinner. This is not to imply that there's
anything wrong with having multiple sexual identities or being attracted to people of both sexes.
Rather, in Glass' instance, it is the symptom of a more important disease: Glass didn't have any
true identity at all.
But Glass couldn't do it alone. The lies made by Stephen Glass could not have persisted for so
long without the complicity -- active or passive -- of people around him. Glass' editor, Charles
Lane, is featured prominently in Bissinger's article. Lane is depicted as taking an almost
paternal, protective role towards the younger Glass. Suspicions quickly arise about the veracity
of Glass' articles but Lane brushes them aside. One wonders about Lane's motives. Were they
purely altruistic and paternal? Was it partly pride at having discovered this young hotshot
reporter? Or was Lane himself too insecure in his role -- he was fairly new to the magazine -- to
reach the obvious conclusion that Glass was a liar?
Finally, as biomedical researchers what can we learn from Glass? First, we can know
ourselves. Greed, ambition, and the desire to be "right" in our scientific hypotheses are all
normal, natural feelings. We are better off acknowledging that these forces exist than trying to
deny them. We can fight off these devils with honesty, understanding, and humor far better
than with abnegation. Second, we shouldn't go it alone. Charles Lane could have caught Glass
earlier if he had confided his doubts to someone else. If Lane's doubts had been wrong, no
great harm would have been done. And since Lane's doubts were right, he could have stopped
Glass -- maybe even have saved him.
Glass' kind of flagrant dishonesty is exceedingly rare in biomedical research. (The cases exist,
of course, but in a tiny minority.) Perhaps the most important lesson to take away from the story
of Stephen Glass is that the web of lies can become a slippery slope. Even for Glass, things
started slowly. He began by making up picaresque anecdotes and finished by co-opting his
younger brother in his web of deceits. As a researcher, one could conceivably be tempted by a
"little" lie of omission. For example, what if one cell line produces "aberrant" results? Can we
throw out the data? What if we have reason to believe that the cell line was contaminated? The
line of truth can begin to blur. Stephen Glass reminds us that the truth ultimately triumphs. To
be good scientists, we must first be faithful reporters.
Whatever happened to Glass? A follow-up article published in Vanity Fair in 2007 said that he
moved to Hollywood, wrote a self-serving book, and took a job as a paralegal. He had a
girlfriend and (somewhat pathetically) did storytelling comedy sketches in his part time.
According to a July 2012 article in The Los Angeles Times, Glass applied to the California State
Bar Association to become a lawyer in 2009. He was rejected and the case was appealed to
the California State Supreme Court. The case remains in appeals.
Vanessa Hill 4/4/13 The Pressure to Deliver or the Need to Succeed: An Analysis of Shattered Glass by Vanessa Hill We have all, at one time or another, found ourselves under a great deal of pressure in a stressful situation; be it the pressure to throw a strike in a tied little league game, the pressure to do well on a final exam, or the pressure, in the case of Stephen Glass, to deliver engaging journalistic pieces by a demanding deadline. These situations can sometimes place us at an ethical crossroads—one where we must decide between playing by the rules and risking a loss, or cheating to ensure a victory. But how do we compare the stressed‐out college student who glances at his peer’s exam answers to the journalist who fabricates entire articles for an esteemed news magazine? Does ethics belong on a spectrum where one lie can be deemed “worse” than another? And could a lie’s position on this spectrum be influenced by factors outside of the lie itself – such as, which factors drove the liar to lie, is the liar remorseful, and what kind of impact does this lie have? It is clear that cheating on an exam and fabricating news stories both involve breaking a code of ethics, but comparing the two situations may bring to light some fundamental differences that will help us to consider the way in which unethical actions should be categorized and punished. Let us begin by analyzing the character of Stephen Glass, both from the perspective of the audience and his unsuspecting colleagues. As Stephen’s network of lies begins to unravel his coworkers at first come strongly to his defense; after all, they say, he is just a young boy under a lot of pressure. The audience, too, is duped at the start of the movie, when they meet Stephan Glass, a meticulous young journalist who has a passion for his field and a respect for the responsibilities that come with it. When speaking to eager high school students, Stephen fervently describes a journalist’s obligation to deliver double‐checked facts in an engaging and thought‐provoking way to readers. Early in the movie, Stephen is so ashamed by a mistaken claim regarding hotel minibars in one of his pieces that he offers to resign. But later, the audience learns that this hotel and every interesting detail surrounding it were just as fabricated as the persona Stephen created in front of his colleagues. The audience comes to realize that the mini‐bar mistake is just the tip of the iceberg and that the meticulous man they met in the beginning showed concern over his rookie “mistake” only for the sake of creating his facade as an honest and reputable individual. His actions reveal a pre‐meditated plan aimed at deterring others from ever suspecting that he would be capable of the brazen fabrications on which his esteemed reputation actually relies. Based on his character, let’s now consider what it is that drives Stephen Glass to fabricate his stories. His colleagues may, at first, blame his lies on stress caused by his busy schedule and the overwhelming pressure to succeed. But stress causes panic and impulsive decisions, and this is not what we see from Stephen Glass. Stephen’s lies are premeditated, tracing back to his start at the company, when he first begins his manipulative attempt to create the persona of the honest, fact‐
worshiping journalist behind which he hides his deceit. When first confronted about Vanessa Hill 4/4/13 the inaccuracies in Hacker Heaven, rather than admit his guilt immediately, as a nervous, remorseful, stress‐driven person might do, Stephen tries to cover his tracks with new lies. He is so convinced that his persona will protect him that he creates a fake company’s website and phone line, and enlists the help of his brother in impersonating a fabricated character in his story. Based on his actions, Stephen Glass is a manipulator, and his lies are driven not by stress or the pressure to deliver, but by a greedy, inner need to succeed at any cost. Throughout the movie, he never once expresses any degree of remorse for his deceptions. He seems more disturbed by the threat of being caught and having a ruined career than by the fact that his actions have defamed the magazine for which he works, as well as perhaps the reputations of his colleagues. Now the question that emerges is whether the reason for a lie, the degree of remorse, and the impact that the lie has, should influence the punishment. How does Stephen differ from the stressed‐out college kid peeking at his classmate’s exam? While the cheating student is most likely driven by desperation, and may feel less elated later when he earns a grade that he knowingly does not deserve, Stephen, instead, seems to be driven by greed. Perhaps Stephen’s first few articles caught the eye of a few renowned editors and earned Stephen a bit of fame. After that, he needed more fame, and was willing to obtain it at any expense. An equally important thought to consider is which factors did not stop Stephen from carrying out his lies. An individual who truly appreciates journalism as a means of bringing important facts and thoughts to the general population, such as unearthing great injustices or encouraging public reform, may also have a burning desire for success. However, this type of individual would not fabricate stories or even small facts because he would feel a responsibility to uphold the honest and reliable practice of his field, and that would outweigh any temptation to lie as a means of gaining personal success. Based on Stephen’s actions in the movie, it seems that he is a manipulative and greedy individual who does not respect the principles upon which his occupational field relies. He fabricates stories without showing remorse for the impact his actions will have on his company and colleagues. When comparing Stephen to a severely stressed, desperate college student who glances at his peer’s exam, it seems that unethical actions are not created equal. The degree of wrongdoing can be based on the motives of the wrongdoer, the wrongdoer’s degree of remorse, and the impact of the wrongdoing, and these factors should influence the appropriate punishment. Ethics Course ‐‐ Spring 2013 What films can teach scientists
Kipyegon Kitur
2nd year Pharmacology Graduate Student
Charles Ferguson's documentary film Inside Job reports on the deregulation of the financial system and the
financial crisis that occurred in the late 2000s. The film shows the impact of financial deregulation, particularly in
Iceland and the U.S.; how financial firms including investment banks, rating agencies and lenders engaged in
unethical and illegal activities; and how conflicts of interest corrupted the judgments of many economists,
bankers, lawmakers and financial regulators. Here, I review the consequences of financial deregulation and
unmanaged conflicts of interest. I also propose that implementing adequate regulations and limiting conflicts of
interest will reduce future financial crises.
Financial conflicts of interest were pervasive contributors to the financial meltdown. Several players in the
financial system during the period leading to the crisis experienced some level of conflict of interest. First, credit
rating agencies (e.g. Standard and Poor’s, Moody’s, and Fitch) were involved. These rating agencies maintained
“safe” rating grades for many firms including Fannie Mae, Freddie Mac, AIG, Lehman Brothers, and Bear Sterns,
even shortly before some of these firms failed. The film points out that investment banks paid rating agencies to
rate risky investment AAA or “safe.” Some investors including retirement funds bought these supposedly safe
investments and failed as a result. Thus, the credit rating agencies’ financial incentive to falsely rate risky firms
“safe” and lack of liability and accountability strongly contributed to the financial crisis.
When the congress questioned some of the employees of the rating agencies for rating risky investment “safe,”
they argued that they were simply making “opinions,” suggesting that investors should not have taken the rating
agencies’ words as facts. However, these agencies had and still have enormous influence in the financial system
and diverse sectors ranging from banks and to regulators use their ratings. This significant influence warrants their
tight regulation to avoid subjective and biased ratings from negatively influencing financial systems.
Second, academic economists played a role in deregulation and by giving financial firms that engaged in risky
investments intellectual support. The film shows that prominent economists and business school professors
including Martin Feldstein, Ruth Simons, Glenn Hubbard, and Larry Summers made money from consulting for
financial institutions and giving intellectual imprimaturs to their risky financial activities. The Icelandic Chamber
of Commerce paid Frederic Mishkin to write an article that showed that Iceland’s economy was stable and strong.
However, about two years after the report, Iceland went into a catastrophic financial collapse, proving Mishkin
wrong. Several economists also wrote in support of the use of risky financial instruments such as derivatives,
without disclosing compensation from financial firms. These examples illustrate how financial conflict of interest
corrupted objective judgment of many economists and contributed to financial crises.
Third, former bankers taking government positions contributed to the collapse. The film demonstrates how little
government regulators did to protect the public from risky financial activities and how, in several instances, they
advocated for those activities and even quieted dissenters. Larry Summers’s quashing the recommendation of
Brooksley Born, the then chair of Commodity Futures Trading Commission (CFTC), to tightly regulate
derivatives best exemplify this point. Former bankers, who went to work for the government, including the then
Treasury Secretary, Robert Rubin; Fed Reserve Chair, Alan Greenspan; and SEC chair, Arthur Levitt, condemned
CFTC for trying to regulate derivatives. These former bankers actually recommended that derivatives should be
1 Ethics Course ‐‐ Spring 2013 unregulated. As the film points out, these derivatives were one of the main contributors to the financial meltdown.
Hence, this example and several others demonstrate the damage to the economy that conflict of interest between
former-bankers-turn-government regulators and their former financial firms. This point is not meant to argue
against hiring former bankers to run government position. Their experiences are useful. However, experience
should not trump impartiality, and regulations should ensure these officials make independent, impartial and
sound judgments.
Fourth, the other injurious conflict of interest that the movie did not emphasize is the one between campaign
donors and lawmakers. The film notes that financial firms spend over $5 billion in lobbying and campaign
contributions. The film did not touch on the direct influence that this money is having in the financial system.
However, science can provide a ground on how funding sources can influence research, and for this reason,
scientists are required to disclose their financial sources. The same principle should go for lawmakers. They
should make their funding sources known to the public and show ways in which they can avoid favoritism when
passing bills.
These conflicts of interest flared because of two reasons: complexity in the financial system and deregulation.
Complex financial instruments allowed individuals and firms to engage in risky financials activities. The more
Collateralized Debt Obligations (CDOs) they bundled, the more profitable they were. The subprime mortgages
carried the highest interest rate making them even more profitable. Lenders did not care if borrowers could repay.
They gave mortgages to even the riskiest homebuyers. Investment banks bundled these mortgages as CDOs,
which credit rating agencies rated AAA or “safe. Goldman Sachs bet against these risky CDOs while falsely
telling their investors that they were high quality investments. This complexity and interdependence created an
opportunity for a calamitous domino effect and complicated smooth regulation of the system.
Conflicts of interest and unethical behaviors spread after deregulation. One of the regulations removed allowed
consolidation of firms into gigantic ones, which were “too big to fail.” For example, in 1998, Citicorp and
Travelers merged to form Citigroup, the largest financial firm in the world. As the film notes, the merger was
illegal. It violated the Glass-Steagall Act, which was meant to prevent commercial banks from participating in
investment banking. However, Allan Greenspan did not disapprove of the process. In fact, Larry Summers and
Robert Rubin urged the formation of Gramm-Leach-Bliley Act, which overturned Glass-Steagall Act.
There are several other instances where deregulation contributed to financial collapse. For example, (a) SEC
relaxing laws that prevented banks from overleveraging, which lead to over-borrowing and flourishing of noncompetitive firms. (b) Relaxing regulations to allow handing out big bonuses not necessarily tied to performance.
In fact, in some instances, CEOs were paid enormous salaries while their companies were underperforming or
failing. Regulations to ensure that bonuses are measured and meant only to reward achievements are essential to
maintaining a healthy financial system.
These arguments do not necessarily call for over-regulation. Over-regulation can lead to economic stagnation,
whereas adequate regulations, when implemented correctly, can lead to economic growth. The film shows that
regulation after the Great Depression were well measured and thus contributed to the growth of the U.S. economy
for almost half a century. For example, during this period, laws did not allow commercial banks to bet with their
customer’s deposit. Thus, regulations have to be right and implemented correctly as not to impede growth, but, at
the same time, not to lead to unfairness and financial disasters.
2 Ethics Course ‐‐ Spring 2013 In all, the film elegantly shows the detrimental impact of financial deregulation and unmanaged conflicts of
interest. I have argued that regulations, when designed and implemented carefully, can mitigate financial
catastrophes. I also believe the financial system can learn a thing from science – publicize and limit conflicts of
interest. These two – tight regulations and managed conflicts of interest – might save the world from future
financial crises.
3 How to Address Suspicions About Scientific Misconduct by Kristin A. Politi
When published data seems “too good to be true,” it sometimes is. Dutch psychologist Diederik
Stapel fabricated data over at least seven years that affected at least 55 publications and the careers of many
young researchers and collaborators. Three young researchers from his own institution finally exposed him
by observing and collecting evidence over a number of months. However, these were not the first to raise
suspicion. During the investigation, two professors came forward claiming to have observed data from
Stapel they believed “too good to be true.” However, the people behind these early claims did not have the
evidence or the means to collect evidence to support their suspicions.
In the wake of widespread, high-impact instances of fraud, there are experts that claim to have
been skeptical from the beginning. Imagine if in these cases, there was a way to investigate early on, at the
level of suspicion: careers, money, time, and livelihoods could be saved. Is it feasible to address suspicion
as a community? Or must we wait for hard evidence to emerge, while careers of those impacted are silently
destroyed?
What do we stand to gain if we decide to trust intuition as a means to expose fraud? Any expert in
a field can have skepticism that data seems “too good to be true.” If this expert were able to suggest an
investigation into their suspicion without consequence, this would abrogate our reliance on whistleblowers
from inside the laboratories or departments of those accused. Stapel’s accusers were young investigators in
dependent positions who had much to lose from the exposure of his fraud; their situation is not unique. In
many other fraud cases, those that have the most to lose are often the only people capable of producing
hard evidence because of direct involvement or close relationships. Must we rely on the bravery of
individuals with an inherent conflict of interest? It is naïve to expect hard evidence to be handed like a
present in the laps of those capable of formally addressing the fraud.
Before we address intuition, we need to believe as a community that it bears value. In Blink: The
Power of Thinking without Thinking, Malcolm Gladwell points out, “our world requires that decisions be
sourced and footnoted, and if we say how we feel, we must also be prepared to elaborate on why we feel
that way...We need to respect the fact that it is possible to know without knowing why we know.” Gladwell
claims that experts often know when a result seems ‘off,’ but would rather rely on tangible evidence than
an intuition when deciding whether to act. In this way, those most capable of suspecting fraud are
immobilized. As scientists, we are compelled to make logical conclusions based on hard evidence. But
according to Gladwell, intuition should not be discounted, but appropriately addressed. In Gladwell’s
opinion, experts gain intuition through experience in a field and more often than not make reliable
judgments.
Lack of concrete evidence is not the only or even the main factor responsible for experts to doubt
their intuitions. There is an inherent trust built into relationships within the scientific community and
between colleagues, coauthors, and mentors-mentees. Furthermore, the status of the individual suspected
can often override judgments. In a mentor-mentee relationship between a principal investigator and a PhD
student, a mentor is in a position to abuse the trust of the PhD student because of higher status. A wellestablished leader of a field may be more trusted by young investigators and those in lower positions.
Diederik Stapel was a faculty dean at Tilburg, University in Netherlands and considered a rising star in the
world of social psychology. He was friendly and well liked among students and faculty in his department
and even taught an ethics course. The final report of the Stapel investigation in November 2012, deemed
Levelt Report, claims, “The last thing that colleagues, staff, and students would suspect is that, of all
people, the department’s scientific star, and faculty dean, and the man who personally taught his
department’s scientific ethics course, would systematically betray that trust.”
What is remarkable is not that there are factors which cause us to distrust our “gut feelings” but
rather that these intuitions persist even when there are conflicting inclinations. Despite Stapel being a
trusted leader in his field, there were PhD students and young investigators interviewed during the
investigation that suspected his data were “too good to be true.” This alone gives merit to Gladwell’s
hypothesis that intuitions are powerful tools if used correctly—gut feelings exist, rational thought either
supports them or overrides them. The Levelt Report, claims:
“Occasionally PhD students would comment informally that some data appeared to be too good
to be true, according to two interviewees. They said that Mr Stapel’s leadership position, his
recognized ability to form hypotheses and the fact that far from all data were beautiful, helped
explain the lack of serious doubt regarding data fraud. There was also an occasion on which a
PhD student expressed doubt about the quality of the data in conversation with a close colleague
of Mr Stapel’s (but not with the faculty confidential counsellor for PhD students). Because only
conjecture was involved, no action was taken. In view of the seriousness of the allegation, strong
evidence would be needed in order to take further action, which is where the matter rested.”
No scientist should be formally accused of fraud without significant evidence. However, there
should be a better system, a method of non-accusatory auditing, where mere ‘suspicions’ are respected and
addressed appropriately. This concept is not novel; companies are expected by shareholders to undergo
auditing of their financials by an independent organization. I do not propose such formality or
sophistication, but at the very least a way to better address concerns from individuals regardless of their
status or ability to produce significant evidence. We cannot pretend that having a strong scientific
community based on peer-review is sufficient to prevent or expose misconduct. In the wake of devastating
cases such as that of Diederik Stapel, we must consider the inherent weaknesses in our community and how
to overcome them.
The Meteoric Fall of Diederik Stapel
by Rachel Fleisher Wachter
When one first reads about the massive fraud conducted by Dutch social psychologist Diederik
Stapel, it is hard to believe. The fraud was committed over a decade and involved work he did at three
universities (University of Tilburg, University of Amsterdam and University of Groningen). When the
charges of fraud were brought against him, the committees established at each of the institutions to
investigate (Levelt, Drenth and Noort committees, respectively) spent tremendous time and energy going
through the 137 papers that he published and the 18 PhD theses that he oversaw. The conclusions of their
work led to the retraction of 55 of Mr. Stapel’s publications. The effects of his fraud have shaken the field
of social psychology to its core. In addition, Mr. Stapel was one of the most well known psychologists in
the Netherlands and around the world; his work was written about in many newspapers, including the
New York Times. His gross violation of research ethics has severely harmed the public’s view of science
and scientists. Scientists should study Diederick Stapel’s fraud so that these egregious violations of
research ethics are avoided in the future.
Perhaps what was most impressive about entire story involving Mr. Stapel’s work was how the
three universities involved reacted to the allegations of fraud. Three students of in the Department of
Social Psychology at Tilburg University spent months observing Mr. Stapel and gathering information
about his research practices. When the fraud was brought to light, Tilburg University immediately began
investigating and informed their colleagues at the University of Amsterdam and the University of
Groningen. The three committees investigating Mr. Stapel released a joint report entitled “Flawed
science: The fraudulent research practices of social psychologist Diederik Stapel 1 ”. It is encompassing in
its scope; the report mentions hiring full-time statisticians to go through all the published and unpublished
1
Levelt, Noort, and Drenth Committees. “Flawed Science: The fraudulent research practices of social
psychologist Diederick Stapel.” November 28, 2012.
1
data. Interestingly, the three committees elected to analyze the entire body of work produced by Mr.
Stapel instead of looking at a smaller pool of papers or choosing a randomized sample. Their reasoning
was the following: “The scientific literature must be cleansed of everything that is fraudulent, especially if
it involves the work of a leading academic…Nevertheless, the most important reason for seeking
completeness in cleansing the scientific record is that science itself has a particular claim to the finding of
truth.” They claim that this is the first example of an ethics committee going through all the work of the
suspect. This is an encouraging step in apprehending and eliminating fraud from science.
While the committees’ report was able to assess the extent of fraud in Mr. Stapel’s work and
successfully remove the fraudulent data from the scientific literature, what was most surprising about the
committees’ findings were those regarding the field of social psychology itself. They found that many
researchers do not have a good grounding in statistics and were prone to leaving out data in their
publications that were contrary to the point they were hoping to get across. Researchers do not share their
datasets with others and are often not required to submit their datasets when publishing their results. The
journal editors often publish data that is “too good to be true” and do not press the authors on how they
obtained their results. In general, the committees found that at each of the universities and in the field as a
whole, there was a lack of rigorous scientific standards that allowed for Mr. Stapel’s fraud to go
undetected for many years. In a field that has as much public interest and direct impact on humanity as
social psychology, it is alarming that scientific standards have not been upheld.
In response to all of the claims against him, Mr. Stapel wrote a book (in Dutch) entitled
“Derailed.” According to translated quotes from a review of the book 2 , Mr. Stapel’s fraud was a result of
a “toxic interaction of person and environment.” Like the report stated, Mr. Stapel claimed that there was
little oversight or criticism and the temptation to cheat proved too strong to resist. The combination of his
2
Borsboom, D. and Wagenmakers, E.J. “Derailed: The Rise and Fall of Diederick Stapel.” Observer, 26,
1 (2013)
2
“…need to score, ambition, laziness, nihilism, want of power, status anxiety, desire for solutions, unity,
pressure to publish, arrogance, emotional detachment, loneliness, disappointment, ADD, addiction to
answers…” led him to commit scientific fraud. The authors of the review note that the last chapter of his
book was a beautifully written account of Mr. Stapel waking up next to his wife. The reviewers also note
that there are many lines that were lifted from James Joyce and Raymond Carver that are not referenced.
It is truly ironic that Mr. Stapel plagiarized in a book that was supposed to serve as a ‘mea culpa’ for his
years of committing fraud. This leads one to suspect that, at best, perhaps his apology is not really sincere
and at worst, Mr. Stapel has an addiction to cheating and has to seek medical help.
Many lessons can be learned from the story of Diederick Stapel. As Dr. Casadeval said in his
speech, it is crucial that scientists take a step back and make sure that the research environment is one that
fosters openness, discussion and high standards and one that does not tolerate “sloppy science.” It is
imperative to teach research ethics and to improve the way future scientists are trained in order to limit
these kinds of abuses to science. Reviews must take the peer review process extremely seriously. While
all this is true, one cannot help but wonder whether science will ever be able to rid itself of fraudsters like
Mr. Stapel. Every discipline, from law enforcement to politics to medicine and education, will
unfortunately always contain people who are corrupted and science is no exception. However, as was
written in the Stapel Report, the fundamental basis of science is the search for truth and the value of truth.
It is impossible to completely rid science of fraudulent and unethical work. However, scientists must be
vigilant to not only apprehend fraud once it is discovered but to ensure that fraud never reaches
publication. They must make clear to the public that work of this kind is unacceptable and that they are
working to stop it. The Stapel affair can be seen as an opportunity to take strong action towards
rehabilitating science and the public’s view of science. We should not let the chance pass us by.
3