Sermon Text

Babes in Arms
An Evidence Based Argument for Raising the Enlistment Age to 25
By
Rev. Dr. Todd F. Eklof
May 25, 2014
In 2012, Susanne Collins’ bestselling novel, The Hunger Games was adapted into a
blockbuster film about children, ages 12 through 18, selected to fight to the death so the
rest of society can avoid ever going to war. That same year the online documentary, Kony
2012, about Joseph Kony, a real Ugandan warlord, allegedly responsible for kidnapping
more than 65 thousand children for use as sex slaves and soldiers, went viral. Whether they
are horribly real or just the stuff of imaginative science fiction, stories of child soldiers
seem the stuff of pure evil! Yet, in the United States the minimum age of enlistment isn’t far
from the ages of some of the unfortunate children purportedly conscripted by Kony, or of
the “Tributes” chosen by lottery in the fictional Hunger Games. As appalled as we are by
such stories, there is little controversy about actually allowing military recruiters into
American high schools or to routinely call upon children at home during their senior years.
Though the ages may differ from culture to culture and from nation to nation, children have
long been used as cannon fodder. The very word, infantry, literally means, “child army,”
originally used in reference to young inexperienced foot soldiers.
The current age of military enlistment (17 with parental consent, 18 without it) has
remained approximately consistent throughout U.S. history. Although hundreds of
thousands of children are known to have participated in the American Revolution, Civil
War, World War I, and World War II, the legal minimum enlistment age has long hovered
somewhere between 17 and 21. In 1863 President Lincoln authorized the draft of all “ablebodied men between the ages of 20 and 45,”i as the Confederacy did of “all white men
between the ages of 18 and 35,”ii which it later extended to those between the ages of 17
and 50.iii During the Spanish American War of 1898 (to 1901), Congress, likewise, declared
that all males ages 18 to 45 were subject to military service.iv The Selective Service Act of
1917 required those between the ages of 21 and 30 to register for service in World War I.v
As World War II approached, the minimal draft age was lowered to 18.vi The first peacetime
draft in U.S. history was passed in 1940, vii requiring all males between 21 and 35 to
register. The age was bumped down to 19 at the start of the cold war in 1948, viii and,
further still to 18.5 during the Korean War.ix In 1955 the Reserve Forces Act again required
all men between 18 and 26 to register for possible service. In response to civil unrest
during Vietnam, the draft law was allowed to expire in 1973, but was reinstated just a few
years later in 1980. Today it remains the law of the land, requiring all males between the
ages of 18 and 26 to register for possible military conscription through the Selective
Service System.
Relatively consistent as the draft age has remained, however, it has been inconsistent in
with those U.S. laws determining the age of adulthood. In 1787 the U.S. Constitution
reserved the right to vote for white male property owners age 21 or over. In 1807 the right
was extended to all white men at least 21 years of age. In 1870 the Constitution was
amended to include all men, regardless of race, as long as they were 21 or older. In 1920 it
was again amended to include women 21 and up. Throughout all these decades, boys as
young as 18, and sometimes much younger, had no right to vote in the nation they were
considered mature enough to serve militarily.
This incongruence became a key criticism against the Vietnam War. As the 1965 antiwar
song, Eve of Destruction, complained, “You’re old enough to kill, but not for votin’.” Rather
than agreeing that many of the young men being drafted were too young and immature for
military service, however, lawmakers instead chose to lower the voting age to 18. x Two
years later, in 1973, conscription was abolished altogether, in favor of an all-volunteer
military, which remains the case today, though, again, all males between 18 and 26 are
required to, at least, register for service should reinstating a draft ever become necessary.
In every instance those at the minimum conscription age were defined as “adult males,”
though, equally as consistently, they were not thought of as being mature enough to vote
until only relatively recently in U.S. history (1971), and then, only after much civil unrest
over compulsory enlistment, and not, to be sure, over the right of teenagers to vote.
Today most the legal rights and responsibilities reserved for adulthood are aligned with
this dated military tradition, with the exception of drinking or buying alcohol, which
remains 21. At age 18, teenagers can legally choose to marry, consent to sex, get a driver’s
license, or a tattoo, go to work, vote, smoke, and will automatically be tried as adults if ever
charged with a crime. This arbitrary tradition, however, which dates back, at least, to
Colonial times, is not rooted in what we now know about human development and
maturation thanks to much research in the areas of biology, neurology, and developmental
psychology. When such evidence is taken into account it is no longer sound for our society
to base the age of adulthood solely upon this antiquated military tradition.
The first argument I would make is that the human age of maturation itself changes over
time and that it takes longer for us to mature today than ever in our history. So, even if the
age of human maturation may have truly been 18 a hundred years ago, it doesn’t mean this
remains the case today. This is so for both biological and sociological reasons.
Biologically speaking, humans are among very few organisms scientists consider neotenous.
Neoteny refers to the tendency of some creatures to retain juvenile characteristics even
after reaching sexual maturity. Axolotl (or Mexican) salamanders, for instance, remain in a
larval stage their entire lives, retaining their gills and, at best, growing underdeveloped
limbs and digits, yet they are perfectly capable of reproducing at 18 to 24 months. Dogs,
likewise, with their smaller frames, flatter faces, and lifelong playfulness, are neotenous
kinds of wolves. Human beings, however, are by far the most neotenous creatures ever. As
early as 1936, Dutch anatomist, Louis Bolk described the human species as “a primate fetus
that has become sexually mature.”xi And biologist Stephen Jay Gould once said a human
baby is “still an embryo.”xii At birth, for instance human bones are not fully ossified and our
skulls aren’t entirely closed. Our spines remain attached toward the base of our skulls,
where it begins in all primates, but, otherwise moves upward toward the top of the skull
during fetal development. Our brachiated limbs also remain relatively weak our entire
lives, although it is this characteristic, separated shoulders, usually allowing the strength
and flexibility to swing from the bottom branches, that largely defines us as apes. Our teeth
also erupt only after we’re born; we remain mostly hairless; we lack opposable toes; and
we retain the same flat faces and oversized heads other primates are born with but
eventually outgrow.
So why are we so different from other apes even though our DNA is almost 99 percent the
same? According to science writers John Gribbin and Jeremy Cherfas “Neoteny resolves the
problem. The one-and-a-bit difference could easily reside in the genes that control the rate
of development, making human beings a form of infant ape that has learned to reproduce
without reaching physical maturity.”xiii
The one advantage to being born premature is our ability to continue gestating outside the
womb, allowing our brains to continue growing for many years. The brains of chimps and
gorillas, for example, are 70 percent of their final size at birth, a milestone not reached by
humans until our second year. Humans, in fact, are born with brains only a quarter of their
eventual size and continue to develop throughout life. This means our intelligence isn’t
limited to what information can be packed tightly into our genes before we’re born.
Squirrels are born with the endogenous knowledge to cache acorns and birds to build
nests, but Human beings acquire information exogenously, that is, by learning it from
others. The problem is, the more complicated our society becomes, especially nowadays
with exponential advances in technology, making for larger networks of human beings and
more complex socializing, it takes much longer for us to learn everything we must in order
to survive. In short, it’s taking us longer to mature. Good thing our neotenous nature allows
us the flexibility to do so, that is, to stay immature for increasingly longer periods of time.
This, or course, leads us to the sociological part of the argument. The more complex our
society becomes, the longer it takes for us to mature, that is, to acquire all the information
and develop all the skills we need to survive on our own. Little more than a hundred years
ago in our country, people often married and began raising families just as soon as they
sexually matured. As developmental psychologist, Clifford Anderson points out, “Indeed, as
late as the mid-nineteenth century, a woman who was not married by age sixteen or so ran
a serious risk of becoming an ‘old maid.’”xiv Today, we sometimes say, my grandmother or
grandfather “only” had a sixth grade or an eighth grade education. But at the time this
would have been considered a great accomplishment that gave a person the knowledge of
reading, writing, and arithmetic then necessary to survive. Yet today few of us would
consider our 13 year olds mature enough to leave home and begin a career, let alone to
marry and start families. In fact, the nature of our society today is making it increasingly
difficult even for many college graduates to fully lead independent lives
In his book, Stages of Life, Clifford Anderson writes that, “For most of history, the average
person’s life cycle reflected a pattern of psychological growth in childhood followed by a
permanent commitment—that is, to marriage, childrearing, and work—in the early teenage
years.”xv Anderson, a once military psychologist and recipient of the Legion of Merit award,
goes on to suggest that as society becomes more complex, so does the human psyche, by
actually adding new stages of development. He points out that as early as 1904,
developmental psychologist G. Stanley Hall, the first president of the American
Psychological Association, described the “emergence into the general population of a new
type of teenager, documenting for the first time a new stage in the human life cycle:” xvi Hall,
whose doctoral advisor was none other than William James, suggested that after the Civil
War children who worked alongside their parents in fields and factories began to
disappear, due largely to advances in industrial technology and productivity, allowing
millions of teenagers, once considered adults, to remain outside the labor force.
At this point, a new stage of psychological development emerged, adolescence. We went
from a situation in which the human matured from childhood directly into adulthood, into
a situation with a new intermediate stage. “Today,” Anderson points out, “Hall’s concept of
adolescence is unshakably enshrined in our view of human life… A stage of life that barely
existed a century ago is now universally accepted as an inherent part of the human
condition.xvii Since then, Anderson goes on to suggest, yet another, far more recent layer
has been added to the developing human psyche—youth. MIT’s esteemed professor of
Human Development, Kenneth Keniston, first used the term “youth” in this regard.
Anderson suggests it erupted onto the scene in response to the social tumult of the 1960s.
As Keniston himself put it back then, “If neither ‘adolescence’ nor ‘early childhood’ quite
describes the young men and women who so disturb American society today, what can we
call them? My answer is to propose that we are witnessing today the emergence on a mass
scale of a previously unrecognized stage of life, a stage that intervenes between
adolescence and adulthood.”xviii1
So if thinkers like these are correct in the analysis of their research, during the last 150
years the human psyche itself has become more complex, evolved, if you will, as an
adaptive response to changing environmental circumstances. As a result, it now takes us
longer to psychologically mature. We no longer pass merely from childhood into adulthood,
but from childhood, through adolescence, through youth, then, finally, into adulthood.
Clifford Anderson actually suggests this entire process can take more than three decades,
and, because society continues to grow more complex, there may be new stages we don’t
yet even recognize. The point is, even if we accept age 18 may have been appropriate age
for military service in the past, and I personally think it was, there is nothing suggesting it
remains so today. In fact, according to the work of Anderson, Hull, and Keniston, the
evidence would lead us to conclude it isn’t.
This conclusion becomes even weightier in light of more recent findings in the field of
neuroscience. Dr. Jay Giedd, Chief of the Brain Imaging Unit at National Institute of Mental
Health, leads a research team that for longer than 20 years has accumulated more than
3000 MRI scans of developing brains, making it the largest pediatric neuroimaging project
ever. Analysis of all this data has led to several findings, but there are a couple that are
most pertinent to our discussion here. First, the thinking part of the brain, the gray matter,
continues to thicken throughout childhood as it makes new connections. This period of
dendritic overproduction peaks just before puberty, when the brain begins pruning itself
by getting rid of the neural connections it doesn’t use as much. The remaining connections
1
Ibid., p. 123.
are wrapped with myelin, a fatty material that insulates them. Both the myelin and the
pruning enable nerve impulses to send information more quickly and efficiently, like a
machete makes it easier to pass through the jungle. In the prefrontal cortex, the thinking
part of the brain, this process isn’t complete until the mid 20s or later.xix
In short, the brain continues to develop by hardwiring itself until we are, at least, in our
mid 20s. “So if a teen is doing music or sports or academics,” Giedd says, “those are the cells
and connections that will be hard-wired. If they're lying on the couch or playing video
games or MTV, those are the cells and connections that are going [to] survive.”xx So the gray
matter volumes that peak in childhood and begin to decline in adolescence, “level off during
adulthood,”xxi suggesting, in short, we become more set in our ways as adults. Until then, an
individual’s behavior becomes increasingly specialized as the brain gets better at the
activities one most engages in, and forgets everything else.
More importantly, the frontal lobes are also the part of the brain responsible for what
neuroscience refers to as our “executive functions;” the ability to plan ahead, learn from the
past, and control our impulses. Because, as Giedd’s research indicates, “loss of gray matter
progresses from the back to the front of the brain with the frontal lobes among the last to
show these structural changes,”xxii the executive functions “are among the last areas of the
brain to mature; they may not be fully developed until halfway through the third decade of
life.”xxiii
This part of the brain, again, the last to mature, is also the part of the brain that interprets
and regulates emotions. It is responsible for what neurologists call “emotional maturity,”
the ability to cope responsibly with primitive emotions like fear and anger. According to a
recent study in the Journal of Science, published by a group of Max Planck researchers, the
specific part of the brain responsible for empathy and compassion, the supramarginal
gyrus, is also located in the front of the brain, that is, in the last part of the brain to
mature.xxiv Giedd himself says, “The evidence suggests that this [emotional] integration
process continues to develop well into adulthood.”xxv In brief, when it comes to
determining the age of adulthood according to the findings of neuroscience, Giedd says,
“there is little empirical evidence to support age 18.”xxvi
The sociological evidence of this is also overwhelming. Despite being at the physically
healthiest point in their lives, less prone to illness and disease, mortality rates among
teenagers is extremely high because they’ve not fully developed the capacity for avoiding
unnecessary risks. Young people, for instance, are four times more likely to be involved in
car accidents than adults, and three times more likely to die as a result. We also know that
most crimes are committed by people between the ages of 13 and 25, after which criminal
activity takes a steep drop.
I am not suggesting here that the age of maturation ought to be definitive at 25. Individuals
are all different. I’ve known some incredibly mature teenagers and more than a few terribly
immature adults. As developmental psychologist Robert Kegan says, “many people who are
chronologically adult are psychologically adolescent...”xxvii What we can be sure of is that
the brains of human children continue to mature until they are at least 25.
So, to summarize then;
The current age of maturity, 18, is based arbitrarily on long held military traditions. “For
example,” as Giedd explains, “in 13th century England, when feudal concerns were
paramount, the age of majority was raised from 15 to 21 years, citing the strength needed
to bear the weight of protective armor and the greater skill required for fighting on
horseback. More recently, in the United States the legal drinking age has been raised to 21,
whereas the voting age has been reduced to 18 years so as to create parity with
conscription.”xxviii
We also know, according to biological and developmental science, that the age of
maturation is not set for all time, but changes with the increasing complexity of human
society. Humans, in particular, as neotenous animals, have the advantage of remaining
immature for longer periods of time. So the average age of maturity a hundred years ago or
more is not likely to be the same as today.
In addition, according to the latest neuroscience, we now understand that brain doesn’t
stop maturing until at least 25, particularly the part of the brain that knows to avoid risks
and can fully empathize with others. As Giedd says, “Poor executive functioning leads to
difficulty with planning, attention, using feedback, and mental inflexibility, all of which
could undermine judgment and decision making.”xxix
Some of the challenging questions that emerge from of such evidence are, do we, as a
society, want to continue enlisting teenagers and young adults into the military now that
the age of adulthood can no longer be determined arbitrarily? Now that we know
adolescence continues, in some sense, into the mid 20s do we, as a society, wish to continue
to enlist and deploy soldiers who have not yet developed the full capacity to empathize
with others or to avoid taking unnecessary risks that could easily get them killed? What,
ultimately, is the difference between a society that knowingly recruits teenage and young
adult soldiers in light of such evidence, and those that conscript child soldiers in Uganda or
get their kids to fight their battles for them in the world of science fiction? What are the
moral implications of such evidence? What are our moral obligations to those with
developing brains? How would shifting our thinking about adulthood impact our ability to
maintain an adequate number of soldiers? What is the implication for our national
security?
I do not have a response for the all the implications of this new evidence, but I do believe, in
light of such evidence, we can no longer justify 18 as an appropriate age for enlistment.
Based upon the evidence, for the very same ethical reasons as presumed in the past, that
soldiers should be adults, I suggest we raise the age of enlistment to 25.
i
Conscription Act (Enrollment Act), Washington City, District of Columbia, March 3, 1863.
Confederate Conscription Act, Montgomery, Alabama, April 16, 1862.
Original Act amended February 1864.
iv There was no need to take advantage of the conscription law, however, given that all military needs were
satisfied by volunteers.
v Selective Service Act (Selective Draft Act), May 18, 1917.
vi 1940.
vii Selective Training and Service Act, September 16, 1940.
viii The Selective Service Act, was the second peacetime draft in U.S. history, instated in 1948 due to the
expiration of the STSA.
ix Universal Military Training and Service Act, 1950.
x 26th Amendment of the U.S. Constitution, July 1, 1971.
xi Gribbin, John, & Cherfas, Jeremy, The First Chimpanzee, 2001, Barnes & Noble, Inc., 2003, US, p. 178.
xii Gould, Stephen Jay, Ever Since Darwin: Reflections in Natural History, from the chapter Human Babies as
Embryos, Penguin, 1977.
xiii Ibid. p. 177.
xiv Anderson, Clifford, The Stages of Life, The Atlantic Monthly Press, New York, NY, 1995, p. 121.
xv Ibid.
xvi Ibid., p. 121f. . [From, Adolescence: Its Psychology and Its Relations to Physiology, Anthropology, Sociology,
Sex, Crime, Religion, and Education, Hall, 1904]
xvii Ibid., p. 122.
xviii Ibid., p. 123.
xix Sara B. Johnson, Ph.D., M.P.Ha,, Robert W. Blum, M.D., Ph.Db, and Jay N. Giedd, M.Dc, Adolescent Maturity
and the Brain: The Promise and Pitfalls of Neuroscience Research in Adolescent Health Policy Adolesc
Health. 2009 September ; 45(3): 216–221. doi:10.1016/j.jadohealth.2009.05.016.
xx http://www.pbs.org/wgbh/pages/frontline/shows/teenbrain/interviews/giedd.html
xxi http://www.dana.org/Cerebrum/2009/The_Teen_Brain__Primed_to_Learn,_Primed_to_Take_Risks
xxii
Ibid.
xxiii
Ibid.
xxiv
“The Neuroscience of Empathy,” published on October 10, 2013 by Christopher Bergland in The Athletes Way,
[Psychology Today: http://www.psychologytoday.com/blog/the-athletes-way/201310/the-neuroscience-empathy]
xxv
Giedd, ibid.
xxvi
Ibid.
xxvii
Kegan, ibid., p. 211.
xxviii
Giedd, ibid.
xxix
Ibid.
ii
iii