Historical Developments in Computers to the

Historical Developments in Computers to the 1950s1
by
Jack Minker
Department of Computer Science and
Institute for Advanced Computer Studies
University of Maryland
College Park, Maryland 20742
April 23, 1998
Abstract
The start of the computer revolution is considered to have taken place with the development
of the ENIAC electronic digital computer in 1946, slightly more than 50 years ago, at the Moore
School of Electrical Engineering at the University of Pennsylvania. I will trace the the scientic
developments that led to the ENIAC and other computers developed at around the same time.
In antiquity, Euclid developed the rst algorithm and Aristotle developed Aristotelian logic,
fundamental to computers. Contributions were made in the 1600s by Schickard, Pascal and Leibniz
who developed calculators. In the 1800s the Jacquard Loom for weaving patterns was inuential
in the development of digital computers proposed and detailed by Charles Babbage. Mathematical
contributions were made by George Boole, and the vacuum tube, to be used in the early electronic
digital computers was developed by John A. Fleming.
In the 1930s, Alan Turing developed the concept of an abstract machine, the Turing Machine
that is able to simulate any digital computer. The rst digital computers were started in the
1930s by John Atanaso and Cli Berry at Iowa State University, by Howard Aiken at Harvard
University, by George Stibitz in conjunction with Samuel B. Williams and Ernest G. Andrews at
the Bell Telephone Laboratories, and by Konrad Zuse in Germany. In England, as part of their
code breaking eorts, an electronic digital computer was developed by Tommy Flowers, Sir Harry
Hinsley and M.H.A. Newman. In the 1940s the rst large scale electronic digital computer, the
ENIAC was designed and developed by Presper Eckert and John Mauchly.
Work on the ENIAC led to the concept of the stored program computer, in which the computer
program and the data to be operated upon co-existed in main memory. Computers inuenced by
the stored program concept developed in the 1950s are discussed. A discussion is also presented of
programming developments that are fundamental to the development of computers.
This paper was presented as part of the Distinguished Scholar/Teacher Lecture Series for 1997/1998 on April 8,
1998 at the university of Maryland
1
1
1 Introduction
It is generally agreed that the computer age started some 50 years ago. As in any other new
development, it is built upon other scientic achievements. In this lecture, I shall describe what I
believe to be some of the signicant developments that have taken place that preceded this period,
and the developments that have taken place that led to the start of the eld. I shall also address
the issue as to what is meant by a computer. Indeed, there may be many dierent answers. I
shall provide a time-line of signicant developments up to approximately the mid 1950s. I shall
stop and reect upon some of the developments as they relate to computer hardware and software
technology, industry, science and education. It is not possible to provide a detailed description of
all of the signicant events in this short lecture, hence, I shall focus on those that I believe to be
the most signicant. Others may have dierent ideas on the same subject.
The computer revolution of 50 years has permeated all aspects of our lives. It has gone from an
esoteric subject practiced by `nerds,' to a device in our homes where our children and grandchildren
use them on a daily basis, as well as a bunch of `nerds'. We have seen many new companies arise
and the Rockefeller's of the oil industries of the early 1900's have been replaced by the Gates's of the
computer industry as our richest individuals. There is no doubt that this revolution in computing
is as signicant as was the industrial revolution that preceded it.
As noted by Juris Hartmanis, current Director of the Division of Computer Science at the
National Science Foundation,
Computer science is a new science among the other sciences and diers signicantly from
these other sciences, for example physics, chemistry, and biology. In these other sciences,
the study to a large extent is of the world that exists. We are trying to understand this
world so as to explain and predict phenomena. Computer science is interested primarily
in what can exist and how to describe and analyze the possible in information processing.
It is a science that has to conceptualize and create the intellectual tools and theories to
help us imagine, analyze and build the feasibly possible.
The start of the computer revolution is considered to be, February 14, 1946, the date the ENIAC
machine was unveiled at the Moore School of Engineering at the University of Pennsylvania. During
the past 52 years we have seen tremendous developments. In hardware, the size of machines has
been reduced drastically so that today we are walking around with digital computers in our brief
cases and pockets. In software, we have seen the development from primitive tools I shall discuss
today, to sophisticated programs that are user friendly. In communications, we have seen the
Internet, that makes information more generally accessible and places the access to libraries in our
homes. In industry, we have gone from the development of computer hardware to the development
of major industries in computers, computer devices and software companies. In education, we
have gone from no curriculum in computer science to curricula for computing at the graduate,
undergraduate, high school and now grade school levels. Whereas the conventional sciences have
had curricula on which they could build, no such curriculum existed for computer science until
1968. I will not have the time to discuss the development of a curriculum for computer science.
I shall only mention now that the University of Maryland played the leading role in this eld.
Indeed, Dr. William F. Atchison, who was Director of the Computer Science Center from 19671973, was in charge of the development of curriculum for the Association for Computing Machinery,
the largest computer association in the world. The report issued by Atchison's committee started
the formalization of a curriculum in computer science throughout the world.
There have been so many developments in computing that it is impossible to do justice to all
of them in one or even two lectures. A complete course is needed. What I shall discuss, today,
2
are some critical developments that led to the computer revolution. I shall focus on some of the
key ideas starting from antiquity to approximately the 1950s. Those interested in further details
concerning material in this article should see the books [1, 2, 4, 7].
2 Developments in Antiquity
The history of computers can be traced back to antiquity, when the Sumerians in the period
4000-1200BC started to keep records of commercial transactions on clay tablets. The Sumerians
apparently came from the mountains of Persia (Iran). They had a usable system of numbers and
may have been the rst to have written numerals. Clay tablets were the earliest storage devices.
The numbers that we use today, 1, 2, 3, ... are often spoken of as Arabic, but were never used
by the Arabs. They came to us by means of a book on arithmetic which apparently was written
in India about 1200 years ago and was translated into Arabic shortly thereafter. This book was
transported by chance to Europe and translated into Latin. The numbers took their current shape
in Europe and hence, it might be appropriate to say, that they are European or Modern numerals.
In 3000BC the Babylonians invented the Abacus (Figure 1), used from that date to today
to perform arithmetic calculations. Indeed, in 1992 I visited Russia and went to a city outside
Moscow where I had to purchase some medication. To my astonishment they were using, not a
hand calculator, but an Abacus to add the sum of the total purchases. The abacus is also used
throughout China.
In approximately 330 BC, Aristotle (384-322 BC) developed the rst form of logic and published
it in his famous work Organon. Aristotelian logic, whose classic example is \All humans are
mortal, \Socrates is human" permits the valid deduction, Socrates is mortal. Aristotelian logic led
ultimately to classical mathematical logic, which plays a fundamental role in the design of digital
computers, programming languages and programs.
Somewhere between 400 and 300BC, Euclid invented an algorithm to nd the greatest common
divisor (gcd) of two positive integers. The gcd of X and Y is the largest integer that exactly divides
both X and Y. Thus, the greatest common divisor of the integers 9 and 12 is 3, since 3 is the largest
number that divides both 9 and 12. It is not unlikely, however, that the Babylonians were aware of
the concept of an algorithm. Algorithms are fundamental to all computer programs. Whereas both a
procedure and an algorithm are nite sequences of instructions that can be carried out mechanically,
an algorithm will always terminate, whereas procedures may never terminate. Euclid's algorithm is
considered to be the rst non-trivial algorithm ever devised. The word algorithm is derived from the
name of the Persian mathematician Mohammed al-Khow^arizm^i who lived during the ninth century
and who is credited with providing the step-by-step rules for adding, subtracting, multiplying and
dividing ordinary decimal numbers. When written in Latin, his name became Algorismus from
which algorithm is derived.
3 Early Computational Eorts
I shall discuss developments during three major periods that led to the rst digital computers.
The periods are: The 1600s, when the rst computational devices were developed, the 1800s, when
Charles Babbage made his contributions and the period from about 1930-1945. I shall also discuss
related topics that occurred during the 1950s.
3
3.1 Computational Eorts in the 1600s
The development of mechanical devices for computing made little progress until the 1600s. John
Napier developed the concept of logarithms and what are called \Napier's Bones" in the period
1612-1614, to perform multiplication and division. This is an analog device, rather than a digital
device. By an analog device we mean one that uses continuously variable physical phenomena such
as mechanical motions, or currents of electricity (for example, a slide rule is an analog device). A
digital device makes use of digital or discrete elements to make computations. Napier also used the
concept of the printed decimal point that we all take for granted today.
The idea of using a train of gear wheels linked so that each time one wheel completes a rotation
the next wheel turns so as to record the `carry' of a single unit is a very ancient idea and appears
in the writings of HERO of Alexandria. It was not until the 17th century that the idea arose of
using such a gear train to build an adding machine.
Wilhelm Schickard, of Tuebingen, Wuertemberg, a professor of astronomy, mathematics and
Hebrew, developed a \Calculating Clock" (Figure 2) in 1623. This is a 6-digit machine that can
add and subtract, and partially does multiplication and division. It indicates overow by ringing a
bell. Mounted on the machine was a set of Napier's Bones, a memory aid facilitating multiplications.
The machine was lost and rediscovered in 1956 and found to be workable.
Blaise Pascal, a famous French mathematician also devised an adding machine, the \Pascalene"
(Figure 3) built in 1642-1644. It was thought to be the rst adding machine, but current thought
is that Schickard was the rst to develop an adding machine. The machine consisted of a small box
with eight dials (resembling telephone dials), each geared to a drum that displayed the digits in a
register window. The cumulative total was displayed in a window above the keyboard. It was a 5digit machine and used a dierent carrying mechanism than Schickard. His fundamental innovation
was a ratchet linkage between the rotating drums, which transferred one drum to the next higherposition drum only during carryover. The machine still exists and is in the Conservatoire des Arts
et Metiers in Paris.
Gottfried Wilhelm von Leibnitz, the co-inventor of the calculus with Newton, and whose notation we use today for the calculus, improved upon Pascal's calculating machine. Leibnitz added a
movable carriage operating on wheels utilizing an active-and-inactive pin principle and a delayed
carrying mechanism. This permitted the machine, the \Stepped Reckoner," to multiply and divide
correctly without an operator having to provide an algorithm. The advantage over Pascal's machine
was that a multi-digit number could be set up beforehand and then added to the number already
in the accumulator by turning a handle. Multiplication could be performed by repeated turns of
the handle, and by shifting the position of the carriage relative to the accumulator. Not only did
Leibnitz create the calculus, but he renewed the idea of Aristotle to create a complete system of
logic in 1670. Leibnitz also had the goal of mechanizing mathematics.
3.2 Computational Eorts in the 1800s
Following Leibnitz, there was a period of 150 years where there were no signicant developments
related to computers. The start of productive results leading towards today's computers began in
the 1800s. The period may be characterized as having signicant technological and mathematic
innovations needed for computers. It also saw developments towards producing a computer.
At the turn of the century, Joseph Marie Jacquard, in 1801, developed the \Jacquard Loom,"
(Figure 4) a linked sequence of punched cards (Figure 5) that controlled the weaving of patterns.
A number of individuals, Basile Bouchon, Falcon and Breton contributed ideas which led to the
Jacquard Loom. The importance of the loom was Bouchon's idea of controlling a machine by
sequencing information held on some separate medium. Bouchon is credited with inventing, in
4
1725, the idea of using a perforated tape to control the weaving of ornamental patterns. The
Jacquard Loom inuenced the developments of computers by Charles Babbage 30 years later.
Punched cards were later used to sort data for business applications in the late 1800s, inherent in
the Hollerith punched card tabulating system, and as input for digital computers in the 1940s. It
is not clear, however, that Hollerith was inuenced by the Jacquard Loom.
In 1820 the rst mass produced calculator was developed, the Thomas Arithmometer, based on
the ideas of Leibnitz. The machine was successful.
It would be well to inquire at this stage as to what may be considered a general purpose
computer, and why the previous developments are not computers. A minimal denition of a digital
computer is a device that has the following characteristics:
1. It must do arithmetic digitally. Hence, the slide rule is eliminated.
2. It must do arithmetic rather than just assist the user's memory. This rules out Napier's
Bones.
3. It must do the entire computation with little or no assistance from the user. This rules out
the Pascalene.
4. It must work on user-supplied operands. That is, it must have the ability to calculate based
on a set of instructions. This rules out all of the previous developments.
5. It must have been described in sucient detail, or a \prototype" must have been built.
The major contribution towards the development of computers came from Charles Babbage, a
colorful mathematician in the period 1820-1840. In 1822 Babbage started the design of a machine
called the Dierence Engine (Figure 6). In 1832 Babbage and Joseph Clement built a portion of the
Dierence Engine. The Dierence Machine was more of a special purpose computer rather one which
could handle an arbitrary problem. They never completed the machine, but an implementation
was completed by Georg and Edvard Scheutz in 1853. It was designed to solve a special class
of mathematics known as dierence equations. During the period 1989-1991 a team at London's
Science Museum implemented the Dierence Engine using drawings that had been made by Babbage
and used technology available in Babbage's time. The copy of the machine operated correctly.
During the period 1834-1835, Babbage shifted his focus and conceived and planned a machine
called the Analytic Engine, an automatically sequenced calculating machine (Figure 7). His objective was to build a machine that could mechanize any mathematical operation. The machine was
conceived entirely in mechanical terms without the suggestion that electronics might be used at
a later date. The Analytic Engine was to be a decimal machine. Numbers were to be stored on
wheels, with 10 distinguishable positions, and transferred by a system of racks to a central mill, or
processor where all arithmetic would be performed. Babbage had in mind a storage capacity for
1,000 numbers of 50 digits each. He devised how the machine would do addition and multiplication.
The importance of the mill was that Babbage perceived that it was the mechanization of the organization of the logical control of arithmetic that was signicant. The sequencing of the engine was to
be done automatically, but not on the stored-program principle where instructions are incorporated
directly into the memory of the machine. I shall discuss the stored-program concept later. He did
perceive that its mechanism could serve to eect operations upon symbols of any kind whatever,
not only operating upon numbers. He conceived of the idea of coding the instructions on punched
cards, inuenced in this regard by the Jacquard Loom. A particularly important idea that he had
was that it must be possible to move forwards or backwards among the stream of instruction cards,
skipping or repeating, according to criteria which were to be tested by the machine itself in the
5
course of its calculations. This idea is called conditional branching and is fundamental to all digital
computers. Without this ability, the program could not manage its own operation and jump to
other locations to execute the program. Many computer scientists believe that the denition of a
digital computer requires one that includes a conditional branching capability.
Babbage never arrived at the idea of instructions containing both an operation part and an
address part, nor at the formal concept of a program that we have today. The computer was never
implemented during Babbage's lifetime, although a prototype section of the mill and printer were
developed in 1871, the year that Babbage died. A British committee investigated the feasibility of
completing the engine and concluded that it was impossible following Babbage's death.
Babbage fully understood the implications of the Analytic Engine. In \The Life of a Philosopher," written in 1864, Babbage stated,
As soon as an Analytic Engine exists, it will necessarily guide the future course of
science. Whenever any result is sought by its aid, the question will then arise - by
what course of calculation can these results be arrived at by the machine in the shortest
possible time.
A description of Babbage's Analytic Engine was written in French by Luigi Menabrea, who
had heard Babbage lecture on the subject. Babbage encouraged Ada King, Countess of Lovelace,
to translate this document (Figure 8). The Countess was the daughter of the poet Lord Byron.
She worked on the translation during the period 1842-1843. Babbage suggested that she add some
explanation to the text, which she did and Countess Lovelace contributed an addenda to the paper
of approximately 40 pages. These pages included the algorithms to solve various problems specied
by Babbage. It has been stated that Countess Lovelace was the rst programmer. This is most
likely an exaggeration. In writing the extra pages, Countess Lovelace consulted very closely with
Babbage, who provided all of the programs that appear in her Sketch of the Analytic Engine.
Whether or not she was the rst programmer may be debatable. What is not debatable is that
she made a signicant contribution to translating the paper written by Menabrea into English and
augmenting it with 40 pages of programs [6]. More recently, a programming language, ADA, was
developed and named in her honor.
Although Babbage was undoubtedly a genius, his work was largely forgotten and did not play
a signicant role in the development of the electronic digital computer. Except in two instances,
the developers of the modern computer were unaware of Babbage's work.
Technologic Innovations, Three major technological inventions took place in the 1800s.
1. Samuel Morse developed the system of telegraphy in 1838 and sent a telegraph message from
Washington to Baltimore in 1842.
2. Alexander Graham Bell invented the telephone in 1867, and
3. Marconi transmitted a radio signal in 1895.
All three developments had profound inuence upon computing. Indeed, most of this paper was
written while I was working at home, typed on my terminal and transmitted via modem through
the telephone to my personal computer at the University.
Mathematical Developments. There was also an important mathematical development by
George Boole that took place in 1854. Boole published his paper, \An Investigation of the Laws
of Thought," that described a system for symbolic and logical reasoning, referred to as Boolean
Algebra. Boolean Algebra is the basis for computer design. All circuit designs use Boole's work.
Finally, Gerhard Frege in his famous Begrisschrift (1879), dened a system of logic which led to
6
the rst order predicate logic and contained a complete form of propositional logic. His system of
logic contains a precise language for constructing sentences in logic. Frege's systems of logic were
investigated extensively in the rst part of the 1900s.
Towards the end of the century, the United States Census Department realized that they needed
help to tabulate the results of the 1890 Census. In 1889 they selected Herman Hollerith's Electronic
Tabulating System for this purpose (Figure 9). Hollerith's key idea was to record census information
as holes on paper tape or cards. One could then mechanically read the presence or absence of holes
and tabulate results. His company, the Tabulating Machine Company, formed in 1895, became the
Calculating Tab and Recording Company, in 1914, and eventually the IBM Corporation in 1924.
The system developed by Hollerith was used successfully in the 1890 Census.
4 Development of the Modern Computer
We shall now turn to the development of the modern digital computer. A technological development
that preceded this was the invention of the diode vacuum tube for radio transmission in 1904 by
John A. Fleming. Vacuum tubes were the basis for several digital computers that were to be
developed later.
The start of the modern computer began in the 1930s in several dierent places. I shall discuss
two major topics: computer hardware and, briey, software in terms of automatic programming
concepts. Both hardware and software were essential for the computer revolution to come about.
4.1 Computer Hardware Developments - the 1930s and 1940s
The 1930s and the 1940s were the start of the electronic computer eorts. The 1930's were distinguished by important work in mathematical logic and in the development of an abstract, simple,
and profound concept of a computer, invented by Alan Turing. In 1935, Turing was concerned
with a problem from mathematical logic know as the Entscheidungsproblem. Could there exist, at
least in principle, any denite method or process by which all mathematical questions could be
decided? This problem was rst posed by the brilliant mathematician David Hilbert in the early
1900s. To solve this problem, Turing formulated the concept of a machine working on a paper tape
on which symbols were printed. The machine, known as a Turing Machine consists of an innite
tape, that can be moved to the left or to the right, and on which one can write and erase symbols
on the tape. Turing inquired into the most general type of process such a machine can perform.
He showed that this was equivalent to what could ever be achieved by a person working on a set
of logical instructions. He further argued that it was equivalent to what could be achieved by the
human brain, assuming a nite number of possible states of mind. Making this correspondence,
he was able to answer Hilbert's question in the negative: no such decision procedure exists. Independently, the famous logician, Alonzo Church, using dierent techniques proved the same result.
The concept of a Turing Machine has become the foundation of the modern theory of computation
and computability. Turing introduced the idea of the Universal Turing Machine that embodies
all possible Turing Machines. It includes the concept of the \stored program," essential to the
modern computer: that symbols representing instructions are no dierent in kind from symbols
representing data. Indeed, it also embodied the idea of using computers to manipulate symbols,
rather than as number crunching machines.
One of the rst computers in the 1930s was developed by Vaneever Bush who devised the
Dierential Analyzer to solve dierential equations by analog methods (Figure 10). That is, by
continuous owing signals rather than by binary digits. Hence, this does not satisfy our denition
of a digital computer. This was a signicant event and the Dierential Analyzer was used during
7
the war for calculations related to defense. In particular, it was used at Aberdeen Proving Grounds
during the 2nd World War to calculate the trajectory of ballistics under realistic conditions (e.g.
to account for wind factors). The machine was able to do integration and dierentiation and was
the largest computational device in the 1930s.
The development of electronic digital computers took place independently in three dierent
countries.
1. In Germany, Konrad Zuse started the development of computing in 1934.
2. In England, as part of the code-breaking eorts in World War II, associated with the Enigma
machine, the British developed the Colossus machine in the 1940s.
3. In the United States there were several developments,
Howard Aiken started the development of the Mark computer in 1937;
John Atanaso in 1937 built a prototype computer;
George Stibitz constructed a relay-driven arithmetic unit in 1937; and
Presper Eckert and John Mauchly developed the ENIAC machine in the 1940s.
I shall briey discuss these eorts and several others. The eort by Zuse in 1934 and the work
by Aiken, Atanaso and Stibitz were not inuenced originally by the 2nd World War. Subsequent
developments by Aiken and Stibitz were, however, supported by the war eorts. The Colossus
and the ENIAC were directly related to the war eort and might not have been developed otherwise. However, neither the Germans, the British nor the Americans realized the importance of
the electronic digital computer. The British realized its importance for the code breaking eort,
but not as a computational vehicle in its own right. The Nazis did not realize the importance of
a programmable computer and minimally took advantage of Zuse's developments. An American
scientic panel came to the same conclusion with respect to the ENIAC. They even did not mention the ENIAC to Dr. John von Neumann who was concerned with computing for the atomic
bomb project in Los Alamos. We shall nd out how von Neumann, one of the most important
mathematicians of the 20th century, found out about the ENIAC machine. You will be amused to
learn of the foresight of one of the developers of the digital computer. In response to a query by
Edward Cannon of the National Bureau of Standards, to Howard Aiken, developer of the Mark I,
to comment as to whether the U.S. Government should lend support to Eckert and Mauchly who,
in the late 1940s, were proposing to make and sell electronic computers to commercial customers,
Aiken stated,
:::
:::
[Y]ou : : : fellows ought to go back and change your program entirely. Stop this
foolishness with Eckert and Mauchly.
In Aiken's estimable opinion, a commercial market would never develop; in the United States
there was a need for perhaps ve or six such machines, but no more.
Credit for the development of the electronic digital computer belongs to many individuals. As
in other elds, the controversies as to who deserves the credit is still under debate and will probably
never be resolved.
4.1.1 Development of the Computer in Germany
In 1934, Konrad Zuse, in Germany, realized the possibility of developing a digital computer [8]. He
decided that any memory unit should be based on a binary mechanism rather than on the rotating
8
ten-position of most mechanical calculators. This simple, but elegant observation of working in
binary notation rather than in decimal, avoided the thousands of meshing gear wheels required
to implement Babbage's Analytic Engine. He did not come up with the concept of conditional
branching as had Babbage. Working in the living room of his parents' home in Berlin he began to
construct a mechanical memory. By 1936 he patented his ideas. By 1937 he produced a mechanical
memory capable of storing 16 binary numbers, each of 24 bits. In 1938 he completed the machine,
now known as the Z-1 (Figure 11). This machine was entirely mechanical in construction. Mechanical gates were constructed from a series of sliding plates connected by various rods. The arithmetic
unit was constructed from the mechanical gates. Control of the Z-1 was by means of instructions
punched on tape. Zuse used discarded 35mm movie lm for his tape, which he punched with a
hand punch.
Before the Z-1 was completed, he started work on the design of a relay-based arithmetic unit
to overcome the signal routing problem he had with the Z-1. Helmut Schreyer, a graduate student, constructed a model of the Z-2 using vacuum tubes, which was demonstrated in 1938. Zuse
abandoned this idea because of the war eort and the diculty of obtaining parts for a machine
with 1,000 vacuum tubes. The Z-2 memory used the mechanical gates as in the Z-1 while all the
rest of the machine was based on relays. Before he completed the machine he was drafted into the
army. He was released from the army a year later to work in an aircraft rm as an engineer. He
completed the Z-2 in April 1939 and demonstrated the machine..
His next development was the Z-3 machine, for which he received funding from the German
Aeronautical Research Industry. Like the Z-1 and the Z-2 it was controlled by punched tape, and
the input and output were via the same four-decimal-place keyboard and lamp display developed
for the Z-2 by Schreyer. The entire machine was based on relay technology: 1,400 for memory and
600 for the arithmetic unit, and 600 as part of the control circuits. Started in 1939 it was completed
by December 5, 1941. The 64 word memory was too small to solve linear dierential equations,
the problem that motivated the Research Industry to provide the funds, and was outperformed
by those solving the equations manually. The Z3, is important as it was the rst fully functional
program-controlled electromechanical digital computer. He completed his last computer, the Z-4,
during this period, This computer survived World War II and helped to launch scientic computers
in Germany through the Siemens Corporation.
Although his developments were too late to be exploited during the war, Zuse calculators were
used in engineering the V2 rockets. In 1945, Zuse was installed in the Dora underground factory
where the V2 rockets were constructed. Following the war, Zuse went into hiding. Along with
Wernher von Braun, the famous rocket scientist, they were discovered and interrogated. The two
were taken to London in 1948, denied that they were doing anything for the government. von Braun
was thought to be suciently important to be brought to the United States, but Zuse remained in
Germany.
4.1.2 Development of the Computer in England
In England, computers were developed to help to break encoded messages during World War II.
Before the start of the Second World War, the Germans had developed an encrypting machine
called the ENIGMA (Figure 16).
Polish scientists had obtained a copy of the machine, which used three rotors to encode messages.
The Poles were able to detect the specic wirings in the machine. They were helped in this regard
by the French secret service who had obtained, through spying, a copy of the instructions for using
the machine in September and October 1932. The basic principle of the Enigma was that its rotors
and rings and plugboard would be set up, and then the message would be encrypted, the rotors
automatically stepping around as this was done. The Poles built a mechanical device by November
9
1938 to help decode messages, The device was called Bombes as they produced a loud ticking sound,
much like that coming from an actual bomb (Figure 17). The Poles had been regularly decoding
German messages for seven years before the war broke out. Having the Enigma and knowing the
wiring did not mean that it was possible to decode what was written as the output of the machine.
This was because there were numerous possible settings of gears on the machine. Each setting
leading to a dierent encoding of the same message. Of course, each day the Germans changed the
settings on the machine. To break the codes, the settings of the machine for the day had to be
deduced. When war broke out, the Poles sent the Enigma and the Bombes to England. To help
break the code, the British collected the nest minds in Britain and set this group up at Bletchley
Park outside of London (Figure 21). One of those recruited was Alan Turing whose interest in code
breaking was known in England. The Germans also became more sophisticated with the ENIGMA
and added three more rotors, making a total of 6 instead of three. This made it more dicult to
decode the encrypted messages.
By the end of 1940 Alan Turing made improvements to the Polish Bombes, and developed what is
referred to as the \Turing Bombe". By the spring of 1943 M.H.A. Newman's ideas for mechanization
of an improved decoding machine was realized. This was the rst electronic adding machine that
they had developed and called \Robinsons". There were various versions of the \Robinson's".
One, named the Heath Robinson, after a famous cartoonist who specialized in elaborate machines
to perform simple tasks, was able to compare two paper tapes at rates of 2,000 characters per
second. These machines were programmed by Women's Royal Naval Service (WRENS). I would
like to comment that women played the leading roles in programming during the early days of
computing, starting with the WRENS, then the women in the United States who programmed the
British computers sent here, and the women programmers who worked on the ENIAC computer
that I will describe shortly, not to mention Lady Lovelace's contributions. Indeed, the women who
wrote programs for the ENIAC were called \computers" in the early days of programing.
Mechanical problems with the Robinsons forced them to consider the development of an electronic machine. In 1943 the group at Bletchley developed Colossus, a vacuum tube computer
(Figure 18). The machine became operable in 1944, and was used to decrypt messages in planning
for the D-Day invasion of Normandy. The machine consisted of 1,500 vacuum tubes. The Colossus, when completed, lled an entire large room in a temporary hut at Bletchley. It operated in
parallel arithmetic mode at 5,000 pulses per second and had electronic counting circuits, electronic
storage resistors, that were changeable by an automatically controlled sequence of operations and
typewriter output. Only one tape was read into the machine at one time and stored in a memory
consisting of gas-lled thyratron triodes. The necessary programming was done by plugboards.
The machine was primarily used for logical manipulations and not for computational purposes. It
had little ability to perform numerical computations. The designers of the machine were: Tommy
Flowers, Sir Harry Hinsley and M.H.A. Newman (Figure 19). This machine was the one used to
break the German cryptology code. The machine remained secret until 1970 and the algorithms of
decryption are still a secret.
The work at Bletchley in decoding the German messages played a signicant role in the winning
of World War II. It helped to locate and sink Nazi U-Boats from the Atlantic Ocean and win the
Battle of the Atlantic. It played the major role in the sinking of the German Battleship Bismarck.
In May 1941 the Germans completed the construction of the Bismarck and sent it out through the
North Sea to the Atlantic Ocean. The ship was detected by the British when it left port and the
H.M.S. Hood and several other ships were sent in pursuit of the Bismarck. In a famous battle on
24 May 1941, the Bismarck sank the H.M.S. Hood and several other British vessels. This was a
shocking development to the British. The Bismarck made her escape on 25 May and contact with
her had been lost. Early on 25 May, Admiral Luytens, thinking that he was still being shadowed by
10
a British warship, sent a long encoded signal to his Naval Headquarters in Germany. It listed all his
diculties, but mainly the loss of fuel from his earlier battle. He asked for instructions as to what
to do. This message was intercepted by the British and decoded at Bletchley Park. The British
located the Bismarck along the French coast and sent a eet of aircraft that sank the Bismarck.
The person considered to have developed the techniques for the code breaking operation is Alan
Turing (Figure 20). He is one of the outstanding mathematicians and computer scientists of the 20th
century. When the history of the 20th century mathematics and computer science is written, Turing
will be listed prominently, as will John von Neumann, whose contributions to digital computing
I shall describe shortly. Not only was he important for the winning of WW II, for which he was
awarded the Order of the British Cross (OBC), but he contributed signicantly to mathematical
logic, and to the development of the abstract Turing Machine. Following the war, Turing published
a report on his design for the ACE (Automatic Computing Engine), featuring random extraction
of information. The ability to access data in memory by going to the data directly, rather than
sequentially, was an important concept that all digital computers now feature. It is a pity that the
world needlessly lost such a brilliant individual at the age of 41 (1912-1954). Turing, who was a
homosexual, was hounded to death by the British government and died tragically, a suicide. See [5]
for a biography of Turing and a description of his contributions to code breaking and the Enigma.
Because of Turing's profound contributions to the eld of computer science, the Association for
Computing Machinery selects an individual each year to receive the coveted Turing Award. The
award, in prestige, is as signicant as the Nobel prizes.
4.1.3 Development of the Computer in the United States
The development of digital computers started in the United States in 1937 at three dierent locations: Iowa State University, Harvard University and the Bell Telephone Laboratories. I shall
discuss each of these eorts. I will then discuss the developments of the rst electronic digital
computer at the Moore School at the University of Pennsylvania that started in 1943.
Atanaso and the Atanaso-Berry Computer (ABC). In 1937, John Atanaso at Iowa
State University spent the winter devising principles for an electronic digital computer (Figure 12).
Atanaso was aware of Vaneever Bush's Dierential Analyzer. He realized that with analog methods, one could not obtain the speeds to solve linear equations as one would with digital computation.
In 1939, Atanaso and his student Cliord Berry built a prototype electronic-digital computer, the
Atanaso-Berry Computer (ABC) (Figure 13). It was the rst computer to use vacuum tubes.
The ABC featured:
1. binary arithmetic,
2. rotating drum memories,
3. capacitor memory elements,
4. continuous regeneration of capacitor elements from a separate arithmetic unit,
5. electronic (vacuum-tube) switching and logic switching adders,
6. base conversion,
7. punched card input output systems,
8. automatic sequential controls,
9. modular units, and
11
10. parallel operations.
The ABC used 300 vacuum tubes. Atanaso's idea was to store the coecients of the linear
equations in the capacitors, which periodically had to be regenerated because they would lose their
charge.
A successful model was completed by December 1939 and the computer itself by May 1942. The
machine they constructed worked, but had limited capabilities for storage and computation. The
ABC was modest technology, not fully implemented and was never used to do constructive work.
While he was building the ABC, Atanaso lectured at a conference attended by a young man
named John Mauchly, who was at Ursinus College and subsequently became a Professor at the
Moore School of Electrical Engineering at the University of Pennsylvania. The two spoke, as
Mauchly had become interested in electronic digital computers to perform computations with respect to predicting the weather by machine. Atanaso invited Mauchly to visit him to see his
machine. Mauchly accepted the invitation and visited Atanaso for ve days in 1941, during which
he examined the machine and a detailed description of the computer. This visit started a major
controversy as to who invented the electronic digital computer, which was to lead to a famous
court decision, which I will discuss shortly. Atanaso left Iowa State to work at the Oce of Naval
Research in Washington, DC on the war eort. He did no further work on computers from that
time to his death at the age of 91 on June 15, 1995.
Howard Aiken and the Mark I Computer. At Harvard, Howard Aiken, a graduate student
conceived of the idea for a digital calculating machine. He was interested in calculating dierential
equations. He was inuenced to develop a computer by the work of Babbage which he had learned
about. It is of interest to note that of all those who designed computers in the early period, Aiken
and Zuse were apparently the only ones aware of Babbage's work and were inuenced by his work.
Aiken submitted a proposal to IBM to support the construction of a digital computer. IBM agreed
to fund the eort and assigned three engineers at IBM to implement the machine. They were, Clair
D. Lake (who headed the eort), Frank E. Hamilton and Benjamin M. Durfee. As stated by Aiken,
I set forth the requirements of the machine for scientic purposes, and in which
the other gentlemen set forth the properties of the various machines which they had
developed, which they had invented : : : .
:::
The computer was intended to compute the elements of mathematical and navigation tables.
Aiken's Harvard Mark I, also known as the IBM Automatic Sequence Controlled Calculator (ASCC)
was dedicated on August 7, 1944. The ASCC was
1. driven by a paper tape containing the instructions in serial order,
2. contained instructions that consisted of three parts: where the data to be operated upon was
stored, where the result was to be stored, and which operation was to be performed,
3. contained 72 counters for storing numbers, each made up of 23 digits plus sign, and
4. contained 60 registers controlled by manually set switches in which constants could be stored.
It took six seconds to do a multiplication and about 12 seconds for a division. Like Babbage,
the machine was based on the decimal system, but unlike Babbage he did not use the idea of
conditional branching. Thus programs had to be executed in the sequence in which the instructions
were written. It was also not a stored program computer. The machine was realized in terms of
electromagnetic relays.
12
The rst \coders" for the machine were Ensigns Robert Campbell, and Richard Block. They
were joined by a third \coder" Ensign Grace Murray Hopper in June 1944 (Figure 22). Aiken was
not at all gracious at the dedication of the Mark I computer, as he gave no credit to IBM, the
organization that had funded him, had supplied the designers and had implemented the machine.
While Aiken had the idea to build a machine, he did not have a specic design to do so.
Subsequent machines developed at Harvard were sponsored by the U.S. Navy. A second version,
the Mark II was completed at Harvard in 1947. In programming, one refers to errors in a program
as \bugs". The term \bug" comes from Grace Murray Hopper one of the programmers on the Mark
computers, who found the rst "bug" on the Mark II prototype (Figure 23).
We shall learn more about Hopper, who remained in the Navy and eventually reached the rank
of Admiral.
The Bell Relay Machine In addition to the Aiken proposal in 1937, several other important
events took place. Claude Shannon at the Bell Telephone Laboratories published principles for an
electric adder to the base two and George Stibitz, also at Bell Labs developed a binary circuit based
on George Boole's (1854) Boolean algebra. As noted earlier, Zuse also had the idea of using binary
representation for computing.
Stibitz was investigating telephone relays to perform arithmetic. He rst constructed a relay
driven arithmetic unit (later called the model K-1 (Figure 14), because it was built on his Kitchen
table) and from that built a number of relay machines subsequently used during the 2nd World War
to perform computing. Stibitz's rst full scale relay machine, developed in conjunction with Samuel
B. Williams and Ernest G. Andrews, worked in the area of complex numbers, useful for solving
problems with underwater cables, called the Complex Number Calculator (or the Bell Labs Model
1 (Figure 15)). The machine was completed in 1939. In 1940 the machine was used remotely over
telephone lines. It was installed in a hallway outside the meeting rooms for the annual American
Mathematical Society conference at Dartmouth College and connected to the Bell Relay Machine
in New York. This was the rst use of computers for linking computers and communications and
eventually networking as evidenced by the Internet that we have today. The need for computation
during World War 2 led to the use of the machine to compute \ring tables" for eld and naval
artillery.
Eckert and Mauchly and the Development of the ENIAC In the early 1940s, work at the
Moore School of Electrical Engineering work was being performed for Aberdeen Proving Grounds on
ballistic table calculations using Vaneever Bush's Dierential Calculator. The Dierential Analyzer
was slow. The Army needed ballistic tables generated faster than could be computed by the
Dierential Calculator. John H. Mauchly, who had become interested in digital computers, together
with a brilliant young engineer, Presper Eckert, wrote a proposal to Aberdeen Proving Grounds in
Maryland, to develop a general purpose electronic digital computer that could perform the ballistic
missile calculations faster than the Dierential Analyzer. Mauchly's real interest was to use the
computer to perform weather predictions.
Dr. Herman H. Goldstine, an ocer at Aberdeen Proving Grounds, was assigned to the Moore
School to supervise the use and training of personnel for the Dierential Analyzer. While there,
Goldstine had extensive conversations with Mauchly and became convinced of the validity of developing a digital computer. Inuenced by Goldstine, the U.S. Army at Aberdeen Proving Grounds,
approved the proposal in 1942. Eckert and Mauchly started construction of a machine they called
the ENIAC (Electronic Numerical Integrator and Computer) (Figure 24), on May 31, 1943. Assuming that the machine was successful, it would perform the computations needed to solve exterior
ballistics problems much faster than either the Bell Relay Machine or the Dierential Analyzer.
These machines took in the order of 45 minutes to solve such a problem.
The machine as nally constructed consisted of:
13
1. 18,000 vacuum tubes of 16 dierent types operating with a clock rate of 100,000 pulses per
second. Hence the clock was synchronous and issued a pulse every 10 microseconds,
2. 70,000 resistors,
3. 10,000 capacitors, and
4. 6,000 switches.
It was 100 feet long, 10 feet high, and 3 feet deep. In operation it consumed 140 kilowatts of power.
And could store only 20 words in its memory.
To achieve a reliable machine with that many vacuum tubes was a major engineering feat.
The machine had to operate with a probability of malfunction of 1 part in 1014 in order for it to
run for 12 hours without error. Eckert was the engineering genius who set forth the specication
on all of the parts that made the machine a reality. Mauchly had the original idea and a great
deal of knowledge as to how to implement the many functions within the machine. There were
other designers who deserve credit for the eort: Arthur Burks and Kite Sharpless who were senior
designers and John Davis and Robert Shaw who contributed to parts of the machine such as the
accumulators and the function tables.
The computer was up and running in the spring of 1945 and unveiled ocially at the University
of Pennsylvania on February 14, 1946. This date is considered to be the start of the modern era of
computers. It is important to note how programs were input to the machine. This was accomplished
by plug boards that were attached to the computer, much in the same way as the Colossus machine
(Figure 25). Thus, the program was external to the machine, and was used to modify the data in
the machine. This was a cumbersome eort, but the program was to be executed over many data
sets so that it was a viable solution at the time.
Goldstine, also a mathematician, was responsible for monitoring the ENIAC development. One
day, on a trip to Philadelphia he met the famous mathematician John von Neumann at the Aberdeen
station (Figure 27). von Neumann was at the Institute for Advanced Studies in Princeton and was
also working at Los Alamos on computations with respect to the atomic bomb. Goldstine decided
to engage him in conversation and, on the train, described the work at the Moore School on the
ENIAC. von Neumann, who was interested in computing because of his work concerning the atomic
bomb, had not heard of this eort and Goldstine invited him to be a consultant to the project.
The signicance of the ENIAC is that it was the rst large scale electronic digital computer.
This was a major achievement. However, Eckert and Mauchly realized that the machine could
be improved signicantly (Figure 26). There were three major shortcomings of the machine: too
little storage (ENIAC) had only 20 words of memory; too many vacuum tubes, and hence the
machine could not function over very long periods without a malfunction; and too lengthy a time
to reprogram the machine, as it eectively meant that a new wiring had to be done.
Eckert, Mauchly, von Neumann, the EDVAC and the Stored Program Concept.
In 1945, Eckert and Mauchly signed a contract to build the EDVAC (Electronic Discrete Variable
Automatic Computer), a more advanced machine that would resolve the limitations of the ENIAC.
The design of the machine came about by a number of developments at the Moore School.
The problem with storage led Eckert to propose that a mercury delay-line storage unit be used
instead of vacuum tube storage. Eckert had calculated that a mercury delay line of ve feet in length
would produce a one-millisecond delay; assuming numbers were represented by electronic pulses of
a microsecond duration. a delay line would be able to store 1,000 such pulses. By connecting the
output of the delay line to the input, the 1,000 bits of information would rotate indenitely inside
the delay line, and would provide permanent read/write storage.
14
Eckert, Mauchly, Goldstine, von Neumann, and Arthur Burks (also at the Moore School), had
engaged in discussions on the machine. During these discussions, it was realized that the computer's
storage device would be used to hold both the instructions of a program and the data on which it
operated. Thus, the idea of the stored program computer was born at the Moore School. The delay
line resolved the problem of too many vacuum tubes, too little storage, and the exibility to write
and store programs in the storage unit. The stored program concept is of extreme importance to
the development of computing. As I noted earlier, programs for the ENIAC were constructed by
plugging wires to the machine. The instructions on the wired machine manipulated the data stored
in the machine. The idea of the stored program was that the program should reside in the machine's
memory, just like data and would be executed, not by the wiring of a program, but by the program
stored in the main memory of the machine. This would permit a degree of exibility in that the
program could modify its own behavior and could reexecute sections of its code as required in most
programs. Hence, it need not execute in a sequential fashion. This seemingly simple observation
that programs and data should be treated similarly is what led to the programming revolution.
Many computer scientists believe that the denition of a digital computer should include one that
has a stored program capability.
Von Neumann wrote up the discussions on the stored program machine, and entitled it, \Draft
Report on the EDVAC Design." The report, dated June 30, 1945 was authored only by von
Neumann. The report emphasized that a computer was basically a mechanism to carry out logical
operations. This idea is undoubtedly von Neumann's contribution, while the basic ideas on the
technology of the machine and on the need for a stored program were contributed by Eckert and
Mauchly. This report, was distributed widely by the Aberdeen Proving Grounds and caused many
problems. First, machines are now, erroneously referred to as von Neumann machines, although
the work was shared by Eckert and Mauchly. Second, it caused hard feelings within the group as
Eckert and Mauchly had wanted to patent the ideas, which von Neumann wanted to make available
to anyone as is the wont of academics. Goldstine, von Neumann and Arthur Burks, a philosopher
and one of the designers of the machine were at odds with Eckert and Mauchly. There appears to be
evidence that Eckert and Mauchly had the idea of a stored program computer before von Neumann
became a consultant on the ENIAC. The question of who was the rst to originate the stored
program concept may never be resolved. Certainly, Eckert, Mauchly, Goldstine, von Neumann and
Burke have legitimate claims. Zuse also claimed that he had the basic idea as well. It is also the
case that Turing had the same idea.
Eckert and Mauchly eventually set up a new company, the Electronic Control Company, in
March 1946. They obtained a contract from the Census Department to develop an EDVAC-type
computer. The machine was subsequently named UNIVAC.
Who Developed the First Electronic Digital Computer? The Eckert and Mauchly
Company was sold to the Sperry-Rand Corporation. The Honeywell Corporation claimed that
Sperry-Rand did not own certain patents that they wanted to use for their own computers. The
Honeywell Corporation claimed that the patents were derived from work by Atanaso and that
Mauchly had used these ideas in his patents. There was a long, acrimonious trial. Burks and
Goldstine testied against Eckert and Mauchly. In 1973 Federal District Court Judge Earl R.
Larson ruled in the now famous Honeywell vs. Sperry-Rand suit that \Eckert and Mauchly did not
themselves rst invent the automatic electronic digital computer, but instead derived that subject
matter from one Dr. John Vincent Atanaso. It has always seemed strange to me that such
an issue was adjudicated by a judge who had no concept of the technical aspects of computers.
There are some in the computer eld who consider Atanaso to be the inventor of the world's rst
electronic digital computer, the Atanaso-Berry Computer. There are others who disagree. This
controversy may never be decided. My view tends towards Eckert and Mauchly for two major
15
reasons. First, the development of ENIAC, a real large machine, in contrast to a small prototype
is a big leap. Many technical problems had to be overcome that Atanaso did not have to face.
Second, Atanaso never really appreciated the signicance of his accomplishment and never worked
on computers after he left Iowa State University. However, there is also no doubt that Atanaso
inuenced Mauchly and that Mauchly learned a great deal from his reading and observation of the
ABC machine when he visited Iowa State University.
Developments following EDVAC The second half of the 1940s led to many developments, both
in devices for computers, computers themselves and in programming. I will mention some of the
highlight developments and then discuss developments in programming.
A major inuence in these developments was a summer school held at the Moores School, and
now referred to as the Moore School Lectures. These lectures were held over eight weeks from 8
July to 31 August 1946. The list of lecturers were a Who's Who in computing at that time. It
included lectures by Aiken, von Neumann, Eckert, Mauchly, Goldstine and Burks, as well as others
from the Moore School. Virtually every computer built in the United States and Britain, used the
basic philosophy of the EDVAC. Many of the designers either read the von Neumann report or
attended the lectures at the Moore School summer school. The rst stored-program computer, the
Manchester Mark I, was implemented in Britain under the direction of Max Newman, who had been
responsible for the Colossus machine and F.W. Williams. It was a relatively small machine. The
Mark I is the rst operational stored program digital computer. The EDSAC (Electronic Delayed
Storage Automatic Computer), a stored-program computer, developed by Maurice Wilkes, was
implemented in 1949. In 1951 Wilkes originated the concept of micro-programming, a technique
providing an orderly approach to designing a computer system's control section.
Other important developments in the late 1940s was the development of computer storage units.
The slow mercury delay line was recognized as a bottleneck in speeding computer operations.
To alleviate this problem, the magnetic drum was introduced as a storage device in 1947-1948.
Although the drum also rotated and one had to wait for the data to come around the track, it had
larger capacity and was faster than the mercury memory. Another development was the use of a
cathode ray tube to store memory, developed by F.W. Williams.
More important than these memory devices was the announcement on December 23, 1947 to
the Bell Laboratories management by John Bardeen, Walter Brattan and William Shockley that
they had developed the rst transistor (Figure 28). The development of the transistor permitted
machines to do away with vacuum tubes and led to the miniaturization of computers that we
have today. They received the Nobel Prize for this invention. The transistor has been called
the single most important invention of the 20th century - the key that unlocked the Computer
Age. The transistor led to the miniaturization of computers, where, in contrast to the size of the
early digital computers, we now have hand-held computers with much more capability than the
early machines. The transistor is a solid-state device and is stable, in contrast to electronic tubes.
Another important development was made by Claude Shannon who developed a mathematical
theory of communication that led to many new developments in the eld of communication. We
also saw the development of the rst stored program computer, the Mark I developed in Manchester,
England.
Before discussing the developments in programming, I would like to mention one other important
development, the Whirlwind computer, important for two reasons (Figure 29). It was the rst
real-time computer. Up to this time, programs were performed on computers and there was no
consideration as to when the computation had to be completed. In real-time applications, one
must complete a job by a specied period of time. Jay Forrester, who led the development of
the Whirlwind machine, which was completed in the third quarter of 1949, and had 5,000 vacuum
tubes, realized that although the machine did perform computations in real-time, up to a fashion,
16
the access to data was a problem. Mercury delay lines and magnetic drums did not give direct
random access to data. That is, it was not possible to specify a precise memory location and
directly access the data in that location. In 1951, Forrester led a patent application for the matrix
core memory on May 11 (Figure 30). This was the second important development on Whirlwind.
Given the address of the memory where an entry is stored, the data can be accessed immediately
without having to wait for the memory to recirculate. This was a key invention. Until the transistor
became commercially available, magnetic core memory became the primary direct random access
memory for digital computers. All computers now use direct random access to primary memory.
4.2 Developments in Computer Programming
I have focused primarily on the developments that led to the rst electronic digital computers.
It is also important to discuss the development of programming for these computers. To make
eective use of digital electronic computers, one must have the ability to write instructions for
these machines. I will briey sketch some of the early developments in computer programming.
In discussing the Colossus and the ENIAC, I noted that the program that was to be executed
on the computer was placed on the computer in plug boards. This was before the stored-program
became a reality. When the program was stored, together with the data, the instructions to the
machine were encoded as a string of bits by the programmer. Thus, the instruction,
\Add the number in memory 25"
was encoded by hand, as
111000000000110010
The instruction in binary is what every computer uses today. In the early days of computing,
programmers rst wrote their programs in a higher order language than binary, and then they had
to encode the program by hand. The problem with hand encoding programs into binary is that
many errors can arise in the transcription, it is dicult to detect errors, and it is a deadly, boring
task. Such a notation is unacceptable for humans since many errors may be introduced and it is
not a very user-friendly language. Mauchly had recognized this and in 1949 developed the rst
high-level programming language, called \Short Order". In 1951, Maurice Wilkes, the developer of
the EDSAC, realized that one could not expect a programmer to write in binary and introduced
the concept of symbolic inputs, so that the above instruction could be written as:
A 25 S,
to add the numeric string in location 25 to the accumulator and store it in location S. Goldstine and
Von Neumann had the same idea. However, they envisioned that \coders" would do the translation,
while Wilkes realized that a program could be written that would translate from the symbolic code
to the binary machine code. This led to the development of Assembly Language Code. Wilkes also
realized the need for sub-routines. That is code that would be called upon by programmers and
need not be programmed separately by each programmer. An example of a sub-routine is a sort
routine that sorts lists of entries in alphabetic order. Together with David Wheeler and Stanley
Gill, Wilkes developed a set of program sub-routines that the programmers could incorporate into
their code. A subroutine may be executed many times in the same program. However, it is written
only once, and each time it is used, the parameters needed to execute the subroutine are sent to it.
Thus, a subroutine saves space by being written only once in the program. Conditional branching,
rst described by Babbage, allows one to go to the subroutine directly.
Francis Elizabeth (Betty) Snyder-Holberton, who was one of a group of women to program the
ENIAC, and hence, among the very rst programmers in the world, developed the rst generalized
sort-merge generator for the Univac I machine. Her Sort Generator allowed programmers to specify parameters and so write their own programs. This revolutionary concept, at the time called
\automatic programming," has been hailed by other programming leaders as a major inuence on
17
their thinking. In addition, Snyder-Holberton developed the instruction code C-10, for Univac-I,
which changed, forever, the manner in which programmers implemented their code.
It was generally recognized that Assembly Language programming was not suciently userfriendly for writing programs. In the period 1951-1952, Grace Murray Hopper, who had worked
on the Harvard Mark machines as a programmer, developed the rst high level compiler language
and program, the A-0 compiler, while she worked at UNIVAC. A compiler takes a set of statements
that represent a program, in a language that is higher than assembly language, as given below, and
translates the program into machine code. Unfortunately, the compiler that Hopper developed was
very inecient, took excessive amounts of time to generate machine code, and was not used, except
for testing. Many individuals believed that the quest for a high level compiler would not pay o.
Grace Hopper, was the the driving force in the eorts to develop automatic programming tools.
It was not until 1957, when John Backus and his colleagues developed the rst FORTRAN
(Formula Translator) compiler (Figure 31), that an eective compiler was developed. Indeed,
FORTRAN had optimization features that permitted it to develop code that was more ecient
and executed faster than most hand-coded programs. FORTRAN became the rst successful programming language. A single FORTRAN statement translated into many computer instructions
and permitted programmers to write in a convenient language. Thus, the single statement:
X
= 17:21 log (a) cos(b + c),
produced all of the code needed to calculate the answer, where, a, b and c are constants, and assign
the result to the variable X . This represents a major advance over coding programs in binary.
Backus's work at IBM was not a high priority and there were many skeptics at IBM, who did
not believe that this research project would ever be completed successfully or be able to develop
code that could be translated so as to execute at the same speed as hand-written code. In practice
Backus's FORTRAN code was 90% as good as those written by hand, as evidenced by the storage
that it occupied and the speed of execution. FORTRAN is still in existence, and is the most
successful programming language ever developed.
While FORTRAN was the most successful scientic programming language, The U.S. government was interested in developing a standard language for business applications. The reason for
this is that each time there was a change in computers, it was necessary to rewrite programs if
they were not written in a standard language. This was expensive and the government did not
want this to happen. The government sponsored the Committee on Data Systems and Languages
(CODASYL) to develop standard languages, such as the standard for FORTRAN, which all manufacturers would support. The Common Business Oriented Language (COBOL), was spawned.
Grace Hopper, although not one of the designers of the COBOL language played a prominent role
in this eort. A statement in COBOL designed to compute the net pay for an employee could be
written as:
SUBTRACT TAX FROM GROSS PAY GIVING NET PAY
The language gave the illusion that it could be understood by managers. COBOL is still in
existence today.
Zuse, in 1945 had also envisioned a programming language and called it Plankalkul (Plan Calculus). This was the rst algorithmic programming language. It was a sophisticated language and
had many interesting features that predate the work on Fortran. The development of Plankalkul
was not known to the rest of the world until many years later [3].
18
5 Summary
I have tried to give an overview and a time-line of signicant events leading to the electronic digital
computer. As we have seen, there were many individuals who contributed to the start of the computer revolution: Babbage, who was the rst person to consider making a general purpose digital
computer in the 1800s and developed the idea of conditional branching; Turing, who developed the
concept of the Turing Machine, a universal computer that could simulate any digital computer and
who contributed ideas to the design of the Colossus computer; Flowers, Hinsley and Newman, the
designers of the Colossus computer; Atanaso who, with his student Berry, developed the rst electronic digital computer; Stibitz, who with Williams and Andrews developed the Bell Relay Machine
and showed that one can communicate remotely to digital computers; Aiken, who also developed
a relay machine, the Mark I at Harvard; Konrad Zuse, for his developments of electromechanical
computers and the Plankalkul compiler; Eckert and Mauchly for the development of the rst large
scale electronic digital computer, as well as having developed the stored programming concept with
von Neumann; and von Neumann for his co-development of the stored program concept and his
formal description of the organization of a computer and the importance of logical operations as
well as arithmetical operations in a computer (also recognized by Turing). As we have seen, many
individuals contributed to the development of the electronic digital computer. One cannot say that
it was one particular individual who was the primary person. All of these individuals deserve credit.
The computer has changed radically in terms of speed of operation, storage capacity and size.
It was not possible in this short talk to describe hardware developments that have taken place
since the 1950s, and software developments which have revolutionized the use of computers. I have
described the events leading to the start of the digital computer revolution. The story of how the
computer has come to permeate science, technology, industry and the world around us is starting to
be told. We can look forward to books that describe how this came about. There are, already some
books on this subject [1] is a start in this direction, [2], currently being written will tell another
aspect of the story.
References
[1] M. Campbell-Kelly and W. Aspray. Computer: A History of the Information Machine. BasicBooks, 10 East 53 Street, N.Y., N.Y. 10022, 1996.
[2] P. Ceruzzi. A history of modern computing: 1945-1985. National Air and Space Museum,
Smithsonian Institution, Washington, D.C. 20560, 1988.
[3] W. Giloi. Konrad Zuse's Plankalkul, non von Neumann programming language. IEEE Annals
of the History of Computing, 19(1):17{24, April - June 1997.
[4] H. Goldstine. The Computer: from Pascal to von Neumann. Princeton University Press,
Princeton, New Jersey, 1972.
[5] A. Hodges. Alan Turing: The Enigma. Simon Schuster, New York, New York, 1983.
[6] K. Katz. Historical content in computer science texts: A concern. IEEE Annals of the History
of Computing, 19(1):16{19, January - March 1997.
[7] B. Randell, editor. The Origins of Digital Computers. Springer-Verlag, New York, second
edition, 1975.
19
[8] R. Rojas. Konrad Zuse's legacy: The architecture of the Z1 and Z3. IEEE Annals of the History
of Computing, 19(1):5{16, April - June 1997.
,
20
Figure 1: ABACUS - 3000BC
Figure 2: SCHICKHARD MACHINE - 1623AD
21
Figure 3: PASCALENE CALCULATOR - 1642-1643
22
Figure 4: JACQUARD LOOM
23
Figure 5: JACQUARD PUNCH CARD
24
Figure 6: BABBAGE DIFFERENCE ENGINE
Figure 7: BABBAGE ANALYTIC ENGINE - 1834-1835
25
Figure 8: COUNTESS ADA LOVELACE - 1842-1843
26
Figure 9: HOLLERITH MACHINE - 1889
27
Figure 10: VANEEVER BUSH DIFFERENTIAL ANALYZER - 1930
28
Figure 11: KONRAD ZUSE AND Z-1 MACHINE - 1934
29
Figure 12: John Vincent Atanaso - 1937
30
Figure 13: Atanaso/Berry Computer - 1939
31
Figure 14: Stibitz - Model-K - 1937
32
Figure 15: Bell Relay Computer - 1937
Figure 16: ENIGMA CODING MACHINE - 1939-1941
33
Figure 17: BOMBE AND ENIGMA MACHINES - 1939-1941
34
Figure 18: COLOSSUS MACHINE - 1943
35
Figure 19: HINSLEY and FLOWERS - 1943
36
Figure 20: ALAN TURING - 1937
37
Figure 21: BLETCHLEY PARK - 1940
38
Figure 22: ADMIRAL GRACE MURRAY HOPPER 1951-1952
39
Figure 23: FIRST COMPUTER BUG - 1945
40
Figure 24: THE ENIAC MACHINE 1943-1945
41
Figure 25: PROGRAMMING THE ENIAC MACHINE 1945
42
Figure 26: ECKERT AND MAUCHLY - 1945
43
Figure 27: JOHN VON NEUMANN AND IAS COMPUTER - 1945
44
Figure 28: TRANSISTOR - 1947
45
Figure 29: WHIRLWIND REAL TIME COMPUTER - 1949
46
Figure 30: CORE MATRIX MEMORY - 1951
47
Figure 31: FIRST FORTRAN MANUAL - 1957
48