The History of Computing Computer

The History of Computing

The historical foundation of computing goes a long way toward explaining why computer systems today are designed as they are. The devices that assist humans in various forms of computation have their roots in the ancient past and have continued to evolve throughout the present day. Let's take a brief tour throughout the history of computing hardware. For the purposes of the tour, the timeline has been broken up into five major chapters as shown on the left.

Early History

The abacus which appeared in the sixteenth century BC, was developed as an instrument to record numeric values on which a human can perform basic arithmetic.

17th Century

According to the (Computer History Museum, 2006), in the middle of the seventeenth century, Blaise Pascal, a French mathematician, built and sold gear–driven mechanical machines, which performed whole–number addition and subtraction. Later in the seventeenth century, a German mathematician, Gottfried Wilhelm von Leibniz, built the first mechanical device designed to do all four whole–number operations: addition, subtraction, multiplication and division. Unfortunately the state of mechanical gears and levers at that time was such that the Leibniz machine was not very reliable.

18th Century

Punched Card
Punched Card

In the late eighteenth century, Joseph Jacquard developed what became known as Jacquard's loom, used for weaving cloth. The loom used a series of cards with holes punched in them to specify the use of specific coloured thread and therefore, dictate the design that was woven into the cloth. Although not a computing device, Jacquard's loom was the first to make use of an important form of input: the punched card.

19th Century

It wasn't until the nineteenth century that the next major step was taken, this time by a British mathematician. Charles Babbage designed what he called his analytical engine. His design was too complex to build with the technology of his day, so it was never implemented. His vision however, included many of the important components of today's computers. His design was the first to include a memory so that values did not have to be re–entered. His design also included the input of both numbers and mechanical steps making use of punched cards similar to those in Jacquard's loom (Dale & Lewis, 2012).

Ada Lovelace
Ada Lovelace

Ada Augusta, Countess of Lovelace, was a very romantic figure in the history of computing. Ada, the daughter of Lord Byron (the English poet), was a skilled mathematician. She became interested in Babbage's work on the analytical engine and extended his ideas (as well as correcting some of his errors). Ada is accredited with being the first programmer. The concept of the loop – a series of instructions that repeat &ndash: is attributed to her (Computer History Museum, 2006).

During the latter part of the nineteenth century and the beginning of the twentieth century, computing advances were made rapidly. William Burroughs produced and sold a mechanical adding machine. Dr Herman Hollerith developed the first electro–mechanical tabulator, which read information from a punched card. His device revolutionised the census taken every ten years in the United States. Hollerith later formed the company known today as IBM.

20th Century

Alan Turing
Alan M. Turing

In 1936, a theoretical development took place that had nothing to do with hardware per se but profoundly influenced the field of computer science. Alan M. Turing, another British mathematician invented an abstract mathematical model called a Turing Machine, laying the foundation for a major area of computing theory. The most prestigious award given in computer science (equivalent to the Fielding Medal in mathematics or a Nobel Prize in other sciences) is the Turing Award, named for Alan Turing (Dale & Lewis, 2012).

By the outbreak of World War II, several computers were under design and construction. The Harvard Mark I and the ENIAC are two of the more famous machines of the era. John von Neumann, who had been a consultant on the ENIAC project, started work on another machine known as EDVAC, which was completed in 1950. In 1951, the first commercial computer UNIVAC I was delivered to the Bureau of the Census and was the first computer used to predict the outcome of a presidential election.

UNIVAC I
The UNIVAC I

The early history that began with the abacus and ended with the delivery of the UNIVAC I. With the building of that machine, the dream of a device that could rapidly manipulate numbers was realised; the search was ended. Or was it? Some experts predicted at that time that a small number of computers would be able to handle the computational needs of mankind. What they didn't realise was that the ability to perform fast calculations on large amounts of data would radically change the very nature of fields such as mathematics, physics, engineering and economics. That is, computers made those experts' assessments of what needed to be calculated entirely invalid.(Dale & Lewis, 2012)

After 1951, the story becomes one of the ever–expanding use of computers to solve problems in all areas. From that point, the search has focused not only on building faster, bigger devices, but also on developing tools that allow us to use these devices more productively. The history of computing hardware from this point on is categorised into several generations based on the technology they employed.

First Generation (1951 – 1959)

Commercial computers in the first generation were built using vacuum tubes to store information. A vacuum tube generated a great deal of heat and was not very reliable. The machines that used them required heavy–duty air–conditioning and frequent maintenance. They also required very large specially built rooms.

The primary memory device of the first generation was a magnetic drum that rotated under a read/write head. When the memory cell that was being accessed rotated under the read/write head, the data was then written or read from that place on the drum.

The input device was a card reader that read the holes punched in an IBM card (a descendant of the Hollerith card). The output device was either a punched card or a line printer. By the end of this generation, magnetic tape drives had been developed that were much faster than card readers. Magnetic tapes are sequential storage devices, meaning that the data on the tape must be accessed one after the other in a linear fashion. Storage devices external to the computer memory are called auxiliary storage devices. The magnetic tape was the first of these. Collectively, input/output devices, and auxiliary storage became known as peripheral devices (Computer History Museum, 2006).

Second Generation (1959 – 1965)

IBM Vacuum Tubes
IBM Vacuum Tubes

The advent of the transistor (for which John Bardeen, Walter H. Brattain and William B. Shockley won a Nobel Prize) ushered in the second generation of commercial computers. The transistor replaced the vacuum tube as the main component in the hardware. The transistor was smaller, more reliable, faster, more durable as well as cheaper (Computer History Museum, 2006).

The second generation also witnessed the advent of immediate–access memory. When accessing information from a drum, the CPU had to wait for the proper place to rotate under the read/write head. The second generation used memory made from magnetic cores, tiny doughnut shaped devices each capable of storing one bit of information. The cores were string together with wires to form cells and cells were combined into a memory unit. Because the device had no moving parts and was accessed electronically, information was available instantly.

The magnetic disc, a new auxiliary storage device was also developed during the second computer hardware generation. Much faster than the magnetic tape, the magnetic disc could access data by referring to its location on the disc. Unlike a tape, which cannot access a piece of data without accessing everything on the tape which comes before it, a disc is organised so that each piece of data has its own location identifier called an address. The read/write heads of a magnetic disc can be sent directly to the specific location on the disc where the desired information is stored.

Third Generation (1965 – 1971)

Integrated Circuit Board
An Integrated Circuite Board

In the second generation, transistors and other components for the computer were assembled by hand on printed circuit boards. The third generation was characterised by integrated circuits (ICs), solid pieces od silicon that contained the transistors, other components and their connections. Integrated circuits were much smaller, cheaper, faster and more reliable that printed circuit boards. Gordon Moore, one of the co–founders of intel, noted that from the time of the invention of the IC, the number of circuits that could be placed on a single integrated circuit was doubling each year. This observation became known as Moore's Law (Dale & Lewis, 2012).

Transistors were also used for memory construction, where each transistor represented one bit of information. IC technology allowed memory boards to be built using transistors. Auxiliary storage devices were still needed though because transistor memory was volatile; that is, the information went away when the power was turned off.

The terminal, an input/output device with a keyboard and screen was introduced during this generation. The keyboard gave the user direct access to the computer, and the screen provided an immediate response.

Fourth Generation (1971 – ?)

Large–scale integration characterises the fourth generation. From several thousand transistors on a silicon chip in the early 1970s, we had moved to a who a whole microcomputer on a chip by the middle of this decade. Main memory devices are still made almost exclusively out of chip technology. Over the previous 40 years, each generation of computer hardware had become more powerful in a smaller package at lower cost. Moore's law was modified to say that chip density was doubling every 18 months (Dale & Lewis, 2012).

By the late 1970s, the phrase personal computer (PC) had entered the vocabulary. Microcomputers had become so cheap that almost anyone could have one, and a generation of kids grew up playing PacMan (Domingo, 2012)

The fourth generation found some commercial names entering the commercial market. Apple, Tandy/Radio Shack, Atari, Commodore and Sun joined the big companies of earlier generations – IBM, Remington Rand, NCR, DEC (Digital Equipment Corporation), Hewlett–Packard, Control Data and Burroughs. The best known success story of the personal computer revolution is that of Apple. Steve Wozniak, an engineer and Steve Jobs, a high school student, created a personal computer kit and marketed it out of a garage. This was the beginning of Apple Computer, a multibillion–dollar company (Domingo, 2012).

The IBM PC was introduced in 1981 and was soon followed by compatible machines manufactured by many other companies. For example, Dell and Compaq were successful in making PCs that were compatible with IBM PCs. Apple introduced its very popular Macintosh microcomputer line in 1984.

The ICL DRS Model 30 workstation - computer fashions of the 1980s
The ICL DRS Model 30 workstation

In the mid–1980s, larger more powerful machines were created called Workstations. Workstations were generally meant for business, not personal use. The idea was for each employee to have his or her own workstation on the desktop. These workstations were then connected by cables, or networked, so they could interact with one another. Workstations were made more powerful by the introduction of the RISC (reduced instruction set computer) architecture. Each computer was designed to understand a set of instructions, called its machine language. Conventional machines such as the IBM 370/168 had a set containing more than 200 instructions (Dale & Lewis, 2012). Instructions were fast and memory access was slow, so specialised instructions made sense. As memory access got increasingly faster, using a reduced set of instructions became attractive. Sun Microsystems introduced a workstation with a RISC chip in 1987. Its enduring popularity proved the feasibility of the RISC chip. These workstations were often called UNIX workstations because they used the UNIX operating system.

Because computers are still being made using circuit boards, we cannot mark the end of this generation. However several things have occurred that so dramatically affected how we use machines that they certainly have ushered in a new era. Moore's law was once again restated in the following form: "Computers will either double in power at the same price or halve in cost for the same power every 18 months" (Dale & Lewis, 2012).