COMPUTERS - A BRIEF HISTORY

The idea for a computer is not new. In fact the first true calculating machine, the abacus, was in use before 400 BC - and is still used today. In 1617, a Scottish mathematician, John Napier, made a set of calculating rods from bone or ivory. They were called, jokingly, Napierís bones. In the 19th Century, an English mathematician, Charles Babbage, drew up, the plans for what he called a ëdifference engineí. This ëengineí consisted of four parts:

  • an input device
  • a memory (Babbage called it a store)
  • a processor (Babbage called it a mill)
  • an output device.

Babbage never actually built his machine, but his ideas were used when modern computers were developed.

In 1880, the United States government held a census. When it took seven years to finalise the results, the Census Bureau decided to have a competition to see if anyone could invent a better, faster method. The winner was a man named Herman Hollerith, who invented the punched card. He formed the company that would eventually become the giant International Business Machines (IBM).

This system was improved by James Powers, in 1910. Although he worked for the Census Bureau, he formed his own company, which became part of Remington Rand, added the Sperry and Univac companies, and eventually became Unisys. The punched card system was used for inputting data into early programmable computers. These arrived during World War II. At that time, only the military could afford the huge cost.

Several machines were built at the same time, including the Mark I at Harvard University and the ENIAC and EDVAC at the University of Pennsylvania. The first machine to be ëmass-producedí and sold was the UNIVAC  in 1951, designed by Dr J Presper Eckert and Dr John Maunchly. IBM started selling systems just two years later.

babbage.gif

Charles Babbage invented the difference engine

These first-generation computers were both huge and expensive. Their central processors were made from vacuum tubes and, by modern standards, they were slow. Because of the heat from the tubes, they needed huge airconditioners to keep them from burning up. If the airconditioner failed, the computer had to be shut down immediately. Neither the airconditioners or the computers were very reliable, so only a few of these first-generation systems were sold.

These computers were not totally electronic, either - they used several electromechanical parts (mechanical parts that were operated electrically). One of these was a relay, a mechanical switch that is turned on and off by an electrical device called a solenoid. One day, a relay broke down. When the engineer took it apart, he found a dead insect jammed between the switch contacts, which stopped them working. This was the source of the word bug to mean a problem with a computer component, either hardware or software.

Major developments soon followed, and each resulted in a new generation of computers. John Bardeen, Walter Brattain and William Shockley invented the transistor while working at Bell Laboratories in 1947. Transistors replaced vacuum tubes and made the second generation of computers smaller, faster and more reliable. Many were sold to companies. In 1958, Jack Kilby of Texas Instruments developed the first integrated circuit, which combined a number of transistors into one unit.

The first microprocessor was designed by engineers at Intel Corporation in 1971. A microprocessors puts all the circuits for a computer onto a single chip. It was this development that finally made the personal computer possible.

circuit.gif

Chips on a hard disk drive controller

By 1977, several companies were selling personal computers, including Apple, Radio Shack and Commodore. In 1981, IBM released the personal computer (PC) which rapidly became a major success. As competition increased, prices came down and computers became more powerful. Soon, personal computers were more powerful than even the largest computers of just ten years earlier.