There is no noun with the ability to represent modern life other than computer. Whether the effect is negative or positive, computers control nearly every aspect of our everyday life. Computers have evolved from bearing the role of strictly computing to having the ability of completing unthinkable tasks. Supermarket scanners calculate our grocery bill while keeping store inventory; computerized telephone switching centers play traffic cop to millions of calls and keep lines of communication untangled; and automatic teller machines (ATM) let us conduct banking transactions form virtually anywhere in the world. All of this amazing technology started over five thousand years ago and continues to grow with an unknown culmination.
Around five thousand years ago in Asia minor, a simple machine bearing a system of sliding beads arranged on a rack such as ones found in a pool hall may be considered the first computer. It is known as the Abacus and is still in use today. Merchants used the Abacus to record their barter transactions. Its popularity began to fall when the use of paper and pencil spread particularly throughout Europe, its importance diminished. The next significant advance of computing started with a man named Blaise Pascal, nearly twelve centuries ensuing the invention of the Abacus.
Pascal was an eighteen year-old son of a French tax collector in the early seventeenth century. To ameliorate his father’s duties, Pascal assembled a brass rectangular box, also called a Pascaline, using eight movable dials capable of adding sums up to eight figures long. Pascal’s system is all based upon the number ten. For example, as one dial passed nine, the next dial turned to represent one in the tens column as the original dial returned back to zero. The Pascaline’s only drawback was its limitation to addition.
A German mathematician and philosopher named Gottfried Wilhem von Leibniz improved the Pascaline in 1694 by inventing a machine with the ability to not only add, but multiply as well. Leibniz’s mechanical multiplier preserved Pascal’s idea of using dials and gears refined form Pascal’s original Pascaline from the study of his notes and drawings. The refined model used a stepped-drum gear design rather than Pascal’s flat gear design, however; the widespread use of the mechanical calculator did not take effect until 1820.
Charles Xavier Thomas de Colmar was another great inventor whose efforts assisted in the evolution of the simple computer. Colmar was a Frenchman whom invented a machine with the ability to perform the four basic arithmetic functions, he invented the Arithometer, which offered a more practical approach to computing offering the ability to add, subtract, divide, and multiply. The enhanced versatility of the arithometer influenced its popularity up until World War I. With the help of his predecessors, Pascal and Leibniz, Colmar helped define the age of mechanical computation.
A British mathematics professor by the name of Charles Babbage began the understanding of our present day computer. It is said that the automation of computers began with one simple quote, “I wish to God these calculations had been performed by steam!” Babbage believed there existed a strong affinity between machines and mathematics. He reckoned, if machines were best at performing flawless tasks in repetition; while mathematics, often required the unpretentious repetition of steps, he could apply the capabilities of machines to the demands of mathematics to refine the arithometer into a more evolved and elaborate machine. Babbage named is first attempt a Difference Engine which could be used to perform differential equations in 1822. The Difference Engine was powered by steam, the size of a car, and was capable of printing out the results automatically. Babbage finally called it quits after ten years of hard work. He was inspired to put is time into the invention of the first general-purpose computer called the Analytical Engine.
The Analytical Engine was also never built. However, Babbage’s steam powered engine outlined the basic elements of the modern personal computer. It contains over 50,000 components, including small-perforated cards containing operating instructions as the basic design. These cards could “store” memory for 1,000 numbers up to fifty decimal points long. The engine also contained a “mill” consisting of output devices used to print out results. His idea of punch cards was borrowed from Joseph-Marie Jacquard who invented the Jacquard Loom obviously named after him. The Loom, produced in 1820 used punchboards to control weaving patterns.
Computers were looked at as a way to simplify large workloads into discreet tasks. The United States census of 1880 took seven years to tally. The fear of later censuses taking an even more absurd amount of time to count, the bureau turned to technology. An American inventor named Herman Hollerith also applied the Jacquard loom concept to computing. Rather than use Babbage’s idea of perforated cards, Jacquard decided to use cards storing data information, which he fed into a machine compiling the results instinctively. Punched holes in the cards would represent letters and number, a single hole depicted a number, while a combination of two holes portrayed a letter. This contrivance allowed the bureau to enumerate the census results in six weeks. Not only did Hollerith’s machine remarkably decrease the amount of time the census took, but also the cards used represented stored memory of the census and reduced computational errors. Hollerith’s machine found its way into the business world founding Tabulating Machine Company in 1896, which later became International Business Machines (IBM) in 1924. After this point in history the evolution of the computer is began to become an increased desired area of interest.
The first major interest began with the onset of World War II. The lust of the government having the ability to use computers to assist them in warfare inspired them to increase the funding of computing projects resulting in the motivation for technical progress in computing. German engineer Konrad Zuse had developed a computer to design airplanes and missiles by 1941 called the Z3. The British were also in the pursuance of enhancing computer technology. They completed a secret code-breaking computer called Colossus used to decode German messages. The existence of the machine was not uncovered until decades after the war.
The first all-electronic calculator was design by a Harvard engineer named Howard H. Aiken in 1944 who was working with IBM. The calculator was about the size of one half a football field and consisted of 500 miles of writing. It was called the Harvard-IBM Automatic Sequence Controlled Calculator, or also known as the Mark I. The machine took anywhere from 3 to 5 seconds for each calculation but possessed the ability to compute complex equations.
The Electronic Numerical Integrator and Computer (ENIAC) was produced in a partnership between the United States government the University of Pennsylvania. The ENIAC contained 18,000 vacuum tubes, 70,000 resistors and 5 million soldered joints, the massive machine consumed 160 kilowatts of electrical power. It was developed by John Presper Eckert and John W. Mauchly. The ENIAC was different form the Colossus and the Mark I in that it was a general-purpose computer, which was able to compute at speed up to 1,000 times faster than the Mark I.
Since the ENIAC, computers have become more complex, and what once was the size of a football field is not the size of a fingernail. The evolution and development of the computer has taken thousands of giant leaps in advances since the start of the twentieth century and continues to grow. It took thousands of years for ancient scientists, mathematicians, and philosophers to improve the smallest amount on the Abacus. Today, the world relies on computers to take care of everything and with out these great men who dedicated their lives to the advancement of computing, the world would not nearly be the way it is today.