The Invention of Computers — Part I
Computers evolved out of special purpose calculation devices. Although it may be hard to imagine, the abacus is one such device. They are as old as civilization, but still in use today. With practice, even complex arithmetic is remarkably quick. The abacus is a testament to practical problem solving, and demonstrates the utility of using devices to perform calculations. The abacus grew out of a particular need: to perform arithmetic. As mathematics became more sophisticated, the practical needs changed. Mathematicians used to publish books full of number tables that were pre-calculated for specific purposes. For example, in 1614, John Napier published a 147-page book that included ninety pages of number tables useful for calculating the positions of planets. These types of number tables became increasingly important, and mathematicians and inventors set about building various machines and devices to speed the process of creating them.
By the time of the industrial revolution, number tables were being used in the design of machine tools and to power factories. To produce these tables, inventors built calculation machines using a wide array of newly available high quality machine parts, such as ratchets and gears. These machines had immediate commercial value, but were always limited to performing particular types of calculations. Although scientists developed ingenious schemes to make calculation machines more flexible, no general theory of computation was known.
One English scientist, ahead of his time, did tackle the problem of general computation. In 1837, Charles Babbage described a type of general purpose computing device, which he called the “Analytical Engine”. It is recognized as the first ever specification of a true computer – capable of any type of calculation. Remarkably, the Analytical Engine was programmable even though it was designed with 19th century machine components.
Babbage’s work was popularized by the mathematician and mystic Augusta Ada King, commonly known as Ada Lovelace. Babbage and Lovelace were correspondents, and wrote numerous letters about the Analytical Engine, and other calculating machines. In these letters, Lovelace described both computers and computer software, and gave an algorithm for the Analytical Engine that computed a sequence of mathematically important numbers called Bernoulli numbers. Lovelace, however, recognized that general purpose computing would have applications well beyond calculating mathematical tables. She put forward the idea that computers could use a mathematical model to compose elaborate musical compositions. Lovelace was correct, and today computers are capable of creating completely novel musical compositions in a variety of styles.
Unfortunately, the Analytical Engine fell into obscurity soon after Babbage’s death in 1871. Both industry and academia overlooked Lovelace and Babbage’s shared conceptual leap, and the invention of the computer would have to wait almost a century. By the late 1930s, computers were a natural extension of the sophisticated calculating machines of the day, and were independently invented in three different countries. Both Britain and the United States made foundational contributions to computing; however, the first true computer was built by a German engineer working in near complete isolation. In 1941, in Berlin, Konrad Zuse completed the Z3, the third revision of Zuse’s mechanical calculating devices. Zuse was primarily concerned with practical calculation problems, such as measuring the surface area of airplane wings, but approached these problems in a principled way that separated the calculation logic from the machinery that performed the calculation. His machines were built out of recycled telephone equipment, and sported numerous innovations: the very apotheosis of practical problem solving. It took centuries of innovation, and a few miss-starts, but the computer had finally arrived, and they proved to be spectacularly useful.
The Invention of Computers — Part II
A lone German engineer built the first true computer during World War II; however, his work was stymied by isolation and under-appreciated by the German leadership. This was not the case across the English Channel. In the early stages of the war, Britain was engaged in a desperate struggle for survival. The British attempted to gain an operational advantage over their opponents by intercepting and decrypting enemy communications. Code breaking requires a combination of ingenuity and millions of statistical calculations. Although computers did not yet exist in Britain, engineers and mathematicians successfully defeated German encryption by designing and building enormous calculation machines. These machines were too specialized to be true computers, but their designers would become some of the worlds first computer scientists.
Once such scientist was Alan Turing, often lauded as the father of computing, and of artificial intelligence. Turing was a mathematician who designed techniques to break German ciphers. An accomplished marathon runner, Turning sometimes ran the 64km from his workplace in Bletchley Park, to head office in London, in order to attend high level war meetings. In 1945, Turing become an Officer of the Order of the British Empire for his scientific contributions to code breaking, which included designing code-breaking machines that deciphered tens of thousands of enemy communications. Turing’s most notable accomplishment, however, was proving that any mathematical algorithm can be expressed and computed by a machine that reads and writes to a tape of 1s and 0s. This type of hypothetical machine is called a “Turing machine”, and it is the mathematical foundation for modern computing. The term “Turing-complete” is used to refer to a computer’s ability to execute any calculation as opposed to machines limited to particular tasks. Turing’s genius is celebrated around the world each year with the Turning Award, which is the computing world’s equivalent to the Nobel Prize.
Even though Turing developed the theory for what a computer is, and designed some of the most sophisticated calculation machines ever built, the British never built a “Turing-complete” computer during the war – they were focused on code-breaking. The allies did build a true computer, however, and it was the harbinger of a new era of calculation. Between 1943 and 1946, US scientists at the University of Pennsylvania designed and built the first ever fully electric programmable computer. It was called ENIAC, and was about one thousand times faster than anything ever built. Being a true computer, it was superbly flexible, and performed numerous important calculations, even whilst under construction, including calculations related to the design of the first atom bomb.
The early pioneers who worked on these computers understood their significance beyond military applications. Popular culture already referred to these machines as “Brains”, but ENIAC was heralded as a “Giant Brain”. Scientists and artists had long stoked the public imagination with dreams of synthetic intelligence, and now anything seemed possible. The remarkable usefulness of computers was already well established, and many scientists, including Turning, now turned their attention to solving the mystery of intelligence – perhaps the grandest mystery of all. No one foresaw the strange limitations that computers would have in this regard, but the birth of computers did lead to a deeper understanding of who we are, and ushered in a new age of scientific inquiry.