From stones to Shakespeare: a history of computing

The Bangkok Post Sunday January 10 1988

Two experiences led me to think that the computer had made its mark on the way we think. A few years ago I taught a bunch of six-year-olds how to tell the time. I drew a big chalk circle on the blackboard and started marking off twelve segments to represent a watch. No matter how many times I went over the half hours and the quarter pasts, the little hand and the big hand, the kids never got the point. They could all read numbers. ‘Watches aren’t round anyway,’ a little girl at the back of the class informed me, ‘they’re square.’ Time no longer went round and round: it was green, digital and Japanese.

The second experience was with some teenagers who were studying Julius Caesar. The big corporations in the United States kept sending sample software: Shakespeare by Computer, Basic Julius Caesar on floppy disc. I tried them out on my class. My students would take much more care with their keyboarding than they ever would with simple pen and paper: they had no choice, no grey areas, and no margin of error. When it came to the test they did no better nor worse than those who had used pen and paper, nor did they particularly enjoy pushing buttons. The method was different but the education remained the same.

We can trace the genealogy of the computer right back to the Stone Age when Fred Flintstone needed to count dinosaurs. One of the first actions of Robinson Crusoe on his island was to mark the number of days on the bark of a tree. Numbers are hard to remember, particularly if you have to add them up, not to speak of multiplication and division, which are complex mental operations even for the best of us.

With the advent of trade, faster, more efficient ways of calculating were needed. Most people, right up to recent times, were completely innumerate. Even educated people had the money to hire somebody else to count it. Nowadays, many have reverted to this situation. Try asking the girl at your supermarket checkout to add the bill when there’s a blackout.

The Chinese, of course, were in there right from the start, with the abacus, a step up from Fred Flintstone’s pebbles. The abacus, a set of coloured beads on wires, has been in existence for about 5000 years. Take a trip to Chinatown and there are still a few diehards around in the backs of shops, clicking their beads.

In Europe there were countless thousands of clerks and accountants adding and subtracting (and fiddling, no doubt) columns and rows of figures for hours and days on end. They used slide rules and long tables, the latter discovered by the Scottish mathematician, John Napier. It was a tedious business, fraught with mistakes and bad eyesight. The human brain and eyes, unlike the computer, suffer from fatigue in the warehouses and ill-lit offices of pre-electricity days. When schoolchildren studied mathematics, log tables were a must for any examination. Now the dilemma is whether to allow super-sophisticated calculators into the exam halls.

Blaise Pascal, a Frenchman, is credited with the invention of the first adding machine, called, appropriately, the Pascaline. It was all cogs and wheels, grinding and grunting away like a constipated combine harvester. But it could add, and subtract too. Pascal never made a cent out of it, since the number boys in all those offices were afraid they’d lose their jobs to this wonder, and refused to have anything to do with it. The Pascaline had an input device, a calculating device and an output device, all on the decimal system. But the scientists were not yet ready to throw those long tables in the bin.

The decimal system of calculation and notation runs from 0 through to 9. This is simplicity itself when compared with Roman numerals. What’s CCCC multiplied by DCCC? Arabic numerals are even more complicated. The binary system, operating on only two symbols, 1 and 0, simplified everything, especially for machines that can’t think. The shift from decimal to binary in calculating machines marked the beginning of computers.

The inventor and mathematician, Charles Babbage, an Englishman, is considered the father of computers. Born in 1792, he built another one of those number crunchers, called the Analytical Engine. It had a memory of sorts and was a big step forward from the Pascaline. It made a lot of noise and was too big to get in the door. Babbage certainly didn’t have a desktop imagination but was brilliant in his insights.

The next advance in the development of the computer came from the unlikely direction of weaving. Punched cards, used to make patterns with thread, could store algebraic patterns. This marked the beginning of programming. But what all these cumbersome machines needed was a spark of electricity to set them in motion and for that they had to wait until the end of the19th century. Once this fired the imaginations of scientists there was no end to machines. ‘The Millionaire,’ developed by a Swiss, was sold in thousands to governments and to that new economic power, the United States of America. After that came the ‘comptometer’, widely in use until well into the 20th century.

The 1890 census in the United States was completed in six weeks whereas previously the tabulating of such information took years. This improvement was brought about by the introduction of punch cards which could ‘read’ the holes and transform them at lightning speed into numbers. Hollerith’s Census Machine seemed like magic and everybody knew in a matter of weeks that the population of the United States stood at 62,622,250. For the record, Hollerith started a company called The Tabulating Machine Company in Washington, D.C. and this, over the passage of time, became one of the leaders in the computer race, International Business Machine corporation, or IBM.

After that it was only a matter of time. A German called Konrad Zuse put the electric calculator together from Meccano parts and using punched 35mm film to make a programme. It was the Second World War and he had to smuggle a model out of Germany and into safe Switzerland.

The Second World War, as the Cold War and the Space Race afterwards, gave tremendous impetus to the scientific communities in Allied countries and in Germany to further develop electronic machines. This was because of the need to break codes. Competition began to enter the field of computer research for the first time, and has stayed there ever since. The secret efforts of British scientists brought forth Colossos, the world’s first electronic digital computer, in 1943. Many people believe this computer is what won the war for the Allies. The German equivalent was called Enigma and for a while cryptography, the science of code-making and code-breaking, was of prime importance.

The development of stored programmes brought the modern idea of a computer one step nearer realisation. The first computer to make use of a stored programme was the Mark 1 at Manchester University, 40 years ago in 1948.

It was the invention of the transistor at Bell Telephone Labs which gave the computer its great leap forward. The years immediately after the war can be seen as a watershed of creativity and technicality in the field of computer science but, nonetheless, they were still pricey commodities, about 1 million pounds each.

In the Fifties and Sixties competition between the USSR and the US to get monkeys, rockets and men into space required smaller, faster and lighter information systems. Small was beautiful, but also easier to get off the ground and into the stratosphere.

The development of the chip is a comparatively recent invention, yet silicone bits surround us. The machine I’m writing on has some, as has the watch I tell time by. A good number of our daily appliances depend on this common element. Sand is simply silicon dioxide. Movement of particles through treated silicon is now measured in nanoseconds – billions of times a second, and even in picoseconds – six millionths of a millionth of a second. It doesn’t bear thinking about.