Current Issue

Apr. 4, 2012

Vol. 112, No. 10

Features

Daybreak of the Digital Age

The world celebrates the man who imagined the computer

By W. Barksdale Maynard ’88
Published in the April 4, 2012, issue


Alan Turing *38
PHOTO: © NATIONAL PORTRAIT GALLERY, LONDON
Alan Turing *38

Just before the Second World War, a Big Idea was detonated: the idea of the computer. The world has never been the same. 

Princetonians played a key role. Here graduate student Alan Turing *38 finalized his landmark paper, “On Computable Numbers,” the light-bulb moment in which humankind first discovered the concept of stored-program computers. And here one of his professors, John von Neumann, would build just such a device after the war, MANIAC at the Institute for Advanced Study, forerunner of virtually every computer on the planet today. 

Of course other people were involved, and other institutions — Penn had its pioneering wartime ENIAC machine — but Turing and von Neumann arguably were the two towering figures in launching computers into the world. As George Dyson claims in his new book on early computing at the Institute, Turing’s Cathedral, “the entire digital universe” can be traced to MANIAC, “the physical realization” of Turing’s dreams. 

June 2012 marks the centennial of Turing’s birth in London, and universities around the world — including Princeton and Cambridge, where Turing did the research that led to his landmark paper — are celebrating with conferences, talks, and even races. Mathematicians are calling 2012 “Alan Turing Year.” Since the year also is the 60th anniversary of the public unveiling of MANIAC, this seems a particularly good time to recall the part Princeton played in the birth of all things digital.

Turing was a 22-year-old math prodigy teaching at Cam­bridge when, in spring 1935, he decided to further his studies by coming to Princeton. He just had met the University’s von Neumann, a Hungarian-émigré mathematician then visiting England. By the time he encountered Turing, von Neumann was the world-famous author of 52 papers, though only 31 years old.

Turing’s specialty was the rarified world of mathematical logic, and he wanted to study near top expert Alonzo Church ’24 *27, another Princeton professor. The legendary Kurt Gödel was here, too, although nervous breakdowns made him frequently absent. All these geniuses had offices in Fine Hall (today’s Jones Hall), where the Institute for Advanced Study temporarily shared quarters with the University’s renowned math department.

Turing arrived in Princeton in October 1936, moving into 183 Graduate College and living alongside several of his fellow countrymen — enough for a “British Empire” versus “Revolting Colonies” softball game. A star runner, Turing enjoyed playing squash and field hockey and canoeing on Stony Brook. Still, Turing made few close friends. He was shy and awkward, with halting speech that has been imitated by actor Derek Jacobi in the biopic movie Breaking the Code. Being homosexual, Turing felt like an outsider.

Soon the postman delivered proofs of Turing’s article for a London scientific journal. The young author made corrections, then mailed it back: “On Computable Numbers,” surely one of the epic papers in history. 

Princeton likes to take some credit — in 2008, a ­PAW-convened panel of professors named him Princeton’s second-most influential alum, after only James Madison 1771 — but Turing actually wrote the paper at Cambridge. It dealt with mathematical problems similar to those on which Church was working independently. In fact, Church published a paper that took a different approach several months before Turing’s came out — but Turing’s paper contained a great novelty. As he lay in a meadow, he had a brainstorm. He proposed solving math problems with a hypothetical machine that runs an interminable strip of paper tape back and forth — writing and erasing the numbers zero and one on the paper and thereby undertaking calculations in binary form. The machine was to be controlled by coded instructions punched on the tape.

Previous machines throughout history were capable of performing only one assigned task; they were designed with some fixed and definite job in mind. By contrast, an operator could endlessly vary the functioning of a hypothetical “Turing machine” by punching in new coded instructions, instead of building an entirely new device. Here was a “universal computing machine” that would do anything it was programmed to do; curiously, the machine proper remained untouched. Thus Turing’s genius lay in formulating the distinction we would describe as hardware (the machine) versus software (the tape with binary digits).

He never actually built such a machine — for Turing, it remained an intellectual construct — nor explained exactly how the tape would have worked. Nonetheless, his idea was profoundly important. Here begins, in theoretical principles at least, the digital universe as distinct from the old analog one. “The entire Internet,” Dyson writes, “can be viewed as a finite if ever-expanding tape shared by a growing population of Turing machines.” The idea led, a decade later, to the construction of an actual stored-program computer — but not until the paper tape of theory was replaced by lightning-fast electrical impulses.

Church praised his student’s paper and popularized the label “Turing machine.” But Turing kept a certain distance from his professors, with Church recalling years later that he “had the reputation of being a loner and rather odd,” even by rarified Fine Hall standards. When Turing presented “On Computable Numbers” in a lecture to the Math Club in December 1936, attendance was sparse, much to his disappointment. “One should have a reputation [already] if one hopes to be listened to,” Turing wrote to his mother glumly. 

Little could this lonely and dejected Englishman have imagined that one day he would be considered among the most important graduates in the University’s history — so important has “On Computable Numbers” proven to us all.

Von Neumann later would make history by adapting Turing’s ideas to the construction of a physical computer, but the two men formed no particular friendship, despite the proximity of their offices. In personality they were virtually opposites. Von Neumann warmly embraced his adopted country, styling himself as “Johnny”; the reticent Turing never fit in and was shocked by coarse American manners, as when a laundry-van driver once draped his arm around him and started to chat. Von Neumann bought a new Cadillac every year, parking it splendidly in front of Palmer Lab; Turing had difficulty learning to drive a used Ford and nearly backed it into Lake Carnegie. The outgoing von Neumann always wore a business suit, initially to look older than his tender years; the morose Turing looked shambolic in a threadbare sports coat. 

Despite their differences, von Neumann recognized Turing’s brilliance and tried to entice him to stay as his assistant for $1,500 a year after his second year of study was complete. But Turing’s eyes were on war clouds. “I hope Hitler will not have invaded England before I come back,” he wrote to a friend. 

Already looking ahead to military code-breaking, Turing sought some practical experience with machines. A physicist friend loaned him his key to the graduate-student machine shop in Palmer Lab and taught him to use lathe, drill, and press. Here Turing built a small electric multiplier, its relays mounted on a breadboard — a foretaste of the complex machines he soon would use back home to crack Nazi codes.

In May 1938 he defended his Ph.D. dissertation, “Systems of Logic Based on Ordinals” — a paper unrelated to computers but still, according to one historian, “a profound work of first-rank importance” in advancing mathematical logic. Then he sailed home to England, which soon declared war on belligerent Germany. 

Turing worked at top-secret Bletchley Park, where a team built 10 huge electronic digital computers, called Colossus. Turing did not design these, but he recognized them as signposts pointing to the digital future — though they had no stored program. Each Colossus inhaled paper tape at a stunning 30 miles an hour, processing 63 million characters in total before the collapse of the Third Reich. 

Turing ought to have become a national hero for his ingenious code-breaking at Bletchley Park — historians say it helped to shorten the war by as much as two years — but the existence of Colossus remained a secret for decades.

War changed the future for von Neumann, too. Famed for his contributions to pure math, he now was transformed, paradoxically, into the most practical of applied scientists. Consulting for the U.S. Army Ordnance Department even before fighting began, he studied the complex behavior of blast waves produced by the detonation of high explosives. Eventually he was helping to build an atomic bomb.

Since no atom bomb ever had been attempted, scientists needed to model how one might work. This required innumerable calculations. At Los Alamos, roomfuls of clerks tapped on desk calculators and shuffled millions of IBM punch cards. After two weeks there, in spring 1944, von Neumann was dismayed by the slow progress. What was needed was computation at electronic speeds. 

Such swiftness was promised by ENIAC (Electronic Numerical Integrator and Computer), a project to build an all-digital, all-electronic device to calculate shell trajectories for Army Ordnance at Penn. Von Neumann watched its progress with fascination but dreamed of something even more advanced: a true stored-program computer, a Turing machine.

 
Post Comments
Comments
5 Responses to Daybreak of the Digital Age

Jon Edwards '75 Says:

2012-03-30 13:29:16

Wonderful article ... I hope that all interested in Turing and his legacy consider attending Princeton's Turing Centennial May 10-12. See www.princeton.edu/turing

Josh Dranoff *56 *60 Says:

2012-04-03 16:25:17

As a grad student in the mid-1950s, I was fortunate to have access to the MANIAC computer as part of my thesis research. This excellent article brought back many fond memories.

Andrew Horchler '00 Says:

2012-04-09 09:32:06

Fascinating article. Looking around the Web, I'm not sure the computer at Princeton was ever referred to as MANIAC. Can anyone confirm this? I see that it is often referred to as the IAS (for Institute for Advanced Study) or IAS Machine. The machine that is commonly called MANIAC, or MANIAC I, was inspired by this first "von Neuman machine" and existed at Los Alamos Labs: http://en.wikipedia.org/wiki/IAS_machine In any case, I hope this helps those looking for more information on the machine at Princeton (now in the Smithsonian) and not necessarily on the later derivatives.

Jerry Porter '58 Says:

2012-04-17 12:08:53

During my junior and senior years at Princeton, 1956-58, I managed a group of Princeton students who were the night operators on the Institute machine. Night meant something like 5-11. The computer had 32 CRTs for memory, and we had an oscilloscope that could tune in on the 32 by 32 bit grid on any of the tubes. Our most important task was to make sure that no one bit "lit up," because if it did (for example, if the program was in a very tight loop), it could burn out that bit in all 32 tubes and that would be a disaster. I used that computer to do calculations on my senior thesis, and I suspect that it was one of the first Princeton senior theses to use a digital computer. My classmate, Ned Irons, tells me that he also used the machine for his thesis, but I was unaware of that at the time. I did have to get my thesis done in a timely way, since the machine was disassembled for shipment to the Smithsonian shortly thereafter. It was very exciting many years later to visit! the Smithsonian and see the desk I used to sit at.

Howard Robbins '57 Says:

2012-04-23 11:05:05

I was one of half-dozen or so juniors or seniors who actually wrote and ran little programs for this machine. There was, of course, no such thing as programming or computer science classes back then, nor high-level languages or even assemblers - the programs were in machine language coded in binary, on IBM punched cards! I can't remember who was in that group or who the poor grad student was who was our instructor. The experience taught me enough about it to avoid programming for the rest of a very gratifying career in engineering. Does anybody else remember that class 56 years ago?
Tell us what you think about
Daybreak of the Digital Age
Enter the word as it appears in the picture below
Send
By submitting a comment, you agree to PAW's comment posting policy.
CURRENT ISSUE: Apr. 4, 2012
Web Exclusives
HISTORY
Princeton’s early computing pioneers
Letters
Related stories
January 23, 2008
Special issue: Princeton’s most influential alumni