Current Issue

Apr. 4, 2012

Vol. 112, No. 10

Features

Daybreak of the Digital Age

The world celebrates the man who imagined the computer

By W. Barksdale Maynard ’88
Published in the April 4, 2012, issue


John von Neumann with MANIAC in 1952. The shiny cylinders contained random-access memory, visible as flickering phosphor through the holes.
PHOTO: ALAN RICHARDS PHOTOGRAPHER
John von Neumann with MANIAC in 1952. The shiny cylinders contained random-access memory, visible as flickering phosphor through the holes.

To get ENIAC to change tasks, its handlers had to reset it manually by flipping switches and unplugging thousands of tangled cables. It could take days to rearrange its hardware for a problem that then took just minutes to compute. Inspired by Turing — whose “On Computable Numbers” he constantly recommended to colleagues — von Neumann began to conceptualize the design of a computer controlled by coded instructions stored internally. 

To usher in the brave new world of Turing machines, von Neumann audaciously proposed that the Institute for Advanced Study build one itself, on its new campus beyond the Graduate College. He envisioned an “all-purpose, automatic, electronic computing machine” with stored programs: “I propose to store everything that has to be remembered by the machine, in these memory organs,” he wrote, including “the coded, logical instructions which define the problem and control the functioning of the machine.” This describes the modern computer exactly.

Von Neumann’s suggestion of building some kind of mechanical apparatus on the Institute grounds was greeted with dismay by many of the aloof intellectuals there, horrified by the thought of greasy mechanics with soldering guns. Nearby homeowners complained about potential noise and nuisance. But the Electronic Computer Project went ahead anyway, starting in November 1945 with ample funding from the military, plus additional contributions from the Univer­sity and other sources. Young engineers were lured with a promise of free enrollment as Princeton Ph.D. students.

Called MANIAC, for Mathematical Analyzer, Numerical Integrator, and Computer, it was meant to improve in every way upon ENIAC. The Penn machine had 17,500 vacuum tubes, each prone to fizzling; the Institute’s, only 2,600. ENIAC was 100 feet long and weighed 30 tons; MANIAC was a single 6-foot-high, 8-foot-long unit weighing 1,000 pounds. Most crucially, MANIAC stored programs, something ENIAC’s creators had pondered but not attempted.

Assembly of the computer — from wartime surplus parts — began in the basement of Fuld Hall at the Institute; in early 1947, the project moved to a low, red-brick building nearby, paid for by the Atomic Energy Commission (the building now houses a day-care center). Not for six years would MANIAC be fully operational. The design choices von Neumann and his team made in the first few months reverberate to this day. 

For example, they chose to use Turing’s binary system (0s and 1s) instead of a decimal system, and collaborator John Tukey *39, a Princeton professor, coined the term “bit.” So vast was their influence, the internal arrangement of today’s computers is termed the von Neumann architecture. 

Von Neumann wanted MANIAC to jump-start a computer revolution, transforming science by solving old, impossible problems at electronic speeds. To maximize its impact upon the world, he eschewed any patent claims and published detailed reports about its progress. “Few technical documents,” Dyson writes, “have had as great an impact, in the long run.” 

Seventeen stored-program computers across the planet soon were built following its specifications, including the identically named MANIAC at Los Alamos and the first commercially available IBM machine. 

Controlled mysteriously from inside instead of outside, MANIAC seemed to many observers uncannily like an electronic brain. The great breakthrough was the set of 40 cylinders that surrounded its base like a litter of piglets. In an ingenious technical achievement, these cathode-ray tubes (similar to those coming into use for television) provided the world’s first substantial random-access memory. 

One could lean over and literally watch the 1,024 bits of memory flickering on a phosphorescent screen on top of each tube, which Dyson calls the genesis of the whole digital universe. Such tubes had been perfected at Manchester University, England, where Turing was a consultant.

“The fundamental conception is owing to Turing,” von Neumann said of MANIAC. A decade earlier, the young Brit had proposed a tape crawling by with numbers on it; now MANIAC flashed at incredible speed the electronic equivalent of zeros and ones in glowing phosphor. 

By our standards, MANIAC may seem a modest achievement: As Dyson notes, the computer’s entire storage (five kilobytes) equals less memory than is required by a single icon on your laptop today. No one yet had invented a modern programming language; just to do the equivalent of hitting the backspace key, science writer Ed Regis says, meant precisely coding in something like 1110101. 

And the computer broke down frequently — all 40 ­memory tubes had to be working perfectly at once. “The ­sensitivity of the memory, that was a big problem,” recalls UCLA professor emeritus Gerald Estrin, who was hired by von Neumann in 1950 to design the input-output device, a paper-tape reader. “If there was a storm with lightning, you would feel it in loss of bits. We spent many nights on the floor trying to tune it up.” 

 
Post Comments
Comments
5 Responses to Daybreak of the Digital Age

Jon Edwards '75 Says:

2012-03-30 13:29:16

Wonderful article ... I hope that all interested in Turing and his legacy consider attending Princeton's Turing Centennial May 10-12. See www.princeton.edu/turing

Josh Dranoff *56 *60 Says:

2012-04-03 16:25:17

As a grad student in the mid-1950s, I was fortunate to have access to the MANIAC computer as part of my thesis research. This excellent article brought back many fond memories.

Andrew Horchler '00 Says:

2012-04-09 09:32:06

Fascinating article. Looking around the Web, I'm not sure the computer at Princeton was ever referred to as MANIAC. Can anyone confirm this? I see that it is often referred to as the IAS (for Institute for Advanced Study) or IAS Machine. The machine that is commonly called MANIAC, or MANIAC I, was inspired by this first "von Neuman machine" and existed at Los Alamos Labs: http://en.wikipedia.org/wiki/IAS_machine In any case, I hope this helps those looking for more information on the machine at Princeton (now in the Smithsonian) and not necessarily on the later derivatives.

Jerry Porter '58 Says:

2012-04-17 12:08:53

During my junior and senior years at Princeton, 1956-58, I managed a group of Princeton students who were the night operators on the Institute machine. Night meant something like 5-11. The computer had 32 CRTs for memory, and we had an oscilloscope that could tune in on the 32 by 32 bit grid on any of the tubes. Our most important task was to make sure that no one bit "lit up," because if it did (for example, if the program was in a very tight loop), it could burn out that bit in all 32 tubes and that would be a disaster. I used that computer to do calculations on my senior thesis, and I suspect that it was one of the first Princeton senior theses to use a digital computer. My classmate, Ned Irons, tells me that he also used the machine for his thesis, but I was unaware of that at the time. I did have to get my thesis done in a timely way, since the machine was disassembled for shipment to the Smithsonian shortly thereafter. It was very exciting many years later to visit! the Smithsonian and see the desk I used to sit at.

Howard Robbins '57 Says:

2012-04-23 11:05:05

I was one of half-dozen or so juniors or seniors who actually wrote and ran little programs for this machine. There was, of course, no such thing as programming or computer science classes back then, nor high-level languages or even assemblers - the programs were in machine language coded in binary, on IBM punched cards! I can't remember who was in that group or who the poor grad student was who was our instructor. The experience taught me enough about it to avoid programming for the rest of a very gratifying career in engineering. Does anybody else remember that class 56 years ago?
Tell us what you think about
Daybreak of the Digital Age
Enter the word as it appears in the picture below
Send
By submitting a comment, you agree to PAW's comment posting policy.
CURRENT ISSUE: Apr. 4, 2012
Web Exclusives
HISTORY
Princeton’s early computing pioneers
Letters
Related stories
January 23, 2008
Special issue: Princeton’s most influential alumni