George Dyson's Turing's Cathedral: The Origins of the Digital Universe is the best book I've read on the origins of the computer. Turing's Cathedral does for the entire history of computing what Tracy Kidder's The Soul of a New Machine did for the computing of the 1980s, simultaneously synthesizing and reappraising what's known about the most world-changing invention of the last century and the people who invented it. Its main discovery: We owe the computer to that other great twentieth-century invention, the nuclear bomb. They would never have been invented without one another.
Dyson's book is not only learned, but brilliantly and surprisingly idiosyncratic and strange. The first part of the book centers on the invention of MANIAC (an early machine used at Los Alamos -- the Mathematical Analyzer, Numerical Integrator, and Computer). MANIAC's architecture drew on years of work by a huge number of scientists, but it was shaped somewhat definitively, Dyson argues, by a team of geniuses led by John von Neumann at the Institute for Advanced Study in Princeton, NJ. Simply because it's fascinating, apparently, Dyson chooses to begin his account of this work in the 1600s -- not just with the founding of the town of Princeton (it sprang up, he explains, around the Greenland Tavern, which was at the mid-point of the wagon road between Philadelphia and New York), but with the founding of Pennsylvania, New Jersey, and New York in the first place.
Von Neumann is, ultimately, the central figure of the book: Although it was Turing's theory of computability which underwrote the whole computing project, it was von Neumann's wide-ranging genius which made it possible to replace huge offices full of human "computers" crunching numbers with a machine. Von Neumann is widely regarded as one of the smartest men of the twentieth century; his modus operandi, Dyson writes, was to "approach a subject, identify the axioms that made it tick, and then, using those axioms, extend the subject beyond where it was when he showed up." After revolutionizing set theory in his dissertation and inventing game theory, among other achievements, von Neumann turned his sights on computing.
Dyson's history isn't just about ideas. His book shows just how much the invention of the computer depended on the material facts of history, especially the Second World War and the nuclear program. It was the war which pushed von Neumann, who was Hungarian (his real name was Neumann János Lajos), along with many other gifted mathematicians, physicists, and engineers, to come to the United States. The Institute for Advanced Study could host them only because it was bankrolled by a Newark family, the Bambergers, who had made their money in department stores. Even von Neumann's facility with the punch cards used by early computers is illuminated by an almost accidental historical fact: when von Neumann was a child, his father, a banker with industrial interests, had brought home parts from an industrial Jacquard loom and explained how it worked over dinner. The loom was controlled with punch-cards.
The destruction unleashed by the war brought von Neumann and his team together in America, and the need for an ultimately destructive weapon was what convinced the military to fund computer research (a computer was needed to model the nuclear explosion). The scientists involved knew that they were working on both inventions at once. As one physicist explains to Dyson: "The military had the need and the money but they didn't have the genius. And Johnny von Neumann was the genius. As soon as he recognized that we needed a computer to do the calculations for the H-bomb, I think Johnny had all of this in his mind."
No history of the invention of the computer has yet to be truly complete: It was such a vast, international project that important people are always getting left ouf of the account. Any given history tends either to emphasize mathematicians at the expense of engineers or vice versa (see, for example, this informed critique of Dyson). Yet Dyson's digressive book comes closest, I think, to capturing the whole of the effort. It took world-scale events, he argues, to push this world-scale project to completion. The computer was invented at a historically unique time. That's why, Dyson argues, we're still using the same "von Neumann architecture" almost seven decades later.
The author is solely responsible for the content.
Leon Neyfakh is the staff writer for Ideas. Amanda Katz is the deputy Ideas editor. Stephen Heuser is the Ideas editor.
Guest blogger Simon Waxman is Managing Editor of Boston Review and has written for WBUR, Alternet, McSweeney's, Jacobin, and others.
Guest blogger Elizabeth Manus is a writer living in New York City. She has been a book review editor at the Boston Phoenix, and a columnist for The New York Observer and Metro.
Guest blogger Sarah Laskow is a freelance writer and editor in New York City. She edits Smithsonian's SmartNews blog and has contributed to Salon, Good, The American Prospect, Bloomberg News, and other publications.
Guest blogger Joshua Glenn is a Boston-based writer, publisher, and freelance semiotician. He was the original Brainiac blogger, and is currently editor of the blog HiLobrow, publisher of a series of Radium Age science fiction novels, and co-author/co-editor of several books, including the story collection "Significant Objects" and the kids' field guide to life "Unbored."
Guest blogger Ruth Graham is a freelance journalist in New Hampshire, and a frequent Ideas contributor. She is a former features editor for the New York Sun, and has written for publications including Slate and the Wall Street Journal.
Joshua Rothman is a graduate student and Teaching Fellow in the Harvard English department, and an Instructor in Public Policy at the Harvard Kennedy School of Government. He teaches novels and political writing.