Turing’s Cathedral: The Origins of the Digital Universe by George Dyson
The history of the universal electronic computer at the Institute of Advanced Studies in Princeton, pioneered by the leading genius of his time, John von Neumann, and driven largely by the computational requirements of building a nuclear bomb, makes for a good book. George Dyson’s Turing’s Cathedral is not that book.
At his best, Dyson writes compelling, erudite, witty, and idiosyncratic prose with a gift for poetic analogies and elegant turns of phrase. The opening of chapter XVII, on the vast computational power we have today and in the future, is a good example:
Von Neumann made a deal with “the other party” in 1946. The scientists would get the computers, and the military would get their bombs. This seems to have turned out well enough so far, because, contrary to von Neumann’s expectations, it was the computers that exploded, not the bombs.
Alas, these morsels are thinly spread.
Worse, many of them are abject nonsense.
Dyson seems to have no internal error correction mechanisms that shield him from stretching his analogies far beyond what their flimsy fabric can endure. He combines this infatuation with his own literacy with digital, mathematical, and technological illiteracy: a cavalier attitude towards the details of the technology that he aims to describe and reason about.
Search engines and social networks are analog computes of unprecedented scale. Information is being encoded (and operated upon) as continuous (and noise-tolerant) variables such as frequencies (of connection or occurrence_ and the topology of what connects where, with location being increasingly defined by a fault-tolerant template rather than by an unforgiving numerical address. Pulse-frequency coding for the Internet is one way to describe the working architecture of a search engine, and PageRank for neurons is one way to describe the working architecture of the brain. These computational structures use digital components, but the analog computing being performed by the system as a whole exceeds the complexity of the digital code on which it runs. [p. 280]
(The chapter ends with a bold one-liner, “Analog is back, and here to stay.”) It’s unfair to cherry-pick particularly egregious passages from a book about a complicated subject, but the book is full of stuff like that. With the first few instances I tried to adopt a charitable stance and wanted to understand Dyson is trying to tell me, behind the noise of half-understood technical inaccuracies. But after a few chapters I just became annoyed.
Generously put, the technical passages of this book are inspiring, in the sense that I was inspired to actually find out how the ENIAC and other machines worked. Using other sources, such as Wikipedia, because Dyson’s book does very little to tell me. Dyson is clearly out of his depth here, and his confused and confusing descriptions read like the account of a blind man explaining the concept of colour. The result is dense, conceited, and just plain annoying. As Goodreads reviewer Jenny Brown puts it, “this book is fatally marred by Dyson’s failure to understand computer architecture.” Other reviewers of this book, both professional and amateur, seem to be appropriately humbled and impressed by the opaque technology, and write off their confusion to their own cognitive inadequacy. Here’s an example from judy’s review: “I stand in awe of the geniuses who envisioned and constructed the digital universe–largely because I haven’t a prayer of understanding what they did. Although written in plain English, somehow my brain will simply not grasp the concepts.” Well, neither does Dyson’s.
Our message should be that computers are simple. Instead, we get yet another book that makes technology into magic, and its inventors into Byronic heroes.
Which leads us to the biographical sketches. The book gives us a rich anecdotes about many of the leading figures associated with Princeton’s computer group in the 40s and 50s. Bigelow, Ulam, Gödel, — all secondary to the book’s main character, the titanic John von Neumann. Many of these descriptions are entertaining and insightful, it’s clearly the best part of the book. Dyson tells much of the story in the voice of others, by quoting at length from interviews and biographies. This works well. However, even these sketches remain disjointed, erratic, meandering, and quirky. Dyson clearly has had access to unique sources at Princeton’s Institute of Advanced Studies, which make for the most interesting parts of the book. Examples include endlessly entertaining log messages from ENIAC programmers. On the other hand, I really don’t need to know that Baricelli’s had his $1,800 stipend was renewed in 1954. These random facts are many, obviously motivated by the author’s access to their sources, and never play a role in the greater narrative.
Even Alan Turing, after whom Dyson’s book is named, makes an appearance in chapter XIII. Otherwise the book has nothing to do with Turing, and little to do with universal computing, so the book’s title remains a mystery.
Von Neumann’s Cathedral would have been a fine title for this book, or better Von Neumann’s Bomb Bay. This is a book about von Neumann, not Turing, and his monomaniacal dream of building a computer. Von Neumann’s motivation was mainly militaristic: to leverage computational power for simulation of complex phenomena, such as the physics of nuclear bombs. As such, the early history of computing co-evolves at the ENIAC project in Princeton and the Manhattan project in Los Alamos. This is a story worth telling, and from that perspective, Dyson’s book is a book worth reading. Just remember to read the technological passages like the techno-babble in a Star Trek episode: It’s window-dressing, not there in order to be understood.
The final prophetic chapters about the future of computation and mankind are worthless and incompetent.
In summary, a misleadingly-titled, meandering, technologically illiterate, annoying, beautifully written, confused, and sometimes entertaining book about an important topic.