7 Questions for Walter Isaacson

Jens Mortensen

Up until now, Walter Isaacson’s doorstop biographies have hinged on the seminal insights of Great Men: Ben Franklin’s kite-and-key experiment, Einstein’s mass-energy equivalence, Steve Jobs’s vision of the personal computer. His latest book, The Innovators—How a Group of Inventors, Hackers, Geniuses, and Geeks Created the Digital Revolution (Simon & Schuster), focuses less on these “lightbulb moments of lone genius” than on the teams, partnerships and collective discoveries that led to the Internet age. Isaacson, who also serves as CEO of the Aspen Institute, speaks here about his lifelong geekdom, his predictions for a technologically enhanced future and what our smartphones owe to Lord Byron’s daughter.

How closely did you follow the developments of the digital age as they were happening?
My father and my uncles were all electrical engineers, and I liked to build ham radios and sort transistors and solder circuits in our workshop at home. When I was working at the New Orleans Times-Picayune, I was the first to use a computer to compose my stories. And when I went to Time magazine and was covering the political campaigns, I carried around a heavy modem so I could connect my computer to online services and send and receive information electronically. So I think I felt the excitement of both computers and the Internet 
with every major advance. 


In writing this sweeping history, did you occasionally feel compelled to focus on just one character? 

I had the urge to write a bio of Ada Lovelace. She was Lord Byron’s daughter and loved poetry, but her mother made sure she was tutored mainly in mathematics, so she stood at that intersection of the humanities and technology. She comes up with the notion of what a computer can do, and she realizes that the goal is to create partnerships between humans and machines. But one of the key things about Lovelace is that she didn’t do things alone; she worked with a guy named Charles Babbage. So I decided it was better to write about the entire digital revolution and show how the sharing of ideas made people more creative. 


Do you think the Digital Revolution was inevitable?

Like any great innovation, it happened because the right seeds fell on fertile ground. A whole lot of creative advances had occurred and came together: the notion that all information could be conveyed in binary bits was one of them, but so, too, was the idea of a transistor. And you had the demands of wartime, which were to break the German codes and 
to calculate the trajectory of missiles, that caused a lot of government funding to come along in the 1940s. So those and dozens of other causes came together to create the computer revolution and the Internet. 


Why do you think it was a primarily American phenomenon? 

The computer revolution happened simultaneously in America, Britain and Germany, but after the war you had a creative environment in the United States in general, and in Silicon Valley in particular, that caused most of the major advances to happen in the U.S. It’s partly because here there was a partnership between government, business and universities to do basic research that helped lead to the Internet. Likewise, there’s been a venture-capital community that’s grown up in the U.S. and encouraged people to take risks. Also, innovation is directly connected to individual and group freedom, and economies that encourage free expression of ideas and 
a free market tend to do better. 


In the book you mention, offhand, that de Tocqueville misunderstood American individualism. What do you 
mean by that?

Americans tend to be very individualistic, but they also come together to form associations and collaborate easily. De Tocqueville thought that those were contradictory strands. But there’s something magical about America, starting with the days of quilting bees and barn raisings; about a society that could be both individualistic and collaborative. It’s part of the American character, and you see it in the open-source software movement, but also in people getting together to launch startups.


The end of your book discusses artificial intelligence—you don’t seem overly worried about the prospect of an A.I. “singularity.”

Lovelace said that machines would never be able to think on their own, that humans would always provide imagination and creativity. Alan Turing called that Lady Lovelace’s Objection and proposed what’s called the Imitation Game, which was a test to see if machines could have artificial intelligence. He believed they would. The lesson of the past 60 years has been that the combination of human creativity and machine processing power has always grown faster than machine intelligence alone. So I think that the future belongs to those who can create partnerships between humans and machines. 


You seem to have a pretty rosy vision of this technologically enhanced future. 

Lord Byron was a Luddite—literally. He was smashing looms in the England of the Industrial Revolution because he thought they were putting people out of work. People have always worried that technology would reduce the number of jobs, but history indicates that that has never been the case, that technology leads to more creativity and more productivity, and that in turn creates new types of jobs.

RELATED STORIES

Page-Turning Coffee-Table Books »
Great Escape: The Caribbean, by the Book »
Top New York Bookstores »