Everybody, and especially everybody at an engineering school like Stevens, should know who Claude Shannon was. Shannon, who lived from 1916 to 2001, was an electrical engineer and mathematician. In 1948, while at Bell Laboratories, he published “A Mathematical Theory of Communication,” which laid the foundation for the digital era. Robert Lucky, a former executive director of research at Bell Labs, has called Shannon’s paper the greatest “in the annals of technological thought.”
Shannon has never been as well-known as he should be. But he is now the subject of a terrific new biography, A Mind at Play: How Claude Shannon Invented the Information Age, by Jimmy Soni and Rob Goodman. Soni, a journalist, is giving a talk at Stevens Wednesday, November 8, Babbio Auditorium, 4-5 p.m. To whet your appetite, below are excerpts from my interview with Shannon in 1989 at his home near Boston.
Horgan: When you started working on information theory, did you have a specific goal in mind?
Shannon: My first thinking about it was: How do you best forward transmissions in a noisy channel… [like] a telegraph system or telephone system. But when I begin thinking about that, you begin to generalize in your head all of the broader applications. So almost all of the time, I was thinking about them as well. I would often phrase things in terms of a very simplified channel. Yes or no’s or something like that. So I had all these feelings of generality very early.
Horgan: I read that [physicist] John Von Neumann suggested you should use the word “entropy” as a measure of information because no one understands entropy and so you can win arguments about your theory.
Shannon: It sounds like the kind of remark I might have made as a joke… Crudely speaking, the amount of information is how much chaos is there in the system. But the mathematics comes out right, so to speak. The amount of information measured by entropy determines how much capacity to leave in the channel.
Horgan: Were you surprised when people tried to use information theory to analyze the nervous system?
Shannon: That’s not so strange if you make the case that the nervous system is a complex communication system, which processes information in complicated ways… Mostly what I wrote about was communicating from one point to another, but I also spent a lot of time in transforming information from one form to another, combining information in complicated ways, which the brain does and the computers do now. So all of these things are kind of a generalization of information theory.
Horgan: John Pierce [an electrical engineer and friend of Shannon] once said that your work could be extended to include meaning.
Shannon: Meaning is a pretty hard thing to get a grip on… In mathematics and physics and science and so on, things do have a meaning, about how they are related to the outside world. But usually they deal with very measurable quantities, whereas most of our talk between humans is not so measurable. It’s a very broad thing which brings up all kinds of emotions in your head when you hear the words. So, I don’t think it is all that easy to encompass that in a mathematical form.
Horgan: Do you worry that machines will take over some of our functions?
Shannon: The machines may be able to solve a lot of problems we have wondered about and reduce our menial labor problem… If you are talking about the machines taking over, I’m not really worried about that. I think so long as we build them, they won’t take over.
Horgan: Did you ever feel any pressure on you, at Bell Labs, to work on something more practical?
Shannon: No. I’ve always pursued my interests without much regard for financial value or value to the world. I’ve been more interested in whether a problem is exciting than what it will do. … I’ve spent lots of time on totally useless things.
John Horgan directs the Center for Science Writings, which is part of the College of Arts & Letters. This column is adapted from one originally published on his ScientificAmerican.com blog, “Cross-check.”