More to Explore 189
Shannon bypassed others’ attempts to work with specific kinds of information—text,
numbers, images, sounds, etc. He also decided not to work on any single way of transmitting information—along a wire, sound waves through the air, radio waves, microwaves, etc.
Instead, Shannon decided to focus on a question so basic, no one had thought to study it:
What is information? What happened when information traveled from sender to receiver?
Shannon’s answer was that information consumed energy and, upon delivery, reduced
uncertainty. In its simplest form (an atom or a quantum of energy), information answered a
simple yes/no question. That answer reduced (or eliminated) uncertainty. Flip a coin. Will it
be heads or tails? You don’t know. You are uncertain. When it lands, you get information:
yes or no. It was heads or it wasn’t. Uncertainty is gone. That’s information.
Shannon realized that he could convert all information into a long string of individual
simple yes/no bits of information and that electrical circuits were ideal for processing and
transmitting this kind of digital information. In this way, he converted information—in any
form—into a string of digital yeses and nos: ones and zeros.
Shannon was then able to apply the laws of physics to information streams. He showed
that there was a limit to the amount of information that could be pushed through any communications channel—just as there was a limit to the amount of water that can be pushed
through a hose no matter how great the pressure. He also derived a mathematical equation to
describe the relationship between the range of frequencies available to carry information
and the amount of information that can be carried. This became what we call “bandwidth.”
Shannon’s discovery made information as physical and easy to work with as water
flowing through a pipe or air pumped through a turbine. In this way, Shannon discovered
what information is and opened the door to our modern digital age.
Fun Facts: There are 6,000 new computer viruses released every month.
More to Explore
Adler, Robert. Science Firsts. New York: John Wiley & Sons, 2003.
Horgan, John. “Claude Shannon: Unicyclist, Juggler, and Father of Information Theory.” Scientific American 262, no. 1 (1995): 22–22B.
Liversidge, Anthony. “Claude Shannon.” OMNI (August 1997): 61.
Riordan, Michael. Crystal Fire: The Birth of the Information Age. New York: W. W.
Norton, 1997.
Shannon, Claude. The Mathematical Theory of Communication. Urbana: University
of Illinois Press, 1999.
Sloane, N., and Aaron Wyner. Claude Elwood Shannon: Collected Papers.
Piscataway, NJ: IEEE Press, 1997.