A Mind at Play: How Claude Shannon Invented the Information Age

Image of A Mind at Play: How Claude Shannon Invented the Information Age
Author(s): 
Release Date: 
July 17, 2017
Publisher/Imprint: 
Simon & Schuster
Pages: 
384
Reviewed by: 

A Mind at Play is well researched, entertaining, and a warm and fascinating portrait of a genius whose ideas bridged mathematics and engineering, providing a foundation for the science of digital circuits and computing.

The reason we are not more familiar with Shannon is simple, he did not capitalize on his fame. Authors Soni and Goodman write, “His was a life spend in the pursuit of curious, serious play; he was that rare scientific genius who was just as content rigging up a juggling robot or a flame throwing trumpet as he was pioneering digital circuits.”

Information technologists, computer designers, and software programmers owe a debt of gratitude to Claude Shannon the same way that all physicists owe a debt of gratitude to Isaac Newton in providing a foundation for others to follow. There has always been information, but before Shannon there was very little math behind information’s measurement. After Shannon, information became bits, channel capacity, entropy and redundancy.

Born 1916, Shannon was withdrawn, shy, and considered odd. His undergraduate degree was a double degree in mathematics and engineering. In 1935 Shannon took his graduate degree in mathematics at MIT, and worked for Vannevar Bush on the analog computer called the “differential analyzer.”

At that time the differential analyzer was the limit of what computers could do. “In the 1930s, there were only a handful of people in the world who were skilled in both ‘symbolic calculus,’ or rigorous mathematical logic, and the design of electric circuits.”

Up until Shannon the branch of mathematics that addressed digital logic and Boolean theory was little more than a curiosity.

Shannon provided the bridge from theory to practice with his master’s thesis: “A Leap from Logic to Symbols and Circuits,” in 1937, creating the science of digital circuit design, which made him (relatively) famous at the young age of 21.

His first job after graduate school was at the IAS (Institute for Advanced Study) at Princeton. The authors introduce luminaries who were familiar with Shannon at IAS, including John von Neumann, Hermann Weyl, and Albert Einstein, and provide a number of Shannon-and-Einstein stories.

While employed at IAS Shannon spent a summer working at Bell Labs in their math research group, which served as an internal consulting organization for the needs of Bell Labs engineers, physicists, and chemists. He then changed jobs to work at Bell Labs full time.

The authors provide segments of interviews of Shannon’s colleagues at Bell Labs on the topic of Shannon. These interviews are not flattering, for example: “He didn’t have much patience with people who weren’t as smart as he was.” And “[he was] a very odd man in so many ways.”

During WWII, Shannon worked for the NDRC (National Defense Resource Council) at Bell Labs on cryptography, in particular SIGSALLY, also called “Project X,” which performed sound encoding. Specifics of the work done by Shannon aren’t known, however what is known is that Shannon was frustrated with war work as it took him away from his own research. Shannon met and talked with Alan Turing at the Bell Labs cafeteria when Turing visited Bell Labs. Though there is no record of their conversation, it is believed they talked about thinking machines and chess-playing computers.

After the war, Shannon’s next breakthrough was in information theory. Key to understanding the mathematics of information is having an understanding of “noise.” The tremendous effect of noise was first acknowledged in the first telegraphic undersea cables in 1858. Most of the messages sent on the cable (before it failed) were of “communications about communications” That is, they were messages asking: Did you get my message? (i.e. Can you hear me now?)

Scientists prior to Shannon who had made some progress on information theory, included Harry Nyquist who determined the maximum rate of information flow and the digitization of information. Another was Ralph Hartley who brought Nyquist’s work to “a higher level of abstraction.” Both worked at Bell Labs but it wasn’t until Shannon that “the final synthesis” was made. Shannon “defined the concept of information and effectively solved the problem on noise.”

Information has a statistical nature, and the authors point out the relationship of that statistical nature to codes and data compression. The feature that makes code-cracking and data compression possible is information redundancy, and redundancy can be manipulated, exploited, or removed.

For his effort on statistical coding, Shannon had help from mathematicians Robert Fano and David Huffman. “Shannon’s seminal paper, “A Mathematical Theory of Communication” turned him into an international phenomenon, though not right away —it took some time for his paper to be understood and applied.

As applications for the use of information theory grew, and so did the demand on Shannon’s time. With respect to the outside world, “. . . he closed himself off further, ignoring letters, colleagues, and projects . . .” Instead of assigned work, Shannon followed his curiosity.

Shannon’s reputation allowed him to be non-productively employed at Bell Labs. Shannon rode a unicycle down the hallways and pogo-sticked between buildings on the Bell Labs campus. The authors write, “Shannon was by this point, a legend masquerading as an ordinary employee.”

Shannon transitioned from industry to academia in 1956 first as a visiting professor at MIT then by accepting a full professorship a year later. Though Shannon left Bell Labs he remained on its payroll for another 15 years.

While at MIT, Shannon was awarded a “named chair,” tenure and position in two departments: mathematics and engineering. Shannon was at least consistent. When in industry he had no corporate ambitions, and as a professor he had no academic ambitions. Shannon felt no pressure to write papers and rarely advised doctoral students. He was considered to be better as inspiration than as an instructor.

Shannon continued to tinker with unicycles and handmade robots. And as he didn’t have to spend a lot of time at MIT, he made his home his office and met with students there. “Looking back, Shannon summed it all up as happily pointless.”

Shannon left no memoir or autobiography, and “. . . while it’s a commonplace to say that Shannon’s best thinking was over by 1948, that criticism might lead to overlook a rich body of work [that came after].” One of the places his curiosity took was a wearable calculator that could determine the resting place of a ball on a roulette wheel (which was not followed through in practice). Another was the first academic paper on the mathematics of juggling.

The authors list Shannon’s honorary awards received after his groundbreaking paper on coding theory. And though Shannon did not receive a Nobel Prize as mathematicians are not recognized by the Nobel Prize committee, he did win the very first Kyoto Prize in basic science.

Shannon acquired Alzheimer’s disease in the early 1980s and died in 2001. His formula for entropy is on his tombstone.

A Mind at Play contains photographs, endnotes, a bibliography, and an index.