The Information: A History, a Theory, a Flood

Image of The Information: A History, a Theory, a Flood
Author(s): 
Release Date: 
February 28, 2011
Publisher/Imprint: 
Pantheon
Pages: 
544
Reviewed by: 

James Gleick, James McPhee, Tracy Kidder, and Henry Petroski belong to the Pantheon of Great American Writers, the subbranch dedicated to Science, Engineering, and Invention. James Gleick’s The Information provides the wonderful backstory to many of the individuals who have given their name to various mathematical laws, theorems, and algorithms found in undergraduate computer science and electrical engineering texts.

Information starts as a desire to transfer thought from one person to another, first as spoken language, but then with the use of technology to enable increasing physical and temporal distance between endpoints. Words can be transported over distance by use of drums, signal fires, or waving flags. Thoughts may converted to representations of thoughts by drawing pictures, or painting on cave walls, by writing words into books, by use of mechanical, electrical, and electro-mechanical states or voltages. Once words have been converted to electrical symbols they may be transported by telegraph, telephone, or by use of data transmission between computers.

To design and improve advanced means of transport, mathematics and physics are needed. Words are converted to codes, codes are efficiently compressed and error corrected, possibly enciphered. Hand calculations for processing symbols are made efficient by mechanical and then electrical forms of computation. Symbols, whether constructed out of words or of codes and ciphers are converted to bits, and bits are transmitted and transported over wires or airwaves.

The common thread, the touchstone of The Information is Claude Shannon, the father of information theory. The history of information starts however before the history of information theory, before the telephone, before the telegraph, before the written word, back to oral culture, to African “talking” drums. In an oral culture, the idea of “looking things up” does not, cannot exist apart from remembering, with all the limitations that remembering holds. All of culture in an oral culture can only be held in living memory.

The demise of oral culture begins with writing, and the concept of writing took many paths. Starting as the depiction of life through cave paintings 30,000 years ago became at some point stylized symbols as pictures (pictographs), which became to symbols as ideas (ideographs) that became symbols as words (logographs started to be used in China 4,500 to 8,000 years ago, and today Chinese writing has 50,000 symbols of which 6,000 are commonly used). Amazingly, writing to the sound, (phoneme) was only invented once, about 3,500 years ago on the eastern edge of the Mediterranean Sea, in the form of cuneiform script by the Babylonians in Mesopotamia. In all the languages of the world, there is only one word for alphabet!

Transition to writing caused a great leap in culture, speech is too fleeting, human memory too poor to allow for complex analysis. Writing thoughts down leads to the systematization of culture, the ability to document history, law, business, mathematics and logic. Not just creating information as preserved knowledge, but also permitting new ways of thinking, not just enabling the creation of signs for things but also signs for signs. Writing may have originally started as keeping track of things, for example counting sheep in a herd, but counting leads directly to math. And then the rules of math may be applied to written language permitting analysis as rules of logic, while logic leads to paradox, where names of things are never exactly the same as the things so named.

The first spelling book (and also proto-dictionary, as it defined hard-to-spell words but did not refer to itself as a dictionary) was written in England in 1604. Fast forward in time to 1989, and we get the second edition of the OED, consisting of 22,000 pages and weighing in at 138 lbs, while the 3rd edition published in 2000 weighs hardly anything at all—being distributed online.

Civilized nations, post signal fire and tolling church bell, but pre-telegraph, used large mechanical signals placed on tall towers that could be viewed from other towers by use of telescope. Access to this form of communication was owned by the state and limited to the purposes of state. This method was, slow, limited by weather conditions and prone to all sorts of and not just mechanical fault.

The telegraph, after improvements made by to Samuel F. B. Morse and Alfred Vail, opened up low cost signaling to individuals and businesses, and enabled the synchronization of clocks across great distance (prior, every town had its own local noon). The telegraph also caused the meaning of commonly used words to change. The word “relay” for example once was used for a fresh horse that replaced a tired one, and the words “send” and “message” once referred to physical transfer of physical objects.

Significant personages in the telegraph era are Boole and De Morgan, who are both well known to computer scientists. Interest in coding and cryptography gets a boost from the telegraph, and the importance of coding and errors in message transmission becomes as important to businesses as it does for generals.

Transmission is but one part of The Information, and computation is another. Charles Babbage in the 1880s attempted to build a complex mechanical computer called the Differential Analyzer. After 20 years, the mechanical computer as a design took up 400 square feet of drawings, and if manufactured would have been composed of 25,000 individual parts, filling up 160 cubic feet, and weighing 15 tons. As it was designed before mechanical construction was capable of matching Babbage’s needs, only a demonstration piece was ever constructed (not quite true says Wikipedia, two have been constructed, if only recently, one is in the London Science Museum, the other owned by former Microsoft CTO Nathan Myrvold)

Gleick’s history of information moves at a brisk pace from Babbage’s Differential Analyzer to Vannevar Bush’s Differential Analyzer, a more modern if obsolete analog computer, a “100-ton platform of rotating shafts and gears,” to Bertrand Russell and Alfred North Whitehead’s Principia Mathematica, to Kurt Goedel, John Von Neumann, and Shannon’s work at Bell Labs. From the invention of telephone, to the history of the switchboard and the invention of the telephone book.

The Information never strays far from Shannon however, and Shannon’s peers at Bell Labs, Nyquist and Hartley, are introduced. Glieck drops an almost offhand comment that Alan Turing and Shannon shared a lunch table at the Bell Lab’s cafeteria in 1943, and together considered the possibility and implication of machines learning to think. In 1948, Shannon’s paper “A mathematical Theory of Communication” is written at Bell Labs. This is the first time that communication in the presence of noise is understood as a stochastic process (that is, something having statistical properties). Information is now a measure of entropy, as entropy itself is a measure of order and disorder.

The inventors next up are Norbert Weiner, who created the influential but short lived field of cybernetics, along with those who worked on cybernetic or computation theory including Warren McCulloch, J.C.R. Licklider, and William Ross Ashby. The study of cybernetics leads to the invention of the cognitive sciences, the science of the mind.
Switching the narrative from computation to biology, Gleick points out that Schrodinger and Gamow after studying atomic physics later worked on early genetic theory. Genetic theory of course brings us to Watson and Crick, to DNA and biochemistry, and coming full circle, biochemistry today looks a lot like information theory.

Switching the narrative back to the cognitive sciences, the basic unit of information that may replicate from mind to mind is not the bit (as the bit belongs to units of computational data), but the meme, as the meme is something that resonates with the psyche, an idea, a tune, a catchphrase, or rhyme, an image (any sense or emotion would do). Switching again to information theory, that noise is a measure of entropy, and entropy is a measure of randomness, raises the question, How random is random? Gleick next introduces Chaitin, Kolmogorov, and the relationship of randomness to number theory, where number theory adds to information theory by its success in using randomness to create advances in cryptography and data compression.

There is much more in The Information, but while there may be no limits to information, there are limits to the length of a book review. The last chapter provides a philosophical look backwards as well as a look forwards, Today, as in the past, there is a gap between information and knowledge, that gap between what is out there and what is useful to us, and the difficulty in extracting value (information that we care about) out of all the information that is.

However, there is one warning to the reader expecting The Information to be all the information ever needed about information. For wide ranging and fascinating selection of material that Gleick puts in to this book, he has to also leave a tremendous amount of material out. Gleick does not cover the Internet or its inventors, nor GPS, nor many recent advances in computing and data and telecommunications. But that, as the saying goes, may be another story.