Digitized: The Science of Computers and How It Shapes Our World

Image of Digitized: The Science of Computers and How It Shapes Our World
Author(s): 
Release Date: 
March 1, 2012
Publisher/Imprint: 
Oxford University Press
Pages: 
256
Reviewed by: 

“. . . an entertaining and informative popular science study . . .”

Unless you’ve been living under a rock for the past 20 years, computers and computer networks are increasingly everywhere, essential to the operation of nearly everything in business, music, news, automobiles, and communication—mobile devices, telephone, radio, television—even web radio and web television.

The author of Digitized, Peter J. Bentley, provides the reader with an entertaining and informative popular science study of computer science.

The reader gets a tour of computer history complete with biographies of researchers and entrepreneurs and visits to labs where the next big thing is being invented today.

Digitized overlaps a bit with James Gleick’s The Information previously reviewed at NYJB, but the reader shouldn’t mind; there are so many facets to computer science and technology that just one book cannot do adequate justice to the subject.

A few of the luminaries of computing, the Internet, and the World Wide Web mentioned in Digitized include Claude Shannon, John von Neumann, John Mauchly, Gordon Moore, Warren McCulloch, Maurice Wilkes, John McCarthy, Marvin Minsky, Bob Kahn, Vint Cerf, Peter T. Kirstein, and Tim Berners-Lee.

Section One, uh excuse me, section 000—Sections are numbered in binary, the first section is 000, the second is 001, the third 010, and so on. Anyway, section 000 starts at the very beginning of the computer revolution when the word “computer” was reserved for human beings who calculated, and the first person of interest was a mathematician by the name of Alan Turing.

In 1936 Turing studied Hilbert’s “Decision Problem.” The problem was: Did an algorithm exist that could automatically decide whether an arbitrary math statement was true or false?

Turing, an unconventional thinker with a predilection for working things out from first principles came up with the idea of using a general purpose theoretical computer to solve Hilbert’s decision problem, coming to the conclusion that there were some things that no computer could work out.

He was not the first to reach that conclusion. Alonzo Church (who was said to speak in algorithms as easily as others speak in prose) had also figured this out by a different but equivalent method. Together their theory is known as the Church-Turing thesis, which has become part of the foundation of computer science, while Turing’s theoretical computing device is now known as the Turing machine.

Peter J. Bentley describes the nature of computational complexity, that is, determining the limit of what can be solved practically by using as example different methods by which numbers can be sorted.

He also explains how computers “work” in their simplest form of machine register moves and mathematical operations, a level of detail not normally found in a popular science texts, which he does creditably well—not a simple feat. From there, topics are explored more broadly than deeply and include cybernetics, artificial intelligence, the human-computer interface, cryptography, databases, and the inventions of the Internet and the Web.

The first computers were actually not-quite computers. They were designed and built during WWII and used for ballistic calculations, for atom bomb physics, and for cryptanalysis. They were also very large and very power hungry, using logic switches made out of vacuum tubes.

The first practical stored program (that is, not programmed by manually changing wires on a plug-board) was the EDSAC, which in use from 1949 through 1958. The much smaller and lower power transistor was invented in 1953 but computers remained large and power hungry until the availability of integrated circuits (ICs made of organized sets of transistors) in 1959.

The first advances in computer software are from what the author calls “foundational elements.” These foundational elements include subroutines, modular programming, and assembly language.

The first commercial high level programming language was used for equations and called FORTRAN (FORmula TRANslation). FORTRAN was created by IBM in 1957 and variants are still in use today. 1968 saw the first ever conference on software engineering, and the recognition of the software development life cycle.

Gordon Moore (the founder of Intel) made a prediction in 1965 based on the regularly increasing density (storage capacity and speed) of digital ICs. This prediction is now called Moore’s Law. Its consequence is the regular presence of smaller, cheaper, more powerful computers every year.

Today we seem to have reached the limit of Moore’s Law. Greater density means more power means more heat, and heat is bad for computer chips. One path to continuing improving processing performance is done increasingly by parallelism by producing more computer cores per chip and running software in parallel. Future progress in computers as noted by Digitized may include the use of neural networks, bio-“inspired” computing, and quantum computers.

The chapter titled Monkeys with World-Spanning Voices is about digital communication networks leading to the Internet. The Internet is used for all sorts of things, not just entertainment, but in general the transport of information.

Dr. Bentley takes a look “under the hood” of the Internet. We get to see how a protocol works and also a look at the processes that keep the Internet running, including the Domain Name System (DNS). The Web also is addressed along with the programs that make it work including hypertext markup language (HTML), and the hypertext transport protocol (HTTP).

Digitized also provides the history of making computers friendlier to use, starting with the luminaries involved who predated but made possible Steve Jobs’ revolutionary Macintosh.

There is Douglas Engelbart, the inventor of the computer mouse and founder of collaborative and computer-aided design, and there is Ivan Sutherland working at Xerox Parc, where the first graphical interface (Windows) for computers was invented.

The next step in improving human to computer interfaces is virtual reality (VR), which the author claims is still just a niche use though it seems that some VR niches are going pretty strong, including, gaming and 3D graphics, manufacturing, medicine, and military applications.

Digitized covers the history of artificial intelligence (AI), from 1952 with Theseus the maze-solving electronic mouse, to artificial game players by way of game theory.

AI has had a troubled past but current success seems to be not from computers understanding humans so much as understanding statistics, in particular the statistics of words that humans use.

Chatterbots and Google language translation is not done by having a computer figuring out the meanings of words but by a computer make a statistical comparison of words in context—using the Internet as a humongous database.

Rodney Brooks at MIT also uses a non-symbolic approach programming small robots with instinctive behaviors that dynamically switch in response to the environment. The field of AI also includes the concept of self-improving or evolutionary programs that started with John Holland, who was granted the first PhD in computer science from the University of Michigan in 1959.

Evolutionary programs model evolution and evolve to select better versions of themselves. There are also classifier programs that are capable of learning rules from interactions with the environment, called bottom-up learning. Evolutionary and classifier AI programs have practical use in medicine where they are used for modeling drugs and drug interactions.

The last section of Digitized comprises an assortment of odds and ends including the relationship of computers and the fine arts in the UK, computational biology, 3D modeling proteins and amino acids, sequencing DNA, and the use of computers in drug discovery and X-ray computed tomography. There are interviews with artists and entrepreneurs who use computers to do new and important things.

Now for the minuses, of which there are few but worth noting: There’s no mention of the downside of computers: societal dislocation from automation and outsourcing, or the simple fact of computer risks—things that go wrong when you let a computer think for you.

There’s no attention paid to the politics of the Internet, of computer surveillance, of Internet censorship, of who owns what on your computer, or what you are allowed or not allowed to do with your computer.

A few luminaries of computer science deserving mention go unmentioned, such as Djikstra, while Admiral Grace Hopper gets no more than a footnote. Software engineering is under-examined yet software engineering is the process that separates computer scientists’ algorithms from next big thing. Lastly there are the author’s Britishisms and gaffes on American culture that may puzzle the American reader.

All that aside, if you are a fan of computers or just want a glimpse into computers, computer science, and computer history (past and future), Digitized is for you.