Crashes, Crises, and Calamities: How We Can Use Science to Read the Early-Warning Signs

Image of Crashes, Crises, and Calamities: How We Can Use Science to Read the Early-Warning Signs
Author(s): 
Release Date: 
March 29, 2011
Publisher/Imprint: 
Basic Books
Pages: 
256
Reviewed by: 

Len Fisher is an author of popular science, and his How to Dunk a Doughnut was named Best Popular Science Book of the Year by the American Institute of Physics. Crashes, Crises, and Calamities is in that same vein, a serious but fun tour of the science of prediction, and not just any sort of prediction but prediction of disaster (called by scientists, “critical transitions”).

Can animal behavior be used to predict earthquakes? Animals were recorded behaving oddly before an earthquake and following tsunami in 737 BCE, in the Greek city of Helike (which may have been the inspiration for Atlantis). The fact is though that if we could establish a genuine correlation between animal behavior and environmental change then we would be able to bypass the need for animals and make this kind of prediction directly.

If not by animal behavior then what about human paranormal precognition? Well, there is science and then there is pseudo-science. The Amazing Randi’s offer of one million dollars for a successful demonstration of he paranormal (under scientifically controlled conditions) has gone unclaimed for 46 years. Mr. Fisher provides examples in pseudo-scientific thinking, which he calls fallacies:

Fallacy 1: Post Hoc Ergo Propter Hoc or the Post Hoc fallacy:
The mistaken belief that if A occurs before B then A must be the cause of B. This fallacy was known to Aristotle (but may have been lost by modern philosophers; see Fate, Time, and Language: An Essay on Free Will by David Foster Wallace, previously reviewed in NYJB).

Fallacy 2: Cherry Picking:
Cherry picking is the dependence on conclusions based on the biased choosing of data that supports your theory while conveniently ignoring data that doesn’t. (Cherry picking was similarly addressed in Charles Seife’s Proofiness: The Dark Arts of Mathematical Deception, also previously reviewed in NYJB.)

Fallacy 3: That the future will be the same as the past:
This particular fallacy ignores factors such as phase transitions (at some point cooling water will turn into ice), and scaling. Mr. Fisher in explaining this theory provides a story about Galileo, as an offering for the Pope miscalculating the height of the ceiling of Hell from values provided by Dante’s Inferno. (The miscalculation was based on not recognizing that weight of a structure increases from the cube while strength only increases by its square, called the square-cube law.)

This calculation was made in period of medieval architecture when massive stone cathedrals collapsed under their own weight, and had to be built with flying buttresses to hold the walls up. Mr. Fisher, playing on this theme, then takes a more serious tangent to provide fascinating explanations about stresses and strains, Hooke’s Law of elasticity and Young’s Modulus, computer modeling and how rebar in concrete inhibits brittle fracture.

Mr. Fisher uses popular culture to hold the reader’s attention to illustrate various aspects about disaster, for example the Blues Brothers are used to illustrate Newton’s laws of motion. Another example hinges on the Laws of cartoon physics, “any body suspended in space will remain suspended in space until made aware of the situation,” while another reminds him of the yearly cheese rolling contest in Glouchestershire. In referring to the physics of cascades of collapse—or “domino effect”—Mr. Fisher makes allusions to Zorba the Greek, Douglas Adams, and real dominos being knocked over for an attempt at a world record.

Mr. Fisher also explores the concept of “Feedback,” both positive and negative, and provides examples that are in the common household: the thermostat and the toilet tank ball-cock (whose invention goes back to 270 BCE). The concept of feedback leads to the concept of equilibrium, both stable and unstable, which is illustrated by the example of tipping back too far while seated in a chair.

The concept of equilibrium or balance leads to the concept of balance in nature. There are many kinds of balance, including driven and punctuated, and Mr. Fisher references The Ecology of Dragons by Robert M. May (which I then found through Google and read) to illustrate a point on the methods that physicists may apply to problems of ecology. Next referenced is Malthus’ exponential growth curve demonstrating the potential runaway positive feedback of population growth, and Pierre Francois Verhulst’s negative feedback applied to Malthus’ equation, thankfully taming it. This entertaining exercise leads the reader to greater understanding of the regularity of boom-and-bust cycles in ecology.

Nature has many balance points, and simple rules can produce complicated outcomes. And models of ecology may be applied to economics; however, the example Mr. Fisher chose to represent as a having a positive effect in economics, microloans, has only recently exposed as having a serious negative effect with reports of usurious rates resulting in crushing debt burdens which have driven micro-borrowers in India to suicide.

Mr. Fisher next explores the fundamentals of scientific models and modeling. How models are made:

1. Get an idea
2. Create a mathematical representation of that idea.
3. Interpret the results

Models may be applied to other social sciences, not just economics. Models have been made for human psychology in relationships and binge drinking. Models of psychology though tend to be less predictive and more descriptive i.e. explanations for rather than predictions of. Models at their most basic are stories we tell each other, are metaphors for reality, used as triggers for insight, for example one scientist’s dream of a snake eating its own tail lead to discovering the shape or structure of benzene. But all in all, the test or validity of any model is its ability to predict the future. Ignoring that can easily lead to intentional misinterpretation of models for fun and profit.

Given that mathematical models are run on computers, how can we trust the computer calculations and the underlying assumptions on which models are based? How can we trust the interpretation of the output? How can we know that the predictions—whatever they may be—are reasonable?

Mr. Fisher does exhibit skepticism to models, for every positive “intelligent swarm” there is the corresponding negative “groupthink.” Not knowing what you have going in leads to the dependence on statistics, the measurement of correct versus incorrect predictions. The solution, Mr. Fisher claims, is applied skepticism. Skepticism applies not just to models but also to all sorts of judgment. Of this claim, this reviewer applies some skepticism. Astrologers and Alchemists may both use math and models.

How does one recognize the difference between an Astrologer and an Astronomer? How does one separate the Chemist from the Alchemist? Science from pseudo-science? When making good judgment depends on having good judgment, the problem is that good just also happens to be a judgment, and outcomes (from which we decide good or bad) may only be known in the future, are predictions. The definition of good judgment becomes good guessing.

And if that weren’t enough, one could argue that for every probability, there remains the possibility of something occurring that has not been modeled for which there are no statistics; something that no one could ever guess. For every probability, there remains the possibility of the unlikely event. If the statistics show that an earthquake occurs with some regularity once per hundred years, statistics could still not tell me if there will be an earthquake tomorrow.

Skepticism in this can lead to the “too little too late” syndrome as there may be true signals of future events with weak correlation, which won’t be recognized as significant until after the fact. For the non-skeptic, there is the syndrome of “look before you leap,” or false signals of future events with strong correlation. The difficulty again is one of judgment. How to recognize the difference that makes a difference?

Effective disaster planning is an effort of resilience. When you cannot predict where or when the next disaster will occur, you should not only have a plan to recover, after the fact, but also test that plan regularly. The loss of resilience (lack of attention to maintenance of infrastructure, i.e. a regularity of small disasters unattended) is a powerful indicator of the potential for a small disaster to cascade into a larger one.

Crashes, Crises, and Calamities may not have all the answers, but this book certainly gets you on a good path to asking the right questions.