Accidents involve much more than the reliability of parts. Safety emerges from the systemic interactions of devices, people and organizations. Nancy Leveson’s Engineering a Safer World (free pdf currently at the MIT press link, lower left) picks up many of the threads in Perrow’s classic Normal Accidents, plus much more, and weaves them into a formal theory of systems safety. It comes to life with many interesting examples and prescriptions for best practice.
So far, I’ve only had time to read this the way I read the New Yorker (cartoons first), but a few pictures give a sense of the richness of systems perspectives that are brought to bear on the problems of safety:
The contrast between the figure above and the one that follows in the book, showing links that were actually in place, is striking. (I won’t spoil the surprise – you’ll have to go look for yourself.)
I’ve been watching a variety of explanations of the financial crisis. As a wise friend noticed, the only thing in short supply is acceptance of responsibility. I’ve seen theories that place the seminal event as far back as the Carter administration. Does that make sense, causally?
In a formal sense, it might in some cases. I could have inhaled a carcinogen a decade ago that only leads to cancer a decade from now, without any outside triggers. But I think that sort of system is a rarity. As a practical matter, we have to look elsewhere.
Socioeconomic systems are at a constant slow boil, with many potential threats existing below the threshold of imminent danger at any given time. Occasionally, one grows exponentially and emerges as a real catastrophe. It seems like a surprise, because of the hockey stick behavior of growth (the French riddle of the lily pond again). However, most apparent low-level threats never emerge from the noise. They don’t have enough gain to grow fast, or they get shut down by some unsuspected negative feedback.
Continue reading “Climate, the Bailout, and the Blame Game”