Systems thinking & asymmetric information

At the STIA conference I played Forio’s Everest simulation, a multiplayer teamwork/leadership game, widely used in business schools.

Our team put 2 on the summit and 2 in the hospital. In the game, the unlucky climbers were rescued by helicopter, though in reality they might have ended up in the morgue as the current helicopter rescue record stands at 19,833 feet – far short of the high camps on Everest.

– Pavel Novak, Wikimedia Commons, CC Attribution Share-Alike 2.5 Generic

As the game progressed, I got itchy – where were the dynamics? Oscillations in the sherpa supply chain? Maybe a boom and bust of team performance? Certainly there were some dynamics, related to irreversible decisions to ascend and descend, but counterintuitive behavior over time was not really the focus of the game.

Instead, it was about pathologies of information sharing on teams. It turns out that several of our near-fatal incidents hinged on information held by a single team member. Just on the basis of probability, unique information is less likely to come up in team deliberations. But it turns out that this is reinforced by reinforcement bias that favors processing of shared information, to the detriment of team performance when unique information is important. While I’d still be interested to ponder the implications of this in a dynamic setting, I found this insight valuable for its own sake.

Back in the old days there was an undercurrent of debate about whether systems thinking was a subset of system dynamics, or vice versa. While I’d like SD to be the one method to rule them all, I have to admit that there’s more to systems than dynamics. There are a lot of interesting things going on at the intersection of multiple stakeholder interests, information and mental models, even before things start evolving over time. We grapple with these issues in practically every SD engagement, but they’re not our core focus, so it’s always nice to have a little cross-fertilization.

All metaphors are wrong – some are useful

I’m hanging out at the Systems Thinking in Action conference, which has been terrific so far.

The use of metaphors came up today. A good metaphor can be a powerful tool in group decision making. It can wrap a story about structure and behavior into a little icon that’s easy to share and relate to other concepts.

But with that power comes a bit of danger, because, like models, metaphors have limits, and those limits aren’t always explicit or shared. Even the humble bathtub can be misleading. We often use bathtubs as analogies for first-order exponential decay processes, but real bathtubs have a nonlinear outflow, so they actually decay linearly. (Update: that is, the water level as a function of time falls linearly, assuming the tub has straight sides, because the rate of outflow varies with the square root of the level.)

Apart from simple caution, I think the best solution to this problem when stakes are high is to formalize and simulate systems, because that process forces you to expose and challenge many assumptions that otherwise remain hidden.