Stress, Burnout & Biology

In my last post, stress takes center stage as both a driver and an outcome of the cortisol-cytokine-serotonin system. But stress can arise endogenously in another way as well, from the interplay of personal goals and work performance. Jack Homer’s burnout model is a system dynamics classic that everyone should explore:

Worker burnout: A dynamic model with implications for prevention and control

Jack B. Homer

This paper explores the dynamics of worker burnout, a process in which a hard‐working individual becomes increasingly exhausted, frustrated, and unproductive. The author’s own two‐year experience with repeated cycles of burnout is qualitatively reproduced by a small system dynamics model that portrays the underlying psychology of workaholism. Model tests demonstrate that the limit cycle seen in the base run can be stabilized through techniques that diminish work‐related stress or enhance relaxation. These stabilizing techniques also serve to raise overall productivity, since they support a higher level of energy and more working hours on the average. One important policy lever is the maximum workweek or work limit; an optimal work limit at which overall productivity is at its peak is shown to exist within a region of stability where burnout is avoided. The paper concludes with a strategy for preventing burnout, which emphasizes the individual’s responsibility for understanding the self‐inflicted nature of this problem and pursuing an effective course of stability.

You can find a copy of the model in the help system that comes with Vensim.

Model Quality: the High Road

There are two ways to go about building a model.

  • Plan A proceeds slowly. You build small or simple, aggregate components, and test each thoroughly before moving on.
  • Plan B builds a rough model spanning the large scope that you think encompasses the problem, then incrementally improves the solution.

Ideally, both approaches converge to the same point.

Plan B is attractive, for several reasons. It helps you to explore a wide range of ideas. It gives a satisfying illusion of rapid progress. And, most importantly, it’s satisfying for stakeholders, who typically have a voracious appetite for detail and a limited appreciation of dynamics.

The trouble is, Plan B does not really exist. When you build a lot of structure quickly, the sacrifice you have to make is ignoring lots of potential interactions, consistency checks, and other relationships between components. You’re creating a large backlog of undiscovered rework, which the extensive SD literature on projects tells us is fatal. So, you’re really on Path C, which leads to disaster: a large, incomprehensible, low-quality model.

In addition, you rarely have as much time as you think you do. When your work gets cut short, only Path A gives you an end product that you can be proud of.

So, resist pressures to include every detail. Embrace elegant simplicity and rich feedback. Check your units regularly, test often, and “always be done” (as Jim Hines puts it). Your life will be easier, and you’ll solve more problems in the long run.

RelatedHow to critique a model (and build a model that withstands critique)

How to ensure that your survey data is useless for dynamic modeling

I’ve been working with pharma brand tracking data, used to calibrate a part of an integrated model of prescriptions in a disease class. Understanding docs’ perceptions of drugs is pretty important, because it’s the major driver of rx. Drug companies spend a lot of money collecting this data; vendors work hard to collect it by conducting quarterly interviews with doctors in a variety of specialties.

Unfortunately, most of the data is poorly targeted for dynamic modeling. It seems to be collected to track and guide ad messaging, but that leads to turbulence that prevents drawing any long term conclusions from the data. That’s likely to lead to reactive decision making. Here’s how to minimize strategic information content:

  1. Ask a zillion questions. Be sure that interviewees have thorough decision fatigue by the time you get to anything important.
  2. Ask numerical questions that require recall of facts no one can remember (how many patients did you treat with X in the last 3 months?).
  3. Change the questions as often as possible, to ensure that you never revisit the same topic twice. (Consistency is so 2015.)
  4. Don’t document those changes.
  5. Avoid cardinal scales. Use vague nominal categories wherever possible. Don’t waste time documenting those categories.
  6. Keep the sample small, but report results in lots of segments.
  7. Confidence bounds? Bah! Never show weakness.
  8. Archive the data in PowerPoint.

On the other hand, please don’t! A few consistent, well-quantified questions are pure gold if you want to untangle causality that plays out over more than a quarter.

Answer to A Bongard Problem

As a few people nearly guessed, the left side is “things a linear system can do” and the right side is “(additional) things a nonlinear system can do.”

On the left:

  • decaying oscillation
  • exponential decay
  • simple accumulation
  • equilibrium
  • exponential growth
  • 2nd order goal seeking with damped oscillation

On the right:

Bongard problems test visual pattern recognition, but there’s no reason to be strict about that. Here’s a slightly nontraditional Bongard problem:

The six on the left conform to a pattern or rule, and your task is to discover it. As an aid, the six boxes on the right do not conform to the same pattern. They might conform to a different pattern, or simply reflect the negation of the rule on the left. It’s possible that more than one rule discriminates between the sets, but the one that I have in mind is not strictly visual (that’s a hint).

The original problem was here.