Ultradian Oscillations of Insulin and Glucose

Citation: Jeppe Sturis, Kenneth S. Polonsky, Erik Mokilde, and Eve van Cauter. Computer Model for Mechanisms Underlying Ultradian Oscillations of Insulin and Glucose. Am. J. Physiol. 260 (Endocrinol. Metab. 23): E801-E809, 1991.

Source: Replicated by Hank Taylor

Units: No Yes!

Format: Vensim

Ultradian Oscillations of Insulin and Glucose (Vensim .vpm)

Update, 10/2017:

Refreshed, with units defined (mathematically the same as before): ultradia2.vpm ultradia2.mdl

Further refined, for initialization in equilibrium (insulin by analytic expression; glucose by parameter). Glucose infusion turned on by default. Graphs added.

ultradia-enhanced-3.mdl ultradia-enhanced-3.vpm

Sea Level Rise

Citations: Rahmstorf 2007, “A semi-empirical approach to projecting future sea level rise.” Science 315. Grinsted, Moore & Jevrejeva 2009. “Reconstructing sea level from paleo and projected temperatures 200 to 2100 AD.” Climate Dynamics [1]

Source: Replicated by Tom Fiddaman based on an earlier replication of Rahmstorf provided by John Sterman

Units balance: Yes

Format: Vensim; requires Model Reader or an advanced version

Notes: See discussion at metasd.

Files:

Grinsted_v3b‎ – first model; default calibration replicates Rahmstorf, and optimization is set up to adjust constant terms to fit Rahmstorf slope to data

Grinsted_v3c – second model; updated data and calibration, as in Part III

Grinsted_v3c-k2 – third model; set up for Kalman filtering, as in Part V

A Behavioral Analysis of Learning Curve Strategy

Model Name: A Behavioral Analysis of Learning Curve Strategy

Citation: A Behavioral Analysis of Learning Curve Strategy, John D. Sterman and Rebecca Henderson, Sloan School of Management, MIT and Eric D. Beinhocker and Lee I. Newman, McKinsey and Company.

Neoclassical models of strategic behavior have yielded many insights into competitive behavior, despite the fact that they often rely on a number of assumptions-including instantaneous market clearing and perfect foresight-that have been called into question by a broad range of research. Researchers generally argue that these assumptions are “good enough” to predict an industry’s probable equilibria, and that disequilibrium adjustments and bounded rationality have limited competitive implications.  Here we focus on the case of strategy in the presence of increasing returns to highlight how relaxing these two assumptions can lead to outcomes quite different from those predicted by standard neoclassical models. Prior research suggests that in the presence of increasing returns, tight appropriability and accommodating rivals, in some circumstances early entrants can achieve sustained competitive advantage by pursuing Get Big Fast (GBF) strategies: rapidly expanding capacity and cutting prices to gain market share advantage and exploit positive feedbacks faster than their rivals. Using a simulation of the duopoly case we show that when the industry moves slowly compared to capacity adjustment delays, boundedly rational firms find their way to the equilibria predicted by conventional models.  However, when market dynamics are rapid relative to capacity adjustment, forecasting errors lead to excess capacity, overwhelming the advantage conferred by increasing returns. Our results highlight the risks of ignoring the role of disequilibrium dynamics and bounded rationality in shaping competitive outcomes, and demonstrate how both can be incorporated into strategic analysis to form a dynamic, behavioral game theory amenable to rigorous analysis.

The original paper is on Archive.org ; it was eventually published in Management Science. You can get the MS version from John Sterman’s page here.

Source: Replicated by Tom Fiddaman

Units balance: Yes

Format: Vensim (the model uses subscripts, so it requires Pro, DSS, or Model Reader)

Behavioral Analysis of Learning Curve Strategy (Vensim .vmf)

New update:

BALCS4b.zip

The Energy Transition and the Economy

Model Name: The Energy Transition and the Economy: A System Dynamics Approach

Citation: John D. Sterman, 1981. PhD Dissertation, MIT Sloan School of Management

Source: Replicated by Miguel Vukelic (a heroic effort)

Units balance: Yes

Format: Vensim (Contains data variables and thus requires an advanced version or the free Model Reader)

The Energy Transition and the Economy (Vensim .vpm)

Terrorism Dynamics

Contributed by Bruce Skarin

Introduction

This model is the product of my Major Qualifying Project (MQP) for my Bachelors degree in the field of system dynamics at Worcester Polytechnic Institute. There were two goals to this project:

1) To develop a model that reasonably simulates the historic attacks by the al-Qaida terrorist network against the United States.

2) To evaluate the usefulness of the model for developing public understanding of the terrorism problem.

The full model and report are available on my website.

Reference Mode

The reference mode for this model was the escalation of attacks linked to al-Qaida against the U.S., as shown below. The data for this chart is available through this Google Document.
Image:Terrorism_Reference_Mode.jpg

Causal View of the Model

Below is the causal diagram of the primary feedback loops in the model.

Image:Terrorism_Causal_Loop.png

Online Story Model

There is an online story version that explains the primary model structure as well as complete iThink and Vensim models on my MQP page.

Payments for Environmental Services

Model Name: payments, penalties, and environmental ethic

Citation: Dudley, R. 2007. Payments, penalties, payouts, and environmental ethics: a system dynamics examination Sustainability: Science, Practice, & Policy 3(2):24-35. http://ejournal.nbii.org/archives/vol3iss2/0706-013.dudley.html.

Source: Richard G. Dudley

Copyright: Richard G. Dudley (2007)

License: Gnu GPL

Peer reviewed: Yes (probably when submitted for publication?)

Units balance: Yes

Format: Vensim

Target audience: People interested in the concept of payments for environmental services as a means of improving land use and conservation of natural resources.

Questions answered: How might land users’ environmental ethic be influenced by, and influence, payments for environmental services.

Software: Vensim

Payments for Environmental Services (Vensim .vmf)

The Obscure Art of Datamodeling in Vensim

There are lots of good reasons for building models without data. However, if you want to measure something (i.e. estimate model parameters), produce results that are closely calibrated to history, or drive your model with historical inputs, you need data. Most statistical modeling you’ll see involves static or dynamically simple models and well-behaved datasets: nice flat files with uniform time steps, units matching (or, alarmingly, ignored), and no missing points. Things are generally much messier with a system dynamics model, which typically has broad scope and (one would hope) lots of dynamics. The diversity of data needed to accompany a model presents several challenges:

  • disagreement among sources
  • missing data points
  • non-uniform time intervals
  • variable quality of measurements
  • diverse source formats (spreadsheets, text files, databases)

The mathematics for handling the technical estimation problems were developed by Fred Schweppe and others at MIT decades ago. David Peterson’s thesis lays out the details for SD-type models, and most of the functionality described is built into Vensim. It’s also possible, of course, to go a simpler route; even hand calibration is often effective and reasonably quick when coupled with Synthesim.

Either way, you have to get your data corralled first. For a simple model, I’ll build the data right into the dynamic model. But for complicated models, I usually don’t want the main model bogged down with units conversions and links to a zillion files. In that case, I first build a separate datamodel, which does all the integration and passes cleaned-up series to the main model as a fast binary file (an ordinary Vensim .vdf). In creating the data infrastructure, I try to maximize three things:

  1. Replicability. Minimize the number of manual steps in the process by making the data model do everything. Connect the datamodel directly to primary sources, in formats as close as possible to the original. Automate multiple steps with command scripts. Never use hand calculations scribbled on a piece of paper, unless you’re scrupulous about lab notebooks, or note the details in equations’ documentation field.
  2. Transparency. Often this means “don’t do complex calculations in spreadsheets.” Spreadsheets are very good at some things, like serving as a data container that gives good visibility. However, spreadsheet calculations are error-prone and hard to audit. So, I try to do everything, from units conversions to interpolation, in Vensim.
  3. Quality.#1 and #2 already go a long way toward ensuring quality. However, it’s possible to go further. First, actually look at the data. Take time to build a panel of on-screen graphs so that problems are instantly visible. Use a statistics or visualization package to explore it. Lately, I’ve been going a step farther, by writing Reality Checks to automatically test for discontinuities and other undesirable properties of spliced time series. This works well when the data is simply to voluminous to check manually.

This can be quite a bit of work up front, but the payoff is large: less model rework later, easy updates, and higher quality. It’s also easier generate graphics or statistics that help others to gain confidence in the model, though it’s sometimes important to help them recognize that goodness of fit is a weak test of quality.

It’s good to build the data infrastructure before you start modeling, because that way your drivers and quality control checks are in place as you build structure, so you avoid the pitfalls of an end-of-pipe inspection process. A frequent finding in our corporate work has been that cherished data is in fact rubbish, or means something quite different that what users have historically assumed. Ventana colleague Bill Arthur argues that modern IT practices are making the situation worse, not better, because firms aren’t retaining data as long (perhaps a misplaced side effect of a mania for freshness).

Continue reading “The Obscure Art of Datamodeling in Vensim”