This video explores a simple epidemic model for a community confronting coronavirus.
I built this to reflect my hometown, Bozeman MT and surrounding Gallatin County, with a population of 100,000 and no reported cases – yet. It shows the importance of an early, robust, multi-pronged approach to reducing infections. Because it’s simple, it can easily be adapted for other locations.
The model, in .mdl and .vpmx formats for any Vensim version:
Update 3/12: community corona 8-mdl+vpmx.zip
There’s another copy at https://vensim.com/coronavirus/ along with links to the software.
Whether or not we can prove that a system experiences trophic cascades and other nonlinear side-effects, we should manage as if it does, because we know that these dynamics are common.
There’s been a long-running debate over whether wolf reintroduction led to a trophic cascade in Yellowstone. There’s a nice summary here:
Yesterday, June initiated an in depth discussion on the benefit of wolves in Yellowstone, in the form of trophic cascade with the video: How Wolves Change the River:
This was predicted by some, and has been studied by William Ripple, Robert Beschta Trophic Cascades in Yellowstone: The first fifteen years after wolf reintroduction http://www.cof.orst.edu/leopold/papers/RippleBeschtaYellowstone_BioConserv.pdf
Shannon, Roger, and Mike, voiced caution that the verdict was still out.
I would like to caution that many of the reported “positive” impacts wolves have had on the environment after coming back to Yellowstone remain unproven or are at least controversial. This is still a hotly debated topic in science but in the popular media the idea that wolves can create a Utopian environment all too often appears to be readily accepted. If anyone is interested, I think Dave Mech wrote a very interesting article about this (attached). As he puts it “the wolf is neither a saint nor a sinner except to those who want to make it so”.
I see 2 points of caution regarding reports of wolves having “positive” impacts in Yellowstone. One is that understanding cause and effect is always hard, nigh onto impossible, when faced with changes that occur in one place at one time. We know that conditions along rivers and streams have changed in Yellowstone but how much “cause” can be attributed to wolves is impossible to determine.
Perhaps even more important is that evaluations of whether changes are “positive” or “negative” are completely human value judgements and have no basis in science, in this case in the science of ecology.
-Ely Field Naturalists
Of course, in a forum discussion, this becomes:
Wolves changed rivers.
Not they didn’t.
Yes they did.
(iterate ad nauseam)
… with “prove it” roughly understood to mean establishing that river = a + b*wolves, rejecting the null hypothesis that b=0 at some level of statistical significance.
I would submit that this is a poor framing of the problem. Given what we know about nonlinear dynamics in networks like an ecosystem, it’s almost inconceivable that there would not be trophic cascades. Moreover, it’s well known that simple correlation would not be able to detect such cascades in many cases anyway.
A “no effect” default in other situations seems equally naive. Is it really plausible that a disturbance to a project would not have any knock-on effects? That stressing a person’s endocrine system would not cause a path-dependent response? I don’t think so. Somehow we need ordinary conversations to employ more sophisticated notions about models and evidence in complex systems. I think at least two ideas are useful:
- The idea that macro behavior emerges from micro structure. The appropriate level of description of an ecosystem, or a project, is not a few time series for key populations, but an operational, physical description of how species reproduce and interact with one another, or how tasks get done.
- A Bayesian approach to model selection, in which our belief in a particular representation of a system is proportional to the degree to which it explains the evidence, relative to various alternative formulations, not just a naive null hypothesis.
In both cases, it’s important to recognize that the formal, numerical data is not the only data applicable to the system. It’s also crucial to respect conservation laws, units of measure, extreme conditions tests and other Reality Checks that essentially constitute free data points in parts of the parameter space that are otherwise unexplored.
The way we think and talk about these systems guides the way we act. Whether or not we can prove in specific instances that Yellowstone had a trophic cascade, or the Chunnel project had unintended consequences, we need to manage these systems as if they could. Complexity needs to be the default assumption.
When things really warm up, to +9 degrees F (not at all implausible in the long run), 16 of the top 20 analogs are in CO and UT, …
Looking at a lot of these future climate analogs on Google Earth, their common denominator appears to be rattlesnakes. I’m sure they’re all nice places in their own way, but I’m worried about my trees. I’ll continue to hope that my back-of-the-envelope analysis is wrong, but in the meantime I’m going to hedge by managing the forest to prepare for change.
I think there’s a lot more to worry about than trees. Fire, wildlife, orchids, snowpack, water availability, …
Recently I decided to take another look, partly inspired by the Bureau of Reclamation’s publication of downscaled data. This solves some of the bias correction issues I had in 2008. I grabbed the model output (36 runs from CMIP5) and observations for the 1/8 degree gridpoint containing Bridger Bowl:
Then I used Vensim to do a little data processing, converting the daily time series (which are extremely noisy weather) into 10-year moving averages (i.e., climate). Continue reading “Future Climate of the Bridgers”
In this case, I think it’s quite literally Normal a.k.a. Gaussian:
Here’s what I think is happening. On windless days with powder, the snow dribbles off the edge of the roof (just above the center of the hump). Flakes drift down in a random walk. The railing terminates the walk after about four feet, by which time the distribution of flake positions has already reached the Normal you’d expect from the Central Limit Theorem.
Enough of the geek stuff; I think I’ll go ski the field.
One of the smaller and cuter stakeholders in climate change is the pika, an alpine mini-rabbit that has nowhere to go as warming takes their habitable zone above mountaintops.
Support pika research on RocketHub.
Brandon Martin-Anderson made this cool map of the US, with a dot at the approximate residence of every person in the Census. It’s fun, but can be tricky to navigate to your locale, so here’s Bozeman:
Old joke: How do you make a small fortune breeding horses? Start with a large fortune ….
It appears that the same logic applies to coal mining here in the Northern Rockies.
With US coal use in slight decline, exports are the growth market. Metallurgical and steam coal currently export for about $140 and $80 per short ton, respectively. But the public will see almost none of that, because unmanaged quantity and “competitive” auctions that are uncompetitive (just like Montana trust land oil & gas), plus low royalty, rent and bonus rates, result in a tiny slice of revenue accruing to the people (via federal and state governments) who actually own the resource.
For the Powder River Basin, here’s how it pencils out in rough terms:
|Royalty, rents & bonus||$2|
|Social Cost of Carbon (@ $21/tonCo2 medium value)||-$55|
|US domestic SCC (at 15% of global, average of 7% damage share and 23% GDP share)||-$8|
|Net US public benefit||< -$6|
In other words, the US public loses at least $3 for every $1 of coal revenue earned. The reality is probably worse, because the social cost of carbon estimate is extremely conservative, and other coal externalities are omitted. And of course the global harm is much greater than the US’ narrow interest.
Even if you think of coal mining as a jobs program, at Wyoming productivity, the climate subsidy alone is almost half a million dollars per worker.
This makes it hard to get enthusiastic about the planned expansion of exports.
When times are tough, there are always calls to unravel environmental regulations and drill, baby, drill. I’m first in line to say that a lot of environmental regulation needs a paradigm shift, but this strikes me as a foolish hair-of-the-dog-that-bit-ya idea. Our current problems don’t come from regulation, and won’t be solved by deregulation.
On average, there’s no material deprivation in the US. We consume more petroleum per capita than any other large nation. Our problems are largely distributional – inequitable income distribution and, recently, high unemployment, which causes disproportionate harm to a few. Why solve a distributional problem by skewing environmental policy? This smacks of an attempt to grow out of our problems, which is surely doomed to the extent that growth relies on intensifying material throughput.
Consider the system:
The underlying mental model behind calls for deregulation sounds like the following: environmental regulations create compliance costs that drive up the total cost of resource extraction, depressing the production rate and depriving the people of needed $$$ and happiness. Certainly that causal path exists. But it’s not the only thing going on.
Those regulations were created for a reason. They reduce environmental impacts, and therefore reduce the unpaid social costs that occur as side effects of oil production and consumption, and therefore improve welfare. These effects are nontrivial, unless you’re a GOP presidential candidate. One could wish for more efficient regulations, but absent that, wishing for less regulation is tantamount to wishing for more environmental consequences and social costs, and hoping that more $$$ will offset that.
Even the basic open-loop rationale for deregulation makes little sense. Resource policy is already loose, so there’s no quantity constraint on production. With the exception of ANWR and some offshore areas, most interesting areas are already leased. Montana certainly doesn’t exercise any foresight in the management of its trust lands. Environmental regulations have hardly become more stringent in the last decade or so. Since oil production in 1999 was higher than it is today, with oil prices well below $20/bbl, so compliance costs must be less than that. So, with oil at $100/bbl, we’d expect an explosion of supply, if regulatory costs were the only constraint. In fact, there’s barely an upward blip, so there must be something else at work…
The real problem is that there’s feedback in the system. For example, there’s balancing loop B1: as you extract more stuff, the remaining resource (oil in the ground) dwindles, and the physical costs of extraction – capital, labor, energy – go up. Technology can stave off that trend for some time, but prices and production trends make it clear that B1 is now dominant. This means that there’s a rather stark better-before-worse tradeoff: if we extract oil more quickly now, to hoist ourselves out of the financial crisis, we’ll have less later. But it seems likely that we’ll be even more desperate later – either to have that oil in an even pricier world market, or to keep it in the ground due to climate concerns. Consider what would have happened if we’d had no environmental constraints on oil production for the last three or four decades. Would the US now have more or less oil to rely on? Would we be happy that we pumped up all that black gold at under $20/bbl? Even the Hotelling rule is telling us that we should leave oil in the ground, as long as prices are rising faster than the interest rate (not hard, at current rates).
Another loop is just gaining traction: B2. As the stock of oil in the ground is depleted, marginal production occurs in increasingly desperate and devastating circumstances. Either you pursue smaller, more remote fields, meaning more drilling and infrastructure disturbance in sensitive areas, or you pursue unconventional resources, like tar sands and shale gas, with resource-intensive methods and unknown hazards. A regulatory rollback would accelerate production via the most destructive extraction methods, right at the time that the physics of extraction is already shifting the balance of private benefits ($$$) and social costs unfavorably. Loop B2 also operates inequitably, much like unemployment. Not everyone is harmed by oil and gas development; the impacts fall disproportionately on the neighbors of projects, who may not even benefit due to severance of surface and mineral rights. This weakens the argument for deregulation even further.
Rather than pretending we can turn the clock back to 1970, we should be thinking carefully about our exit strategy for scarce and climate-constrained resources. There must be lots of things we can do to solve the distributional problems of the current crisis without socializing the costs and privatizing the gains of fossil fuel exploitation more than we already do.
I’ve been browsing the ALEC model legislation on ALECexposed, some of which infiltrated the Montana legislature. It’s discouragingly predictable stuff, but not without a bit of amusement. Take the ALEC Energy Principles:
Mission: To define a comprehensive strategy for energy security, production, and distribution in the states consistent with the Jeffersonian principles of free markets and federalism.
Except when authoritarian government is needed to stuff big infrastructure projects down the throats of unwilling private property owners:
Reliable electricity supply depends upon significant improvement of the transmission grid. Interstate and intrastate transmission siting authority and procedures must be addressed to facilitate the construction of needed new infrastructure.
Like free markets, federalism apparently has its limits:
Such plan shall only be approved by the commission if the expense of implementing such a plan is borne by the federal government.