These bears showed up for dinner just as we sat down. Fortunately, they didn’t want any curry and were satisfied with some fruit.
Month: August 2012
The model that ate Europe is back, and it's bigger than ever
The FuturICT Knowledge Accelerator, a grand unified model of everything, is back in the news.
What if global scale computing facilities were available that could analyse most of the data available in the world? What insights could scientists gain about the way society functions? What new laws of nature would be revealed? Could society discover a more sustainable way of living? Developing planetary scale computing facilities that could deliver answers to such questions is the long term goal of FuturICT.
I’ve been rather critical of this effort before, but I think there’s also much to like.
- An infrastructure for curated public data would be extremely useful.
- There’s much to be gained through a multidisciplinary focus on simulation, which is increasingly essential and central to all fields.
- Providing a public portal into the system could have valuable educational benefits.
- Creating more modelers, and more sophisticated model users, helps build capacity for science-based self governance.
But I still think the value of the project is more about creating an infrastructure, within which interesting models can emerge, than it is in creating an oracle that decision makers and their constituents will consult for answers to life’s pressing problems.
- Even with Twitter and Google, usable data spans only a small portion of human existence.
- We’re not even close to having all the needed theory to go with the data. Consider that general equilibrium is the dominant modeling paradigm in economics, yet equilibrium is not a prevalent feature of reality.
- Combinatorial explosion can overwhelm any increase in computing power for the foreseeable future, so the very idea of simulating everything social and physical at once is laughable.
- Even if the technical hurdles can be overcome,
- People are apparently happy to hold beliefs that are refuted by the facts, as long as buffering stocks afford them the luxury of a persistent gap between reality and mental models.
- Decision makers are unlikely to cede control to models that they don’t understand or can’t manipulate to generate desired results.
I don’t think you need to look any further than the climate debate and the history of Limits to Growth to conclude that models are a long way from catalyzing a sustainable world.
If I had a billion Euros to spend on modeling, I think less of it would go into a single platform and more would go into distributed efforts that are working incrementally. It’s easier to evolve a planetary computing platform than to design one.
With the increasing accessibility of computing and visualization, we could be on the verge of a model-induced renaissance. Or, we could be on the verge of an explosion of fun and pretty but vacuous, non-transparent and unvalidated model rubbish that lends itself more to propaganda than thinking. So, I’d be plowing a BIG chunk of that billion into infrastructure and incentives for model and data quality.
On the usefulness of big models
Steven Wright’s “life size map” joke is a lot older than I thought:
On Exactitude in Science
Jorge Luis Borges, Collected Fictions, translated by Andrew Hurley.
…In that Empire, the Art of Cartography attained such Perfection that the map of a single Province occupied the entirety of a City, and the map of the Empire, the entirety of a Province. In time, those Unconscionable Maps no longer satisfied, and the Cartographers Guilds struck a Map of the Empire whose size was that of the Empire, and which coincided point for point with it. The following Generations, who were not so fond of the Study of Cartography as their Forebears had been, saw that that vast Map was Useless, and not without some Pitilessness was it, that they delivered it up to the Inclemencies of Sun and Winters. In the Deserts of the West, still today, there are Tattered Ruins of that Map, inhabited by Animals and Beggars; in all the Land there is no other Relic of the Disciplines of Geography.
—Suarez Miranda,Viajes de varones prudentes, Libro IV,Cap. XLV, Lerida, 1658
It’s no less relevant to big models, though.
h/t Benjamin Blonder
In search of SD conference excellence
I was pleasantly surprised by the quality of presentations I attended at the SD conference in St. Gallen. Many of the posters were also very good – the society seems to have been successful in overcoming the booby-prize stigma, making it a pleasure to graze on the often-excellent work in a compact format (if only the hors d’oeuvre line had had brevity to match its tastiness…).
In anticipation of an even better array of papers next year, here’s my quasi-annual reminder about resources for producing good work in SD:
- Doing quality simulation research
Rahmandad & Sterman on reporting guidelines - The secret to successful system dynamics modeling The secret, with links to earlier posts on writing good SD papers and effective model critique.
I suppose I should add posts on good presentation technique and poster development (thoughts welcome).
Thanks to the organizers for a well-run enterprise in a pleasant venue.
Beggaring ourselves through coal mining
Old joke: How do you make a small fortune breeding horses? Start with a large fortune ….
It appears that the same logic applies to coal mining here in the Northern Rockies.
With US coal use in slight decline, exports are the growth market. Metallurgical and steam coal currently export for about $140 and $80 per short ton, respectively. But the public will see almost none of that, because unmanaged quantity and “competitive” auctions that are uncompetitive (just like Montana trust land oil & gas), plus low royalty, rent and bonus rates, result in a tiny slice of revenue accruing to the people (via federal and state governments) who actually own the resource.
For the Powder River Basin, here’s how it pencils out in rough terms:
Item | $/ton |
Minemouth price | $10 |
Royalty, rents & bonus | $2 |
Social Cost of Carbon (@ $21/tonCo2 medium value) | -$55 |
US domestic SCC (at 15% of global, average of 7% damage share and 23% GDP share) | -$8 |
Net US public benefit | < -$6 |
In other words, the US public loses at least $3 for every $1 of coal revenue earned. The reality is probably worse, because the social cost of carbon estimate is extremely conservative, and other coal externalities are omitted. And of course the global harm is much greater than the US’ narrow interest.
Even if you think of coal mining as a jobs program, at Wyoming productivity, the climate subsidy alone is almost half a million dollars per worker.
This makes it hard to get enthusiastic about the planned expansion of exports.
Global lukewarming
Fred Krupp, President of EDF, has an opinion on climate policy in the WSJ. I have to give him credit for breaking into a venue that is staunchly ignorant the realities of climate change. An excerpt:
If both sides can now begin to agree on some basic propositions, maybe we can restart the discussion. Here are two:
The first will be uncomfortable for skeptics, but it is unfortunately true: Dramatic alterations to the climate are here and likely to get worse—with profound damage to the economy—unless sustained action is taken. As the Economist recently editorialized about the melting Arctic: “It is a stunning illustration of global warming, the cause of the melt. It also contains grave warnings of its dangers. The world would be mad to ignore them.”
The second proposition will be uncomfortable for supporters of climate action, but it is also true: Some proposed climate solutions, if not well designed or thoughtfully implemented, could damage the economy and stifle short-term growth. As much as environmentalists feel a justifiable urgency to solve this problem, we cannot ignore the economic impact of any proposed action, especially on those at the bottom of the pyramid. For any policy to succeed, it must work with the market, not against it.
If enough members of the two warring climate camps can acknowledge these basic truths, we can get on with the hard work of forging a bipartisan, multi-stakeholder plan of action to safeguard the natural systems on which our economic future depends.
I wonder, though, if the price of admission was too high. Krupp equates two risks: climate impacts, and policy side effects. But this is a form of false balance – these risks are not in the same league.
Policy side effects are certainly real – I’ve warned against inefficient policies multiple times (e.g., overuse of standards). But the effects of a policy are readily visible to well-defined constituencies, mostly short term, and diverse across jurisdictions with different implementations. This makes it easy to learn what’s working and to stop doing what’s not working (and there’s never a shortage of advocates for the latter), without suffering large cumulative effects. Most of the inefficient approaches (like banning the bulb) are economically miniscule.
Climate risk, on the other hand, accrues largely to people in far away places, who aren’t even born yet. It’s subject to reinforcing feedbacks (like civil unrest) and big uncertainties, known and unknown, that lend it a heavy tail of bad outcomes, which are not economically marginal.
The net balance of these different problem characteristics is that there’s little chance of catastrophic harm from climate policy, but a substantial chance from failure to have a climate policy. There’s also almost no chance that we’ll implement a too-stringent climate policy, or that it would stick if we did.
The ultimate irony is that EDF’s preferred policy is cap & trade, which trades illusory environmental certainty for considerable economic inefficiency.
Does this kind of argument reach a wavering middle ground? Or does it fail to convince skeptics, while weakening the position of climate policy proponents by conceding strawdog growth arguments?