Why is national modeling hard?

If you’ve followed the work on the System Dynamics National Model, you know that it came to an end uncompleted. Yet, there is a vast amount of interesting structure in the model, and there have been many productive spinoffs from the work. How can this be?

I think there are several explanations. One is that the problem is intrinsically hard. Economies are big, and they operate at many scales. There are micro processes (firms investing in capacity and choosing technologies) but also evolutionary processes (firms getting it wrong die).

This means there’s no one to ask when you want to understand how things work. You can’t ask someone about their car fuel purchase habits and aggregate up to national energy intensity, because their understanding encompasses Ford vs. Chevy, not all the untried and future contingencies in the economic network beyond their limited sphere of influence.

You can’t ask the data. Data must always be interpreted through the lens of a model, and model structure is what we lack. If we had a lot more data, we might be able to infer more about the constraints on plausible structures, but economic data is pretty sparse compared to the number of constructs we need to understand.

In spite of this, dynamic general equilibrium models have managed to model whole economies anyway. Why have they succeeded? I think there are two answers. First, they cheat. They reduce all behavior to an optimization algorithm. That’s guaranteed to yield an answer, but whether that answer has any relevance to the real world is debatable. Second, they give answers that people who fund economic models like: the world is just fine as it is, externalities don’t exist, and all policy interventions are costly.

All this is not to say that we’ll never have useful national models; indeed we already have many models (including the DGEs) that are useful for some purposes. But we still have a long way to go before we have solid macrobehavior from microfoundations to inform policy broadly.

 

Early economic dynamics: Samuelson's multiplier-accelerator

Paul Samuelson’s 1939 analysis of the multiplier-accelerator is a neat piece of work. Too bad it’s wrong.

Interestingly, this work dates from a time in which the very idea of a mathematical model was still questioned:

Contrary to the impression commonly held, mathematical methods properly employed, far from making economic theory more abstract, actually serve as a powerful liberating device enabling the entertainment and analysis of ever more realistic and complicated hypotheses.

Samuelson should be hailed as one of the early explorers of a very big jungle.

The basic statement of the model is very simple:

NationalIncome

In quasi-System Dynamics notation, that looks like:

SamuelsonDiagramB

A caveat:

The limitations inherent in so simplified a picture as that presented here should not be overlooked. In particular, it assumes that the marginal propensity to consume and the relation are constants; actually these will change with the level of income, so that this representation is strictly a marginal analysis to be applied to the study of small oscillations. Nevertheless it is more general than the usual analysis.

Samuelson hand-simulated the model (it’s fun – once – but he runs four scenarios):Simulated Samuelson then solves the discrete time system, to identify four regions with different behavior: goal seeking (exponential decay to a steady state), damped oscillations, unstable (explosive) oscillations, and unstable exponential growth or decline. He nicely maps the parameter space:

parameterSpace

ParamRegionBehaviorSo where’s the problem?

The first is not so much of Samuelson’s making as it is a limitation of the pre-computer era. The essential simplification of the model for analytic solution is;

Simplified

This is fine, but it’s incredibly abstract. Presented with this equation out of context – as readers often are – it’s almost impossible to posit a sensible description of how the economy works that would enable one to critique the model. This kind of notation remains common in econometrics, to the detriment of understanding and progress.

At the first SD conference, Gil Low presented a critique and reconstruction of the MA model that addressed this problem. He reconstructed the model, providing an operational description of the economy that remains consistent with the multiplier-accelerator framework.

LowThe mere act of crafting a stock-flow description reveals problem #1: the basic multiplier-accelerator doesn’t conserve stuff.

inventory1 InventoryCapital2Non-conservation of stuff leads to problem #2. When you do implement inventories and capital stocks, the period of multiplier-accelerator oscillations moves to about 2 decades – far from the 3-7 year period of the business cycle that Samuelson originally sought to explain. This occurs in part because the capital stock, with a 15-year lifetime, introduces considerable momentum. You simply can’t discover this problem in the original multiplier-accelerator framework, because too many physical and behavioral time constants are buried in the assumptions associated with its 2 parameters.

Low goes on to introduce labor, finding that variations in capacity utilization do produce oscillations of the required time scale.

ShortTermI think there’s a third problem with the approach as well: discrete time. Discrete time notation is convenient for matching a model to data sampled at regular intervals. But the economy is not even remotely close to operating in discrete annual steps. Moreover a one-year step is dangerously close to the 3-year period of the business cycle phenomenon of interest. This means that it is a distinct possibility that some of the oscillatory tendency is an artifact of discrete time sampling. While improper oscillations can be detected analytically, with discrete time notation it’s not easy to apply the simple heuristic of halving the time step to test stability, because it merely compresses the time axis or causes problems with implicit time constants, depending on how the model is implemented. Halving the time step and switching to RK4 integration illustrates these issues:

RK4

It seems like a no-brainer, that economic dynamic models should start with operational descriptions, continuous time, and engineering state variable or stock flow notation. Abstraction and discrete time should emerge as simplifications, as needed for analysis or calibration. The fact that this has not become standard operating procedure suggests that the invisible hand is sometimes rather slow as it gropes for understanding.

The model is in my library.

See Richardson’s Feedback Thought in Social Science and Systems Theory for more history.

Samuelson’s Multiplier Accelerator

This is a fairly direct implementation of the multiplier-accelerator model from Paul Samuelson’s classic 1939 paper,

“Interactions between the Multiplier Analysis and the Principle of Acceleration” PA Samuelson – The Review of Economics and Statistics, 1939 (paywalled on JSTOR, but if you register you can read a limited number of publications for free)

SamuelsonDiagramB

This is a nice example of very early economic dynamics analyses, and also demonstrates implementation of discrete time notation in Vensim. Continue reading “Samuelson’s Multiplier Accelerator”

The crisis was not predicted because crises aren't predictable?

There’s a terrific essay on economics by John Kay on the INET blog. Some juicy excerpts follow, but it’s really worth the trip to read the whole thing. They’ve invited some other economists to respond, which should be interesting.

The Map is Not the Territory: An Essay on the State of Economics

by JOHN KAY

The reputation of economics and economists, never high, has been a victim of the crash of 2008. The Queen was hardly alone in asking why no one had predicted it. An even more serious criticism is that the economic policy debate that followed seems only to replay the similar debate after 1929. The issue is budgetary austerity versus fiscal stimulus, and the positions of the protagonists are entirely predictable from their previous political allegiances.

The doyen of modern macroeconomics, Robert Lucas, responded to the Queen’s question in a guest article in The Economist in August 2009.[1] The crisis was not predicted, he explained, because economic theory predicts that such events cannot be predicted. Faced with such a response, a wise sovereign will seek counsel elsewhere.

[…]All science uses unrealistic simplifying assumptions. Physicists describe motion on frictionless plains, gravity in a world without air resistance. Not because anyone believes that the world is frictionless and airless, but because it is too difficult to study everything at once. A simplifying model eliminates confounding factors and focuses on a particular issue of interest. To put such models to practical use, you must be willing to bring back the excluded factors. You will probably find that this modification will be important for some problems, and not others – air resistance makes a big difference to a falling feather but not to a falling cannonball.

But Lucas and those who follow him were plainly engaged in a very different exercise, as the philosopher Nancy Cartwright has explained.[4] The distinguishing characteristic of their approach is that the list of unrealistic simplifying assumptions is extremely long. Lucas was explicit about his objective[5] – ‘the construction of a mechanical artificial world populated by interacting robots that economics typically studies’. An economic theory, he explains, is something that ‘can be put on a computer and run’. Lucas has called structures like these ‘analogue economies’, because they are, in a sense, complete economic systems. They loosely resemble the world, but a world so pared down that everything about them is either known, or can be made up. Such models are akin to Tolkien’s Middle Earth, or a computer game like Grand Theft Auto.

[… interesting discussion of the fiscal crisis as a debate over Ricardian equivalence …]
But another approach would discard altogether the idea that the economic world can be described by a universally applicable model in which all key relationships are predetermined. Economic behaviour is influenced by technologies and cultures, which evolve in ways that are certainly not random but which cannot be described fully, or perhaps at all, by the kinds of variables and equations with which economists are familiar. Models, when employed, must therefore be context specific, in the manner suggested in a recent book by Roman Frydman and Michael Goldberg.[8]

[…]

But you would not nowadays be able to publish similar articles in a good economics journal. You would be told that your model was theoretically inadequate – it lacked rigour, failed to demonstrate consistency. You might be accused of the cardinal sin of being ‘ad hoc’. Rigour and consistency are the two most powerful words in economics today.

[…]

Consistency and rigour are features of a deductive approach, which draws conclusions from a group of axioms – and whose empirical relevance depends entirely on the universal validity of the axioms. The only descriptions that fully meet the requirements of consistency and rigour are complete artificial worlds, like those of Grand Theft Auto, which can ‘be put on a computer and run’.

For many people, deductive reasoning is the mark of science, while induction – in which the argument is derived from the subject matter – is the characteristic method of history or literary criticism. But this is an artificial, exaggerated distinction. ‘The first siren of beauty’, says Cochrane, ‘is logical consistency’. It seems impossible that anyone acquainted with great human achievements – whether in the arts, the humanities or the sciences – could really believe that the first siren of beauty is consistency. This is not how Shakespeare, Mozart or Picasso – or Newton or Darwin – approached their task.

[…] Economists who assert that the only valid prescriptions in economic policy are logical deductions from complete axiomatic systems take prescriptions from doctors who often know little more about these medicines than that they appear to treat the disease. Such physicians are unashamedly ad hoc; perhaps pragmatic is a better word. With exquisite irony, Lucas holds a chair named for John Dewey, the theorist of American pragmatism.

[…] The modern economist is the clinician with no patients, the engineer with no projects. And since these economists do not appear to engage with the issues that confront real businesses and actual households, the clients do not come.There are, nevertheless, many well paid jobs for economists outside academia. Not, any more, in industrial and commercial companies, which have mostly decided economists are of no use to them. Business economists work in financial institutions, which principally use them to entertain their clients at lunch or advertise their banks in fillers on CNBC. Economic consulting employs economists who write lobbying documents addressed to other economists in government or regulatory agencies.

[…]A review of economics education two decades ago concluded that students should be taught ‘to think like economists’. But ‘thinking like an economist’ has come to be interpreted as the application of deductive reasoning based on a particular set of axioms. Another Chicago Nobel Prize winner, Gary Becker, offered the following definition: ‘the combined assumptions of maximising behaviour, market equilibrium, and stable preferences, used relentlessly and consistently form the heart of the economic approach’.[13] Becker’s Nobel citation rewards him for ‘having extended the domain of microeconomic analysis to a wide range of economic behavior.’ But such extension is not an end in itself: its value can lie only in new insights into that behaviour.

‘The economic approach’ as described by Becker is not, in itself, absurd. What is absurd is the claim to exclusivity he makes for it: a priori deduction from a particular set of unrealistic simplifying assumptions is not just a tool but ‘the heart of the economic approach’. A demand for universality is added to the requirements of consistency and rigour. Believing that economics is like they suppose physics to be – not necessarily correctly – economists like Becker regard a valid scientific theory as a representation of the truth – a description of the world that is independent of time, place, context, or the observer. […]

The further demand for universality with the consistency assumption leads to the hypothesis of rational expectations and a range of arguments grouped under the rubric of ‘the Lucas critique’. If there were to be such a universal model of the economic world, economic agents would have to behave as if they had knowledge of it, or at least as much knowledge of it as was available, otherwise their optimising behaviour be inconsistent with the predictions of the model. This is a reductio ad absurdum argument, which demonstrates the impossibility of any universal model – since the implications of the conclusion for everyday behaviour are preposterous, the assumption of model universality is false.

[…]Economic models are no more, or less, than potentially illuminating abstractions. Another philosopher, Alfred Korzybski, puts the issue more briefly: ‘the map is not the territory’.[15] Economics is not a technique in search of problems but a set of problems in need of solution. Such problems are varied and the solutions will inevitably be eclectic.

This is true for analysis of the financial market crisis of 2008. Lucas’s assertion that ‘no one could have predicted it’ contains an important, though partial, insight. There can be no objective basis for a prediction of the kind ‘Lehman Bros will go into liquidation on September 15’, because if there were, people would act on that expectation and, most likely, Lehman would go into liquidation straight away. The economic world, far more than the physical world, is influenced by our beliefs about it.

Such thinking leads, as Lucas explains, directly to the efficient market hypothesis – available knowledge is already incorporated in the price of securities. […]

In his Economist response, Lucas acknowledges that ‘exceptions and anomalies’ to the efficient market hypothesis have been discovered, ‘but for the purposes of macroeconomic analyses and forecasts they are too small to matter’. But how could anyone know, in advance not just of this crisis but also of any future crisis, that exceptions and anomalies to the efficient market hypothesis are ‘too small to matter’?

[…]The claim that most profit opportunities in business or in securities markets have been taken is justified. But it is the search for the profit opportunities that have not been taken that drives business forward, the belief that profit opportunities that have not been arbitraged away still exist that explains why there is so much trade in securities. Far from being ‘too small to matter’, these deviations from efficient market assumptions, not necessarily large, are the dynamic of the capitalist economy.

[…]

The preposterous claim that deviations from market efficiency were not only irrelevant to the recent crisis but could never be relevant is the product of an environment in which deduction has driven out induction and ideology has taken over from observation. The belief that models are not just useful tools but also are capable of yielding comprehensive and universal descriptions of the world has blinded its proponents to realities that have been staring them in the face. That blindness was an element in our present crisis, and conditions our still ineffectual responses. Economists – in government agencies as well as universities – were obsessively playing Grand Theft Auto while the world around them was falling apart.

The first response, from Paul Davidson, is already in.

A Dynamic Synthesis of Basic Macroeconomic Theory

Model Name: A Dynamic Synthesis of Basic Macroeconomic Theory

Citation: Forrester, N.B. (1982) A Dynamic Synthesis of Basic Macroeconomic Theory: Implications for Stabilization Policy Analysis. PhD Dissertation, MIT Sloan School of Management.

Source: Provided by Nathan Forrester

Units balance: Yes, with 3 exceptions, evidently from the original publication

Format: Vensim

Notes: I mention this model in this article

A Dynamic Synthesis of Basic Macroeconomic Theory (Vensim .vpm)

Update: a newer version with improved diagrams and a control panel, plus changes files for a series of experiments with responses to negative demand shocks:

Download NFDis+TF-3.vpm or NFDis+TF-3.zip

The model runs in Vensim PLE, but you’ll need an advanced version to use the .cin and .cmd files included.

Economists in the bathtub

Env-Econ is one of several econ sites to pick up on standupeconomist Yoram Bauman’s assessment, Grading Economics Textbooks on Climate Change.

Most point out the bad, but there’s also a lot of good. On Bauman’s curve, there are 4 As, 3 Bs, 5 Cs, 3 Ds, and one F. Still, the bad tends to be really bad. Bauman writes about one,

Overall, the book is not too bad if you ignore that it’s based on climate science that is almost 15 years out of date and that it has multiple errors that would make Wikipedia blush. The fact that this textbook has over 20 percent of the market shakes my faith in capitalism.

The interesting thing is that the worst textbooks go astray more on the science than on the economics. The worst cherry-pick outdated studies, distort the opinions of scientists, and toss in red herrings like “For Greenland, a warming climate is good economic news.”

I find the most egregious misrepresentation in Schiller’s The Economy Today (D+):

The earth’s climate is driven by solar radiation. The energy the sun absorbs must be balanced by outgoing radiation from the earth and the atmosphere. Scientists fear that a flow imbalance is developing. Of particular concern is a buildup of carbon dioxide (CO2) that might trap heat in the earth’s atmosphere, warming the planet. The natural release of CO2 dwarfs the emissions from human activities. But there’s a concern that the steady increase in man-made CO2 emissions—principally from burning fossil fuels like gasoline and coal—is tipping the balance….

First, there’s no “might” about the fact that CO2 traps heat (infrared radiation); the only question is how much, when feedback effects come into play.  But the bigger issue is Schiller’s implication about the cause of atmospheric CO2 buildup. Here’s a picture of Schiller’s words, with arrow width scaled roughly to actual fluxes:

CO2flows1

Apparently, nature is at fault for increasing atmospheric CO2. This is like worrying that the world will run out of air, because people are inhaling it all (Schiller may be inhaling something else). The reality is that the natural flux, while large, is a two way flow:

CO2flows2

What goes into the ocean and biosphere generally comes out again. For the last hundred centuries, those flows were nearly equal (i.e. zero net flow). But now that humans are emitting a lot of carbon, the net flow is actually from the atmosphere into natural systems, like this:

CO2flows3

That’s quite a different situation. If an author can’t paint an accurate verbal picture of a simple stock-flow system like this, how can a text help students learn to manage resources, money or other stocks?

Heat Trap

Replicated by: Tom Fiddaman

Citation: Hatlebakk, Magnus, & Moxnes, Erling (1992). Misperceptions and Mismanagement of the Greenhouse Effect? The Simulation Model . Report # CMR-92-A30009, December). Christian Michelsen Research.

Units: no

Format: Vensim

This is a climate-economy model, of about the same scale and vintage as Nordhaus’ original DICE model. It’s more interesting in some respects, because it includes path-dependent reversible and irreversible emissions reductions. As I recall, the original also had some stochastic elements, not active here. This version has no units; hopefully I can get an improved version online at some point.

Heat trap (Vensim .vmf)

A Behavioral Analysis of Learning Curve Strategy

Model Name: A Behavioral Analysis of Learning Curve Strategy

Citation: A Behavioral Analysis of Learning Curve Strategy, John D. Sterman and Rebecca Henderson, Sloan School of Management, MIT and Eric D. Beinhocker and Lee I. Newman, McKinsey and Company.

Neoclassical models of strategic behavior have yielded many insights into competitive behavior, despite the fact that they often rely on a number of assumptions-including instantaneous market clearing and perfect foresight-that have been called into question by a broad range of research. Researchers generally argue that these assumptions are “good enough” to predict an industry’s probable equilibria, and that disequilibrium adjustments and bounded rationality have limited competitive implications.  Here we focus on the case of strategy in the presence of increasing returns to highlight how relaxing these two assumptions can lead to outcomes quite different from those predicted by standard neoclassical models. Prior research suggests that in the presence of increasing returns, tight appropriability and accommodating rivals, in some circumstances early entrants can achieve sustained competitive advantage by pursuing Get Big Fast (GBF) strategies: rapidly expanding capacity and cutting prices to gain market share advantage and exploit positive feedbacks faster than their rivals. Using a simulation of the duopoly case we show that when the industry moves slowly compared to capacity adjustment delays, boundedly rational firms find their way to the equilibria predicted by conventional models.  However, when market dynamics are rapid relative to capacity adjustment, forecasting errors lead to excess capacity, overwhelming the advantage conferred by increasing returns. Our results highlight the risks of ignoring the role of disequilibrium dynamics and bounded rationality in shaping competitive outcomes, and demonstrate how both can be incorporated into strategic analysis to form a dynamic, behavioral game theory amenable to rigorous analysis.

The original paper is on Archive.org ; it was eventually published in Management Science. You can get the MS version from John Sterman’s page here.

Source: Replicated by Tom Fiddaman

Units balance: Yes

Format: Vensim (the model uses subscripts, so it requires Pro, DSS, or Model Reader)

Behavioral Analysis of Learning Curve Strategy (Vensim .vmf)

New update:

BALCS4b.zip