Your gut may be leading you astray

An interesting comment on rationality and conservatism:

I think Sarah Palin is indeed a Rorschach test for Conservatives, but it’s about much than manners or players vs. kibbitzes – it’s about what Conservativsm MEANS.

The core idea behind Conservatism is that most of human learning is done not by rational theorizing, but by pattern recognition. Our brain processes huge amounts of data every second, and most information we get out of it is in the form of recognized patterns, not fully logical theories. It’s fair to say that 90% of our knowledge is in patterns, not in theories.

This pattern recognition is called common sense, and over generations, it’s called traditions, conventions etc. Religion is usually a carrier meme for these evolved patterns. It’s sort of an evolutionary process, like a genetic algorithm.

Liberals, Lefties and even many Libertarians want to use only 10% of the human knowledge that’s rational. And because our rational knowledge cannot yet fully explain neither human nature in itself nor everything that happens in society, they fill the holes with myths like that everybody is born good and only society makes people bad etc.

Conservatives are practical people who instinctively recognize the importance of evolved patterns in human learning: because our rational knowledge simply isn’t enough yet, these common sense patterns are our second best option to use. And to use these patterns effectively you don’t particularly have to be very smart i.e. very rational. You have to be _wise_ and you have to have a good character: you have to set hubris and pride aside and be able to accept traditions you don’t fully understand.

Thus, for a Conservative, while smartness never hurts, being wise and having a good character is more important than being very smart. Looking a bit simple simply isn’t a problem, you still have that 90% of knowledge at hand.

Anti-Palin Conservatives don’t understand it. They think Conservativism is about having different theories than the Left, they don’t understand that it’s that theories and rational knowledge isn’t so important.

(via Rabbett Run)

A possible example of the writer’s perspective at work is provided by survey research showing that Tea Partiers are skeptical of anthropogenic climate change (established by models) but receptive to natural variation (vaguely, patterns), and they’re confident that they’re well-informed about it in spite of evidence to the contrary. Another possible data point is conservapedia’s resistance to relativity, which is essentially a model that contradicts our Newtonian common sense.

As an empirical observation, this definition of conservatism seems plausible at first. Humans are fabulous pattern recognizers. And, there are some notable shortcomings to rational theorizing. However, as a normative statement – that conservatism is better because of the 90%/10% ratio, I think it’s seriously flawed.

The quality of the 90% is quite different from the quality of the 10%. Theories are the accumulation of a lot of patterns put into a formal framework that has been shared and tested, which at least makes it easy to identify the theories that fall short. Common sense, or wisdom or whatever you want to call it, is much more problematic. Everyone knows the world is flat, right?

Sadly, there’s abundant evidence that our evolved heuristics fall short in complex systems. Pattern matching in particular falls short even in simple bathtub systems. Inappropriate mental models and heuristics can lead to decisions that are exactly the opposite of good management, even when property rights are complete; noise only makes things worse.

Real common sense would have the brains to abdicate when faced with situations, like relativity or climate change, where it was clear that experience (low velocities, local weather) doesn’t provide any patterns that are relevant to the conditions under consideration.

After some reflection, I think there’s more than pattern recognition to conservatism. Liberals, anarchists, etc. are also pattern matchers. We all have our own stylized facts and conventional wisdom, all of which are subject to the same sorts of cognitive biases. So, pattern matching doesn’t automatically lead to conservatism. Many conservatives don’t believe in global warming because they don’t trust models, yet observed warming and successful predictions of models from the 70s (i.e. patterns) also don’t count. So, conservatives don’t automatically respond to patterns either.

In any case, running the world by pattern recognition alone is essentially driving by looking in the rearview mirror. If you want to do better, i.e. to make good decisions at turning points or novel conditions, you need a model.

 

The crisis was not predicted because crises aren't predictable?

There’s a terrific essay on economics by John Kay on the INET blog. Some juicy excerpts follow, but it’s really worth the trip to read the whole thing. They’ve invited some other economists to respond, which should be interesting.

The Map is Not the Territory: An Essay on the State of Economics

by JOHN KAY

The reputation of economics and economists, never high, has been a victim of the crash of 2008. The Queen was hardly alone in asking why no one had predicted it. An even more serious criticism is that the economic policy debate that followed seems only to replay the similar debate after 1929. The issue is budgetary austerity versus fiscal stimulus, and the positions of the protagonists are entirely predictable from their previous political allegiances.

The doyen of modern macroeconomics, Robert Lucas, responded to the Queen’s question in a guest article in The Economist in August 2009.[1] The crisis was not predicted, he explained, because economic theory predicts that such events cannot be predicted. Faced with such a response, a wise sovereign will seek counsel elsewhere.

[…]All science uses unrealistic simplifying assumptions. Physicists describe motion on frictionless plains, gravity in a world without air resistance. Not because anyone believes that the world is frictionless and airless, but because it is too difficult to study everything at once. A simplifying model eliminates confounding factors and focuses on a particular issue of interest. To put such models to practical use, you must be willing to bring back the excluded factors. You will probably find that this modification will be important for some problems, and not others – air resistance makes a big difference to a falling feather but not to a falling cannonball.

But Lucas and those who follow him were plainly engaged in a very different exercise, as the philosopher Nancy Cartwright has explained.[4] The distinguishing characteristic of their approach is that the list of unrealistic simplifying assumptions is extremely long. Lucas was explicit about his objective[5] – ‘the construction of a mechanical artificial world populated by interacting robots that economics typically studies’. An economic theory, he explains, is something that ‘can be put on a computer and run’. Lucas has called structures like these ‘analogue economies’, because they are, in a sense, complete economic systems. They loosely resemble the world, but a world so pared down that everything about them is either known, or can be made up. Such models are akin to Tolkien’s Middle Earth, or a computer game like Grand Theft Auto.

[… interesting discussion of the fiscal crisis as a debate over Ricardian equivalence …]
But another approach would discard altogether the idea that the economic world can be described by a universally applicable model in which all key relationships are predetermined. Economic behaviour is influenced by technologies and cultures, which evolve in ways that are certainly not random but which cannot be described fully, or perhaps at all, by the kinds of variables and equations with which economists are familiar. Models, when employed, must therefore be context specific, in the manner suggested in a recent book by Roman Frydman and Michael Goldberg.[8]

[…]

But you would not nowadays be able to publish similar articles in a good economics journal. You would be told that your model was theoretically inadequate – it lacked rigour, failed to demonstrate consistency. You might be accused of the cardinal sin of being ‘ad hoc’. Rigour and consistency are the two most powerful words in economics today.

[…]

Consistency and rigour are features of a deductive approach, which draws conclusions from a group of axioms – and whose empirical relevance depends entirely on the universal validity of the axioms. The only descriptions that fully meet the requirements of consistency and rigour are complete artificial worlds, like those of Grand Theft Auto, which can ‘be put on a computer and run’.

For many people, deductive reasoning is the mark of science, while induction – in which the argument is derived from the subject matter – is the characteristic method of history or literary criticism. But this is an artificial, exaggerated distinction. ‘The first siren of beauty’, says Cochrane, ‘is logical consistency’. It seems impossible that anyone acquainted with great human achievements – whether in the arts, the humanities or the sciences – could really believe that the first siren of beauty is consistency. This is not how Shakespeare, Mozart or Picasso – or Newton or Darwin – approached their task.

[…] Economists who assert that the only valid prescriptions in economic policy are logical deductions from complete axiomatic systems take prescriptions from doctors who often know little more about these medicines than that they appear to treat the disease. Such physicians are unashamedly ad hoc; perhaps pragmatic is a better word. With exquisite irony, Lucas holds a chair named for John Dewey, the theorist of American pragmatism.

[…] The modern economist is the clinician with no patients, the engineer with no projects. And since these economists do not appear to engage with the issues that confront real businesses and actual households, the clients do not come.There are, nevertheless, many well paid jobs for economists outside academia. Not, any more, in industrial and commercial companies, which have mostly decided economists are of no use to them. Business economists work in financial institutions, which principally use them to entertain their clients at lunch or advertise their banks in fillers on CNBC. Economic consulting employs economists who write lobbying documents addressed to other economists in government or regulatory agencies.

[…]A review of economics education two decades ago concluded that students should be taught ‘to think like economists’. But ‘thinking like an economist’ has come to be interpreted as the application of deductive reasoning based on a particular set of axioms. Another Chicago Nobel Prize winner, Gary Becker, offered the following definition: ‘the combined assumptions of maximising behaviour, market equilibrium, and stable preferences, used relentlessly and consistently form the heart of the economic approach’.[13] Becker’s Nobel citation rewards him for ‘having extended the domain of microeconomic analysis to a wide range of economic behavior.’ But such extension is not an end in itself: its value can lie only in new insights into that behaviour.

‘The economic approach’ as described by Becker is not, in itself, absurd. What is absurd is the claim to exclusivity he makes for it: a priori deduction from a particular set of unrealistic simplifying assumptions is not just a tool but ‘the heart of the economic approach’. A demand for universality is added to the requirements of consistency and rigour. Believing that economics is like they suppose physics to be – not necessarily correctly – economists like Becker regard a valid scientific theory as a representation of the truth – a description of the world that is independent of time, place, context, or the observer. […]

The further demand for universality with the consistency assumption leads to the hypothesis of rational expectations and a range of arguments grouped under the rubric of ‘the Lucas critique’. If there were to be such a universal model of the economic world, economic agents would have to behave as if they had knowledge of it, or at least as much knowledge of it as was available, otherwise their optimising behaviour be inconsistent with the predictions of the model. This is a reductio ad absurdum argument, which demonstrates the impossibility of any universal model – since the implications of the conclusion for everyday behaviour are preposterous, the assumption of model universality is false.

[…]Economic models are no more, or less, than potentially illuminating abstractions. Another philosopher, Alfred Korzybski, puts the issue more briefly: ‘the map is not the territory’.[15] Economics is not a technique in search of problems but a set of problems in need of solution. Such problems are varied and the solutions will inevitably be eclectic.

This is true for analysis of the financial market crisis of 2008. Lucas’s assertion that ‘no one could have predicted it’ contains an important, though partial, insight. There can be no objective basis for a prediction of the kind ‘Lehman Bros will go into liquidation on September 15’, because if there were, people would act on that expectation and, most likely, Lehman would go into liquidation straight away. The economic world, far more than the physical world, is influenced by our beliefs about it.

Such thinking leads, as Lucas explains, directly to the efficient market hypothesis – available knowledge is already incorporated in the price of securities. […]

In his Economist response, Lucas acknowledges that ‘exceptions and anomalies’ to the efficient market hypothesis have been discovered, ‘but for the purposes of macroeconomic analyses and forecasts they are too small to matter’. But how could anyone know, in advance not just of this crisis but also of any future crisis, that exceptions and anomalies to the efficient market hypothesis are ‘too small to matter’?

[…]The claim that most profit opportunities in business or in securities markets have been taken is justified. But it is the search for the profit opportunities that have not been taken that drives business forward, the belief that profit opportunities that have not been arbitraged away still exist that explains why there is so much trade in securities. Far from being ‘too small to matter’, these deviations from efficient market assumptions, not necessarily large, are the dynamic of the capitalist economy.

[…]

The preposterous claim that deviations from market efficiency were not only irrelevant to the recent crisis but could never be relevant is the product of an environment in which deduction has driven out induction and ideology has taken over from observation. The belief that models are not just useful tools but also are capable of yielding comprehensive and universal descriptions of the world has blinded its proponents to realities that have been staring them in the face. That blindness was an element in our present crisis, and conditions our still ineffectual responses. Economists – in government agencies as well as universities – were obsessively playing Grand Theft Auto while the world around them was falling apart.

The first response, from Paul Davidson, is already in.

Fat taxes & modeling

NPR covers a Danish move to tax saturated fat:

So when the tiny Scandinavian country announced it would be imposing a 16 Kroner (about $3 U.S.) tax on every kilogram of saturated fat as a way to discourage poor eating habits and raise revenue, we were left scratching our heads.

How’s that going to work?

Ole Linnet Juul, food director at Denmark’s Confederation of Industries, tells The Washington Post that the tax will increase the price of a burger by around $0.15 and raise the price of a small package of butter by around $0.40.

Our pals over at Planet Money took a stab last year at explaining the economics of our version of the fat tax — the soda tax. They conclude that price increases do drive down demand somewhat.

But couldn’t Danes just easily sneak over to neighboring Sweden for butter and oil and simply avoid paying the tax, throwing all revenue calculations off?

Meanwhile, some health studies indicate a soda tax doesn’t work to curb obesity anyways.

First a few obvious problems: oil is typically not saturated and therefore presumably wouldn’t fall under the tax. And sneaking over the border for butter? Seriously? You’d better bring back a heckuva lot, because there’s the little matter of the Øresund Strait, which now has a handy bridge, and a 36 EUR toll to go with it.

More interesting is the use of models in the linked studies. From the second (“doesn’t work”):

But new research from Northwestern University suggests that soda taxes don’t actually help obese people lose weight, largely because people with weight problems already tend to drink diet soda rather than the sugary kind. So taxing full-calorie sodas may not help many Americans make better dietary choices.

Patel ran computer simulations designed to track how soda prices would affect obesity rates. The findings demonstrated that a sugar tax would cause a negligible drop in obesity, about 1.4%, and that obese people would not lose much weight. “For people going from [body mass indexes] of over 30 to below that…most people are not having massive swings,” Patel said.

For the study, Patel’s team collected data on people with “all ranges of BMI” from the Centers for Disease Control and Prevention’s Behavioral Risk Factor Surveillance System, which has tracked health conditions in the U.S. for nearly three decades. They also collected a data set of soda prices and sales to estimate consumer practices, which they used to predict what people would purchase before and after the implementation of a soda tax. Based on the resulting change in total calories consumed per day over a set time period, the team modeled long-term changes in weight using existing nutrition literature.

Kelly Brownell, the director of the Rudd Center for Food Policy and Obesity at Yale University, has doubts about the accuracy of studies such as Patel’s. Simulations of the potential impact of public health actions such as a soda tax are based on a huge number of assumptions — about consumption, spending behavior, weight change — that are, in reality, difficult to make accurately, he explains.

“All of those changes are unknown,” he said. “So it’s not hard to allow those assumptions to create the results you want.”

Patel counters that assumptions are inevitable in research, and that previous studies that have produced results in favor of soda taxes have also made assumptions, typically about consumer preferences. “I’m trying to see if there are any critical assumptions here that really change the results, but so far I haven’t had anything like that,” he said. “It’s a somewhat valid criticism, but the paper is still being fleshed out, and there are a variety of robustness checks.”

But Patel acknowledges that his study could not predict whether a soda tax would help prevent people from consuming sweetened drinks in the first place and becoming fat later on — another point raised by Brownell. “The question of whether a soda tax could prevent people from becoming obese in the future…that’s still kind of an open question because there are some issues on how you model weight change that to my knowledge haven’t been addressed,” he said. “It’s possible that a soda tax could prevent people from becoming obese in the future, but for people already obese it’s not really going to do anything.”

As press coverage of models goes, this is actually pretty good, and Patel is nicely circumspect about the limitations of the work. The last paragraph hints at one thing that strikes me as extremely important though: the study model is essentially open loop, with price->choice->calories->body mass causality. The real world is closed loop, with important feedbacks between health and future choices of diet and exercise, and social interactions involved in choices. I suspect that the net result is that the long term effect of pricing, or any other measure, on health is substantially greater than the open loop analysis indicates, especially if you’re clever about exploiting the various positive loops that create obesity traps.

Brownell’s complaint – that we know nothing, so we can just plug in assumptions to get whatever answer we want – irks me. It betrays an ignorance of models (especially nonlinear dynamic ones), which are typically more constrained than unstated mental models, not less.

There seems to be a flowering of health and obesity models in system dynamics lately, with some interesting posters and papers at the last few conferences. There’s hope for closing those loops yet.

Bananas, vomit and behavioral economics

I just ran across a nice series of videos and transcripts on behavioral decision making, heuristics and biases, psychology and economics, with Nobel Prize winner Daniel Kahneman, Dick Thaler and other masters:

You have to watch the first to work out the meaning of my strange title. I can’t embed, so head over to Edge to view, where other interesting links will pop up.

The envelope please…

The 2011 Ig Nobel in Mathematics is for modeling … it goes to predictors of the end of the world:

Dorothy Martin of the USA (who predicted the world would end in 1954), Pat Robertson of the USA (who predicted the world would end in 1982), Elizabeth Clare Prophet of the USA (who predicted the world would end in 1990), Lee Jang Rim of KOREA (who predicted the world would end in 1992), Credonia Mwerinde of UGANDA (who predicted the world would end in 1999), and Harold Camping of the USA (who predicted the world would end on September 6, 1994 and later predicted that the world will end on October 21, 2011), for teaching the world to be careful when making mathematical assumptions and calculations.

Notice that the authors of Limits to Growth aren’t here, not because they were snubbed, but because Limits didn’t actually predict the end of the world. Update: perhaps the Onion should be added to the list though.

The Medicine prize goes to a pair of behavior & decision making studies:

Mirjam Tuk (of THE NETHERLANDS and the UK), Debra Trampe (of THE NETHERLANDS) and Luk Warlop (of BELGIUM). and jointly to Matthew Lewis, Peter Snyder and Robert Feldman (of the USA), Robert Pietrzak, David Darby, and Paul Maruff (of AUSTRALIA) for demonstrating that people make better decisions about some kinds of things — but worse decisions about other kinds of things‚ when they have a strong urge to urinate. REFERENCE: “Inhibitory Spillover: Increased Urination Urgency Facilitates Impulse Control in Unrelated Domains,” Mirjam A. Tuk, Debra Trampe and Luk Warlop, Psychological Science, vol. 22, no. 5, May 2011, pp. 627-633.

REFERENCE: “The Effect of Acute Increase in Urge to Void on Cognitive Function in Healthy Adults,” Matthew S. Lewis, Peter J. Snyder, Robert H. Pietrzak, David Darby, Robert A. Feldman, Paul T. Maruff, Neurology and Urodynamics, vol. 30, no. 1, January 2011, pp. 183-7.

ATTENDING THE CEREMONY: Mirjam Tuk, Luk Warlop, Peter Snyder, Robert Feldman, David Darb

Perhaps we need more (or is it less?) restrooms in the financial sector and Washington DC these days.

Are environmental regulations the real constraint on US energy output?

When times are tough, there are always calls to unravel environmental regulations and drill, baby, drill. I’m first in line to say that a lot of environmental regulation needs a paradigm shift, but this strikes me as a foolish hair-of-the-dog-that-bit-ya idea. Our current problems don’t come from regulation, and won’t be solved by deregulation.

On average, there’s no material deprivation in the US. We consume more petroleum per capita than any other large nation. Our problems are largely distributional – inequitable income distribution and, recently, high unemployment, which causes disproportionate harm to a few. Why solve a distributional problem by skewing environmental policy? This smacks of an attempt to grow out of our problems, which is surely doomed to the extent that growth relies on intensifying material throughput.

Consider the system:

The underlying mental model behind calls for deregulation sounds like the following: environmental regulations create compliance costs that drive up the total cost of resource extraction, depressing the production rate and depriving the people of needed $$$ and happiness. Certainly that causal path exists. But it’s not the only thing going on.

Those regulations were created for a reason. They reduce environmental impacts, and therefore reduce the unpaid social costs that occur as side effects of oil production and consumption, and therefore improve welfare. These effects are nontrivial, unless you’re a GOP presidential candidate. One could wish for more efficient regulations, but absent that, wishing for less regulation is tantamount to wishing for more environmental consequences and social costs, and hoping that more $$$ will offset that.

Even the basic open-loop rationale for deregulation makes little sense. Resource policy is already loose, so there’s no quantity constraint on production. With the exception of ANWR and some offshore areas, most interesting areas are already leased. Montana certainly doesn’t exercise any foresight in the management of  its trust lands. Environmental regulations have hardly become more stringent in the last decade or so. Since oil production in 1999 was higher than it is today, with oil prices well below $20/bbl, so compliance costs must be less than that. So, with oil at $100/bbl, we’d expect an explosion of supply, if regulatory costs were the only constraint. In fact, there’s barely an upward blip, so there must be something else at work…

The real problem is that there’s feedback in the system. For example, there’s balancing loop B1: as you extract more stuff, the remaining resource (oil in the ground) dwindles, and the physical costs of extraction – capital, labor, energy – go up. Technology can stave off that trend for some time, but prices and production trends make it clear that B1 is now dominant. This means that there’s a rather stark better-before-worse tradeoff: if we extract oil more quickly now, to hoist ourselves out of the financial crisis, we’ll have less later. But it seems likely that we’ll be even more desperate later – either to have that oil in an even pricier world market, or to keep it in the ground due to climate concerns. Consider what would have happened if we’d had no environmental constraints on oil production for the last three or four decades. Would the US now have more or less oil to rely on? Would we be happy that we pumped up all that black gold at under $20/bbl? Even the Hotelling rule is telling us that we should leave oil in the ground, as long as prices are rising faster than the interest rate (not hard, at current rates).

Another loop is just gaining traction: B2. As the stock of oil in the ground is depleted, marginal production occurs in increasingly desperate and devastating circumstances. Either you pursue smaller, more remote fields, meaning more drilling and infrastructure disturbance in sensitive areas, or you pursue unconventional resources, like tar sands and shale gas, with resource-intensive methods and unknown hazards. A regulatory rollback would accelerate production via the most destructive extraction methods, right at the time that the physics of extraction is already shifting the balance of private benefits ($$$) and social costs unfavorably. Loop B2 also operates inequitably, much like unemployment. Not everyone is harmed by oil and gas development; the impacts fall disproportionately on the neighbors of projects, who may not even benefit due to severance of surface and mineral rights. This weakens the argument for deregulation even further.

Rather than pretending we can turn the clock back to 1970, we should be thinking carefully about our exit strategy for scarce and climate-constrained resources. There must be lots of things we can do to solve the distributional problems of the current crisis without socializing the costs and privatizing the gains of fossil fuel exploitation more than we already do.

Greater petroleum independence for the US?

The NYT enthuses about the prospects for new oil production in the Americas:

New Fields May Propel Americas to Top of Oil Companies’ Lists

Still, the new oil exploits in the Americas suggest that technology may be trumping geology, especially in the region’s two largest economies, the United States and Brazil. The rock formations in Texas and North Dakota were thought to be largely fruitless propositions before contentious exploration methods involving horizontal drilling and hydraulic fracturing — the blasting of water, chemicals and sand through rock to free oil inside, known as fracking — gained momentum.

While the contamination of water supplies by fracking is a matter of fierce environmental debate, the technology is already reversing long-declining oil production in the United States, with overall output from locations where oil is contained in shale and other rocks projected to exceed two million barrels a day by 2020, according to some estimates. The United States already produces about half of its own oil needs, so the increase could help it further peel away dependence on foreign oil.

Setting aside the big developments in Brazil and Canada, what does technology trumping geology, “reversing long-declining oil production in the United States” look like? Here’s the latest from EIA:

Somehow it’s not such a compelling story in pictures.