How I learned to stop worrying and love methane

RealClimate has a nice summary of recent atmospheric methane findings. Here’s the structure:

methane2The bad news (red) has been that methane release from permafrost and clathrates on the continental shelf appears to be significant. At the same time, methane release from natural gas seems to be larger than previously thought, and (partly for the same reason – fracking) gas resources appear to be larger. Both put upward pressure on atmospheric methane.

However, there are some constraints as well. The methane budget must be consistent with observations of atmospheric concentrations and gradients (green). Therefore, if one source is thought to be bigger, it must be the case historically that other natural or anthropogenic sources are smaller (or perhaps uptake is faster) by an offsetting amount (blue).

This bad-news-good-news story does not rule out positive feedbacks from temperature or atmospheric chemistry, but at least we’re not cooked yet.

Energy rich or poor?

The Energy Collective echoes amazement at unconventional oil and gas,

Yergin, vice chairman of IHS CERA:

“The United States is in the midst of the ‘unconventional revolution in oil and gas’ that, it becomes increasingly apparent, goes beyond energy itself.

“Owing to the scale and impact of shale gas and tight oil, it is appropriate to describe their development as the most important energy innovation so far of the 21st century. … It is striking to think back to the hearings of even just half a decade ago, during the turmoil of 2008, when it was widely assumed that a permanent era of energy shortage was at hand. How different things look today.”

Mary J. Hutzler, Institute for Energy Research:

“The United States has vast resources of oil, natural gas, and coal. In a few short years, a forty-year paradigm – that we were energy resource poor – has been disproven. Instead of being resource poor, we are incredibly energy rich.”

Abundance is often attributed to a technical miracle, brought about by government R&D into unconventional fossil fuels. The articulated mental model is something like the following:

But is this really a revolutionary transition from scarcity to abundance, was it a surprise, and should technology get all the credit? I don’t think so.

(Abundance/Scarcity) = 1.03?

Contrast the 1995 and 2012 USGS National Assessments of onshore resources:

Resources, on an energy basis (EJ). Cumulative production from EIA; note that gas production data begins in 1980, so gas cumulative production is understated.

In spite of increasing unconventional resources, there’s actually less oil than there was, mainly because a lot of the 1995 resource has since been produced. (Certainly there are also other differences, including method changes.) For gas, where one can make a stronger case for a miracle due to the large increase in unconventional resources, the top line is up a whopping 3%. Even if you go with EIA/INTEK‘s ~2x larger estimate for shale gas, resources are up only 35%.

Call me conservative, but I think an abundance revolution that “disproves” scarcity would be a factor of 10 increase, not these piddly changes.

You could argue that the USGS hasn’t gotten the memo, and therefore has failed to appreciate new, vast unconventional resources. But given that they have reams of papers assessing unconventional fields, I think it more likely that they’re properly accounting for low recoverability, and not being bamboozled by large resources in place.

Reserves involve less guesswork, but more confounding dynamics. But reserves tell about the same story as resources. Oil reserves are more than 40% off their 1970 peak. Even gas reserves have only just regained the levels achieved 40 years ago.

EIA

Surprise?

In 1991, USGS’ Thomas Ahlbrandt wrote:

Unconventional natural gas resources are also becoming increasingly viable. Coalbed methane, which accounts for about 25 percent of potential natural gas resources in the U.S., will displace nearly a trillion cubic feet (TCF) of gas from conventional resources in the near term and perhaps several TCF by the turn of the century. Similarly, production of gas from low permeability resources may displace some production of conventional gas as increasingly smaller conventional accumulations are developed. Coalbed methane and tight gas, both abundant in the Rocky Mountain and Appalachian regions, will likely experience significant production increases. Optimistic scenarios suggest that tight gas and coalbed methane resources may provide more domestic natural gas production than conventional resources by the year 2010. Horizontal drilling technology will most likely unlock the large currently uneconomic gas resources in tight reservoirs. Technologies like this will most certainly change the status of what are presently considered unconventional resources.

I’d call that a “no.”

Should we be surprised to see supply increasing in the current price environment? Again, I’d say no. The idea that oil and gas have supply curves is certainly much older than its appearance in the 1995 USGS assessment. Perhaps the ongoing increase in shale gas development, when prices have collapsed, is a bit surprising. But then you have to consider that (a) drilling costs have tanked alongside the economy, (b) there are lags between price, perception, capital allocation, and production, and (c) it’s expectations of price, not current prices, that drive investment.

Does tech get the credit?

Certainly tech gets some credit. For example, the Bakken oil boom owes much to horizontal drilling:

EIA

But there’s more than tech going on. And much of the tech evolution is surely a function of industry activity funded out of revenue or accumulated through production experience, rather than pure government R&D.

If tech is the exclusive driver of increasing abundance, you’d expect costs and prices to be falling. Gas prices are indeed well off their recent peak, though one could wonder whether that’s a durable circumstance. Even so, gas is no cheaper than it was in the 90s, and more costly than in the pre-OPEC era. Oil isn’t cheap at all – it’s close to its historic highs.

So, if there’s anything here that one might call a tech fingerprint, it would have to be the decline in gas prices post-mid-2008. But that coincides better with the financial crisis than with the gas boom.

Cost data are less current, but if anything the cost picture is less sanguine. “Real gas equipment costs are 12 percent higher and operating costs are 37 percent higher than for the base year of 1976,” says EIA.

Bottom Line

First, let’s not kid ourselves. There’s less oil and gas under US soil than there has ever been.

Technology has at best done a little more than keep the wolf from the door, by lowering the cost of exploration and development by enough to offset the increases that would result from increasing physical scarcity.

It’s possible that the effects on shale and tight gas cost and availability have been dramatic, but there are plausible alternative hypotheses (financial crisis, moving up supply curves, and delays in production capital investment) for current prices.

Personally, I doubt that technology can keep up with physical scarcity and demand growth forever, so I don’t expect that gas prices will continue walking back to 1970 or 1960 levels. The picture for oil is even worse. But I hope that at some point, we’ll come to our senses and tax CO2 at a level high enough to reverse consumption growth. If that happens abruptly enough, it could drive down wellhead prices.

None of this sounds like the kind of tailfins and big-block V8 abundance that people seem to be hoping for.

 

 

The myth of optimal depletion

Fifteen years ago, when I was working on my dissertation, I read a lot of the economic literature on resource management. I was looking for a behavioral model of the management of depletable resources like oil and gas. I never did find one (and still haven’t, though I haven’t been looking as hard in the last few years).

Instead, the literature focused on optimal depletion models. Essentially these characterize the extraction of resources that would occur in an idealized market – a single, infinitely-lived resource manager, perfect information about the resource base and about the future (!), no externalities, no lock-in effects.

It’s always useful to know the optimal trajectory for a managed resource – it identifies the upper bound for improvement and suggests strategic or policy changes to achieve the ideal. But many authors have transplanted these optimal depletion models into real-world policy frameworks directly, without determining whether the idealized assumptions hold in reality.

The problem is that they don’t. There are some obvious failings – for example, I’m pretty certain a priori that no resource manager actually knows the future. Unreal assumptions are reflected in unreal model behavior – I’ve seen dozens of papers that discuss results matching the classic Hotelling framework – prices rising smoothly at the interest rate, with the extraction rate falling to match, as if it had something to do with what we observe.

The fundamental failure is valuing the normative knowledge about small, analytically tractable problems above the insight that arises from experiments with a model that describes actual decision making – complete with cognitive limitations, agency problems, and other foibles.

In typical optimal depletion models, an agent controls a resource, and extracts it to maximize discounted utility. Firms succeed in managing other assets reasonably well, so why not? Well, there’s a very fundamental problem: in most places, firms don’t control resources. They control reserves. Governments control resources. As a result, firms’ ownership of the long term depletion challenge extends only as far as their asset exposure – a few decades at most. If there are principal-agent problems within firms, their effective horizon is even shorter – only as long as the tenure of a manager (worse things can happen, too).

Governments are no better; politicians and despots both have incentives to deplete resources to raise money to pacify the populace. This encourages a “sell low” strategy – when oil prices are low, governments have to sell more to meet fixed obligations (the other end of the backward-bending supply curve). And, of course, a government that wisely shepherds its resources can always lose them to a neighbor that extracts its resources quickly and invests the proceeds in military hardware.

The US is unusual in that many mineral rights are privately held, but still the government’s management of its share is instructive. I’ll just skip over the circus at the MMS and go to Montana’s trust lands. The mission of the trust is to provide a permanent endowment for public schools. But the way the trust is run could hardly be less likely to maximize or even sustain school revenue.

Fundamentally, the whole process is unmanaged – the trust makes no attempt to control the rate at which parcels are leased for extraction. Instead, trust procedures put the leasing of tracts in the hands of developers – parcels are auctioned whenever a prospective bidder requests.  Once anyone gets a whiff of information about the prospects of a tract, they must act to bid – if they’re early enough, they may get lucky and face little or no competition in the auction (easier than you’d think, because the trust doesn’t provide much notice of sales). Once buyers obtain a lease, they must drill within five years, or the lease expires. This land rush mentality leaves the trust with no control over price or the rate of extraction – they just take their paltry 16% cut (plus or minus), whenever developers choose to give it to them. When you read statements from the government resource managers, they’re unapologetically happy about it: they talk about the trust as if it were a jobs program, not an endowment.

This sort of structure is the norm, not the exception. It would be a strange world in which all of the competing biases in the process cancelled each other out, and yielded a globally optimal outcome in spite of local irrationality. The result, I think, is that policies in climate and energy models are biased, possibly in an unknown direction. On one hand, it seems likely that there’s a negative externality from extraction of public resources above the optimal rate, as in Montana. On the other hand, there might be harmful spillovers from climate or energy policies that increase the use of natural gas, if they exacerbate problems with a suboptimal extraction trajectory.

I’ve done a little sniffing around lately, and it seems that the state of the art in integrated assessment models isn’t too different from what it was in 1995 – most models still use exogenous depletion trajectories or some kind of optimization or equilibrium approach. The only real innovation I’ve seen is a stochastic model-within-a-model approach – essentially, agents know the structure of the system they’re in, but are uncertain about it’s state, so they make stochastically optimal decisions at each point in time. This is a step in the right direction, but still implies a very high cognitive load and degree of intended rationality that doesn’t square with real institutions. I’d be very interested to hear about anything new that moves toward a true behavioral model of resource management.