How I learned to stop worrying and love methane

RealClimate has a nice summary of recent atmospheric methane findings. Here’s the structure:

methane2The bad news (red) has been that methane release from permafrost and clathrates on the continental shelf appears to be significant. At the same time, methane release from natural gas seems to be larger than previously thought, and (partly for the same reason – fracking) gas resources appear to be larger. Both put upward pressure on atmospheric methane.

However, there are some constraints as well. The methane budget must be consistent with observations of atmospheric concentrations and gradients (green). Therefore, if one source is thought to be bigger, it must be the case historically that other natural or anthropogenic sources are smaller (or perhaps uptake is faster) by an offsetting amount (blue).

This bad-news-good-news story does not rule out positive feedbacks from temperature or atmospheric chemistry, but at least we’re not cooked yet.

Energy rich or poor?

The Energy Collective echoes amazement at unconventional oil and gas,

Yergin, vice chairman of IHS CERA:

“The United States is in the midst of the ‘unconventional revolution in oil and gas’ that, it becomes increasingly apparent, goes beyond energy itself.

“Owing to the scale and impact of shale gas and tight oil, it is appropriate to describe their development as the most important energy innovation so far of the 21st century. … It is striking to think back to the hearings of even just half a decade ago, during the turmoil of 2008, when it was widely assumed that a permanent era of energy shortage was at hand. How different things look today.”

Mary J. Hutzler, Institute for Energy Research:

“The United States has vast resources of oil, natural gas, and coal. In a few short years, a forty-year paradigm – that we were energy resource poor – has been disproven. Instead of being resource poor, we are incredibly energy rich.”

Abundance is often attributed to a technical miracle, brought about by government R&D into unconventional fossil fuels. The articulated mental model is something like the following:

But is this really a revolutionary transition from scarcity to abundance, was it a surprise, and should technology get all the credit? I don’t think so.

(Abundance/Scarcity) = 1.03?

Contrast the 1995 and 2012 USGS National Assessments of onshore resources:

Resources, on an energy basis (EJ). Cumulative production from EIA; note that gas production data begins in 1980, so gas cumulative production is understated.

In spite of increasing unconventional resources, there’s actually less oil than there was, mainly because a lot of the 1995 resource has since been produced. (Certainly there are also other differences, including method changes.) For gas, where one can make a stronger case for a miracle due to the large increase in unconventional resources, the top line is up a whopping 3%. Even if you go with EIA/INTEK‘s ~2x larger estimate for shale gas, resources are up only 35%.

Call me conservative, but I think an abundance revolution that “disproves” scarcity would be a factor of 10 increase, not these piddly changes.

You could argue that the USGS hasn’t gotten the memo, and therefore has failed to appreciate new, vast unconventional resources. But given that they have reams of papers assessing unconventional fields, I think it more likely that they’re properly accounting for low recoverability, and not being bamboozled by large resources in place.

Reserves involve less guesswork, but more confounding dynamics. But reserves tell about the same story as resources. Oil reserves are more than 40% off their 1970 peak. Even gas reserves have only just regained the levels achieved 40 years ago.

EIA

Surprise?

In 1991, USGS’ Thomas Ahlbrandt wrote:

Unconventional natural gas resources are also becoming increasingly viable. Coalbed methane, which accounts for about 25 percent of potential natural gas resources in the U.S., will displace nearly a trillion cubic feet (TCF) of gas from conventional resources in the near term and perhaps several TCF by the turn of the century. Similarly, production of gas from low permeability resources may displace some production of conventional gas as increasingly smaller conventional accumulations are developed. Coalbed methane and tight gas, both abundant in the Rocky Mountain and Appalachian regions, will likely experience significant production increases. Optimistic scenarios suggest that tight gas and coalbed methane resources may provide more domestic natural gas production than conventional resources by the year 2010. Horizontal drilling technology will most likely unlock the large currently uneconomic gas resources in tight reservoirs. Technologies like this will most certainly change the status of what are presently considered unconventional resources.

I’d call that a “no.”

Should we be surprised to see supply increasing in the current price environment? Again, I’d say no. The idea that oil and gas have supply curves is certainly much older than its appearance in the 1995 USGS assessment. Perhaps the ongoing increase in shale gas development, when prices have collapsed, is a bit surprising. But then you have to consider that (a) drilling costs have tanked alongside the economy, (b) there are lags between price, perception, capital allocation, and production, and (c) it’s expectations of price, not current prices, that drive investment.

Does tech get the credit?

Certainly tech gets some credit. For example, the Bakken oil boom owes much to horizontal drilling:

EIA

But there’s more than tech going on. And much of the tech evolution is surely a function of industry activity funded out of revenue or accumulated through production experience, rather than pure government R&D.

If tech is the exclusive driver of increasing abundance, you’d expect costs and prices to be falling. Gas prices are indeed well off their recent peak, though one could wonder whether that’s a durable circumstance. Even so, gas is no cheaper than it was in the 90s, and more costly than in the pre-OPEC era. Oil isn’t cheap at all – it’s close to its historic highs.

So, if there’s anything here that one might call a tech fingerprint, it would have to be the decline in gas prices post-mid-2008. But that coincides better with the financial crisis than with the gas boom.

Cost data are less current, but if anything the cost picture is less sanguine. “Real gas equipment costs are 12 percent higher and operating costs are 37 percent higher than for the base year of 1976,” says EIA.

Bottom Line

First, let’s not kid ourselves. There’s less oil and gas under US soil than there has ever been.

Technology has at best done a little more than keep the wolf from the door, by lowering the cost of exploration and development by enough to offset the increases that would result from increasing physical scarcity.

It’s possible that the effects on shale and tight gas cost and availability have been dramatic, but there are plausible alternative hypotheses (financial crisis, moving up supply curves, and delays in production capital investment) for current prices.

Personally, I doubt that technology can keep up with physical scarcity and demand growth forever, so I don’t expect that gas prices will continue walking back to 1970 or 1960 levels. The picture for oil is even worse. But I hope that at some point, we’ll come to our senses and tax CO2 at a level high enough to reverse consumption growth. If that happens abruptly enough, it could drive down wellhead prices.

None of this sounds like the kind of tailfins and big-block V8 abundance that people seem to be hoping for.

 

 

Gas – a bridge to nowhere?

NPR has a nice piece on the US natural gas boom.

Henry Jacoby, an economist at the Center for Energy and Environmental Policy Research at MIT, says cheap energy will help pump up the economy.

“Overall, this is a great boon to the United States,” he says. “It’s not a bad thing to have this new and available domestic resource.” He says cheap energy can boost the economy, and he notes that natural gas is half as polluting as coal when it’s burned for electricity.

“But we have to keep our eye on the ball long-term,” Jacoby says. He’s concerned about how cheap gas will affect much cleaner sources of energy. Wind and solar power are more expensive than natural gas, and though those prices have been coming down, they’re chasing a moving target that has fallen fast: natural gas.

“It makes the prospects for large-scale expansion of those technologies more chancy,” Jacoby says.

From an environmental perspective, natural gas could help transition our economy from fossil fuels to clean energy. It’s often portrayed as a bridge fuel to help us through the transition, because it’s so much cleaner than coal and it’s abundant. But Jacoby says that bridge could be in trouble if cheap gas kills the incentive to develop renewable industry.

“You’d better be thinking about a landing of the bridge at the other end. If there’s no landing at the other end, it’s just a bridge to nowhere,” he says.

(For those who don’t know, Jake Jacoby is not a warm-fuzzy greenie; he’s a hard line economist who leads a big general equilibrium modeling project, but also takes climate science seriously).

For me, the key takeaways are:

  • Gas beats coal, and may have other useful roles to play. For example, gas backup might be a low-capital-cost complement to variable renewables, with minor emissions consequences.
  • It’s better to have more resources than less.
  • Whether the opportunity of greater resources translates into a benefit depends on whether the price of gas accounts for full costs.

The last item is a problem. In the US, the price of greenhouse emissions from gas (or anything else) is approximately zero. The effective prices of other environmental consequences – air quality, pollution from fracking, etc. – are also low. Depletion rents for gas are probably also too low, because the abundance of gas is overhyped, and public resources were suboptimally over-allocated decades ago. Low depletion rents encourage a painful boom/bust of gas supply.

Not only physical assets are mispriced. Another part of the story is learning-by-doing, deliberate R&D, and economies of scale – positive feedbacks that grow the market for low-emissions technologies. Firms producing new tech like PV or wind turbines are only able to appropriate part of the profits of their innovations. The rest spills over to benefit society more generally. Too-cheap gas undercuts these reinforcing mechanisms, so gas substitutes aren’t available when scarcity inevitably returns, hence the “bridge to nowhere” dynamic.

Long-term renewable deployment in the U.S. is going to depend primarily on policy. Is there enough concern about environmental consequences to put in place incentives for renewable energy?

Trevor Houser, energy analyst, Rhodium Group

They key is, what kind of policy? Currently, we rely primarily on performance standards and subsidies. These aren’t getting the job done, for structural reasons. For example, subsidies are self-extinguishing, because they get too expensive to sustain when the target gets too big (think solar feed-in-tariffs in Europe). They’re also politically vulnerable to apparently-cheap alternatives:

“If those prices hang around for another three or four years, then I think you’ll definitely see reduced political will for renewable energy deployment, ” Houser says

The basic problem is that the mindset of subsidizing or requiring “good” technologies makes them feel like luxuries for rich altruists, even though the apparently-cheap alternatives may be merely penny-wise and pound-foolish. The essential alternative is to price the bads, with the logic that people who want to use technologies that harm others ought to at least pay for the privilege. If we can’t manage to do that, I don’t think there’s much hope of getting gas or climate policy right.

The myth of optimal depletion

Fifteen years ago, when I was working on my dissertation, I read a lot of the economic literature on resource management. I was looking for a behavioral model of the management of depletable resources like oil and gas. I never did find one (and still haven’t, though I haven’t been looking as hard in the last few years).

Instead, the literature focused on optimal depletion models. Essentially these characterize the extraction of resources that would occur in an idealized market – a single, infinitely-lived resource manager, perfect information about the resource base and about the future (!), no externalities, no lock-in effects.

It’s always useful to know the optimal trajectory for a managed resource – it identifies the upper bound for improvement and suggests strategic or policy changes to achieve the ideal. But many authors have transplanted these optimal depletion models into real-world policy frameworks directly, without determining whether the idealized assumptions hold in reality.

The problem is that they don’t. There are some obvious failings – for example, I’m pretty certain a priori that no resource manager actually knows the future. Unreal assumptions are reflected in unreal model behavior – I’ve seen dozens of papers that discuss results matching the classic Hotelling framework – prices rising smoothly at the interest rate, with the extraction rate falling to match, as if it had something to do with what we observe.

The fundamental failure is valuing the normative knowledge about small, analytically tractable problems above the insight that arises from experiments with a model that describes actual decision making – complete with cognitive limitations, agency problems, and other foibles.

In typical optimal depletion models, an agent controls a resource, and extracts it to maximize discounted utility. Firms succeed in managing other assets reasonably well, so why not? Well, there’s a very fundamental problem: in most places, firms don’t control resources. They control reserves. Governments control resources. As a result, firms’ ownership of the long term depletion challenge extends only as far as their asset exposure – a few decades at most. If there are principal-agent problems within firms, their effective horizon is even shorter – only as long as the tenure of a manager (worse things can happen, too).

Governments are no better; politicians and despots both have incentives to deplete resources to raise money to pacify the populace. This encourages a “sell low” strategy – when oil prices are low, governments have to sell more to meet fixed obligations (the other end of the backward-bending supply curve). And, of course, a government that wisely shepherds its resources can always lose them to a neighbor that extracts its resources quickly and invests the proceeds in military hardware.

The US is unusual in that many mineral rights are privately held, but still the government’s management of its share is instructive. I’ll just skip over the circus at the MMS and go to Montana’s trust lands. The mission of the trust is to provide a permanent endowment for public schools. But the way the trust is run could hardly be less likely to maximize or even sustain school revenue.

Fundamentally, the whole process is unmanaged – the trust makes no attempt to control the rate at which parcels are leased for extraction. Instead, trust procedures put the leasing of tracts in the hands of developers – parcels are auctioned whenever a prospective bidder requests.  Once anyone gets a whiff of information about the prospects of a tract, they must act to bid – if they’re early enough, they may get lucky and face little or no competition in the auction (easier than you’d think, because the trust doesn’t provide much notice of sales). Once buyers obtain a lease, they must drill within five years, or the lease expires. This land rush mentality leaves the trust with no control over price or the rate of extraction – they just take their paltry 16% cut (plus or minus), whenever developers choose to give it to them. When you read statements from the government resource managers, they’re unapologetically happy about it: they talk about the trust as if it were a jobs program, not an endowment.

This sort of structure is the norm, not the exception. It would be a strange world in which all of the competing biases in the process cancelled each other out, and yielded a globally optimal outcome in spite of local irrationality. The result, I think, is that policies in climate and energy models are biased, possibly in an unknown direction. On one hand, it seems likely that there’s a negative externality from extraction of public resources above the optimal rate, as in Montana. On the other hand, there might be harmful spillovers from climate or energy policies that increase the use of natural gas, if they exacerbate problems with a suboptimal extraction trajectory.

I’ve done a little sniffing around lately, and it seems that the state of the art in integrated assessment models isn’t too different from what it was in 1995 – most models still use exogenous depletion trajectories or some kind of optimization or equilibrium approach. The only real innovation I’ve seen is a stochastic model-within-a-model approach – essentially, agents know the structure of the system they’re in, but are uncertain about it’s state, so they make stochastically optimal decisions at each point in time. This is a step in the right direction, but still implies a very high cognitive load and degree of intended rationality that doesn’t square with real institutions. I’d be very interested to hear about anything new that moves toward a true behavioral model of resource management.