Energy rich or poor?

The Energy Collective echoes amazement at unconventional oil and gas,

Yergin, vice chairman of IHS CERA:

“The United States is in the midst of the ‘unconventional revolution in oil and gas’ that, it becomes increasingly apparent, goes beyond energy itself.

“Owing to the scale and impact of shale gas and tight oil, it is appropriate to describe their development as the most important energy innovation so far of the 21st century. … It is striking to think back to the hearings of even just half a decade ago, during the turmoil of 2008, when it was widely assumed that a permanent era of energy shortage was at hand. How different things look today.”

Mary J. Hutzler, Institute for Energy Research:

“The United States has vast resources of oil, natural gas, and coal. In a few short years, a forty-year paradigm – that we were energy resource poor – has been disproven. Instead of being resource poor, we are incredibly energy rich.”

Abundance is often attributed to a technical miracle, brought about by government R&D into unconventional fossil fuels. The articulated mental model is something like the following:

But is this really a revolutionary transition from scarcity to abundance, was it a surprise, and should technology get all the credit? I don’t think so.

(Abundance/Scarcity) = 1.03?

Contrast the 1995 and 2012 USGS National Assessments of onshore resources:

Resources, on an energy basis (EJ). Cumulative production from EIA; note that gas production data begins in 1980, so gas cumulative production is understated.

In spite of increasing unconventional resources, there’s actually less oil than there was, mainly because a lot of the 1995 resource has since been produced. (Certainly there are also other differences, including method changes.) For gas, where one can make a stronger case for a miracle due to the large increase in unconventional resources, the top line is up a whopping 3%. Even if you go with EIA/INTEK‘s ~2x larger estimate for shale gas, resources are up only 35%.

Call me conservative, but I think an abundance revolution that “disproves” scarcity would be a factor of 10 increase, not these piddly changes.

You could argue that the USGS hasn’t gotten the memo, and therefore has failed to appreciate new, vast unconventional resources. But given that they have reams of papers assessing unconventional fields, I think it more likely that they’re properly accounting for low recoverability, and not being bamboozled by large resources in place.

Reserves involve less guesswork, but more confounding dynamics. But reserves tell about the same story as resources. Oil reserves are more than 40% off their 1970 peak. Even gas reserves have only just regained the levels achieved 40 years ago.

EIA

Surprise?

In 1991, USGS’ Thomas Ahlbrandt wrote:

Unconventional natural gas resources are also becoming increasingly viable. Coalbed methane, which accounts for about 25 percent of potential natural gas resources in the U.S., will displace nearly a trillion cubic feet (TCF) of gas from conventional resources in the near term and perhaps several TCF by the turn of the century. Similarly, production of gas from low permeability resources may displace some production of conventional gas as increasingly smaller conventional accumulations are developed. Coalbed methane and tight gas, both abundant in the Rocky Mountain and Appalachian regions, will likely experience significant production increases. Optimistic scenarios suggest that tight gas and coalbed methane resources may provide more domestic natural gas production than conventional resources by the year 2010. Horizontal drilling technology will most likely unlock the large currently uneconomic gas resources in tight reservoirs. Technologies like this will most certainly change the status of what are presently considered unconventional resources.

I’d call that a “no.”

Should we be surprised to see supply increasing in the current price environment? Again, I’d say no. The idea that oil and gas have supply curves is certainly much older than its appearance in the 1995 USGS assessment. Perhaps the ongoing increase in shale gas development, when prices have collapsed, is a bit surprising. But then you have to consider that (a) drilling costs have tanked alongside the economy, (b) there are lags between price, perception, capital allocation, and production, and (c) it’s expectations of price, not current prices, that drive investment.

Does tech get the credit?

Certainly tech gets some credit. For example, the Bakken oil boom owes much to horizontal drilling:

EIA

But there’s more than tech going on. And much of the tech evolution is surely a function of industry activity funded out of revenue or accumulated through production experience, rather than pure government R&D.

If tech is the exclusive driver of increasing abundance, you’d expect costs and prices to be falling. Gas prices are indeed well off their recent peak, though one could wonder whether that’s a durable circumstance. Even so, gas is no cheaper than it was in the 90s, and more costly than in the pre-OPEC era. Oil isn’t cheap at all – it’s close to its historic highs.

So, if there’s anything here that one might call a tech fingerprint, it would have to be the decline in gas prices post-mid-2008. But that coincides better with the financial crisis than with the gas boom.

Cost data are less current, but if anything the cost picture is less sanguine. “Real gas equipment costs are 12 percent higher and operating costs are 37 percent higher than for the base year of 1976,” says EIA.

Bottom Line

First, let’s not kid ourselves. There’s less oil and gas under US soil than there has ever been.

Technology has at best done a little more than keep the wolf from the door, by lowering the cost of exploration and development by enough to offset the increases that would result from increasing physical scarcity.

It’s possible that the effects on shale and tight gas cost and availability have been dramatic, but there are plausible alternative hypotheses (financial crisis, moving up supply curves, and delays in production capital investment) for current prices.

Personally, I doubt that technology can keep up with physical scarcity and demand growth forever, so I don’t expect that gas prices will continue walking back to 1970 or 1960 levels. The picture for oil is even worse. But I hope that at some point, we’ll come to our senses and tax CO2 at a level high enough to reverse consumption growth. If that happens abruptly enough, it could drive down wellhead prices.

None of this sounds like the kind of tailfins and big-block V8 abundance that people seem to be hoping for.

 

 

Are environmental regulations the real constraint on US energy output?

When times are tough, there are always calls to unravel environmental regulations and drill, baby, drill. I’m first in line to say that a lot of environmental regulation needs a paradigm shift, but this strikes me as a foolish hair-of-the-dog-that-bit-ya idea. Our current problems don’t come from regulation, and won’t be solved by deregulation.

On average, there’s no material deprivation in the US. We consume more petroleum per capita than any other large nation. Our problems are largely distributional – inequitable income distribution and, recently, high unemployment, which causes disproportionate harm to a few. Why solve a distributional problem by skewing environmental policy? This smacks of an attempt to grow out of our problems, which is surely doomed to the extent that growth relies on intensifying material throughput.

Consider the system:

The underlying mental model behind calls for deregulation sounds like the following: environmental regulations create compliance costs that drive up the total cost of resource extraction, depressing the production rate and depriving the people of needed $$$ and happiness. Certainly that causal path exists. But it’s not the only thing going on.

Those regulations were created for a reason. They reduce environmental impacts, and therefore reduce the unpaid social costs that occur as side effects of oil production and consumption, and therefore improve welfare. These effects are nontrivial, unless you’re a GOP presidential candidate. One could wish for more efficient regulations, but absent that, wishing for less regulation is tantamount to wishing for more environmental consequences and social costs, and hoping that more $$$ will offset that.

Even the basic open-loop rationale for deregulation makes little sense. Resource policy is already loose, so there’s no quantity constraint on production. With the exception of ANWR and some offshore areas, most interesting areas are already leased. Montana certainly doesn’t exercise any foresight in the management of  its trust lands. Environmental regulations have hardly become more stringent in the last decade or so. Since oil production in 1999 was higher than it is today, with oil prices well below $20/bbl, so compliance costs must be less than that. So, with oil at $100/bbl, we’d expect an explosion of supply, if regulatory costs were the only constraint. In fact, there’s barely an upward blip, so there must be something else at work…

The real problem is that there’s feedback in the system. For example, there’s balancing loop B1: as you extract more stuff, the remaining resource (oil in the ground) dwindles, and the physical costs of extraction – capital, labor, energy – go up. Technology can stave off that trend for some time, but prices and production trends make it clear that B1 is now dominant. This means that there’s a rather stark better-before-worse tradeoff: if we extract oil more quickly now, to hoist ourselves out of the financial crisis, we’ll have less later. But it seems likely that we’ll be even more desperate later – either to have that oil in an even pricier world market, or to keep it in the ground due to climate concerns. Consider what would have happened if we’d had no environmental constraints on oil production for the last three or four decades. Would the US now have more or less oil to rely on? Would we be happy that we pumped up all that black gold at under $20/bbl? Even the Hotelling rule is telling us that we should leave oil in the ground, as long as prices are rising faster than the interest rate (not hard, at current rates).

Another loop is just gaining traction: B2. As the stock of oil in the ground is depleted, marginal production occurs in increasingly desperate and devastating circumstances. Either you pursue smaller, more remote fields, meaning more drilling and infrastructure disturbance in sensitive areas, or you pursue unconventional resources, like tar sands and shale gas, with resource-intensive methods and unknown hazards. A regulatory rollback would accelerate production via the most destructive extraction methods, right at the time that the physics of extraction is already shifting the balance of private benefits ($$$) and social costs unfavorably. Loop B2 also operates inequitably, much like unemployment. Not everyone is harmed by oil and gas development; the impacts fall disproportionately on the neighbors of projects, who may not even benefit due to severance of surface and mineral rights. This weakens the argument for deregulation even further.

Rather than pretending we can turn the clock back to 1970, we should be thinking carefully about our exit strategy for scarce and climate-constrained resources. There must be lots of things we can do to solve the distributional problems of the current crisis without socializing the costs and privatizing the gains of fossil fuel exploitation more than we already do.

Greater petroleum independence for the US?

The NYT enthuses about the prospects for new oil production in the Americas:

New Fields May Propel Americas to Top of Oil Companies’ Lists

Still, the new oil exploits in the Americas suggest that technology may be trumping geology, especially in the region’s two largest economies, the United States and Brazil. The rock formations in Texas and North Dakota were thought to be largely fruitless propositions before contentious exploration methods involving horizontal drilling and hydraulic fracturing — the blasting of water, chemicals and sand through rock to free oil inside, known as fracking — gained momentum.

While the contamination of water supplies by fracking is a matter of fierce environmental debate, the technology is already reversing long-declining oil production in the United States, with overall output from locations where oil is contained in shale and other rocks projected to exceed two million barrels a day by 2020, according to some estimates. The United States already produces about half of its own oil needs, so the increase could help it further peel away dependence on foreign oil.

Setting aside the big developments in Brazil and Canada, what does technology trumping geology, “reversing long-declining oil production in the United States” look like? Here’s the latest from EIA:

Somehow it’s not such a compelling story in pictures.

Summer driving is an emergency?

A coordinated release of emergency oil stockpiles is underway. It’s almost as foolish as that timeless chain email, the Great American Gasout (now migrated to Facebook, it seems), and for the same stock-flow reasons.

Like the Gasout, strategic reserve operations don’t do anything about demand; they just shuffle it around in time. Releasing oil does increase supply by augmenting production, which causes a short term price break. But at some point you have to refill the reserve. All else equal, storing oil has to come at the expense of producing it for consumption, which means that price goes back up at some other time.

The implicit mental model here is that governments are going to buy low and sell high, releasing oil at high prices when there’s a crisis, and storing it when peaceful market conditions return. I rather doubt that political entities are very good at such things, but more importantly, where are the prospects for cheap refills, given tight supplies, strategic behavior by OPEC, and (someday) global recovery? It’s not even clear that agencies were successful at keeping the release secret, so a few market players may have captured a hefty chunk of the benefits of the release.

Setting dynamics aside, the strategic reserve release is hardly big enough to matter – the 60 million barrels planned isn’t even a day of global production. It’s only 39 days of Libyan production. Even if you have extreme views on price elasticity, that’s not going to make a huge difference – unless the release is extended. But extending the release through the end of the year would consume almost a quarter of world strategic reserves, without any clear emergency at hand.

We should be saving those reserves for a real rainy day, and increasing the end-use price through taxes, to internalize environmental and security costs and recapture OPEC rents.

The myth of optimal depletion

Fifteen years ago, when I was working on my dissertation, I read a lot of the economic literature on resource management. I was looking for a behavioral model of the management of depletable resources like oil and gas. I never did find one (and still haven’t, though I haven’t been looking as hard in the last few years).

Instead, the literature focused on optimal depletion models. Essentially these characterize the extraction of resources that would occur in an idealized market – a single, infinitely-lived resource manager, perfect information about the resource base and about the future (!), no externalities, no lock-in effects.

It’s always useful to know the optimal trajectory for a managed resource – it identifies the upper bound for improvement and suggests strategic or policy changes to achieve the ideal. But many authors have transplanted these optimal depletion models into real-world policy frameworks directly, without determining whether the idealized assumptions hold in reality.

The problem is that they don’t. There are some obvious failings – for example, I’m pretty certain a priori that no resource manager actually knows the future. Unreal assumptions are reflected in unreal model behavior – I’ve seen dozens of papers that discuss results matching the classic Hotelling framework – prices rising smoothly at the interest rate, with the extraction rate falling to match, as if it had something to do with what we observe.

The fundamental failure is valuing the normative knowledge about small, analytically tractable problems above the insight that arises from experiments with a model that describes actual decision making – complete with cognitive limitations, agency problems, and other foibles.

In typical optimal depletion models, an agent controls a resource, and extracts it to maximize discounted utility. Firms succeed in managing other assets reasonably well, so why not? Well, there’s a very fundamental problem: in most places, firms don’t control resources. They control reserves. Governments control resources. As a result, firms’ ownership of the long term depletion challenge extends only as far as their asset exposure – a few decades at most. If there are principal-agent problems within firms, their effective horizon is even shorter – only as long as the tenure of a manager (worse things can happen, too).

Governments are no better; politicians and despots both have incentives to deplete resources to raise money to pacify the populace. This encourages a “sell low” strategy – when oil prices are low, governments have to sell more to meet fixed obligations (the other end of the backward-bending supply curve). And, of course, a government that wisely shepherds its resources can always lose them to a neighbor that extracts its resources quickly and invests the proceeds in military hardware.

The US is unusual in that many mineral rights are privately held, but still the government’s management of its share is instructive. I’ll just skip over the circus at the MMS and go to Montana’s trust lands. The mission of the trust is to provide a permanent endowment for public schools. But the way the trust is run could hardly be less likely to maximize or even sustain school revenue.

Fundamentally, the whole process is unmanaged – the trust makes no attempt to control the rate at which parcels are leased for extraction. Instead, trust procedures put the leasing of tracts in the hands of developers – parcels are auctioned whenever a prospective bidder requests.  Once anyone gets a whiff of information about the prospects of a tract, they must act to bid – if they’re early enough, they may get lucky and face little or no competition in the auction (easier than you’d think, because the trust doesn’t provide much notice of sales). Once buyers obtain a lease, they must drill within five years, or the lease expires. This land rush mentality leaves the trust with no control over price or the rate of extraction – they just take their paltry 16% cut (plus or minus), whenever developers choose to give it to them. When you read statements from the government resource managers, they’re unapologetically happy about it: they talk about the trust as if it were a jobs program, not an endowment.

This sort of structure is the norm, not the exception. It would be a strange world in which all of the competing biases in the process cancelled each other out, and yielded a globally optimal outcome in spite of local irrationality. The result, I think, is that policies in climate and energy models are biased, possibly in an unknown direction. On one hand, it seems likely that there’s a negative externality from extraction of public resources above the optimal rate, as in Montana. On the other hand, there might be harmful spillovers from climate or energy policies that increase the use of natural gas, if they exacerbate problems with a suboptimal extraction trajectory.

I’ve done a little sniffing around lately, and it seems that the state of the art in integrated assessment models isn’t too different from what it was in 1995 – most models still use exogenous depletion trajectories or some kind of optimization or equilibrium approach. The only real innovation I’ve seen is a stochastic model-within-a-model approach – essentially, agents know the structure of the system they’re in, but are uncertain about it’s state, so they make stochastically optimal decisions at each point in time. This is a step in the right direction, but still implies a very high cognitive load and degree of intended rationality that doesn’t square with real institutions. I’d be very interested to hear about anything new that moves toward a true behavioral model of resource management.

EIA projections – peak oil or snake oil?

Econbrowser has a nice post from Steven Kopits, documenting big changes in EIA oil forecasts. This graphic summarizes what’s happened:

kopits_eia_forecasts_jun_10
Click through for the original article.

As recently as 2007, the EIA saw a rosy future of oil supplies increasing with demand. It predicted oil consumption would rise by 15 mbpd to 2020, an ample amount to cover most eventualities. By 2030, the oil supply would reach nearly 118 mbpd, or 23 mbpd more than in 2006. But over time, this optimism has faded, with each succeeding year forecast lower than the year before. For 2030, the oil supply forecast has declined by 14 mbpd in only the last three years. This drop is as much as the combined output of Saudi Arabia and China.

In its forecast, the EIA, normally the cheerleader for production growth, has become amongst the most pessimistic forecasters around. For example, its forecasts to 2020 are 2-3 mbpd lower than that of traditionally dour Total, the French oil major. And they are below our own forecasts at Douglas-Westwood through 2020. As we are normally considered to be in the peak oil camp, the EIA’s forecast is nothing short of remarkable, and grim.

Is it right? In the last decade or so, the EIA’s forecast has inevitably proved too rosy by a margin. While SEC-approved prospectuses still routinely cite the EIA, those who deal with oil forecasts on a daily basis have come to discount the EIA as simply unreliable and inappropriate as a basis for investments or decision-making. But the EIA appears to have drawn a line in the sand with its new IEO and placed its fortunes firmly with the peak oil crowd. At least to 2020.

Since production is still rising, I think you’d have to call this “inflection point oil,” but as a commenter points out, it does imply peak conventional oil:

It’s also worth note that most of the liquids production increase from now to 2020 is projected to be unconventional in the IEO. Most of this is biofuels and oil sands. They REALLY ARE projecting flat oil production.

Since I’d looked at earlier AEO projections in the past, I wondered what early IEO projections looked like. Unfortunately I don’t have time to replicate the chart above and overlay the earlier projections, but here’s the 1995 projection:

Oil - IEO 1995

The 1995 projections put 2010 oil consumption at 87 to 95 million barrels per day. That’s a bit high, but not terribly inconsistent with reality and the new predictions (especially if the financial bubble hadn’t burst). Consumption growth is 1.5%/year.

And here’s 2002:

Oil - IEO 2002

In the 2002 projection, consumption is at 96 million barrels in 2010 and 119 million barrels in 2020 (waaay above reality and the 2007-2010 projections), a 2.2%/year growth rate.

I haven’t looked at all the interim versions, but somewhere along the way a lot of optimism crept in (and recently, crept out). In 2002 the IEO oil trajectory was generated by a model called WEPS, so I downloaded WEPS2002 to take a look. Unfortunately, it’s a typical open-loop spreadsheet horror show. My enthusiasm for a detailed audit is low, but it looks like oil demand is purely a function of GDP extrapolation and GDP-energy relationships, with no hint of supply-side dynamics (not even prices, unless they emerge from other models in a sneakernet portfolio approach). There’s no evidence of resources, not even synchronized drilling. No wonder users came to “discount the EIA as simply unreliable and inappropriate as a basis for investments or decision-making.”

Newer projections come from a new version, WEPS+. Hopefully it’s more internally consistent than the 2002 spreadsheet, and it does capture stock/flow dynamics and even includes resources. EIA appears to be getting better. But it appears that there’s still a fundamental problem with the paradigm: too much detail. There just isn’t any point in producing projections for dozens of countries, sectors and commodities two decades out, when uncertainty about basic dynamics renders the detail meaningless. It would be far better to work with simple models, capable of exploring the implications of structural uncertainty, in particular relaxing assumptions of equilibrium and idealized behavior.

Update: Michael Levi at the CFR blog points out that much of the difference in recent forecasts can be attributed to changes in GDP projections. Perhaps so. But I think this reinforces my point about detail, uncertainty, and transparency. If the model structure is basically consumption = f(GDP, price, elasticity) and those inputs have high variance, what’s the point of all that detail? It seems to me that the detail merely obscures the fundamentals of what’s going on, which is why there’s no simple discussion of reasons for the change in forecast.

Rising at the Interest Rate?

With oil back at $70, I got curious how Hotelling is holding up. The observation that resource prices ought to rise at the interest rate is looking almost plausible now, if you squint, whereas it looked rather foolish for most of the 80s and 90s. Of course, the actual production trajectory has nothing to do with Hotelling’s simple model, which produces a monotonic decline. The basic problem with Hotelling, as I see it, is that there’s a difference between equilibrium and expectations subject to uncertainty. Moreover the extraction trajectory is largely controlled by the rate at which governments lease or otherwise exploit resources, and governments have more than the usual dose of bounded rationality. (I got interested in this because I’ve been investigating Montana’s management of mineral rights on its school trust lands. So far, the state’s exercise of its fiduciary responsibility looks suspiciously like a corporate welfare program. More on that another time.)

Oil vs Interest Rates

The figure compares the nominal oil price trajectory to actual risk-free rates (3month T-bills and the federal funds rate), as well as three constant rates for good measure. At those rates, one would have to conclude that a large risk premium must apply to oil production, or that there’s been an awful lot of uneconomic production over the years (for example, everything from about 1986 to 2006), or that current prices are just a blip and will continue to revert to some more moderate long-term level.

More Oil Price Forecasts

The history of long term energy forecasting is a rather mixed bag. Supply and demand forecasts have generally been half decent, in terms of percent error, but that’s primarily because GDP growth is steady, energy intensity is price-inelastic, and there’s a lot of momentum in energy consuming and producing capital. Energy price forecasts, on the other hand, have generally been terrible. Consider the Delphi panel forecasts conducted by the CEC:

California Energy Commission Delphi Forecasts

In 1988, John Sterman showed that energy forecasts, even those using sophisticated models, were well represented by a simple adaptive rule: Continue reading “More Oil Price Forecasts”

SRES – We've got a bigger problem now

Recently Pielke, Wigley and Green discussed the implications of autonomous energy efficiency improvements (AEEI) in IPCC scenarios, provoking many replies. Some found the hubbub around the issue surprising, because the assumptions concerned were well known, at least to modelers. I was among the surprised, but sometimes the obvious needs to be restated loud and clear. I believe that there are several bigger elephants in the room that deserve such treatment. AEEI is important, as are other hotly debated SRES choices like PPP vs. MEX, but at the end of the day, these are just parameter choices. In complex systems parameter uncertainty generally plays second fiddle to structural uncertainty. Integrated assessment models (IAMs) as a group frequently employ similar methods, e.g., dynamic general equilibrium, and leave crucial structural assumptions untested. I find it strange that the hottest debates surround biogeophysical models, which are actually much better grounded in physical principles, when socio-economic modeling is so uncertain.

Continue reading “SRES – We've got a bigger problem now”

No Gas

Every year or two the “gas out” email arrives in my inbox. This year, it’s May 15th when “all internet users are to not go to a gas station in protest of high gas prices.” Wait – am I supposed to avoid gas stations, or protesting at gas stations? I’m amazed at the durability of this internet chain letter, which now claims a ten-year history: “In April 1997, there was a “gas out” conducted nationwide in protest of gas prices. Gasoline prices dropped 30 cents a gallon overnight.” A Monty Python tune from The Meaning of Life jumps to mind:

So remember when you’re feeling very small and insecure
How amazingly unlikely is your birth
And pray that there’s intelligent life somewhere up in space
Because there’s bugger all down here on earth.

Continue reading “No Gas”