Strategic Excess?

I’ve been reading the Breakthrough Institute’s Waxman Markey analysis, which is a bit spotty* but raises many interesting issues. One comment seemed too crazy to be true: that the W-M strategic reserve is “refilled” with forestry offsets. Sure enough, it is true:

726 (g) (2) INTERNATIONAL OFFSET CREDITS FOR REDUCED DEFORESTATION- The Administrator shall use the proceeds from each strategic reserve auction to purchase international offset credits issued for reduced deforestation activities pursuant to section 743(e). The Administrator shall retire those international offset credits and establish a number of emission allowances equal to 80 percent of the number of international offset credits so retired. Emission allowances established under this paragraph shall be in addition to those established under section 721(a).

This provision makes the reserve nearly self-perpetuating: at constant prices, 80% of allowances released from the reserve are replaced. If the reserve accomplishes its own goal of reducing prices, more than 80% get replaced (if replacement exceeds 100%, the excess is vintaged and assigned to future years). This got me wondering: does anyone understand how the reserve really works? Its market rules seem arbitrary. Thus I set out to simulate them.

First, I took a look at some data. What would happen if the reserve strategy were applied to other commodities? Here’s oil:

Oil prices & moving average cap

Red is the actual US weekly crude price, while purple shows the strategic reserve price trigger level: a 3-year moving average + 60%. With this trajectory, the reserve would be shaving a few peaks, but wouldn’t do anything about the long term runup in prices. Same goes for corn: Continue reading “Strategic Excess?”

Waxman-Markey emissions coverage

In an effort to get a handle on Waxman Markey, I’ve been digging through the EPA’s analysis. Here’s a visualization of covered vs. uncovered emissions in 2016 (click through for the interactive version).

0b50f88e-65c3-11de-b8e7-000255111976 Blog_this_caption

The orange bits above are uncovered emissions – mostly the usual suspects: methane from cow burps, landfills, and coal mines; N2O from agriculture; and other small process or fugitive emissions. This broad scope is one of W-M’s strong points.

Talking to the taxman about math

I ran across this gem in the text of Waxman Markey (HR 2454):

(e) Trade-vulnerable Industries-

(1) IN GENERAL- The Administrator shall allocate emission allowances to energy-intensive, trade-exposed entities, to be distributed in accordance with section 765, in the following amounts:

(A) For vintage years 2012 and 2013, up to 2.0 percent of the emission allowances established for each year under section 721(a).

(B) For vintage year 2014, up to 15 percent of the emission allowances established for that year under section 721(a).

(C) For vintage year 2015, up to the product of–

(i) the amount specified in paragraph (2); multiplied by

(ii) the quantity of emission allowances established for 2015 under section 721(a) divided by the quantity of emission allowances established for 2014 under section 721(a).

(D) For vintage year 2016, up to the product of–

(i) the amount specified in paragraph (3); multiplied by

(ii) the quantity of emission allowances established for 2015 under section 721(a) divided by the quantity of emission allowances established for 2014 under section 721(a).

(E) For vintage years 2017 through 2025, up to the product of–

(i) the amount specified in paragraph (4); multiplied by

(ii) the quantity of emission allowances established for that year under section 721(a) divided by the quantity of emission allowances established for 2016 under section 721(a).

(F) For vintage years 2026 through 2050, up to the product of the amount specified in paragraph (4)–

(i) multiplied by the quantity of emission allowances established for the applicable year during 2026 through 2050 under section 721(a) divided by the quantity of emission allowances established for 2016 under section 721(a); and

(ii) multiplied by a factor that shall equal 90 percent for 2026 and decline 10 percent for each year thereafter until reaching zero, except that, if the President modifies a percentage for a year under subparagraph (A) of section 767(c)(3), the highest percentage the President applies for any sector under that subparagraph for that year (not exceeding 100 percent) shall be used for that year instead of the factor otherwise specified in this clause.

What we have here is really a little dynamic model, which can be written down in 4 or 5 lines. The intent is apparently to stabilize the absolute magnitude of the allocation to trade-vulnerable industries. In order to do that, the allocation share has to rise over time, as the total allowances issued falls. After 2026, there’s a 10%-per-year phaseout, but that’s offset by the continued upward pressure on share from the decline in allowances, so the net phaseout rate is about 5%/year, I think. Oops: Actually, I think now that it’s the other way around … from 2017-2025, the formula decreases the share of allowances allocated at the same rate as the absolute allowance allocation declines. Thereafter, it’s that rate plus 10%. There is no obvious rationale for this strange method.

Seems to me that if legislators want to create formulas this complicated, they ought to simply write out the equations (with units) in the text of the bill. Otherwise, natural language hopelessly obscures the structure and no ordinary human can participate effectively in the process. But perhaps that’s part of the attraction?

The elusive MAC curve

Marginal Abatement Cost (MAC) curves are a handy way of describing the potential for and cost of reducing energy consumption or GHG emissions. McKinsey has recently made them famous, but they’ve been around, and been debated, for a long time.

McKinsey MAC 2.0

One version of the McKinsey MAC curve

Five criticisms are common:

1. Negative cost abatement options don’t really exist, or will be undertaken anyway without policy support. This criticism generally arises from the question begged by the Sweeney et al. MAC curve below: if the leftmost bar (diesel anti-idling) has a large negative cost (i.e. profit opportunity) and is price sensitive, why hasn’t anyone done it? Where are those $20 bills on the sidewalk? There is some wisdom to this, but you have to drink pretty deeply of the neoclassical economic kool aid to believe that there really are no misperceptions, institutional barriers, or non-climate externalities that could create negative cost opportunities.

Sweeney et al. California MAC curve

Sweeney, Weyant et al. Analysis of Measures to Meet the Requirements of California’s Assembly Bill 32

The neoclassical perspective is evident in AR4, which reports results primarily of top-down, equilibrium models. As a result, mitigation costs are (with one exception) positive:

AR4 WG3 TS fig. TS.9, implicit MAC curves

AR4 WG3 TS fig. TS-9

Note that these are top-down implicit MAC curves, derived by exercising aggregate models, rather than bottom-up curves constructed from detailed menus of technical options.

2. The curves employ static assumptions, that might not come true. For example, I’ve heard that the McKinsey curves assume $60/bbl oil. This criticism is true, but could be generalized to more or less any formal result that’s presented as a figure rather than an interactive model. I regard it as a caveat rather than a flaw.

3. The curves themselves are static, while reality evolves. I think the key issue here is that technology evolves endogenously, so that to some extent the shape of the curve in the future will depend on where we choose to operate on the curve today. There are also 2nd-order, market-mediated effects (related to #2 as well): a) exploiting the curve reduces energy demand, and thus prices, which changes the shape of the curve, and b) changes in GHG prices or other policies used to drive exploitation of the curve influence prices of capital and other factors, again changing the shape of the curve.

4. The notion of “supply” is misleading or incomplete. Options depicted on a MAC curve typically involve installing some kind of capital to reduce energy or GHG use. But that installation depends on capital turnover, and therefore is available only incrementally. The rate of exploitation is more difficult to pin down than the maximum potential under idealized conditions.

5. A lot of mitigation falls through the cracks. There are two prongs to this criticism: bottom-up, and top-down. Bottom-up models, because they employ a menu of known technologies, inevitably overlook some existing or potential options that might materialize in reality (with the incentive of GHG prices, for example). That error is, to some extent, offset by over-optimism about other technologies that won’t materialize. More importantly, a menu of supply and end use technology choices is an incomplete specification of the economy; there’s also a lot of potential for changes in lifestyle and substitution of activity among economic sectors. Today’s bottom-up MAC curve is essentially a snapshot of how to do what we do now, with fewer GHGs. If we’re serious about deep emissions cuts, the economy may not resemble what we’re doing now very much  in 40 years. Top down models capture the substitution potential among sectors, but still take lifestyle as a given and (mostly) start from a first-best equilibrium world, devoid of mitigation options arising from the frailty of human, institutional, and market failures.

To get the greenhouse gas MAC curve right, you need a model that captures bottom-up and top-down aspects of the economy, with realistic dynamics and agent behavior, endogenous technology, and non-climate externalities all included. As I see it, mainstream integrated assessment models are headed down some of those paths (endogenous technology), but remain wedded to the equilibrium/optimization perspective. Others (including us at Ventana) are exploring other avenues, but it’s a hard road to hoe.

In the meantime, we’re stuck with a multitude of perspectives on mitigation costs. Here are a few from the WCI, compiled by Wei and Rose from partner jurisdictions’ Climate Action Team reports and other similar documents:

WCI partner MAC curves

Wei & Rose, Preliminary Cap & Trade Simulation of Florida Joining WCI

The methods used to develop the various partner options differ, so these curves reflect diverse beliefs rather than a consistent comparison. What’s striking to me is that the biggest opportunities (are perceived to) exist in California, which already has (roughly) the lowest GHG intensity and most stringent energy policies among the partners. Economics 101 would suggest that California might already have exploited the low-hanging fruit, and that greater opportunity would exist, say, here in Montana, where energy policy means low taxes and GHG intensity is extremely high.

For now, we have to live with the uncertainty. However, it seems obvious that an adaptive strategy for discovering the true potential for mitigation is easy. No matter who you beleive, the cost of the initial increment of emissions reductions is either small (<<1% of GDP) or negative, so just put a price on GHGs and see what happens.

ABC to air Clout & Climate Change documentary

This just in from CNAS:

ABC News will air Earth 2100, the prime time documentary for which they filmed the war game, on June 2, 2009, at 9:00 p.m. (EST). You can view a promotional short report on the documentary from ABC News online, and hopefully you will all be able to view it on television or via Internet.

In conjunction with the airing of the documentary, CNAS has made the participant briefing book and materials from the game available online. We encourage other institutions to use and cite these materials to learn about the game and to stage their own scenario exercises. I also hope that they will be useful to you for your own future reference.

Finally, we are posting a short working paper of major findings from the game. While the game did not result in the kind of breakthrough agreements we all would have liked to see, this exercise achieved CNAS’s goals of exploring and highlighting the potential difficulties and opportunities of international cooperation on climate change. I know that everyone took away different observations from the game, however, and I hope that you will share your memories and your own key findings of the event with us, and allow us to post them online as a new section of the report.

Visit the Climate Change War Game webpage to view the CNAS report on major findings and background on developing the 2015 world, the participant briefing book, and materials generated from the game.

The only thing worse than cap & trade …

… is Marty Feldstein’s lame arguments against it.

  • He cites CBO household costs of policy that reflect outlays, rather than real deadweight or welfare losses after revenue recycling.
  • He wants the US to wait for global agreement before moving. News flash: there won’t be a global agreement without some US movement.
  • He argues that unilateral action is ineffective: true, but irrelevant if you aim to solve the problem. However, if that’s our moral philosophy, I think I should be exempted from all laws – on a global scale, no one will notice my murdering and pillaging, and it’ll be fun for me.

There is one nugget of wisdom in Feldstein’s piece: it’s a travesty to overcompensate carbon-intensive firms, and foolish to use allowance allocation to utilities to defeat the retail price signal. I haven’t read the details of the bill yet, so I don’t know how extensive those provisions really are, but it’s definitely something to watch.

Well, OK, lots of things are worse than cap & trade. More importantly, one thing (an upstream carbon tax) could be a lot better than Waxman Markey. But it’s sad when a Harvard economist sounds like an astroturf skeptic.

Hat tip to Economist’s View.

Cash for Clunkers Illusion

The proposed cash-for-clunkers program strikes me as yet another marginally effective policy that coulda been a contenda. In the aggregate, getting rid of clunkers doesn’t do much good, because fleet fuel economy has not improved in the last decade (at least current proposals don’t target age). Only transaction costs prevent wholesale shuffling of vehicles to yield advantageous trades that don’t improve total fleet efficiency. Clunkers that are cheap enough to scrap for a tax credit likely have low utilization; if they’re replaced by a new vehicle with high utilization, that doesn’t help. It might be a good stimulus for automakers, but you can’t get to a low-carbon future by subsidizing new carbon-consuming capital. The credits proposed in House and Senate versions appear to suffer from MPG illusion:

Clunker credits & differences

Clunker credit vs. fuel savings

How many climate and energy policies that don’t work do we really need?

Drinking too much CAFE?

The NHTSA and EPA have announced upgraded vehicle efficiency and emissions standards. The CAFE standard will go up to 35.5 mpg by 2016, and a 250 gCO2/mile emissions limit will be phased in by the same time. My bottom line: I strongly favor efficient, low-emissions vehicles, but I think command and control legislation is a lousy way to get them. The approach works, but there’s a lot of collateral damage and inefficiency, and opponents of climate and energy policy are given lots to complain about. I’m happy about the new standard, but I look forward to the day when it’s not needed, because other signals are working properly.

First, as background, here’s the new CAFE standard in perspective:

CAFE standard and performance & light truck share

Source: NHTSA Update: I’ve corrected the data, which inadvertently showed light trucks rather than the total fleet. Notice two things: first, the total fleet corporate average fuel economy (CAFE) and standard has been declining, due to the penetration of light trucks (including SUVs). Second, if the 2016 standard of 35.5 mpg is to be met, given car and truck standards of 39 and 30 mpg, the share of light trucks will have to fall below 40%, though extrapolation of the historic trend would carry it nearer to 70%. It’s not clear how the allocation of footprint, credit trading and other features of CAFE will cause this to occur.

Like other portfolio standards, CAFE creates an internal tax and subsidy system within regulated entities. To meet its portfolio requirement, a manufacturer has to (internally) subsidize high-mpg vehicles and tax low-mpg vehicles. This hidden tax structure is problematic in several ways. There’s no guarantee that it yields an implicit price of carbon or energy that’s consistent across manufacturers, or consistent with fuel taxes and the price of emissions under a cap & trade system. Subsidizing the high-mpg vehicles is a bad idea: they’re more efficient, but they aren’t zero-emissions, and they still contribute to congestion and other side effects of driving – why would we want more of that? It’s even possible, if high-mpg drivers are price elastic  (think kids) and low-mpg drivers are less so (think luxury SUV buyers, that the standard increases the total fleet and thus offsets some of its intended fuel savings.

The basic incentive problem with portfolio standards is compounded by the division of CAFE into domestic and imported, car and light truck stovepipes. Separate, non-fungible standards for cars and trucks create a bizarre allocation of property rights – in effect, light truck buyers are endowed with more property rights to consume or emit, irrespective of the fact that GHGs and other externalities do the same harm regardless of who’s responsible. Recently, a new footprint methodology effectively generalized the car/truck distinction to an allocation based on vehicle footprint. This makes about as much sense as subsidizing bullets for felons. It sounds like the stovepipe issue will be relaxed a bit with the new regulations, because credits will become tradable, but just wait until GM truck buyers figure out that they’re paying a tax that goes to subsidize Honda Fits. Still, there’s no clear reason why the ratio of car:truck standards should be 39:30, or why the car standard should go up 30% while the truck standard goes up 15%.

Applying the standard to vehicles at the point of purchase, rather than as they are used (through fuel taxes or VMT tolls) fails to recognize that most of the negative side effects of a vehicle arise from its use, not from its existence. With fuel, emissions, and congestion charges, people could be free to make their own tradeoffs among efficiency, vehicle utilization, and capabilities like cargo capacity. Standards basically ignore diversity in usage patterns, and shoehorn everyone into the same mold. Remember that, while a driver-only Chevy Suburban is ridiculous, a full one moves people almost as efficiently as a full Prius, and 3x more efficiently than a driver-only Prius.

Once efficient vehicles are on the road, the rebound effect crops up. CAFE lowers the cost of driving, so in the absence of a fuel or emissions price signal, people will drive, consume, and emit more. Over the past three decades, miles traveled per vehicle and the total fleet size have dramatically increased. As a result, fuel consumption per vehicle has been essentially constant, in spite of efficiency improvements, and total fuel consumption is up. The increase in driving is likely due mostly to cheap fuel, sprawl, and increasing population and wealth, but efficiency mandates have probably contributed as well.

VMT, fuel, registrations

Source: DOT FHWA

In addition to incentive problems, there are lots of implementation issues in CAFE. Over the years, there’s been a lot of tinkering with the standard (like the footprint methodology) designed to restore flexibility you’d have automatically with a market-based mechanism or to achieve other policy goals. As a result, the rules have become rather opaque. CAFE measurements use EPA’s old fuel economy measurement methods, which were abandoned for window stickers a few years ago because they didn’t match reality. There are various loopholes, including one that permits vehicles to claim 4x mpg if they can consume alternate fuels, even if those fuels are not widely distributed (E85).

The critics of CAFE mostly don’t focus on the incentive and transparency problems above. Instead, they hammer on two ideas: that CAFE costs jobs, and forces us all to die in tiny boxes. Those make good sound bites, but neither argument is particularly strong. Seeking Alpha has a nice look at the economics. The safety issue is harder to wrap your arms around. Basically, the critics argue that, in a collision, weight is good. From the perspective of a single driver, that’s largely true, because the distribution of changes in momentum in a collision is strongly proportional to the relative mass of the objects involved. However, that’s an arms race, with no aggregate benefit: when everyone else drives a 4952 lb Dodge Ram 1500, you need a 6342 lb Ram 3500 to stay ahead. With safety as the only consideration, soon we’d all be driving locomotives and M1 tanks. The real social benefit of weight is that it’s correlated with size, which (all else equal) lowers the acceleration passengers face in a collision, but the size-weight correlation is intermediated by technology, which governs the strength of a passenger compartment and the aggressiveness of a vehicle chassis against other vehicles.

In that respect, CAFE’s car-light truck distinction and footprint methodology probably has been damaging, because it has encouraged the spread of heavy SUVs on ladder frames, as can be seen in the first figure. Those vehicles impose disproportionate risk on others:

Collision risk, decomposed to own and other effects

Source: Marc Ross UMich, Tom Wenzel LBNL, An Analysis of Traffic Deaths by Vehicle Type and Model, ACEE Report #T012, March 2002.

There are many ways to achieve safety without simply adding mass: good design, better materials, restraints, lower speeds, and less beer on Saturday night all help. If we had a vehicle energy and emissions policy that permitted broader tradeoffs, I’m sure we could arrive at a more efficient system with better aggregate safety than we have now.

In spite of its many problems, I’ll take CAFE – it’s better than nothing, and there’s certainly no technical obstacle to meeting the new standards (be prepared for lots of whining though). Alternatives will take a while to construct, so by wingwalker’s rule we should hang onto what we have for the moment. But rather than pushing the standards approach to its inevitable breakdown point, I think we should be pursuing other options: get a price on carbon, and any other externalities we care about (congestion tolls and pay-at-the-pump insurance are good examples). Then work on zoning, infrastructure, and other barriers to efficiency, mode shifting, and VMT reduction. With the fundamental price signals aligned with the goals, it should be easier to push things in the right direction.