Cash for Clunkers Illusion

The proposed cash-for-clunkers program strikes me as yet another marginally effective policy that coulda been a contenda. In the aggregate, getting rid of clunkers doesn’t do much good, because fleet fuel economy has not improved in the last decade (at least current proposals don’t target age). Only transaction costs prevent wholesale shuffling of vehicles to yield advantageous trades that don’t improve total fleet efficiency. Clunkers that are cheap enough to scrap for a tax credit likely have low utilization; if they’re replaced by a new vehicle with high utilization, that doesn’t help. It might be a good stimulus for automakers, but you can’t get to a low-carbon future by subsidizing new carbon-consuming capital. The credits proposed in House and Senate versions appear to suffer from MPG illusion:

Clunker credits & differences

Clunker credit vs. fuel savings

How many climate and energy policies that don’t work do we really need?

Drinking too much CAFE?

The NHTSA and EPA have announced upgraded vehicle efficiency and emissions standards. The CAFE standard will go up to 35.5 mpg by 2016, and a 250 gCO2/mile emissions limit will be phased in by the same time. My bottom line: I strongly favor efficient, low-emissions vehicles, but I think command and control legislation is a lousy way to get them. The approach works, but there’s a lot of collateral damage and inefficiency, and opponents of climate and energy policy are given lots to complain about. I’m happy about the new standard, but I look forward to the day when it’s not needed, because other signals are working properly.

First, as background, here’s the new CAFE standard in perspective:

CAFE standard and performance & light truck share

Source: NHTSA Update: I’ve corrected the data, which inadvertently showed light trucks rather than the total fleet. Notice two things: first, the total fleet corporate average fuel economy (CAFE) and standard has been declining, due to the penetration of light trucks (including SUVs). Second, if the 2016 standard of 35.5 mpg is to be met, given car and truck standards of 39 and 30 mpg, the share of light trucks will have to fall below 40%, though extrapolation of the historic trend would carry it nearer to 70%. It’s not clear how the allocation of footprint, credit trading and other features of CAFE will cause this to occur.

Like other portfolio standards, CAFE creates an internal tax and subsidy system within regulated entities. To meet its portfolio requirement, a manufacturer has to (internally) subsidize high-mpg vehicles and tax low-mpg vehicles. This hidden tax structure is problematic in several ways. There’s no guarantee that it yields an implicit price of carbon or energy that’s consistent across manufacturers, or consistent with fuel taxes and the price of emissions under a cap & trade system. Subsidizing the high-mpg vehicles is a bad idea: they’re more efficient, but they aren’t zero-emissions, and they still contribute to congestion and other side effects of driving – why would we want more of that? It’s even possible, if high-mpg drivers are price elastic  (think kids) and low-mpg drivers are less so (think luxury SUV buyers, that the standard increases the total fleet and thus offsets some of its intended fuel savings.

The basic incentive problem with portfolio standards is compounded by the division of CAFE into domestic and imported, car and light truck stovepipes. Separate, non-fungible standards for cars and trucks create a bizarre allocation of property rights – in effect, light truck buyers are endowed with more property rights to consume or emit, irrespective of the fact that GHGs and other externalities do the same harm regardless of who’s responsible. Recently, a new footprint methodology effectively generalized the car/truck distinction to an allocation based on vehicle footprint. This makes about as much sense as subsidizing bullets for felons. It sounds like the stovepipe issue will be relaxed a bit with the new regulations, because credits will become tradable, but just wait until GM truck buyers figure out that they’re paying a tax that goes to subsidize Honda Fits. Still, there’s no clear reason why the ratio of car:truck standards should be 39:30, or why the car standard should go up 30% while the truck standard goes up 15%.

Applying the standard to vehicles at the point of purchase, rather than as they are used (through fuel taxes or VMT tolls) fails to recognize that most of the negative side effects of a vehicle arise from its use, not from its existence. With fuel, emissions, and congestion charges, people could be free to make their own tradeoffs among efficiency, vehicle utilization, and capabilities like cargo capacity. Standards basically ignore diversity in usage patterns, and shoehorn everyone into the same mold. Remember that, while a driver-only Chevy Suburban is ridiculous, a full one moves people almost as efficiently as a full Prius, and 3x more efficiently than a driver-only Prius.

Once efficient vehicles are on the road, the rebound effect crops up. CAFE lowers the cost of driving, so in the absence of a fuel or emissions price signal, people will drive, consume, and emit more. Over the past three decades, miles traveled per vehicle and the total fleet size have dramatically increased. As a result, fuel consumption per vehicle has been essentially constant, in spite of efficiency improvements, and total fuel consumption is up. The increase in driving is likely due mostly to cheap fuel, sprawl, and increasing population and wealth, but efficiency mandates have probably contributed as well.

VMT, fuel, registrations

Source: DOT FHWA

In addition to incentive problems, there are lots of implementation issues in CAFE. Over the years, there’s been a lot of tinkering with the standard (like the footprint methodology) designed to restore flexibility you’d have automatically with a market-based mechanism or to achieve other policy goals. As a result, the rules have become rather opaque. CAFE measurements use EPA’s old fuel economy measurement methods, which were abandoned for window stickers a few years ago because they didn’t match reality. There are various loopholes, including one that permits vehicles to claim 4x mpg if they can consume alternate fuels, even if those fuels are not widely distributed (E85).

The critics of CAFE mostly don’t focus on the incentive and transparency problems above. Instead, they hammer on two ideas: that CAFE costs jobs, and forces us all to die in tiny boxes. Those make good sound bites, but neither argument is particularly strong. Seeking Alpha has a nice look at the economics. The safety issue is harder to wrap your arms around. Basically, the critics argue that, in a collision, weight is good. From the perspective of a single driver, that’s largely true, because the distribution of changes in momentum in a collision is strongly proportional to the relative mass of the objects involved. However, that’s an arms race, with no aggregate benefit: when everyone else drives a 4952 lb Dodge Ram 1500, you need a 6342 lb Ram 3500 to stay ahead. With safety as the only consideration, soon we’d all be driving locomotives and M1 tanks. The real social benefit of weight is that it’s correlated with size, which (all else equal) lowers the acceleration passengers face in a collision, but the size-weight correlation is intermediated by technology, which governs the strength of a passenger compartment and the aggressiveness of a vehicle chassis against other vehicles.

In that respect, CAFE’s car-light truck distinction and footprint methodology probably has been damaging, because it has encouraged the spread of heavy SUVs on ladder frames, as can be seen in the first figure. Those vehicles impose disproportionate risk on others:

Collision risk, decomposed to own and other effects

Source: Marc Ross UMich, Tom Wenzel LBNL, An Analysis of Traffic Deaths by Vehicle Type and Model, ACEE Report #T012, March 2002.

There are many ways to achieve safety without simply adding mass: good design, better materials, restraints, lower speeds, and less beer on Saturday night all help. If we had a vehicle energy and emissions policy that permitted broader tradeoffs, I’m sure we could arrive at a more efficient system with better aggregate safety than we have now.

In spite of its many problems, I’ll take CAFE – it’s better than nothing, and there’s certainly no technical obstacle to meeting the new standards (be prepared for lots of whining though). Alternatives will take a while to construct, so by wingwalker’s rule we should hang onto what we have for the moment. But rather than pushing the standards approach to its inevitable breakdown point, I think we should be pursuing other options: get a price on carbon, and any other externalities we care about (congestion tolls and pay-at-the-pump insurance are good examples). Then work on zoning, infrastructure, and other barriers to efficiency, mode shifting, and VMT reduction. With the fundamental price signals aligned with the goals, it should be easier to push things in the right direction.

Battle of the Bulb

The NYT covers the resistance movement against incandescent light bulb bans. I think most of the resistance’s arguments are flimsy. Good-quality CFLs have better color reproduction and much longer lifetimes than incandescents. Start up times are now pretty fast, flicker is not a problem, and cold weather operation is fine outdoors, even here in Montana. Bad-quality bulbs are more problematic, but you get what you pay for; if you pay for quality, you still come out ahead with CFLs.

Still, I sympathize with the resistance, because an outright ban makes little sense. CFLs don’t work in some applications, and don’t even save energy or money when used in locations that are infrequently on. They also make lousy chicken incubators. Instead, we should ban inefficient lighting economically, by pricing GHGs, local air quality, light pollution, energy security, and whatever else motivates us to seek efficient lighting in the first place. Then incandescents can stick around for things that make sense, and disappear for things that don’t. The resistance won’t have to hoard bulbs, because they can run their little tungsten filaments as long as they feel like paying for the privelege. While we’re at it, we should price mercury, so the indoor and outdoor pollution effects of CFL disposal and coal combustion are properly traded off.

Command and control is so 20th century.

FutureGen killing a mistake?

Via ClimateArk,

US government slammed over coal project

Basic accounting error led government department to miscalculate ongoing project costs

The document, which examines the restructuring of the FutureGen project in January 2008, found that a basic accounting error led the department to miscalculate ongoing project costs. This led it to drastically alter the nature of the project, delaying its operation by three years.

FutureGen, which was meant to begin operation in 2012, combined integrated gasification combined cycle (IGCC) with carbon capture and sequestration (CCS).

The initiative was designed to be an experimental one for emerging clean coal research, but construction prices had been escalating as material and labour costs increased. The DoE decided to withdraw support for the industry alliance that was partially funding the programme in January last year.

“Contrary to best practices, DoE did not base its decision to restructure FutureGen on a comprehensive analysis of factors, such as the associated costs, benefits, and risks,” says the report.

“DoE made its decision based, in large part, on its conclusion that construction and material costs for the original programme would continue escalating substantially in the definite future and that lifecycle costs were likely to double.”

However, the DoE’s own Energy Information Administration has pointed out that significant cost escalation for building power plants does not continue in the long run.

The department also made a fundamental mistake in assessing ongoing project costs. It said that costs had doubled from original estimates, using that as the key justification for withdrawing funds from the alliance.

But when it compared its original 2004 estimate of the project’s cost with the alliance’s 2006 estimate to reach that conclusion, it did not take into account that the first estimate was in constant 2004 dollars, whereas the latter was in inflated dollars. Had it acknowledged this difference, the project cost would only have increased by 39 per cent ($370m), according to the GAO.

Another good reason to make sure your units balance. I find this explanation of the cancellation barely credible. There must be more to this than meets the eye.

Biofuels, dost thou protest too much?

Future ethanol?

Following up on yesterday’s LCFS item, a group of biofuel researchers have written an open letter to the gubernator, protesting the inclusion of indirect land use emissions in biofuel assessments for the LCFS. The letter is followed by 12 pages of names and affiliations – mostly biologists, chemical engineers, and ag economists. They ask for a 24-month moratorium on regulation of indirect land use effects, during which all indirect or market-mediated effects of petroleum and alternative fuels would be studied.

I have mixed feelings about this. On one hand, I don’t think it’s always practical to burden a local regulation with features that attempt to control its nonlocal effects. Better to have a simple regulation that gets imitated widely, so that nonlocal effects come under control in their own jurisdictions. On the other hand, I don’t see how you can do regional GHG policy without some kind of accounting for at least the largest boundary effects. Otherwise leakage of emissions to unregulated jurisdictions just puts the regions who are trying to do the right thing at a competitive disadvantage.

Continue reading “Biofuels, dost thou protest too much?”

Ethanol Odd Couple & the California LCFS

I started sharing items from my feed reader, here. Top of the list is currently a pair of articles from Science Daily:

Corn-for-ethanol’s Carbon Footprint Critiqued

To avoid creating greenhouse gases, it makes more sense using today’s technology to leave land unfarmed in conservation reserves than to plow it up for corn to make biofuel, according to a comprehensive Duke University-led study.

“Converting set-asides to corn-ethanol production is an inefficient and expensive greenhouse gas mitigation policy that should not be encouraged until ethanol-production technologies improve,” the study’s authors reported in the March edition of the research journal Ecological Applications.

Corn Rises After Government Boosts Estimate for Ethanol Demand

Corn rose for a fourth straight session, the longest rally this year, after the U.S. government unexpectedly increased its estimate of the amount of grain that will be used to make ethanol.

House Speaker Nancy Pelosi, a California Democrat, and Senator Amy Klobuchar, a Minnesota Democrat, both said March 9 they support higher amounts of ethanol blended into gasoline. On March 6, Growth Energy, an ethanol-industry trade group, asked the Environmental Protection Agency to raise the U.S. ratio of ethanol in gasoline to 15 percent from 10 percent.

This left me wondering where California’s assessments of low carbon fuels now stand. Last March, I attended a collaborative workshop on life cycle analysis of low carbon fuels, part of a series (mostly facilitated by Ventana, but not this one) on GHG policy. The elephant in the room was indirect land use emissions from biofuels. At the time, some of the academics present argued that, while there’s a lot of uncertainty, zero is the one value that we know to be wrong. That left me wondering what plan B is for biofuels, if current variants turn out to have high land use emissions (rendering them worse than fossil alternatives) and advanced variants remain elusive.

It turns out to be an opportune moment to wonder about this again, because California ARB has just released its LCFS staff report and a bunch of related documents on fuel GHG intensities and land use emissions. The staff report burdens corn ethanol with an indirect land use emission factor of 30 gCO2eq/MJ, on top of direct emissions of 47 to 75 gCO2eq/MJ. That renders 4 of the 11 options tested worse than gasoline (CA RFG at 96 gCO2eq/MJ). Brazilian sugarcane ethanol goes from 27 gCO2eq/MJ direct to 73 gCO2eq/MJ total, due to a higher burden of 46 gCO2eq/MJ for land use (presumably due to tropical forest proximity).

These numbers are a lot bigger than the zero, but also a lot smaller than Michael O’Hare’s 2008 back-of-the-envelope exercise. For example, for corn ethanol grown on converted CRP land, he put total emissions at 228 gCO2eq/MJ (more than twice as high as gasoline), of which 140 gCO2eq/MJ is land use. Maybe the new results (from the GTAP model) are a lot better, but I’m a little wary of the fact that the Staff Report sensitivity ranges on land use (32-57 gCO2eq/MJ for sugarcane, for example) have such a low variance, when uncertainty was previously regarded as rather profound.

But hey, 7 of 11 corn ethanol variants are still better than gasoline, right? Not so fast. A low carbon fuel standard sets the constraint:

(1-x)*G = (1-s)*G + s*A

where x is the standard (emissions intensity cut vs. gasoline), s is the market share of the low-carbon alternative, G is the intensity of gasoline, and A is the intensity of the alternative. Rearranging,

s = x / (1-A/G)

In words, the market share of the alternative fuel needed is proportional to the size of the cut, x, and inversely proportional to the alternative’s improvement over gasoline, (1-A/G), which I’ll call i. As a result, the required share of an alternative fuel increases steeply as it’s performance approaches the limit required by the standard, as shown schematically below:

Intensity-share schematic

Clearly, if a fuel’s i is less than x, s=x/i would have to exceed 1, which is impossible, so you couldn’t meet the constraint with that fuel alone (though you could still use it, supplemented by something better).

Thus land use emissions are quite debilitating for conventional ethanol fuels’ role in the LCFS. For example, ignoring land use emissions, California dry process ethanol has intensity ~=59, or i=0.39. To make a 10% cut, x=0.1, you’d need s=0.26 – 26% market share is hard, but doable. But add 30 gCO2eq/MJ for land use, and i=0.07, which means you can’t meet the standard with that fuel alone. Even the best ethanol option, Brazilian sugarcane at i=0.24, would have 42% market share to meet the standard. This means that the alternative to gasoline in the LCFS would have to be either an advanced ethanol (cellulosic, not yet evaluated), electricity (i=0.6) or hydrogen. As it turns out, that’s exactly what the new Staff Report shows. In the new gasoline compliance scenarios in table ES-10, conventional ethanol contributes at most 5% of the 2020 intensity reduction.

Chapter VI of the Staff Report describes compliance scenarios in more detail. Of the four scenarios in the gasoline stovepipe, each blends 15 to 20% ethanol into gasoline. That ethanol is in turn about 10% conventional (Midwest corn or an improved CA variant with lower intensity) and up to 10% sugarcane. The other 80 to 90% of ethanol is either cellulosic or “advanced renewable” (from forest waste).

That makes the current scenarios a rather different beast from those explored in the original UC Davis LCFS technical study that provides the analytical foundation for the LCFS. I dusted off my copy of VISION-CA (the model used, and a topic for another post some day) and ran the 10% cut scenarios. Some look rather like the vision in the current staff report, with high penetration of low-intensity fuels. But the most technically diverse (and, I think, the most plausible) scenario is H10, with multiple fuels and vehicles. The H10 scenario’s ethanol is still 70% conventional Midwest corn in 2020. It also includes substantial “dieselization” of the fleet (which helps due to diesel’s higher tank-to-wheel efficiency). I suspect that H10-like scenarios are now unavailable, due to land use emissions (which greatly diminish the value of corn ethanol) and the choice of separate compliance pathways for gasoline and diesel.

The new beast isn’t necessarily worse than the old, but it strikes me as higher risk, because it relies on the substantial penetration of fuels that aren’t on the market today. If that’s going to happen by 2020, it’s going to be a busy decade.

Electric Car Wisdumb

The current McKinsey Quarterly feature’s Andy Grove’s editorial, An electric plan for energy resilience. An excerpt:

We believe the United States should consider accelerating this movement by creating an industry of after-market retrofitters. What problems’”technical and economic’”would need to be solved in order to do that? With the help of a team of second-year graduate students in our Bass seminar at the Stanford Business School, we examined this question in the context of a proposed pilot program, whose aim would be to retrofit one million vehicles in three years. We felt that such a project would represent what in game theory is referred to as the ‘minimum winning game’: a significant step toward a long-term strategic objective (see sidebar, ‘Inside Andy’s real-world seminar’).

We estimate the price tag of such a pilot project to be around $10 billion, owing to the present high cost of batteries, which are around $10,000 each. One might expect such costs to drop as volume increases, but because this program is accelerated by design, we have to assume that batteries will remain expensive. Assuming an average gas price of $3 per gallon, the payback period to the owner of a retrofitted vehicle is at least ten years, not a strong economic incentive. But the benefits of this program’”testing and validating a key approach to energy resilience’”accrue to the well-being of the United States at large. As the general population is the predominant beneficiary, economic assistance flowing from everyone to vehicle owners, in the form of tax incentives, is justified.

There are different approaches to retrofitting vehicles. We favor GM’s Volt design, in which the car is directly driven by an electric motor. The vehicle’s existing gasoline engine is replaced by a smaller one, whose sole purpose is to generate electricity and recharge the battery. To simplify the retrofitting task, we would limit the scope of the program to six to ten Chevrolet, Ford, and Dodge models, selected on the basis of two criteria: low fuel efficiency and large numbers of vehicles on the road. Most of these vehicles would be SUVs, pick-ups, and vans.

There’s some wisdom in this proposal, particularly in the recognition that achieving an alt fuel vehicle transformation takes more than a few inventions; it requires changes in infrastructure, marketing, and a variety of other domains, each with bugs to be worked out:

Others wondered why we should bother retrofitting a million cars if that would deal only with a fraction of a percent of the existing cars. That’s one way to look at it. Another, which was the view our students took, is that it is important to strive to do enough conversions that we can encounter all the unknown unknowns, which in my experience characterize every new product or technology as it gets scaled into volume. Should it be 5 million? Should it only be 500,000? We picked a million as a number that is big enough to stress retrofitting capability, battery production capability, manufacturing issues and marketing issues. We described our aim as the ‘minimum winning game’ that would give us a platform from which we could scale further.

However, the retrofit idea strikes me as fundamentally flawed. Targeting low efficiency SUVs, pick-ups, and vans puts batteries exactly where they’d be least effective. If most such vehicles weren’t overweight, un-aerodynamic, saddled with lossy AWD, and bloated with power-hungry accessories, they’d already get decent fuel economy. Adding batteries to them is going to result in some combination of high cost, short range, and poor performance. That sounds like a sure way to poison the public perception of plug in electric vehicles.

RMI has been arguing for years that a coordinated set of chassis innovations could make powertrains with high cost-per-watt, like fuel cells, attractive. It’s no accident that that the only really successful hybrid vehicle (the Prius, responsible for over half of 2007 and 2008 hybrid sales) was designed from scratch. It gets its breakthrough mileage/performance combination from much more than a battery and motor. Lightweight materials, aerodynamics, low rolling resistance tires, and other innovations are also key.

I think Grove and his students are falling for a common fantasy: that technology will step up and allow us to drive exactly as we now do, fossil-free. I personally doubt that will happen. Arnold will probably be one of only a few to ever drive a hydrogen Hummer. The rest of us will have to recognize that if alt fuel vehicles are to accomplish anything really meaningful from an energy standpoint, they’ll be different, as will our land use, commuting, and travel habits.

With that in mind, we should be focusing on creating the new stuff, not fixing the old. That might mean the Chevy Volt, but it might also mean rail or telecommuting. Rather than setting up programs to achieve narrow goals, I’d rather see broad, credible signals (e.g., prices at the pump reflecting environmental and security values) guide the evolution of the new from the bottom up.

Rising at the Interest Rate?

With oil back at $70, I got curious how Hotelling is holding up. The observation that resource prices ought to rise at the interest rate is looking almost plausible now, if you squint, whereas it looked rather foolish for most of the 80s and 90s. Of course, the actual production trajectory has nothing to do with Hotelling’s simple model, which produces a monotonic decline. The basic problem with Hotelling, as I see it, is that there’s a difference between equilibrium and expectations subject to uncertainty. Moreover the extraction trajectory is largely controlled by the rate at which governments lease or otherwise exploit resources, and governments have more than the usual dose of bounded rationality. (I got interested in this because I’ve been investigating Montana’s management of mineral rights on its school trust lands. So far, the state’s exercise of its fiduciary responsibility looks suspiciously like a corporate welfare program. More on that another time.)

Oil vs Interest Rates

The figure compares the nominal oil price trajectory to actual risk-free rates (3month T-bills and the federal funds rate), as well as three constant rates for good measure. At those rates, one would have to conclude that a large risk premium must apply to oil production, or that there’s been an awful lot of uneconomic production over the years (for example, everything from about 1986 to 2006), or that current prices are just a blip and will continue to revert to some more moderate long-term level.

Inflection Point Oil

Peak oil gets all the attention, but the peak isn’t the problem. Unless your assumptions about the dynamics of oil production are rather artificial, there will be an inflection point before the peak. Symptoms of strain in the system appear as soon as the production rate grows more slowly more slowly than GDP less intensity improvements, or simply more slowly than expected. That’s when price has to begin rising to clear the market, creating the signal to alternatives (efficiency, biofuels, unconventional oil …) that they are needed, so that’s when the pain hits. Whether the pain is brief and results in an orderly transition, or something rockier, is a matter for some debate. Either way, arguing about when the peak might come is the wrong question; we should be considering whether we’re past the inflection point, and thus in a period of rising stress, and what to do about it.