Biofuels, dost thou protest too much?

Future ethanol?

Following up on yesterday’s LCFS item, a group of biofuel researchers have written an open letter to the gubernator, protesting the inclusion of indirect land use emissions in biofuel assessments for the LCFS. The letter is followed by 12 pages of names and affiliations – mostly biologists, chemical engineers, and ag economists. They ask for a 24-month moratorium on regulation of indirect land use effects, during which all indirect or market-mediated effects of petroleum and alternative fuels would be studied.

I have mixed feelings about this. On one hand, I don’t think it’s always practical to burden a local regulation with features that attempt to control its nonlocal effects. Better to have a simple regulation that gets imitated widely, so that nonlocal effects come under control in their own jurisdictions. On the other hand, I don’t see how you can do regional GHG policy without some kind of accounting for at least the largest boundary effects. Otherwise leakage of emissions to unregulated jurisdictions just puts the regions who are trying to do the right thing at a competitive disadvantage.

Continue reading “Biofuels, dost thou protest too much?”

Ethanol Odd Couple & the California LCFS

I started sharing items from my feed reader, here. Top of the list is currently a pair of articles from Science Daily:

Corn-for-ethanol’s Carbon Footprint Critiqued

To avoid creating greenhouse gases, it makes more sense using today’s technology to leave land unfarmed in conservation reserves than to plow it up for corn to make biofuel, according to a comprehensive Duke University-led study.

“Converting set-asides to corn-ethanol production is an inefficient and expensive greenhouse gas mitigation policy that should not be encouraged until ethanol-production technologies improve,” the study’s authors reported in the March edition of the research journal Ecological Applications.

Corn Rises After Government Boosts Estimate for Ethanol Demand

Corn rose for a fourth straight session, the longest rally this year, after the U.S. government unexpectedly increased its estimate of the amount of grain that will be used to make ethanol.

House Speaker Nancy Pelosi, a California Democrat, and Senator Amy Klobuchar, a Minnesota Democrat, both said March 9 they support higher amounts of ethanol blended into gasoline. On March 6, Growth Energy, an ethanol-industry trade group, asked the Environmental Protection Agency to raise the U.S. ratio of ethanol in gasoline to 15 percent from 10 percent.

This left me wondering where California’s assessments of low carbon fuels now stand. Last March, I attended a collaborative workshop on life cycle analysis of low carbon fuels, part of a series (mostly facilitated by Ventana, but not this one) on GHG policy. The elephant in the room was indirect land use emissions from biofuels. At the time, some of the academics present argued that, while there’s a lot of uncertainty, zero is the one value that we know to be wrong. That left me wondering what plan B is for biofuels, if current variants turn out to have high land use emissions (rendering them worse than fossil alternatives) and advanced variants remain elusive.

It turns out to be an opportune moment to wonder about this again, because California ARB has just released its LCFS staff report and a bunch of related documents on fuel GHG intensities and land use emissions. The staff report burdens corn ethanol with an indirect land use emission factor of 30 gCO2eq/MJ, on top of direct emissions of 47 to 75 gCO2eq/MJ. That renders 4 of the 11 options tested worse than gasoline (CA RFG at 96 gCO2eq/MJ). Brazilian sugarcane ethanol goes from 27 gCO2eq/MJ direct to 73 gCO2eq/MJ total, due to a higher burden of 46 gCO2eq/MJ for land use (presumably due to tropical forest proximity).

These numbers are a lot bigger than the zero, but also a lot smaller than Michael O’Hare’s 2008 back-of-the-envelope exercise. For example, for corn ethanol grown on converted CRP land, he put total emissions at 228 gCO2eq/MJ (more than twice as high as gasoline), of which 140 gCO2eq/MJ is land use. Maybe the new results (from the GTAP model) are a lot better, but I’m a little wary of the fact that the Staff Report sensitivity ranges on land use (32-57 gCO2eq/MJ for sugarcane, for example) have such a low variance, when uncertainty was previously regarded as rather profound.

But hey, 7 of 11 corn ethanol variants are still better than gasoline, right? Not so fast. A low carbon fuel standard sets the constraint:

(1-x)*G = (1-s)*G + s*A

where x is the standard (emissions intensity cut vs. gasoline), s is the market share of the low-carbon alternative, G is the intensity of gasoline, and A is the intensity of the alternative. Rearranging,

s = x / (1-A/G)

In words, the market share of the alternative fuel needed is proportional to the size of the cut, x, and inversely proportional to the alternative’s improvement over gasoline, (1-A/G), which I’ll call i. As a result, the required share of an alternative fuel increases steeply as it’s performance approaches the limit required by the standard, as shown schematically below:

Intensity-share schematic

Clearly, if a fuel’s i is less than x, s=x/i would have to exceed 1, which is impossible, so you couldn’t meet the constraint with that fuel alone (though you could still use it, supplemented by something better).

Thus land use emissions are quite debilitating for conventional ethanol fuels’ role in the LCFS. For example, ignoring land use emissions, California dry process ethanol has intensity ~=59, or i=0.39. To make a 10% cut, x=0.1, you’d need s=0.26 – 26% market share is hard, but doable. But add 30 gCO2eq/MJ for land use, and i=0.07, which means you can’t meet the standard with that fuel alone. Even the best ethanol option, Brazilian sugarcane at i=0.24, would have 42% market share to meet the standard. This means that the alternative to gasoline in the LCFS would have to be either an advanced ethanol (cellulosic, not yet evaluated), electricity (i=0.6) or hydrogen. As it turns out, that’s exactly what the new Staff Report shows. In the new gasoline compliance scenarios in table ES-10, conventional ethanol contributes at most 5% of the 2020 intensity reduction.

Chapter VI of the Staff Report describes compliance scenarios in more detail. Of the four scenarios in the gasoline stovepipe, each blends 15 to 20% ethanol into gasoline. That ethanol is in turn about 10% conventional (Midwest corn or an improved CA variant with lower intensity) and up to 10% sugarcane. The other 80 to 90% of ethanol is either cellulosic or “advanced renewable” (from forest waste).

That makes the current scenarios a rather different beast from those explored in the original UC Davis LCFS technical study that provides the analytical foundation for the LCFS. I dusted off my copy of VISION-CA (the model used, and a topic for another post some day) and ran the 10% cut scenarios. Some look rather like the vision in the current staff report, with high penetration of low-intensity fuels. But the most technically diverse (and, I think, the most plausible) scenario is H10, with multiple fuels and vehicles. The H10 scenario’s ethanol is still 70% conventional Midwest corn in 2020. It also includes substantial “dieselization” of the fleet (which helps due to diesel’s higher tank-to-wheel efficiency). I suspect that H10-like scenarios are now unavailable, due to land use emissions (which greatly diminish the value of corn ethanol) and the choice of separate compliance pathways for gasoline and diesel.

The new beast isn’t necessarily worse than the old, but it strikes me as higher risk, because it relies on the substantial penetration of fuels that aren’t on the market today. If that’s going to happen by 2020, it’s going to be a busy decade.

The Growth Bubble

I caught up with my email just after my last post, which questioned the role of the real economy in the current financial crisis. I found this in my inbox, by Thomas Friedman, currently the most-emailed article in the NYT:

Let’s today step out of the normal boundaries of analysis of our economic crisis and ask a radical question: What if the crisis of 2008 represents something much more fundamental than a deep recession? What if it’s telling us that the whole growth model we created over the last 50 years is simply unsustainable economically and ecologically and that 2008 was when we hit the wall ’” when Mother Nature and the market both said: ‘No more.’

Certainly there are some parallels between the housing bubble and environment/growth issues. You have your eternal growth enthusiasts with plausible-sounding theories, cheered on by people in industry who stand to profit.

There’s plenty of speculation about the problem ahead of time:
Google news timeline - housing bubble

Google news timeline – housing bubble

People in authority doubt that there’s a problem, and envision a soft landing. In any case, nobody does anything about it.

Sound familiar so far?

However, I think it’s a bit of a leap to attribute our current mess to unsustainability in the real economy. For one thing, in hindsight, it’s clear that we weren’t overshooting natural carrying capacity in 1929, so it’s clearly possible to have a depression without an underlying resource problem. For another, we had ridiculously high commodity prices, but not many other direct impacts of environmental catastrophe (other than all the ones that have been slowly worsening for decades). My guess is that environmental overshoot has a lot longer time constant than housing or tech stock markets, both on the way up and the way down, so overshoot will evolve in more gradual and diverse ways at first. I think at best you can say that detecting the role of unsustainable resource management is like the tropical storm attribution problem. There are good theoretical reasons to think that higher sea surface temperatures contribute to tropical storm intensity, but there’s little hope of pinning Katrina on global warming specifically.

Personally, I think it’s possible that EIA is right, and peak oil is a little further down the road. With a little luck, asset prices might stabilize, and we could get another run of growth, at least from the perspective of those who benefit most from globalization. If so, will we learn from this bubble, and take corrective action before the next? I hope so.

I think the most important lesson could be the ending of the housing bubble, as we know it so far. It’s not a soft landing; positive feedbacks have taken over, as with a spark in a dry forest. That seems like a really good reason to step back and think, not just how to save big banks, but how to turn our current situation into a storm of creative destruction that mitigates the bigger one coming.

What about the real economy?

I sort of follow a bunch of economics blogs. Naturally they’re all very much preoccupied with the financial crisis. There’s a lot of debate about Keynesian multipliers, whether the stimulus will work, liquidity traps, bursting bubbles, and the like. If you step back, it appears to be a conversation about how to use fiscal and monetary policy to halt a vicious cycle of declining expectations fueled by financial instruments no one really understands – essentially an attempt to keep the perceived economy from dragging down the real economy (as it is clearly now doing). The implicit assumption often seems to be that, if we could only untangle the current mess, the economy would return to its steady state growth path.

What I find interesting is that there’s little mention of what might have been wrong in the real economy to begin with, and its role in the current crisis. Clearly the last decade was a time of disequilibrium, not just in the price of risk, but in the real capital investments and consumption patterns that flowed from it. My working hypothesis is that we were living in a lala land of overconsumption, funded by deficits, sovereign wealth funds, resource drawdown, and failure to invest in our own future. In that case, the question for the real economy is, how much does consumption have to fall to bring things back into balance? My WAG is 15% – which implies a heck of a lot of reallocation of activity in the real economy. What does that look like? Could we see it through the fog of knock-on effects that we’re now experiencing? Is there something we could be doing, on top of fiscal and monetary policy, to ease the transition?

Killer Models?

I was just looking up Archimedean copulas, and stumbled across a bunch of articles blaming the Gaussian copula for the crash, like this interesting one at Wired.

Getting into trouble by ignoring covariance actually has a long and glorious history. Want to make your complex device look super reliable? Decompose it into a zillion parts, then assess their collective probability of failure without regard for collateral damage and other feedbacks that correlate the failure of one part with another. Just don’t check for leaks with a candle afterwards.

Still, blaming copulas, or any other model, for the financial crisis strikes me as a lot like blaming a telephone pole for your car crash. Never mind that you were speeding, drunk, and talking on the phone. It’s not the models, but a general predisposition to ignore systemic risk that brought down the system.

SD on Long Waves, Boom & Bust

Two relevant conversations from the SD email list archive:

Where are we in the long wave?

Bill Harris asks, in 2003,

… should a reasonable person think we are now in the down side of a long wave? That the tough economic times we’ve seen for the past few years will need years to work through, as levels adjust? That simple, short-term economic fixes wont work as they may have in the past? That the concerns we’ve heard about deflation should be seen in a longer context of an entire cycle, not as an isolated event to be overcome? Is there a commonly accepted date for the start of this decline?

Was Bill 5 years ahead of schedule?

Preventing the next boom and bust

Kim Warren asks, in 2001,

This is a puzzle – we take a large fraction of the very brightest and best educated people in the world, put them through 2 years of further intensive education in how business, finance and economics are supposed to work, set them to work in big consulting firms, VCs, and investment banks, pay them highly and supervise them with very experienced and equally bright managers. Yet still we manage to invent quite implausible business ideas, project unsustainable earnings and market performance, and divert huge sums of money and talented people from useful activity into a collective fantasy. Some important questions remain unanswered, like who they are, what they did, how they got away with it, and why the rest of us meekly went along with them? So the challenge to SDers in business is … where is the next bubble coming from, what will it look like, and how can we stop it?

Clearly this is one nut we haven’t cracked.

Can Montana Escape Recession Ravages?

The answer is evidently now “no”, but until recently the UofM’s Bureau of Business and Economic Research director Patrick Barkey thought so:

“As early as last summer we still thought Montana would escape this recession,” he said. “We knew the national economic climate was uncertain, but Montana had been doing pretty well in the previous two recessions. We now know this is a global recession, and it is a more severe recession, and it’s a recession that’s not going to leave Montana unscathed.”

Indeed, things aren’t as bad here as they are in a lot of other places – yet. Compare our housing prices to Florida’s:

MT vs FL house price indexes

On the other hand, our overall economic situation shows a bigger hit than some places with hard-hit housing markets. Here’s the Fed’s coincident index vs. California:

MT coincident index of economic activity

As one would expect, the construction and resource sectors are particularly hard hit by the double-whammy of housing bubble and commodity price collapse. In spite of home prices that seem to have held steady so far, new home construction has fallen dramatically:

MT housing

Interestingly, that hasn’t hit construction employment as hard as one would expect. Mining and resources employment has taken a similar hit, though you can hardly see it here because the industry is comparatively small (so why is its influence on MT politics comparatively large?).

MT construction & mining employment

So, where’s the bottom? For metro home prices nationwide, futures markets think it’s 10 to 20% below today, some time around the end of 2010. If the recession turns into a depression, that’s probably too rosy, and it’s hard to see how Montana could escape the contagion. But the impact will certainly vary regionally. The answer for Montana likely depends a lot on two factors: how bubbly was our housing market, and how recession-resistant is our mix of economic activity?

On the first point, here’s the Montana housing market (black diamonds), compared to the other 49 states and DC:

State home price index vs 2000

Prices above are normalized to 2000 levels, using the OFHEO index of conforming loan sales (which is not entirely representative – read on). At the end of 2003, Montana ranked 20th in appreciation from 2000. At the end of 2008, MT was 8th. Does the rise mean that we’re holding strong on fundamentals while others collapse? Or just that we’re a bunch of hicks, last to hear that the party’s over? Hard to say.

It’s perhaps a little easier to separate fundamentals from enthusiasm by looking at prices in absolute terms. Here, I’ve used the Census Bureau’s 2000 median home prices to translate the OFHEO index into $ terms:

State median home prices

Among its western region peers, a few other large states, and random states I like, Montana starts to look like a relative bargain still. The real question then is whether demographic trends (latte cowboys like me moving in) can buoy the market against an outgoing tide. I suspect that we’ll fare reasonably well in the long run, but suffer a significant undershoot in the near term.

The OFHEO indices above are a little puzzling, in that so many states seem to be just now, or not yet, peaking. For comparison, here are the 20 metro areas in the CSI index (lines), together with Gallatin County’s median prices (bars):

Gallatin County & CSI metro home prices

These more representative indices still show Montana holding up comparatively well, but with Gallatin County peaking in 2006. I suspect that the OFHEO index is a biased picture of the wider market, due to its exclusion of nonconforming loans, and that this is a truer picture.

Real Estate Roundup

Ira Artman takes a look at residential real estate price indices – S&P/Case-Shiller (CSI), OFHEO, and RPX. The RPX comes out on top, for (marginally) better correlation with foreclosures and, more importantly, a much shorter reporting lag than CSI. This is a cause for minor rejoicing, as we at Ventana helped create the RPX and are affiliated with Radar Logic. Perhaps more importantly, rumor has it that there’s more trading volume on RPX.

In spite of the lag it introduces, the CSI repeat sales regression is apparently sexy to economists. Calculated Risk has been using it to follow developments in prices and price/rent ratios. Econbrowser today looks at the market bottom, as predicted by CSI forward contracts on CME. You can find similar forward curves in Radar’s monthly analysis. As of today, both RPX and CSI futures put the bottom of the market in Nov/Dec 2010, another 15% below current prices. Interestingly, the RPX forward curve looks a little more pessimistic than CSI – an arbitrage opportunity, if you can find the liquidity.

Artman notes that somehow the Fed, in its flow of funds reporting, missed most of the housing decline until after the election.

MIT Updates Greenhouse Gamble

For some time, the MIT Joint Program has been using roulette wheels to communicate climate uncertainty. They’ve recently updated the wheels, based on new model projections:

No Policy Policy
New No policy Policy
Old Old no policy Old policy

The changes are rather dramatic, as you can see. The no-policy wheel looks like the old joke about playing Russian Roulette with an automatic. A tiny part of the difference is a baseline change, but most is not, as the report on the underlying modeling explains:

The new projections are considerably warmer than the 2003 projections, e.g., the median surface warming in 2091 to 2100 is 5.1°C compared to 2.4°C in the earlier study. Many changes contribute to the stronger warming; among the more important ones are taking into account the cooling in the second half of the 20th century due to volcanic eruptions for input parameter estimation and a more sophisticated method for projecting GDP growth which eliminated many low emission scenarios. However, if recently published data, suggesting stronger 20th century ocean warming, are used to determine the input climate parameters, the median projected warning at the end of the 21st century is only 4.1°C. Nevertheless all our simulations have a very small probability of warming less than 2.4°C, the lower bound of the IPCC AR4 projected likely range for the A1FI scenario, which has forcing very similar to our median projection.

I think the wheels are a cool idea, but I’d be curious to know how users respond to it. Do they cheat, and spin to get the outcome they hope for? Perhaps MIT should spice things up a bit, by programming an online version that gives users’ computers the BSOD if they roll a >7C world.

Hat tip to Travis Franck for pointing this out.