Spring has (Un)Sprung

Spring has arrived here in Montana, though there’s at least two months of snow still to come. Spring critters have arrived, as if on cue. This weekend we saw our first robin, bluebird, sandhill crane, and woolly bear caterpillar. The caterpillar found a little bit too much warmth – he’s fast becoming a fossil in a pool at Mammoth Hot Springs:

Mammoth woolly bear

Typical Montana: in the time it took me to write this, then find & upload the photo, it’s snowed almost a foot and yet another bird (juncos) has arrived.

Having a Blast in Bozeman

I don’t often get to read about my adopted hometown in the national papers; it’s usually pretty obscure. When FAA analysts look for a small-time airport to poke fun at, we’re first on the list. However, today the NYT has covered the gas explosion that destroyed half a block of downtown, including some wonderful historic brick buildings. The blast was so powerful that we heard it from our house, 6 miles away with an intervening ridge. Sadly there’s no recovery for one person, but I hope the rest of downtown bounces back quickly.

FutureGen killing a mistake?

Via ClimateArk,

US government slammed over coal project

Basic accounting error led government department to miscalculate ongoing project costs

The document, which examines the restructuring of the FutureGen project in January 2008, found that a basic accounting error led the department to miscalculate ongoing project costs. This led it to drastically alter the nature of the project, delaying its operation by three years.

FutureGen, which was meant to begin operation in 2012, combined integrated gasification combined cycle (IGCC) with carbon capture and sequestration (CCS).

The initiative was designed to be an experimental one for emerging clean coal research, but construction prices had been escalating as material and labour costs increased. The DoE decided to withdraw support for the industry alliance that was partially funding the programme in January last year.

“Contrary to best practices, DoE did not base its decision to restructure FutureGen on a comprehensive analysis of factors, such as the associated costs, benefits, and risks,” says the report.

“DoE made its decision based, in large part, on its conclusion that construction and material costs for the original programme would continue escalating substantially in the definite future and that lifecycle costs were likely to double.”

However, the DoE’s own Energy Information Administration has pointed out that significant cost escalation for building power plants does not continue in the long run.

The department also made a fundamental mistake in assessing ongoing project costs. It said that costs had doubled from original estimates, using that as the key justification for withdrawing funds from the alliance.

But when it compared its original 2004 estimate of the project’s cost with the alliance’s 2006 estimate to reach that conclusion, it did not take into account that the first estimate was in constant 2004 dollars, whereas the latter was in inflated dollars. Had it acknowledged this difference, the project cost would only have increased by 39 per cent ($370m), according to the GAO.

Another good reason to make sure your units balance. I find this explanation of the cancellation barely credible. There must be more to this than meets the eye.

Friendly Climate Science & Policy Models

Beth Sawin just presented our C-ROADS work in Copenhagen. The model will soon be available online and in other forms, for decision support and educational purposes. It helps people to understand the basic dynamics of the carbon cycle and climate, and to add up diverse regional proposals for emissions reductions, to see what they imply for the globe. It’s a small model, yet there are those who love it. No model can do everything, so I thought I’d point out a few other tools that are available online, fairly easy to use, and serve similar purposes.

FAIR

From MNP, Netherlands. Like C-ROADS, runs interactively. The downloadable demo version is quite sophisticated, but emphasizes discovery of emissions trajectories that meet goals and constraints, rather than characterization of proposals on the table. The full research version, with sector/fuel detail and marginal abatement costs, is available on a case-by-case basis. Backed up by some excellent publications.

JCM

Ben Matthews’ Java Climate Model. Another interactive tool. Generates visually stunning output in realtime, which is remarkable given the scale and sophistication of the underlying model. Very rich; it helps to know what you’re after when you start to get into the deeper levels.

MAGICC

The tool used in AR4 to summarize the behavior of 19 GCMs, facilitating more rapid scenario experimentation and sensitivity analysis. Its companion SCENGEN does nice regional maps, which I haven’t really explored. MAGICC takes a few seconds to run, and while it has a GUI, detailed input and output is buried in text files, so I’m stretching the term “friendly” here.

I think these are the premier accessible tools out there, but I’m sure I’ve forgotten a few, so I’ll violate my normal editing rules and update this post as needed.

Biofuels, dost thou protest too much?

Future ethanol?

Following up on yesterday’s LCFS item, a group of biofuel researchers have written an open letter to the gubernator, protesting the inclusion of indirect land use emissions in biofuel assessments for the LCFS. The letter is followed by 12 pages of names and affiliations – mostly biologists, chemical engineers, and ag economists. They ask for a 24-month moratorium on regulation of indirect land use effects, during which all indirect or market-mediated effects of petroleum and alternative fuels would be studied.

I have mixed feelings about this. On one hand, I don’t think it’s always practical to burden a local regulation with features that attempt to control its nonlocal effects. Better to have a simple regulation that gets imitated widely, so that nonlocal effects come under control in their own jurisdictions. On the other hand, I don’t see how you can do regional GHG policy without some kind of accounting for at least the largest boundary effects. Otherwise leakage of emissions to unregulated jurisdictions just puts the regions who are trying to do the right thing at a competitive disadvantage.

Continue reading “Biofuels, dost thou protest too much?”

Ethanol Odd Couple & the California LCFS

I started sharing items from my feed reader, here. Top of the list is currently a pair of articles from Science Daily:

Corn-for-ethanol’s Carbon Footprint Critiqued

To avoid creating greenhouse gases, it makes more sense using today’s technology to leave land unfarmed in conservation reserves than to plow it up for corn to make biofuel, according to a comprehensive Duke University-led study.

“Converting set-asides to corn-ethanol production is an inefficient and expensive greenhouse gas mitigation policy that should not be encouraged until ethanol-production technologies improve,” the study’s authors reported in the March edition of the research journal Ecological Applications.

Corn Rises After Government Boosts Estimate for Ethanol Demand

Corn rose for a fourth straight session, the longest rally this year, after the U.S. government unexpectedly increased its estimate of the amount of grain that will be used to make ethanol.

House Speaker Nancy Pelosi, a California Democrat, and Senator Amy Klobuchar, a Minnesota Democrat, both said March 9 they support higher amounts of ethanol blended into gasoline. On March 6, Growth Energy, an ethanol-industry trade group, asked the Environmental Protection Agency to raise the U.S. ratio of ethanol in gasoline to 15 percent from 10 percent.

This left me wondering where California’s assessments of low carbon fuels now stand. Last March, I attended a collaborative workshop on life cycle analysis of low carbon fuels, part of a series (mostly facilitated by Ventana, but not this one) on GHG policy. The elephant in the room was indirect land use emissions from biofuels. At the time, some of the academics present argued that, while there’s a lot of uncertainty, zero is the one value that we know to be wrong. That left me wondering what plan B is for biofuels, if current variants turn out to have high land use emissions (rendering them worse than fossil alternatives) and advanced variants remain elusive.

It turns out to be an opportune moment to wonder about this again, because California ARB has just released its LCFS staff report and a bunch of related documents on fuel GHG intensities and land use emissions. The staff report burdens corn ethanol with an indirect land use emission factor of 30 gCO2eq/MJ, on top of direct emissions of 47 to 75 gCO2eq/MJ. That renders 4 of the 11 options tested worse than gasoline (CA RFG at 96 gCO2eq/MJ). Brazilian sugarcane ethanol goes from 27 gCO2eq/MJ direct to 73 gCO2eq/MJ total, due to a higher burden of 46 gCO2eq/MJ for land use (presumably due to tropical forest proximity).

These numbers are a lot bigger than the zero, but also a lot smaller than Michael O’Hare’s 2008 back-of-the-envelope exercise. For example, for corn ethanol grown on converted CRP land, he put total emissions at 228 gCO2eq/MJ (more than twice as high as gasoline), of which 140 gCO2eq/MJ is land use. Maybe the new results (from the GTAP model) are a lot better, but I’m a little wary of the fact that the Staff Report sensitivity ranges on land use (32-57 gCO2eq/MJ for sugarcane, for example) have such a low variance, when uncertainty was previously regarded as rather profound.

But hey, 7 of 11 corn ethanol variants are still better than gasoline, right? Not so fast. A low carbon fuel standard sets the constraint:

(1-x)*G = (1-s)*G + s*A

where x is the standard (emissions intensity cut vs. gasoline), s is the market share of the low-carbon alternative, G is the intensity of gasoline, and A is the intensity of the alternative. Rearranging,

s = x / (1-A/G)

In words, the market share of the alternative fuel needed is proportional to the size of the cut, x, and inversely proportional to the alternative’s improvement over gasoline, (1-A/G), which I’ll call i. As a result, the required share of an alternative fuel increases steeply as it’s performance approaches the limit required by the standard, as shown schematically below:

Intensity-share schematic

Clearly, if a fuel’s i is less than x, s=x/i would have to exceed 1, which is impossible, so you couldn’t meet the constraint with that fuel alone (though you could still use it, supplemented by something better).

Thus land use emissions are quite debilitating for conventional ethanol fuels’ role in the LCFS. For example, ignoring land use emissions, California dry process ethanol has intensity ~=59, or i=0.39. To make a 10% cut, x=0.1, you’d need s=0.26 – 26% market share is hard, but doable. But add 30 gCO2eq/MJ for land use, and i=0.07, which means you can’t meet the standard with that fuel alone. Even the best ethanol option, Brazilian sugarcane at i=0.24, would have 42% market share to meet the standard. This means that the alternative to gasoline in the LCFS would have to be either an advanced ethanol (cellulosic, not yet evaluated), electricity (i=0.6) or hydrogen. As it turns out, that’s exactly what the new Staff Report shows. In the new gasoline compliance scenarios in table ES-10, conventional ethanol contributes at most 5% of the 2020 intensity reduction.

Chapter VI of the Staff Report describes compliance scenarios in more detail. Of the four scenarios in the gasoline stovepipe, each blends 15 to 20% ethanol into gasoline. That ethanol is in turn about 10% conventional (Midwest corn or an improved CA variant with lower intensity) and up to 10% sugarcane. The other 80 to 90% of ethanol is either cellulosic or “advanced renewable” (from forest waste).

That makes the current scenarios a rather different beast from those explored in the original UC Davis LCFS technical study that provides the analytical foundation for the LCFS. I dusted off my copy of VISION-CA (the model used, and a topic for another post some day) and ran the 10% cut scenarios. Some look rather like the vision in the current staff report, with high penetration of low-intensity fuels. But the most technically diverse (and, I think, the most plausible) scenario is H10, with multiple fuels and vehicles. The H10 scenario’s ethanol is still 70% conventional Midwest corn in 2020. It also includes substantial “dieselization” of the fleet (which helps due to diesel’s higher tank-to-wheel efficiency). I suspect that H10-like scenarios are now unavailable, due to land use emissions (which greatly diminish the value of corn ethanol) and the choice of separate compliance pathways for gasoline and diesel.

The new beast isn’t necessarily worse than the old, but it strikes me as higher risk, because it relies on the substantial penetration of fuels that aren’t on the market today. If that’s going to happen by 2020, it’s going to be a busy decade.

The Growth Bubble

I caught up with my email just after my last post, which questioned the role of the real economy in the current financial crisis. I found this in my inbox, by Thomas Friedman, currently the most-emailed article in the NYT:

Let’s today step out of the normal boundaries of analysis of our economic crisis and ask a radical question: What if the crisis of 2008 represents something much more fundamental than a deep recession? What if it’s telling us that the whole growth model we created over the last 50 years is simply unsustainable economically and ecologically and that 2008 was when we hit the wall ’” when Mother Nature and the market both said: ‘No more.’

Certainly there are some parallels between the housing bubble and environment/growth issues. You have your eternal growth enthusiasts with plausible-sounding theories, cheered on by people in industry who stand to profit.

There’s plenty of speculation about the problem ahead of time:
Google news timeline - housing bubble

Google news timeline – housing bubble

People in authority doubt that there’s a problem, and envision a soft landing. In any case, nobody does anything about it.

Sound familiar so far?

However, I think it’s a bit of a leap to attribute our current mess to unsustainability in the real economy. For one thing, in hindsight, it’s clear that we weren’t overshooting natural carrying capacity in 1929, so it’s clearly possible to have a depression without an underlying resource problem. For another, we had ridiculously high commodity prices, but not many other direct impacts of environmental catastrophe (other than all the ones that have been slowly worsening for decades). My guess is that environmental overshoot has a lot longer time constant than housing or tech stock markets, both on the way up and the way down, so overshoot will evolve in more gradual and diverse ways at first. I think at best you can say that detecting the role of unsustainable resource management is like the tropical storm attribution problem. There are good theoretical reasons to think that higher sea surface temperatures contribute to tropical storm intensity, but there’s little hope of pinning Katrina on global warming specifically.

Personally, I think it’s possible that EIA is right, and peak oil is a little further down the road. With a little luck, asset prices might stabilize, and we could get another run of growth, at least from the perspective of those who benefit most from globalization. If so, will we learn from this bubble, and take corrective action before the next? I hope so.

I think the most important lesson could be the ending of the housing bubble, as we know it so far. It’s not a soft landing; positive feedbacks have taken over, as with a spark in a dry forest. That seems like a really good reason to step back and think, not just how to save big banks, but how to turn our current situation into a storm of creative destruction that mitigates the bigger one coming.

What about the real economy?

I sort of follow a bunch of economics blogs. Naturally they’re all very much preoccupied with the financial crisis. There’s a lot of debate about Keynesian multipliers, whether the stimulus will work, liquidity traps, bursting bubbles, and the like. If you step back, it appears to be a conversation about how to use fiscal and monetary policy to halt a vicious cycle of declining expectations fueled by financial instruments no one really understands – essentially an attempt to keep the perceived economy from dragging down the real economy (as it is clearly now doing). The implicit assumption often seems to be that, if we could only untangle the current mess, the economy would return to its steady state growth path.

What I find interesting is that there’s little mention of what might have been wrong in the real economy to begin with, and its role in the current crisis. Clearly the last decade was a time of disequilibrium, not just in the price of risk, but in the real capital investments and consumption patterns that flowed from it. My working hypothesis is that we were living in a lala land of overconsumption, funded by deficits, sovereign wealth funds, resource drawdown, and failure to invest in our own future. In that case, the question for the real economy is, how much does consumption have to fall to bring things back into balance? My WAG is 15% – which implies a heck of a lot of reallocation of activity in the real economy. What does that look like? Could we see it through the fog of knock-on effects that we’re now experiencing? Is there something we could be doing, on top of fiscal and monetary policy, to ease the transition?

Killer Models?

I was just looking up Archimedean copulas, and stumbled across a bunch of articles blaming the Gaussian copula for the crash, like this interesting one at Wired.

Getting into trouble by ignoring covariance actually has a long and glorious history. Want to make your complex device look super reliable? Decompose it into a zillion parts, then assess their collective probability of failure without regard for collateral damage and other feedbacks that correlate the failure of one part with another. Just don’t check for leaks with a candle afterwards.

Still, blaming copulas, or any other model, for the financial crisis strikes me as a lot like blaming a telephone pole for your car crash. Never mind that you were speeding, drunk, and talking on the phone. It’s not the models, but a general predisposition to ignore systemic risk that brought down the system.

SD on Long Waves, Boom & Bust

Two relevant conversations from the SD email list archive:

Where are we in the long wave?

Bill Harris asks, in 2003,

… should a reasonable person think we are now in the down side of a long wave? That the tough economic times we’ve seen for the past few years will need years to work through, as levels adjust? That simple, short-term economic fixes wont work as they may have in the past? That the concerns we’ve heard about deflation should be seen in a longer context of an entire cycle, not as an isolated event to be overcome? Is there a commonly accepted date for the start of this decline?

Was Bill 5 years ahead of schedule?

Preventing the next boom and bust

Kim Warren asks, in 2001,

This is a puzzle – we take a large fraction of the very brightest and best educated people in the world, put them through 2 years of further intensive education in how business, finance and economics are supposed to work, set them to work in big consulting firms, VCs, and investment banks, pay them highly and supervise them with very experienced and equally bright managers. Yet still we manage to invent quite implausible business ideas, project unsustainable earnings and market performance, and divert huge sums of money and talented people from useful activity into a collective fantasy. Some important questions remain unanswered, like who they are, what they did, how they got away with it, and why the rest of us meekly went along with them? So the challenge to SDers in business is … where is the next bubble coming from, what will it look like, and how can we stop it?

Clearly this is one nut we haven’t cracked.