Energy unprincipled

I’ve been browsing the ALEC model legislation on ALECexposed, some of which infiltrated the Montana legislature. It’s discouragingly predictable stuff, but not without a bit of amusement. Take the ALEC Energy Principles:

Mission: To define a comprehensive strategy for energy security, production, and distribution in the states consistent with the Jeffersonian principles of free markets and federalism.

Except when authoritarian government is needed to stuff big infrastructure projects down the throats of unwilling private property owners:

Reliable electricity supply depends upon significant improvement of the transmission grid. Interstate and intrastate transmission siting authority and procedures must be addressed to facilitate the construction of needed new infrastructure.

Like free markets, federalism apparently has its limits:

Such plan shall only be approved by the commission if the expense of implementing such a plan is borne by the federal government.

The overconfidence of nuclear engineers

Rumors that the Fort Calhoun nuclear power station is subject to a media blackout appear to be overblown, given that the NRC is blogging the situation.

Apparently floodwaters at the plant were at 1006 feet ASL yesterday, which is a fair margin from the 1014 foot design standard for the plant. That margin might have been a lot less, if the NRC hadn’t cited the plant for design violations last year, which it estimated would lead to certain core damage at 1010 feet.

Still, engineers say things like this:

“We have much more safety measures in place than we actually need right now,” Jones continued. “Even if the water level did rise to 1014 feet above mean sea level, the plant is designed to handle that much water and beyond. We have additional steps we can take if we need them, but we don’t think we will. We feel we’re in good shape.” – suite101

The “and beyond” sounds like pure embellishment. The design flood elevation for the plant is 1014 feet. I’ve read some NRC documents on the plant, and there’s no other indication that higher design standards were used. Presumably there are safety margins in systems, but those are designed to offset unanticipated failures, e.g. from design deviations like those discovered by the NRC. Surely the risk of unanticipated problems would rise dramatically above the maximum anticipated flood level of 1014 feet.
Overconfidence is a major contributor to accidents in complex systems. How about a little humility?
Currently the Missouri River forecast is pretty flat, so hopefully we won’t test the limits of the plant design.

Wedge furor

Socolow is quoted in Nat Geo as claiming the stabilization wedges were a mistake,

“With some help from wedges, the world decided that dealing with global warming wasn’t impossible, so it must be easy,” Socolow says.  “There was a whole lot of simplification, that this is no big deal.”

Pielke quotes & gloats:

Socolow’s strong rebuke of the misuse of his work is a welcome contribution and, perhaps optimistically, marks a positive step forward in the climate debate.

Romm refutes,

I spoke to Socolow today at length, and he stands behind every word of that — including the carefully-worded title.  Indeed, if Socolow were king, he told me, he’d start deploying some 8 wedges immediately. A wedge is a strategy and/or technology that over a period of a few decades ultimately reduces projected global carbon emissions by one billion metric tons per year (see Princeton website here). Socolow told me we “need a rising CO2 price” that gets to a serious level in 10 years.  What is serious?   “$50 to $100 a ton of CO2.”

Revkin weighs in with a broader view, but the tone is a bit Pielkeish,

From the get-go, I worried about the gushy nature of the word “solving,” particularly given that there was then, and remains, no way to solve the climate problem by 2050.

David Roberts wonders what the heck Socolow is thinking.

Who’s right? I think it’s best in Socolow’s own words (posted by Revkin):

1. Look closely at what is in quotes, which generally comes from my slides, and what is not in quotes. What is not in quotes is just enough “off” in several places to result in my messages being misconstrued. I have given a similar talk about ten times, starting in December 2010, and this is the first time that I am aware of that anyone in the audience so misunderstood me. I see three places where what is being attributed to me is “off.”

a. “It was a mistake, he now says.” Steve Pacala’s and my wedges paper was not a mistake. It made a useful contribution to the conversation of the day. Recall that we wrote it at a time when the dominant message from the Bush Administration was that there were no available tools to deal adequately with climate change. I have repeated maybe a thousand times what I heard Spencer Abraham, Secretary of Energy, say to a large audience in Alexandria. Virginia, early in 2004. Paraphrasing, “it will take a discovery akin to the discovery of electricity” to deal with climate change. Our paper said we had the tools to get started, indeed the tools to “solve the climate problem for the next 50 years,” which our paper defined as achieving emissions 50 years from now no greater than today. I felt then and feel now that this is the right target for a world effort. I don’t disown any aspect of the wedges paper.

b. “The wedges paper made people relax.” I do not recognize this thought. My point is that the wedges people made some people conclude, not surprisingly, that if we could achieve X, we could surely achieve more than X. Specifically, in language developed after our paper, the path we laid out (constant emissions for 50 years, emissions at stabilization levels after a second 50 years) was associated with “3 degrees,” and there was broad commitment to “2 degrees,” which was identified with an emissions rate of only half the current one in 50 years. In language that may be excessively colorful, I called this being “outflanked.” But no one that I know of became relaxed when they absorbed the wedges message.

c. “Well-­?intentioned groups misused the wedges theory.” I don’t recognize this thought. I myself contributed the Figure that accompanied Bill McKibben’s article in National Geographic that showed 12 wedges (seven wedges had grown to eight to keep emissions level, because of emissions growth post-­?2006 and the final four wedges drove emissions to half their current levels), to enlist the wedges image on behalf of a discussion of a two-­?degree future. I am not aware of anyone misusing the theory.

2. I did say “The job went from impossible to easy.” I said (on the same slide) that “psychologists are not surprised,” invoking cognitive dissonance. All of us are more comfortable with believing that any given job is impossible or easy than hard. I then go on to say that the job is hard. I think almost everyone knows that. Every wedge was and is a monumental undertaking. The political discourse tends not to go there.

3. I did say that there was and still is a widely held belief that the entire job of dealing with climate change over the next 50 years can be accomplished with energy efficiency and renewables. I don’t share this belief. The fossil fuel industries are formidable competitors. One of the points of Steve’s and my wedges paper was that we would need contributions from many of the available option. Our paper was a call for dialog among antagonists. We specifically identified CO2 capture and storage as a central element in climate strategy, in large part because it represents a way of aligning the interests of the fossil fuel industries with the objective of climate change.

It is distressing to see so much animus among people who have common goals. The message of Steve’s and my wedges paper was, above all, ecumenical.

My take? It’s rather pointless to argue the merits of 7 or 14 or 25 wedges. We don’t really know the answer in any detail. Do a little, learn, do some more. Socolow’s $50 to $100 a ton would be a good start.

this
three 

a. “It
It
time
available
thousand
audience
akin
the
tools
to
get
started,
indeed
the
tools
to
“solve
the
climate
problem
for
the
next
50

years,”
than
disown
any
aspect
of
the
wedges
paper.

b. “The
wedges
paper
made
people
relax.”
I
do
not
recognize
this
thought.
My
point
is
that

the
wedges
people
made
some
people
conclude,
not
surprisingly,
that
if
we
could

achieve
after
our
paper,
the
path
we
laid
out
(constant
emissions
for
50
years,
emissions
at

stabilization
was
only
half
the
current
one
in
50
years.
In
language
that
may
be
excessively
colorful,
I

called
this
being
“outflanked.”
But
no
one
that
I
know
of
became
relaxed
when
they

absorbed
the
wedges
message.

c.
“Well-­?intentioned
myself
contributed
the
Figure
that
accompanied
Bill
McKibben’s
article
in
National

Geographic
emissions
emissions
discussion

A walk through the Ryan budget proposal

Since the budget deal was announced, I’ve been wondering what was in it. It’s hard to imagine that it really works like this:

“This is an agreement to invest in our country’s future while making the largest annual spending cut in our history,” Obama said.

However, it seems that there isn’t really much substance to the deal yet, so I thought I’d better look instead at one target: the Ryan budget roadmap. The CBO recently analyzed it, and put the $ conveniently in a spreadsheet.

Like most spreadsheets, this is very good at presenting the numbers, and lousy at revealing causality. The projections are basically open-loop, but they run to 2084. There’s actually some justification for open-loop budget projections, because many policies are open loop. The big health and social security programs, for example, are driven by demographics, cutoff ages and inflation adjustment formulae. The demographics and cutoff ages are predictable. It’s harder to fathom the possible divergence between inflation adjustments and broad inflation (which affects the health sector share) and future GDP growth. So, over long horizons, it’s a bit bonkers to look at the system without considering feedback, or at least uncertainty in the future trajectory of some key drivers.

There’s also a confounding annoyance in the presentation, with budgets and debt as percentages of GDP. Here’s revenue and “other” expenditures (everything but social security, health and interest):

RevenueOtherTransientThere’s a huge transient in each, due to the current financial mess. (Actually this behavior is to some extent deliberately Keynesian – the loss of revenue in a recession is amplified over the contraction of GDP, because people fall into lower tax brackets and profits are more volatile than gross activity. Increased borrowing automatically takes up the slack, maintaining more stable spending.) The transient makes it tough to sort out what’s real change, and what is merely the shifting sands of the GDP denominator. This graph also points out another irritation: there’s no history. Is this plausible, or unprecedented behavior?

The Ryan team actually points out some of the same problems with budgets and their analyses:

One reason the Federal Government’s major entitlement programs are difficult to control is that they are designed that way. A second is that current congressional budgeting provides no means of identifying the long-term effects of near-term program expansions. A third is that these programs are not subject to regular review, as annually appropriated discretionary programs are; and as a result, Congress rarely evaluates the costs and effectiveness of entitlements except when it is proposing to enlarge them. Nothing can substitute for sound and prudent policy choices. But an improved budget process, with enforceable limits on total spending, would surely be a step forward. This proposal calls for such a reform.

Unfortunately the proposed reforms don’t seem to change anything about the process for analyzing the budget or designing programs. We need transparent models with at least a little bit of feedback in them, and programs that are robust because they’re designed with that little bit of feedback in mind.

Setting aside these gripes, here’s what I can glean from the spreadsheet.

The Ryan proposal basically flatlines revenue at 19% of GDP, then squashes programs to fit. By contrast, the CBO Extended Baseline scenario expands programs per current rules and then raises revenue to match (very roughly – the Ryan proposal actually winds up with slightly more public debt 20 years from now).

RevenueIt’s not clear how the 19% revenue level arises; the CBO used a trajectory from Ryan’s staff, not its own analysis. Ryan’s proposal says:

  • Provides individual income tax payers a choice of how to pay their taxes – through existing law, or through a highly simplified code that fits on a postcard with just two rates and virtually no special tax deductions, credits, or exclusions (except the health care tax credit).
  • Simplifies tax rates to 10 percent on income up to $100,000 for joint filers, and $50,000 for single filers; and 25 percent on taxable income above these amounts. Also includes a generous standard deduction and personal exemption (totaling $39,000 for a family of four).
  • Eliminates the alternative minimum tax [AMT].
  • Promotes saving by eliminating taxes on interest, capital gains, and dividends; also eliminates the death tax.
  • Replaces the corporate income tax – currently the second highest in the industrialized world – with a border-adjustable business consumption tax of 8.5 percent. This new rate is roughly half that of the rest of the industrialized world.

It’s not clear that there’s any analysis to back up the effects of this proposal. Certainly it’s an extremely regressive shift. Real estate fans will flip when they find out that the mortgage interest deduction is gone (actually a good idea, I think).

On the outlay side, here’s the picture (CBO in solid lines; Ryan proposal with dashes):

OutlaysYou can see several things here:

  • Social security is untouched until some time after 2050. CBO says that the proposal doesn’t change the program; Ryan’s web site partially privatizes it after about a decade and “eventually” raises the retirement age. There seems to be some disconnect here.
  • Health care outlays are drastically lower; this is clearly where the bulk of the savings originate. Even so, there’s not much change in the trend until at least 2025 (the initial absolute difference is definitional – inclusion of programs other than Medicare/Medicaid in the CBO version).
  • Other noninterest outlays also fall substantially – presumably this means that all other expenditures would have to fit into a box not much bigger than today’s defense budget, which seems like a heroic assumption even if you get rid of unemployment, SSI, food stamps, Section 8, and all similar support programs.

You can also look at the ratio of outlays under Ryan vs. CBO’s Extended Baseline:

OutlayRatios

Since health care carries the flag for savings, the question is, will the proposal work? I’ll take a look at that next.

April Fools in the MT Legislature

I was planning an April Fool’s Day post to mock the Montana legislature, but I really can’t top what’s actually been going on in Helena over the past few days. One bar-owning legislator proposed rolling back DUI laws, to preserve the sacred small town rite of driving home drunk from the bar. The same day, they seriously debated putting the state on the gold standard, which drew open laughter and an amendment to permit paying state transactions in coal. The gold bugs, who fancy themselves constitutional scholars, evidently weren’t around when the proposal to assert eminent domain power over federal lands was drafted. I could go on and on… It’s troubling, because I keep getting my news reader feed mixed up with The Onion.

A comment at the Bozeman Daily Chronicle captured widespread sentiment around here better than I can:

Hey members of the house- Thanks for wasting our money. Try to do something productive up there instead of making all Montanans look like a bunch of idiots. If I was as worthless as you I’d kick my own a_$. Put that in your cowboy code…

Dynamics of Fukushima Radiation

I like maps, but I love time series.

ScienceInsider has a nice roundup of radiation maps. I visited a few, and found current readings, but got curious about the dynamics, which were not evident.

So, I grabbed Marian Steinbach’s scraped data and filtered it to a manageable size. Here’s what I got for the 9 radiation measurement stations in Ibaraki prefecture, where the Fukushima-Daiichi reactors are located:

IbarakiStationRadiation

The time series above (click it to enlarge) shows about 10 days of background readings, pre-quake, followed by some intense spikes of radiation, with periods of what looks like classic exponential decay behavior. “Intense” is relative, because fortunately those numbers are in nanoGrays, which are small.

The cumulative dose at these sites is not yet high, but climbing:

IbarakiStationCumDose

The Fukushima contribution to cumulative dose is about .15 milliGrays – according to this chart, roughly a chest x-ray. Of course, if you extrapolate to long exposure from living there, that’s not good, but fortunately the decay process is also underway.

The interesting thing about the decay process is that it shows signs of having multiple time constants. That’s exactly what you’d expect, given that there’s a mix of isotopes with different half lives and a mix of processes (radioactive decay and physical transport of deposited material through the environment).

IbarakiRadHalfLife

The linear increases in the time constant during the long, smooth periods of decay presumably arise as fast processes play themselves out, leaving the longer time constants to dominate. For example, if you have a patch of soil with cesium and iodine in it, the iodine – half life 8 days – will be 95% gone in a little over a month, leaving the cesium – half life 30 years – to dominate the local radiation, with a vastly slower rate of decay.

Since the longer-lived isotopes will dominate the future around the plant, the key question then is what the environmental transport processes do with the stuff.

Update: Here’s the Steinbach data, aggregated to hourly (from 10min) frequency, with -888 and -888 entries removed, and trimmed in latitude range. Station_data Query hourly (.zip)

Nuclear safety follies

I find panic-fueled iodine marketing and disingenuous comparisons of Fukushima to Chernobyl deplorable.

iodineBut those are balanced by pronouncements like this:

Telephone briefing from Sir John Beddington, the UK’s chief scientific adviser, and Hilary Walker, deputy director for emergency preparedness at the Department of Health.“Unequivocally, Tokyo will not be affected by the radiation fallout of explosions that have occurred or may occur at the Fukushima nuclear power stations.”

Surely the prospect of large scale radiation release is very low, but it’s not approximately zero, which is my interpretation of “unequivocally not.”

On my list of the seven deadly sins of complex systems management, number four is,

Certainty. Planning for it leads to fragile strategies. If you can’t imagine a way you could be wrong, you’re probably a fanatic.

Nuclear engineers disagree, but some seem to have a near-fanatic faith in plant safety. Normal Accidents documents some bizarrely cheerful post-accident reflections on safety. I found another when reading up over the last few days:

again Continue reading “Nuclear safety follies”

Will complex designs win the nuclear race?

Areva pursues “defense in depth” for reactor safety:

Areva SA (CEI) Chief Executive Officer Anne Lauvergeon said explosions at a Japanese atomic power site in the wake of an earthquake last week underscore her strategy to offer more complex reactors that promise superior safety.

“Low-cost reactors aren’t the future,” Lauvergeon said on France 2 television station yesterday. “There was a big controversy for one year in France about the fact that our reactors were too safe.”

Lauvergeon has been under pressure to hold onto her job amid delays at a nuclear plant under construction in Finland. The company and French utility Electricite de France SA, both controlled by the state, lost a contract in 2009 worth about $20 billion to build four nuclear stations in the United Arab Emirates, prompting EDF CEO Henri Proglio to publicly question the merits of Areva’s more complex and expensive reactor design.

Areva’s new EPR reactors, being built in France, Finland and China, boasts four independent safety sub-systems that are supposed to reduce core accidents by a factor 10 compared with previous reactors, according to the company.

The design has a double concrete shell to withstand missiles or a commercial plane crash, systems designed to prevent hydrogen accumulation that may cause radioactive release, and a core catcher in the containment building in the case of a meltdown. To withstand severe earthquakes, the entire nuclear island stands on a single six-meter (19.6 feet) thick reinforced concrete base, according to Paris-based Areva.

via Bloomberg

I don’t doubt that the Areva design is far better than the reactors now in trouble in Japan. But I wonder if this is really the way forward. Big, expensive hardware that uses multiple redundant safety systems to offset the fundamentally marginal stability of the reaction might indeed work safely, but it doesn’t seem very deployable on the kind of scale needed for either GHG emissions mitigation or humanitarian electrification of the developing world. The financing comes in overly large bites, huge piles of concretes increase energy and emission payback periods, and it would take ages to ramp up construction and training enough to make a dent in the global challenge.

I suspect that the future – if there is one – lies with simpler designs that come in smaller portions and trade some performance for inherent stability and antiproliferation features. I can’t say whether their technology can actually deliver on the promises, but at least TerraPower – for example – has the right attitude:

“A cheaper reactor design that can burn waste and doesn’t run into fuel limitations would be a big thing,” Mr. Gates says.

However, even simple/small-is-beautiful may come rather late in the game from a climate standpoint:

While Intellectual Ventures has caught the attention of academics, the commercial industry–hoping to stimulate interest in an energy source that doesn’t contribute to global warming–is focused on selling its first reactors in the U.S. in 30 years. The designs it’s proposing, however, are essentially updates on the models operating today. Intellectual Ventures thinks that the traveling-wave design will have more appeal a bit further down the road, when a nuclear renaissance is fully under way and fuel supplies look tight. Technology Review

Not surprisingly, the evolution of the TerraPower design relies on models,

Myhrvold: When you put a software guy on an energy project he turns it into a software project. One of the reasons were innovating around nuclear is that we put a huge amount of energy into computer modeling. We do very extensive computer modeling and have better computer modeling of reactor internals than anyone in the world. No one can touch us on software for designing the reactor. Nuclear is really expensive to do experiments on, so when you have good software it’s way more efficient and a shorter design cycle.

Computing is something that is very important for nuclear. The first fast reactors, which TerraPower is, were basically designed in the slide rule era. It was stunning to us that the guys back then did what they did. We have these incredibly accurate simulations of isotopes and these guys were all doing it with slide rules. My cell phone has more computing power than the computers that were used to design the world’s nuclear plants.

It’ll be interesting to see whether current events kindle interest in new designs, or throw the baby out with the bathwater (is it a regular baby, or a baby Godzilla?). From a policy standpoint, the trick is to create a level playing field for competition among nuclear and non-nuclear technologies, where government participation in the fuel cycle has been overwhelming and risks are thoroughly socialized.

Fortunately, the core ended up on the floor

I’ve been sniffing around for more information on the dynamics of boiling water reactors, particularly in extreme conditions. Here’s what I can glean (caveat: I’m not a nuclear engineer).

It turns out that there’s quite a bit of literature on reduced-form models of reactor operations. Most of this, though, is focused on operational issues that arise from nonlinear dynamics, on a time scale of less than a second or so. (Update: I’ve posted an example of such a model here.)

reactorBlockDiagram

Source: Instability in BWR NPPs – F. Maggini 2004

Those are important – it was exactly those kinds of fast dynamics that led to disaster when operators took the Chernobyl plant into unsafe territory. (Fortunately, the Chernobyl design is not widespread.)

However, I don’t think those are the issues that are now of interest. The Japanese reactors are now far from their normal operating point, and the dynamics of interest have time scales of hours, not seconds. Here’s a map of the territory:

reactorShutdown2

Source: Instability in BWR NPPs – F. Maggini 2004
colored annotations by me.

The horizontal axis is coolant flow through the core, and the vertical axis is core power – i.e. the rate of heat generation. The green dot shows normal full-power operation. The upper left part of the diagram, above the diagonal, is the danger zone, where high power output and low coolant flow creates the danger of a meltdown – like driving your car over a mountain pass, with nothing in the radiator.

It’s important to realize that there are constraints on how you move around this diagram. You can quickly turn off the nuclear chain reaction in a reactor, by inserting the control rods, but it takes a while for the power output to come down, because there’s a lot of residual heat from nuclear decay products.

On the other hand, you can turn off the coolant flow pretty fast – turn off the electricity to the pumps, and the flow will stop as soon as the momentum of the fluid is dissipated. If you were crazy enough to turn off the cooling without turning down the power (yellow line), you’d have an immediate catastrophe on your hands.

In an orderly shutdown, you turn off the chain reaction, then wait patiently for the power to come down, while maintaining coolant flow. That’s initially what happened at the Fukushima reactors (blue line). Seismic sensors shut down the reactors, and an orderly cool-down process began.

After an hour, things went wrong when the tsunami swamped backup generators. Then the reactor followed the orange line to a state with near-zero coolant flow (whatever convection provides) and nontrivial power output from the decay products. At that point, things start heating up. The process takes a while, because there’s a lot of thermal mass in the reactor, so if cooling is quickly restored, no harm done.

If cooling isn’t restored, a number of positive feedbacks (nasty vicious cycles) can set in. Boiling in the reactor vessel necessitates venting (releasing small amounts of mostly short-lived radioactive materials); if venting fails, the reactor vessel can fail from overpressure. Boiling reduces the water level in the reactor and makes heat transfer less efficient; fuel rods that boil dry heat up much faster. As fuel rods overheat, their zirconium cladding reacts with water to make hydrogen – which can explode when vented into the reactor building, as we apparently saw at reactors 1 & 3. That can cause collateral damage to systems or people, making it harder to restore cooling.

Things get worse as heat continues to accumulate. Melting fuel rods dump debris in the reactor, obstructing coolant flow, again making it harder to restore cooling. Ultimately, melted fuel could concentrate in the bottom of the reactor vessel, away from the control rods, making power output go back up (following the red line). At that point, it’s likely that the fuel is going to end up in a puddle on the floor of the containment building. Presumably, at that point negative feedback reasserts dominance, as fuel is dispersed over a large area, and can cool passively. I haven’t seen any convincing descriptions of this endgame, but nuclear engineers seem to think it benign – at least compared to Chernobyl. At Chernobyl, there was one less balancing feedback loop (ineffective containment) and an additional reinforcing feedback: graphite in the reactor, which caught fire.

So, the ultimate story here is a race against time. The bad news is that if the core is dry and melting, time is not on your side as you progress faster and faster up the red line. The good news is that, as long as that hasn’t happened yet, time is on the side of the operators – the longer they can hold things together with duct tape and seawater, the less decay heat they have to contend with. Unfortunately, it sounds like we’re not out of the woods yet.