Exponential Epi Pens

Mylan Pharmaceuticals is in the news for taking the price of EpiPens, which contain about $1 of active ingredient, to stratospheric levels. I think Bloomberg broke the story, and the NY Times has the latest.

Here’s the price trajectory:

epiPen

epi pen data.xlsx

The rate of increase is not that far from the health care inflation rate in general, except that in this case, there’s no obvious underlying cost driver, hence the allegations of gouging.

Here’s a first cut at the structure of the problem:

epi_dynamic

Econ 101 says that high profits should attract competition, putting downward pressure on prices (loop B 101). However, that’s not happening, because the FDA is the gatekeeper on product approval. It’s not clear to me whether the FDA just makes the approval delay systematically long and uncertain, or that it’s actually captive to Mylan lobbyists and holding new entrants to higher standards, as some hint (that would be a reinforcing loop, R2). Either way, the only loop that’s functioning is Mylan’s reinvestment in marketing and lobbying to create demand (R 1).

This reminds me of California’s electricity market deregulation debacle, which created a wholesale power market without corresponding retail price elasticity. Utilities were stranded between hammer (floating generation prices) and anvil (fixed demand). The resulting mess was worse than might have occurred in either a more or less deregulated market.

Similarly, to bring this market under control, you’d either have to get the FDA out of the way, restoring the balancing loop, or regulate the price side of the market, constraining the reinforcing loop. In this case, it may be the court of public opinion that puts the brakes on, adding a balancing loop of bad press that has so far cost Mylan dearly in investor confidence, if nothing else.

Mylan responds to gouging allegations rather unconvincingly, I think. Their CEO argues that the problem is multiple markups in the supply chain, subsidization of Europe, and R&D. It’s hard to square those external-cause arguments with Mylan’s financials.

Problem Formulation

Nelson Repenning & colleagues have a nice new paper on problem formulation. It’s set in a manufacturing context, but the advice is as relevant for building models as for building motorcycles:

Anatomy of a Good Problem Statement
A good problem statement has five basic elements:
• it references something that the organization cares about and connects that element to a clear and specific goal or target;
• it contains a clear articulation of the gap between the current state and the goal;
• the key variables—the target, the current state and the gap—are quantifiable,if not immediately measurable;
• it is neutral as possible concerning possible diagnoses or solutions;
• it is sufficiently small in scope that you can tackle it quickly.

A textbook death spiral

NPR has a nice article on self-regulation in the textbook industry. It turns out that textbook prices are up almost 100% from 2002, yet student spending on texts is nearly flat. (See the article for concise data.)

Here’s part of the structure that explains the data:

Starting with a price increase, students have a lot of options: they can manage textbooks more intensively (e.g., sharing, brown), they can simply choose to use fewer (substitution, blue), they can adopt alternatives that emerge after a delay (red), and they can extend the life of a given text by being quick to sell them back, or an agent can do that on their behalf by creating a rental fleet (green).

All of these options help students to hold spending to a desired level, but they have the unintended effect of triggering a variant of the utility death spiral. As unit sales (purchasing) fall, the unit cost of producing textbooks rises, due to the high fixed costs of developing and publishing the materials. That drives up prices, promping further reductions in purchasing – a vicious cycle.

This isn’t quite the whole story – there’s more to the supply side to think about. If publishers are facing a margin squeeze from rising costs, are they offering fewer titles, for example? I leave that as an exercise.

Missing the point about efficiency rebounds … again

Breakthrough’s Nordhaus and Shellenberger (N&S) spot a bit of open-loop thinking about LED lighting:

ON Tuesday, the Royal Swedish Academy of Sciences awarded the 2014 Nobel Prize in Physics to three researchers whose work contributed to the development of a radically more efficient form of lighting known as light-emitting diodes, or LEDs.

In announcing the award, the academy said, “Replacing light bulbs and fluorescent tubes with LEDs will lead to a drastic reduction of electricity requirements for lighting.” The president of the Institute of Physics noted: “With 20 percent of the world’s electricity used for lighting, it’s been calculated that optimal use of LED lighting could reduce this to 4 percent.”

The problem of course is that lighting energy use would fall 20% to 4% only if there’s no feedback, so that LEDs replace incandescents 1 for 1 (and of course the multiplier can’t be that big, because CFLs and other efficient technologies already supply a lot of light).

N&S go on to argue:

But it would be a mistake to assume that LEDs will significantly reduce overall energy consumption.

Why? Because rebound effects will eat up the efficiency gains:

“The growing evidence that low-cost efficiency often leads to faster energy growth was recently considered by both the Intergovernmental Panel on Climate Change and the International Energy Agency.”

“The I.E.A. and I.P.C.C. estimate that the rebound could be over 50 percent globally.”

Notice the sleight-of-hand: the first statement implies a rebound effect greater than 100%, while the evidence they’re citing describes a rebound of 50%, i.e. 50% of the efficiency gain is preserved, which seems pretty significant.

Presumably the real evidence they have in mind is http://iopscience.iop.org/0022-3727/43/35/354001 – authors Tsao & Saunders are Breakthrough associates. Saunders describes a 100% rebound for lighting here http://thebreakthrough.org/index.php/programs/energy-and-climate/understanding-energy-efficiency-rebound-interview-with-harry-saunders

Now the big non sequitur:

But LED and other ultraefficient lighting technologies are unlikely to reduce global energy consumption or reduce carbon emissions. If we are to make a serious dent in carbon emissions, there is no escaping the need to shift to cleaner sources of energy.

Let’s assume the premise is true – that the lighting rebound effect is 100% or more. That implies that lighting use is highly price elastic, which in turn means that an emissions price like a carbon tax will have a strong influence on lighting energy. Therefore pricing can play a major role in reducing emissions. It’s probably still true that a shift to clean energy is unavoidable, but it’s not an exclusive remedy, and a stronger rebound effect actually weakens the argument for clean sources.

Their own colleagues point this out:

In fact, our paper shows that, for the two 2030 scenarios (with and without solid-state lighting), a mere 12% increase in real electricity prices would result in a net decline in electricity-for-lighting consumption.

What should the real takeaway be?

  • Subsidizing lighting efficiency is ineffective, and possibly even counterproductive.
  • Subsidizing clean energy lowers the cost of delivering lighting and other services, and therefore will also be offset by rebound effects.
  • Emissions pricing is a win-win, because it encourages efficiency, counteracts rebound effects and promotes substitution of clean sources.

Doing our bit for the cure … and the cause

I have a soft spot for breast cancer research, but I have to admit that it seemed a little silly when I started getting hay with pink baling twine.

But now it seems the Susan G. Komen foundation for breast cancer has really jumped the shark, with pink drill bits from oilfield service company Baker Hughes. Funding cancer care with revenue derived in part from pumping carcinogens into the ground, providing pinkwash for that practice, seems like rather unsystemic thinking. What’s next, pink cigarettes?

Not so fast?

Maybe Baker Hughes is deriving some enlightenment from the relationship. In a less-noticed bit of news:

As part of our ongoing commitment, we have adopted a new policy with respect to the information that we provide about the chemistry contained within our hydraulic fracturing fluid systems. Beginning October 1, 2014, Baker Hughes will provide a complete, detailed, and public listing of all chemical constituents for all wells that the company fractures using its hydraulic fracturing fluid products.

An unwinnable arms race

It seems that we Americans are engaged in an arms race with our own government. Bozeman is the latest to join in, with its recent acquisition of an armored vehicle:

armoredArms races are an instance of the escalation archetype, where generally the only winning strategy is not to play, but it’s particularly foolish to run an arms race against ourselves.

Here’s how it works:
WeaponEscalation
The police (left) and citizens (right) each have stocks of weapons and associated skills and attitudes. Each “side” adjusts those stocks toward a desired level, which is set by various signals.

Citizens, for example, see media coverage of school shootings and less spectacular events, and arm themselves against their fellow citizens and against the eventuality of totalitarian government. A side effect of this is that, as the general availability of weapons increases, the frequency and scale of violent conflict increases, all else equal. This in itself reinforces the citizen perception of the need to arm.

The government (i.e. the police) respond to the escalation of violent conflict in their own locally rational way as well. They acquire heavy weapons and train tactical teams. But this has a number of side effects that further escalate conflict. Spending and training on paramilitary approaches necessarily comes at the expense of non-violent policing methods.

Lester said he’s concerned about the potential overuse of such commanding vehicles among some police departments, a common criticism in the wake of the Ferguson protests.“When you bring that to the scene,” he said, “you bring an attitude that’s not necessarily needed.”

Accidents happen, and the mere availability of heavy armor encourages overkill, as we saw in Ferguson. And police departments are not immune to keeping up with the Joneses:

“For a community our size, we’re one of the last communities that does not have an armored rescue vehicle,” he said.

This structure is a nest of reinforcing feedback loops – I haven’t labeled them, because every loop above is positive, except the two inner loops in the acquisition/militarization stock control processes.

Strangely, this is happening at a time in which violent crime rates are trending down. This means that the driver of escalation must be more about perceptions and fear of potential harm than about actual shooting incidents.

Carrying the escalation to its conclusion, one of two things has to happen. The police win, and we have a totalitarian state. Or, the citizens win, and we have stateless anarchy. Neither outcome is really a “win.”

The alternative is to reverse the escalation, and make the reinforcing loops virtuous rather than vicious cycles. This is harder than it should be, because there’s a third party involved, that profits from escalation (red):
EscalationLobbying
Arms makers generate revenue from weapon sales and service, and reinvest that in marketing, to increase both parties desired weapons, and in lobbying to preserve the legality of assault weapons and fund the grant programs that enable small towns to have free armor.
EscalationEngagement
Fortunately, there is a remedy. Voters can (at least indirectly) fire the Bozeman officials who “forgot” to run the armored vehicle acquisition through any public process, and defund the Homeland (In)Security programs that bring heavy weapons to our doorsteps.

The difficult pill to swallow is that, for this to work, citizens have to de-escalate too. Reinstating the assault weapons ban is messy, and perhaps ineffective given the large stock of weapons now widely distributed. Maybe the first change should be cultural: recognizing that arming oneself to the teeth is a fear-driven antisocial response to our situation, and that ballots are a better solution than bullets.

The end is here

Facebook is down.

Runaway positive feedback is the culprit:

To make matters worse, every time a client got an error attempting to query one of the databases it interpreted it as an invalid value, and deleted the corresponding cache key. This meant that even after the original problem had been fixed, the stream of queries continued. As long as the databases failed to service some of the requests, they were causing even more requests to themselves. We had entered a feedback loop that didn’t allow the databases to recover.

The way to stop the feedback cycle was quite painful – we had to stop all traffic to this database cluster, which meant turning off the site. Once the databases had recovered and the root cause had been fixed, we slowly allowed more people back onto the site.

This got the site back up and running today, and for now we’ve turned off the system that attempts to correct configuration values. We’re exploring new designs for this configuration system following design patterns of other systems at Facebook that deal more gracefully with feedback loops and transient spikes.

It’s faintly ironic, since positive feedback of a different sort is responsible for Facebook’s success.

Reflections on Virgin Earth

Colleagues just pointed out the Virgin Earth Challenge, “a US$25 million prize for an environmentally sustainable and economically viable way to remove greenhouse gases from the atmosphere.”

John Sterman writes:

I think it inevitable that we will see more and more interest in CO2 removal. And IF it can be done without undermining mitigation I’d be all for it. I do like biochar as a possibility; though I am very skeptical of direct air capture and CCS. But the IF in the prior sentence is clearly not true: if there were effective removal technology it would create moral hazard leading to less mitigation and more emissions.

Even more interesting, direct air capture is not thermodynamically favored; needs lots of energy. All the finalists claim that they will use renewable energy or “waste” heat from other processes to power their removal technology, but how about using those renewable sources and waste heat to directly offset fossil fuels and reduce emissions instead of using them to power less efficient removal processes? Clearly, any wind/solar/geothermal that is used to power a removal technology could have been used directly to reduce fossil emissions, and will be cheaper and offset more net emissions. Same for waste heat unless the waste heat is too low temp to be used to offset fossil fuels. Result: these capture schemes may increase net CO2 flux into the atmosphere.

Every business knows it’s always better to prevent the creation of a defect than to correct it after the fact. No responsible firm would say “our products are killing the customers; we know how to prevent that, but we think our money is best spent on settling lawsuits with their heirs.” (Oh: GM did exactly that, and look how it is damaging them). So why is it ok for people to say “fossil fuel use is killing us; we know how to prevent that, but we’ve decided to spend even more money to try to clean up the mess after the pollution is already in the air”?

To me, many of these schemes reflect a serious lack of systems thinking, and the desire for a technical solution that allows us to keep living the way we are living without any change in our behavior. Can’t work.

I agree with John, and I think there are some additional gaps in systemic thinking about these technologies. Here are some quick reflections, in pictures.

EmittingCapturingA basic point for any system is that you can lower the level of a stock (all else equal) by reducing the inflow or increasing the outflow. So the idea of capturing CO2 is not totally bonkers. In fact, it lets you do at least one thing that you can’t do by reducing emissions. When emissions fall to 0, there’s no leverage to reduce CO2 in the atmosphere further. But capture could actively draw down the CO2 stock. However, we are very far from 0 emissions, and this is harder than it seems:

AirCapturePushbackNatural sinks have been graciously absorbing roughly half of our CO2 emissions for a long time. If we reduce emissions dramatically, and begin capturing, nature will be happy to give us back that CO2, ton for ton. So, the capture problem is actually twice as big you’d think from looking at the excess CO2 in the atmosphere.

Currently, there’s also a problem of scale. Emissions are something like two orders of magnitude larger than potential markets for CO2, so there’s a looong way to go. And capture doesn’t scale like like a service running on Amazon Elastic Cloud servers; it’s bricks and mortar.

EmitCaptureScaleAnd where does that little cloud go, anyway? Several proposals gloss over this, as in:

The process involves a chemical solution (that naturally absorbs CO2) being brought into contact with the air. This solution, now containing the captured CO2, is sent to through a regeneration cycle which simultaneously extracts the CO2 as a high-pressure pipeline-quality product (ready to be put to numerous commercial uses) …

The biggest commercial uses I know of are beverage carbonation and enhanced oil recovery (EOR). Consider the beverage system:

BeverageCO2CO2 sequestered in beverages doesn’t stay there very long! You’d have to start stockpiling vast quantities of Coke in salt mines to accumulate a significant quantity. This reminds me of Nike’s carbon-sucking golf ball. EOR is just as bad, because you put CO2 down a hole (hopefully it stays there), and oil and gas come back up, which are then burned … emitting more CO2. Fortunately the biochar solutions do not suffer so much from this problem.

Next up, delays and moral hazard:

CO2moralHazardThis is a cartoonish view of the control system driving mitigation and capture effort. The good news is that air capture gives us another negative loop (blue, top) by which we can reduce CO2 in the atmosphere. That’s good, especially if we mismanage the green loop. The moral hazard side effect is that the mere act of going through the motions of capture R&D reduces the perceived scale of the climate problem (red link), and therefore reduces mitigation, which actually makes the problem harder to solve.

Capture also competes with mitigation for resources, as in John’s process heat example:

ProcessHeat

It’s even worse than that, because a lot of mitigation efforts have fairly rapid effects on emissions. There are certainly long-lived aspects of energy and infrastructure that must be considered, but behavior can change a lot of emissions quickly and with off-the-shelf technology. The delay between air capture R&D and actual capturing, on the other hand, is bound to be fairly long, because it’s in its infancy, and has to make it through multiple discover/develop/deploy hurdles.

One of those hurdles is cost. Why would anyone bother to pay for air capture, especially in cases where it’s a sure loser in terms of thermodynamics and capital costs? Altruism is not a likely candidate, so it’ll take a policy driver. There are essentially two choices: standards and emissions pricing.

A standard might mandate (as the EPA and California have) that new power plants above a certain emissions intensity must employ some kind of offsetting capture. If coal wants to stay in business, it has to ante up. The silly thing about this, apart from inevitable complexity, is that any technology that meets the standard without capture, like combined cycle gas electricity currently, pays 0 for its emissions, even though they too are harmful.

Similarly, you could place a subsidy or bounty on tons of CO2 captured. That would be perverse, because taxpayers would then have to fund capture – not likely a popular measure. The obvious alternative would be to price emissions in general – positive for emissions, negative for capture. Then all sources and sinks would be on a level playing field. That’s the way to go, but of course we ought to do it now, so that mitigation starts working, and air capture joins in later if and when it’s a viable competitor.

I think it’s fine if people work on carbon capture and sequestration, as long as they don’t pretend that it’s anywhere near a plausible scale, or even remotely possible without comprehensive changes in incentives. I won’t spend my own time on a speculative, low-leverage policy when there are more effective, immediate and cheaper mitigation alternatives. And I’ll certainly never advise anyone to pursue a geoengineered world, any more than I’d advise them to keep smoking but invest in cancer research.

 

 

Climate Interactive – #12 climate think tank

Climate Interactive is #12 (out of 210) in the International Center for Climate Governance’s Standardized Ranking of climate think tanks (by per capita productivity):

  1. Woods Hole Research Center (WHRC)
  2. Basque Centre for Climate Change (BC3)
  3. Centre for European Policy Studies (CEPS)*
  4. Centre for European Economic Research (ZEW)*
  5. International Institute for Applied Systems Analysis (IIASA)
  6. Worldwatch Institute
  7. Fondazione Eni Enrico Mattei (FEEM)
  8. Resources for the Future (RFF)
  9. Mercator Research Institute on Global Commons and Climate Change (MCC)
  10. Centre International de Recherche sur l’Environnement et le De?veloppement (CIRED)
  11. Institut Pierre Simon Laplace (IPSL)
  12. Climate Interactive
  13. The Climate Institute
  14. Buildings Performance Institute Europe (BPIE)
  15. International Institute for Environment and Development (IIED)
  16. Center for Climate and Energy Solutions (C2ES)
  17. Global Climate Forum (GCF)
  18. Potsdam Institute for Climate Impact Research (PIK)
  19. Sandbag Climate Campaign
  20. Civic Exchange

That’s some pretty illustrious company! Congratulations to all at CI.