Androids rule the earth

Android activations are apparently growing 4.4% per week, on a basis of around 100 million sales per year.

By the rule of 72 for exponential growth, that means sales are doubling every 16 weeks, or about three times per year.

If sales are growing exponentially, the installed base is also growing exponentially (because the integral of e^x is e^x). Half of the accumulated sales occur in the most recent doubling (because the series sum 1+2+4+8+…+n = 2*n-1), so the integrated unit sales are roughly one doubling (16 weeks) ahead of the interval sales.

Extrapolating, there’s an Android for everyone on the planet in two years (6 doublings, or a factor of 64 increase).

Extrapolating a little further, sales equal the mass of the planet by about 2030 (ln(10^25/10^8)/ln(2)/3 = 19 years).

Limits? What limits?

Delayed negative feedback on the financial crisis

The wheels of justice grind slowly, but they grind exceedingly fine:

Too Big to Fail or Too Big to Change

While the SEC has reached several settlements in connection with misconduct related to the financial meltdown, those settlements have been characterized as “cheap,” “hollow,” “bloodless,” and merely “cosmetic,” as noted by Columbia University law professor John C. Coffee in a recent article. Moreover, one of the SEC’s own Commissioners, Luis Aguilar, has recently admitted that the SEC’s penalty guidelines are “seriously flawed” and have “adversely impact[ed]” civil enforcement actions.

For example, Judge Jed Rakoff castigated the SEC for its attempted settlement of charges that Bank of America failed to disclose key information to investors in connection with its acquisition of Merrill Lynch (“Merrill”), including that Merrill was on the brink of insolvency (necessitating a massive taxpayer bailout), and that Bank of America had entered into a secret agreement to allow Merrill to pay its executives billions of dollars in bonuses prior to the close of the merger regardless of Merrill’s financial condition. The SEC agreed to settle its action against Bank of America for $33 million in August 2009, even though its acquisition of Merrill resulted in what The New York Times characterized as “one of the greatest destructions of shareholder value in financial history.” In rejecting the deal, Judge Rakoff declared that the proposed settlement was “misguided,” “inadequate” and failed to “comport with the most elementary notions of justice and morality.” …

It has increasingly fallen to institutional investors to hold mortgage lenders, investment banks and other large financial institutions accountable for their role in the mortgage crisis by seeking redress for shareholders injured by corporate misconduct and sending a powerful message to executives that corporate malfeasance is unacceptable. For example, sophisticated public pension funds are currently prosecuting actions involving billions of dollars of losses against Bank of America, Goldman Sachs, JPMorgan Chase, Lehman Brothers, Bear Stearns, Wachovia, Merrill Lynch, Washington Mutual, Countrywide, Morgan Stanley and Citigroup, among many others. In some instances, litigations have already resulted in significant recoveries for defrauded investors.

Historically, institutional investors have achieved impressive results on behalf of shareholders when compared to government- led suits. Indeed, since 1995, SEC settlements comprise only 5 percent of the monetary recoveries arising from securities frauds, with the remaining 95 percent obtained through private litigation ….

I think the problem here is that litigation works slowly. It’s not clear that punitive legal outcomes occur on a relevant time scale. Once bonuses have been paid and leaders have moved on, there are no heads left to roll, so organizations may only learn that they’d better have good lawyers.

Summer driving is an emergency?

A coordinated release of emergency oil stockpiles is underway. It’s almost as foolish as that timeless chain email, the Great American Gasout (now migrated to Facebook, it seems), and for the same stock-flow reasons.

Like the Gasout, strategic reserve operations don’t do anything about demand; they just shuffle it around in time. Releasing oil does increase supply by augmenting production, which causes a short term price break. But at some point you have to refill the reserve. All else equal, storing oil has to come at the expense of producing it for consumption, which means that price goes back up at some other time.

The implicit mental model here is that governments are going to buy low and sell high, releasing oil at high prices when there’s a crisis, and storing it when peaceful market conditions return. I rather doubt that political entities are very good at such things, but more importantly, where are the prospects for cheap refills, given tight supplies, strategic behavior by OPEC, and (someday) global recovery? It’s not even clear that agencies were successful at keeping the release secret, so a few market players may have captured a hefty chunk of the benefits of the release.

Setting dynamics aside, the strategic reserve release is hardly big enough to matter – the 60 million barrels planned isn’t even a day of global production. It’s only 39 days of Libyan production. Even if you have extreme views on price elasticity, that’s not going to make a huge difference – unless the release is extended. But extending the release through the end of the year would consume almost a quarter of world strategic reserves, without any clear emergency at hand.

We should be saving those reserves for a real rainy day, and increasing the end-use price through taxes, to internalize environmental and security costs and recapture OPEC rents.

The overconfidence of nuclear engineers

Rumors that the Fort Calhoun nuclear power station is subject to a media blackout appear to be overblown, given that the NRC is blogging the situation.

Apparently floodwaters at the plant were at 1006 feet ASL yesterday, which is a fair margin from the 1014 foot design standard for the plant. That margin might have been a lot less, if the NRC hadn’t cited the plant for design violations last year, which it estimated would lead to certain core damage at 1010 feet.

Still, engineers say things like this:

“We have much more safety measures in place than we actually need right now,” Jones continued. “Even if the water level did rise to 1014 feet above mean sea level, the plant is designed to handle that much water and beyond. We have additional steps we can take if we need them, but we don’t think we will. We feel we’re in good shape.” – suite101

The “and beyond” sounds like pure embellishment. The design flood elevation for the plant is 1014 feet. I’ve read some NRC documents on the plant, and there’s no other indication that higher design standards were used. Presumably there are safety margins in systems, but those are designed to offset unanticipated failures, e.g. from design deviations like those discovered by the NRC. Surely the risk of unanticipated problems would rise dramatically above the maximum anticipated flood level of 1014 feet.
Overconfidence is a major contributor to accidents in complex systems. How about a little humility?
Currently the Missouri River forecast is pretty flat, so hopefully we won’t test the limits of the plant design.

Setting up Vensim compiled simulation on Windows

If you don’t use Vensim DSS, you’ll find this post rather boring and useless. If you do, prepare for heart-pounding acceleration of your big model runs:

  • Get Vensim DSS.
  • Get a C compiler. Most flavors of Microsoft compilers are compatible; MS Visual C++ 2010 Express is a good choice (and free). You could probably use gcc, but I’ve never set it up. I’ve heard reports of issues with 2005 and 2008 versions, so it may be worth your while to upgrade.
  • Install Vensim, if you haven’t already, being sure to check the Install external function and compiled simulation support box.
  • Launch the program and go to Tools>Options…>Startup and set the Compiled simulation path to C:Documents and SettingsAll UsersVensimcomp32 (WinXP) or C:UsersPublicVensimcomp32 (Vista/7).
    • Check your mdl.bat in the location above to be sure that it points to the right compiler. This is a simple matter of checking to be sure that all options are commented out with “REM ” statements, except the one you’re using, for example:
  • Move to the Advanced tab and set the compilation options to Query or Compile (you may want to skip this for normal Simulation, and just do it for Optimization and Sensitivity, where speed really counts).

This is well worth the hassle if you’re working with a large model in SyntheSim or doing a lot of simulations for sensitivity analysis and optimization. The speedup is typically 4-5x.

Elk, wolves and dynamic system visualization

Bret Victor’s video of a slick iPad app for interactive visualization of the Lotka-Voltera equations has been making the rounds:

Coincidentally, this came to my notice around the same time that I got interested in the debate over wolf reintroduction here in Montana. Even simple models say interesting things about wolf-elk dynamics, which I’ll write about some other time (I need to get vaccinated for rabies first).

To ponder the implications of the video and predator-prey dynamics, I built a version of the Lotka-Voltera model in Vensim.

After a second look at the video, I still think it’s excellent. Victor’s two design principles, ubiquitous visualization and in-context manipulation, are powerful for communicating a model. Some aspects of what’s shown have been in Vensim since the introduction of SyntheSim a few years ago, though with less Tufte/iPad sexiness. But other features, like Causal Tracing, are not so easily discovered – they’re effective for pros, but not new users. The way controls appear at one’s fingertips in the iPad app is very elegant. The “sweep” mode is also clever, so I implemented a similar approach (randomized initial conditions across an array dimension) in my version of the model. My favorite trick, though, is the 2D control of initial conditions via the phase diagram, which makes discovery of the system’s equilibrium easy.

The slickness of the video has led some to wonder whether existing SD tools are dinosaurs. From a design standpoint, I’d agree in some respects, but I think SD has also developed many practices – only partially embodied in tools – that address learning gaps that aren’t directly tackled by the app in the video: Continue reading “Elk, wolves and dynamic system visualization”

The future

IBM was founded a hundred years ago today. Its stock has appreciated by a factor of 40 from 1962 (about 5 doublings in 50 years is 7%/yr).

Perhaps more importantly, the Magna Carta turned 796 yesterday. It was a major milestone in a long ascent of rule of law and civil liberties.

What will the next century and millennium bring?

Lotka-Volterra predator-prey system

The Lotka-Volterra equations, which describe a predator-prey system, must be one of the more famous dynamic systems. There are many generalizations and applications outside of biology.

Wikipedia has a nice article, which I used as the basis for this simple model.

Continue reading “Lotka-Volterra predator-prey system”

Et tu, EJ?

I’m not a cap & trade fan, but I find it rather bizarre that the most successful opposition to California’s AB32 legislation comes from the environmental justice (EJ) movement, on the grounds that cap & trade might make emissions go up in areas that are already disadvantaged, and that Air Resources failed to adequately consider alternatives like a carbon tax.

I think carbon taxes did get short shrift in the AB32 design. Taxes were a second-place favorite among economists in the early days, but ultimately the MAC analysis focused on cap & trade, because it provided environmental certainty needed to meet legal targets (oops), but also because it was political suicide to say “tax” out loud at the time.

While cap & trade has issues with dynamic stability, allocation wrangling and complexity, it’s hard to imagine any way that those drawbacks would change the fundamental relationship between the price signal’s effect on GHGs vs. criteria air pollutants. In fact, GHGs and other pollutant emissions are highly correlated, so it’s quite likely that cap & trade will have ancillary benefits from other pollutant reductions.

To get specific, think of large point sources like refineries and power plants. For the EJ argument to make sense, you’d have to think that emitters would somehow meet their greenhouse compliance obligations by increasing their emissions of nastier things, or at least concentrating them all at a few facilities in disadvantaged areas. (An analogy might be removing catalytic converters from cars to increase efficiency.) But this can’t really happen, because the air quality permitting process is not superseded by the cap & trade system. In the long run, it’s also inconceivable that it could occur, because there’s no way you could meet compliance obligations for deep cuts by increasing emissions. A California with 80% cuts by 2050 isn’t going to have 18 refineries, and therefore it’s not going to emit as much.

The ARB concludes as much in a supplement to the AB32 scoping plan, released yesterday. It considers alternatives to cap & trade. There’s some nifty stuff in the analysis, including a table of existing emissions taxes (page 89).

It seems that to some extent ARB has tilted the playing field a bit by evaluating a dumb tax, i.e. one that doesn’t adapt its price level to meet environmental objectives without legislative intervention, and heightening leakage concerns that strike me as equally applicable to cap & trade. But they do raise legitimate legal concerns – a tax is not a legal option for ARB without a vote of the legislature, which would likely fail because it requires a supermajority, and tax-equivalent fees are a dubious proposition.

If there’s no Plan B alternative to cap and trade, I wonder what the EJ opposition was after? Surely failure to address emissions is not compatible with a broad notion of justice.

Hand over your cell phones

Adam Frank @NPR says, “Science Deniers: Hand Over Your Cellphones!”

I’m sympathetic to the notion that attitudes toward science are often a matter of ideological convenience rather than skeptical reasoning. However, we don’t have a cell phone denial problem. Why? I think it helps to identify the contributing factors in circumstances in which denial occurs:

  • Non-experimental science (reliance on observations of natural experiments; no controls or randomized assignment)
  • Infrequent replication (few examples within the experience of an individual or community)
  • High noise (more specifically, low signal-to-noise ratio)
  • Complexity (nonlinearity, integrations or long delays between cause and effect, multiple agents, emergent phenomena)
  • “Unsalience” (you can’t touch, taste, see, hear, or smell the variables in question)
  • Cost (there’s some social or economic penalty  imposed by the policy implications of the theory)
  • Commons (the risk of being wrong accrues to society more than the individual)

It’s easy to believe in radio waves used by cell phones, or general relativity corrected for by GPS, because their only problematic feature is invisibility. Calling grandma is a pretty compelling experiment, which one can repeat as often as needed to dispel any doubts about those mysterious electromagnetic waves.

At one time, the debate over the structure of the solar system was subject to these problems. There was a big social cost to believing the heliocentric model (the Inquisition), and little practical benefit to being right. Theory relied on observations that were imprecise and not salient to the casual observer. Now that we have low-noise observations, replicated experiments (space probe launches), and so on, there aren’t too many geocentrists around.

Climate, on the other hand, has all of these problems. Of particular importance, the commons and long-time-scale aspects of the problem shelter individuals from selection pressure against wrong beliefs.