A billion prices

Econbrowser has an interesting article on the Billion Prices Project, which looks for daily price movements on items across the web. This yields a price index that’s free of quality change assumptions, unlike hedonic CPI measures, but introduces some additional issues due to the lack of control over the changing portfolio of measured items.

A couple of years ago we built the analytics behind the RPX index of residential real estate prices, and grappled with many of the same problems. The competition was the CSI – the Case-Shiller indes, which uses the repeat-sales method. With that approach, every house serves as its own control, so changes in neighborhoods or other quality aspects wash out. However, the clever statistical control introduces some additional problems. First, it reduces the sample of viable data points, necessitating a 3x longer reporting lag. Second, the processing steps reduce transparency. Third, one step in particular involves downweighting of homes with (possibly implausibly) large price movements, which may have the side effect of reducing sensitivity to real extreme events. Fourth, users may want to see effects of a changing sales portfolio.

For the RPX, we chose instead a triple power law estimate, ignoring quality and mix issues entirely. The TPL is basically a robust measure of the central tendency of prices. It’s not too different from the median, except that it provides some diagnostics of data quality issues from the distribution of the tails. The payoff is a much more responsive index, which can be reported daily with a short lag. We spent a lot of time comparing the RPX to the CSI, and found that, while changes in quality and mix of sales could matter in principle, in practice the two approaches yield essentially the same answer, even over periods of years. My (biased)  inclination, therefore, is to prefer the RPX approach. Your mileage may vary.

One interesting learning for me from the RPX project was that traders don’t want models. We went in thinking that sophisticated dynamics coupled to data would be a winner. Maybe it is a winner, but people want their own sophisticated dynamics. They wanted us to provide only a datastream that maximized robustness and transparency, and minimized lag. Those are sensible design principles. But I have to wonder whether a little dynamic insight would have been useful as well since, after all, many data consumers evidently did not have an adequate model of the housing market.

Painting ourselves into a green corner

At the Green California Summit & Expo this week, I saw a strange sight: a group of greentech manufacturers hanging out in the halls, griping about environmental regulations. Their point? That a surfeit of command-and-control measures makes compliance such a lengthy and costly process that it’s hard to bring innovations to market. That’s a nice self-defeating outcome!

Consider this situation:

greenCorner
I was thinking of lighting, but it could be anything. Letters a-e represent technologies with different properties. The red area is banned as too toxic. The blue area is banned as too inefficient. That leaves only technology a. Maybe that’s OK, but what if a is made in Cuba, or emits harmful radiation, or doesn’t work in cold weather? That’s how regulations get really complicated and laden with exceptions. Also, if we revise our understanding of toxics, how should we update this to reflect the tradeoffs between toxics in the bulb and toxics from power generation, or using less toxic material per bulb vs. using fewer bulbs? Notice that the only feasible option here – a – is not even on the efficient frontier; a mix of e and b could provide the same light with slightly less power and toxics.

Proliferation of standards creates a situation with high compliance costs, both for manufacturers and the bureaucracy that has to administer them. That discourages small startups, leaving the market for large firms, which in turn creates the temptation for the incumbents to influence the regulations in self-serving ways. There are also big coverage issues: standards have to be defined clearly, which usually means that there are fringe applications that escape regulation. Refrigerators get covered by Energy Star, but undercounter icemakers and other cold energy hogs don’t. Even when the standards work, lack of a price signal means that some of their gains get eaten up by rebound effects. When technology moves on, today’s seemingly sensible standard becomes part of tomorrow’s “dumb laws” chain email.

The solution is obviously not total laissez faire; then the environmental goals just don’t get met. There probably are some things that are most efficient to ban outright (but not the bulb), but for most things it would be better to impose upstream prices on the problems – mercury, bisphenol A, carbon, or whatever – and let the market sort it out. Then providers can make tradeoffs the way they usually do – which package of options makes the cheapest product? -without a bunch of compliance risk involved in bringing their product to market.

Here’s the alternative scheme:

greenTradeoffs

The green and orange lines represent isocost curves for two different sets of energy and toxic prices. If the unit prices of a-e were otherwise the same, you’d choose b with the green pricing scheme (cheap toxics, expensive energy) and e in the opposite circumstance (orange). If some of the technologies are uniquely valuable in some situations, pricing also permits that tradeoff – perhaps c is not especially efficient or clean, but has important medical applications.

With a system driven by prices and values, we could have very simple conversations about adaptive environmental control. Are NOx levels acceptable? If not, raise the price of emitting NOx until it is. End of discussion.

Two related tidbits:

Fed green buildings guru Kevin Kampschroer gave an interesting talk on the GSA’s greening efforts. He expressed hope that we could move from LEED (checklists) to LEEP (performance-based ratings).

I heard from a lighting manufacturer that the cost of making a CFL is under a buck, but running a recycling program (for mercury recapture) costs $1.50/bulb. There must be a lot of markup in the distribution channels to get them up to retail prices.

Real Estate Roundup

Ira Artman takes a look at residential real estate price indices – S&P/Case-Shiller (CSI), OFHEO, and RPX. The RPX comes out on top, for (marginally) better correlation with foreclosures and, more importantly, a much shorter reporting lag than CSI. This is a cause for minor rejoicing, as we at Ventana helped create the RPX and are affiliated with Radar Logic. Perhaps more importantly, rumor has it that there’s more trading volume on RPX.

In spite of the lag it introduces, the CSI repeat sales regression is apparently sexy to economists. Calculated Risk has been using it to follow developments in prices and price/rent ratios. Econbrowser today looks at the market bottom, as predicted by CSI forward contracts on CME. You can find similar forward curves in Radar’s monthly analysis. As of today, both RPX and CSI futures put the bottom of the market in Nov/Dec 2010, another 15% below current prices. Interestingly, the RPX forward curve looks a little more pessimistic than CSI – an arbitrage opportunity, if you can find the liquidity.

Artman notes that somehow the Fed, in its flow of funds reporting, missed most of the housing decline until after the election.

More Oil Price Forecasts

The history of long term energy forecasting is a rather mixed bag. Supply and demand forecasts have generally been half decent, in terms of percent error, but that’s primarily because GDP growth is steady, energy intensity is price-inelastic, and there’s a lot of momentum in energy consuming and producing capital. Energy price forecasts, on the other hand, have generally been terrible. Consider the Delphi panel forecasts conducted by the CEC:

California Energy Commission Delphi Forecasts

In 1988, John Sterman showed that energy forecasts, even those using sophisticated models, were well represented by a simple adaptive rule: Continue reading “More Oil Price Forecasts”

SRES – We've got a bigger problem now

Recently Pielke, Wigley and Green discussed the implications of autonomous energy efficiency improvements (AEEI) in IPCC scenarios, provoking many replies. Some found the hubbub around the issue surprising, because the assumptions concerned were well known, at least to modelers. I was among the surprised, but sometimes the obvious needs to be restated loud and clear. I believe that there are several bigger elephants in the room that deserve such treatment. AEEI is important, as are other hotly debated SRES choices like PPP vs. MEX, but at the end of the day, these are just parameter choices. In complex systems parameter uncertainty generally plays second fiddle to structural uncertainty. Integrated assessment models (IAMs) as a group frequently employ similar methods, e.g., dynamic general equilibrium, and leave crucial structural assumptions untested. I find it strange that the hottest debates surround biogeophysical models, which are actually much better grounded in physical principles, when socio-economic modeling is so uncertain.

Continue reading “SRES – We've got a bigger problem now”

No Gas

Every year or two the “gas out” email arrives in my inbox. This year, it’s May 15th when “all internet users are to not go to a gas station in protest of high gas prices.” Wait – am I supposed to avoid gas stations, or protesting at gas stations? I’m amazed at the durability of this internet chain letter, which now claims a ten-year history: “In April 1997, there was a “gas out” conducted nationwide in protest of gas prices. Gasoline prices dropped 30 cents a gallon overnight.” A Monty Python tune from The Meaning of Life jumps to mind:

So remember when you’re feeling very small and insecure
How amazingly unlikely is your birth
And pray that there’s intelligent life somewhere up in space
Because there’s bugger all down here on earth.

Continue reading “No Gas”

It's the crude price, stupid

The NYT reports that Hillary Clinton and John McCain have lined up to suspend federal excise taxes on fuel:

Senator Hillary Rodham Clinton lined up with Senator John McCain, the presumptive Republican nominee for president, in endorsing a plan to suspend the federal excise tax on gasoline, 18.4 cents a gallon, for the summer travel season. But Senator Barack Obama, Mrs. Clinton’s Democratic rival, spoke out firmly against the proposal, saying it would save consumers little and do nothing to curtail oil consumption and imports.

Mrs. Clinton would replace that money with the new tax on oil company profits, an idea that has been kicking around Congress for several years but has not been enacted into law. Mr. McCain would divert tax revenue from other sources to make the highway trust fund whole.

On April 22, EIA data put WTI crude at $119/bbl, which is $2.83/gal before accounting for refinery losses. Spot gasoline was at $2.90 to $3.14 (depending on geography and type), which is about what you’d expect with total taxes near $0.50 and retail gasoline at $3.55/gal. With refinery yields typically at something like 85%, you’d actually expect spot gasoline to be at about $3.30, so other, more-expensive products (diesel, jet fuel, heating oil) or cheaper feedstocks must be making up the difference. The price breaks down roughly as follows:

Gasoline price breakdown
Continue reading “It's the crude price, stupid”