Wednesday, September 17, 2014

An open letter to you climate people

Dear Climate People (yes, I mean you IPCC WG1 types):

I am a lowly social scientist. An economist to be precise. I am the type of person who is greatly interested in projecting impacts of climate change on human and natural systems. My friends and I are pretty darn good at figuring out how human and natural systems responded to observed changes in weather and climate. We use fancy statistics, spend tons of time and effort collecting good data on observed weather/climate and outcomes of interest. Crime? Got it. Yields for any crop you can think of? Got your back. Labor productivity? Please. Try harder.

But you know what's a huge pain in the neck for all of us? Trying to get climate model output in a format that is useable by someone without at least a computer science undergraduate degree. While you make a big deal out of having all of your climate model output in a public depository, we (yes the lowly social scientists) do not have the skills to read your terabytes and terabytes of netCDF files into our Macbooks and put them in a format we can use.

What do I mean by that? The vast majority of us use daily data of TMin, Tmax and Precipitation at the surface. That's it. We don't really care what's going on high in the sky. If we get fancy, we use wet bulb temperature and cloud cover. But that's really pushing it. For a current project I am trying to get county level Climate Model Output for the CMIP5 models. All of them. For all 3007 US counties. This should not be hard. But it is. My RA finally got the CMIP5 output from a Swiss server and translated them into a format we can use (yes. ASCII. laugh if you wish. The "a" in ASCII stands for awesome.) We now are slicing and dicing these data into the spatial units we can use. We had to buy a new computer and bang our heads against the wall for weeks.

If you want more people working on impacts in human and natural systems, we need to make climate model output available to them at the spatial and temporal level of resolution they need. For the old climate model output, there was such a tool, which was imperfect, but better than what we have now. I got a preview of the update to this tool, but it chokes on larger requests.

Here's what I'm thinking IPCC WG1: Let's create a data deposit, which makes climate model output available to WG2 types like me in formats I can understand. I bet you the impacts literature would grow much more rapidly. The most recent AR5 points out that the largest gaps in our understanding are in human systems. I am not surprised. If this human system has trouble getting the climate data into a useful format, I am worried about folks doing good work, who are even more computationally challenged than I am.

Call me some time. I am happy to help you figure out what we could do. It could be amazing!

Your friend (really) Max.

Monday, September 8, 2014

Can we measure being a good scientific citizen?

This is a bit trivial, but I was recently on travel, and I often ponder a couple of things when traveling. One is how to use my work time more efficiently. Or more specifically, what fraction of requests to say yes to, and which ones to choose? It’s a question I know a lot of other scientists ask themselves, and it’s a moving target as the number of requests change over time, for talks, reviews, etc.

The other thing is that I usually get a rare chance to sit and watch Sportscenter, and I'm continually amazed by how many statistics are now used to discuss sports. Like “so-and-so has a 56% completion percentage when rolling left on 2nd down” or “she’s won 42% of points on her second serve when playing at night on points that last less than 8 strokes, and when someone in the crowd sneezes after the 2nd stroke.” Ok, I might be exaggerating a little, but not by much.

So it gets me wondering why scientists haven’t been more pro-active in using numbers to measure our perpetual time management issues. Take reviews for journals as an example. It would seem fairly simple for journals to report how many reviews different people perform each year, even without revealing who reviewed which papers. I’m pretty sure this doesn’t exist, but could be wrong (The closest thing I’ve seen to this is Nature sends an email at the end of each year saying something like “thanks for your service to our journal family, you have reviewed 8 papers for us this year”).  It would seem that comparing the number of reviews to the number of papers you get reviewed by others (also something journals could easily report) would be a good measure of whether each person is doing their part.

Or more likely you’d want to share the load with your co-authors, but also account for the fact that a single paper usually requires about 3 reviewers. So we can make a simple “science citizen index” or “scindex” that would be
SCINDEX = A / (B x C/D) = A x D / (B x C)
where
A = # of reviews performed
B = # of your submissions that get reviewed (even if the paper ends up rejected)
C = average number of reviews needed per submission (assume = 3)
D = average number of authors per your submitted papers

Note that to keep it simple, none of this counts time spent as an editor of a journal. And it doesn’t adjust for being junior or senior, even though you could argue junior people should do less reviews and make up for it when they are senior. And I’m sure some would complain that measuring this will incentivize people agreeing but then doing lousy reviews. (Of course that never happens now). Anyhow, if this number is equal to 1 than you are pulling your own weight. If it’s more than 1 you are probably not rejecting enough requests. So now I’m curious how I stack up. Luckily I have a folder where I save all reviews and can look at the number saved in a given year. Let’s take 2013. Apparently I wrote 27 reviews, not counting proposal or assessment related reviews. And Google Scholar can quickly tell me how many papers I was an author on in that year (14), and I can calculate the average number of authors per paper (4.2). Let’s also assume that a few of those were first rejected after review elsewhere (I don’t remember precisely, but that’s an educated guess), so that my total submissions were 17. So that makes my scindex 27 x 4.2 / (17 x 3) = 2.2. For 2012 it was 3.3.

Holy cow! I’m doing 2-3 times as many reviews as I should be reasonably expected to. And here I was thinking I had a reasonably good balance of accepting/rejecting requests. It also means that there must be lots of people out there who are not pulling their weight (I’m looking at you, Sol).


It would be nice if the standard citation reports add something like a scindex to the h-index and other standards. Not because I expect scientists to be rewarded for being a good citizen, though that would be nice, or because it would expose the moochers. But because it would help us make more rational decisions about how much of the thankless tasks to take on. Or maybe my logic is completely off here. If so, let me know. I’ll blame it on being tired from too much travel and doing too many reviews!

Sunday, August 31, 2014

Commodity Prices: Financialization or Supply and Demand?


I've often panned the idea that commodity prices have been greatly influenced by so-called financialization---the emergence of tradable commodity price indices and growing participation by Wall Street in commodity futures trading. No, Goldman Sachs did not cause the food and oil-price spikes in recent years. I've had good company in this view.   See, for example, Killian, Knittel and PindyckKrugman (also here), HamiltonIrwin and coauthers, and I expect many others.

I don't deny that Wall Street has gotten deeper into the commodity game, a trend that many connect to  Gorton and Rouwenhorst (and much earlier similar findings).  But my sense is that commodity prices derive from more-or-less fundamental factors--supply and demand--and fairly reasonable expectations about future supply and demand.  Bubbles can happen in commodities, but mainly when there is poor information about supply, demand, trade and inventories.  Consider rice, circa 2008.

But most aren't thinking about rice. They're thinking about oil.

The financialization/speculation meme hasn't gone away, and now bigger guns are entering the fray, with some new theorizing and evidence.

Xiong theorizes (also see Cheng and Xiong and Tang and Xiong) that commodity demand might be upward sloping.  A tacit implication is that new speculation of higher prices could feed higher demand, leading to even higher prices, and an upward spiral.  A commodity price "bubble" could arise without accumulation of inventories, as many of us have argued.  Tang and Xiong don't actually write this, but I think some readers may infer it (incorrectly, in my view).

It is an interesting and counter-intuitive result.  After all, The Law of Demand is the first thing everybody learns in Econ 101:  holding all else the same, people buy less as price goes up.  Tang and Xiong get around this by considering how market participants learn about future supply and demand.  Here it's important to realize that commodity consumers are actually businesses that use commodities as inputs into their production processes.  Think of refineries, food processors, or, further down the chain, shipping companies and airlines.  These businesses are trying to read crystal balls about future demand for their final products.  Tang and Xiong suppose that commodity futures tell these businesses something about future demand.  Higher commodity futures may indicate stronger future demand for their finished, so they buy more raw commodities, not less.

There's probably some truth to this view.  However, it's not clear whether or when demand curves would actually bend backwards.  And more pointedly, even if the theory were true, it doesn't really imply any kind of market failure that regulation might ameliorate. Presumably some traders actually have a sense of the factors causing prices to spike: rapidly growing demand in China and other parts of Asia, a bad drought, an oil prospect that doesn't pan out, conflict in the Middle East that might disrupt future oil exports, and so on.  Demand shifting out due to reasonable expectations of higher future demand for finished product is not a market failure or the makings of a bubble.  I think Tang and Xiong know this, but the context of their reasoning seems to suggest they've uncovered a real anomaly, and I don't think they have.  Yes, it would be good to have more and better information about product supply, demand and disposition.  But we already knew that.

What about the new evidence?

One piece of evidence is that commodity prices have become more correlated with each other, and with stock prices, with a big spike around 2008, and much more so for indexed commodities than off-index commodities.


This spike in correlatedness happens to coincide with the overall spike in commodity prices, especially oil and food commodities.  This fact would seem consistent with the idea that aggregate demand growth--real or anticipated--was driving both higher prices and higher correlatedness.  This view isn't contrary to Tang and Xiong's theory, or really contrary to any of the other experts I linked to above.  And none of this really suggests speculation or financialization has anything to do with it.  After all, Wall Street interest in commodities started growing much earlier, between 2004 and 2007, and we don't see much out of the ordinary around that time.

The observation that common demand factors---mainly China growth pre-2008 and the Great Recession since then---have been driving price fluctuations also helps to explain changing hedging profiles and risk premiums noted by Tang and Xiong and others.  When idiosyncratic supply shocks drive commodity price fluctuations (e.g, bad weather), we should expect little correlation with the aggregate economy, and risk premiums should be low, and possibly even negative for critical inputs like oil.  But when large demand shocks drive fluctuations, correlatedness becomes positive and so do risk premiums.

None of this is really contrary to what Tang and Xiong write.  But I'm kind of confused about why they see demand growth from China as an alternative explanation for their findings. It all looks the same to me.  It all looks like good old fashioned fundamentals.

Another critical point about correlatedness that Tang and Xiong overlook is the role of ethanol policy.  Ethanol started to become serious business around 2007 and going into 2008, making a real if modest contribution to our fuel supply, and drawing a huge share of the all-important US corn crop.


During this period, even without subsidies, ethanol was competitive with gasoline.  Moreover, ethanol concentrations hadn't yet hit 10% blend wall, above which ethanol might damage some standard gasoline engines.  So, for a short while, oil and corn were effectively perfect substitutes, and this caused their prices to be highly correlated.  Corn prices, in turn, tend to be highly correlated with soybean and wheat prices, since they are substitutes in both production and consumption.

With ethanol effectively bridging energy and agricultural commodities, we got a big spike in correlatedness.  And it had nothing to do with financialization or speculation.

Note that this link effectively broke shortly thereafter. Once ethanol concentrations hit the blend wall, oil and ethanol went from being nearly perfect substitutes to nearly perfect complements in the production of gasoline.  They still shared some aggregate demand shocks, but oil-specific supply shocks and some speculative shocks started to push corn and oil prices in opposite directions.

Tang and Xiong also present new evidence on the volatility of hedgers positions. Hedgers--presumably commodity sellers who are more invested in commodities and want to their risk onto Wall Street---have highly volatile positions relative to the volatility of actual output.



These are interesting statistics.  But it really seems like a comparison of apples and oranges.  Why should we expect hedger's positions to scale with the volatility of output?  There are two risks for farmers: quantity and price.  For most farmers one is a poor substitute for the other.

After all, very small changes in quantity can cause huge changes in price due to the steep and possibly even backward-bending demand.  And it's not just US output that matters.  US farmers pay close attention to weather and harvest in Brazil, Australia, Russia, China and other places, too.

It also depends a little on which farmers we're talking about, since some farmers have a natural hedge if they are in a region with a high concentration of production (Iowa), while others don't (Georgia).  And farmers also have an ongoing interest in the value of their land that far exceeds the current crop, which they can partially hedge through commodity markets since prices tend to be highly autocorrelated.

Also, today's farmers, especially those engaged in futures markets, may be highly diversified into other non-agricultural investments.  It's not really clear what their best hedging strategy ought to look like.

Anyhow, these are nice papers with a bit of good data to ponder, and a very nice review of past literature.  But I don't see how any of it sheds new light on the effects of commodity financialization. All of it is easy to reconcile with existing frameworks.  I still see no evidence that speculation and Wall Street involvement in commodities is wreaking havoc.

Monday, August 25, 2014

What’s the goal and point of national biofuel regulation?

While preparing a lecture for the 4th Berkeley Summer School in Environmental and Energy Economics, I returned to contemplating the regulation of biofuels as part of a federal strategy to combat climate change and increase energy security. If we review policy approaches for increasing the share of biofuels in the transportation fuels supply across this great land, there are three main approaches. We have subsidies for the production of ethanol and biodiesel, renewable fuels standards (RFS) and low carbon fuels standards (LCFS).
The two main tools employed at the federal level are subsidies, which essentially provide a per gallon payment for producing a gallon of a certain type of biofuel, and renewable fuels standards, which require the production of different classes and quantities of biofuels over a prescribed time path. California has employed a low carbon fuel standard, whose goal it is to decrease the average carbon content of California’s gasoline by prescribed percentages over time. It relies on life cycle calculations for the carbon content of different fuels and allows producers to choose a mix of different fuels, which decrease the average carbon content, thus providing more flexibility in terms of fuels compared to the RFS.
If I were elected the social planner, I would recognize that I am most likely not smarter than the market, but also would not trust the market to make the right decisions when it comes to carbon reductions (see the demonstrated record of markets since 1850). The standard way an economist would approach the problem, assuming that we know what the right amount of carbon abatement is, is to set a cap on emissions and issue tradable rights to pollute (a cap and trade). This would in theory lead to the desired level of emissions reductions at least cost. While preparing for my lecture, I was thinking I should set up a simple model where profit-maximizing producers of fuels face different policy constraints (e.g., subsidies, RFS, LCFS or a cap and trade), a reasonable demand curve and my giant computer. As so often happens to many of us environmental and energy economists, EI@Haas’ all-star team captain Chris Knittel (MIT) and coauthors had already written the paper, which is titled “Unintended Consequences of Transportation Carbon Policies: Land-Use, Emission, and Innovation”.
[Skip this paragraph if you are not a fan of wonk]. The paper, which is a great and relatively quick read, simulates the consequences of the 2022 US RFS, current ethanol subsidies and constructs a fictional national LCFS and cap-and-trade (CAT) system, which are calibrated to achieve the same savings as the RFS. The paper assumes profit-maximizing firms which either face no policy, the RFS, subsidies, LCFS or CAT. Using an impressive county level dataset on agricultural production and waste, the authors set out to construct supply curves for corn ethanol and six different types of cellulosic ethanol. Chris’ daisy chained Mac Pros then maximize profits of the individual firms by choosing plant location, production technology, and output conditional on fuel price, biomass resources, conversion and transportation costs. Changing fuel prices and re-optimizing gets them county level supply curves. Assuming a perfectly elastic supply of gasoline and a constant elasticity demand curve for transportation fuels, they solve for market equilibria numerically.
They use the results to compare the consequences of each policy type for a variety of measures we might care about. Here is what happens:
The CAT leads to the greatest increase in gas prices and largest decrease in fuel consumption. It leads to no additional corn ethanol production and slight increases in second-generation biofuels. The RFS and LCFS both lead to less than half the price increase and fuel reduction compared to the CAT. Both policies see a four to nine fold increase in corn ethanol production relative to no policy and a massive ramp up in second generation biofuels production. All three measures lead to the same reductions in carbon emissions. The subsidies leave fuel costs constant, do not change fuel consumption and lead to a massive increase in first and second generation biofuels, but only achieve two thirds of the carbon reductions compared to the other policies (which is due to the authors using current subsidy rates rather than artificially higher ones which would lead to the same carbon savings).
Biofuels lead to lower gas prices and equivalent carbon savings! This is the point, where biofuels cheerleaders scream “everything is awesome!” But this ain’t a Lego movie. Especially since Legos are not made from corn. The paper evaluates the policies along a number of dimensions. First, compare the abatement cost curves for the CAT and the LCFS. When it comes to marginal abatement cost curves, the flatter, the better. What we see in the paper is a radically steeper marginal abatement cost curve from the LCFS compared to the CAT. In equilibrium the marginal abatement cost for the LCFS is almost five times higher that of the CAT. What about those emissions reductions? What happens in practice is that the CAT leads to higher emissions reductions from reduced fuel consumption (by driving less or more efficient cars) and a little bit of fuel switching. For the LCFS there is much more fuel switching and not much less driving.
What about land use? Well, since the non CAT policies incentivize ethanol production, significant amounts of crop and marginal lands will be pulled into production.
figure 3
The paper shows that total land use for energy crops goes up about ten fold under the biofuels policies and only by about 30% under the CAT. The paper calculates that damages from erosion and habitat loss from these policies can reach up to 20% of the social cost of carbon compared to essentially 0% for the CAT.
Further, ethanol policies create the wrong incentives for innovation, where in some settings the incentives are too strong and in others they are too weak. A further aspect of the paper, which is incredibly clever, is that they show the cost of being wrong in terms of the carbon intensity (e.g., you get the indirect land use effect wrong, which is almost certainly the case) of different fuels can lead to massive amounts of uncontrolled emissions. The carbon damage consequences of being wrong by 10% in terms of the emissions intensity of corn ethanol are an order of magnitude (read 10 times!) the number for the cap and trade. Before I wonk you to death, I will close with some more general thoughts, but staffers of carbon regulators should read this paper. Now.
What this work shows is that in the case of biofuels setting a simple universal policy, which lets market participants choose the least cost ways of finding emissions reductions, is vastly preferred to complex renewable fuels or low carbon fuels standards. While I understand that producers of ethanol enjoy their subsidies (much like I enjoy my home interest mortgage deduction), this paper argues that they are a bad deal for society. And so is the RFS, as would be a national LCFS. As we go ahead and design a national carbon policy, I would hope that we take the lessons from this paper and the decades of environmental economics insight it builds upon to heart. This does not say that first or second generation biofuels are a bad idea, but if they want to compete for emissions reductions, they need to be fully cost competitive with other and currently lower cost emissions reductions alternatives.
[This post originally appeared on the Energy Institute at Haas Blog]

Tuesday, August 12, 2014

Big swings and shock absorbers

Two years ago when we started this blog, the Midwest was going through a major drought and ended up eaking out just above 123 bushels per acre (bu/ac) in corn yield. Today the USDA released its latest projection for 2014, with a forecast for record corn yields of 167.4 bu/ac, due to really good weather (as Wolfram summarized in the last post.)

The difference of 44 bu/ac between two years this close apart is bigger than anything experienced in the history of US agriculture. The closest thing was in 1994, when yields were 139 bu/ac after just 101 in the flood year of 1993. When expressed as a percentage swing, to account for the fact that yields overall have been going up, the swing from 2012 to 2014 (36%) is near historic highs but still less than some of the swings seen in the 1980’s and early 1990’s (see figure below).


We’ve talked a lot on this blog about what contributes to big swings in production, and why it shouldn’t be surprising to see them increase over time. Partly it’s because really bad years like 2012 become more likely as warming occurs, and partly it’s because farmers are getting even better at producing high yields in good conditions. Sometimes I think of good weather like a hanging curveball – a decent hitter will probably manage a hit, but a really good hitter will probably hit a home run. So the difference between good and bad weather grows as farmers get better at hitting the easy pitches.

Moving on from bad analogies, the point of this post is to describe some of the changes we might see as the world comes to grips with more volatile production. What kind of shock absorbers will farmers and markets seek out? Four come to mind, though I can’t be sure which if any will actually turn out to be right. First, and most obvious, is that all forms of grain storage will increase. There are some interesting reports of new ways farmers are doing this on-site with enormous, cheap plastic bags. We have a working paper coming out soon (hopefully) on accounting for these storage responses in projections of price impacts.

Second will be new varieties that help reduce the sensitivity to drought. Mark Cooper and his team at Pioneer have some really interesting new papers here and here on their new Aquamax seeds, describing the approach behind them and how they perform across a range of conditions.

A third response is more irrigation in areas where it hasn’t been very common. I haven’t seen any great data on recent irrigation for grain crops throughout the Corn Belt, but I’ve heard some anecdotes that suggest it is becoming a fairly widespread insurance strategy to boost yields in dry years.

The fourth is a bit more speculative, but I wouldn’t be surprised to see new approaches to reducing soil evaporation, including cheap plastic mulches. There are some interesting new papers here and here testing this in China showing yield gains of 50% or more in dry years. Even in the Midwest, where farmers practice no-till and crops cover the ground fairly quickly, as much as a third of the water in the soil is commonly lost to evaporation rather than via plant uptake. That represents a big opportunity cost, and if the price of grain is high enough, and the cost of mulch is low enough, it’s possible that it will take hold as a way of raising yields in drier years. So between storage and mulching, maybe “plastics” is still the future.

Sunday, August 3, 2014

2014 - colder and slightly wetter than average US growing conditions

Futures prices for corn have been decreasing a lot - for example the December 2014 contract for corn has decreased by more than 25% between the beginning of May and the beginning of August (chart of CME), indicating that the market either anticipates increased supply or a drop in demand, likely the former.  Maybe good weather will be giving the US, which produces 40% of the world's corn a bumper crop.

Below are the weather updates for 2014 until the end of July 2014.  Similar to earlier posts, I graphed the cumulative degree days above 29C, which have been shown to be very detrimental for corn growth. This measure counts how much temperatures exceed 29C and for how long. For example, being half a day at 33C would result in 2degree days above 29C (0.5days x 4C). These are the weighted average of all counties in the United States, where the weight is proportional to expected production (expected yield according to trend times last reported growing area).  Areas with higher yields and larger growing area get weighted more heavily.

Grey lines show the historic distribution from 1950-2011, while the last three years are shown in color.  The red line shows how hot July 2012 has been - notice the sharp increase of the red line in July.  By comparison, 2014 (green line) has been the second lowest total by the end of July of what we have observed in the last 65 years.  This should be great for con yields, as there hasn't been much damaging heat.

At the same time, it has also been slightly wetter than average.  Since too wet or too dry, having close to average amount of rain is good for crops as well.

Looking at these two graphs, it suggests that 2014 will be a very plentiful harvest.

Wednesday, June 25, 2014

Adaptation Illusions

For some reason, the topic of adaptation can sometimes turn me into a grumpy old man. Maybe it’s because I’ve sat through hours of presentations about how someone’s pet project should be funded under the name of climate adaptation. Or maybe it’s the hundreds of times people say that adaptation will just fix the climate problem, with no actual evidence to support that other than statements like “corn grows just fine in Alabama.” Or maybe it’s just that I’m really turning into a grumpy old man and I like excuses to act that way.

In any case, I was recently asked to write a piece for Global Food Security, a relatively new journal run by Ken Cassman that I’m on the editorial board for. I used it as a chance to articulate some of the reasons I am unconvinced by most studies trumpeting the benefits of adaptation. That’s not to say I don’t think adaptation can be effective, just that most of the “evidence” for it is currently quite weak. In fact, I think some adaptations could be quite effective, which is why it’s so important to not spend time on dead ends. The paper is here, and abstract below.


"Climate change adaptation in crop production: Beware of illusions"
A primary goal of studying climate change adaptation is to estimate the net impacts of climate change. Many potential changes in agricultural management and technology, including shifts in crop phenology and improved drought and heat tolerance, would help to improve crop productivity but do not necessarily represent true adaptations. Here the importance of retaining a strict definition of adaptation – as an action that reduces negative or enhances positive impacts of climate change – is discussed, as are common ways in which studies misinterpret the adaptation benefits of various changes. These “adaptation illusions” arise from a combination of faulty logic, model errors, and management assumptions that ignore the tendency for farmers to maximize profits for a given technology. More consistent treatment of adaptation is needed to better inform synthetic assessments of climate change impacts, and to more easily identify innovations in agriculture that are truly more effective in future climates than in current or past ones. Of course, some of the best innovations in agriculture in coming decades may have no adaptation benefits, and that makes them no less worthy of attention.

Tuesday, June 24, 2014

American Climate Prospectus, feeling the heat

I've had the pleasure of working with a team of brilliant people on the American Climate Prospectus: Economic Risks in the United States, which was released today. It was the independent research report commissioned by the Risky Business initiative, and which underlies that group's conclusions.  I'm excited to share and discuss with G-FEEDers the numerous ideas, questions, challenges and results that arose in the process of the analysis, since until now we haven't had a chance to discuss the work with the larger community. I will try to post on these various issues in the coming weeks, but for now I'd like to [sleep and] simply kick off the social media rodeo with my favorite graphic from the entire report (which has no economics in it!). If we've done our job right, it needs no explaining. The only hint for the non-wonks is that RCP 8.5 means "business as usual" climate change (or "A1B" if your wonk is out of date).


Source: American Climate Prospectus

I win 20 bucks if this goes viral, so please repost it.

Saturday, June 7, 2014

Regulating CO2 emissions and firm competitiveness

EPA announced this week the Clean Power Plan to reduce carbon emissions of coal fired power plants by 30% by 2030 (relative to 2005 levels).  It has created a lot of discussion with environmental groups praising it as an overdue step and some business pointing to the cost of such regulation.

One thing to note is that EPA was smart enough in giving states a lot of flexibility in meeting the 30% reduction.  In general, more flexibility should allow regulated plants to meet the goal in the most cost effective way. Max Auffhammer called this the Yoga Theorem, i.e., the "less flexible you are, the more you will suffer," which I can personally relate to (you don't want to see me stretch).

Most of the research we have talked about on this blog has estimated the effects of increasing temperatures that are caused by elevated CO2 levels (or other greenhouse gases like methane). Having spent so much time on the benefit side of greenhouse gas regulation (avoided damages), it might be interesting to look at some research on the cost side of such regulation. As any student of econ 101 remembers, in the optimum one should balance the costs of regulation against the benefits.

Making a factor of production (coal) more expensive will have two effects: first, there will be a shift away from coal, which will be especially felt in regions that rely on coal production (e.g., West Virginia).  While electricity generation has been shifting from coal to natural gas even before the regulation was implemented, the regulation will likely accelerate this transition. So there will be losers compared to the stays quo (one could also argue that coal has an unfair advantage in the status quo as it imposes a sizable unpriced externality, which makes a factor of production artificially cheap).

The best research on the cost of environmental regulation is a recent paper by Reed Walker, where he compares wages of individuals that work in industries in a county that are subject to tougher regulation under a the revised ozone standard compared to workers who work in the same county in industries that are not covered by the more stringent regulation. The combined loss in wages are sizable (5 billion in present discounted value), but order(s) of magnitude lower than the estimated benefits.  The largest cost incur to people that used to work in firms that are subject to tougher regulation and loose their jobs - once they find a new job wages tend to be lower for up to eight years.  Workers who stay at regulated firms see no drop in wages.

Second, regulating coal will make production that uses electricity (pretty much all production) more expensive as electricity prizes might rise as a result of the regulation.  How big is this second effect? We have some evidence from Europe, where the European Trading System (ETS) created a market for carbon credits.  Similar to the EPA regulation that only covers coal-fired power plants, ETS only covered a subset of firms.  This offers some nice way for researchers to compare regulated and unregulated firms.

Sebastian Petrick and Ulrich Wagner have a great new paper that uses firm-level data from firms in Germany and compares what happens to firms that are regulated under ETS by matching them to comparable firms that were not regulated.  Not surprisingly, regulated firms reduce emissions by 20% more than unregulated firms (the regulation is effective), but there seems to be no reduction in competitiveness of regulated firm, i.e., they find no reduction in employment, gross output, or exports.

In summary, my best guess is that the carbon regulation will have very little effect on overall competitiveness or employment in the US as a whole, but will be felt in some regions that heavily rely on coal.

Thursday, June 5, 2014

Adaptation with an Envelope


Economists like to emphasize how people and businesses will adapt to climate change.  On a geological scale the world is warming very fast.  But on a human scale it is warming slowly, so we can easily adjust infrastructure and management decisions to the gradually changing climate.  For example, in agriculture farmers can gradually adjust planting times, cultivars, and locations where we grow crops, and so on.

So how much does adaptation really buy us?  As it turns out, probably very little, at least in most contexts. 

Since it is economists who often emphasize this point, sometimes even intimating that otherwise negative impacts could turn positive with adaptation, perhaps we should pause for a moment to consider what basic microeconomic theory says about it.  And we have a ready-made tool for the job, called the envelope theorem (or here), that provides essential insight.

I'll try to make this intuitive, but it helps to be a little formal.  Suppose agricultural yield is:

y = f(xr)

where r indicates climate and x represents farmers' decisions.  I'm just using notation from a generic case in the second link above.

Farmers' decisions are not random.  With time and experience, we should expect farmers to optimize decisions for their climate.  Call these optimal decisions x*(r).  So, the outcomes we observe in practice are

y*  =  f(x*(r), r)

Now, to obtain a first-order approximation of the effect of climate change on yield, we need to find dy*/dr, which is just a fancy way of saying the marginal change in observed yield for a small change in climate.  Multiply this marginal change by the total change in climate (the change in r) , and we get a first-order approximation to the total impact.

If you've taken basic calculus, you learned the chain rule, which says that:

dy*/dr  =  df/dx dx/dr + df/dr

If the farmer is optimizing, however, df/dx=0.  The farmer cannot improve yield outcomes by changing decisions, because s/he's already optimizing.  So

dy*/dr = df/dr

And this gives the heart of the envelope theorem: to a first approximation, we don't need to worry about changes in behavior (dx/dr, or adaptation) to evaluate the effect of climate change on output.  The fact that behavior is already optimized means that behavioral adjustments will be second-order.

Here's an illustration of the math from lifted from the link above. The black and blue curves hold farmers's decisions fixed at different levels of x, optimized at each r.  The f*(r) ( or y*) we observe is the "upper envelope"of the all the blue and black curves with different, optimized levels of x.


Now, if  f*(r) is highly nonlinear, and we are contemplating a very large change in r, then adaptation will come into play. But even then it's not going to be a primary consideration.

I don't expect this basic insight, drilled into every economist during their first year of grad school (and even some undergraduates), will stop some economists from over-emphasizing adaptation.  But our own basic theory nevertheless indicates it is a small deal.  And it seems to me that the evidence so far bears this out just as clearly as the theory does.