Showing posts with label variance. Show all posts
Showing posts with label variance. Show all posts

Sunday, August 31, 2014

Commodity Prices: Financialization or Supply and Demand?


I've often panned the idea that commodity prices have been greatly influenced by so-called financialization---the emergence of tradable commodity price indices and growing participation by Wall Street in commodity futures trading. No, Goldman Sachs did not cause the food and oil-price spikes in recent years. I've had good company in this view.   See, for example, Killian, Knittel and PindyckKrugman (also here), HamiltonIrwin and coauthers, and I expect many others.

I don't deny that Wall Street has gotten deeper into the commodity game, a trend that many connect to  Gorton and Rouwenhorst (and much earlier similar findings).  But my sense is that commodity prices derive from more-or-less fundamental factors--supply and demand--and fairly reasonable expectations about future supply and demand.  Bubbles can happen in commodities, but mainly when there is poor information about supply, demand, trade and inventories.  Consider rice, circa 2008.

But most aren't thinking about rice. They're thinking about oil.

The financialization/speculation meme hasn't gone away, and now bigger guns are entering the fray, with some new theorizing and evidence.

Xiong theorizes (also see Cheng and Xiong and Tang and Xiong) that commodity demand might be upward sloping.  A tacit implication is that new speculation of higher prices could feed higher demand, leading to even higher prices, and an upward spiral.  A commodity price "bubble" could arise without accumulation of inventories, as many of us have argued.  Tang and Xiong don't actually write this, but I think some readers may infer it (incorrectly, in my view).

It is an interesting and counter-intuitive result.  After all, The Law of Demand is the first thing everybody learns in Econ 101:  holding all else the same, people buy less as price goes up.  Tang and Xiong get around this by considering how market participants learn about future supply and demand.  Here it's important to realize that commodity consumers are actually businesses that use commodities as inputs into their production processes.  Think of refineries, food processors, or, further down the chain, shipping companies and airlines.  These businesses are trying to read crystal balls about future demand for their final products.  Tang and Xiong suppose that commodity futures tell these businesses something about future demand.  Higher commodity futures may indicate stronger future demand for their finished, so they buy more raw commodities, not less.

There's probably some truth to this view.  However, it's not clear whether or when demand curves would actually bend backwards.  And more pointedly, even if the theory were true, it doesn't really imply any kind of market failure that regulation might ameliorate. Presumably some traders actually have a sense of the factors causing prices to spike: rapidly growing demand in China and other parts of Asia, a bad drought, an oil prospect that doesn't pan out, conflict in the Middle East that might disrupt future oil exports, and so on.  Demand shifting out due to reasonable expectations of higher future demand for finished product is not a market failure or the makings of a bubble.  I think Tang and Xiong know this, but the context of their reasoning seems to suggest they've uncovered a real anomaly, and I don't think they have.  Yes, it would be good to have more and better information about product supply, demand and disposition.  But we already knew that.

What about the new evidence?

One piece of evidence is that commodity prices have become more correlated with each other, and with stock prices, with a big spike around 2008, and much more so for indexed commodities than off-index commodities.


This spike in correlatedness happens to coincide with the overall spike in commodity prices, especially oil and food commodities.  This fact would seem consistent with the idea that aggregate demand growth--real or anticipated--was driving both higher prices and higher correlatedness.  This view isn't contrary to Tang and Xiong's theory, or really contrary to any of the other experts I linked to above.  And none of this really suggests speculation or financialization has anything to do with it.  After all, Wall Street interest in commodities started growing much earlier, between 2004 and 2007, and we don't see much out of the ordinary around that time.

The observation that common demand factors---mainly China growth pre-2008 and the Great Recession since then---have been driving price fluctuations also helps to explain changing hedging profiles and risk premiums noted by Tang and Xiong and others.  When idiosyncratic supply shocks drive commodity price fluctuations (e.g, bad weather), we should expect little correlation with the aggregate economy, and risk premiums should be low, and possibly even negative for critical inputs like oil.  But when large demand shocks drive fluctuations, correlatedness becomes positive and so do risk premiums.

None of this is really contrary to what Tang and Xiong write.  But I'm kind of confused about why they see demand growth from China as an alternative explanation for their findings. It all looks the same to me.  It all looks like good old fashioned fundamentals.

Another critical point about correlatedness that Tang and Xiong overlook is the role of ethanol policy.  Ethanol started to become serious business around 2007 and going into 2008, making a real if modest contribution to our fuel supply, and drawing a huge share of the all-important US corn crop.


During this period, even without subsidies, ethanol was competitive with gasoline.  Moreover, ethanol concentrations hadn't yet hit 10% blend wall, above which ethanol might damage some standard gasoline engines.  So, for a short while, oil and corn were effectively perfect substitutes, and this caused their prices to be highly correlated.  Corn prices, in turn, tend to be highly correlated with soybean and wheat prices, since they are substitutes in both production and consumption.

With ethanol effectively bridging energy and agricultural commodities, we got a big spike in correlatedness.  And it had nothing to do with financialization or speculation.

Note that this link effectively broke shortly thereafter. Once ethanol concentrations hit the blend wall, oil and ethanol went from being nearly perfect substitutes to nearly perfect complements in the production of gasoline.  They still shared some aggregate demand shocks, but oil-specific supply shocks and some speculative shocks started to push corn and oil prices in opposite directions.

Tang and Xiong also present new evidence on the volatility of hedgers positions. Hedgers--presumably commodity sellers who are more invested in commodities and want to their risk onto Wall Street---have highly volatile positions relative to the volatility of actual output.



These are interesting statistics.  But it really seems like a comparison of apples and oranges.  Why should we expect hedger's positions to scale with the volatility of output?  There are two risks for farmers: quantity and price.  For most farmers one is a poor substitute for the other.

After all, very small changes in quantity can cause huge changes in price due to the steep and possibly even backward-bending demand.  And it's not just US output that matters.  US farmers pay close attention to weather and harvest in Brazil, Australia, Russia, China and other places, too.

It also depends a little on which farmers we're talking about, since some farmers have a natural hedge if they are in a region with a high concentration of production (Iowa), while others don't (Georgia).  And farmers also have an ongoing interest in the value of their land that far exceeds the current crop, which they can partially hedge through commodity markets since prices tend to be highly autocorrelated.

Also, today's farmers, especially those engaged in futures markets, may be highly diversified into other non-agricultural investments.  It's not really clear what their best hedging strategy ought to look like.

Anyhow, these are nice papers with a bit of good data to ponder, and a very nice review of past literature.  But I don't see how any of it sheds new light on the effects of commodity financialization. All of it is easy to reconcile with existing frameworks.  I still see no evidence that speculation and Wall Street involvement in commodities is wreaking havoc.

Tuesday, August 12, 2014

Big swings and shock absorbers

Two years ago when we started this blog, the Midwest was going through a major drought and ended up eaking out just above 123 bushels per acre (bu/ac) in corn yield. Today the USDA released its latest projection for 2014, with a forecast for record corn yields of 167.4 bu/ac, due to really good weather (as Wolfram summarized in the last post.)

The difference of 44 bu/ac between two years this close apart is bigger than anything experienced in the history of US agriculture. The closest thing was in 1994, when yields were 139 bu/ac after just 101 in the flood year of 1993. When expressed as a percentage swing, to account for the fact that yields overall have been going up, the swing from 2012 to 2014 (36%) is near historic highs but still less than some of the swings seen in the 1980’s and early 1990’s (see figure below).


We’ve talked a lot on this blog about what contributes to big swings in production, and why it shouldn’t be surprising to see them increase over time. Partly it’s because really bad years like 2012 become more likely as warming occurs, and partly it’s because farmers are getting even better at producing high yields in good conditions. Sometimes I think of good weather like a hanging curveball – a decent hitter will probably manage a hit, but a really good hitter will probably hit a home run. So the difference between good and bad weather grows as farmers get better at hitting the easy pitches.

Moving on from bad analogies, the point of this post is to describe some of the changes we might see as the world comes to grips with more volatile production. What kind of shock absorbers will farmers and markets seek out? Four come to mind, though I can’t be sure which if any will actually turn out to be right. First, and most obvious, is that all forms of grain storage will increase. There are some interesting reports of new ways farmers are doing this on-site with enormous, cheap plastic bags. We have a working paper coming out soon (hopefully) on accounting for these storage responses in projections of price impacts.

Second will be new varieties that help reduce the sensitivity to drought. Mark Cooper and his team at Pioneer have some really interesting new papers here and here on their new Aquamax seeds, describing the approach behind them and how they perform across a range of conditions.

A third response is more irrigation in areas where it hasn’t been very common. I haven’t seen any great data on recent irrigation for grain crops throughout the Corn Belt, but I’ve heard some anecdotes that suggest it is becoming a fairly widespread insurance strategy to boost yields in dry years.

The fourth is a bit more speculative, but I wouldn’t be surprised to see new approaches to reducing soil evaporation, including cheap plastic mulches. There are some interesting new papers here and here testing this in China showing yield gains of 50% or more in dry years. Even in the Midwest, where farmers practice no-till and crops cover the ground fairly quickly, as much as a third of the water in the soil is commonly lost to evaporation rather than via plant uptake. That represents a big opportunity cost, and if the price of grain is high enough, and the cost of mulch is low enough, it’s possible that it will take hold as a way of raising yields in drier years. So between storage and mulching, maybe “plastics” is still the future.

Thursday, April 4, 2013

Are yields becoming less or more sensitive to temperature?


One of the hopes for adaptation is that farmers and breeders can somehow make crop yields less sensitive to temperature. Given that many agricultural areas have already seen significant amounts of warming, it’s fair to ask whether this is in fact happening. At the same time, it’s worth considering the possibility that things may be going in the opposite direction – that yields may in fact be getting more sensitive to weather. There’s a lot of history on this topic, which I’ll try to summarize briefly here before delving into some recent research summaries.

Before history, though, it’s useful to keep in mind three distinctions. First, yield sensitivity is not the same as yield variability. The latter can change if weather variability changes, even if sensitivity to weather stays the same. Second, an increase in yield sensitivity (or, for that matter, variability) is not necessarily a bad thing from a farmer’s or even a consumer’s point of view. If increased variability comes as a cost of increased average productivity, then both producers and consumers may benefit from the newer, more variable yield levels.

Third, this is potentially a VERY scale dependent question. Yield at a national scale, for example, can change for a lot of reasons apart from a change in yield variability for individual fields. For example, work by PeterHazell 30 years ago showed that changes in national yield variability were often driven by increased correlation of yields across different parts of the country. This could happen if production becomes more concentrated in a given geographic region, if practices or varieties become more uniform, or if weather becomes more correlated. There are lots of interesting questions related to sensitivity of aggregate yields to weather, some of which may be a topic for future posts. But here I want to focus on the evidence for yield sensitivity at the field level.

In thinking about changes in sensitivity, it might be useful to lump things into two bins – things that we would expect to reduce sensitivity to temperature, and things we’d expect to increase it. In the first category are many agronomic changes, particularly a stable supply of irrigation water and pest and disease control measures. It’s also possible that varietal changes could reduce sensitivity, for example if they are more resistant to pest or disease damage (especially if these disease pressures are different from year to year), or if they are more “hardy”, by which breeders often mean the ability to handle extreme moisture or temperature conditions. One example of this would be the flood tolerant rice lines recently developed. Finally, there is increased atmospheric CO2, which one would expect to lower impacts of droughts by improving water use efficiency.


In the other bin are things that increase sensitivity. Some agronomic changes could increase sensitivity, for example high fertilizer rates can allow crops to be more responsive to good weather conditions. Similarly, new varieties can be more responsive to good conditions. The “responsiveness” of varieties to water and nutrients is in fact one of the main reasons that current varieties are so high yielding on average.

The net change in yield sensitivity of a cropping system is the result of all of these factors pushing in different directions. So which changes matter most in modern cropping systems? I recently came across a very nice volume from an IFPRI meeting held 25 years ago, which included several papers on the topic of yield variability (here’s a link, but beware of large file size). It seems many of their comments are just as relevant today as then. In particular, most of them emphasize the “responsiveness” issue – that well-fertilized modern hybrids are increasingly capable of taking advantage of good years. Especially in rainfed systems, this seems to dominate the other factors.

For example, here’s Donald Duvick (a world renowned corn breeder) on the experience of US corn:
“The yield advantage of the new hybrids is greatest when environmental conditions are most favorable. When environmental factors are severely limiting, as in drought, the new hybrids outyield the old ones, but by a smaller margin. It is likely, therefore, that present-day hybrids introduce the possibility of greater year-to-year variation in U.S. maize yields than used to occur, since they can expand their yields so much further in environmentally favorable seasons. When environmental factors are overwhelmingly limiting, the fallback in yield of the new hybrids is correspondingly greater than it would have been for the older hybrids, even though the new hybrids, under poor conditions, yield more than the old ones.”

He goes on to conclude
“I expect that any changes in U.S. maize farming practices will be made in concert. The tendency for the nation's maize plantings to be handled like one big farm will continue. Reactions to varying climatic conditions will be amplified, and some measure of instability in year-to-year national expectations for maize yields must continue. This may be the price that must be paid for high average yield in the long term.”

Similar conclusions are made by other authors in the volume on tropical maize, temperate and tropical wheat, and other crops. More recent work has made similar points, for example David Connor and others have written about how Australian wheat yields are now more variable than in previous generations, because farmers and cultivars are now so good at taking advantage of high rainfall years, but still face low yields in dry years because of insurmountable water constraints (compare 1920-1940 and 1980-2000 in figure below, taken from here. not shown here are the major recent droughts which would further show the remarkable amount of current variability in the system).


Of course, in some cropping systems things can go the other direction. For example, a recent study of maize yields in France showed a decline in rainfall sensitivity associated with increased irrigation. And another study at the grid-cell level for major maize and soybean regions found both areas of increases and decreases in temperature sensitivity. Although in that study, it seems plausible that a lot of the variation was due to noise, because cells of increasing sensitivity were often adjacent to cells with decreasing sensitivity.

In a recent study, which was a collaboration with CIMMYT (the International maize and wheat improvement center), we attempted to look at how heat sensitivity has changed for one particular set of cropping systems that are of wide interest – irrigated spring wheat. To make a long story very short, we looked at two breeding nurseries run by CIMMYT, and evaluated yield changes for differing temperature conditions. In the main nursery aimed at producing “elite” varieties, there has been steady yield progress at cool temperatures but no significant progress at warmer temperatures (left panel in figure below shows yield trends for 4 bins defined by grain filling temperatures. “climate corrected” trends account for shifts in locations and weather within each temperature bin over time). This seems to be a similar story of cultivars becoming more responsive to good conditions (low temperatures in the case of wheat), with a resulting increase in the sensitivity to bad conditions (high temperatures). Again, this isn't necessarily a bad thing, but does suggest that wheat farmers are getting more not less sensitive to high temperatures. Interestingly, when looking at nurseries targeted at drought stress (right panel), the gains appear more evenly distributed and maybe even higher under worse conditions (albeit with more noise because of a smaller sample size). This is a reminder that it is possible to create more hardy plants. But those won’t necessarily be the ones that farmers choose to grow.



Tuesday, August 28, 2012

Is temperature variance changing?


People in agriculture often talk about “weather getting more variable.” It’s usually hard to know exactly what they mean – sometimes they are talking about precipitation becoming less frequent and more intense, and sometimes they're talking about hot extremes becoming more frequent. But it’s well known that what was considered “extreme” historically can become more frequent just by shifting the mean of the weather distribution, without any change in variance. The IPCC SREX figure below shows that clearly in the first panel.


We care about variance, though, not just because of its ability to increase the occurrence of historical hot extremes. If variance is increasing, this would mean more uncertainty faced each year by farmers and markets about what the growing season weather will be. Note that we are talking here strictly about weather variance. The variance in production can increase just from a shift in the distribution towards less favorable temperatures, as we showed in a recent study led by Dan Urban.

The question of whether variance per se has been changing (or is projected to change) has received much less attention than whether extremes are becoming more common. This is partly because changes in variance are harder to measure than shifts in means or increases in extreme events. But an interesting analysis by Donat and Alexander in GRL sheds some light directly on the variance question. They looked at the distribution of daily temperature anomalies for two 30-year time periods: 1951-1980 and 1981-2010. The figure below from their paper maps the change between the two time periods for three parameters of the distribution (mean, variance, and skewness), both for minimum (left) and maximum (right) temperature.


Two things seem new to me here. (Certainly the shifts in mean are not new, but it’s interesting to note that the shifts are about equal for minimum and maximum temperature.) First, the variance changes are mixed around the world, and not statistically significant in most places (the significant areas at 10% are shown with hatching). The authors also say that the variance changes depend a lot on what criteria they use to exclude grid cells without enough data.

The second interesting thing is that the skewness has increased in most parts, much more uniformly than the changes in variance. Just to be clear, we are talking about skewness in the statistical sense, not in the way it is sometimes used to mean “distorted” or “biased”. An increase in skewness means that the distribution is now less left-skewed or more right-skewed than before, which would mean that for a given average, there is a higher chance of having warm anomalies and a lower chance of having cool anomalies (see bottom panel of ipcc figure above).  

It’s hard to know what exactly is driving the skewness, but I suspect their paper will spur some more focus on this issue. Maybe it has to do with the shift in rainfall distribution toward heavier events, with less rainfall during moderate events. For now it seems safe to say that temperatures are not clearly becoming more variable for most parts of the world, but they seem to be slightly more skewed toward hotness.