One function of this blog, other than to raise the level of guilt in Sol’s life (we are still waiting for his March post -- from 2013), is to help us work through ideas that are possibly wrong, possibly unoriginal, or very likely both at the same time. So here’s one idea I’d welcome feedback on. Let’s call it the “inconsistent farmer” problem.
In the early days of work on climate change and agriculture, the notion that modelers were being too pessimistic in how they treated farmer’s ability to adapt was captured by the phrase “dumb farmer.” Of course, the idea was not that farmers are actually dumb, but that modelers were treating them as such. Modelers would simulate a “reference” farmer without climate change, and then that same farmer with the same exact practices and crops but with climate change. The idea of calling this a “dumb farmer” is that a real (i.e. smart) farmer would notice the change in climate and adjust. Obviously, a lot of work since has added simulations with hypothetical adjustments.
But let’s revisit the basic setup, in terms of the “reference” farmer. Generally speaking these were meant to characterize the current crops and practices at the time of the study. But the farmer in question was being exposed to some future climate, say of 2050 or 2080. So even the reference farmer was in some sense “dumb” or “backwards” in that 50 years had passed and they were still using the cropping systems of circa 2000.
All this is probably old hat for anyone who has read the literature. But what seems to go less noticed is that the impact models that then use the yield impacts derived from crop models are generally assuming some exogenous yield trend. For example, the recent AgMIP papers have some scenarios out to 2050, with the assumed yield increases summarized below in the table from Nelson et al.
So on the one hand the crop models assume current farmers, and on the other hand the economic impact models assume the more sophisticated future farmer. A farmer can’t be both things at the same time, so we have the “inconsistent farmer.”
Now why would anyone possibly care about this? I’d say for two big reasons. The first is that future technologies could have a very different sensitivity to climate than current ones. This is the idea behind previous posts here and here and here, so I won’t spend much time on it here. But there is some new evidence along these lines, such as the studies on soybean here and here that show modern cultivars are more sensitive to hot weather, like what we saw for wheat. For example, below is one plot for soybean from Rincker et al. showing genetic yield gains (comparing newer vs. older varieties) as a function of the favorability of the environment. The stronger yield trends in good conditions means that the difference between good and bad growing conditions is bigger for the newer cultivars, at least in absolute terms.
Second, and maybe more important, is that there is potentially a lot of double counting going on when people examine adaptation. Or put differently, there’s a lot of overlap between the types of things that explain “exogenous” yield trends, and the types of things that crop modelers use as “adaptation” in their models. For example, I was recently in Eastern India looking at various strategies to get wheat sown earlier. When you ask farmers what the benefits of sowing early are, they generally tell you it's because wheat doesn't like the hot spring so yields are higher if you sow earlier. If you ask them whether the spring weather has been changing, they generally say it's getting warmer. But if you ask them if they are doing anything different because of that warming, they generally say no, they just get lower yields. They don't view the earlier sowing as a benefit specific to climate change. It's a change that would help them anyway.
This idea of double counting is similar to the notion of adaptation illusions that I wrote about earlier. But it depends on the degree to which these “adaptive” measures are already part of the baseline “exogenous” yield trends. To get at that, it’s important to really understand not just the types of things being considered as adaptations but also the source of recent yield growth and the likely drivers of future yield growth. And if the latter are going to be a big part of the “exogenous” trend, they should probably be out of bounds for modelers to incorporate as adaptations.
I realize that a lot of this seems like semantic details. But I don’t think it is. My sense is that there are real risks of understating the climate change impacts. Either because we are specifying a reference scenario that uses cropping systems that are less sensitive to climate than their future descendants will be, or by allowing technologies to be called on to reduce impacts (i.e. adapt) when in fact they would have already been deployed in the “exogenous” reference. I suppose the first factor could also go the other way, in which we would be overstating impacts because future crops will be less sensitive (i.e. irrigated).
For the double counting, you can look at the types of adaptations that modelers employ and simply ask whether you really think these aren’t part of what will drive the “exogenous” yield trend. Drought tolerance, shifting sow dates, more irrigation and fertilizer – these are all things that have been important sources of recent yield growth and will continue to play a role in future trends. Below is a quick schematic to try to explain this point. The reference farmer is generally assumed to continue on a trajectory of yield growth, shown here as linear to keep it simple (green line). Climate changes then can affect this trajectory, and often impacts are calculated both without and with adaptation (red and blue line). But if one lists the types of things that are implicit in the "exogenous" trend, and then the things generally invoked as adaptations, there is a lot of overlap. These are good things, but in scoping out the prospects for future supply, we shouldn't count them twice.