
Wildfires in the American West can make for apocalyptic images, but they’re also routine, as dry-season heat can turn large areas of forest into waiting fires. One lightning strike – or one careless human being – can start a fire that spans tens of thousands of acres.
Several factors contribute to the size of these wildfires. We’ve made an effort to evict them as soon as possible – it’s well-meaning and sometimes necessary to protect ever-expanding human communities. But in many places, putting out the fires has disrupted a natural process of forest management. Because small bits of fuel can accumulate on the forest floor longer, fires are less frequent, but much more intense.
The climate also plays a role. Variability from year to year causes some summers to be noticeably drier and hotter than others. And then there is climate change. What can we say about its influence on fires in the West?
Previous research has shown that wildfires are already getting worse due to global warming. In a new study, John Abatzoglou of the University of Idaho and Park Williams of the Lamont-Doherty Earth Observatory take this analysis a step further by attempting to quantify the impact of global warming.
To do that, the researchers used a number of statistics that track the drought and dryness of combustible forest material, as well as the area of forest that actually burned down each year. Those drought statistics correlate very strongly with the wildfire area and follow the ups and downs from one year to the next. So if the metrics of drought can be related to climate change, then it should be possible to measure the contribution of the latter.
The human-induced trend was calculated using the smoothed average of a large group of climate model simulations, focusing on changes in temperature and evapotranspiration potential. Each dryness metric was calculated for the model average, allowing for a comparison between apples and apples. The difference between the smooth model trend and the wobbly observations shows the contribution of natural climate variability.
The dryness of heating oil has increased over the last century – and especially in the last 35 years. About 55 percent of that increase since 1979 is due to climate change, with the rest a result of natural variability such as the slow climate seesaw called the Pacific Decadal Oscillation. Climate change alone increased the length of the fire season by about two weeks during that period. By the time the 2000s dawned, the forest area at high fire risk was about 75 percent larger than it would have been without global warming.
As for the areas that ended up burning, the researchers estimated the amount of area that was burned by climate change alone between 1984 and 2015. It turned out to be an area the size of Massachusetts and Connecticut combined. That’s about it doubles the total amount that would have burned due to natural weather variability.
So climate change “not only played a role”, it certainly played a role.
Like all estimates, this one is based on some simplifying assumptions. It focuses on human-induced changes in average temperature, disregarding possible influences on weather variability, precipitation and wind. It doesn’t take into account the effects of climate change on, say, the mountain snow that melts in summer, or the bark beetle population that has left vast areas of dead trees standing. And the warming could even lead to an increase in the number of lightning strikes that can start fires. So it could be that climate change is responsible for a slightly smaller proportion of forest fires – or an even larger proportion.
There are several things we can do to limit wildfires in the American West, but halting climate change is clearly one of them. The researchers write, “We expect that anthropogenic climate change and associated increases in fuel aridity will have an increasingly dominant and detectable effect on the wildfire region in the western U.S. over the coming decades, while fuels remain abundant.”
PNAS2016. DOI: 10.1073/pnas.1607171113 (About DOIs).