The vicious 2012 wildfire season now unfolding in the interior West is hardly a surprise. Much of the region has been in a drought for more than a decade. This winter’s snowpack was sparse, particularly in Colorado, and it melted and ran off early. Temperatures have been high, and humidity has been low—making fuels from grasses to trees very dry and flammable.
All of that means conditions ripe for fires, which have come with a vengeance. New Mexico has had its biggest fire ever, and Colorado has seen the most destructive fires in its history, with the Waldo Canyon Fire near Colorado Springs and the High Park Fire near Fort Collins destroying more than 600 homes combined. Even as firefighters have brought the big Colorado fires near containment, other large blazes have broken out in Wyoming, Utah, and Montana.
Yes, it’s looking like a big fire year. And yes, this is part of the new normal. It’s pretty much exactly what climate experts have been predicting and what the data have been telegraphing for some time. While there are various proposals on the table to deal with increasingly destructive wildfires, they are likely to continue and become worse unless we tackle climate change.
More severe wildfires, right on schedule
Numerous studies in recent years have predicted that higher temperatures and drought conditions brought on by climate change will accelerate wildland fire activity in the West.
In 2004 U.S. Forest Service researchers studying past fires in the West constructed a model that predicted as much as a fivefold increase in burned areas by the end of the century.
Two years later a Scripps Institute of Oceanography study looked at the relatively recent spike in wildfire activity and determined it was due to changes in climate rather than forest management practices:
Robust statistical associations between wildfire and hydroclimate in western forests indicate that increased wildfire activity over recent decades reflects sub-regional responses to changes in climate. Historical wildfire observations exhibit an abrupt transition in the mid-1980s from a regime of infrequent large wildfires of short (average of 1 week) duration to one with much more frequent and longer burning (5 weeks) fires. This transition was marked by a shift toward unusually warm springs, longer summer dry seasons, drier vegetation (which provoked more and longer burning large wildfires), and longer fire seasons. Reduced winter precipitation and an early spring snowmelt played a role in this shift.
Three years ago, in a thorough report on the impacts of climate change across the country, the U.S. Global Change Research Program said that earlier melting of snow and drier soils and plants had already increased fire activity in the West, and that the situation would grow worse:
Wildfires in the United States are already increasing due to warming. In the West, there has been a nearly fourfold increase in large wildfires in recent decades, with greater fire frequency, longer fire durations, and longer wildfire seasons. This increase is strongly associated with increased spring and summer temperatures and earlier spring snowmelt, which have caused drying of soils and vegetation.
It’s impossible to link any one particular fire or weather event to climate change. In the case of fires in the West, there are other factors as well: more people living in fire-prone areas in and near forests and unnaturally crowded forests brought on in large part by decades of misguided efforts to battle and suppress nearly all fires.
But federal scientists and officials whose responsibilities include management of the vast national forest system in the West are increasingly saying flat out that there is an undeniable link between wildfires and climate change.
The Agriculture Department official who oversees the U.S. Forest Service, Under Secretary Harris Sherman, noted recently that 10 states have had record fires in the past decade. “The climate is changing,” Sherman told The Washington Post, “and these fires are a very strong indicator of that.”
“There’s enough data that show fires are very clearly linked to warming,” U.S. Geological Society Research Ecologist Craig Allen recently told a symposium sponsored by the Aspen Center for Environmental Studies. “Fire season’s about two months longer than it used to be.”
But the longer season is just the beginning. Data compiled by the National Interagency Fire Center in Boise, Idaho, show that the fires are becoming more destructive—the amount of acreage burned has skyrocketed over the past few decades:
- During the four decades of the 1960s through the 1990s, the annual acreage burned by wildfire averaged 2,879,054 acres. Between 2000 and 2009 the average year saw 6,941,952 acres burn.
- Between 1960 and 1995 there were just five years where the acreage burned exceeded 5 million. Between 1996 and 2011, 11 of the 16 years exceeded 5 million acres burned, including 8 of the past 10 years.
To date in 2012 fires have burned about 2.4 million acres, according to the National Interagency Fire Center. And the outlook for the rest of the summer and early fall is not rosy, the center reports. Much of the West—from northern Arizona and northern New Mexico to southern Montana, across Nevada, and into parts of California—will have above-normal fire potential through the remainder of July. From August to October large swaths of Wyoming, Montana, Idaho, Utah, Nevada, and California will have above-average fire potential due to drought, fuel conditions, and El Niño, which causes sea temperatures to rise.
Past policies are partly to blame
Some of the conditions driving wildfire activity today were set decades ago in forest and rangeland management policies that have since been discredited, including fire suppression and poor grazing and timber harvest activities.
Aggressive federal firefighting efforts began in 1910, when in the space of just two days, a huge firestorm in Montana and Idaho burned some 3 million acres. During the 1920s and 1930s, the Forest Service adopted policies to control fires before they reached 10 acres in size and before 10 a.m. on the day after the fires began.
But beginning in the 1970s, the aggressive suppression of most fires came under more scrutiny, with the recognition that fires are a natural process that in some ecosystems are vital to healthy forests. Federal policies now encourage allowing fires to burn where appropriate—for example in wilderness areas where fires won’t threaten communities.
The great irony of the decades of aggressive fire suppression is that in many parts of the West it has made fires larger and more destructive. For example, the mid-elevation Ponderosa Pine forests that cover much of the West historically had frequent fires of low intensity that removed undergrowth but spared larger, thick-barked trees. Firefighting efforts prevented those small cleansing fires and allowed forests to become overcrowded with small trees and underbrush that now fuel larger, more catastrophic fires.
Fighting climate change should be part of the solution
Various federal initiatives since the 1990s have sought to address the questions surrounding forest fuel loads and how to better manage them to moderate the wildfire threat, either by reintroducing fire, by thinning crowded forest stands using logging tools, or a combination of both methods. The results are questionable at best.
A recent Congressional Research Service paper on wildfire protection summed up the science on whether such interventions succeed in reducing wildfire extent and severity:
The presumption is that lower fuel loads and a lack of fuel ladders [underbrush and small trees that carry fire into the tops of larger trees] will reduce the extent of wildfires, the damages they cause, and the cost of controlling them. Numerous on-the-ground examples support this belief. However, little empirical research has documented this presumption. As noted in one research study, “scant information exists on fuel treatment efficacy for reducing wildfire severity.”
Despite that research ambiguity, fire years such as the current one almost always spur calls for large-scale efforts to thin overgrown forests and return them to a more natural condition, particularly in what is called the “wildland-urban interface.” That awkward phrase is sometimes defined as “where combustible homes meet combustible vegetation.”
Sherman, speaking to the recent Aspen conference, said that, “We need to move forward with landscape-scale restoration. Too often we have conservation projects where we’re working on a hundred acres here or a hundred acres there. We need to move into an entirely new and expanded scope of work.”
That demand for larger restoration is partly driven by the extraordinary costs of fighting fires. Between fiscal year 2000 and fiscal year 2010, fire suppression appropriations by Congress rose from less than $300 million to nearly $1.4 billion, according to a 2011 Congressional Research Service paper on federal funding of wildfire activities. At the same time federal spending on fuel reduction rose from $117 million in fiscal 2000 to $400 million the next year and has largely remained in the $400-million-to-$500-million range since.
The cost of an ambitious forest restoration effort would be huge. In a 1999 report the U.S. General Accounting Office (now the Government Accountability Office) estimated it would cost $12 billion to treat the 39 million Forest Service acres at the time thought to be at high risk of catastrophic wildfire. Since then the Forest Service has raised its acreage estimate to 51 million acres, and the estimate of a $300-per-acre treatment cost has probably become obsolete. Further, the original estimate did not include other federal lands beyond Forest Service areas.
As the Congressional Research Service paper on wildfire protection noted, “If a comprehensive program were undertaken to reduce fuels on all high-risk and moderate-risk federal lands, using GAO’s treatment cost rate of $300 per acre, the total cost would come to $69 billion.”
And that’s just for initial treatment. It doesn’t include repeat treatments that would be necessary in ecosystems where fire naturally returns on 5-year to 35-year cycles.
“There is a final, significant question,” the CRS report said. “Would it work?”
The CRS’s conclusion:
Reducing fuel loads might reduce acreage burned and the severity and damages of the wildfires that occur. Research is needed … to examine whether the cost of fuel reduction is justified by the lower fire risk and damage. However, it should also be recognized that … as long as there is biomass for burning, especially under severe weather conditions (drought and high wind), catastrophic wildfires will occasionally occur, with the attendant damages to resources, destruction of nearby homes, other economic and social impacts, and potential loss of life.
In a warming world we can expect those things will happen more often and with greater intensity, as we are seeing this summer. The bottom line is that climate change is a major cause of these fires, and climate solutions should become part of the effort to tame them.
Tom Kenworthy is a Senior Fellow at the Center for American Progress.
Read more on the Center’s clean energy solutions on our Energy and Environment page.