Why California burns – its forests have too many trees

NOT A LOT OF PEOPLE KNOW THAT

By Paul Homewood

Another forestry expert reminds of the real reason why wildfires are getting worse in California:

image

The reason wildfires are burning California with unprecedented ferocity this year is because our public forests are so thick. It is our fault. We don’t manage our forests, we just let them grow. That is the simple truth. However, it is easier to deny the truth and blame a warming climate instead of admitting our guilt and taking action to prevent wildfires.

Hot, dry weather doesn’t cause catastrophic wildfires. It only makes them worse. In order for any fire to burn, it must have fuel. To spread wildly, it must have abundant fuel. Efforts in the 20th century to prevent fire and preserve forests have been too successful — they have disrupted the ecological balance and allowed more and more trees to grow.

Some California forests have more than 1,000 trees per…

View original post 285 more words

Advertisements

Global Wildfire Area Has Declined, Contrary To Popular Myth

NOT A LOT OF PEOPLE KNOW THAT

By Paul Homewood

Another thorough assessment of wildfire trends wrecks alarmist claims:

image

ABSTRACT

Wildfire has been an important process affecting the Earth’s surface and atmosphere for over 350 million years and human societies have coexisted with fire since their emergence. Yet many consider wildfire as an accelerating problem, with widely held perceptions both in the media and scientific papers of increasing fire occurrence, severity and resulting losses. However, important exceptions aside, the quantitative evidence available does not support these perceived overall trends. Instead, global area burned appears to have overall declined over past decades, and there is increasing evidence that there is less fire in the global landscape today than centuries ago. Regarding fire severity, limited data are available. For the western USA, they indicate little change overall, and also that area burned at high severity has overall declined compared to pre-European settlement. Direct fatalities from fire and economic losses…

View original post 919 more words

California acknowledges government policy failures in wildfires – not nebulous “climate change” excuses

Reblogged from Watts Up With That:

Guest essay by Larry Hamlin

Articles in the New York Times, Los Angeles Times, and Orange County Register address Governor Newsom’s declaration of a state of emergency to allow the state to waive environmental extremist laws and regulations that are needed so that Cal Fire can proceed with actions to clear dead trees, remove excessive undergrowth, thin out excessive tree growth and crowding, use prescribed fire, etc. to improve forest health and decrease Ca. wildfire risks.

clip_image002

clip_image004

California Governor Brown erroneously claimed that scientifically unsupported and nebulous “climate change” was driving the states wildfires while ignoring decades of pleas from forest and fire fighting professionals to address failed government and regulatory policies which were allowing the build up of excessive fuel which was leading to more intense and dangerous wildfires as addressed in a WUWT article as follows:

“In October 18, 2015 L. A. Times article wildfire experts unsupportive of Brown’s position noted that:

“Today’s forest fires are indeed larger than those of the past, said National Park Service climate change scientist Patrick Gonzalez.

At a symposium sponsored by Brown’s administration, Gonzalez presented research attributing that trend to policies of fighting the fires, which create thick under layers of growth, rather than allowing them to burn.

“We are living right now with a legacy of unnatural fire suppression of approximately a century,” Gonzalez told attendees.”

This century long policy of fire suppression and its impact of Ca. wildfires is further reflected in a 2015 University of California Berkeley study which noted:

“National parks and other protected areas clearly provide an important function in removing carbon from the atmosphere and storing it,” said Battles. “But we also know from previous research that a century of fire suppression has contributed to a potentially unsustainable buildup of vegetation. This buildup provides abundant fuel for fires that contribute to carbon emissions.”

Before this state of emergency action by Governor Newsom the state ignored the comprehensive Little Hoover Commission report of February 2017 that laid out in detail the failure of states forest management and environmental polices which were found to be driving the increased wildfire intensity and damage.

clip_image006

Governor Newsom’s state of emergency declaration action is in accordance with recommendations contained in the most recent wildfire study by Cal Fire.

clip_image008

Governor Browns flawed and politically contrived claim of “climate change” driven wildfires was never supported by scientific data which shows that California has a long history going back centuries of extensive and severe droughts and that recent droughts are in fact less severe than the state has experienced in the past.

clip_image010

The Los Angeles Times article noted that some “experts” were in disagreement with the need for Governor Newsom to take these actions but the Governor rebutted these challenges as follows:.

“At the news conference, Newsom acknowledged the criticism, and rebutted it.

“Some people, you know, want to maintain our processes and they want to maintain our rules and protocols,” the governor said. “But I’m going to push back on that. Some of these projects quite literally, not figuratively, could take two years to get done, or we could get them done in the next two months. That’s our choice.”

Finally California has begun to address reality in dealing with the state wildfire debacle by acknowledging its role in building this huge problem instead of continuing to make phony “climate change” excuses for these wildfires.

This action should never have taken so long to occur but instead should have been initiated many years ago.

Solar energy may have caused California’s wildfires

Reblogged from Watts Up With That:

From The Washington Examiner

by Kevin Mooney | March 04, 2019 12:59 PM

Screenshot 2019-03-08 22.05.04

Taxpayer-subsidized, ratepayer-funded utilities that may be on the hook for billions of dollars in liabilities point to climate change as the major factor standing behind the recent California wildfires. PG&E CEO Geisha Williams has argued that dry, arid conditions associated with global warming were to blame for wildfires that devastated parts of northern California in 2018. Edison International CEO Pedro Pizarro has said much of the same with regard to the wildfires of 2017 that ignited in the southern part of the state.

But what if the blame belongs not with climate change, but with climate change policies that the utilities and their benefactors in government favor? There’s some evidence for this that insurance companies and displaced California residents might be interested in learning more about. As taxpayers and utility ratepayers, they are all spending part of their workday financing solar energy schemes that may have led to high-pressure conditions affecting electrical equipment, which in turn sparked the fires. How’s that?

Let’s look at just one example going back to December 2017, when wildfires devastated portions of Ventura and Santa Barbara counties. At the time, what became known as the Thomas Fire was the largest wildfire in California’s history. The fire erupted on Dec. 4, 2017, in the Santa Paula Canyon area just south of Thomas Aquinas College a little before 6:30 p.m., according to reports from the Ventura County Fire Department.

The latest figures show the Thomas Fire burned more than 280,000 acres before it was finally contained on Jan. 12, 2018. The fire destroyed more than 1,000 structures including hundreds of homes.

But there was also a second, related fire that broke out in Ojai, a small city in Ventura County, located a little northwest of Los Angles. This one hasn’t received as much attention in the national press, but it could be the key to unraveling what’s really going down with California’s misguided, big government policies. That fire broke out about an hour later after a transformer reportedly exploded in a residential area on Koenigstein Road. There are local witnesses who say they saw the flash of the explosion on the pole with the transformer, and others who say they heard the explosion.

Homeowners who have filed lawsuits against Southern California Edison in connection with the Thomas Fire argue that the utility, which is a subsidiary of Edison International, was negligent in terms of how it maintained the power lines. One of the lawsuits filed on behalf of an Ojai couple specifically addresses the explosion of the Edison transformer on Koenigstein Road, which was mounted on a pole. The couple lost their home in the fire.

Let’s take a hard look at the facts.

The transformer exploded around 7 p.m. at the end of a sunny day. Around that time, because of the solar energy mandates implemented under former Gov. Jerry Brown, a Democrat, California’s power grid must ramp up in the evening with conventional energy when the sun goes down. This cannot be done incrementally and gradually. Instead, California’s power grid experiences what is known as a “duck curve” as solar energy drops off and conventional energy ramps up.

So, the key questions are: “Did solar power cause the Thomas Wildfire?” Did it cause other wildfires?

There’s no denying the pressure that was put on the Edison transformer, and for that matter other transformers throughout the state. Certainly, correlation is not necessarily causation. But utilities and fire departments must have information and data associated with damaged transformers and other electrical equipment that could be insightful. The Thomas Wildfire is a good starting point for an investigation, but it is just one part of a larger story.

Keep in mind that Southern California Edison is also the subject of litigation filed in response to wildfires that broke out in 2018. So far, 170 homeowners and business owners who suffered damage in connection with the Woolsey Fire that broke out in November 2018 have filed suit in Los Angeles and Ventura Counties claiming the utility’s electrical equipment was responsible for the fire. Meanwhile, PG&E has announced that it will file for bankruptcy since it is now drowning under “at least $7 billion in claims from the Camp Fire,” according to news reports. The California Department of Forestry and Fire Protection has also blamed PG&E for some of the 2017 wildfires.

Without intervention from lawmakers and regulators, PG&E appears to be doomed. The Camp Fire is now on record as the deadliest fire in state history in terms of fatalities and destruction to infrastructure. Williams, the CEO, has announced that she is stepping down.

Read the Full Story Here.

Kevin Mooney (@KevinMooneyDC) is a contributor to the Washington Examiner’s Beltway Confidential blog. He is an investigative reporter in Washington, D.C. who writes for several national publications.

California’s Wildfire History – in one map

Reblogged from Watts Up With That:

Here is an interesting interactive graphic that depicts perimeters of more than 100 years of California wildfires recorded by Cal Fire and the U.S. Geological Survey. The map below shows all the cumulative fires from 1878 to 2018. It seems as if there is very little of California that has not been touched by wildfire. Large areas of desert in the southeast are mostly untouched due to lack of vegetation.


From the About page:

This map shows the perimeters of wildfires that have burned in California from 1878 to 2018 using data from the California Department of Forestry and Fire Protection and the U.S. Geological Survey. The wildfires are categorized by the year in which they started. Perimeter information from fires that started between 1878 and 2017 comes from Cal Fire, while information on the Thomas Fire and fires that started in 2018 comes from the USGS.

Cal Fire says that their dataset — which runs from 1878 to 2017 as of January 2019 — is the most complete dataset of California wildfire perimeters before 1950. However, the pre-1950 information shown here is incomplete and should not be used for further analysis.

Cal Fire’s data on this map shows timber fires that burned more than 10 acres, brush fires that burned more than 50 acres and grass fires that burned more than 300 acres. The USGS data comes from the Bureau of Land Management and the U.S. Forest Service, which have lower acreage requirements for recording fire perimeters. Because of that, Cal Fire’s data is less comprehensive than the data of their federal partners, which was used for the 2018 fires shown on this map.

This map shows the perimeters of Cal Fire and the U.S. Geological Survey’s recorded wildfires, but it should be noted that not everything within a wildfire perimeter has burned. This means that the areas shown here do not necessarily represent burned areas.

CapRadio changed the names of two fires from the names reported by Cal Fire. Cal Fire’s names for the fires included a racial slur, so we have edited the word in accordance with Associated Press guidelines and our own standards.


Here is the interactive link:

http://projects.capradio.org/california-fire-history/#6/38.58/-121.49

Inverse Hockey-Stick: climate related death risk for an individuals down 99% since 1920

Reblogged from Watts Up With That:

Bjørn Lomborg writes on Facebook about some new and surprising data that turn climate alarmist claims upside down.

Fewer and fewer people die from climate-related natural disasters.

This is clearly opposite of what you normally hear, but that is because we’re often just being told of one disaster after another – telling us how *many* events are happening. The number of reported events is increasing, but that is mainly due to better reporting, lower thresholds and better accessibility (the CNN effect). For instance, for Denmark, the database only shows events starting from 1976.

Instead, look at the number of dead per year, which is much harder to fudge. Given that these numbers fluctuate enormously from year to year (especially in the past, with huge droughts and floods in China), they are here presented as averages of each decade (1920-29, 1930-39 etc, with last decade as 2010-18). The data is from the most respected global database, the International Disaster Database. There is some uncertainty about complete reporting from early decades, which is why this graph starts in 1920, and if anything this uncertainty means the graph *underestimates* the reduction in deaths.

Notice, this does *not* mean that there is no global warming or that possibly a climate signal could eventually lead to further deaths. Instead, it shows that our increased wealth and adaptive capacity has vastly outdone any negative impact from climate when it comes to human climate vulnerability.

Notice that the reduction in absolute deaths has happened while the global population has increased four-fold. The individual risk of dying from climate-related disasters has declined by 98.9%. Last year, fewer people died in climate disasters than at any point in the last three decades (1986 was a similarly fortunate year).

Somewhat surprisingly, while climate-related deaths have been declining strongly for 70 years, non-climate deaths have not seen a similar decline, and should probably get more of our attention.

If we look at the death risk for an individual, seen below, the risk reduction is even bigger – dropped almost 99% since the 1920s.


Data Source: The International Disaster Database,http://emdat.be/emdat_db/

Geoscientists reconstruct ‘eye-opening’ 900-year Northeast climate record

Tallbloke's Talkshop

Credit: planetsave.com
This supports the idea that temperature cycles in the region of 60 years are very likely a common feature of Earth’s climate.

Deploying a new technique for the first time in the region, geoscientists at the University of Massachusetts Amherst have reconstructed the longest and highest-resolution climate record for the Northeastern United States, which reveals previously undetected past temperature cycles and extends the record 900 years into the past, well beyond the previous early date of 1850, reports Phys.org.

View original post 253 more words

UMass Amherst geoscientists reconstruct ‘eye-opening’ 900-year Northeast climate record

Reblogged from Watts Up With That:

Just a single study.  Just a single location.  Just a single technique.  Importance TBD~ctm

From EurekAlert!

Geoscientists at UMass Amherst have reconstructed the longest and highest-resolution climate record for the northeastern United States

189909_web

Doctoral students Daniel Miller, in the water, with Helen Habicht and Benjamin Keisling, handle two recaptured sediment traps from an unusually deep lake in central Maine, where they collected 136 sediment samples spanning the 900-year time span to reconstruct the longest and highest-resolution climate record for the Northeastern United States to date. Credit: UMass Amherst

AMHERST, Mass. – Deploying a new technique for the first time in the region, geoscientists at the University of Massachusetts Amherst have reconstructed the longest and highest-resolution climate record for the Northeastern United States, which reveals previously undetected past temperature cycles and extends the record 900 years into the past, well beyond the previous early date of 1850.

First author Daniel Miller, with Helen Habicht and Benjamin Keisling, conducted this study as part of their doctoral programs with advisors geosciences professors Raymond Bradley and Isla Castañeda. As Miller explains, they used a relatively new quantitative method based on the presence of chemical compounds known as branched glycerol dialkyl glycerol tetra ethers (branched GDGTs) found in lakes, soils, rivers and peat bogs around the world. The compounds can provide an independent terrestrial paleo-thermometer that accurately assesses past temperature variability.

Miller says, “This is the first effort using these compounds to reconstruct temperature in the Northeast, and the first one at this resolution.” He and colleagues were able to collect a total of 136 samples spanning the 900-year time span, many more than would be available with more traditional methods and from other locations that typically yield just one sample per 30-100 years.

In their results, Miller says, “We see essentially cooling throughout most of the record until the 1900s, which matches other paleo-records for North America. We see the Medieval Warm Period in the early part and the Little Ice Age in the 1800s.” An unexpected observation was 10, 50-to-60-year temperature cycles not seen before in records from Northeast U.S., he adds, “a new finding and surprising. We’re trying to figure out what causes that. It may be caused by changes in the North Atlantic Oscillation or some other atmospheric patterns. We’ll be looking further into it.”

He adds, “We’re very excited about this. I think it’s a great story of how grad students who come up with a promising idea, if they have enough support from their advisors, can produce a study with really eye-opening results.” Details appear in a recent issue of the European Geophysical Union’s open-access online journal, Climate of the Past.

The authors point out that paleo-temperature reconstructions are essential for distinguishing human-made climate change from natural variability, but historical temperature records are not long enough to capture pre-human-impact variability. Further, using conventional pollen- and land-based sediment samples as climate proxies can reflect confounding parameters rather than temperature, such as precipitation, humidity, evapo-transpiration and vegetation changes.

Therefore, additional quantitative paleo-temperature records are needed to accurately assess past temperature variability in the Northeast United States, the researchers point out. An independent terrestrial paleo-thermometer that relies on measuring two byproducts of processes carried out in branched GDGTs in lake sediment, a method first introduced two decades ago by researchers in The Netherlands, offered a promising alternative, Miller says.

Source organisms are not known for branch GDGTs, he points out, but they are thought to be produced in part by Acidobacteria. “These are compounds likely produced by different algae and bacteria communities in the membrane, or skin,” he notes. “Just like for humans, the skin regulates the organism’s body temperature and these compounds change in response to temperature. So if they grow in summer, they reflect that and the compounds are different than if they were produced in winter. We record the compounds to get the temperature curves. We found there seems to be a huge bloom of these organisms in the fall. After they die, they settle into the lake bottom. We think it’s mainly a fall temperature that we’re detecting.”

For this work, Miller and colleagues constructed large plastic sediment traps and deployed them about ten feet below the surface of a small, 106-foot-deep lake in central Maine in May, 2014. They then dove under to collect a catchment bottle from the bottom of each trap every month in June, July, August and September, and the following May 2015.

Miller says, “This lake is very deep for its small area, with very steep sides. It doesn’t seem to have much mixing of water layers by surface winds. We think that has helped to preserve a bottom water layer with no oxygen year-round, known as anoxia, which helps in the preservation of annual layers in the sediments at the bottom of the lake. It’s rare for a lake to have such fine, thin lines that represent annual deposition, so all you have to do is count the lines to count the years. We double-checked our results with radiocarbon dating and other methods, and it turns out that reconstructing the temperature record this way was successful.”

Miller and colleagues say this project enjoyed notable support from many quarters, including the UMass Amherst Alumni Association supporting student field work and data collection in Maine; the geology department at Bates College; funding from the U.S. Geological Survey; and at UMass Amherst, sophisticated biogeochemistry laboratory equipment and the Joe Hartshorn Memorial Award from the geosciences department, and other assistance from the Northeast Climate Adaptation Science Center.

The researchers conclude that this first paleo-temperature reconstruction coupled with site-specific knowledge from Basin Pond “informs our understanding of climatic variability in the Northeast U.S. beyond the era of human influence” and “contributes to our understanding of the production and fate of brGDGTs” in lake systems.

###

The paper is open access and can be found here.

Abstract

Paleotemperature reconstructions are essential for distinguishing anthropogenic climate change from natural variability. An emerging method in paleolimnology is the use of branched glycerol dialkyl glycerol tetraethers (brGDGTs) in sediments to reconstruct temperature, but their application is hindered by a limited understanding of their sources, seasonal production, and transport. Here, we report seasonally resolved measurements of brGDGT production in the water column, in catchment soils, and in a sediment core from Basin Pond, a small, deep inland lake in Maine, USA. We find similar brGDGT distributions in both water column and lake sediment samples but the catchment soils have distinct brGDGT distributions suggesting that (1) brGDGTs are produced within the lake and (2) this in situ production dominates the down-core sedimentary signal. Seasonally, depth-resolved measurements indicate that most brGDGT production occurs in late fall, and at intermediate depths (18–30 m) in the water column. We utilize these observations to help interpret a Basin Pond brGDGT-based temperature reconstruction spanning the past 900 years. This record exhibits trends similar to a pollen record from the same site and also to regional and global syntheses of terrestrial temperatures over the last millennium. However, the Basin Pond temperature record shows higher-frequency variability than has previously been captured by such an archive in the northeastern United States, potentially attributed to the North Atlantic Oscillation and volcanic or solar activity. This first brGDGT-based multi-centennial paleoreconstruction from this region contributes to our understanding of the production and fate of brGDGTs in lacustrine systems.

6 Conclusions

We find evidence for seasonally biased in situ production of branched glycerol dialkyl glycerol tetraethers (brGDGTs) in a lake in central Maine, NE US. BrGDGTs are mostly produced in September at Basin Pond, and their downward fluxes in the water column peak at 30 m in water depth. A down-core brGDGT-based reconstruction reveals both gradual and transient climate changes over the last 900 years and records cooling and warming events correlated with other Northern Hemisphere records and the NAO and AMO indices. This suggests inland Maine climate is sensitive to hemispheric climate forcing as well as changes in regional atmospheric pressure patterns and North Atlantic sea surface temperatures. Our new MBT′5ME temperature reconstruction, supported by a pollen record from the same site, reveals a prominent cooling trend from AD 1100 to 1900 in this area. Comparison with regional hydroclimate records suggests that despite increasingly cool and wet conditions persisting at Basin Pond over the last 900 years, fire activity has increased. Although recent fire activity is likely anthropogenically triggered (i.e., via land-use change), our results imply an independent relationship between climate and NE US fire occurrence over the study interval. Thus, the paleotemperature reconstruction presented here alongside site-specific knowledge from Basin Pond informs our understanding of climatic variability in the NE US beyond the era of human influence.

Wildfires USA – 80% Less Than The 1930s

sunshine hours

Bjorn Lomborg has been trying to quell the hysteria about forest fires in the USA.

Image may contain: text

No photo description available.

As Lomborg writes on Facebook:

Some people have pointed out that the National Interagency Fire Center writes that “Prior to 1983, sources of these figures are not known, or cannot be confirmed, and were not derived from the current situation reporting process. As a result the figures prior to 1983 should not be compared to later data.”

This is convenient, since the NIFC for the longest time didn’t even want to acknowledge that there were data before 1960 (https://web.archive.org/web/20171206160413/https://www.nifc.gov/fireInfo/fireInfo_stats_totalFires.html). I’ve consistently pointed out that we had early data and where the data starting in 1926 comes from; it is the Wildfire Statistics from USDA, summarized in the official Historical Statistics of the United States – Colonial Times to 1970, p537: http://bit.ly/2hGp7XF.

So, we all know, very well, where this data is from.

View original post 326 more words

What’s Natural? A Look at Wildfires

Reblogged from Watts Up With That:

Jim Steele writes: I am excited to announce my local weekly paper the Pacifica Tribune has added me as a columnist. Every 2 weeks I will post my column “What’s Natural”. The publisher has 5 other papers in the SF paper which might also carry the column. To publish a more skeptical and scientific opinion, while deep in the heart of this blue state is a bold move and reveals a commitment to objectivity and I am eager to see what kind of reaction it gets. Pacifica is just south of San Francisco. Next column will be a look at drought.

What’s Natural?

A Look at Wildfires

In early December I surveyed the horrific Camp Fire disaster in Paradise. Having been director for 25 years of a university field station located in the heart of the Tahoe National Forest, I’ve been a “student” of fire ecology for 30 years and wanted a closer look at why row after row of homes completely incinerated while surrounding trees were merely scorched, with leaves and needles browned but not burnt?

clip_image002

Large fires have recently ravaged about 1.8 million California acres a year, prompting media and politicians to proclaim a “new normal” that’s “evidence of global warming”. But UC Berkeley fire ecologists have calculated that before 1800, fires burned 4 million California acres each year (despite cooler temperatures). So what natural fire dynamics promote such extensive burning?

Wildfires have indeed increased since 1970, but that’s relative to previous decades of intensive fire prevention. As fire was recognized as a natural and necessary phenomenon for healthy ecosystems a new era began. In the 70s the US Forest Service moved away from extinguishing all fires by 10 AM the day after detection, switching to a “let it burn policy” if human structures were not endangered.

Paradise, unfortunately, sprung up amidst a forest dominated by Ponderosa pines. Largely due to frequent lightning strikes and dry summers, Ponderosa habitat endures fires about every 11 years. Fortunately for California’s coastal residents, lightning is rare. However, both regions are vulnerable to human ignitions, which start 85-95% of all fires. Recognizing this growing problem, a bipartisan bill was presented to Governor Brown two years ago to secure our power grid. Shockingly he vetoed it. That was a bad choice given the Camp Fire, Wine Country Fires and many more were sparked by an ageing electrical infrastructure. Recent studies show larger fires result from a confluence of human ignitions and high winds. But it is not just random coincidence. The high winds that spread these massive fires also blow down power lines that ignite those fires.

In 2008 the world’s foremost expert on fire history, Stephen Pyne lamented, “global warming has furnished political cover to encourage certain fire management decisions while allowing climate to take the blame.” How true. Both PGE and Governor Brown have blamed wildfires on climate change.

When you build a camp fire, you intuitively understand fire ecology basics. You do not hold a match to a log no matter how dry. You start a camp fire with kindling. Fire ecologists call forest kindling, like dead grass, leaves and small shrubs, “fine fuels”. In dry weather “fine fuels” become highly combustible in a matter of hours, or at most days, even during the winter. Furthermore, California’s summer climate is naturally dry for 3-4 months, creating highly combustible habitat each and every summer.

Additionally, camp fires only smolder without enough air, so we huff and puff to get a burst of flames. Likewise, high winds turn a spark into a major conflagration. It was strong winds that rapidly spread the Camp Fire. The fast-moving flames, feeding on “fine fuels” littering the forest floor, generated enough heat to ignite flammable homes that then burned from the inside out; but only enough heat to char the bark of most surrounding trees.

Miraculously spared buildings dotting a devastated landscape made the case for creating “defensible spaces” by managing the “fine fuels”. Surveying one unscathed church, the fire clearly came within 100 feet, scorching the base of every encircling tree. But due to a parking lot and a well-manicured lawn, the lack of “fine fuels” stopped the fire in its tracks. Trees on the lawn were not even charred. The public would benefit greatly if wildfire news stories emphasized the need to create adequate defensible spaces.

With high deserts to the east and the ocean to the west, California’s winds shift with the seasons. Land temperatures always change faster than the ocean’s. In the summer, warmer land surfaces draw in moist sea breezes. The resulting fog moistens coastal landscapes and reduces fire danger there. Thus, any warming, whether natural or CO2 driven, should increase the fog.

In the autumn, the land cools faster than the ocean causing the winds to reverse direction. The colder it gets, the stronger the winds blow from the high deserts towards the coast, peaking in December. These winds are called Santa Annas in southern California. The Wine Country fires were spread by the Diablo winds. But regardless of the name, the science is the same. Accordingly, it was November winds that fanned a spark into an inferno aimed directly at the heart of Paradise.

It has long been known that due to these autumn and winter winds, much of California endures a dangerous fire season year-round. On the optimistic side, any warming of the land during the cool seasons, whether natural or CO2 driven, should reduce these winds. Indeed, the natural drivers of wildfire are very complex, and maintaining a defensible space is our safest bet.

Jim Steele is author of “Landscapes and Cycles: An Environmentalist’s Journey to Climate Skepticism”. Contact him at naturalclimatechange@earthlink.net