Category: Extreme (so to speak) Weather
July 2019 Was Not the Warmest on Record
Reblogged from Dr Roy Spencer.com:
August 2nd, 2019 by Roy W. Spencer, Ph. D.
July 2019 was probably the 4th warmest of the last 41 years. Global “reanalysis” datasets need to start being used for monitoring of global surface temperatures.
[NOTE: It turns out that the WMO, which announced July 2019 as a near-record, relies upon the ERA5 reanalysis which apparently departs substantially from the CFSv2 reanalysis, making my proposed reliance on only reanalysis data for surface temperature monitoring also subject to considerable uncertainty].
We are now seeing news reports (e.g. CNN, BBC, Reuters) that July 2019 was the hottest month on record for global average surface air temperatures.
One would think that the very best data would be used to make this assessment. After all, it comes from official government sources (such as NOAA, and the World Meteorological Organization [WMO]).
But current official pronouncements of global temperature records come from a fairly limited and error-prone array of thermometers which were never intended to measure global temperature trends. The global surface thermometer network has three major problems when it comes to getting global-average temperatures:
(1) The urban heat island (UHI) effect has caused a gradual warming of most land thermometer sites due to encroachment of buildings, parking lots, air conditioning units, vehicles, etc. These effects are localized, not indicative of most of the global land surface (which remains most rural), and not caused by increasing carbon dioxide in the atmosphere. Because UHI warming “looks like” global warming, it is difficult to remove from the data. In fact, NOAA’s efforts to make UHI-contaminated data look like rural data seems to have had the opposite effect. The best strategy would be to simply use only the best (most rural) sited thermometers. This is currently not done.
(2) Ocean temperatures are notoriously uncertain due to changing temperature measurement technologies (canvas buckets thrown overboard to get a sea surface temperature sample long ago, ship engine water intake temperatures more recently, buoys, satellite measurements only since about 1983, etc.)
(3) Both land and ocean temperatures are notoriously incomplete geographically. How does one estimate temperatures in a 1 million square mile area where no measurements exist?
There’s a better way.
A more complete picture: Global Reanalysis datasets
(If you want to ignore my explanation of why reanalysis estimates of monthly global temperatures should be trusted over official government pronouncements, skip to the next section.)
Various weather forecast centers around the world have experts who take a wide variety of data from many sources and figure out which ones have information about the weather and which ones don’t.
But, how can they know the difference? Because good data produce good weather forecasts; bad data don’t.
The data sources include surface thermometers, buoys, and ships (as do the “official” global temperature calculations), but they also add in weather balloons, commercial aircraft data, and a wide variety of satellite data sources.
Why would one use non-surface data to get better surface temperature measurements? Since surface weather affects weather conditions higher in the atmosphere (and vice versa), one can get a better estimate of global average surface temperature if you have satellite measurements of upper air temperatures on a global basis and in regions where no surface data exist. Knowing whether there is a warm or cold airmass there from satellite data is better than knowing nothing at all.
Furthermore, weather systems move. And this is the beauty of reanalysis datasets: Because all of the various data sources have been thoroughly researched to see what mixture of them provide the best weather forecasts
(including adjustments for possible instrumental biases and drifts over time), we know that the physical consistency of the various data inputs was also optimized.
Part of this process is making forecasts to get “data” where no data exists. Because weather systems continuously move around the world, the equations of motion, thermodynamics, and moisture can be used to estimate temperatures where no data exists by doing a “physics extrapolation” using data observed on one day in one area, then watching how those atmospheric characteristics are carried into an area with no data on the next day. This is how we knew there were going to be some exceeding hot days in France recently: a hot Saharan air layer was forecast to move from the Sahara desert into western Europe.
This kind of physics-based extrapolation (which is what weather forecasting is) is much more realistic than (for example) using land surface temperatures in July around the Arctic Ocean to simply guess temperatures out over the cold ocean water and ice where summer temperatures seldom rise much above freezing. This is actually one of the questionable techniques used (by NASA GISS) to get temperature estimates where no data exists.
If you think the reanalysis technique sounds suspect, once again I point out it is used for your daily weather forecast. We like to make fun of how poor some weather forecasts can be, but the objective evidence is that forecasts out 2-3 days are pretty accurate, and continue to improve over time.
The Reanalysis picture for July 2019
The only reanalysis data I am aware of that is available in near real time to the public is from WeatherBell.com, and comes from NOAA’s Climate Forecast System Version 2 (CFSv2).
The plot of surface temperature departures from the 1981-2010 mean for July 2019 shows a global average warmth of just over 0.3 C (0.5 deg. F) above normal:

Note from that figure how distorted the news reporting was concerning the temporary hot spells in France, which the media reports said contributed to global-average warmth. Yes, it was unusually warm in France in July. But look at the cold in Eastern Europe and western Russia. Where was the reporting on that? How about the fact that the U.S. was, on average, below normal?
The CFSv2 reanalysis dataset goes back to only 1979, and from it we find that July 2019 was actually cooler than three other Julys: 2016, 2002, and 2017, and so was 4th warmest in 41 years. And being only 0.5 deg. F above average is not terribly alarming.
Our UAH lower tropospheric temperature measurements had July 2019 as the third warmest, behind 1998 and 2016, at +0.38 C above normal.
Why don’t the people who track global temperatures use the reanalysis datasets?
The main limitation with the reanalysis datasets is that most only go back to 1979, and I believe at least one goes back to the 1950s. Since people who monitor global temperature trends want data as far back as possible (at least 1900 or before) they can legitimately say they want to construct their own datasets from the longest record of data: from surface thermometers.
But most warming has (arguably) occurred in the last 50 years, and if one is trying to tie global temperature to greenhouse gas emissions, the period since 1979 (the last 40+ years) seems sufficient since that is the period with the greatest greenhouse gas emissions and so when the most warming should be observed.
So, I suggest that the global reanalysis datasets be used to give a more accurate estimate of changes in global temperature for the purposes of monitoring warming trends over the last 40 years, and going forward in time. They are clearly the most physically-based datasets, having been optimized to produce the best weather forecasts, and are less prone to ad hoc fiddling with adjustments to get what the dataset provider thinks should be the answer, rather than letting the physics of the atmosphere decide.
Extream
Dr. Judith Curry’s post at WUWT on preparing to testify before Congress.
Politics versus science in attributing extreme weather events to manmade global warming.
If you follow me on twitter, you may have noticed that I was scheduled to testify before the House Oversight and Reform Committee on Jun 12 [link]. The subject of the Hearing is Contending with Natural Disasters in the Wake of Climate Change.
Full Post HERE.
Conclusions
So where does all this leave us in the climate debate? There is very little in the way of extreme weather events that can convincingly be attributed to manmade global warming, even if you are assuming that all of the recent warming is manmade.
Global warming activists will continue use extreme events as an argument against fossil fuels, even though there is little to no evidence to support this. Without this argument, there is very little left to…
View original post 188 more words
Extremes
by Judith Curry
Politics versus science in attributing extreme weather events to manmade global warming.
View original post 2,570 more words
Cooling Down the Hysteria About Global Warming
Reblogged from Watts Up With That:
Guest essay by Rich Enthoven
Recently, NASA released its annual report on global temperatures and reported that 2018 was the fourth hottest year on record, surpassed only by three recent years. This claim was accompanied by dire predictions of climate change and for immediate action to dramatically curtail CO2 emissions around the globe. Like every concerned citizen read this report with interest. I also read it as an informed and trained climate analyst – and I can tell that there are some serious problems with the report and its conclusions.
For starters, I can assure my readers that I am not a climate change “denier.” No one doubts the climate changed when it experienced the Ice Age that ended 12,000 years ago. I have read enough scientific literature to believe the well documented view that the planet experienced the Medieval Warm Period (950 – 1250 AD) and Little Ice Age (1550 – 1850 AD) when global temperatures changed materially. I have also read enough scientific literature to understand that solar and ocean cycles affect global climate.
NASA is now reporting significant changes to the global temperature. According to NASA (and others) the entire globe experienced a persistent warming trend in the early part of the 20th century (1911 – 1940). Then, this trend reversed, and the globe cooled until the 1970’s.[1] Now, NASA is reporting that the global temperature increased .31° C in the last 10 years and that this trend is different than the .31° C increase NASA reports for the 1930’s[2]. But, a closer look at the data and methods used by NASA should make any reader skeptical of their results.
Land Temperatures
It turns out, that over long periods of time it is actually quite difficult to measure temperature changes from climate consistently. The problems arise from changes in measurement technology (mercury bulbs then, semiconductors now) and changes in the sites surrounding the measurement locations. A good way to think about this problem is to consider Dallas Love Field Airport where average temperatures have been reported monthly since 1940. During that time Love Field transformed from a tiny airport near a small city[3] – to large urban airport with 200 daily flights. These changes have generated massive heat at the airport. It is no wonder that the reported temperatures at Love Field have trended up by approximately 2.9 ° F since 1940. [4]
But, when we look at the temperatures in Centerville, TX – much less affected by land use changes – we see the opposite trend. The average reported temperature in Centerville has been on a declining trend and now averages (on trend) .3 °F less than it was in 1940.[5]
As a result of this urban heat effect, scientists around the world have been identifying (or constructing) ‘pristine’ weather monitoring stations to get a clearer look at temperature changes. These stations are located in areas where urban development has not occurred and is not expected. These locations do not show any meaningful change in reported land temperatures. The best data comes from the National Oceanic and Atmospheric Administration (NOAA) which set up 114 rural temperature monitoring stations in the US in 2002 (USCRN). When we look at these, we see no persistent increase in US temperatures.[6] In fact, 2018 was .3°F colder than the first two years measured. February and March 2019 combined to be the coldest two-month period (temperature anomaly) ever recorded by the USCRN.
MONTHLY TEMPERATURE CHANGES AT USCRN STATIONS
And it is not just the US rural temperatures that are stable – all around the globe, temperature growth is eliminated once land use changes are eliminated. Shown below are temperature graphs from rural areas in Netherlands, Ireland, Chile, Antarctica, Japan[7], and China[8].
Further calling into question the global land temperature data used by NASA are climate scientists themselves. Seventeen leading climate scientists (including scientists at NOAA) recently co-authored a paper calling for a new network of global weather stations in which they lamented the “imperfect measurements and ubiquitous changes in measurement networks and techniques.”[9]
Even these efforts to measure temperature change may not be enough – even the ‘pristine’ USCRN temperature measurement locations continue to biased towards warmer temperatures from land use changes. For example, a parking area and road was built next to the USCRN weather station[10] at the University of Rhode Island leading to a .34 ° C increase in measured temperatures at that location.[11][12]
Ocean and Satellite Temperature Measurement
The NASA global temperature estimate also relies heavily on estimates of temperatures in the ocean and air above it. Ocean temperatures have been measured over the years with highly inconsistent methods (buckets off ships; water flowing through ship engine rooms; buoys; and lately, satellites). In addition to technology changes, there are short term annual ocean cycles such as the well-publicized El Nino/La Nina and long term (multi decade) cycles such as the Pacific (and Atlantic) Decadal Oscillations which affect ocean temperatures at many depths over decades. A recent report out of UC San Diego described the problem “Determining changes in the average temperature of the entire world’s ocean has proven to be a nearly impossible task due to the distribution of different water masses.”[13]
Respected climate scientists are tackling the ocean measurement challenge and come up with results very different than the NASA report. Satellite measurements from University of Alabama show atmosphere temperatures over the ocean increasing since 1980 (end of the last cooling period per NASA) but only at .13 ° C per decade.[14] Both major satellite measurement groups report temperatures are lower now than they were in 1998, although by different amounts.[15] Harvard University oceanographer Carl Wunsch estimated the average temperature of the ocean grew by .02 degrees during 1994 – 2013.[16] Scripps Institute of Oceanography recently estimated the ocean temperature growth at .1 ° C total over the last 50 years. The science and history of measuring ocean temperatures is far from ‘settled’ and there are plenty of credible estimates that ocean temperatures are not changing rapidly or at anywhere near the rate that NASA is estimating.
Back to the NASA Temperature Estimate
To come up with their global temperature assessments, NASA faces all these problems and more. For starters, there is very little reliable global scale land data before 1940, and there are still shortages of reliable data in many parts of the world. (Africa, Middle East). Most of the historical data has been affected by land use changes and measurement technology changes. As they have tried to deal with these problems, NASA has dramatically changed the locations and methods that they use to assess temperatures over the last several decades.[17] Some observers question whether the new locations and technologies have the same pattern as the old ones would have had.
Not only have they adjusted the locations they take land measurements from, NASA adjusts the data that goes into their estimates[18]. Here are examples from the NASA website for Darwin Airport, Australia and Reykjavik, Iceland that show the liberal data changes adopted by NASA.[19]
Readers should note several problematic elements of these graphs:
1) The unadjusted data does not indicate warming at these locations over the last 80 years.
2) The unadjusted data is shown in such a faint outline that its hard to see. Why would NASA present it this way?
3) As NASA changed each data set, they made the past appear cooler – the “adjusted, cleaned” data is cooler than the “unadjusted” data – and the “homogenized” data is cooler still. A cooler past allows NASA to claim current temperatures are dramatically higher.
The NASA has “adjusted, cleaned, and homogenized” the data from these locations along with thousands of others to make up the data set that NASA uses. They then add data from satellites and use data grid methodology to come up with a final temperature change result.
Needless to say, the NASA changes have been the subject of considerable debate – within the climate scientist community, the climate “skeptic” community, and even NASA itself.[20] The “unadjusted” raw data has been adjusted meaningfully over the years as NASA recalculates.[21] The satellite measurements are very controversial according Zeke Hausfather, climate researcher at Berkley Earth – “If you don’t like adjustments, you really shouldn’t use the satellite record.”[22] A major problem is that the average adjustments between raw and final data average strongly in one direction – the adjustments tend to cool the past – which makes the present temperatures seem warmer by comparison.[23] NASA itself is apparently unhappy with their current formulas and plans to release version four of their “adjustments” soon.[24]
Other Indicators of Global Temperatures
The debate about the temperatures adjustments and estimates used by NASA can quickly get in to mathematical manipulations that are well beyond the level of this article. Scientists are arguing about changes in the global temperature that are on the order of one percent of one degree centigrade. Fortunately, we can look at a variety of other climate indicators in an effort to verify whether temperatures are changing. According to the theory endorsed by NASA, humans have been increasing carbon dioxide (CO2) in the atmosphere for more than 70 years[25] – and this increased CO2 has led to demonstrably higher global temperatures which affect major aspects of global climate.
Fortunately for the planet, there is no evidence of change in large scale climate indicators that should be changing with the temperature. Here are some notable examples:
· US Land Temperatures: In 1986, James Hansen testified to congress that rising CO2 levels would cause US temperatures to rise by three to four degrees by 2020. [26] This prediction was spectacularly wrong – US land temperatures have moved at most a fraction of that amount since 1986.[27]
· Sea Level Rise: NASA (and later Al Gore) have made it clear that a warmer planet would cause ice to melt and the seas to expand – rising by up to four feet in 2050[28]. An accelerating trend in sea levels would potentially inundate lower elevation cities. But, NOAA data makes it clear that there is no change in the rate of sea level increase since measurements began.[29] If the warming globe would accelerate sea level changes, and we don’t see acceleration – it seems reasonable to suggest the globe isn’t warming.
· Hurricanes and Other Adverse Weather Events: By the early 2000s climate scientists told us to expect an increase in hurricanes due to higher temperatures in the ocean. Instead, the US experienced a major hurricane drought from 2006 – 2016.[30] In fact, global hurricanes/typhoon activity have shown no up trend in frequency or severity for the last fifty years.[31] The IPCC also reported in 2013 that there was no change in frequency of other adverse events such as droughts, floods, and tornados.
· Glaciers: Observers often become concerned as they see glaciers melting and blame it on global warming. It is certainly true that on average glaciers in the northern hemisphere have been retreating lately. But, glaciers have been retreating since the end of the Little Ice Age (1850) and numerous studies point out that many glaciers were actually melting faster during early 1900’s than they are today.[32] Glacier Bay in Alaska is a good example of the long term melting trend.
· Snowfall: In 2001, the scientists at IPCC (worlds global authority on climate change) said that rising global temperatures would result in a reduction in snowfall and even the end of skiing industry.[33] However, according to both NOAA and Rutgers University, snowfall has been trending up across the northern hemisphere since 1970. If less snow is expected from higher temperatures – is more snow an indicator of lower temperatures?[34]
These are large scale indicators that should not be subject to much measurement debate. They are not subject to “adjustments.” They all tell me that the NASA report is hopelessly biased in favor of reporting a temperature increase that is not happening.
Motivation for NASA to Report Higher Temperatures
Why would NASA come up with results so different from those of other climate observations? Consider the history of the NASA global temperature estimates. In 1986, James Hansen broadly publicized his global warming theory in testimony before the US Senate. For the next 27 years, Mr. Hansen was the chief scientist at NASA in charge of preparing and presenting those estimates. Is it unreasonable to suggest that the “adjustments” and formulas he used after his Senate testimony were biased with an effort to make his predictions turn out to be correct? How much of the NASA estimate is a simple self-fulfilling prophesy?
It’s not just NASA that is subject to significant pressure which likely introduces bias into their results. Climate scientists may be in the same position as those in other fields (i.e. nutrition, pharmaceuticals, psychology) where the desire to produce a pre-selected result influences the inputs, methods, and findings of their science. Alarming results (“hottest ever!” “disaster predicted” “urgent action needed”) all generate headlines; speaking engagements; trips to climate conferences (IPCC); and additional funding for more research. When scientists find opposite results (“nothing is really changing” “it’s just weather” “random events as usual”) they get no publicity; no funding; and instead are attacked (“pro big oil” “anti-environment” or worst of all, a “climate change denier.”)[35] There are indeed thousands of scientific papers that are at odds with NASA, but they don’t get nearly the media coverage and they are not included in NASA’s estimates.
Summary
It is time for a much more open and fair reporting and debate about global temperatures and climate change. Every time an adverse weather event occurs, we have news media blaming it on climate change that isn’t happening. We now have people marching in the streets over a non-existent crisis. All around the globe, trillions of dollars are being spent to avert a perceived global temperature crisis that is not happening. These energies and funds could be spent on far better uses to protect our environment, educate our people, and actually help the planet. We could be spending money on keeping toxins out of our ecosystems; keeping our oceans clean and healthy; improving sustainable farming techniques; expanding and protecting our natural habitats. Its time to take real action to protect and improve our planet – and stop the misplaced worry about climate change.
[1].https://climate.nasa.gov/vital-signs/global-temperature/
[2] Temp anomalies per NASA site: 2018 +.82 ° C less 2008 +.51 ° C =+.31 ° C. 1939 -.03 ° C – 1929 -.34 ° C =+.31 ° C
[3] Dallas population 400,000. Love Field had three daily flights. Wikipedia
[4] Data per iweathernet.com. Authors trend analysis – least squares regression.
[5] Iweathernet.com Authors trend analysis – least squares regression.
[6] https://www.ncdc.noaa.gov/temp-and-precip/national-temperature-index/time-series?datasets%5B%5D=uscrn¶meter=anom-tavg&time_scale=p12&begyear=2004&endyear=2019&month=3 See also https://agupubs.onlinelibrary.wiley.com/doi/10.1002/2015GL067640 for discussion of this data series. Trend is not significant at any reasonable level of certainty. Measurements themselves are subject to +/-.3°C at source.
[7] Temperatures from Japanese Meteorological Association.
[8] https://www.sciencedirect.com/science/article/pii/S0048969718331978
[9] Journal of Climatology 3/1/18 – https://rmets.onlinelibrary.wiley.com/doi/10.1002/joc.5458
[10] Data available at: https://www1.ncdc.noaa.gov/pub/data/uscrn/products/monthly01/CRNM0102-RI_Kingston_1_W.txt
[11] https://iowaclimate.org/2018/04/09/infrastructure
[12] Moose, Wy in Grand Teton National Park is experiencing record park visitors. Are they affecting measured temperatures at the USCRN site there?
[13] https://www.sciencedaily.com/releases/2018/01/180103160129.htm)
[14] https://www.nsstc.uah.edu/climate/2019/february2019/tlt_201902_bar.png Note this is closer to one third of the NASA estimated increase.
[15] http://www.drroyspencer.com/2014/10/why-2014-wont-be-the-warmest-year-on-record/
[16] https://www.tandfonline.com/doi/full/10.1080/16000870.2018.1471911)
[17] https://data.giss.nasa.gov/gistemp/history/
[18] https://data.giss.nasa.gov/gistemp/history/
[19] https://data.giss.nasa.gov/cgi-bin/gistemp/stdata_show.cgi?id=501941200000&dt=1&ds=5
[20] Sample paper on the debate from Journal of Geophysical Research – “There remain important inconsistencies between surface and satellite records.” https://pielkeclimatesci.files.wordpress.com/2009/11/r-345.pdf
[21] https://realclimatescience.com/2019/03/nasa-tampering-with-reykjavik-raw-temperature-data/
[22] https://www.carbonbrief.org/explainer-how-surface-and-satellite-temperature-records-compare
[23] https://data.giss.nasa.gov/gistemp/history/
[24] https://data.giss.nasa.gov/gistemp/
[25] CO2 has risen from 315 ppm to 380 ppm per Mauna Loa Observation 1960 – 2018.
[26] https://reason.com/archives/2016/06/17/climate-change-prediction-fail/print).
[27] https://www.ncdc.noaa.gov/temp-and-precip/national-temperature-index/time-series?datasets%5B%5D=uscrn¶meter=anom-tavg&time_scale=p12&begyear=2004&endyear=2019&month=2.
[28] https://www.nytimes.com/1988/06/24/us/global-warming-has-begun-expert-tells-senate.html?/pagewanted=all
[29] NOAA Tides & Currents – https://tidesandcurrents.noaa.gov/sltrends/sltrends_station.shtml?id=9414750
[30] US Hurricanes: https://journals.ametsoc.org/doi/pdf/10.1175/BAMS-D-17-0184.1
[31]Global Cyclone activity: https://agupubs.onlinelibrary.wiley.com/doi/10.1029/2011GL047711
[32] http://appinsys.com/globalwarming/gw_4ce_glaciers.htm
https://www.the-cryosphere-discuss.net/tc-2018-22/tc-2018-22.pdf https://www.researchgate.net/publication/312185500_High_sensitivity_of_North_Iceland_Trollaskagi_debris-free_glaciers_to_climatic_change_from_the_%27Little_Ice_Age%27_to_the_present
[33] https://www.theguardian.com/environment/2001/jan/23/globalwarming.climatechange)
[34] In 2019, Mother Nature is making this point emphatically with at or near record snowfall and cold temperatures across North America and Europe.
[35] Prof. Ross McKitrick http://www.rossmckitrick.com/uploads/4/8/0/8/4808045/gatekeeping_chapter.pdf and Judith Curry are well known commentators on this phenomenon.
California water supply dream
I’m dreaming of a wet California …
“With full reservoirs and a dense snowpack, this year is practically a California water supply dream,” California DWR Director Karla Nemeth said April 2, 2019, after latest Sierra snowpack measurement.
California state officials made their monthly snowpack measurement at Phillips Station in the Sierra and confirmed there will be no lack of water this year.
Snowpack at the station was at 200% of average while statewide snowpack is 162% of average.
“This is great news for this year’s water supply, but water conservation remains a way of life in California, rain or shine,” California Department of Water Resources said.
The state has experienced more than 30 atmospheric rivers since the start of the water year, six in February alone, and statewide snow water equivalent has nearly tripled since February 1, officials said.
Phillips Station now stands at 106.5 inches (270.5 cm) of snow…
View original post 111 more words
The Little Ice Age – Back to the Future
Reblogged from Watts Up With That:
What’s Natural
By Jim Steele
Extreme scientists and politicians warn we will suffer catastrophic climate change if the earth’s average temperature rises 2.7°F above the Little Ice Age average. They claim we are in a climate crisis because average temperature has already warmed by 1.5°F since 1850 AD. Guided by climate fear, politicians fund whacky engineering schemes to shade the earth with mirrors or aerosols to lower temperatures. But the cooler Little Ice Age endured a much more disastrous climate.
The Little Ice Age coincides with the pre-industrial period. The Little Ice Age spanned a period from 1300 AD to 1850 AD, but the exact timing varies. It was a time of great droughts, retreating tree lines, and agricultural failures leading to massive global famines and rampant epidemics. Meanwhile advancing glaciers demolished European villages and farms and extensive sea ice blocked harbors and prevented trade.
Dr. Michael Mann who preaches dire predictions wrought by global warming described the Little Ice Age as a period of widespread “famine, disease, and increased child mortality in Europe during the 17th–19th century, probably related, at least in part, to colder temperatures and altered weather conditions.” In contrast to current models suggesting global warming will cause wild weather swings, Mann concluded “the Little Ice Age may have been more significant in terms of increased variability of the climate”. Indeed, historical documents from the Little Ice Age describe wild climate swings with extremely cold winters followed by very warm summers, and cold wet years followed by cold dry years.
A series of Little Ice Age droughts lasting several decades devastated Asia between the mid 1300s and 1400s. Resulting famines caused significant societal upheaval within India, China, Sri Lanka, and Cambodia. Bad weather resulted in the Great Famine of 1315-1317 which decimated Europe causing extreme levels of crime, disease, mass death, cannibalism and infanticide. The North American tree-ring data reveal megadroughts lasting several decades during the cool 1500s. The Victorian Great Drought from 1876 to 1878 brought great suffering across much of the tropics with India devastated the most. More than 30 million people are thought to have died at this time from famine worldwide.
The Little Ice Age droughts and famines forced great societal upheaval, and the resulting climate change refugees were forced to seek better lands. But those movements also spread horrendous epidemics. Wild climate swings brought cold and dry weather to central Asia. That forced the Mongols to search for better grazing. As they invaded new territories they spread the Bubonic plague which had devastated parts of Asia earlier. In the 1300s the Mongols passed the plague to Italian merchant ships who then brought it to Europe where it quickly killed one third of Europe’s population. European explorers looking for new trade routes brought smallpox to the Americas, causing small native tribes to go extinct and decimating 25% to 50% of larger tribes. Introduced diseases rapidly reduced Mexico’s population from 30 million to 3 million.
By the 1700s a new killer began to dominate – accidental hypothermia. When indoor temperatures fall below 48°F for prolonged periods, the human body struggles to keep warm, setting off a series of reactions that causes stress and can result in heart attacks. As recently as the 1960s in Great Britain, 20,000 elderly and malnourished people who lacked central heating died from accidental hypothermia. As people with poor heating faced bouts of extreme cold in the 1700s, accidental hypothermia was rampant.
What caused the tragic climate changes of the Little Ice Age? Some scientists suggest lower solar output associated with periods of fewer sunspots. Increasing solar output then reversed the cooling and warmed the 20th century world. As solar output is now falling to the lows of the Little Ice Age, a natural experiment is now in progress testing that solar theory. However other scientists suggest it was rising CO2 that delivered the world from the Little Ice Age.
Increasing CO2 also has a beneficial fertilization effect that is greening the earth. The 20th century warming, whether natural or driven by rising CO2 concentrations, has lengthened the growing season. Famines are being eliminated. Tree-lines stopped retreating and trees are now reclaiming territory lost over the past 500 years. So why is it that now we face a climate crisis?
At the end of the 1300’s Great Famine and the Bubonic Plague epidemic, the earth sustained 350 million people. With today’s advances in technology and milder growing conditions, record high crop yields are now feeding a human population that ballooned to over 7.6 billion.
So, the notion that cooler times represent the “good old days” and we are now in a warmer climate crisis seems truly absurd.
Jim Steele is retired director of the Sierra Nevada Field Campus, SFSU
and authored Landscapes and Cycles: An Environmentalist’s Journey to Climate Skepticism
Natural climate processes overshadow recent human-induced Walker circulation trends
Reblogged from Watts Up With That:
Institute for Basic Science

Normal conditions (top), strengthening due to natural variability (middle) and weakening due to greenhouse warming (bottom). Black arrows represent horizontal and vertical winds with the shading on the background map illustrating ocean temperatures. Over the past few decades, natural variability has strengthened the Pacific Walker circulation leading to enhanced cooling in the equatorial central-to-eastern Pacific (middle). Climate models forced by increasing greenhouse gas concentrations simulate weakening of the Walker circulation (bottom). (Right) Temporal evolution of model-simulated Walker circulation trends, with the dark blue line and orange shading denoting anthropogenically-induced changes and the impact of natural processes, respectively. Credit IBS
A new study, published this week in the journal Nature Climate Change, shows that the recent intensification of the equatorial Pacific wind system, known as Walker Circulation, is unrelated to human influences and can be explained by natural processes. This result ends a long-standing debate on the drivers of an unprecedented atmospheric trend, which contributed to a three-fold acceleration of sea-level rise in the western tropical Pacific, as well as to the global warming hiatus.
Driven by the east-west sea surface temperature difference across the equatorial Pacific, the Walker circulation is one of the key features of the global atmospheric circulation. It is characterized by ascending motion over the Western Pacific and descending motion in the eastern equatorial Pacific. At the surface trade winds blow from east to west, causing upwelling of cold water along the equator. From the early 1990s to about 2013, this circulation has intensified dramatically, cooling the eastern equatorial Pacific and triggering shifts in global winds and rainfall (see Figure 1). These conditions further contributed to drying in California, exacerbating mega-drought conditions and impacting agriculture, water resources and wild fires. Given these widespread impacts on ecosystems and society, the recent Walker circulation trends have become subject of intense research.
In contrast to the observed strengthening, the majority of climate computer models simulates a gradual weakening of the Walker Circulation when forced by increasing greenhouse gas concentrations (see Figure 1). “The discrepancy between climate model projections and observed trends has led to speculations about the fidelity of the current generation of climate models and their representation of tropical climate processes”, said Eui-Seok Chung, researcher from the Center for Climate Physics, Institute for Basic Science, South Korea, and lead-author of the study.
To determine whether the observed changes in the tropical atmospheric circulation are due to natural climate processes or caused by human-induced climate change, scientists from South Korea, the United States and Germany came together to conduct one of the most comprehensive big-data analyses of recent atmospheric trends to date. “Using satellite data, improved surface observations and a large ensemble of climate model simulations, our results demonstrate that natural variability, rather than anthropogenic effects, were responsible for the recent strengthening of the Walker circulation”, said Prof. Axel Timmermann, Director of the IBS Center for Climate Physics at Pusan National University and co-author of this study.
In their integrated analysis, the researchers found that the satellite-inferred strengthening of the Walker circulation is substantially weaker than implied by other surface observations used in previous studies. “Putting surface observations in context with latest satellite products was a key element of our study”, said co-author Dr. Lei Shi from NOAA’s National Centers for Environmental Information in the United States.
Analyzing 61 different computer model simulations forced with increasing greenhouse gas concentrations, the authors showed that, although the average response is a Walker circulation weakening, there are substantial discrepancies amongst the individual model experiments, in particular when considering shorter-term trends. “We found that some models are even consistent with the observed changes in the tropical Pacific, in stark contrast to other computer experiments that exhibit more persistent weakening of the Walker circulation during the observational period”, said co-author Dr. Viju John from EUMETSAT in Germany. The authors were then able to tease apart what caused the spread in the computer model simulations.
Co-author Prof. Kyung-Ja Ha from the IBS Center for Climate Physics and Pusan National University explains “Natural climate variability, associated for instance with the El Niño-Southern Oscillation or the Interdecadal Pacific Oscillation can account for a large part of diversity in simulated tropical climate trends”.
“The observed trends are not that unusual. In climate model simulations we can always find shorter-term periods of several decades that show similar trends to those inferred from the satellite data. However, in most cases, and when considering the century-scale response to global warming, these trends reverse their sign eventually”, said co-author Prof. Brian Soden from the Rosenstiel School of Marine and Atmospheric Science, at the University of Miami, United States.
The study concludes that the observed strengthening of the Walker circulation from about 1990-2013 and its impact on western Pacific sea level, eastern Pacific cooling, drought in the Southwestern United States, was a naturally occurring phenomenon, which does not stand in contrast to the notion of projected anthropogenic climate change. Given the high levels of natural decadal variability in the tropical Pacific, it would take at least two more decades to detect unequivocally the human imprint on the Pacific Walker Circulation (see Figure 1, right panel).
Solar variability weakens the Walker cell
Credit: PAR @ Wikipedia
This looks significant, pointing directly at solar influences on climate patterns. The researchers found evidence that atmosphere-ocean coupling can amplify the solar signal, having detected that wind anomalies could not be explained by radiative considerations alone.
An international team of researchers from United Kingdom, Denmark, and Germany has found robust evidence for signatures of the 11-year sunspot cycle in the tropical Pacific, reports Phys.org.
They analyzed historical time series of pressure, surface winds and precipitation with specific focus on the Walker Circulation—a vast system of atmospheric flow in the tropical Pacific region that affects patterns of tropical rainfall.
They have revealed that during periods of increased solar irradiance, the trade winds weaken and the Walker circulation shifts eastward.
View original post 249 more words
Global Wildfire Area Has Declined, Contrary To Popular Myth
By Paul Homewood
Another thorough assessment of wildfire trends wrecks alarmist claims:
ABSTRACT
Wildfire has been an important process affecting the Earth’s surface and atmosphere for over 350 million years and human societies have coexisted with fire since their emergence. Yet many consider wildfire as an accelerating problem, with widely held perceptions both in the media and scientific papers of increasing fire occurrence, severity and resulting losses. However, important exceptions aside, the quantitative evidence available does not support these perceived overall trends. Instead, global area burned appears to have overall declined over past decades, and there is increasing evidence that there is less fire in the global landscape today than centuries ago. Regarding fire severity, limited data are available. For the western USA, they indicate little change overall, and also that area burned at high severity has overall declined compared to pre-European settlement. Direct fatalities from fire and economic losses…
View original post 919 more words