This is a comment by David Middleton from Jim Steele’s post on Watts Up With That:
Reblogged from Watts Up With That:
Guest opinion: Dr. Tim Ball
Two recent events triggered the idea for this article. On the surface, they appear unconnected, but that is an indirect result of the original goal and methods of global warming science. We learned from Australian Dr, Jennifer Marohasy of another manipulation of the temperature record in an article titled “Data mangling: BoM’s Changes to Darwin’s Climate History are Not Logical.” The second involved the claim of final, conclusive evidence of Anthropogenic Global Warming ((AGW). The original article appeared in the journal Nature Climate Change. Because it is in this journal raises flags for me. The publishers of the journal Nature created the journal. That journal published as much as it could to promote the deceptive science used for the untested AGW hypothesis. However, they were limited by the rules and procedures required for academic research and publications. This isn’t a problem if the issue of global warming was purely about science, but it never was. It was a political use of science for a political agenda from the start. The original article came from a group led by Ben Santer, a person with a long history of involvement in the AGW deception.
An article titled “Evidence that humans are responsible for global warming hits ‘gold standard’ certainty level” provides insight but includes Santer’s comment that “The narrative out there that scientists don’t know the cause of climate change is wrong,” he told Reuters. “We do.” It is a continuation of his work to promote the deception. He based his comment on the idea that we know the cause of climate change because of the work of the Intergovernmental Panel on Climate Change (IPCC). They only looked at human causes, and it is impossible to determine that, if you don’t know and understand natural climate change and its causes. If we did know and understand then forecasts would always be correct. If we do know and understand then Santer and all the other researchers and millions of dollars are no longer necessary.
So why does Santer make such a claim? For the same reason, they took every action in the AGW deception, to promote a stampede created by the urgency to adopt the political agenda. It is classic the sky is falling” alarmism. Santer’s work follows on the recent ‘emergency’ report of the IPCC presented at COP 24 in Poland that we have 12 years left.
One of the earliest examples of this production of inaccurate science to amplify urgency was about the residency time of CO2 in the atmosphere. In response to the claims for urgent action of the IPCC, several researchers pointed out that the levels and increase in levels were insufficient to warrant urgent action. In other words, don’t rush to judgement. The IPCC response was to claim that even if production stopped the problem would persist for decades because of CO2’s 100-year residency time. A graph produced by Lawrence Solomon appeared showing that the actual time was 4 to 6 years (Figure 1).
This pattern of underscoring urgency permeates the entire history of the AGW deception.
Lord Walter Scott said, “What a tangled web we weave when first we practice to deceive.” Another great author expanded on that idea but from a different perspective. Mark Twain said, “If you tell the truth you don’t have to remember.” In a strange way, they contradict or at least explain how the deception spread, persisted, and achieved their damaging goal. The web becomes so tangled and the connection between tangles so complicated that people never see what is happening. This is particularly true if the deception is about an arcane topic unfamiliar to a majority of the people.
All these observations apply to the biggest deception in history, the claim that human production of CO2 is causing global warming. The objective is unknown to most people even today, and that is a measure of the success. The real objective was to prove overpopulation combined with industrial development was exhausting resources at an unsustainable rate. As Maurice Strong explained the problem for the planet are the industrialized nations and isn’t it our responsibility to get rid of them. The hypothesis this generated was that CO2, the byproduct of burning fossil fuel, was causing global warming and destroying the Earth. They had to protect the charge against CO2 at all cost, and that is where the tangled web begins.
At the start, the IPCC and agencies supporting them had control over the two important variables, the temperature, and the CO2. Phil Jones expressed the degree of control over temperature in response to Warwick Hughes’ request for which stations he used and how they were adjusted in his graph, He received the following reply on 21, February 2005.
“We have 25 or so years invested in the work. Why should I make the data available to you, when your aim is to try and find something wrong with it.”
Control over the global temperature data continued until the first satellite data appeared in 1978. Despite the limitations, it provided more complete coverage; the claim is 97 to 98%. This compares with the approximately 15% coverage of the surface data.
Regardless of the coverage, the surface data had to approximate the satellite data as Figure 2 shows.
This only prevented changing the most recent 41 years of the record, but it didn’t prevent altering the historical record. Dr. Marohasy’s article is just one more illustration of the pattern. Tony Heller produced the most complete analysis of the adjustments made. Those making the changes claim, as they have done again in Marohasy’s challenge, that they are necessary to correct for instrument errors, site and situation changes such as for an Urban Heat Island Effect (UHIE). The problem is that the changes are always in one direction, namely, lowering the historic levels. This alters the gradient of the temperature change by increasing the amount and rate of warming. One of the first examples of such adjustments occurred with the Auckland, New Zealand record (Figure 3). Notice the overlap in the most recent decades.
The IPCC took control of the CO2 record from the start, and it continues. They use the Mauna Loa record and data from other sites using similar instruments and techniques as the basis for their claims. Charles Keeling, one of the earliest proponents of AGW, was recognized and hired by Roger Revelle at the Scripps institute. Yes, that is the same Revelle Al Gore glorifies in his movie An Inconvenient Truth. Keeling established a CO2 monitoring station that is the standard for the IPCC. The problem is Mauna Loa is an oceanic crust volcano, that is the lava is less viscous and more gaseous than continental crust volcanoes like Mt Etna. A documentary titled Future Frontiers: Mission Galapagos reminded me of studies done at Mt Etna years ago that showed high levels of CO2 emerging from the ground for hundreds of kilometers around the crater. The documentary is the usual, people are destroying the planet sensationalist BBC rubbish. However, at one point they dive in the waters around the tip of a massive volcanic island and are amazed to see CO2 visibly bubbling up all across the ocean floor.
Charles Keeling patented his instruments and techniques. His son Ralph continues the work at the Scripps Institute and is a member of the IPCC. His most recent appearance in the media involved an alarmist paper with a major error – an overestimate of 60%. Figure 4 shows him with the master of PR for the IPCC narrative, Naomi Oreskes.
Figure 5 shows the current Mauna Loa plot of CO2 levels. It shows a steady increase from 1958 with the supposed seasonal variation.
This increase is steady over 41 years, which is remarkable when you look at the longer record. For example, the Antarctic ice core record (Figure 6) shows remarkable variability.
The ice core record is made up of data from bubbles that take a minimum of 70 years to be enclosed. Then a 70-year smoothing average is applied. The combination removes most of the variability, and that eliminates any chance of understanding and predetermines the outcome.
Figure 7 shows the degree of smoothing. It represents a comparison of 2000 years of CO2 measures using two different measuring techniques. You can see the difference in variability but also in total atmospheric levels of approximately 260 ppm to 320 ppm.
However, we also have a more recent record that shows similar differences in variation and totals (Figure 8). It allows you to see the IPCC smoothed the record to control the CO2 record. The dotted line shows the Antarctic ice core record and how Mauna Loa was created to continue the smooth but inaccurate record. Zbigniew Jaworowski, an atmospheric chemist and ice core specialist, explained what was wrong with CO2 measures from ice cores. He set it all out in an article titled, “CO2: The Greatest Scientific Scandal of Our Time.” Of course, they attacked him, yet the UN thought enough of his qualifications and abilities to appoint him head of the Chernobyl nuclear reactor disaster investigation.
Superimposed is the graph of over 90,000 actual atmospheric measures of CO2 that began in 1812. Publication of the level of oxygen in the atmosphere triggered collection of the CO2 data. Science wanted to identify the percentage of all the gases in the atmosphere. They were not interested in global warming or any other function of those gases – they just wanted to obtain accurate data, something the IPCC never did.
People knew about these records decades ago. The record was introduced into the scientific community by railway engineer Guy Callendar in coordination with familiar names as Ernst-Georg Beck noted,
“Modern greenhouse hypothesis is based on the work of G.S. Callendar and C.D. Keeling, following S. Arrhenius, as latterly popularized by the IPCC.”
He deliberately selected a unique set of the data to claim the average level was 270 ppm and changed the slope of the curve from an increase to a decrease (Figure 9). Jaworowski circled the data he selected, but I added the trend lines for all the data (red) and Callendar’s selection (blue).
Tom Wigley, Director of the Climatic Research Unit (CRU) and one of the fathers of AGW, introduced the record to the climate community in a 1983 Climatic Change article titled, “The Pre-Industrial Carbon Dioxide Level.” He also claimed the record showed a pre-industrial CO2 level of 270 ppm. Look at the data!
The IPCC and its proponents established through cherry-picking and manipulation the pre-industrial CO2 level. They continue control of the atmospheric level through control of the Mauna Loa record, and they control the data on annual human production. Here is their explanation.
The IPCC has set up the Task Force on Inventories (TFI) to run the National Greenhouse Gas Inventory Programme (NGGIP) to produce this methodological advice. Parties to the UNFCCC have agreed to use the IPCC Guidelines in reporting to the convention.
How does the IPCC produce its inventory Guidelines? Utilising IPCC procedures, nominated experts from around the world draft the reports that are then extensively reviewed twice before approval by the IPCC. This process ensures that the widest possible range of views are incorporated into the documents.
In other words, they make the final decision about which data they would use for their reports and as input to their computer models.
This all worked for a long time, however, as with all deceptions even the most tangled web unravels. They continue to increase the atmospheric level of CO2 and then confirm it to the world by controlling the Mauna Loa annual level. However, they lost control of the recent temperature record with the advent of satellite data. They couldn’t lower the CO2 data because it would expose their entire scam, they are on a treadmill of perpetuating whatever is left of their deception and manipulation. All that was left included artificial lowering of the historical record, changing the name from global warming to climate change, and producing increasingly threatening narratives like the 12 years left and Santer’s certainty of doom.
NOTE: In my opinion, I do not give the work of Ernst-Georg Beck in Figure 8 any credence for accuracy, because the chemical procedure is prone to error and the locations of the data measurements (mostly in cities at ground level) have highly variable CO2 levels. Note how highly variable the data is. – Anthony
Reblogged from Watts Up With That:
Guest Post by Willis Eschenbach
As a result of a tweet by Steve McIntyre, I was made aware of an interesting dataset. This is a look by Vinther et al. at the last ~12,000 years of temperatures on the Greenland ice cap. The dataset is available here.
Figure 1 shows the full length of the data, along with the change in summer insolation at 75°N, the general location of the ice cores used to create the temperature dataset.
Figure 1. Temperature anomalies of the Greenland ice sheet (left scale, yellow/black line), and the summer insolation in watts per square metre at 75°N (right scale, blue/black line). The red horizontal dashed line shows the average ice sheet temperature 1960-1980.
I’ll only say a few things about each of the graphs in this post. Regarding Figure 1, the insolation swing shown above is about fifty watts per square metre. Over the period in question, the temperature dropped about two and a half degrees from the peak in about 5800 BCE. That would mean the change is on the order of 0.05°C for each watt per square metre change in insolation …
From about 8300 BCE to 800 BCE, the average temperature of the ice sheet, not the maximum temperature but the average temperature of the ice sheet, was greater than the 1960-1980 average temperature of the ice sheet. That’s 7,500 years of the Holocene when Greenland’s ice sheet was warmer than recent temperatures.
Next, Figure 2 shows the same temperature data as in Figure 1, but this time with the EPICA Dome C ice core CO2 data.
Figure 2. Temperature anomalies of the Greenland ice sheet (left scale, yellow/black line), and EPICA Dome C ice core CO2 data, 9000 BCE – 1515 AD (right scale, blue/black line)
Hmmm … for about 7,000 years, CO2 is going up … and Greenland temperature is going down … who knew?
Finally, here’s the recent Vinther data:
Figure 3. Recent temperature anomalies of the Greenland ice sheet.
Not a whole lot to say about that except that the Greenland ice sheet has been as warm or warmer than the 1960-1980 average a number of times during the last 2000 years.
Finally, I took a look to see if there were any solar-related or other strong cycles in the Vinther data. Neither a Fourier periodogram nor a CEEMD analysis revealed any significant cycles.
And that’s the story of the Vinther reconstruction … here, we’ve had lovely rain for a couple of days now. Our cat wanders the house looking for the door into summer. He goes out time after time hoping for a different outcome … and he is back in ten minutes, wanting to be let in again.
My best to all, rain or shine,
From Tony Heller’s RealClimateScience.com:
In November 1976, National Geographic had a fairly objective discussion of climate.
They showed how temperatures regularly take large swings, independent of human activities.
They showed that most of the recession of glaciers in the Alps occurred before 1940.
They discussed the next ice age and stated that climate change has always been occurring, and always will continue to occur.
They discussed the rapid cooling since 1940.
Now they publish mindless idiocy like this :
From Watts Up With That:
Dr. Leif Svalgaard sent this to me via email saying “Anthony, here is a short note I just submitted to arXiv. You are welcome to make of it what you want, if anything”. I choose to publish it without comment for our readers to consider.
Up to Nine Millennia of Multimessenger Solar Activity
Leif Svalgaard, Stanford University
A nine-millennia reconstruction of decadal sunspot numbers derived from 10Be and 14C terrestrial archives for 6755 BC to 1885 AD has been extended to the present using several other messengers (Observed Sunspot Number, Group Number, range of the diurnal variation of the geomagnetic field, and the InterDiurnal Variation of the geomagnetic Ring Current) and scaled to the modern SILSO Version 2 sunspot number. We find that there has been no secular up tick of activity the last three hundred years and that recent activity has not been out of the ordinary. There is a sharp 87.6-year peak in the power spectrum, but no significant power at the Hallstatt 2300-year period. The reconciliation of the cosmogenic record with the modern sunspot record could be an important step to providing a vetted solar activity record for the use in climate research.
Wu et al. (2018) (hereafter WEA) present a multi-proxy reconstruction of solar activity over the last 9000 years, using all available long-span datasets of 10Be and 14C messengers in terrestrial archives. Cosmogenic isotopes are produced by cosmic rays in the Earth’s atmosphere and their measured production/depositional flux reflects changes in the cosmic ray flux in the past. The cosmic ray flux is modulated by solar magnetic activity, which can be quantified in terms of the heliospheric modulation potential characterizing the energy spectrum of Galactic Cosmic Rays reaching the top of the atmosphere at a given time. The WEA reconstruction is given as decadal averages centered on the midpoint of each decade and runs from 6755.5 BC to 1885.5 AD. The reason for stopping in 1885 was that the (Suess 1955) effect of extensive fossil fuel burning makes it problematic to use 14C data after the mid-19th century; in addition, radiocarbon data cannot be used after the 1950s because of nuclear explosions that led to massive production of 14C. The modulation potential series is not a stable proxy for solar activity since the modulation potential is a relative index whose absolute value is model dependent (e.g. Herbst et al. 2017). Therefore, WEA converted the reconstructed modulation potential to a more practical and certainly more widely used index: the sunspot number, its current version designated SN (version 2, Clette et al. 2014). The conversion was done via the open solar magnetic flux following an ‘established’ procedure (e.g. Usoskin et al. 2003, 2007). As the ‘procedure’ was developed for version 1 of the sunspot number, the newer version 2 data were scaled down by a factor of 0.6 for the calibration, in spite of the so-called k-factor (the 0.6) not being constant over time (Clette & Lefèvre 2016). It seems a step backwards to cling to the obsolete version 1 of the sunspot number scale, so we undo the spurious down-scaling of version 2. We shall not here quibble about details of the conversion procedure except to note that one would expect (even require) that the SN-reconstruction should match the actual observed SN-series for the time of overlap. WEA suggest that their reconstructed values be multiplied by 1.667 to place them on the SN V2-scale. Figure 1 shows that this is not enough. A factor of 2.0 seems to be necessary to match the two scales, likely meaning that the WEA calibration is too low by about 20%.
When comparing two series it can be difficult to decide which one is too low or too high. It could simply be wrong. Luckily, there are several other messengers directly pertaining to solar activity: the independently derived sunspot Group Number, GN (Svalgaard & Schatten 2016) back to 1610, the range of the diurnal variation of the geomagnetic field, rY (Svalgaard 2016; Loomis 1873) good back to the 1810s, and the InterDiurnal Variation of the geomagnetic Ring Current, IDV (Svalgaard & Cliver 2005, 2010; Svalgaard 2014; Cliver & Herbst 2018; Owen et al. 2016; Bartels 1932) back to the 1830s. Decadal means for these are given in Table 1 together with the (linear) regression-equations to convert them to the SN V2 scale. Applying the conversions we can now plot the messages all on the same scale, Figure 2.
In scaling rY and IDV we have first constructed a composite of SN V2 and GN*(on SN V2 scale).
IDV*(V2) = 18.71˟IDV(nT) – 91.27. Column 12 gives the average of columns 7-11, with its standard deviation in column 13, based on the number of values, N in column 14, going into the average. The table can also be found in the Excel file (see below) associated with this article.
We can now put our Multimessenger reconstruction in the context of solar activity over the last millennium, Figure 3. It is encouraging that our reconstruction matches the WEA reconstruction very well (R2 = 0.87) for their time of overlap, illustrating the power of the Multimessenger approach in reconciling various time series. We note that a ~100-year quasi-wave is clearly seen by eye over the last three centuries (only) and also that there has not been any significant secular change (e.g. an often claimed increase) over the same time interval, the lack of which had already been established (e.g. Clette et al. 2014).
This convergence of the recent cosmogenic and solar activity records (see also Muscheler et al. 2016) lends credence to the admissibility of making a leap of faith back to the beginning of the WEA reconstruction nine millennia ago, Figure 4, even if we have to admit that it is not clear if the very long-period variations are of solar origin. On the other hand, it seems clear that recent activity has not been extraordinary (Berggren et al. 2009).
The combined time series from 6755 BC to 2015 AD is available as an Excel file at
When you have 8770 years (878 data points) of data, the urge to look for cycles is overwhelming. Figure 5 shows the magnitude of the FFT in the time domain of the full sunspot number time series (combining the WEA and Multimessenger series). Although there are better and more powerful methods (e.g. Wavelets), any real periodic activity would show up in the FFT spectrum. We computed the FFT for the entire series and also for three subsets: the first half, the second half, and the middle half in order to see if periods (‘cycles’) would be persistent and coincident in all of them. Three long-term cycles are often assumed to exist (e.g. Damon & Sonett, 1991): the ~2300-year Hallstatt (or Bray) Cycle, the 208-year de Vries (or Suess) Cycle, and the 88-year Gleissberg Cycle. Figure 5 shows that the Hallstatt Cycle (found in climate records) is not significant in the solar record. There does seem to be power at periods between 200 and 240 years, but the power is perhaps too broadly distributed to qualify as a strong periodicity, although there is a narrow peak at half the period (104 years), a variation also visible by eye in Figure 3. With lots of peaks between 250 and 1200 years it is no surprise that some of them just coincide around 350 years. On the other hand, the 87.6-year Gleissberg peak is sharp and prevalent in the whole series and in all three sub-intervals.
The Wu et al. (2018) reconstruction of the sunspot number since 6755 BC combined with modern Multimessenger proxies covering the 19th century until today goes a long way to reconcile the cosmogenic solar activity record with recent assessments of long-term solar activity.
Bartels, J. 1932, Terr. Magn. Atmos. Electr., 37, 1-52.
Berggren, A.-M., Beer J., Possnert, G., et al. 2009, Geophys. Res. Lett., 36(11), L11801.
Clette, F., Svalgaard, L., Vaquero, J. M., et al. 2014, Space Sci. Rev., 186, 35-103.
Clette, F., Lefèvre, L. 2016, Solar Phys., 291(9-10), 2629-2629.
Cliver, E. W., Herbst, K. 2018, Space Sci. Rev., 214(2), Id 56.
Damon, P. E., Sonett, C. P. 1991, in The sun in time, Univ. Arizona Press, 360-388.
Herbst, K., Muscheler, R., Heber, B. 2017, J. Geophys., 122(1), 23-34.
Loomis, E. 1873, Amer. J. Sci. Ser. 3, 5, 245-260.
Muscheler, R., Adolphi, F., Herbst, K., et al. 2016, Solar Phys., 291(9-10), 3025-3043.
Owens, M. J., Cliver, E. W., McCracken, K. G., et al. 2016, J. Geophys. Res., 121(7), 6048-6063.
Suess, H. E. 1955, Science, 122 (3166), 415-417.
Svalgaard, L. 2014, Ann. Geophysicae, 32(6), 633-641.
Svalgaard, L. 2016, Solar Phys., 291(9-10), 2981-3010.
Svalgaard, L., Cliver, E. W. 2005, J. Geophys. Res., 110(12), A12103.
Svalgaard, L., Cliver, E. W. 2010, J. Geophys. Res., 115(9), A09111.
Svalgaard, L., Schatten, K. H. 2016, Solar Phys., 291(9-10), 2653-2684.
Usoskin, I. G., Solanki, S. K., Kovaltsov, G. A. 2007, A&A, 471, 301.
Usoskin, I. G., Solanki, S. K., Schüssler, M., et al. 2003, Phys. Rev. Lett., 91, 211101.
Wu, C. J., Usoskin I. G., Krivova, N., et al. 2018, A&A, 615, A93.
Note: Excellent comment thread at WUWT: https://wattsupwiththat.com/2018/10/27/svalgaard-paper-reconstruction-of-9000-years-of-solar-activity/ –Hifast
Analysis of ice cores delivers continuous data for the first time on industrial soot from 1740 to today, reports HeritageDaily.
In the first half of the 19th century, a series of large volcanic eruptions in the tropics led to a temporary global cooling of Earth’s climate.
It was a natural process that caused Alpine glaciers to grow and subsequently recede again during the final phase of the so-called Little Ice Age.
This has now been proven by PSI researchers, on the basis of ice cores.
View original post 283 more words
Bond Cycles and the Role of The Sun in Shaping Climate
Guest Post by Willis Eschenbach
A WUWT commenter emailed me with a curious claim. I have described various emergent phenomena that regulate the surface temperature. These operate on time scales ranging from minutes to hours (e.g. dust devils, thunderstorms) to multi-decadal (e.g. Atlantic Multidecadal Oscillation, Pacific Decadal Oscillation). He suggested that there is also a much slower thermostatic mechanism at play over thousands of years and longer. Here’s how I understand it.
He said that when it gets warmer, the atmosphere is more moist, so there is more snow on Antarctica. This translates into more ice on the ice cap, which puts increased pressure on the ice below. Now, the ice gain at the surface and the ice loss in the calving of the Antarctic glaciers is in some kind of long-term very slow-moving steady state. Pressure at the top squeezes out the ice on all sides. So increasing the…
View original post 2,027 more words
Good graphs and data