Climate hysterics skyrocket

Reblogged from Watts Up With That:

Increasingly absurd disaster rhetoric is consistently contradicted by climate and weather reality

Paul Driessen

Call it climate one-upmanship. It seems everyone has to outdo previous climate chaos rhetoric.

The “climate crisis” is the “existential threat of our time,” Speaker Nancy Pelosi told her House colleagues. We must “end the inaction and denial of science that threaten the planet and the future.”

Former California Governor Jerry Brown solemnly intoned that America has “an enemy, though different, but perhaps very much devastating in a similar way” as the Nazis in World War II.

Not to be outdone, two PhDs writing in Psychology Today declared that “the human race faces extinction” if we don’t stop burning fossil fuels. And yet “even people who experience extreme weather events often still refuse to report the experiences as a manifestation of climate change.” Psychologists, they lament, “have never had to face denial on this scale before.”

Then there’s Oxford University doctoral candidate Samuel Miller-McDonald. He’s convinced the only thing that could save people and planet from cataclysmic climate change is cataclysmic nuclear war that “shuts down the global economy but stops short of human extinction.”

All this headline-grabbing gloom and doom, however, is backed up by little more than computer models, obstinate assertions that the science is settled, and a steady litany of claims that temperatures, tornadoes, hurricanes, droughts et cetera are unprecedented, worse than ever before, and due to fossil fuels.

And on the basis of these hysterics, we are supposed to give up the carbon-based fuels that provide over 80% of US and global energy, gladly reduce our living standards – and put our jobs and economy at the mercy of expensive, unreliable, weather dependent, pseudo-renewable wind, solar and biofuel energy.

As in any civil or criminal trial, the burden of proof is on the accusers and prosecutors who want to sentence fossil fuels to oblivion. They need to provide more than blood-curdling charges, opening statements and summations. They need to provide convincing real-world evidence to prove their case.

They have refused to do so. They ignore the way rising atmospheric carbon-dioxide is spurring plant growth and greening the planet. They blame every extreme weather event on fossil fuel emissions, but cannot explain the Medieval Warm Period, Little Ice Age or extreme weather events decades or centuries ago – or why we have had fewer extreme weather events in recent decades. They simply resort to trial in media and other forums where they can exclude exculpatory evidence, bar any case for the fossil fuel defense, and prevent any cross-examination of their witnesses, assertions and make-believe evidence.

Climate models are not evidence. At best, they offer scenarios of what might happen if the assumptions on which they are based turn out to be correct. However, the average prediction by 102 models is now a full degree F (0.55 C) above what satellites are actually measuring. Models that cannot be confirmed by actual observations are of little value and certainly should not be a basis for vital energy policy making.

The alarmist mantra seems to be: If models and reality don’t agree, reality must be wrong.

In fact, even as atmospheric carbon dioxide levels climbed to 405 parts per million (0.0405% of Earth’s atmosphere), except for short-term temperature spikes during El Niño ocean warming events, there has been very little planetary warming since 1998; nothing to suggest chaos or runaway temperatures.

Claims that tornadoes have gotten more frequent and intense are obliterated by actual evidence. NOAA records show that from 1954 to 1985 an average of 56 F3 to F5 tornadoes struck the USA each year – but from 1985 to 2017 there were only 34 per year on average. And in 2018, for the first time in modern history, not a single “violent” twister touched down in the United States.

Harvey was the first major (category 3-5) hurricane to make US landfall in a record twelve years. The previous record was nine years, set in the 1860s. (If rising CO2 levels are to blame for Harvey, Irma and other extreme weather events, shouldn’t they also be credited for this hurricane drought?)

Droughts differ little from historic trends and cycles – and the Dust Bowl, Anasazi and Mayan droughts, and other ancient dry spells were long and destructive. Moreover, modern agricultural and drip irrigation technologies enable farmers to deal with droughts far better than they ever could in the past.

Forest fires are fewer than in the recent past – and largely due to failure to remove hundreds of millions of dead and diseased trees that provide ready tinder for massive conflagrations.

Arctic and Antarctic ice are largely within “normal” or “cyclical” levels for the past several centuries – and snow surface temperatures in the East Antarctic Plateau regularly reach -90 °C (-130 F) or lower. Average Antarctic temperatures would have to rise some 20-85 degrees F year-round for all its land ice to melt and cause oceans to rise at faster than their current 7-12 inches per century pace.

In fact, the world’s oceans have risen over 400 feet since the last Pleistocene glaciers melted. (That’s how much water those mile-high Ice Age glaciers took out of the oceans!) Sea level rise paused during the Little Ice Age but kicked in again the past century or so. Meanwhile, retreating glaciers reveal long-lost forests, coins, corpses and other artifacts – proving those glaciers have come and gone many times.

Pacific islands will not be covered by rising seas anytime soon, at 7-12 inches per century, and because corals and atolls grow as seas rise. Land subsidence also plays a big role in perceived sea level rise – and US naval bases are safe from sea level rise, though maybe not from local land subsidence.

The Washington Post did report that “the Arctic Ocean is warming up, icebergs are growing scarcer, and in some places the seals are finding the water too hot.” But that was in 1922.

Moreover, explorers wrote about the cyclical absence of Arctic ice long before that. “We were astonished by the total absence of ice in Barrow Strait,” Sir Francis McClintock wrote in 1860. “I was here at this time in [mid] 1854 – still frozen up – and doubts were entertained as to the possibility of escape.”

Coral bleaching? That too has many causes – few having anything to do with manmade global warming – and the reefs generally return quickly to their former glory as corals adopt new zooxanthellae.

On and on it goes – with more scare stories daily, more attempts to blame humans and fossil fuels for nearly every interesting or as-yet-unexplained natural phenomenon, weather event or climate fluctuation. And yet countering the manmade climate apocalypse narrative is increasingly difficult – in large part because the $2-trillion-per-year climate “science” and “renewable” energy industry works vigorously to suppress such evidence and discussion … and is aided and abetted by its media and political allies.

Thus we have Chuck Todd, who brought an entire panel of alarmist climate “experts” to a recent episode of Meet the Press. He helped them expound ad nauseam on the alleged “existential threat of our time” – but made it clear that he was not going to give even one minute to experts on the other side.

“We’re not going to debate climate change, the existence of it,” Todd proclaimed. “The Earth is getting hotter. And human activity is a major cause, period. We’re not going to give time to climate deniers. The science is settled, even if political opinion is not.” The only thing left to discuss, from their perspective was “solutions” – most of which would hugely benefit them and their cohorts, politically and financially.

Regular folks in developed and developing countries alike see this politicized, money-driven kangaroo court process for what it is. They also know that unproven, exaggerated and fabricated climate scares must be balanced against their having to give up (or never having) reliable, affordable fossil fuel energy. That is why we have “dangerous manmade climate change” denial on this scale.

That is why we must get the facts out by other means. It is why we must confront Congress, media people and the Trump Administration, and demand that they address these realities, hold debates, revisit the CO2 Endangerment Finding – and stop calling for an end to fossil fuels and modern living standards before we actually have an honest, robust assessment of supposedly “settled” climate science.

Paul Driessen is senior policy advisor for the Committee For A Constructive Tomorrow (CFACT) and author of articles and books on energy, environmental and human rights issues.

ARGO—Fit for Purpose?

Reblogged from Watts Up With That:

By Rud Istvan

This is the second of two guest posts on whether ‘big’ climate science missions are fit for purpose, inspired by ctm as seaside lunch speculations.

The first post dealt with whether satellite altimetry, specifically NASA’s newest Jason3 ‘bird’, was fit for sea level rise (SLR) ‘acceleration’ purpose. It found using NASA’s own Jason3 specs that Jason3 (and so also its predecessors) likely was NOT fit–and never could have been–despite its SLR data being reported by NASA to 0.1mm/yr. We already knew that annual SLR is low single digit millimeters. The reasons satellite altimetry cannot provide that level of precision are very basic, and were known to NASA beforehand—Earth’s requisite reference ellipsoid is lumpy, oceans have varying waves, atmosphere has varying humidity—so NASA never really had a chance of achieving what they aspired to: satalt missions to measure sea level rise to fractions of a millimeter per year equivalent to tide gauges. NASA claims they can, but their specifications say they cannot. The post proved lack of fitness via overlap discrepancies between Jason2 and Jason3, plus failure of NASA SLR estimates to close.

This second related guest post asks the same question of ARGO.

Unlike Jason3, ARGO had no good pre-existing tide gauge equivalent mission comparable. Its novel oceanographic purposes (below) tried to measure several things ‘rigorously’ for the very first time. “Rigorously’ did NOT mean precisely. One, ocean heat content (OHC), was previously very inadequately estimated. OHC is much more than just sea surface temperatures (SSTs). SSTs (roughly but not really surface) were formerly measured by trade route dependent buckets/thermometers, or by trade route and ship laden dependent engine intake cooling water temperatures. Deeper ocean was not measured at all until inherently depth inaccurate XBT sensors were developed for the Navy.

Whether ARGO is fit for purpose involves a complex unraveling of design intent plus many related facts. The short ARGO answer is probably yes, although OHC error bars are provably understated in ARGO based scientific literature.

For those WUWT readers wishing a deeper examination of this guest post’s summary conclusions, a treasure trove of ARGO history, implementation, and results is available at Most of this post is either directly derived therefrom, or from references found therein, or leads to Willis Eschenbach’s previous WUWT ARGO posts (many searchable using ARGO), with the four most relevant directly linked below.

This guest post is divided into three parts:

1. What was the ARGO design intent? Unlike simple Jason3 SLR, ARGO has a complex set of overlapping oceanographic missions.

2. What were/are the ARGO design specs relative to its missions?

3. What do facts say about ARGO multiple mission fitness?

Part 1 ARGO Intent

ARGO was intended to explore a much more complicated set of oceanography questions than Jason’s simple SLR acceleration. The ideas were developed by oceanographers at Scripps circa 1998-1999 based on a decade of previous regional ocean research, and were formulated into two intent/design documents agreed by the implementing international ARGO consortium circa 2000. There were several ARGO intended objectives. The three most explicitly relevant to this summary post were:

1. Global ocean heat climatology (OHC with intended accuracy explicitly defined as follows)

2. Ocean ‘fresh water storage’ (upper ocean rainfall salinity dilution)

3. Map of non-surface currents

All providing intended “global coverage of the upper ocean on broad spatial scales and time frames of several months or longer.”

Unlike Jason3, no simple yes/no ‘fit for purpose’ for ARGO’s multiple missions is possible. It depends on which mission over what time frame.

Part 2 ARGO Design

The international design has evolved. Initially, the design was ~3000 floats providing a random roughly 3 degree lat/lon ocean spacing, explicitly deemed sufficient spatial resolution for all ARGO intended oceanographic purposes.

There is an extensive discussion of the array’s accuracy/cost tradeoffs in the original intent/design documentation. The ARGO design “is an ongoing exercise in balancing the array’s requirements against the practical limitations imposed by technology and resources”. Varying perspectives still provided (1998-99) “consistent estimates of what is needed.” Based on previous profiling float experiments, “in proximate terms an array with spacing of a few hundred kilometers is sufficient to determine surface layer heat storage (OHC) with an accuracy of about 10W/m2 over areas (‘pixels’) about 1000km on a side.” Note the abouts.

The actual working float number is now about 3800. Each float was to last 4-5 years battery life; the actual is ~4.1 years. Each float was to survive at least 150 profiling cycles; this has been achieved (150 cycles*10 days per cycle/365 days per year equals 4.1 years). Each profile cycle was to be 10 days, drifting randomly at ~1000 meters ‘parking depth’ at neutral buoyancy for 9, then descending to 2000 meters to begin measuring temperature and salinity, followed by a ~6 hour rise to the surface with up to 200 additional measurement sets of pressure (giving depth), temperature, and salinity. This was originally followed by 6-12 hours on the surface transmitting data (now <2 hours using the Iridium satellite system) before sinking back to parking depth.

The basic ARGO float design remains:


And the basic ARGO profiling pattern remains:


‘Fit for purpose’ concerning OHC (via the 2000 meter temperature profile) presents two relevant questions. (1) Is 2000 meters deep enough? (2) Are the sensors accurate enough to estimate the 10W/m2 per 1000km/side ‘pixel’?

With respect to depth, there are two differently sourced yet similar ‘yes’ answers for all mission intents.

For salinity, the ARGO profile suffices. Previous oceanographic studies showed (per the ARGO source docs) that salinity is remarkably unvarying below about 750 meters depth in all oceans. This fortunately provides a natural salinity ‘calibration’ for those empirically problematic sensors.

It also means seawater density is roughly constant over about 2/3 of the profile, so pressure is a sufficient proxy for depth (and pressure can also be calibrated by measured salinity above 750 meters translated to density).

For temperature, as the following figure (in °F not °C) typical thermocline profiles show, ARGO ΔT depth profile does not depend very much on latitude since 2000 meters equaling ~6500 feet reaches the approximately constant deep ocean temperature equilibrium at all latitudes, providing another natural ARGO ‘calibration’. The 2000 meters ARGO profile was a wise intent/design choice.


Part 3 Is ARGO fit for purpose?

Some further basics are needed as background to the ARGO objectives.

When an ARGO float surfaces to transmit its data, its position is ascertained via GPS to within about 100 meters. Given the vastness of the oceans, that is an overly precise position measurement for ‘broad spatial scales’ of deep current drift and 1000000km2 OHC/salinity ‘pixels’.

Thanks to salinity stability below 750 meters, ARGO ‘salinity corrected’ instruments are accurate (after float specific corrections) to ±0.01psu, giving reasonable estimates of ‘fresh water storage’. A comparison of 350 retrieved ‘dead battery’ ARGO floats showed that 9% were still out of ‘corrected’ salinity calibration at end of life, unavoidably increasing salinity error a little.

The remaining big ‘sufficient accuracy’ question is OHC, and issues like Trenberth’s infamous “Missing Heat” covered in the eponious essay in ebook Blowing Smoke. OHC is a very tricky sensor question, since the vast heat capacity of ocean water means a very large change in ocean heat storage translates into a very small change in absolute seawater temperature.

How good are the ARGO temperature sensors? On the surface, it might seem to depend, since as an international consortium, ARGO does not have one float design. There are presently five: Provor, Apex, Solo, S2A, and Navis.

However, those 5 only ever embodied two temperature sensors, FS1 and SBE. Turns out—even better for accuracy—FS1 was retired late in 2006 when JPL’s Willis published the first ARGO OHC analysis after full (3000 float) deployment, finding (over too short a time frame, IMO) OHC was decreasing (!). Oops! Further climate science analysis purportedly showed FS1 temperature profiles in a few hundred of the early ARGO floats were probably erroneous. Those floats were taken out of service, leaving just SBE sensors. All five ARGO float designs use current model SBE38 from 2015.

SeaBirdScientific builds that sensor, and its specs can be found at The SeaBird E38 sensor spec is the following (sorry, but it doesn’t copy well from their website where all docs are in a funky form of pdf probably intended to prevent partial duplication like for this post).

Measurement Range

-5 to +35 °C

Initial Accuracy 1

± 0.001 °C (1 mK)

Typical Stability

0.001 °C (1 mK) in six months, certified


Response Time 2

500 msec

Self-heating Error

< 200 μK

1 NIST-traceable calibration applying over the entire range.
2 Time to reach 63% of nal value following a step change in temperature

That is a surprisingly good seawater temperature sensor. Accurate to a NIST calibrated 0.001°C, with a certified temperature precision drift per 6 months (1/8 of a float lifetime) of ±0.001°C. USCD says in its ARGO FAQs that the ARGO temperature data it provides is accurate to ±0.002°C. This suffices to estimate the about 10W/m2 OHC intent per 1000000 km2 ARGO ‘pixel’.

BUT, there is still a major ‘fit for purpose’ problem despite all the ARGO strong positives. Climate papers based on ARGO habitually understate the actual resulting OHC uncertainty—about 10W/m2. (Judith Curry has called this one form of her ‘uncertainty monster’). Willis Eschenbach has posted extensively here at WUWT (over a dozen guest posts already) on ARGO and its findings. His four most relevant for the ‘fit for purpose’ scientific paper uncertainty question are from 2012-2015, links that WE kindly provided via email needing no explanation:

Decimals of Precision

An Ocean of Overconfidence

More Ocean-Sized Errors In Levitus Et Al.

Can We Tell If The Oceans Are Warming

And so we can conclude concerning the ARGO ‘fit for purpose’ question, yes it probably is—but only if ARGO based science papers also correctly provide the associated ARGO intent uncertainty (error bars) for ‘rigorous albeit broad spatial resolution’.

Sea Level Rise (SLR) Satellite Altimetry—Fit for Purpose?

Reblogged from Watts Up With That:

by Rud Istvan

ctm and I were having lunch recently near our mutually admired South Florida coral reef system, and over conversation we started speculating about ARGO. I brought up Jason2 SLR as an analog. WUWT readers can see my 2016 WUWT guest post: ‘Sea Level Rise, Acceleration, and Closure’ for details. That ctm lunch has inspired a lot more volunteer WUWT ‘sciency’ research on whether the most modern climate research instrument systems are fit for purpose. This post covers satellite altimetry measured sea level rise (SLR). The short answer is NOPE. The eventual companion post whose results are TBD, because ‘It’s complicated, folks’, and will cover ARGO. Dunno any ARGO answer(s) yet.

There are strong evidentiary reasons to think satellite altimetry does NOT accurately represent SLR change over time. The two most irrefutable observational reasons are:

(1) Satellite altimetry measured trends are about 1.5x higher than differential GPS, (vertical land motion) corrected long record tide gauges (about 3.4 versus about 2.2 mm/yr)

(2) the dGPS tide gauge estimates roughly close, while Jason2 satellite altimetry estimates definitely do NOT. Per my previous above referenced guest post, ‘closure’ is the simple arithmetic that SLR must approximately equal thermosteric rise, as hotter seawater expands in volume, plus ice sheet losses (land based ice when melted adds ocean water), while all other contributions, such as ground water extraction are arguably de minimus.)

So, is the most recent ‘satalt bird’, Jason3, fit for purpose? Satellite altimetry uses radar signal returns reflected off a wavy ocean surface to estimate sea level from the timing of the signal from generation to receipt. This is no different in principle than any other range estimating radar system. And as many military and commercial aviation uses evidence, radar ranging generally IS fit for purpose.

Except, military cruise missile ranging is meters between Syrian aircraft bunkers, not millimeters. Commercial aviation aircraft avoidance is kilometers between planes, not millimeters. But sea level changes are measured in millimeters/yr. That is a different accuracy/precision ball game, and the logical essence of this post.

Rather than a bunch of footnotes and links necessary for ‘whack-a-mole’ guest posts like my recent ‘Antarctic SLR contributions’, this guest post simply extracts irrefutable images and numeric values from the official Jason3, (the newest satalt bird) NASA reference products and mission specs. The official ‘product’ documents are available at for any WUWT reader wanting to double check. The related Jason3 mission/physical instrument specs are available at This guest post uses Jason3 product handbook version 1.5, issued 9/17/18.

Jason3 was launched into polar orbit on 1/17/2016.


It overlaps Jason2, and interestingly shows significantly less SLR in its overlap period. According to NASA, Jason3 instruments and processing algorithms correct for ‘known’, (and noted in my Blowing Smoke ebook SLR essay ‘PseudoPrecision’) Jason2 deficiencies: wave height, sigma naught, tropospheric, and ionospheric humidity. Jason3’s many new instruments now have a spatial aperture ‘pixel’ resolution of 11.2km x 5.1km, allowing closer calibration to land altitude reference pixels in order to better estimate temporal orbital decay.

The following schematic from the official Jason3 NASA information illustrates only some of the data processing problems Jason3 supposedly now ‘overcomes’.


The smaller Jason3 aperture improves its reference orbital accuracy. True. But probably not the processing algorithms for the reference Earth ellipsoid; thanks to geology Earth’s gravity field is anything but a uniform reference ellipsoid. GRACE gravimetric data shows it is a lumpy mess; which means that even though water seeks its own level, ocean seawater is NOT level across a lumpy planetary gravity field.


So, what does the latest NASA information say about Jason3’s algorithmically processed radar return data? Jason3 product manual §1.4.4 says all distance units are reported in units of 1/10 millimeter. WOW! Sure sounds fit for purpose!?!

Not quite. Following is NASA Jason3 ‘product’ manual Table 2.3.1.


For those that are NASA table challenged, the key number is IGDR actual for Total Sea Surface Height hHeightHeighight RMS, (Listed as RSS in the table above is, as noted on the associated EU Jason3 site [it’s a joint mission], just a NASA typo) precision of 3.3 cm. Not mm! This is still an improvement over Jason2, which had an SLR RMS pixel precision of 3.4 cm.

This is defined by repeated pass aperture over the same site ‘pixel’ on the lumpy Earth ellipsoid. I know of no statistics that can reduce a minimum repeatable precision error of >3 cm to an ‘average accuracy’ of 0.1 mm without a ginormous error bar, which NASA ‘conveniently’ DOES NOT provide.

An error term digression is perhaps useful for those who are not long time WUWT readers or Judith Curry “uncertainty monster” cognoscenti. There are two basic error types: precision and accuracy. The simplest layman’s explanation is from target shooting. A tight group is precision. A group on the bullseye is accuracy. An easily understood general illustration is:


NASA Jason3 information say it has BOTH precision and accuracy problems. Its ‘grouping’ precision is 3.3 cm, deceitfully reported to 0.1mm. Its accuracy is 1.5x high off what dGPS corrected long record tide gauges report. For climate purposes it is in quadrants 1/2 rather than 3/4. NOT good.

One further not-so-little satalt factoid. The Jason3 instrument drift spec (column GDR goal, last line) is identical to Jason2, ≤ 1mm/year. So the SLR acceleration that Jason2 ‘sees’ that Jason3 does not (yet) is likely just ‘in spec’ instrument drift between the two.

Ineluctable conclusion: current satellite altimetry measurements of SLR are NOT fit for climate purpose.

Study reconstructing ocean warming finds ocean circulation changes may account for significant portion of sea level rise

Reblogged from Watts Up With That:

Study suggests that in the last 60 years up to half the observed warming and associated sea level rise in low- and mid- latitudes of the Atlantic Ocean is due to changes in ocean circulation.

Over the past century, increased greenhouse gas emissions have given rise to an excess of energy in the Earth system. More than 90% of this excess energy has been absorbed by the ocean, leading to increased ocean temperatures and associated sea level rise, while moderating surface warming.

The multi-disciplinary team of scientists have published estimates in PNAS, that global warming of the oceans of 436 x 1021 Joules has occurred from 1871 to present (roughly 1000 times annual worldwide human primary energy consumption) and that comparable warming happened over the periods 1920-1945 and 1990-2015.

The estimates support evidence that the oceans are absorbing most of the excess energy in the climate system arising from greenhouse gases emitted by human activities.

Prof Laure Zanna (Physics), who led the international team of researchers said: ‘Our reconstruction is in line with other direct estimates and provides evidence for ocean warming before the 1950s.’

The researchers’ technique to reconstruct ocean warming is based on a mathematical approach originally developed by Prof Samar Khatiwala (Earth Sciences) to reconstruct manmade CO2 uptake by the ocean.

Prof Khatiwala said: ‘Our approach is akin to “painting” different bits of the ocean surface with dyes of different colors and monitoring how they spread into the interior over time. We can then apply that information to anything else – for example manmade carbon or heat anomalies – that is transported by ocean circulation. If we know what the sea surface temperature anomaly was in 1870 in the North Atlantic Ocean we can figure out how much it contributes to the warming in, say, the deep Indian Ocean in 2018. The idea goes back nearly 200 years to the English mathematician George Green.’

The new estimate suggests that in the last 60 years up to half the observed warming and associated sea level rise in low- and mid- latitudes of the Atlantic Ocean is due to changes in ocean circulation. During this period, more heat has accumulated at lower latitudes than would have if circulation were not changing.

While a change in ocean circulation is identified, the researchers cannot attribute it solely to human-induced changes.

Much work remains to be done to validate the method and provide a better uncertainty estimate, particularly in the earlier part of the reconstruction. However the consistency of the new estimate with direct temperature measurements gives the team confidence in their approach.

Prof Zanna said: ‘Strictly speaking, the technique is only applicable to tracers like manmade carbon that are passively transported by ocean circulation. However, heat does not behave in this manner as it affects circulation by changing the density of seawater. We were pleasantly surprised how well the approach works. It opens up an exciting new way to study ocean warming in addition to using direct measurements.’

This work offers an answer to an important gap in knowledge of ocean warming, but is only a first step. It is important to understand the cause of the ocean circulation changes to help predict future patterns of warming and sea level rise.


Via Eurekalert

Full paper title: Zanna, L., Khatiwala, S., Gregory, J., Ison, J. and Heimbach, P. (2019) Global reconstruction of historical ocean heat storage and transport. Proceedings of the National Academy of Sciences of the United States of America (PNAS); doi/10.1073/pnas.1808838115

(open access)


Most of the excess energy stored in the climate system due to anthropogenic greenhouse gas emissions has been taken up by the oceans, leading to thermal expansion and sea level rise. The oceans thus have an important role in the Earth’s energy imbalance. Observational constraints on future anthropogenic warming critically depend on accurate estimates of past ocean heat content (OHC) change. We present a novel reconstruction of OHC since 1871, with global coverage of the full ocean depth. Our estimates combine timeseries of observed sea surface temperatures, with much longer historical coverage than those in the ocean interior, together with a representation (a Green’s function) of time-independent ocean transport processes. For 1955-2017, our estimates are comparable to direct estimates made by infilling the available 3D time-dependent ocean temperature observations. We find that the global ocean absorbed heat during this period at a rate of 0.30 ± 0.06 W/m2 in the upper 2000 m and 0.028 ± 0.026 W/m2 below 2000 m, with large decadal fluctuations. The total OHC change since 1871 is estimated at 436 ±91 × 1021 J, with an increase during 1921-1946 (145 ± 62× 1021 J) that is as large as during 1990-2015. By comparing with direct estimates, we also infer that, during 1955-2017, up to half of the Atlantic Ocean warming and thermosteric sea level rise at low-to-mid latitudes emerged due to heat convergence from changes in ocean transport.



Sea level oscillations in Japan and China since the start of the 20th century and consequences for coastal management – Part 1: Japan

Reblogged from Watts Up With That:

Albert Parker


  • Japan has strong quasi-20 and quasi-60 years low frequencies sea level fluctuations.
  • These periodicities translate in specific length requirements of tide gauge records.
  • 1894/1906 to present, there is no sea level acceleration in the 5 long-term stations.
  • Those not affected by crustal movement (4 of 5) do not even show a rising trend.
  • Proper consideration of the natural oscillations should inform coastal planning.


In Japan tide gauges are abundant, recording the sea levels since the end of the 19th century. Here I analyze the long-term tide gauges of Japan: the tide gauges of Oshoro, Wajima, Hosojima and Tonoura, that are affected to a lesser extent by crustal movement, and of Aburatsubo, which is more affected by crustal movement. Hosojima has an acceleration 1894 to 2018 of +0.0016 mm/yr2. Wajima has an acceleration 1894 to 2018 of +0.0046 mm/yr2. Oshoro has an acceleration 1906 to 2018 of −0.0058 mm/yr2. Tonoura has an acceleration 1894 to 1984 of −0.0446 mm/yr2. Aburatsubo, has an acceleration 1894 to 2018 of −0.0066 mm/yr2. There is no sign of any sea level acceleration around Japan since the start of the 20th century. The different tide gauges show low frequency (>10 years) oscillations of periodicity quasi-20 and quasi-60 years. The latter periodicity is the strongest in four cases out of five. As the sea levels have been oscillating, but not accelerating, in the long-term-trend tide gauges of Japan since the start of the 20th century, the same as all the other long-term-trend tide gauges of the world, it is increasingly unacceptable to base coastal management on alarmist prediction that are not supported by measurements.

And the Conclusion.

In three of the four long-term tide gauges of Japan, Oshoro, Wajima, Hamada, there is no sea level rise and there is no sea level acceleration. Hosojima has an acceleration 1894 to 2018 of +0.0016 mm/yr2. Wajima has an acceleration 1894 to 2018 of +0.0046 mm/yr2. Oshoro has an acceleration 1906 to 2018 of −0.0058 mm/yr2.

In the fourth term tide gauge of Japan, an apparent sea level rise and acceleration is only the result of the composite record obtained by coupling the long-term tide gauge record of Tonoura, of no acceleration and no sea level rise, with the short-term tide gauge record of Hamada II, sinking and rising at a much faster rate. Tonoura has an acceleration 1894 to 1984 of −0.0446 mm/yr2.

The other long-term tide gauge of Japan, Aburatsubo, which is significantly affected by crustal movement, has an acceleration 1894 to 2018 of −0.0066 mm/yr2.

There is therefore no sign of any sea level acceleration around Japan since the start of the 20th century.

All the long-term tide gauges considered show a clear multidecadal oscillation of periodicity quasi-60 years. This translate in the need of tide gauge records long enough to compute rates of rise (>60 years) and accelerations (>100 years).

Ocean and coastal management should be based on reliable data for sea level rise and acceleration, not on alarmist speculation.

Read the full paper here.

Sea levels, atmospheric pressure and land temperature during glacial maxima

Climate Etc.

by Alan Cannell

The new tropical lands: a carbon sink during formation and huge source of carbon dioxide and methane when lost to the sea.

View original post 2,563 more words

Top 12 Debunked Climate Scares of 2018

Reblogged from Watts Up With That:

Reposted from The GWPF


January 2018:  Worst-case global warming scenarios not credible: Study

PARIS (AFP) – Earth’s surface will almost certainly not warm up four or five degrees Celsius by 2100, according to a study released Wednesday (Jan 17) which, if correct, voids worst-case UN climate change predictions.

A revised calculation of how greenhouse gases drive up the planet’s temperature reduces the range of possible end-of-century outcomes by more than half, researchers said in the report, published in the journal Nature.

February:  ‘Sinking’ Pacific nation Tuvalu is actually getting bigger, new research reveals

The Pacific nation of Tuvalu — long seen as a prime candidate to disappear as climate change forces up sea levels — is actually growing in size, new research shows.

A University of Auckland study examined changes in the geography of Tuvalu’s nine atolls and 101 reef islands between 1971 and 2014, using aerial photographs and satellite imagery.

It found eight of the atolls and almost three-quarters of the islands grew during the study period, lifting Tuvalu’s total land area by 2.9 percent, even though sea levels in the country rose at twice the global average.

March: BBC forced to retract false claim about hurricanes

You may recall the above report by the BBC, which described how bad last year’s Atlantic hurricane season was, before commenting at the end: “A warmer world is bringing us a greater number of hurricanes and a greater risk of a hurricane becoming the most powerful category 5.I fired off a complaint, which at first they did their best to dodge. After my refusal to accept their reply, they have now been forced to back down

April: Corals can withstand another 100-250 Years of  climate change, new study

Heat-tolerant genes may spread through coral populations fast enough to give the marine creatures a tool to survive another 100-250 years of warming in our oceans.

May: Climate change causes beaches to grow by 3,660 square kilometers

Since 1984 humans have gushed forth 64% of our entire emissions from fossil fuels. (Fully 282,000 megatons of deplorable carbon “pollution”.) During this time, satellite images show that 24% of our beaches shrank, while 28% grew. Thus we can say that thanks to the carbon apocalypse there are 3,660 sq kms more global beaches now than there were thirty years ago.

June: Antarctica not losing ice, NASA researcher finds

NASA glaciologist Jay Zwally says his new study will show, once again, the eastern Antarctic ice sheet is gaining enough ice to offset losses in the west.

July: National Geographic admits they were wrong about notorious starving polar bear-climate claims

The narrative behind the viral photo of a polar bear starving, reportedly thanks to climate change, has been called into question by the National Geographic photographer who took it in the first place.

August: New study shows declining risk and increasing resilience to extreme weather in France

This risk factor for French residents of cities stricken by a disaster has been falling with every passing decade.

September: Coral bleaching is a natural event that has gone on for centuries, new study

Coral bleaching has been a regular feature of the Great Barrier Reef for the past 400 years, with evidence of repeated mass events dating back to well before Euro­pean settlement and the start of the industrial revolution.

October: Climate predictions could be wrong in UK and Europe

Current climate change predictions in the UK and parts of Europe may be inaccurate, a study conducted by researchers from the University of Lincoln, UK, and the University of Liège, Belgium, suggests.

November: Number and intensity of US hurricanes have remained constant since 1900

There’s been “no trend” in the number and intensity of hurricanes hitting the continental U.S. and the normalized damages caused by such storms over the past 117 years, according to a new study.

December: Alarmist sea level rise scenarios unlikely, says climate scientist Judith Curry

A catastrophic rise in sea levels is unlikely this century, with ­recent experience falling within the range of natural variability over the past several thousand years, according to a report on peer-­reviewed studies by US climate scientist Judith Curry.

HT/GWPF and Marcus

University of Exeter research sheds new light on what drove last, long-term global climate shift

Reblogged from Watts Up With That:

Public Release: 19-Dec-2018

The quest to discover what drove the last, long-term global climate shift on Earth, which took place around a million years ago, has taken a new, revealing twist.

A team of researchers led by Dr Sev Kender from the University of Exeter, have found a fascinating new insight into the causes of the Mid-Pleistocene Transition (MPT) – the phenomenon whereby the planet experienced longer, intensified cycles of extreme cold conditions.

While the causes of the MPT are not fully known, one of the most prominent theories suggests it may have been driven by reductions in glacial CO2 emissions.

Now, Dr Kender and his team have discovered that the closure of the Bering Strait during this period due to glaciation could have led the North Pacific to become stratified – or divided into distinct layers – causing CO2 to be removed from the atmosphere. This would, they suggest, have caused global cooling.

The team believe the latest discovery could provide a pivotal new understanding of how the MPT occurred, but also give a fresh insight into the driving factors behind global climate changes.

The research is published in Nature Communications on December 19th 2018.

Dr Kender, a co-author on the study from the Camborne School of Mines, based at the University of Exeter’s Penryn Campus in Cornwall said: “The subarctic North Pacific is composed of some of the oldest water on Earth, which has been separated from the atmosphere for such a long time that a high concentration of dissolved CO2 has built up at depth. When this water upwells to the surface, some of the CO2 is released. This is thought to be an important process in geological time, causing some of the global warming that followed past glaciations.

“We took deep sediment cores from the bottom of the Bering Sea that gave us an archive of the history of the region. By studying the chemistry of sediment and fossil shells from marine protists called foraminifera, we reconstructed plankton productivity, and surface and bottom water masses. We were also able to better date the sediments so that we could compare changes in the Bering Sea to other global changes at that time.

“We discovered that the Bering Sea region became more stratified during the MPT with an expanded intermediate-depth watermass, such that one of the important contributors to global warming – the upwelling of the subarctic North Pacific – was effectively curtailed.”

The Earth’s climate has always been subjected to significant changes, and over the past 600,000 years and more it has commonly oscillated between warm periods, similar today, and colder, ‘glacial’ periods when large swathes of continents are blanketed under several kilometres of ice.

These regular, natural changes in the Earth’s climate are governed by changes in how the Earth orbits around the sun, and variations in the tilt of its axis caused by gravitational interactions with other planets.

These changes, known as orbital cycles, can affect how solar energy is dispersed across the planet. Some orbital cycles can, therefore, lead to colder summers in the Northern Hemisphere which can trigger the start of glaciations, while later cycles can bring warmer summers, causing the ice to melt.,

These cycles can be influenced by a host of factors that can amplify their effect. One of which is CO2 levels in the atmosphere.

As the MPT occurred during a period when there were no apparent changes in the nature of the orbit cycles, scientists have long been attempting to discover what drove the changes to take place.

For this research, Dr Kender and his team drilled for deep-sea sediment in the Bering Sea, in conjunction with the International Ocean Discovery Program, and measured the chemistry of the fossil shells and sediments.

The team were able to create a detailed reconstruction of oceanic water masses through time – and found that the closure of the Baring Strait caused the subarctic North Pacific became stratified during this period of glaciation.

This stratification, that argue, would have removed CO2 from the atmosphere and caused global cooling.

Dr Kender added: “Today much of the cold water produced by sea ice action flows northward into the Arctic Ocean through the Bering Strait. As glaciers grew and sea levels fell around 1 million years ago, the Bering Strait would have closed, retaining colder water within the Bering Sea. This expanded watermass appears to have stifled the upwelling of deep CO2-rich water and allowed the ocean to sequester more CO2 out of the atmosphere. The associated cooling effect would have changed the sensitivity of Earth to orbital cycles, causing colder and longer glaciations that characterise climate ever since.

“Our findings highlight the importance of understanding present and future changes to the high latitude oceans, as these regions are so important for long term sequestration or release of atmospheric CO2.”

The end of the Little Ice Age

Reblogged from Euan Mearns’ Energy Matters:

The Little Ice Age (LIA) was a recent and significant climate perturbation that may still be affecting the Earth’s climate, but nobody knows what caused it. In this post I look into the question of why it ended when it did, concentrating on the European Alps, without greatly advancing the state of knowledge. I find that the LIA didn’t end because of increasing temperatures, decreasing precipitation or fewer volcanic eruptions. One possible contributor is a trend reversal in the Atlantic Multidecadal Oscillation; another is an increase in solar radiation, but in neither case is the evidence compelling. There is evidence to suggest that the ongoing phase of glacier retreat and sea level rise is largely a result of a “natural recovery” from the LIA, but no causative mechanism for this has been identified either.

The Little Ice Age (LIA) was a period of lower global temperatures defined by temperature reconstructions based mostly on tree ring proxies. Figure 1 shows the results of fifteen such reconstructions for the Northern Hemisphere with three instrumental records added after 1900 (data from NOAA/NCDC). The period of lower temperatures between about 1450 and 1900 roughly defines the LIA, but the high level of scatter (cunningly muted by plotting the more erratic reconstructions in lighter shades) makes it impossible to pick exact start and stop dates:

Figure 1: Northern Hemisphere temperature reconstructions over the last 2,000 years

Because of the problems with temperature reconstructions this post concentrates on the European Alps, where long-term instrumental records – some going back to the early 1700s – provide information on temperature and precipitation changes around the time the LIA came to an end. Another reason for concentrating on the Alps is that almost half of the world’s glaciers that have long-term monitoring data are located there.

Figure 2 is a plot of glacier lengths in the Alps since 1700, taken from Oerlemans et al. (2007) and based on historical data from 96 glaciers, a few of which are in the Pyrenees. According to this plot Alpine glaciers began to retreat at some time between 1800 and 1820, with the rate of retreat gradually increasing until about 1860 and becoming more or less constant after that. I have defined the beginning of the retreat as the end of the LIA in the Alps because there is no objective way of selecting a later date:

Figure 2: Alpine glacier length variations since 1700

To evaluate how this plot matches up with European temperatures I downloaded 21 GHCN v2 long-term European temperature records from KNMI Climate Explorer. I was concerned that these records would show too much scatter to be of use, but after discarding three bad ones (St. Petersburg, Wroclaw and Paris) the remaining 18 (Archangel, Vilnius, Trondheim, Stockholm, Warsaw, Bergen, Budapest, Vienna, Copenhagen, Klagenfurt, Kremsmuenster, Berlin Tempel, Berlin Dahlem, Munich, Hohenpeissen, Basel, Greenwich, de Bilt and St. Bernard) match up reasonably well. Figure 3 plots all 18 records. The orange line is a smoothed mean:

Figure 3: 18 long-term GHCN v2 European temperature records, annual means (the single record extending back to 1706 is de Bilt, Netherlands)

According to the orange line Europe didn’t begin to warm until around 1890, long after the Alpine glaciers began to retreat. And serious warming didn’t begin until 1980.

Other data sets give similar results. According to the Central England temperature record the UK didn’t begin to warm until around 1900:

Figure 4: Central England annual temperatures

And according to the Hadley Centre’s CRUTEM4 gridded land temperature data set warming in the Alps didn’t begin until around 1890:

Figure 5: CRUTEM4 annual mean temperature anomalies in and around the Alps   (44N to 48N, 5N to 15E)

Evidently the end of the LIA in the Alps wasn’t triggered by rising temperatures. What other possibilities are there? Well, the other key ingredient in making ice is water, so I downloaded ten European precipitation records with data extending back before 1800  (Milan, Paris, Padua, Marseilles, Montdidier, Hoofdorp, Kew, Podehole, Lund and Uppsala), plotted them up and obtained the results shown in Figure 3:

Figure 6: 10 long-term GHCN v2 European precipitation records, annual means (the single record extending back before 1700 is Kew, UK)

And found that average precipitation in Europe hasn’t changed significantly in almost 300 years. This is in itself an interesting result, but a discussion of the implications is beyond the scope of this post. Suffice it to say that decreased precipitation did not cause the LIA to end either.

If neither precipitation nor temperature ended the LIA in the Alps, what did? I looked next at volcanic eruptions, which are often claimed to have been a factor. Did Alpine glaciers begin to retreat because a period of frequent volcanic eruptions, which would have cooled the Earth down, was followed by a period of infrequent volcanic eruptions, which would have allowed the Earth to warm back up again? No. Quite the opposite, in fact:

Figure 6: Alpine glacier length change (blue) vs. volcanic eruptions of VEI 5 or greater (red, data from Bradley 1992). Note that the VEI index is logarithmic, so a magnitude 6 will be ten times more explosive than a magnitude 5

The fact that the Alpine glaciers began to retreat began somewhere around the time of the 1815 Tambora eruption – the largest in recorded history – is intriguing, but it’s hard to think of a mechanism whereby Tambora could have initiated a glacial retreat in the Alps. It seems that in this case the coincidence really is coincidental.

What next? About ten years ago I put together the following graph. It showed a correlation between Alpine glacier retreat and the Atlantic Multidecadal Oscillation Index, which is calculated from North Atlantic sea surface temperatures:

Figure 7: Percent of Alpine glaciers retreating vs. Atlantic Multidecadal Oscillation Index

According to the AMO reconstruction presented in a post by Bob Tisdale (Figure 8) the beginning of Alpine glacier retreat coincides broadly with a change in the trend of the AMO Index from down to up around 1820. Since higher AMO values coincide with glacier retreats (Figure 7) it’s possible that this event may have contributed to or even triggered the Alpine retreat, but the data are too erratic to be confident that it did:

Figure 8: Atlantic Multidecadal Oscillation Index (brown) vs. Glacier lengths (blue)

Next I looked at solar. Here I adopted a global perspective because solar variations would presumably impact glaciers all over the world, not just in the Alps. As shown in Figure 9, however, when the data from all the world’s glaciers are summed global glacier retreat mirrors Alpine glacier retreat, beginning in the early 1800s and picking up steam as the century progressed (graphic from Oerlemans et al. 2005):

Figure 9: Global and non-Alpine glacier length variations since 1700

Figure 10 now shows two versions of sunspot counts, which are a measure of solar activity. The first is the “original” Hoyt & Schatten (1998) version, which for a number of years was used as the basis for estimating total solar irradiance. According to this version the end of the LIA coincided with the recovery from the Dalton solar minimum around 1820 and  since then the sunspot count has been increasing, albeit erratically. This suggests that a strengthening sun may well have contributed to ending the LIA. The second version is a revised version published by the Sunspot Index and Long-term Solar Observations group in 2015. It also shows the LIA ending during the recovery from the Dalton Minimum but no significant increase in the sunspot count since then. If this version is correct it’s difficult to see how the sun could have been responsible for ending the LIA:

Figure 10: Sunspot counts, original (1998) and revised (2015) versions, annual means

Counting sunspots, however, is more a black art than a science. (Try the quiz at the end of the post.)

Finally I looked at CO2. And at last I found something resembling a match. The Figure 11 plot from Quora broadly mirrors the main features of the glacial retreat plots shown in Figure 9, suggesting that increasing atmospheric CO2 levels were the culprit:

Figure 11: Atmospheric CO2 concentrations since 1700

But not to worry. Not even Skeptical Science claims that CO2 caused the LIA:

The Little Ice Age remains for the present the subject of speculation. The most likely influence during this period is variable output from the sun combined with pronounced volcanic activity.

Nor does RealClimate:

The cause of this relatively short lived cooling (it was not a true “ice age”) is likely due to an increase in volcanic eruptions and with some role for a slightly reduced solar activity.

And what does the IPCC have to say about it?

Both model simulations and results from detection and attribution studies suggest that a small drop in GHG concentrations may have contributed to the cool conditions during the 16th and 17th centuries. Note, however, that centennial variations of GHG during the late Holocene are very small relative to their increases since pre-industrial times. The role of solar forcing is less clear ….

In the 1,535-page AR5 Working Group 1 report this is all the space the LIA got.

Brief discussion:

AGW skeptics often claim that present-day sea level rise and glacier ice loss are a result of a natural recovery from the LIA. AGW believers counter that a “recovery from the LIA” is not an acceptable explanation; a causative mechanism must be identified. But they too are unable to come up with a mechanism which explains why sea levels have been rising and glaciers retreating for a well over a hundred years, long before man-made CO2 emissions became significant. Global warming certainly doesn’t explain it. Figure 12, which matches up Church and White’s global sea level reconstruction – the one used by the IPCC – against the HadCRUT4 global temperature record, shows sea levels rising since 1870, and at a constant rate between 1925 and 2000, regardless of what temperatures did. Sea levels don’t respond immediately to changes in temperature, but the fact that sea level rise was unaffected by global temperature fluctuations again suggests that a factor other than temperature was the dominant cause:

Figure 12: Church & White sea level reconstruction vs. HadCRUT4 global surface temperatures since 1870, smoothed annual means

And what might this factor have been? A natural recovery from the LIA, of course.

There’s still a lot we don’t understand about how the Earth’s climate works.

Sunspot Quiz:

How many sunspots are there in this photo? Make your estimate and stay tuned for the “correct” count.

Sea Levels–Inside The Acceleration Factory

Reblogged from Watts Up With That:

Guest Post by Willis Eschenbach

Nerem and Fasullo have a new paper called OBSERVATIONS OF THE RATE AND ACCELERATION OF GLOBAL MEAN SEA LEVEL CHANGE, available here. In it, we find the following statement:

Both tide gauge sea level reconstructions and satellite altimetry show that the current rate of global mean sea level change is about 3 mm yr–1, and both show that this rate is accelerating.

So the claim is that tide gauges show acceleration. Let’s start with a look at the Church and White (hereinafter C&W) estimate of sea level from tide gauges around the world, which is the one used in the Nerem and Fasullo paper. The C&W paper is here.

Figure 1. Church and White sea level rise estimate.

Not real scary …

However, there is an oddity. Let’s take a closer look at the C&W sea level estimate shown in Figure 1.

Figure 2. As in Figure 1, but with a different scale.

Now, when I looked at that, the curious part to me was the change in the recent trend. For the last quarter century, we’ve had satellite sea level data, which began in 1993. In the past, the trend of the satellite data (1993 – 2013, 2.8 ± .16 mm/year, or about an eighth of an inch per year) has been almost double the overall trend of the tide gauges (1.6 ± 0.14 mm/year).

But in this most recent C&W estimate, the recent tide gauge trend is much larger. How much larger? Well … a lot. In fact, the recent C&W estimate is greater than the satellite estimate for the overlap period …

Figure 3. As in Figure 2, showing trends for the 21-year periods before and during the satellite era.

Why the increase in trend? Well, since 1993 they’ve mixed satellite data in with the tide gauge data.

To combine the tide gauge and satellite datasets, … Church and White (2011) and Ray and Douglas (2011) use empirical orthogonal functions of the satellite data with principal components derived from the tide gauge records. Church and White analyze changes in sea level over time, enabling them to use many tide gauges, some with short records, without needing to relate the absolute level of different tide gauges. SOURCE

But is this approach justified? I mean, did the tide gauge data itself go up during that time, so that it would be reasonable to use satellite data to refine the results?

Now, that is a tough question to answer, because the tide gauge data is sparse spatially and temporally, and it is also affected by vertical land motion. But you know me … I’m a data guy. So I went and got the full set of 1,512 tide gauge records from the Permanent Service For Mean Sea Level. In passing let me say that I don’t think they could make it harder to collect the data. It is in 1,512 separate files. Not only that, but the so-called catalog looks like this:

Figure 4. PSMSL Catalog. It is great fun to convert this to a simple computer file … but I digress.

To highlight some of the problems with converting tide-gauge data to global sea level data, here are ten typical records in the dataset:

Figure 5. Typical tide gauge records.

I’m sure you can see the difficulties. Some places the land is steadily rising from post-glacial rebound, and it’s rising so fast that the sea levels are actually sinking relative to the land. In other places, the land is sinking due to subsidence and groundwater extraction. Many records are short and have gaps. Generally, it’s a mess.

So … here was my thought about how to get around these issues: You’ll note in Figure 3 above that the increase in trend between the 21 years before the satellite era and the 21-year overlap during the satellite era was 2.1 ± 0.5 mm per year. And while the trends in the tide gauges are all over the place … I can look at the difference in the trends for each individual dataset over the same period. This gets rid of the problem of vertical land movement, which is constant over such a geologically short time period. So here was my procedure.

First, from the 1,512 tide gauge records in the PSMSL dataset, I selected all the records that contained 90% data over the 21 year period before the satellite era and also had 90% data over the succeeding 21 year period during the satellite era. This left me with 258 tide gauge datasets with coverage over the full 42-year period.

Next, I calculated the trend for each of these datasets for the period before and during the satellite era.

Then, for each tide station, I subtracted the pre-satellite trend from the satellite trend. And finally, I got the median and the uncertainty of those 258 trend differences. Figure 6 shows a graphic of those results.

Figure 6. Comparison between the values and the errors of the difference between the 21-year trend before the 1993 start of the satellite record, and the succeeding 21-year trend from 1993 to the end of the Church and White records. The C&W trends are shown in Figure 3 above.

Since the error bars (orange and red) do not overlap, we can say that the C&W estimate does NOT agree with the tide gauge data. And that, of course, means that it has been artificially increased by cross-pollution with satellite data.

Let me close by saying that I think that it is very bad scientific practice to splice together a terrestrial and a satellite record unless they agree well during the period of overlap. In this case, they disagree greatly over the period of record. For the detrended values over the period of overlap (1993-2013), the R^2 value is 0.01 and the P-value is 0.37 … in other words, there is absolutely no significant correlation between the satellite data and the C&W estimate.

And this makes it very likely that Church and White are manufacturing sea level acceleration where none exists … bad scientists, no cookies.

Finally, at the end of my research into this, I find that I’m not the only one to notice the discrepancy …

Figure 7. Different results for the satellite era depending on whether or not the satellite data is illegitimately spliced into the tide gauge records. SOURCE

My best to everyone. Here I’m staying indoors on a rainy Sunday, watching American football and researching the vagaries of sea level …


As Always: I politely request that when you comment, you quote the exact words you are discussing, to avoid misunderstandings.

Data: So that others won’t have the hassles I had extracting and collating the data, I’ve put the full PSMSL dataset as a single comma-separated values (CSV) file here, and the PSMSL catalog here.