Ocean SSTs Cooler in December

Science Matters

droguestatus Note: A drogue is a sea anchor resisting drifting speed.

The best context for understanding decadal temperature changes comes from the world’s sea surface temperatures (SST), for several reasons:

  • The ocean covers 71% of the globe and drives average temperatures;
  • SSTs have a constant water content, (unlike air temperatures), so give a better reading of heat content variations;
  • A major El Nino was the dominant climate feature in recent years.

HadSST is generally regarded as the best of the global SST data sets, and so the temperature story here comes from that source, the latest version being HadSST3.  More on what distinguishes HadSST3 from other SST products at the end.

The Current Context

The chart below shows SST monthly anomalies as reported in HadSST3 starting in 2015 through December 2018.

hadsst122018

A global cooling pattern is seen clearly in the Tropics since its peak in 2016, joined by NH and SH…

View original post 1,328 more words

ARGO—Fit for Purpose?

Reblogged from Watts Up With That:

By Rud Istvan

This is the second of two guest posts on whether ‘big’ climate science missions are fit for purpose, inspired by ctm as seaside lunch speculations.

The first post dealt with whether satellite altimetry, specifically NASA’s newest Jason3 ‘bird’, was fit for sea level rise (SLR) ‘acceleration’ purpose. It found using NASA’s own Jason3 specs that Jason3 (and so also its predecessors) likely was NOT fit–and never could have been–despite its SLR data being reported by NASA to 0.1mm/yr. We already knew that annual SLR is low single digit millimeters. The reasons satellite altimetry cannot provide that level of precision are very basic, and were known to NASA beforehand—Earth’s requisite reference ellipsoid is lumpy, oceans have varying waves, atmosphere has varying humidity—so NASA never really had a chance of achieving what they aspired to: satalt missions to measure sea level rise to fractions of a millimeter per year equivalent to tide gauges. NASA claims they can, but their specifications say they cannot. The post proved lack of fitness via overlap discrepancies between Jason2 and Jason3, plus failure of NASA SLR estimates to close.

This second related guest post asks the same question of ARGO.

Unlike Jason3, ARGO had no good pre-existing tide gauge equivalent mission comparable. Its novel oceanographic purposes (below) tried to measure several things ‘rigorously’ for the very first time. “Rigorously’ did NOT mean precisely. One, ocean heat content (OHC), was previously very inadequately estimated. OHC is much more than just sea surface temperatures (SSTs). SSTs (roughly but not really surface) were formerly measured by trade route dependent buckets/thermometers, or by trade route and ship laden dependent engine intake cooling water temperatures. Deeper ocean was not measured at all until inherently depth inaccurate XBT sensors were developed for the Navy.

Whether ARGO is fit for purpose involves a complex unraveling of design intent plus many related facts. The short ARGO answer is probably yes, although OHC error bars are provably understated in ARGO based scientific literature.

For those WUWT readers wishing a deeper examination of this guest post’s summary conclusions, a treasure trove of ARGO history, implementation, and results is available at www.ARGO.uscd.edu. Most of this post is either directly derived therefrom, or from references found therein, or leads to Willis Eschenbach’s previous WUWT ARGO posts (many searchable using ARGO), with the four most relevant directly linked below.

This guest post is divided into three parts:

1. What was the ARGO design intent? Unlike simple Jason3 SLR, ARGO has a complex set of overlapping oceanographic missions.

2. What were/are the ARGO design specs relative to its missions?

3. What do facts say about ARGO multiple mission fitness?

Part 1 ARGO Intent

ARGO was intended to explore a much more complicated set of oceanography questions than Jason’s simple SLR acceleration. The ideas were developed by oceanographers at Scripps circa 1998-1999 based on a decade of previous regional ocean research, and were formulated into two intent/design documents agreed by the implementing international ARGO consortium circa 2000. There were several ARGO intended objectives. The three most explicitly relevant to this summary post were:

1. Global ocean heat climatology (OHC with intended accuracy explicitly defined as follows)

2. Ocean ‘fresh water storage’ (upper ocean rainfall salinity dilution)

3. Map of non-surface currents

All providing intended “global coverage of the upper ocean on broad spatial scales and time frames of several months or longer.”

Unlike Jason3, no simple yes/no ‘fit for purpose’ for ARGO’s multiple missions is possible. It depends on which mission over what time frame.

Part 2 ARGO Design

The international design has evolved. Initially, the design was ~3000 floats providing a random roughly 3 degree lat/lon ocean spacing, explicitly deemed sufficient spatial resolution for all ARGO intended oceanographic purposes.

There is an extensive discussion of the array’s accuracy/cost tradeoffs in the original intent/design documentation. The ARGO design “is an ongoing exercise in balancing the array’s requirements against the practical limitations imposed by technology and resources”. Varying perspectives still provided (1998-99) “consistent estimates of what is needed.” Based on previous profiling float experiments, “in proximate terms an array with spacing of a few hundred kilometers is sufficient to determine surface layer heat storage (OHC) with an accuracy of about 10W/m2 over areas (‘pixels’) about 1000km on a side.” Note the abouts.

The actual working float number is now about 3800. Each float was to last 4-5 years battery life; the actual is ~4.1 years. Each float was to survive at least 150 profiling cycles; this has been achieved (150 cycles*10 days per cycle/365 days per year equals 4.1 years). Each profile cycle was to be 10 days, drifting randomly at ~1000 meters ‘parking depth’ at neutral buoyancy for 9, then descending to 2000 meters to begin measuring temperature and salinity, followed by a ~6 hour rise to the surface with up to 200 additional measurement sets of pressure (giving depth), temperature, and salinity. This was originally followed by 6-12 hours on the surface transmitting data (now <2 hours using the Iridium satellite system) before sinking back to parking depth.

The basic ARGO float design remains:

clip_image002

And the basic ARGO profiling pattern remains:

clip_image004

‘Fit for purpose’ concerning OHC (via the 2000 meter temperature profile) presents two relevant questions. (1) Is 2000 meters deep enough? (2) Are the sensors accurate enough to estimate the 10W/m2 per 1000km/side ‘pixel’?

With respect to depth, there are two differently sourced yet similar ‘yes’ answers for all mission intents.

For salinity, the ARGO profile suffices. Previous oceanographic studies showed (per the ARGO source docs) that salinity is remarkably unvarying below about 750 meters depth in all oceans. This fortunately provides a natural salinity ‘calibration’ for those empirically problematic sensors.

It also means seawater density is roughly constant over about 2/3 of the profile, so pressure is a sufficient proxy for depth (and pressure can also be calibrated by measured salinity above 750 meters translated to density).

For temperature, as the following figure (in °F not °C) typical thermocline profiles show, ARGO ΔT depth profile does not depend very much on latitude since 2000 meters equaling ~6500 feet reaches the approximately constant deep ocean temperature equilibrium at all latitudes, providing another natural ARGO ‘calibration’. The 2000 meters ARGO profile was a wise intent/design choice.

clip_image005

Part 3 Is ARGO fit for purpose?

Some further basics are needed as background to the ARGO objectives.

When an ARGO float surfaces to transmit its data, its position is ascertained via GPS to within about 100 meters. Given the vastness of the oceans, that is an overly precise position measurement for ‘broad spatial scales’ of deep current drift and 1000000km2 OHC/salinity ‘pixels’.

Thanks to salinity stability below 750 meters, ARGO ‘salinity corrected’ instruments are accurate (after float specific corrections) to ±0.01psu, giving reasonable estimates of ‘fresh water storage’. A comparison of 350 retrieved ‘dead battery’ ARGO floats showed that 9% were still out of ‘corrected’ salinity calibration at end of life, unavoidably increasing salinity error a little.

The remaining big ‘sufficient accuracy’ question is OHC, and issues like Trenberth’s infamous “Missing Heat” covered in the eponious essay in ebook Blowing Smoke. OHC is a very tricky sensor question, since the vast heat capacity of ocean water means a very large change in ocean heat storage translates into a very small change in absolute seawater temperature.

How good are the ARGO temperature sensors? On the surface, it might seem to depend, since as an international consortium, ARGO does not have one float design. There are presently five: Provor, Apex, Solo, S2A, and Navis.

However, those 5 only ever embodied two temperature sensors, FS1 and SBE. Turns out—even better for accuracy—FS1 was retired late in 2006 when JPL’s Willis published the first ARGO OHC analysis after full (3000 float) deployment, finding (over too short a time frame, IMO) OHC was decreasing (!). Oops! Further climate science analysis purportedly showed FS1 temperature profiles in a few hundred of the early ARGO floats were probably erroneous. Those floats were taken out of service, leaving just SBE sensors. All five ARGO float designs use current model SBE38 from 2015.

SeaBirdScientific builds that sensor, and its specs can be found at www.seabird.com. The SeaBird E38 sensor spec is the following (sorry, but it doesn’t copy well from their website where all docs are in a funky form of pdf probably intended to prevent partial duplication like for this post).

Measurement Range

-5 to +35 °C

Initial Accuracy 1

± 0.001 °C (1 mK)

Typical Stability

0.001 °C (1 mK) in six months, certified

Resolution

Response Time 2

500 msec

Self-heating Error

< 200 μK

1 NIST-traceable calibration applying over the entire range.
2 Time to reach 63% of nal value following a step change in temperature

That is a surprisingly good seawater temperature sensor. Accurate to a NIST calibrated 0.001°C, with a certified temperature precision drift per 6 months (1/8 of a float lifetime) of ±0.001°C. USCD says in its ARGO FAQs that the ARGO temperature data it provides is accurate to ±0.002°C. This suffices to estimate the about 10W/m2 OHC intent per 1000000 km2 ARGO ‘pixel’.

BUT, there is still a major ‘fit for purpose’ problem despite all the ARGO strong positives. Climate papers based on ARGO habitually understate the actual resulting OHC uncertainty—about 10W/m2. (Judith Curry has called this one form of her ‘uncertainty monster’). Willis Eschenbach has posted extensively here at WUWT (over a dozen guest posts already) on ARGO and its findings. His four most relevant for the ‘fit for purpose’ scientific paper uncertainty question are from 2012-2015, links that WE kindly provided via email needing no explanation:


Decimals of Precision

An Ocean of Overconfidence

More Ocean-Sized Errors In Levitus Et Al.

Can We Tell If The Oceans Are Warming

And so we can conclude concerning the ARGO ‘fit for purpose’ question, yes it probably is—but only if ARGO based science papers also correctly provide the associated ARGO intent uncertainty (error bars) for ‘rigorous albeit broad spatial resolution’.

CO2 Not So Much, 60 Year Cycle Paper Actually Got Published

Reblogged from Musings from the Chiefio:

The article is cited in a couple of other places. I ran into it here:

https://tallbloke.wordpress.com/2019/01/13/geoscientists-reconstruct-eye-opening-900-year-northeast-climate-record/

Where Tallbloke points to the Elsevier / Science Direct origin (where it is paywalled…)

This supports the idea that temperature cycles in the region of 60 years are very likely a common feature of Earth’s climate.

Deploying a new technique for the first time in the region, geoscientists at the University of Massachusetts Amherst have reconstructed the longest and highest-resolution climate record for the Northeastern United States, which reveals previously undetected past temperature cycles and extends the record 900 years into the past, well beyond the previous early date of 1850, reports Phys.org.

And points at the description of the article at phys.org:

https://phys.org/news/2019-01-geoscientists-reconstruct-eye-opening-year-northeast.html

As Miller explains, they used a relatively new quantitative method based on the presence of chemical compounds known as branched glycerol dialkyl glycerol tetra ethers (branched GDGTs) found in lakes, soils, rivers and peat bogs around the world. The compounds can provide an independent terrestrial paleo-thermometer that accurately assesses past temperature variability.

Miller says, “This is the first effort using these compounds to reconstruct temperature in the Northeast, and the first one at this resolution.” He and colleagues were able to collect a total of 136 samples spanning the 900-year time span, many more than would be available with more traditional methods and from other locations that typically yield just one sample per 30-100 years.

I make that about a 6 2/3 year long duration per sample. So 9 samples per 60+ year cycle. A bit coarse but it ought to resolve with 4 to 5 samples per arc of excursion.

I find it a bit amusing that they are all worked up about having rediscovered the Medieval Warm Period and the Little Ice Age; but OK, at least we’re finally getting back to reality. They are amazed at the “new” finding of the same 60ish year cycle that has been found just about everywhere anyone actually looks for it. OK… All bolding by me.

In their results, Miller says, “We see essentially cooling throughout most of the record until the 1900s, which matches other paleo-records for North America. We see the Medieval Warm Period in the early part and the Little Ice Age in the 1800s.” An unexpected observation was 10, 50-to-60-year temperature cycles not seen before in records from Northeast U.S., he adds, “a new finding and surprising. We’re trying to figure out what causes that. It may be caused by changes in the North Atlantic Oscillation or some other atmospheric patterns. We’ll be looking further into it.”

He adds, “We’re very excited about this. I think it’s a great story of how grad students who come up with a promising idea, if they have enough support from their advisors, can produce a study with really eye-opening results.” Details appear in a recent issue of the European Geophysical Union’s open-access online journal, Climate of the Past.

The authors point out that paleo-temperature reconstructions are essential for distinguishing human-made climate change from natural variability, but historical temperature records are not long enough to capture pre-human-impact variability. Further, using conventional pollen- and land-based sediment samples as climate proxies can reflect confounding parameters rather than temperature, such as precipitation, humidity, evapo-transpiration and vegetation changes.

Or put more succinctly, our thermometer record is short and lousy and our proxy record is pretty damn poor too.

Then TallBloke also points at a couple of other links. Here’s the paywall:

https://www.sciencedirect.com/science/article/pii/S0012825216300277

Anthropogenic CO2 warming challenged by 60-year cycle

Author François Gervais

Abstract

Time series of sea-level rise are fitted by a sinusoid of period ~ 60 years, confirming the cycle reported for the global mean temperature of the earth. This cycle appears in phase with the Atlantic Multidecadal Oscillation (AMO). The last maximum of the sinusoid coincides with the temperature plateau observed since the end of the 20th century. The onset of declining phase of AMO, the recent excess of the global sea ice area anomaly and the negative slope of global mean temperature measured by satellite from 2002 to 2015, all these indicators sign for the onset of the declining phase of the 60-year cycle. Once this cycle is subtracted from observations, the transient climate response is revised downwards consistent with latest observations, with latest evaluations based on atmospheric infrared absorption and with a general tendency of published climate sensitivity. The enhancement of the amplitude of the CO2 seasonal oscillations which is found up to 71% faster than the atmospheric CO2 increase, focus on earth greening and benefit for crops yields of the supplementary photosynthesis, further minimizing the consequences of the tiny anthropogenic contribution to warming.

I found a non-paywall copy up here:

http://www.skyfall.fr/wp-content/2016/05/Earth-Science-Reviews_FG_2016-.pdf

So download your copy while you can…

A nice summary of other 60 year-ish cycle evidence in this link also from TallBloke:
http://appinsys.com/globalwarming/SixtyYearCycle.htm

So what are these “branched glycerol dialkyl glycerol tetraethers”?
Found that answer here:

https://agupubs.onlinelibrary.wiley.com/doi/full/10.1029/2010JG001365

From their Figure 2:

GDGTs

GDGTs

Strange things made in the membrane of various microbes in the water column that change with the temperature.

From the actual paper intro:

A cycle of period ~60 years has been reported in global mean temperature of the earth (Schlesinger and Ramankutty, 1994; Ogurtsov et al., 2002; Klyashtorin and Lyubushin, 2003; Loehle, 2004; Zhen-Shan and Xian, 2007; Carvalo et al., 2007; Swanson and Tsonis, 2009; Scafetta, 2009; Akasofu, 2010; D’Aleo and Easterbrook, 2010; Loehle and Scafetta, 2011; Humlum et al., 2011; Chambers et al., 2012; Lüdecke et al., 2013; Courtillot et al., 2013; Akasofu, 2013; Macias et al., 2014; Ogurtsov et al., 2015). This cycle and others of smaller amplitude were found to be correlated with the velocity of the motion of
the sun with respect to the center of mass of the solar system (Scafetta, 2009). This cycle is also in phase with AMO index (Knudsen et al., 2011; McCarthy et al., 2015).
Section 2 will search for additional signatures of this 60-year cycle in major components and sensitive indicators of climate. The impact on climate of the CO2 emitted by burning of fossil fuels is a long-standing debate illustrated by 1637 papers found in the Web of Science by crossing the keywords

[anthropogenic] AND [greenhouse OR CO2] AND [warming]

This is to be compared to more than 1350 peer-reviewed papers which express reservations about dangerous anthropogenic CO2 warming and/or insist on the natural variability of climate (Andrew,2014). The transient climate response (TCR) is defined as the change in global mean surface temperature at the time of doubling of atmospheric CO2 concentration. The range of uncertainty reported by AR5 (2013) is very wide, 1–2.5 °C. More recent evaluations, later than the publication of AR5 (2013), focus on low values lying between 0.6 °Cand 1.4 °C (Harde, 2014; Lewis and Curry, 2014; Skeie et al., 2014; Lewis, 2015). The infrared absorption of CO2 is well documented since the availability of wide-band infrared spectrometry (Ångström, 1900).

OMG! Actually talking about solar motion and AMO connections! Next thing you know they will discover the lunar cycle involvement and how tides are directly shifting the ocean and air flows as part of those celestial motions…

The intro then goes into a discussion of the IR spectra and transmission where it finds the approved models lacking and also finds that yes, Virginia, we have had a pause in temperature rises…

The controversy has reached a novel phase because, contrary to CMIP3 and CMIP5 warming projections (AR5, 2013), global mean temperatures at the surface of the earth display a puzzling « plateau » or « pause » or « hiatus » since the end of the last century (McKitrick, 2014). This hiatus seems to have encouraged climate modelers to refrain from exaggerated warming projections.

The paper is full of many such goodies. It goes on to find a large 60 year cycle effect and a very muted CO2 effect.

Skipping down to the summary:

Dangerous anthropogenic warming is questioned (i) upon recognition of the large amplitude of the natural 60–year cyclic component and (ii) upon revision downwards of the transient climate response consistent with latest tendencies shown in Fig. 1, here found to be at most 0.6 °C once the natural component has been removed, consistent with latest infrared studies (Harde, 2014). Anthropogenic warming well below the potentially dangerous range were reported in older and recent studies (Idso, 1998; Miskolczi, 2007; Paltridge et al., 2009; Gerlich and Tscheuschner, 2009; Lindzen and Choi, 2009, 2011; Spencer and Braswell, 2010; Clark, 2010; Kramm and Dlugi, 2011; Lewis and Curry, 2014; Skeie et al., 2014; Lewis, 2015; Volokin and ReLlez, 2015). On inspection of a risk of anthropogenic warming thus toned down, a change of paradigm which highlights a benefit for mankind related to the increase of plant feeding and crops yields by enhanced CO2 photosynthesis is suggested.

I strongly recommend a download and careful reading of the paper.

December Cooling by Sea, More than by Land

Science Matters

banner-blog

With apologies to Paul Revere, this post is on the lookout for cooler weather with an eye on both the Land and the Sea.  UAH has updated their tlt (temperatures in lower troposphere) dataset for December.   Previously I have done posts on their reading of ocean air temps as a prelude to updated records from HADSST3. This month I will add a separate graph of land air temps because the comparisons and contrasts are interesting as we contemplate possible cooling in coming months and years.

Presently sea surface temperatures (SST) are the best available indicator of heat content gained or lost from earth’s climate system.  Enthalpy is the thermodynamic term for total heat content in a system, and humidity differences in air parcels affect enthalpy.  Measuring water temperature directly avoids distorted impressions from air measurements.  In addition, ocean covers 71% of the planet surface and thus dominates surface temperature estimates.  Eventually…

View original post 518 more words

Little Ice Age Still Cooling Pacific

sunshine hours

Lag: a period of time between one event or phenomenon and another

Little Ice Age Still Cooling Pacific

As much of the ocean responds to the rising temperatures of today’s world, the deep, dark waters at the bottom of the Pacific Ocean appear to be doing the exact opposite.

A Harvard study has found that parts of the deep Pacific may be getting cooler as the result of a climate phenomenon that occurred hundreds of years ago.

Around the 17th century, Earth experienced a prolonged cooling period dubbed the Little Ice Age that brought chillier-than-average temperatures to much of the Northern Hemisphere.

Though it’s been centuries since this all played out, researchers say the deep Pacific appears to lag behind the waters closer to the surface, and is still responding to the Little Ice Age.

In the deep Pacific Ocean, however, temperatures are dropping. This effect could be seen at a…

View original post 10 more words

University of Exeter research sheds new light on what drove last, long-term global climate shift

Reblogged from Watts Up With That:

Public Release: 19-Dec-2018

The quest to discover what drove the last, long-term global climate shift on Earth, which took place around a million years ago, has taken a new, revealing twist.

A team of researchers led by Dr Sev Kender from the University of Exeter, have found a fascinating new insight into the causes of the Mid-Pleistocene Transition (MPT) – the phenomenon whereby the planet experienced longer, intensified cycles of extreme cold conditions.

While the causes of the MPT are not fully known, one of the most prominent theories suggests it may have been driven by reductions in glacial CO2 emissions.

Now, Dr Kender and his team have discovered that the closure of the Bering Strait during this period due to glaciation could have led the North Pacific to become stratified – or divided into distinct layers – causing CO2 to be removed from the atmosphere. This would, they suggest, have caused global cooling.

The team believe the latest discovery could provide a pivotal new understanding of how the MPT occurred, but also give a fresh insight into the driving factors behind global climate changes.

The research is published in Nature Communications on December 19th 2018.

Dr Kender, a co-author on the study from the Camborne School of Mines, based at the University of Exeter’s Penryn Campus in Cornwall said: “The subarctic North Pacific is composed of some of the oldest water on Earth, which has been separated from the atmosphere for such a long time that a high concentration of dissolved CO2 has built up at depth. When this water upwells to the surface, some of the CO2 is released. This is thought to be an important process in geological time, causing some of the global warming that followed past glaciations.

“We took deep sediment cores from the bottom of the Bering Sea that gave us an archive of the history of the region. By studying the chemistry of sediment and fossil shells from marine protists called foraminifera, we reconstructed plankton productivity, and surface and bottom water masses. We were also able to better date the sediments so that we could compare changes in the Bering Sea to other global changes at that time.

“We discovered that the Bering Sea region became more stratified during the MPT with an expanded intermediate-depth watermass, such that one of the important contributors to global warming – the upwelling of the subarctic North Pacific – was effectively curtailed.”

The Earth’s climate has always been subjected to significant changes, and over the past 600,000 years and more it has commonly oscillated between warm periods, similar today, and colder, ‘glacial’ periods when large swathes of continents are blanketed under several kilometres of ice.

These regular, natural changes in the Earth’s climate are governed by changes in how the Earth orbits around the sun, and variations in the tilt of its axis caused by gravitational interactions with other planets.

These changes, known as orbital cycles, can affect how solar energy is dispersed across the planet. Some orbital cycles can, therefore, lead to colder summers in the Northern Hemisphere which can trigger the start of glaciations, while later cycles can bring warmer summers, causing the ice to melt.,

These cycles can be influenced by a host of factors that can amplify their effect. One of which is CO2 levels in the atmosphere.

As the MPT occurred during a period when there were no apparent changes in the nature of the orbit cycles, scientists have long been attempting to discover what drove the changes to take place.

For this research, Dr Kender and his team drilled for deep-sea sediment in the Bering Sea, in conjunction with the International Ocean Discovery Program, and measured the chemistry of the fossil shells and sediments.

The team were able to create a detailed reconstruction of oceanic water masses through time – and found that the closure of the Baring Strait caused the subarctic North Pacific became stratified during this period of glaciation.

This stratification, that argue, would have removed CO2 from the atmosphere and caused global cooling.

Dr Kender added: “Today much of the cold water produced by sea ice action flows northward into the Arctic Ocean through the Bering Strait. As glaciers grew and sea levels fell around 1 million years ago, the Bering Strait would have closed, retaining colder water within the Bering Sea. This expanded watermass appears to have stifled the upwelling of deep CO2-rich water and allowed the ocean to sequester more CO2 out of the atmosphere. The associated cooling effect would have changed the sensitivity of Earth to orbital cycles, causing colder and longer glaciations that characterise climate ever since.

“Our findings highlight the importance of understanding present and future changes to the high latitude oceans, as these regions are so important for long term sequestration or release of atmospheric CO2.”

N. Atlantic SST Plunging

Science Matters

RAPID Array measuring North Atlantic SSTs.

For the last few years, observers have been speculating about when the North Atlantic will start the next phase shift from warm to cold.

Source: Energy and Education Canada

An example is this report in May 2015 The Atlantic is entering a cool phase that will change the world’s weather by Gerald McCarthy and Evan Haigh of the RAPID Atlantic monitoring project. Excerpts in italics with my bolds.

This is known as the Atlantic Multidecadal Oscillation (AMO), and the transition between its positive and negative phases can be very rapid. For example, Atlantic temperatures declined by 0.1ºC per decade from the 1940s to the 1970s. By comparison, global surface warming is estimated at 0.5ºC per century – a rate twice as slow.

In many parts of the world, the AMO has been linked with decade-long temperature and rainfall trends. Certainly – and perhaps…

View original post 567 more words