Comparison of global climatologies confirms warming of the global ocean

Reblogged from Watts Up With That:

Institute of Atmospheric Physics, Chinese Academy of Sciences

200635_web

IMAGE: Deployment of an APEX float from a German research ship.

Credit: Argo

The global ocean represents the most important component of the Earth climate system. The oceans accumulate heat energy and transport heat from the tropics to higher latitudes, responding very slowly to changes in the atmosphere. Digital gridded climatologies of the global ocean provide helpful background information for many oceanographic, geochemical and biological applications. Because both the global ocean and the observational basis are changing, periodic updates of ocean climatologies are needed, which is in line with the World Meteorological Organization’s recommendations to provide decadal updates of atmospheric climatologies.

“Constructing ocean climatologies consists of several steps, including data quality control, adjustments for instrumental biases, and filling the data gaps by means of a suitable interpolation method”, says Professor Viktor Gouretski of the University of Hamburg and a scholarship holder of the Chinese Academy of Sciences’ President’s International Fellowship Initiative (PIFI) at the Institute of Atmospheric Physics, Chinese Academy of Sciences, and the author of a report recently published in Atmospheric and Oceanic Science Letters.

“Sea water is essentially a two-component system, with a nonlinear dependency of density on temperature and salinity, with the mixing in the ocean interior taking place predominantly along isopycnal surfaces. Therefore, interpolation of oceanic parameters should be performed on isopycnals rather than on isobaric levels, to minimize production of artificial water masses. The differences between these two methods of data interpolation are most pronounced in the high-gradient regions like the Gulf Stream, Kuroshio, and Antarctic Circumpolar Current,” continues Professor Gouretski.

In his recent report, Professor Gouretski presents a new World Ocean Circulation Experiment/ARGO Global Hydrographic Climatology (WAGHC), with temperature and salinity averaged on local isopycnal surfaces. Based on high-quality ship-board data and temperature and salinity profiles from ARGO floats, the new climatology has a monthly resolution and is available on a 1/4° latitude-longitude grid.

“We have compared the WAGHC climatology with NOAA’s WOA13 gridded climatology. These climatologies represent alternative digital products, but the WAGHC has benefited from the addition of new ARGO float data and hydrographic data from the North Polar regions”, says Professor Gourteski. “The two climatologies characterize mean ocean states that are 25 years apart, and the zonally averaged section of the WAGHC-minus-WOA13 temperature difference clearly shows the ocean warming signal, with a mean temperature increase of 0.05°C for the upper 1500-m layer since 1984”.

Advertisements

Willis’ Favorite Airport

Reblogged from Watts Up With That:

By Steven Mosher,

AC Osborn made an interesting comment about airports that will give me an opportunity to do two things: Pay tribute to Willis for inspiring me and give you all a few more details about airports and GHCN v4 stations. Think of this as a brief but necessary sideline before returning to the investigation of how many stations in GHCNv4 are “ruralish” or “urbanish”. In his comments AC was most interested in how placement at airports would bias the records and my response was that he was talking about microsite and I would get to that eventually. Also a few other folks had some questions about microsite versus LCZ, so let’s start with a super simple diagram.

fig01

We can define microsite bias as any disturbance/encroachment at the site location which biases the measurement up or down within the “footprint” of the sensor. For a thermometer at 1.5meters, this range varies from a few meters in unstable conditions to hundreds of meters in stable conditions . In the recent NOAA study, they found bias up to 50 meters away from a disturbance. I’ve drawn this as the red circle, but in practice, depending on prevailing wind, it is an ellipse. The NOAA experiment (more on that in a future post) put sensors at 4m, 50m, and 124m from a building and found

The mean urban bias for these conditions quickly dropped from 0.84 °C at tower-A (4 m) to 0.55 and 0.01 °C at towers-B` and -C located 50 and 124 meters from the small-scale built environment. Despite a mean urban signal near 0.9 °C at tower-A, the mean urban biases were not statistically significant given the magnitude of the towers standard 2 deviations; 0.44, 0.40, 0.37, and 0.31 °C for tower-A, -B, -B’, and -C respectively.

While not statistically significant, however, they still recommend precaution and suggest that the first 100m of a site be free of encroachments. In field experiments of the effect of roads on air temperature measured at 1.5m, a bias of .1C was found as far as 10m away from roads. At airports this distance should probably be increased. At an airport where the runway is 50m+ wide, the effect the asphalt has on the air temperature is roughly 1.2C at the edge of the runway and diminishes to ~.1c by 150m away from the runway. (Kinoshita, N. (2014). An Evaluation Method of the Effect of Observation Environment on Air Temperature Measurement. Boundary-Layer Meteorology) Exercising even more caution, I’ve extended this out to 500m, although it should be noted that this could classify good sites as “bad” sites and reduce differences in a good/bad comparison. Obviously, this range can be tested by sensitivity analysis.

Outside the red circle I’ve depicted the “Local Climate Zone”. Per Oke/Stewart this region can extend for kilometers. In simple terms you can think of two kinds of biases: Those biases that arise from the immediate vicinity within the view of the sensor and have a direct impact of the sensor, and those that are outside the view of the sensor and act indirectly– say that tall set of buildings 800m away that disturb the natural airflow to the site. In the previous post, we were discussing the local scale; this is the scale at which we would term the bias “UHI.”

There is another source of bias, from far away areas, and I will cover that in another post. For now, we will use airports to understand the difference between these two scales. Let’s do that by merely picturing some extremes in our mind: An airport in Hong Kong, and an airport on a small island in the middle of the ocean. Both airports might have microsite bias, but the Hong Kong temperature would be influenced by the urban local climate zone with its artificial ground cover. The airport on the island is surrounded by nonurban ocean, with no UHI from the ocean. Simplistically, the total bias a site might be seen as a combination of a micro bias, local bias, and distant bias.

There are, logically, six conditions we can outline:

Rural–natural No Micro Bias Warm Micro Bias Cool Micro Bias
Urban–artificial No Micro Bias Warm Micro Bias Cool Micro Bias

It is important to remember that micro disturbances can bias in both directions, cooling by shading for example. And note that logically you could find a well sited site in an urban location. This was hypothesized by Peterson long ago:

“In a recent talk at the World Meteorological Organization, T. Oke (2001, personal communication) stated that there has been considerable advancement in the understanding of urban climatology in the last 15 years. He went on to say that urban heat islands should be considered on three different scales. First, there is the mesoscale of the whole city. The second is the local scale on the order of the size of a park. And the third scale is the microscale of the garden and buildings near the meteorological observing site. Of the three scales, the microscale and local-scale effects generally are larger than mesoscale effects….

Gallo et al. (1996) examined of the effect of land use/ land cover on observed diurnal temperature range and the results support the notion that microscale influences of land use/land cover are stronger than mesoscale. A metadata survey provided land use information in three radii: 100 m, 1 km, and 10 km. The analysis found that the strongest effect of differences in land use/land cover was for the 100-m radius. While the land use/land cover effect ‘‘remains present even at 10,000 m….

Recent research by Spronken-Smith and Oke (1998) also concluded that there was a marked park cool island effect within the UHI. They report that under ideal conditions the park cool island can be greater than 5 C, though in midlatitude cities they are typically 1 –2C. In the cities studied, the nocturnal cooling in parks is often similar to that of rural areas. They reported that the thermal influence of parks on air temperatures appears to be restricted to a distance of about one park width….

Park cool islands are not the only potential mitigating factor for in situ urban temperature observations. Oceans and large lakes can have a significant influence on the temperature of nearby land stations whether the station is rural or urban. The stations used in this analysis that were within 2 km of the shore of a large body of water disproportionally tended to be urban (5.8% of urban were coastal versus 2.4% of rural).

Looking at airports will also help you cement the difference between the micro and the LCZ in your thinking. With that in mind we will turn to airports and look at various pictures to understand the difference between the micro and the local- the nearby city or the nearby ocean or field.

First a few details about airports. In my metadata I have airports classified as small, medium and large

First, the small: some are paved. Pixels (30m) detected as artificial surface are colored orange:

clip_image004

Some are dirt

clip_image006

Now large airports

clip_image008

We will get to medium, but first a few other airports by water, a 10km look, the blue dot is the station, red squares are 30meter urban cover

clip_image010

Zooming in

clip_image012

The medium airport I choose was one of Willis’ favorite airports, discussed in this post. Before we get to that visual, I encourage you all to read that post, because it put me on a 6 year journey. Willis is rather rare among those who question climate science. He does his own work, and he raises interesting testable questions. He doesn’t merely speculate; he looks and reads and does actual work. He raised two points I want to highlight:

Many of the siting problems have nothing to do with proximity to an urban area.

Instead, many of them have everything to do with proximity to jet planes, or to air conditioner exhaust, or to the back of a single house in a big field, or to being located over a patch of gravel.

And sadly, even with a map averaged on a 500 metre grid, there’s no way to determine those things.

And that’s why I didn’t expect they would find any difference … because their division into categories has little to do with the actual freedom of the station from human influences on the temperature. Urban vs Rural is not the issue. The real dichotomy is Well Sited vs Poorly Sited.

It is for this reason that I think that the “Urban Heat Island” or UHI is very poorly named. I’ve been agitating for a while to call it the LHI, for the “Local Heat Island”. It’s not essentially urban in nature. It doesn’t matter what’s causing the local heat island, whether it’s shelter from the wind as the trees grow up or proximity to a barbecue pit.

Nor does the local heat island have to be large. A thermometer sitting above a small patch of gravel will show a very different temperature response from one just a short distance away in a grassy field. The local heat island only needs to be big enough to contain the thermometer, one air conditioner exhaust is plenty, as is a jet exhaust

I think we both agree that the micro, what he calls local, is important. However, the area outside of the immediate area cannot be discounted: Hong Kong airport next to a huge city is going to be influenced by that locale, whereas, a large airport ( see above) on an island next to the sea, is arguably not going to be biased as much.

The second point Willis made was about the problems with 500meter data. In particular the MODIS classification system which required multiple adjacent pixels before a pixel was classified as urban. At that time we did not have a world database at 30m; Today we can look at that station and calculate the artificial area using 30m data. The next 4 images show the site at various scales: 500m, 1000m, 5000m and lastly 10000m. At the microscale ( <500meters) it classified as greater than 10% artificial, at 1km greater than 10% artificial, and at 5km and 10km it was less than 10% artificial.

clip_image014clip_image016clip_image018clip_image020

There were some concerns about the temperature at this station being used. However, there has never been enough data from this station to include in any global series, even Berkeley’s. Nevertheless, it lets us see the kind of improvements that can be made now that higher resolution data is available for the entire world. Also, even when airports are included in the data analysis, the bias can be reduced in some cases. Here a 2C bias is removed.

One last small airport to give you some kind of idea of that data that we can produce today.

clip_image022

AC Osborn also wanted to know just how many airports were in GHCN v4; and, I think it’s safe to say that many skeptics believe that the record is dominated by airport stations. Well, is it? We can count them and see. For this count I will use 1km as a distance cut off. There are couple ways to “determine” if a station is at an airport. The least accurate way is to look at the names of the stations. This misses a large number of airports. To answer the question I use GPS coordinates compiled for over 55000 airports world wide, including small airports, heliports, balloon ports, and seaplane ports. I then calculate the distance between all 27K stations and the 55K airports and select the closest airport. I then cross check with those stations in GHCN that have a “name” that indicates it is an airport.

For this we consider a 1km distance for being “at an airport”. While this is farther than the microsite boundary, the point of the exercise is to illustrate that not all the stations are at airports.

Using 1km as a cut off, I find there are 1,129 stations by small airports, 1830 by medium airports, and 267 by large airports. That’s from a total of ~27,000 stations.

To assess the ability of the 30m data to detect airport runways and other artificial surfaces we can look at the stations that are within 500 meters of a large airport and ask? Does our 30m data show artificial surface?. There are 131 stations within 500m of an airport. We know that no sensor data/image classification system is perfect, but we can see that in the aggregate the 30m data performs well.

clip_image024

We can also ask how many large airports are embedded in Local climate Zones that have less than 10% artificial cover out to 10km. As expected large airports are in local areas that are also built up at levels above 10%. You don’t get large airports where there are no people.

clip_image026

Conversely, you get small airports embedded in local zones that are not heavily built out, a few cases of small airports embedded in Local Climate Zones that are heavily built out.

clip_image028

Summary

Here are the points that I would like to emphasize.

1. We can discuss or differentiate between at least 2 types/sources of bias: the close and immediate and those sources more distant

2. Bias at the short range (micro) can be more important than bias at the long range.

3. A good site can be embedded in a “bad” area or “good” area, similarly for a bad site.

4. 30m data is better than 500m data

5. Skeptics should not argue that all the sites or a majority are at airports. They are not.

6. There are different types of airports.

7. One way to tell if there is a bias is by comparing Airports with Non airports.

Climate data shows no recent warming in Antarctica, instead a slight cooling

Reblogged from Watts Up With That:

Below is a plot from a resource we have not used before on WUWT, “RIMFROST“. It depicts the average temperatures for all weather stations in Antarctica. Note that there is some recent cooling in contrast to a steady warming since about 1959.

Data and plot provided by http://rimfrost.no 

Contrast that with claims by Michael Mann, Eric Steig, and others who used statistical tricks to make Antarctica warm up. Fortunately, it wasn’t just falsified by climate skeptics, but rebutted in peer review too.

Data provided by http://rimfrost.no 

H/T to Kjell Arne Høyvik‏  on Twitter

ADDED:

No warming has occurred on the South Pole from 1978 to 2019 according to satellite data (UAH V6). The linear trend is flat!

Analysis of new NASA AIRS study: 80% of U.S. Warming has been at Night

Reblogged from Watts Up With That:

By Dr. Roy Spencer

I have previously addressed the NASA study that concluded the AIRS satellite temperatures “verified global warming trends“. The AIRS is an infrared temperature sounding instrument on the NASA Aqua satellite, providing data since late 2002 (over 16 years). All results in that study, and presented here, are based upon infrared measurements alone, with no microwave temperature sounder data being used in these products.

That reported study addressed only the surface “skin” temperature measurements, but the AIRS is also used to retrieve temperature profiles throughout the troposphere and stratosphere — that’s 99.9% of the total mass of the atmosphere.

Since AIRS data are also used to retrieve a 2 meter temperature (the traditional surface air temperature measurement height), I was curious why that wasn’t used instead of the surface skin temperature. Also, AIRS allows me to compare to our UAH tropospheric deep-layer temperature products.

So, I downloaded the entire archive of monthly average AIRS temperature retrievals on a 1 deg. lat/lon grid (85 GB of data). I’ve been analyzing those data over various regions (global, tropical, land, ocean). While there are a lot of interesting results I could show, today I’m going to focus just on the United States.

AIRS temperature trend profiles averaged over the contiguous United States, Sept. 2002 through March 2019. Gray represents an average of day and night. Trends are based upon monthly departures from the average seasonal cycle during 2003-2018. The UAH LT temperature trend (and it’s approximate vertical extent) is in violet, and NOAA surface air temperature trends (Tmax, Tmin, Tavg) are indicated by triangles. The open circles are the T2m retrievals, which appear to be less trustworthy than the Tskin retrievals.

Because the Aqua satellite observes at nominal local times of 1:30 a.m. and 1:30 p.m., this allows separation of data into “day” and “night”. It is well known that recent warming of surface air temperatures (both in the U.S. and globally) has been stronger at night than during the day, but the AIRS data shows just how dramatic the day-night difference is… keeping in mind this is only the most recent 16.6 years (since September 2002):

The AIRS surface skin temperature trend at night (1:30 a.m.) is a whopping +0.57 C/decade, while the daytime (1:30 p.m.) trend is only +0.15 C/decade. This is a bigger diurnal difference than indicated by the NOAA Tmax and Tmin trends (triangles in the above plot). Admittedly, 1:30 a.m. and 1:30 pm are not when the lowest and highest temperatures of the day occur, but I wouldn’t expect as large a difference in trends as is seen here, at least at night.

Furthermore, these day-night differences extend up through the lower troposphere, to higher than 850 mb (about 5,000 ft altitude), even showing up at 700 mb (about 12,000 ft. altitude).

This behavior also shows up in globally-averaged land areas, and reverses over the ocean (but with a much weaker day-night difference). I will report on this at some point in the future.

If real, these large day-night differences in temperature trends is fascinating behavior. My first suspicion is that it has something to do with a change in moist convection and cloud activity during warming. For instance more clouds would reduce daytime warming but increase nighttime warming. But I looked at the seasonal variations in these signatures and (unexpectedly) the day-night difference is greatest in winter (DJF) when there is the least convective activity and weakest in summer (JJA) when there is the most convective activity.

One possibility is that there is a problem with the AIRS temperature retrievals (now at Version 6). But it seems unlikely that this problem would extend through such a large depth of the lower troposphere. I can’t think of any reason why there would be such a large bias between day and night retrievals when it can be seen in the above figure that there is essentially no difference from the 500 mb level upward.

It should be kept in mind that the lower tropospheric and surface temperatures can only be measured by AIRS in the absence of clouds (or in between clouds). I have no idea how much of an effect this sampling bias would have on the results.

Finally, note how well the AIRS low- to mid-troposphere temperature trends match the bulk trend in our UAH LT product. I will be examining this further for larger areas as well.

The Cooling Rains

Reblogged from Watts Up With That:

Guest Post by Willis Eschenbach

I took another ramble through the Tropical Rainfall Measurement Mission (TRMM) satellite-measured rainfall data. Figure 1 shows a Pacific-centered and an Atlantic-centered view of the average rainfall from the end of 1997 to the start of 2015 as measured by the TRMM satellite.

Figure 1. Average rainfall, meters per year, on a 1° latitude by 1° longitude basis. The area covered by the satellite data, forty degrees north and south of the Equator, is just under 2/3 of the globe. The blue areas by the Equator mark the InterTropical Convergence Zone (ITCZ). The two black horizontal dashed lines mark the Tropics of Cancer and Capricorn, the lines showing how far north and south the sun travels each year (23.45°, for those interested).

There’s lots of interesting stuff in those two graphs. I was surprised by how much of the planet in general, and the ocean in particular, are bright red, meaning they get less than half a meter (20″) of rain per year.

I was also intrigued by how narrowly the rainfall is concentrated at the average Inter-Tropical Convergence Zone (ITCZ). The ITCZ is where the two great global hemispheres of the atmospheric circulation meet near the Equator. In the Pacific and Atlantic on average the ITCZ is just above the Equator, and in the Indian Ocean, it’s just below the Equator. However, that’s just on average. Sometimes in the Pacific, the ITCZ is below the Equator. You can see kind of a mirror image as a light orange horizontal area just below the Equator.

Here’s an idealized view of the global circulation. On the left-hand edge of the globe, I’ve drawn a cross section through the atmosphere, showing the circulation of the great atmospheric cells.

Figure 2. Generalized overview of planetary atmospheric circulation. At the ITCZ along the Equator, tall thunderstorms take warm surface air, strip out the moisture as rain, and drive the warm dry air vertically. This warm dry air eventually subsides somewhere around 25-30°N and 25-30S of the Equator, creating the global desert belts at around those latitudes.

The ITCZ is shown in cross-section at the left edge of the globe in Figure 2. You can see the general tropical circulation. Surface air in both hemispheres moves towards the Equator. It is warmed there and rises. This thermal circulation is greatly sped up by air driven vertically at high rates of speed through the tall thunderstorm towers. These thunderstorms form all along the ITCZ. These thunderstorms provide much of the mechanical energy that drives the atmospheric circulation of the Hadley cells.

With all of that as prologue, here’s what I looked at. I got to thinking, was there a trend in the rainfall? Is it getting wetter or drier? So I looked at that using the TRMM data. Figure 3 shows the annual change in rainfall, in millimeters per year, on a 1° latitude by 1° longitude basis.

Figure 3. Annual change in the rainfall, 1° latitude x 1° longitude gridcells.

I note that the increase in rain is greater on the ocean vs land, is greatest at the ITCZ, and is generally greater in the tropics.

Why is this overall trend in rainfall of interest? It gives us a way to calculate how much this cools the surface. Remember the old saying, what comes down must go up … or perhaps it’s the other way around, same thing. If it rains an extra millimeter of water, somewhere it must have evaporated an extra millimeter of water.

And in the same way that our bodies are cooled by evaporation, the surface of the planet is also cooled by evaporation.

Now, we note above that on average, the increase is 1.33 millimeters of water per year. Metric is nice because volume and size are related. Here’s a great example.

One millimeter of rain falling on one square meter of the surface is one liter of water which is one kilo of water. Nice, huh?

So the extra 1.33 millimeters of rain per year is equal to 1.33 extra liters of water evaporated per square meter of surface area.

Next, how much energy does it take to evaporate that extra 1.33 liters of water per square meter so it can come down as rain? The calculations are in the endnotes. It turns out that this 1.33 extra liters per year represents an additional cooling of a tenth of a watt per square meter (0.10 W/m2).

And how does this compare to the warming from increased longwave radiation due to the additional CO2? Well, again, the calculations are in the endnotes. The answer is, per the IPCC calculations, CO2 alone over the period gave a yearly increase in downwelling radiation of ~ 0.03 W/m2. Generally, they double that number to allow for other greenhouse gases (GHGs), so for purposes of discussion, we’ll call it 0.06 W/m2 per year.

So over the period of this record, we have increased evaporative cooling of 0.10 W/m2 per year, and we have increased radiative warming from GHGs of 0.06 W/m2 per year.

Which means that over that period and that area at least, the calculated increase in warming radiation from GHGs was more than counterbalanced by the observed increase in surface cooling from increased evaporation.

Regards to all,

w.

As usual: please quote the exact words you are discussing so we can all understand exactly what and who you are replying to.

Additional Cooling

Finally, note that this calculation is only evaporative cooling. There are other cooling mechanisms at work that are related to rainstorms. These include:

• Increased cloud albedo reflecting hundreds of watts/square meter of sunshine back to space

• Moving surface air to the upper troposphere where it is above most GHGs and freer to cool to space.

• Increased ocean surface albedo from whitecaps, foam, and spume.

• Cold rain falling from a layer of the troposphere that is much cooler than the surface.

• Rain re-evaporating as it falls to cool the atmosphere

• Cold wind entrained by the rain blowing outwards at surface level to cool surrounding areas

• Dry descending air between rain cells and thunderstorms allowing increased longwave radiation to space.

Between all of these, they form a very strong temperature regulating mechanism that prevents overheating of the planet.

Calculation of energy required to evaporate 1.33 liters of water.

#latent heat evaporation joules/kg @ salinity 35 psu, temperature 24°C

> latevap = gsw_latentheat_evap_t( 35, 24 ) ; latevap

[1] 2441369

# joules/yr/m2 required to evaporate 1.33 liters/yr/m2

> evapj = latevap * 1.33 ; evapj

[1] 3247021

# convert joules/yr/m2 to W/m2

> evapwm2 = evapj / secsperyear ; evapwm2

[1] 0.1028941

Note: the exact answer varies dependent on seawater temperature, salinity, and density. These only make a difference of a couple percent (say 0.1043 vs 0.1028941). I’ve used average values.

Calculation of downwelling radiation change from CO2 increase.

#starting CO2 ppmv Dec 1997

> thestart = as.double( coshort[1] ) ; thestart

[1] 364.38

#ending CO2 ppmv Mar 2015

> theend = as.double( last( coshort )) ; theend

[1] 401.54

# longwave increase, W/m2 per year over 17 years 4 months

> 3.7 * log( theend / thestart, 2)/17.33

[1] 0.0299117

GHCN v3.3 vs. v4 Anomaly Australia / Pacific Islands

Reblogged from Musings from the Chiefio:

In prior postings I did a sample of various countries around the world, and a full set of North America, South America and Antarctica. This extends that set with Australia and the Pacific Islands. Note that these are often near the equator and may be on either side of it, so seasonality may vary by graph.

I’m going to group things into Australia, New Zealand, North of Australia (Indonesia, Papua New Gunea, Philipines, etc.), then those islands scattered across the center of the Pacific. Why? Because countries in those areas ought to look a lot more like each other in terms of Anomaly than like those in other groups. The Pacific is dominated by ENSO and tropical conditions. while the countries north of Australia have Indian Ocean influences and share a current flow up the coast of Asia. To some extent New Zealand is “special” in that it is closest to The Southern Ocean so has more cold southern islands and arctic water exposure. Similarly, Singapore is on the Malay Peninsula and north of the equator so it, and Malaysia, ought to reflect some of the Asian Continent; but protruding into the mixed ocean area will also reflect climate similar to Indonesia. Finally, Australia is unique in this group as it has a large hot desert in the center.

Here’s the Koppen Climate graph for the World (from the Wiki) so you have something for comparison.

Köppen-Geiger Climate Map for 1980-2016

From that you can easily see how Australia and New Zealand differ from the tropical ocean group.

What I find fascinating in these graphs is just how much the islands temperature recordings vary (often a lot) while they are in the same climate zone as a nearby neighbor and share a large body of nearly uniform temperature water between them. I expect change between distant islands, but we often see it on neighboring islands. I think that likely is an instrument or siting change issue. Who has the large airport and tourists, and who doesn’t. But that needs an historical retrospective photo essay on each of the places like that and someone else will need to take that “Dig Here!”.

Which countries?

This bit of SQL programming gets us a table of countries in Region 5, the Australia & Pacific Islands group (remember that in Linux the command “cat” is “concatenate and print” and with just one file name prints out the contents. In this case the program named “ApacList.sql”):

chiefio@PiM3Devuan2:~/SQL/bin$ cat ApacList.sql 
SELECT cnum, abrev,region, cname 
FROM country WHERE region=5 ORDER BY cname;

So what does that give us? Here’s the result:

MariaDB [temps]> source bin/ApacList.sql
+------+-------+--------+------------------------------------------+
| cnum | abrev | region | cname                                    |
+------+-------+--------+------------------------------------------+
| 521  | AQ    | 5      | American Samoa [United States]           |
| 501  | AS    | 5      | Australia                                |
| 522  | BX    | 5      | Brunei                                   |
| 523  | KT    | 5      | Christmas Island [Australia]             |
| 524  | CK    | 5      | Cocos (Keeling) Islands [Australia]      |
| 525  | CW    | 5      | Cook Islands [New Zealand]               |
| 527  | FM    | 5      | Federated States of Micronesia           |
| 502  | FJ    | 5      | Fiji                                     |
| 528  | FP    | 5      | French Polynesia                         |
| 529  | GQ    | 5      | Guam [United States]                     |
| 503  | ID    | 5      | Indonesia                                |
| 530  | JQ    | 5      | Johnston Atoll [United States]           |
| 504  | KR    | 5      | Kiribati                                 |
| 505  | MY    | 5      | Malaysia                                 |
| 531  | RM    | 5      | Marshall Islands                         |
| 598  | MQ    | 5      | Midway Islands [United States}           |
| 506  | NR    | 5      | Nauru                                    |
| 532  | NC    | 5      | New Caledonia [France]                   |
| 507  | NZ    | 5      | New Zealand                              |
| 533  | NE    | 5      | Niue [New Zealand]                       |
| 534  | NF    | 5      | Norfolk Island [Australia]               |
| 535  | CQ    | 5      | Northern Mariana Islands [United States] |
| 536  | PS    | 5      | Palau                                    |
| 599  | LQ    | 5      | Palmyra Atoll [United States]            |
| 508  | PP    | 5      | Papua New Guinea                         |
| 509  | RP    | 5      | Philippines                              |
| 537  | PC    | 5      | Pitcairn Islands [United Kingdom]        |
| 541  | WS    | 5      | Samoa                                    |
| 511  | SN    | 5      | Singapore                                |
| 512  | BP    | 5      | Solomon Islands                          |
| 597  | TT    | 5      | Timor-Leste                              |
| 538  | TL    | 5      | Tokelau [New Zealand]                    |
| 517  | TN    | 5      | Tonga                                    |
| 518  | TV    | 5      | Tuvalu                                   |
| 520  | NH    | 5      | Vanuatu                                  |
| 539  | WQ    | 5      | Wake Island [United States]              |
| 540  | WF    | 5      | Wallis and Futuna [France]               |
+------+-------+--------+------------------------------------------+
37 rows in set (0.90 sec)

MariaDB [temps]> 

So 37 Countries. 74 total graphs. This is going to take a while…

First I’ll put up Australia and New Zealand as they are the two most different from everything else in terms of climate types.

Australia

GHCN v3.3 vs v4 Australia Difference

Interesting that the general trend is a roll off of heat. But a couple of years get a hot bump at the end.

GHCN v3.3 vs v4 Australia Anomaly

New Zealand

GHCN v3.3 vs v4 New Zealand Difference

Abut 1/2 C cooling of the deep past, but not much else.

GHCN v3.3 vs v4 New Zealand Anomaly

North Of Australia

I’m going to start this group with Indonesia, as it is the largest, then work my way around the nearby bits. These all ought to be substantially the same as they all share the same giant bath tub of water and currents.

Indonesia

GHCN v3.3 vs v4 Indonesia Difference

Either the historic Indonesia data are crap and need a lot of fixes, or they can’t decide what their temperature was in the past.. Nice warming jump added at the recent end.

GHCN v3.3 vs v4 Indonesia Anomaly

Timor-Leste

Looks like these folks are missing data in v3.3:

MariaDB [temps]> SELECT year,AVG(deg_c) FROM anom3 AS A 
INNER JOIN country AS C ON A.country=C.cnum  
WHERE C.abrev='TT' GROUP BY year;
Empty set (0.09 sec)

So the anomaly difference graph report fails:

============ RESTART: /SG500/xfs/chiefio/Py3/Aapac/a3v4deltaTT.py ============
stuffed SQL statement for TT Timor-Leste 
Executed SQL
[]
Got data
This is the exception branch
All Done
>>> 

So taking num3>0 and num4>0 of of the script (so it accepts years with no data, the result becomes:

============ RESTART: /SG500/xfs/chiefio/Py3/Aapac/a3v4deltaTT.py ============
stuffed SQL statement for TT Timor-Leste 
Executed SQL
[('1917', None), ('1918', None), ('1919', None), ('1920', None),
 ('1927', None), ('1928', None), ('1929', None), ('1930', None),
 ('1931', None), ('1932', None), ('1933', None), ('1934', None),
 ('1936', None), ('1938', None), ('1939', None), ('1940', None),
 ('1941', None), ('1951', None), ('1952', None), ('1953', None),
 ('1954', None), ('1955', None), ('1956', None), ('1957', None),
 ('1958', None), ('1959', None), ('1960', None), ('1961', None),
 ('1962', None), ('1963', None), ('1964', None), ('1965', None),
 ('1966', None), ('1967', None), ('1968', None), ('1969', None),
 ('1970', None), ('1971', None), ('1972', None), ('1973', None),
 ('1974', None), ('1975', None), ('1976', None), ('1977', None),
 ('1978', None), ('1979', None), ('1980', None), ('1981', None),
 ('1982', None), ('1983', None), ('1984', None), ('1985', None),
 ('1990', None)]
Got data
after the transpose

And we get an empty graph. All those “None” for difference data.

Printing the two sets of data has only the v4 data show up on the graph of anomalies:

GHCN v3.3 vs V4 Timor-Leste Anomaly

Where it looks like nobody has got around to molesting the data and making it toe the PC Line. We have a very hot 1930s, a cold 1960s, A return to ALMOST as hot in the 1980s, then a cold dip in the ’90s. Rather like we all experienced and rather like recorded in the history of the times. Golly.

Papua New Guinea

GHCN v3.3 vs V4 Papua New Guinea Difference

Very little change over much of the history, then about 1/4 C cooler in recent years with some ‘fliers’ of 1/2 C higher.

GHCN v3.3 vs V4 Papua New Guinea Anomalies

VERY significant range compression in the GISS/Hadley baseline years (about 1950-1990) then it widens out again with a bit of “higher highs”, then most recently it gets a cold year. Not looking at all like general warming over the years from CO2.

Malaysia

GHCN v3.3 vs V4 Malaysia Differences

Wow! Really cooling the past there in Malaysia. A full degree C colder in many cases; rising to only 1/2 C colder just before the baseline period. Then the baseline period kept at zero. (Remember this is just change between version 3.3 and version 4 for what is supposedly the SAME place and the same instruments recorded at one time in the past…) Then the recent data gets about a 1/3 C “lift” (but freezing the past has already created the slope needed…)

GHCN v3.3 vs V4 Malaysia Anomalies

Here we can see that it is no warmer now than it was in the past in the old version; but only after cooling the past a full degree C does that unfortunate fact go away.

Singapore

GHCN v3.3 vs V4 Singapore Differences

Interesting cold “adjustment” in the 1870s and then that dip in the late 1990’s is interesting, finally we end with an uptick of only about 1/4 C in the last datapoint.

GHCN v3.3 vs V4 Singapore Anomalies

Then the actual anomaly data shows a nice “dip” in the baseline period, but otherwise the actual temperature change has not been much at all over the years. Other than that one hot dot at the very end…

Brunei

Lookslike Brunei was also not in GHCN v3.3 so no “difference in anomalies” graph can be made:

MariaDB [temps]> SELECT year,AVG(deg_C) FROM anom3 AS A 
INNER JOIN country AS C ON A.country=C.cnum 
WHERE C.abrev='BX' GROUP BY year;
Empty set (0.48 sec)

So all we’ll get is the v4 anomaly data on the Anomalies graph:

GHCN v3.3 vs v4 Brunei Anomalies

Then this is too short a record to say much at all about climate. It’s about 35 years so only a tiny bit over one half the known 60 year cycle. Fitting a trend to cyclical data is a fools errand. I note in passing that recent years are about the same as the mid 1990s.

Philippines

GHCN v3.3 vs v4 Philippines Differences

Not much changed between the two data set versions. Looks like the W.W.II data changed a bit more.

GHCN v3.3 vs v4 Philippines Anomalies

So the 1800s were a bit of cold, then we see about 1-1.5 C of range in the Yr/Yr data until the “Baseline period” where the range narrows (closer to 3/4 C though near 1980 things are remarkably constant. In more recent years we have the return of some range (though it looks like minus some cold excursions) and the final temperature is very much like about 1965, 1942 or so, and around 1932. So while the slope of a fit line might well show a trend, the present temperature is not out of line with hot periods in the past. My best guess would be a bit of growth of the airport, UHI, and jet exhaust.

Palau

Oddly, Pelau right nearby the Philippines, has a different shape to their data…

GHCN v3.3 vs v4 Palau Differences

The deep past gets changed to a little cooler, then the present has a 1.5 C range to the CHANGES between version 3.3 and version 4 of what is supposedly the same place and data. Now Pelau isn’t big enough to have a whole lot of thermometers to chose between and among, so just why is the data that “mailable”? Eh?

GHCN v3.3 vs v4 Palau Differences

The actual anomaly graphs have the usual compressed “waistline” with reduced range in the “Baseline Years”, then with an otherwise almost constant spread and range of data from about 1950 to 1995, when suddenly the low ranges start to pull up. The spectacular bit, though, is the spike of roughly 2 C in the last few years. I’m sorry, but CO2 effects to not lurk for 40 years doing not much then suddenly show up in one year and stay for 3 or 4. That’s something else. Jet exhaust maybe? Isn’t that a big US Military spot?

Pacific Island Arc

This set is all those islands and atolls scattered around the Pacific Ocean toward North America (compared to the prior set). As ENSO tends to make an oscillator between the E and W sides of this basin, and some of these are N of the equator while most are S, I’m generally going to lay them out from near New Zealand over toward the Americas, but with those North of the Equator near the middle (some US owned Atolls mostly) set out separately. (Provided I can keep straight which of these rocks is classified as a what and who has had which name change and…)

Up North & Scattered

Here’s a few islands and atolls in the more northern part of the Pacific and scattered around a bit in the Big Empty.

Midway Islands

Midway also has no data in the GHCN v3.3 set:

MariaDB [temps]> SELECT year,AVG(deg_C) FROM anom3 AS A 
INNER JOIN country AS C ON A.country=C.cnum 
WHERE C.abrev='MQ' GROUP BY year;
Empty set (0.06 sec)

MariaDB [temps]>

So once again all we will get is the GHCN v4 Anomalies graph:

GHCN v3.3 vs v4 Midway Islands Anomalies

Other than a couple of “fliers” recently, the temperatures are rather like the hot points in the 1930s-50s. I note that the W.W.II years are missing. Low excursions are about the same in the 1920s-1940 and in the 1955-1975 range, then just “go away”. Rather like a 1/2 C “step function” happened in 1979. Very strange. Wonder if there was any equipment change then?

Johnston Atoll

GHCN v3.3 vs v4 Johnston Atoll Difference

Not much going on with Johnston Atoll. Then again they already have 2 C range in the anomaly (see next graph) so maybe nothing more was needed…

GHCN v3.3 vs v4 Johnston Atoll Anomaly

Once again almost nothing really happening until 1980, then a jump up; followed by another big jump up in about 1995. Odd little atoll. Wonder what was going on then… From the Wiki:

Chemical weapon demilitarization mission 1990–2000
Johnston Atoll Chemical Agent Disposal System (JACADS) building
Main article: JACADS

The Army’s Johnston Atoll Chemical Agent Disposal System (JACADS) was the first full-scale chemical weapons disposal facility. Built to incinerate chemical munitions on the island, planning started in 1981, construction began in 1985, and was completed five years later. Following completion of construction and facility characterization, JACADS began operational verification testing (OVT) in June 1990. From 1990 until 1993, the Army conducted four planned periods of Operational Verification Testing (OVT), required by Public Law 100-456. OVT was completed in March 1993, having demonstrated that the reverse assembly incineration technology was effective and that JACADS operations met all environmental parameters. The OVT process enabled the Army to gain critical insight into the factors that establish a safe and effective rate of destruction for all munitions and agent types. Only after this critical testing period did the Army proceed with full-scale disposal operations at JACADS. Transition to full-scale operations started in May 1993 but the facility did not begin full-scale operations until August 1993.

All of the chemical weapons once stored on Johnston Island were demilitarized and the agents incinerated at JACADS with the process completing in 2000 followed by the destruction of legacy hazardous waste material associated with chemical weapon storage and cleanup. JACADS was demolished by 2003 and the island was stripped of its remaining infrastructure and environmentally remediated.

Oh… So a lot of stuff shipped in, big construction, then years of running an incinerator… a BIG incinerator. I’m sure that had nothing to do with it… /sarc;

Wake Island

GHCN v3.3 vs v4 Wake Island Differences

Nothing changed much for years other than a roughly 1/4 C cooling of the past then BAM a 1.5 C range of changes in a few year and then back to not much change.

GHCN v3.3 vs v4 Wake Island Anomalies

Looks to me like another “Step function” of about 1 C in 1980 with a sight cooling trend over a cyclical spike of about 15 to 20 years. Not at all what steady increases in a warming gas would cause.

The more Southern Group

These are the islands that make an equatorial to South Pacific arc.

North Marianas Islands

GHCN v3.3 vs v4 Northern Mariana Islands Differences

Gosh, a 2.5 C range in teh anomalies just from variation in instrument chosen or processing. When you can get that much essentially “random” variation from what is supposedly the same small place and the same data / instruments, where are the error bars on that 1/2 C of “Global Warming” fantasy?

Interesting that this pushes up the “New Ice Age Comming” 1970s and pulls down the present. Just how crazy bad was this “Global Warming” chart that they needed to take 2.3 C out of it?

GHCN v3.3 vs v4 Northern Mariana Islands Anomalies

Gee from warming black dots to dead flat red dots in one “fix”. I wonder who got caught doing what and had to fix it? ;-0

Guam

GHCN v3.3 vs v4 Guam Differences

Another small island with big changes in their “historical” data. Looks like a tiny rise in the early ’80s, then a big 1 C cut around 2000.

New data looks to be about as high as what was reduced. I guess it would look bad to have a “Halt” to “Global Warming”, so need to take a tuck in that older “hot” time and preserve the warming “trend” that way… I note in passing that the 1920s to 1940s are about as hot as “whichever hot now is really now”, and only the “baseline period” is nominally cool.

Fed. Islands Of Micronesia

GHCN v3.3 vs v4 Federated States Of Micronesia Differences

Another one with a change “dip” around 2000.

GHCN v3.3 vs v4 Federated States Of Micronesia Anomalies

Oddly, even though in the same giant bathtub of warm water as Guam, these Islands have a cold 1920s to 1940s. Then essentially dead flat from 1950 to about 2000-2005 and only then a jump up (or smooth rise depending on version). Doesn’t look at all like a gentle persistent rise of 1/2 C due to CO2 and looks a whole lot more like ENSO, cyclical changes with step functions, or diddled data / bad measuring.

Marshal Islands

GHCN v3.3 vs v4 Marshal Islands Difference

Not much in the change department. Bit of a minor down tweak at the end.

GHCN v3.3 vs v4 Marshal Islands Anomalies

Other than the “dip” or “sag” in the “basline period” of about 1950 to 1990, not much in the anomalies either. OTOH, they have a nice 1 C range from bottom of the baseline to now pretty much baked in, so why change anything? Just ignore that pesky pre-baseline data and call it a warming trend.

Nauru

GHCN v3.3 vs v4 Nauru Differences

Changes all over the place and with a 2 C range. Big dropout from 1940 to 1960. Huge cooling of the hot 1930s.

GHCN v3.3 vs v4 Nauru Anomalies

Ah, that’s why. Turn a cooling down trend into a slight warming then throw away any recent data and anything newer than 1970. Can’t keep a place that’s getting cooler in the data now can we?

Kiribati

GHCN v3.3 vs v4 Kiribati Differences

Another “dogs breakfast” of changes. Almost 3 C of “fix ‘er up” done with cooling the 1920s to ’30s. Got to erase that pesky hot ’30s somehow. Then pull down the ’80s a little to erase the “pause” and make it a smoother trend.

GHCN v3.3 vs v4 Kiribati Anomalies

And “Bob’s Yer Uncle” a flat to cooling trend becomes a “warming out of the baseline period”. (Even though over all the data it isn’t warming, but no worries, nobody cares about data older than W.W.II).

Christmas Island

GHCN v3.3 vs v4 Christmas Island Differences

Again a big dropout of data in the baseline, then a nice 1/2 C of “Pop” added in 2000-2010.

GHCN v3.3 vs v4 Christmas Island Anomalies

So not erased the hot 30s & 40s here yet (which just begs the question how they could vary so much from nearby island to nearby island…) but did get rid of that annoying cold dip after 2000. Add a couple of juiced up hot years in the recent data and you too can turn a dead flat trendless Island into a Global Warming place. Just ignore that 30s & 40s data (don’t worry, it will be taken care of in v5, I’m sure… /sarc;)

Solomon Islands

GHCN v3.3 vs v4 Solomon Islands Differences

Another place with a 2 C range in the ‘fix up’ differences. Makes one wonder how bad the recent data are to need to much changing.

GHCN v3.3 vs v4 Solomon Islands Anomalies

Essentially trendless until after 2000. Even then not much (and mostly from removing low going excursions). Wonder if they moved the thermometer closer to a cement runway 😉

Tuvalu

GHCN v3.3 vs v4 Tuvalu Differences

About a 1 C range of what looks like a few semi-random changes.

GHCN v3.3 vs v4 Tuvalu Anomalies

Nothing much at all going on until the year 2000 then a sudden jump up of about 1/2 C consistent with the 1940 temperatures. This will create a false trend if you plot a trend line from the “baseline period” to the present when at best there’s a cyclical thing happening (and at worst it is an instrumentation issue).

Tokelau

Oh man is this one a challenge / amusing:

That big pop up of up to 1.5 C in 1965-70 range shows that somebody did go back and get different data, yet the result (graph below) is still just crazy time.

GHCN v3.3 vs v4 Tokelau Anomalies

A full 4 C+ of range, all over the place, with the most recent data quite cool. No trend until the late 1970s, then a massive pop up of 1 C for near a decade+, a drop of 4 C, and then it returns with mostly cooler data but still bouncing around by 2 C. This one is a real “Dig Here!” issue.

Wallis & Fortuna

GHCN v3.3 vs v4 Wallis & Fortuna Differences

One degree C of changes in the data with no clear pattern nor reason. So one full degree C of “jitter” can be in the data with no connection at all to CO2 (By Definition – since this is only the result of change in instruments or processing – and I doubt there were many instruments to change in Wallis & Fortuna).

GHCN v3.3 vs v4 Wallis & Fortuna Anomalies

Other than a “dip” in the baseline period (that rughly 1965-1985 low) it is essentially flat. Present temperatures essentially the same as around 1960.

Samoa

GHCN v3.3 vs v4 Samoa Differences

How unusual. the past is warmed in the v4 data and the present is cooled. I guess having 2 C of warming in Samoa didn’t look very CO2 physical as it was only supposed to be about 1/2 C.

GHCN v3.3 vs v4 Samoa Anomalies

We still have a nice 2 C of range, rising from -1 C in 1900 through 0 C (or equal to the average) in 1920 to 1980, then finally a bit of “lift” in the end with one year at +1 C and another at closer to +1.5 C. Yet the low years are about normal. Wonder what was in the missing years (and why “modern” data is missing but we have full data prior to 1995 or so…

American Samoa

GHCN v3.3 vs v4 American Samoa Differences

GHCN v3.3 vs v4 American Samoa Anomalies

Vanuatu

GHCN v3.3 vs v4 Vanuatu Differences

Vanuatu looks like another of those “too hot to be CO2 physical need to cool it” charts. Nothing much changes in the past, but the recent (“highest quality”) data gets cooled up to 3/4 of a degree C.

GHCN v3.3 vs v4 Vanuatu Anomalies

Basically a flat chunk from about 1950 to 1990 then a sudden jump up by about 3/4 C to 1.5 C. Anyone want to bet it became a “destination” then and the airport got bigger with more jet traffic and tarmac / concrete? But I can see where you would want to blend down that big jump into a more gentile rise. Doesn’t stand out as so odd then.

New Caledonia

GHCN v3.3 vs v4 New Caledonia Differences

A gentle cooling of the 1940s so they blend in with each side (can’t have them being about the same as now, can we?)

GHCN v3.3 vs v4 New Caledonia Anomalies

So now it looks like a steady flat period from about 1940 to 1965, then warming. Except most of the recent years data looks a lot like the 1930s.

Norfolk Island

GHCN v3.3 vs v4 Norfolk Island Anomalies

Nice little 1/2 C “POP” up in the recent years there. Wonder what that does?

GHCN v3.3 vs v4 Norfolk Island Differences

Oh, erases that cold dip… Realistically, this isn’t warming. A couple of recent years have a warm spike, but about the same as 1998 and the 19-teens, and with a (pre-erasure) cold dip in the 2010’s about like prior years too.

Fiji

Fiji is a bit of a trip. They change the recent data to about 1 C warmer and it is still cooling.

GHCN v3.3 vs v4 Fiji Differences

So about 1/2 C cooler in 1990 to 1/2 C warmer in the early 2000s. Looking at the graph below, it seems to have taken a “rolling off to cooler” in the black dots and turned it into a “continuing to warm”… Wonder if they manicure fingernails as well? /snark;

GHCN v3.3 vs v4 Fiji Anomalies

While it does look like a trend line from the “Baseline” years to the present would have a warming trend, the data overall do not. “Now” is no warmer than 1900 or 1930 or 1980. It does look like some low going excursions might be being clipped off. Airport cement anyone?

Tonga

GHCN v3.3 vs v4 Tonga Difference

Looks like about a 3/4 C range of mindless changes.

GHCN v3.3 vs v4 Tonga Anomaly

And more random coin toss than trend in the anomalies.

Niue

GHCN v3.3 vs v4 Niue Differences

Nobody changing much n Niue.

GHCN v3.3 vs v4 Niue Differences

And no “Global Warming” either… Guess that’s why the data get sparse after 1990, so it can be “re-imagined” and infilled via homogenizing from somewhere else.

French Polynesia

GHCN v3.3 vs v4 French Polynesia Difference

Again with the cooling of the baseline window… I think we’re getting a trend here… but not in the climate.

GHCN v3.3 vs v4 French Polynesia Anomaly

Pitcairn Islands

Poor Pitcairn Islands. Off near nowhere. Not important enough for anyone to diddle the data…

GHCN v3.3 vs v4 Pitcairn Islands Differences

Nearly nothing changed.

GHCN v3.3 vs v4 Pitcairn Islands Anomalies

No discernable trend to the anomalies / data… Guess “Global Warming” isn’t very global after all…

In Conclusion

IMHO the degree of change of what ought to be the same data from the same instruments between these “versions” of the “same” data indicate that any warming found is as likely to be error, or more likely to be error, than anything real.

Just looking at the anomaly profiles shows that islands located in the same body of water with nearly constant sea surface temperatures have very different profiles, or shapes of the plotted data. How do you do that when the environment is the same from island to island?

My best guess is that it is local siting issues (in particular measuring at airports with changes of size, materials, and traffic – from grass shack by a Pan Am Clipper seaport to 10,000 foot of concrete and Jet Age Vacationing), or just flat out lousy measuring.

What I do NOT see in the data is a general and steady increase in warming, year over year, across many stations; the kind of thing CO2 and radiative blocking ought to cause.

There will not be a Tech Talk in this posting as it is in the prior postings and all that changes is the letter code used to select for the countries. If you want to know more about the data base used, the codes, and the processing done, see the prior postings.

UAH, RSS, NOAA, UW: Which Satellite Dataset Should We Believe?

Reblogged from DrRoySpencer.com:

April 23rd, 2019 by Roy W. Spencer, Ph. D.

NOTE: See the update from John Christy below, addressing the use of RATPAC radiosonde data.

This post has two related parts. The first has to do with the recently published study of AIRS satellite-based surface skin temperature trends. The second is our response to a rather nasty Twitter comment maligning our UAH global temperature dataset that was a response to that study.

The AIRS Study

NASA’s Atmospheric InfraRed Sounder (AIRS) has thousands of infrared channels and has provided a large quantity of new remote sensing information since the launch of the Aqua satellite in early 2002. AIRS has even demonstrated how increasing CO2 in the last 15+ years has reduced the infrared cooling to outer space at the wavelengths impacted by CO2 emission and absorption, the first observational evidence I am aware of that increasing CO2 can alter — however minimally — the global energy budget.

The challenge for AIRS as a global warming monitoring instrument is that it is cloud-limited, a problem that worsens as one gets closer to the surface of the Earth. It can only measure surface skin temperatures when there are essentially no clouds present. The skin temperature is still “retrieved” in partly- (and even mostly-) cloudy conditions from other channels higher up in the atmosphere, and with “cloud clearing” algorithms, but these exotic numerical exercises can never get around the fact that the surface skin temperature can only be observed with satellite infrared measurements when no clouds are present.

Then there is the additional problem of comparing surface skin temperatures to traditional 2 meter air temperatures, especially over land. There will be large biases at the 1:30 a.m./p.m. observation times of AIRS. But I would think that climate trends in skin temperature should be reasonably close to trends in air temperature, so this is not a serious concern with me (although Roger Pielke, Sr. disagrees with me on this).

The new paper by Susskind et al. describes a 15-year dataset of global surface skin temperatures from the AIRS instrument on NASA’s Aqua satellite. ScienceDaily proclaimed that the study “verified global warming trends“, even though the period addressed (15 years) is too short to say much of anything much of value about global warming trends, especially since there was a record-setting warm El Nino near the end of that period.

Furthermore, that period (January 2003 through December 2017) shows significant warming even in our UAH lower tropospheric temperature (LT) data, with a trend 0.01 warmer than the “gold standard” HadCRUT4 surface temperature dataset (all deg. C/decade):

AIRS: +0.24
GISTEMP: +0.22
ECMWF: +0.20
Cowtan & Way: +0.19
UAH LT: +0.18
HadCRUT4: +0.17

I’m pretty sure the Susskind et al. paper was meant to prop up Gavin Schmidt’s GISTEMP dataset, which generally shows greater warming trends than the HadCRUT4 dataset that the IPCC tends to favor more. It remains to be seen whether the AIRS skin temperature dataset, with its “clear sky bias”, will be accepted as a way to monitor global temperature trends into the future.

What Satellite Dataset Should We Believe?

Of course, the short period of record of the AIRS dataset means that it really can’t address the pre-2003 adjustments made to the various global temperature datasets which significantly impact temperature trends computed with 40+ years of data.

What I want to specifically address here is a public comment made by Dr. Scott Denning on Twitter, maligning our (UAH) satellite dataset. He was responding to someone who objected to the new study, claiming our UAH satellite data shows minimal warming. While the person posting this objection didn’t have his numbers right (and as seen above, our trend even agrees with HadCRUT4 over the 2003-2017 period), Denning took it upon himself to take a swipe at us (see his large-font response, below):

Scott-Denning-tweet-1-550x733

First of all, I have no idea what Scott is talking about when he lists “towers” and “aircraft”…there has been no comprehensive comparisons of such data sources to global satellite data, mainly because there isn’t nearly enough geographic coverage by towers and aircraft.

Secondly, in the 25+ years that John Christy and I have pioneered the methods that others now use, we made only one “error” (found by RSS, and which we promptly fixed, having to do with an early diurnal drift adjustment). The additional finding by RSS of the orbit decay effect was not an “error” on our part any more than our finding of the “instrument body temperature effect” was an error on their part. All satellite datasets now include adjustments for both of these effects.

Nevertheless, as many of you know, our UAH dataset is now considered the “outlier” among the satellite datasets (which also include RSS, NOAA, and U. of Washington), with the least amount of global-average warming since 1979 (although we agree better in the tropics, where little warming has occurred). So let’s address the remaining claim of Scott Denning’s: that we disagree with independent data.

The only direct comparisons to satellite-based deep-layer temperatures are from radiosondes and global reanalysis datasets (which include all meteorological observations in a physically consistent fashion). What we will find is that RSS, NOAA, and UW have remaining errors in their datasets which they refuse to make adjustments for.

From late 1998 through 2004, there were two satellites operating: NOAA-14 with the last of the old MSU series of instruments on it, and NOAA-15 with the first new AMSU instrument on it. In the latter half of this overlap period there was considerable disagreement that developed between the two satellites. Since the older MSU was known to have a substantial measurement dependence on the physical temperature of the instrument (a problem fixed on the AMSU), and the NOAA-14 satellite carrying that MSU had drifted much farther in local observation time than any of the previous satellites, we chose to cut off the NOAA-14 processing when it started disagreeing substantially with AMSU. (Engineer James Shiue at NASA/Goddard once described the new AMSU as the “Cadillac” of well-calibrated microwave temperature sounders).

Despite the most obvious explanation that the NOAA-14 MSU was no longer usable, RSS, NOAA, and UW continue to use all of the NOAA-14 data through its entire lifetime and treat it as just as accurate as NOAA-15 AMSU data. Since NOAA-14 was warming significantly relative to NOAA-15, this puts a stronger warming trend into their satellite datasets, raising the temperature of all subsequent satellites’ measurements after about 2000.

But rather than just asserting the new AMSU should be believed over the old (drifting) MSU, let’s look at some data. Since Scott Denning mentions weather balloon (radiosonde) data, let’s look at our published comparisons between the 4 satellite datasets and radiosondes (as well as global reanalysis datasets) and see who agrees with independent data the best:

Sat-datasets-vs-sondes-reanalyses-tropics-Christy-et-al-2018-550x413
Trend differences 1979-2005 between 4 satellite datasets and either radiosondes (blue) or reanalyses (red) for the MSU2/AMSU5 tropospheric channel in the tropics. The balloon trends are calculated from the subset of gripoints where the radiosonde stations are located, whereas the reanalyses contain complete coverage of the tropics. For direct comparisons of full versus station-only grids see the paper.

Clearly, the RSS, NOAA, and UW satellite datasets are the outliers when it comes to comparisons to radiosondes and reanalyses, having too much warming compared to independent data.

But you might ask, why do those 3 satellite datasets agree so well with each other? Mainly because UW and NOAA have largely followed the RSS lead… using NOAA-14 data even when its calibration was drifting, and using similar strategies for diurnal drift adjustments. Thus, NOAA and UW are, to a first approximation, slightly altered versions of the RSS dataset.

Maybe Scott Denning was just having a bad day. In the past, he has been reasonable, being the only climate “alarmist” willing to speak at a Heartland climate conference. Or maybe he has since been pressured into toeing the alarmist line, and not being allowed to wander off the reservation.

In any event, I felt compelled to defend our work in response to what I consider (and the evidence shows) to be an unfair and inaccurate attack in social media of our UAH dataset.

UPDATE from John Christy (11:10 CDT April 26, 2019):

In response to comments about the RATPAC radiosonde data having more warming, John Christy provides the following:

The comparison with RATPAC-A referred to in the comments below is unclear (no area mentioned, no time frame).  But be that as it may, if you read our paper, RATPAC-A2 was one of the radiosonde datasets we used.  RATPAC-A2 has virtually no adjustments after 1998, so contains warming shifts known to have occurred in the Australian and U.S. VIZ sondes for example.  The IGRA dataset used in Christy et al. 2018 utilized 564 stations, whereas RATPAC uses about 85 globally, and far fewer just in the tropics where this comparison shown in the post was made.  RATPAC-A warms relative to the other radiosonde/reanalyses datasets since 1998 (which use over 500 sondes), but was included anyway in the comparisons in our paper. The warming bias relative to 7 other radiosonde and reanalysis datasets can be seen in the following plot:

RATPAC-vs-7-others-550x413

Adjusting Good Data To Make It Match Bad Data

Reblogged from RealClimateScience.com:

mwr-035-01-0007b.pdf

On election day in 2016, both satellite data sets (UAH and RSS) showed a 15 year long hiatus in global warming, and bore no resemblance to the warming trend being generated by NOAA and NASA.  I captured this image in a November 16, 2016 blog post.

Gavin Schmidt Promises To Resign | The Deplorable Climate Science Blog

This is what the same graph looks like now.

Wood for Trees: Interactive Graphs

In the next image, I overlaid the current RSS graph on the 2016 image.  You can see how RSS was adjusted to match the NASA data.

I predicted this would happen on

Look for the satellite data to be adjusted to bring it into compliance with the fully fraudulent surface temperatures. The Guardian is now working to discredit UAH, so it seems likely that RSS will soon be making big changes – to match the needs of the climate mafia. Bookmark this post.

RSSChanges

Roy Spencer at UAH made the same prediction on January 9, 2017

“I expect there will soon be a revised TLT product from RSS which shows enhanced warming, too.

Here’s what I’m predicting:

1) neither John Christy nor I will be asked to review the paper

2) it will quickly sail through peer review (our UAH V6 paper is still not in print nearly 1 year after submission)

3) it will have many authors, including climate model people and the usual model pundits (e.g. Santer), which will supposedly lend legitimacy to the new data adjustments.

Let’s see how many of my 3 predictions come true.

-Roy”

Wood for Trees: Interactive Graphs

The reason I made this prediction was because Ted Cruz used an RSS graph in a Senate hearing in March of 2015. Carl Mears at RSS then came under intense pressure to make his data match the surface temperature data.

My particular dataset (RSS tropospheric temperatures from MSU/AMSU satellites) show less warming than would be expected when compared to the surface temperatures. All datasets contain errors. In this case, I would trust the surface data a little more because the difference between the long term trends in the various surface datasets (NOAA, NASA GISS, HADCRUT, Berkeley etc) are closer to each other than the long term trends from the different satellite datasets. This suggests that the satellite datasets contain more “structural uncertainty” than the surface dataset.

Ted Cruz says satellite data show the globe isn’t warming

You can see what Mears did to bring his data into compliance. This was his web page in November 2016.

Note that after 1998, the observations are likely to be below the simulated values, indicating that the simulation as a whole are predicting too much warming.

Climate Analysis | Remote Sensing Systems

But under intense pressure,  Mears altered his own data to bring it into compliance.  The large discrepancy became a small discrepancy.

there is a small discrepancy between the model predictions and the satellite observations.

Remote Sensing Systems

The image below overlays Mears’ old graph (V3) on his new one (V4.) It is clear what he did – he  eliminated the blue error interval, and started using the high side of the interval as his temperature.

RSS V3 shows no warming since 2002.

The warming was all created by tampering with the data to eliminate the error interval.

Spreadsheet

The corruption is now complete.  NASA has announced that new satellite data matches their surface temperature data. This was done to keep the President’s Commission on Climate Security from having accurate data to work with.

All government climate data goes through the same transition in support of global warming alarm. The past keeps getting cooler, and recent years keep getting warmer.

NASA 1999   NASA 2016

Government climate agencies appear to be using Orwell’s 1984 as Standard Operating Procedure.

Basic Science: 4 Keys to Melt Fears About Ice Sheets Melting

Reblogged from Watts Up With That:

William Ward, April 18, 2019


[HiFast BLUF:  Here’s the author’s summary/bottom line up front.] Despite the overwhelming number of popular news reports to the contrary, studies of ice sheets melting over the past century show remarkable ice stability. Using the proper scientific perspective, analysis of ice-melt rates and ice-mass losses show the ice sheets will take hundreds of thousands of years to melt, assuming the next glacial period doesn’t start first. An application of basic physics shows that for every 1 °C of atmospheric heat exchanged with the ice sheets we get a maximum 0.4 inches of SLR and a correspondingly cooler atmosphere. Over the 20th century, we observed a worst-case 4:1 ratio of consumed heat to retained atmospheric heat. It is proposed that this ratio can be used to assess potential ice-melt related SLR for a hypothetical atmospheric temperature increase scenario over the current century. Using a reasonable range for all of the variables we can estimate an SLR of between 1.4 – 6.4 inches, but our current observations support the rise being toward the lower end of that range.

The atmosphere and oceans do not show the increase in energy necessary to cause catastrophic SLR from rapidly melting ice. Humankind does not possess the technology to melt a significant amount of ice because the energy required is enormous and only nature can meter out this energy over very long periods. With the proper scientific perspective about the amount of energy required to melt ice, it should be much more difficult for Climate Alarmists to scare the public with scenarios not supported by basic science.


 

The world is drowning in articles about catastrophic sea level rise (SLR), reminding us that if the ice sheets melt, 260 feet of water will flood our coastal cities. We know that sea level today is 20-30 feet lower than it was at the end of the last interglacial period 120,000 years ago. We also know that sea level has risen 430 feet since the end of the last glacial maximum 22,000 years ago. Research shows this rise was not monotonic but oscillatory, and during periods over the past 10,000 years, sea level has been several meters higher than today. So, evidence supports the possibility of higher sea levels, but does the evidence support the possibility of catastrophic sea level rise from rapidly melting ice?

In this paper, basic science is used to show that catastrophic SLR from melting ice cannot happen naturally over a short period. Additionally, humankind does not possess the capability to melt a large amount of ice quickly even through our most advanced technology. This news should relieve the public, which is routinely deceived by reporting that misrepresents the facts. The public is susceptible to unnecessary alarmism when melt rates and ice-melt masses are presented without perspective and juxtaposed against claims that scientists are worried. This paper uses the same facts but places them in perspective to show that catastrophic risks do not exist.

Ice Sheets Melting: Deceptive Reporting

The growing alarm over melting ice sheets is directly attributable to deceptive reporting. The sheer number of reports inundates the public with an incessant message of angst. A single scientific study can be the source for headlines in hundreds of news articles. With social media repeating the news and the subsequent chorus of lectures from celebrities and politicians, we find ourselves in the deafening echo chamber of Climate Alarmism. However, it is a mistake to assume the real risks are proportional to the frequency or intensity of the message.

The primary problem is that the news writers do not have the scientific background to report on the subject responsibly, and therefore they routinely corrupt and distort the facts. Take for example an article in Smithsonian dated September 1, 2016, entitled “Melting Glaciers Are Wreaking Havoc on Earth’s Crust.” The first two sentences of the article read:

“You’ve no doubt by now been inundated with the threat of global sea level rise. At the current estimated rate of one-tenth of an inch each year, sea level rise could cause large swaths of cities like New York, Galveston and Norfolk to disappear underwater in the next 20 years.”

A sea level rise rate of one-tenth of an inch per year yields 2 inches of SLR in 20 years. Topographical maps show the lowest elevations of these cities are more than ten feet above sea level. No portion of these cities will disappear underwater from 2 inches of SLR.

The news writers seem obligated to pepper the facts with their own opinions such as “… climate change is real, undeniable and caused by humans.” It is often difficult for the reader to discern the facts from the opinions. However, even the facts become troubling because they consist of numbers without the perspective to understand their significance and are wrapped in existential angst. Consider the following excerpt from a June 13, 2018 article in the Washington Post, entitled “Antarctic ice loss has tripled in a decade. If that continues, we are in serious trouble.”

“Antarctica’s ice sheet is melting at a rapidly increasing rate, now pouring more than 200 billion tons of ice into the ocean annually and raising sea levels a half-millimeter every year, a team of 80 scientists reported… The melt rate in Antarctica has tripled in the past decade, the study concluded. If the acceleration continues, some of scientists’ worst fears about rising oceans could be realized, leaving low-lying cities and communities with less time to prepare than they had hoped.”

As reported, the reader assumes a melt rate that has tripled must be dire, and billions of tons of melting ice must be extreme. However, this perception changes if the facts are analyzed to provide perspective. An analysis shows that the original annual melt rate of 1.3 parts-per-million (ppm) has increased to nearly 4 ppm over 26 years. The news writer failed to inform us of these facts which provide perspective. The new melt rate is analogous to losing 4 dollars out of 1 million dollars. Losing slightly less than 4 parts in 1 million each year means that it will take over 250,000 years to melt entirely. No natural process is static, so we should expect variation over time. Most change is cyclical. Sometimes the ice is increasing and sometimes it is decreasing. The average person’s body mass fluctuates by 20,000 to 40,000 ppm each day. By comparison, Antarctica varying by 1-4 ppm over a year should be considered rock-solid stability in the natural world.

Ice Sheets Melting: What Happened Over the Past Century

Antarctica holds 91% of the world’s land ice, Greenland 8%, and the remaining 1% is spread over the rest of the world. Therefore, by understanding what is happening to the ice sheets in Antarctica and Greenland, we understand what is happening to 99% of the world’s land ice.

NASA is a good source for research about what is happening in Antarctica. However, two NASA agencies have recently published studies with conflicting conclusions. The Goddard Space Flight Center recently published research concluding Antarctica is not contributing to SLR. According to the study, snow accumulation exceeded ice melting, resulting in a 0.5-inch sea level reduction since 1900. Contrarily, the Jet Propulsion Laboratory (JPL) reports that the rate of Ice loss from Antarctica has tripled since 2012 and contributed 0.3 inches to SLR between 1992 and 2017. To cover the worst-case scenario, we can analyze the JPL study and provide the perspective to understand their results.

Over 26 years, Antarctica’s average annual mass loss was less than 0.00040% of its total. If Antarctica were a 220 lb man, his mass loss each year would be 0.4 grams or about eight tears. (Eight human tears weigh about 0.4 g.) At this alarming rate that makes our most elite climate scientists worried, it would take 250,185 years to melt all of the ice. It would take over 1,000 years of melting to yield 12 inches of SLR from Antarctica if we ignore natural variability and the cyclical nature of ice volume and assume the melt rate continues uninterrupted.

The best information we have about Greenland comes from a study in the journal Nature, estimating Greenland’s ice losses between 1900 – 2010. Using current ice volume estimates from USGS, we calculate the ice mass in 2010 was between 99.5% – 99.8% of what it was in 1900. Ice melt from Greenland in the 111 years contributed 0.6 – 1.3 inches to SLR. It would take over 1,300 years of melting to yield 12 inches of SLR from Greenland if we ignore natural variability and the cyclical nature of ice volume and assume the melt rate continues uninterrupted.

The average annual inland temperature in Antarctica is -57 °C and most coastal stations average -5 °C to -15 °C. The much talked about Western Antarctica averages several degrees below 0 °C. Southern Greenland does experience summer temperatures above 0 °C and seasonal melting. Northern Greenland stays below 0 °C even in the summer months, and the average annual inland temperatures are -20 °C to -30 °C. The temperatures in Greenland and Antarctica are not warm enough to support significant rapid ice melt. In the past century, we have 1 °C of retained atmospheric heat, and enough heat exchanged with ice in Greenland and Antarctica to raise sea level by 0.9 – 1.6 inches. Despite all of the reports in the media to the contrary, we have no real observations of any ice melt crisis. The past 111 years have been remarkable because of ice stability – not because of ice melting. We are 19 years into the 21st century with no evidence supporting an outcome much different from the 20th century.

Ice Sheets Melting: The Process

The lifecycle of an ice sheet begins as snow. Snow falls in the higher elevations and over time it compacts and becomes ice. The ice thickness in Antarctica is over 12,000 feet in the center of the continent and over 9,000 feet over most of East Antarctica. The force of gravity initiates a thousand-year journey where the ice flows from its heights back to the sea. At the end of this journey, when its weight can no longer be supported by the sea, it “calves” and becomes an iceberg. Some icebergs can float around Antarctica for over 30 years before fully melting. So, young ice is born inland from snow, and old ice dies near the coast from seasonal melting or after drifting for years as an iceberg. This process is the natural cycle of ice and not one which should create panic. During some periods we have more snow accumulating than ice melting, such as the period between 1300 CE and 1850 CE, known as the “Little Ice Age.” During other periods we have more ice melting than snow accumulating, such as the Medieval Warm Period and our present time.

In our present time, sunlight alone is insufficient to cause significant changes to ice sheet mass. Sunlight must act in concert with other effects such as cloud cover, water vapor and other “greenhouse” gasses such as CO2. Regardless of the mechanisms, the Earth system must do two things to melt more ice: 1) retain more heat energy and 2) via the atmosphere, transport this heat to the poles and transfer it to the ice. Additional heat energy in the system cannot melt ice unless this transport and transfer happen.

Ice Sheets Melting: Conservation of Energy

A 2007 study by Shepherd and Wingham published in Science shows the current melt rate from Greenland and Antarctica contribute 0.014 inches to SLR each year. For perspective, the thickness of 3 human hairs is greater than 0.014 inches. The results align reasonably well with the other studies mentioned. Despite the minuscule amount of actual SLR from melting ice, NOAA and the IPCC provide 21st century SLR projections that range from a few inches to several meters. The wide range of uncertainty leads to angst about catastrophe; however, the use of basic science allows us to provide reasonable bounds to the possibilities.

Before the start of the American Revolution, Scottish scientist Joseph Black (and others) solved the mysteries of specific heat and latent heat, which gives us the relationship between heat energy, changing states of matter (solid/liquid) and change of temperature. Equations 1 and 2 give us the mathematical relationships for specific heat and latent heat respectively:

(1) E = mc∆T

(2) E = mL

Where E is thermal energy (Joules), m is the mass (kg), c is the “specific heat” constant (J/kg/°C), ∆T is the change in temperature (°C), and L is the latent heat constant (J/kg). Specific heat is the amount of heat energy that we must add (or remove) from a specified mass to increase (or decrease) the temperature of that mass by 1 °C. Latent heat is the thermal energy released or absorbed during a constant temperature phase change. If we know the mass of the ice, water or atmosphere, it is easy to calculate the amount of energy it takes to change its temperature, melt it or freeze it.

Understanding that energy is conserved when melting ice, the equations above can be used to calculate the temperature effects that must be observed in the oceans or atmosphere to support an ice melt scenario. We can provide reasonable bounds and reduce the uncertainty.

See the reference section at the end of the paper for all sources and calculations.

Key #1: Importance of the Latent Heat of Fusion

It is essential to understand the latent heat of fusion because of the enormous amount of heat energy that is required to change the state of H2O from solid to liquid. Figure 1 shows the specific heat and phase change diagram for water. The blue line shows the temperature of water in °C (y-axis) plotted against the change in thermal energy in kJ/kg (x-axis). It shows how temperature and energy are related as we go from cold solid ice to boiling liquid water. The average annual inland temperature of Greenland is -25 °C and this is the reason for Point 1 on the line. If we start at Point 1 and progress to Point 2, this shows how much heat energy must be added to change the temperature of 1kg of ice from -25 °C to 0 °C. It is important to note that at Point 2, the ice is still 100% solid at 0 °C.

Figure 1: Water Phase/Specific Heat Diagram

The diagram reveals something interesting about the behavior of water. As we progress from Point 2 to Point 3, the water undergoes a phase change from solid to liquid. There is no temperature change as the ice becomes liquid water; however, a large amount of heat energy must be added. The energy that must be added to change the phase of water from solid to liquid is the latent heat of fusion. For melting ice, temperature alone does not inform us about what is happening to the system. To assess ice melting, we must understand the net change of energy. Whether we melt 1kg of ice or the entire ice sheet in Greenland, using Equations 1 and 2, we can easily calculate the energy required to do so. Going from Point 1 to Point 3 requires 3.86×105 Joules of energy for each kg of ice mass warmed and melted. For simplicity, we call this quantity of energy “E.”

Figure 1 also shows what happens as we move from Point 3 (0 °C liquid seawater) to Point 4 (seawater starting to boil at 100 °C). It takes a measure of energy “E” to move between Points 3 and 4, just as it does to move between Points 1 and 3. Therefore, as shown in Table 1, the energy required to melt the ice is equivalent to the energy required to heat the meltwater to a boil at 100 °C. (Note: the fresh water from the ice is assumed to flow to the oceans.)

Energy to melt 1kg of polar ice from -25 °C to 0 °C water <– Is Equal To –> Energy to raise the temperature of 1kg of seawater from 0 °C to 100 °C

Table 1: Relating Energy Between Polar Ice Melt and Boiling Water

Key #2: Total Energy Required to Melt the Ice Sheets

Using Equations 1 and 2, we calculate that the total heat energy required to melt the ice sheets entirely is 1.32×1025 J. This value can be given perspective by calculating the increase in ocean water temperature that would result from adding 1.32×1025 J of heat. We know that deep ocean water below the thermocline is very stable in temperature between 0-3 °C. 90% of the ocean water mass is below the thermocline. The thermocline and surface layer above contains the ocean water that responds to changes in atmospheric heat, whether that be from seasonal changes or climate changes. Therefore, if we constrain the 1.32×1025 J of heat energy to the upper 10% of the ocean mass, we calculate the temperature increase would be 25.6 °C, assuming equal heat distribution for simplicity of analysis. This increase would make the surface temperature of equatorial ocean water close to 55 °C, similar to a cup of hot coffee. Polar seas would be perfect for swimming at nearly 25 °C. According to NOAA, over the past 50 years, the average ocean surface temperature has increased approximately 0.25 °C.

Another way to give perspective is to calculate the increase in atmospheric temperature that would result from adding 1.32×1025 J of heat to the atmosphere. First, we must understand some related facts about the atmosphere. Heat energy must be transported by the atmosphere to the polar regions, or no ice can melt. However, the atmosphere’s capacity to store heat energy is extremely low compared to the energy required to melt all of the ice. The ice sheets contain more than 900 times the thermal energy below 0 °C as the atmosphere contains above 0 °C, and therefore the atmospheric heat energy must be replenished continuously to sustain ice melting. Melting polar ice with heat from the atmosphere is analogous to filling a bathtub with a thimble. The low specific heat of air is one reason the atmosphere lacks heat carrying capacity. The other reason is its low mass.

Figure 2 shows the vertical profile of the Earth’s atmosphere. The red line in Figure 2 shows the temperature of the atmosphere in °C (x-axis) plotted against the altitude in km (y-axis). 75% of the mass of the atmosphere is contained in the Troposphere, where all life (outside of the oceans) exists on Earth. Figure 2 reveals that most of the atmosphere is far too cold to melt ice. We can ignore the Upper Thermosphere as the mass of atmosphere contained in that layer is negligibly small. Only the Lower Troposphere below 2.5 km altitude contains air at a warm enough temperature to melt ice. (See the region of the graph enclosed in the yellow oval.) 35% of the atmospheric mass exists below 2.5 km, and the average temperature is ~ 8 °C.

Figure 2: Vertical Profile of Earth’s Atmosphere

Using Equation 1 with E = 1.32×1025 J, the mass of the atmosphere below 2.5 km and solving for ∆T, we can calculate what the temperature of the air below 2.5 km would be if it contained the energy required to melt all of the ice. The atmospheric temperature would have to be 7,300 °C, which is 1,522 °C hotter than the surface of the sun. Life on Earth would be in jeopardy from the increased atmospheric heat long before all of the ice melted. While there are no plausible thermodynamic pathways to heat the Earth’s atmosphere to such temperatures, the calculations of energy required are accurate. According to NASA, the global average temperature over the past 50 years has increased approximately 0.6 °C.

Key #3: SLR From Incremental Atmospheric Heat Exchange with Ice Sheets

It is said, “you can’t have your cake and eat it too.” Similarly, you can’t have atmospheric heat and melt with it too. If the ice consumes heat, then the atmosphere cools. If the atmosphere retains its heat, then no ice melts. So, let’s examine some scenarios where we trade energy from the atmosphere with ice to see how much corresponding SLR we can get.

Using Equation 1, we can determine the change in energy for a 1 °C temperature decrease in the atmosphere below 2.5km. We can then apply this energy to the ice, assume maximum melting volume and translate that to SLR. For every 1 °C of atmospheric energy transferred to the ice, we get 0.4 inches of SLR. Some IPCC scenarios project a 4 °C rise in “global average temperature” in the 21st century, due to increased atmospheric CO2. An increase in temperature does not melt any additional ice unless the heat is transferred to the ice. If 4 °C of energy from the atmosphere is transferred to the ice, we get a corresponding 1.7 inches of SLR and an atmosphere that is 4 °C cooler. If we transfer all of the energy in the atmosphere above 0 °C to the ice, then we get 3.4 inches of SLR and a world where the entire atmosphere is at or below 0 °C. The global average temperature would be 6 °C less than the coldest experienced during the depth of a glacial period.

To raise sea level by 12 inches would require the atmosphere to heat up by 28 °C before exchanging that energy with the ice. As we would experience it, the atmosphere would have to heat up by some incremental value, then exchange that incremental value of energy with the ice, thus cooling the atmosphere, and then repeat this process until the 28 °C of atmospheric heat is consumed.

Key #4: Maximum Ice Melt Potential from Technology

Keys #1-3 don’t offer much to support the possibility of large quantities of ice being melted rapidly by natural causes. The next obvious question is, can humankind generate enough heat with our most advanced technology to melt a significant amount of ice rapidly?

The power of the atom is one of the most awesome powers humankind has harnessed. There are 8,400 operational nuclear warheads in the world’s nuclear arsenal, with a total yield of 2,425 Megatons of TNT. It is interesting to note that the energy contained in this nuclear arsenal is over 800 times the equivalent explosive power used in World War II. It is said that there are enough nuclear weapons to destroy the world a hundred times over. So, perhaps this is enough energy to melt the ice sheets entirely. For this exercise, we assume the nuclear weapons release their energy slowly – only fast enough to melt ice and no faster. For maximum melting, we evenly distribute all of the weapons in the ice. However, when we convert 2,425 MT to Joules, we get a number that is far below the energy required to melt all of the ice. The SLR we could get by using all of the world’s nuclear weapons for melting ice would be 0.002 inches. For reference, the diameter of a human hair is 2.5 times thicker than this. If we want all of the ice to melt, we need to duplicate each weapon more than 1,300,000 times. So, it looks like our current arsenal of nuclear weapons is no match for the ice.

What other sources of power does humankind have that could be used to melt a significant amount of ice? The annual global energy production of electric power is 25 petawatt-hours (25×1015 Whr) or 9×1019 Joules. If we could, through some advanced technology, transfer all electric energy generated over one year to heaters buried in the ice, and do this with no transmission or distribution losses, then how much ice could we melt? The answer is 0.02 inches of SLR (equivalent to 4 human hair diameters). This scenario would require that humans not use any electric power for that entire year, for anything other than melting ice. Humanity would have to forego the benefits of electric power for over 146,000 years to melt all of the ice, assuming static conditions in the ice.

Ice Sheets Melting: Analysis

Since 1900 we have 1 °C of retained atmospheric heat, and enough heat consumed by the ice sheets to produce 0.9 – 1.6 inches of SLR. From Key #3 we learned 1.7 inches of SLR results from trading 4 °C of atmospheric heat for ice melting. Therefore, as a worst-case approximation, if there had been no net ice melt since 1900, the atmosphere would have heated by approximately 5 °C. We can conclude that ice melting consumed 4 °C of heat, leaving the atmosphere with 1 °C of retained heat. We observed a 4:1 ratio of consumed heat to retained heat in the 20th century, worst case. For the best-case approximation, we use the lower estimate of 0.9 inches of SLR, which yields a 2:1 ratio of consumed heat to retained heat over the same period. In one of the more extreme scenarios, the IPCC climate model projects 4 °C of atmospheric temperature rise in the 21st century. For a 4 °C rise scenario, using the worst-case ratio of consumed to retained heat, we can estimate a 6.4 inch SLR over that period. In a more moderate scenario, the IPCC projects a 1.5 °C temperature rise. For a 1.5 °C rise, using the best-case ratio of consumed to retained heat, we can estimate an SLR of 1.4 inches. Unfortunately, none of the climate models have been able to predict the climate accurately, and none of them backtest successfully. We are one-fifth of the way through the 21st century and do not appear to be on course for the IPCC’s worst-case temperature projections. Therefore, it is reasonable to assume the results for the 21st century will likely be very similar to the 20th century, with 1-2 inches of SLR.

Detailed analysis of the claimed Earth energy imbalance is beyond the scope of this paper. The analysis presented here exposes the effects that must occur from an imbalance that leads to catastrophic melting. The ice must absorb large quantities of heat energy for sustained periods. Therefore, inland temperatures over Antarctica and Greenland would need to be maintained well above 0 °C for significant portions of the year. Atmospheric heat lost to the ice would need to be continually replenished to perpetuate the process. The oceans store heat energy, but the large mass of the oceans with the high specific heat of seawater blunts the possible effects from that energy. The energy that would raise the first 2.5 km of atmospheric air by 1 °C would raise the first 1,000 feet of seawater by only 0.0035 °C. The 2nd law of thermodynamics requires a temperature difference to transfer heat energy. Small increases in ocean temperature cannot lead to large movements of heat energy to an already warmer atmosphere. Finally, the system must transport more heat energy to the polar regions. In reality, the Earth maintains a very large temperature gradient between the equator and the poles. Our observations do not show gradient changes that would support significant additional heat transport. Without the increased energy storage and transport, and sustained polar temperatures well above freezing, catastrophic ice melt scenarios are not possible.

Ice Sheets Melting: Summary

Despite the overwhelming number of popular news reports to the contrary, studies of ice sheets melting over the past century show remarkable ice stability. Using the proper scientific perspective, analysis of ice-melt rates and ice-mass losses show the ice sheets will take hundreds of thousands of years to melt, assuming the next glacial period doesn’t start first. An application of basic physics shows that for every 1 °C of atmospheric heat exchanged with the ice sheets we get a maximum 0.4 inches of SLR and a correspondingly cooler atmosphere. Over the 20th century, we observed a worst-case 4:1 ratio of consumed heat to retained atmospheric heat. It is proposed that this ratio can be used to assess potential ice-melt related SLR for a hypothetical atmospheric temperature increase scenario over the current century. Using a reasonable range for all of the variables we can estimate an SLR of between 1.4 – 6.4 inches, but our current observations support the rise being toward the lower end of that range.

The atmosphere and oceans do not show the increase in energy necessary to cause catastrophic SLR from rapidly melting ice. Humankind does not possess the technology to melt a significant amount of ice because the energy required is enormous and only nature can meter out this energy over very long periods. With the proper scientific perspective about the amount of energy required to melt ice, it should be much more difficult for Climate Alarmists to scare the public with scenarios not supported by basic science.

References

NASA Study: Mass Gains of Antarctic Ice Sheet Greater than Losses: https://www.nasa.gov/feature/goddard/nasa-study-mass-gains-of-antarctic-ice-sheet-greater-than-losses

Ramp-up in Antarctic ice loss speeds sea level rise: https://climate.nasa.gov/news/2749/ramp-up-in-antarctic-ice-loss-speeds-sea-level-rise/?fbclid=IwAR2Vnkbxxa-NTU_v0lRUUGGDffMs4Q6BGvHX-KHzcHM7-q2B7IO59wCEiQc

Sea Level and Climate (Fact Sheet 002-00): https://pubs.usgs.gov/fs/fs2-00/

Spatial and temporal distribution of mass loss from the Greenland Ice Sheet since AD 1900: https://www.nature.com/articles/nature16183

Recent Sea-Level Contributions of the Antarctic and Greenland Ice Sheets: http://science.sciencemag.org/content/315/5818/1529

All of the constants and calculations are provided in the associated Excel file located here: https://wattsupwiththat.com/wp-content/uploads/2019/04/Ice-Atmosphere-Ocean-Energy-20190407-1-1.xlsx

Greenland Temperature Data For 2018

NOT A LOT OF PEOPLE KNOW THAT

By Paul Homewood

image

Greenland yemps

The DMI has just published its Greenland Climate Data Collection for last year, and it is worth looking at the temperature data:

There are six stations with long records, Upernavik, Nuuk, Ilulissat, Qaqortoq, Narsarsuaq and Tasilaq.

kort_vejrstationer_480px

image

image

image

image

image

image

Throughout Greenland we find that temperatures in the last two decades are little different to the 1920s to 60s.

The only exceptions were 2010 on the west coast sites, which was an unusually warm year, and 2016 on the east coast at Tasilaq, another warm year there.

Noticeably, last year was actually colder than the 1981-2010 average at all of the west and south coast stations.

It is also noticeable that temperatures during the very cold interval at all sites during the 1970s and 80s were comparable to the late 19thC, when Greenland beginning to struggle out of the Little Ice Age. This can be seen best in the…

View original post 22 more words