Posted in | Climate Change

New Statistical Technique Shows a Simpler Pattern of Ocean Warming

The oceans underwent some drastic changes in the early 20th century. For instance, while the Northwest Pacific cooled over many decades, the Northeast Pacific and North Atlantic seemed to warm twofold as much as the global average.

This chart shows annual sea surface temperature changes from different datasets in the North Pacific (top) and North Atlantic (bottom). The blue line indicates the corrected data from this research. It shows greater warming in the North Pacific and less warming in the North Atlantic relative to previous estimates. (Image credit: Harvard University)

Oceanic and atmospheric models had found it difficult to factor in these differences in temperature variations, leading to an enigma in climate science—what were the factors that were responsible for warming and cooling the oceans at such diverse rates in the early 20th century?

Now, a new study performed by the U.K.’s National Oceanography Centre and Harvard University provides an answer, which is as complicated as global politics and as ordinary as a decimal point truncation. The research, which is part history and part climate science, corrects years of data and proposes that ocean warming took place in a relatively more uniform way.

The study has been reported in Nature.

Humans have been determining and recording the temperature of the sea surface for many decades. As a matter of fact, sea surface temperatures have been useful for sailors to find their bearings, verify their course, and predict turbulent weather conditions. However, until the 1960s, the majority of sea surface temperature measurements involved dropping a bucket into the sea and determining the temperature of the water inside.

A collection of sea surface temperature readings dating back to the early 19th century is being maintained by the National Oceanic and Atmospheric Administration (NOAA) and by the National Center for Atmospheric Research (NCAR) of National Science Foundation.

This database includes over 155 million observations from the navy, fishing, research, and merchant ships from all over the globe. Such observations are important for interpreting the changes that occur in sea surface temperature over time, both anthropogenic and natural.

However, these observations are also a statistical nightmare.

For instance, how does one compare the measurements of a British Man of War from 1820 with those from a U.S. Navy ship from 1950 or a Japanese fishing vessel from 1920? And how does one know the type of buckets that were used, and how much these buckets were cooled by evaporation or warmed by sunshine at the time of sampling?

When a canvas bucket filled with water is left on deck for about three minutes under normal weather conditions, it will become by an additional 0.5 ºC than water in a wooden bucket determined under the same, exact conditions.

During the 20th century, global warming was around 1 ºC, and therefore, the biases related to diverse measurement protocols needed meticulous accounting.

There are gigabytes of data in this database and every piece has a quirky story. The data is rife with peculiarities.

Peter Huybers, Study Senior Author, Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS)

Huybers is also a Professor of Earth and Planetary Sciences and of Environmental Science and Engineering at SEAS.

A great deal of research has been performed to detect and alter these peculiarities. For instance, in the year 2008, scientists discovered that a rise in sea surface temperatures in 1945 was the outcome of measurements taken from engine room intakes. However, despite these corrections, the data was not flawless and there were still mysterious changes occurring in ocean surface temperature.

Huybers and his coworkers subsequently suggested a detailed method for rectifying the data. This involved using a novel statistical method that compares measurements taken by neighboring ships.

Our approach looks at the differences in sea surface temperature measurements from distinct groups of ships when they pass nearby, within 300 kilometers and two days of one another. Using this approach, we found 17.8 million near crossings and identified some big biases in some groups.

Duo Chan, Study First Author and Graduate Student, Harvard Graduate School of Arts and Sciences

The team studied data collected between 1908 and 1941, broken down by the “decks”—the marine observations maintained using decks of punch cards—and the country of ship’s origin. A single deck includes observations from Robert Falcon Scott’s voyage as well as Ernest Shackleton’s voyage to the Antarctic.

These data have made a long journey from the original logbooks to the modern archive and difficult choices were made to fit the available information onto punch cards or a manageable number of magnetic tape reels. We now have both the methods and the computer power to reveal how those choices have affected the data, and also pick out biases due to variations in observing practice by different nations, bringing us closer to the real historical temperatures.

Elizabeth Kent, Study Co-Author, the U.K. National Oceanography Centre

The scientists discovered two new major causes of the differences in the North Atlantic and North Pacific. The first cause is due to changes in Japanese records. Before 1932, fishing vessels provided most of the records of sea surface temperature from Japanese vessels in the North Pacific.

This data, distributed across a number of different decks, was initially recorded in whole degrees Fahrenheit, subsequently changed to Celsius, and eventually rounded to tenths of a degree.

Conversely, in the lead-up to World War II, naval ships provided an increasing number of Japanese readings. These data were maintained in a different deck. When the collection was digitized by the U.S. Air Force, the data was truncated, the tenths-of-a-degree digits were chopped off, and the data in whole degrees Celsius was recorded.

Unknown effects of truncation mostly elucidate the rapid cooling that was obvious in the estimates of Pacific sea surface temperatures from 1935 to 1941, Huybers stated. As soon as the bias introduced by truncation was corrected, the warming occurred in the Pacific in a much more homogenous way.

Although Japanese data holds the significance to warming in the Pacific sea in the early 20th century, it is the German data that has the most important role to play in interpreting the temperatures of the sea surface in the North Atlantic during the same period.

Most of the data in the North Atlantic was provided by German ships in the late 1920s. A large part of these measurements was obtained in a single deck, which, upon comparing with nearby measurements, is considerably warmer. When the warming in the North Atlantic is adjusted, it becomes slower.

Based on these adjustments, the investigators discovered that rates of warming across the North Atlantic and North Pacific become relatively more similar and exhibit a warming pattern closer to what would be anticipated from the increasing concentrations of greenhouse gas.

Nevertheless, discrepancies continue to exist and the overall rate of warming identified in the measurements is still quicker than estimated by model simulations.

Remaining mismatches highlight the importance of continuing to explore how the climate has been radiatively forced, the sensitivity of the climate, and its intrinsic variability. At the same time, we need to continue combing through the data—through data science, historical sleuthing, and a good physical understanding of the problem, I bet that additional interesting features will be uncovered,” stated Huybers.

David I. Berry from the U.K. National Oceanography Centre co-authored the study.

The Harvard Global Institute, the National Science Foundation, and the Natural Environment Research Council supported the research.

(Video credit: Harvard University)


Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback