x

Photo: Mike Trenchard

In my first article in this series, “The History of Climate Science,” I began with the discovery of infrared radiant energy in 1800. In 1859, John Tyndall discovered how radiant energy reacted with even small amounts of CO2 to warm the atmosphere. By 1900, Savanti Arrhenius was making the first attempts at modeling the affects of CO2 on the temperature of the atmosphere. Let’s pick up from there.

Photo Credit: NASA Goddard Space Flight Center / Flickr

Photo Credit: NASA Goddard Space Flight Center / Flickr

Progress in climate science (U of Chicago, highly recommended link) and technology (MIT, highly recommended link) have brought the picture of global warming, attribution (causes) and climate change into much sharper focus. (Both links in this paragraph take you to a series of university lectures covering the science and technology of climate science by two of the world’s leading universities.)

So far it appears that our species is the only organism capable of forecasting based upon generalizations. This gave our hunter gather forbearers an edge in finding food and avoiding predators. We take this for granted, because we do this constantly in nearly every activity. We use approximations based upon experience to predict outcomes. When we drive a car, we predict our ability to make a curve and adjust our speed accordingly, despite not knowing exactly what the maximum safe speed might be. We plan a trip on our assumptions and estimated requirements. We plan for retirement based upon estimates of future inflation, health and lifestyle. None of these are precisely true but our world would collapse if we lost our ability to forecast based upon experience and estimates (a.k.a. models).

In 1976, statistician George E. P. Box wrote, “All models are wrong but some are useful.” Today we model everything from market strategies to strategic military scenarios to weather forecasting. How do we insure their usefulness?

Hurricane Isabel (2003) as seen from orbit during Expedition 7 of the International Space Station. (Photo Credit: Mike Trenchard, Earth Sciences & Image Analysis Laboratory, Johnson Space Center.)

Hurricane Isabel (2003) as seen from orbit during Expedition 7 of the International Space Station. (Photo Credit: Mike Trenchard, Earth Sciences & Image Analysis Laboratory, Johnson Space Center.)

Box goes on to explain that a model cannot ever exactly represent reality but it can provide a useful approximation. He illustrated this by the commonly used formula

PV = RT (pressure P multiplied by volume V is equal to a constant R multiplied by the temperature T of an “ideal” gas). This is not exactly true for any real gas, but provides a useful approximation in understanding the general behavior of gas molecules.

What remains is how much confidence should we have in climate scenario models that can never be 100 percent accurate?

This brings us to several relevant research principles. The first is accuracy and precision. The second is how much resolution (focus or detail) do we need to be useful? The third is often referred to as QAQC or quality assurance and quality control. These three factors are critical but the outcome can still be bogus if due diligence in each factor is incomplete. This brings us to the axiom “garbage in = garbage out.” The data we put into the equation must be authentic and relevant.

A good metaphor for accuracy is hitting the target. Precision refers to how tight the grouping is after identical, repeated procedures. If we do a, b and c, do we always get very close to “d” or diverse results? If we always group around “d” then our accuracy and precision is good. If we get q, m or z, we need to take better aim.

Resolution is how clear or how big the target is. In some ways climate science is less difficult to model than predicting the weather. Daily weather forecasting models attempt to hit a precise location over a very short interval of time. The variables change by the minute. Climate change science looks at many decades where patterns are far more consistent and changes usually occur more slowly.

noaa1

The best explanation of Quality Assurance and Quality Control (QAQC) was by a U.S. EPA colleague of mine. Roy Jones was a John Hopkins graduate who did risk assessment for EPA Region 10 in Seattle. He summed up QAQC like this: “Quality assurance is doing the right thing. Quality control is doing the right thing right.”

As important as the amount of data is, it must be authentic, meaning it is what it is reported to be. It must also be relevant, meaning that it is directly applicable to the task at hand.

Climate models are among the most statistically difficult approximations humans have ever attempted. They demand almost unimaginable computer crunching capacity; because to be useful, trillions of data bits have to be entered. The climate is not static so the data is constantly being updated and re-crunched.

The first models were from incomplete data taken at surface level. Later balloons were incorporated but they drifted. The resolution (detail) of our models was like trying to make a mural image with a one megapixel camera. Today we have dozens of satellites measuring continually from the Earth’s surface to the top of the atmosphere. We have super computers capable of teraflops/second of processing power. Still, our super computers take months to crunch the vast amounts of data.

Over the decades since 1958, and the first “geophysical” study of the entire planet, the resolution of climate change models has come into clearer focus. Questions about the influence of Earth’s orbit, the wobble of the axis, volcanic influences and even bovine digestion have been studied and considered in the calculus of climate modeling.

Climate scientists are warning about irreversible catastrophe just over the time horizon. Why should we alter the course of our entire global civilization? Are these models, scenarios and forecasts accurate enough to be useful?

The Earth, as projected on NOAA's Science on a Sphere. (Photo Credit: Rick Baraff)

The Earth, as projected on NOAA’s Science on a Sphere. (Photo Credit: Rick Baraff)

We have a far better understanding of climate history than you might imagine. We know climate history by two methods. The first can be learned directly from human records. That history gets more clouded the further back we go. The second method is through “proxies.” Proxies are non-human references such as tree rings, ice core analyses and geophysical and chemical records. Researchers take this historical data and enter it into the model to see if the model predicts a later known condition. The better the correlation between the known record and the model results, the more confidence we have in the model.

There is a principle of investigation called Occam’s razor. Occam’s razor suggests that after all known explanations have been investigated, the simplest solution tends to be the best answer. Science has attempted to explain global warming and resulting climate change by looking at Earth’s orbit, volcanic activity and every other possible cause that arises, but the most clear and simple explanation is anthropogenic greenhouse emissions from burning fossil fuels. None of the other explanations or combinations of explanations come close to explaining today’s warming planet.

Excessive CO2 is warping Earth’s energy balance and warming things up. The relationship between greenhouse gases and heat have been studied and known for over 200 years.

Print Friendly

Leave a Reply

ENVIRONMENTAL NEWS
FROM THE FRONT LINES

#KnowYourPlanet

Get the top stories from Planet Experts — right to your inbox every week.

Send this to a friend