In 1958, at a remote location on Mauna Loa, Hawaii, Charles Keeling began taking detailed measurements of atmospheric carbon dioxide (CO2) levels. Using an instrument he developed while working as postdoctoral fellow at the California Institute of Technology (CalTech), Keeling was the first to take accurate measurements of CO2 over extended period of time. Long enough in fact to detect short- and long-term trends in CO2 levels. What he found would change the world.


But what were CO2 levels like before Keeling started taking measurements? We now know from the examination of tiny air bubbles trapped in polar ice that the concentration of CO2 in the atmosphere for much of the Holocene epoch (the last 11,000 years; about as long as humans have been forming settlements) has been between 275 and 285 ppm. CO2 levels began to increase noticeably by the beginning of the 19th century, when the industrial revolution was in full swing in Europe and North America. The strong and consistent relationship between the release of carbon dioxide through the burning of fossil fuels (first coal, then oil and natural gas) and the global rise in atmospheric CO2 is undeniable.
Perhaps most unsettling is that CO2 levels aren’t simply rising; they appear to be rising at an ever-faster rate. According to the National Ocean & Atmospheric Administration (NOAA), CO2 levels rose an average of about 0.8 ppm per year between 1960 and 1970. Just two decades later, CO2 levels were increasing by about 1.6 ppm per year. And more recently – between 2000 and 2016 – CO2 levels rose by 2-3 ppm annually. The trend is clear: CO2 levels are rising fast, and will likely to continue to rise faster in the future.

Although CO2 makes up only a tiny fraction of the atmosphere (about 0.041 percent as it turns out), it – like methane, water vapor, and ozone – is a greenhouse gas capable of absorbing and readmitting solar energy, thereby warming the atmosphere. Some of the solar radiation that reaches the surface of the Earth is reflected directly back into the atmosphere, and some is absorbed by the Earth and re-released as longer-wave heat energy. Greenhouse gases in the atmosphere can “trap” some of this radiation, which would otherwise escaped back out into space.

While greenhouse gases help keep the surface of the Earth warm enough to sustain life (we wouldn’t be here without them!), too much of a good thing can be disastrous. One only needs to look at our planetary neighbor, Venus, to see what happens to a planet with a runaway greenhouse gas effect. Venus’ atmosphere is about 96 percent carbon dioxide. Much of the solar energy that reaches Venus is trapped within the planet’s dense atmosphere. As a result, the surface of Venus can reach temperatures of nearly 900 °F (480 °C) – hot enough to melt lead. Mercury, for comparison, is much closer to the sun but has only a trace atmosphere. The planet has a mean surface temperature of only 67 °C at the equator. The presence of an atmosphere makes a huge difference! As does the amount of greenhouse gases in that atmosphere.
As we continue to release more CO2 into the atmosphere, we can expect global temperatures to continue to rise. Earlier this year, NASA reported that the average surface temperature in 2016 was about 1.8 °F (1 °C) warmer than the mid-20th century mean. Since the late 1970s, average global surface temperature has risen about 0.28 °F per decade. The three hottest years on record are 2016 2015, and 2014.

Clearly, we humans need to be careful about how much CO2 we release into the atmosphere. As we move into uncharted territory – with CO2 levels higher than any other time since the dawn of our civilization – it is difficult to predict with any certainty what will happen. What is certain, however, is that our planet is changing quickly, even within the span of our (geologically) brief lifetimes.