It’s beginning to feel a bit like piling on to highlight the latest scientific study reinforcing the notion that byproducts of the industrialization are causing our atmosphere to warm unnaturally.
But today’s news is noteworthy in that a) it comes from the National Science Foundation, not exactly a loony left-wing tree-hugging group, if you know what we mean, and b) is able to use glacial ice cores, tree rings and sediments from lakes, along with computer simulations, to look back at the Arctic’s past climate, down to a decade-by-decade scale, going back 2,000 years. Previously, a climate simulation this fine-grained only went back about 400 years.

Scientists take sediment core in Alaska. Photo courtesy Darrell Kaufman, Arizona State University
Now, Dateline Earthers were reporting as early as 2003 on how global warming already was affecting the Pacific Northwest. But there are still today those who want to discount the notion that carbon dioxide and other greenhouse gases are playing a role in enhancing the greenhouse effect.
The NSF study, though, traces temperatures in the Arctic, showing temps there had actually been growing steadily colder for 19 centuries until the last century — and why they should have kept getting colder, but for greenhouse gases emitted by modern machinery, farming methods, yadda yadda.
You see, the earth’s rotation around the sun is not a perfectly spherical thing. No, the planet actually wobbles a little bit as it zooms around the sun. And when researchers trace the effects of the wobble, they can see why Arctic started getting colder. What they can’t see — without manmade global warming — is why that trend was reversed over the last 50 years or so.
USA Today’s Doyle Rice captured the essence:
The Arctic’s gradual cooling trend is due to a wobble in the tilt of the Earth, which, over the last 7,000 years, has shifted the Earth’s closest pass by the sun from September to January. This reduces the intensity of the sunlight that reaches the Arctic in the summer and has caused noticeable cooler summers over the past several centuries. That is, until the effects of global warming took over.
It’s been known for some time that Europe and North America hit some pretty cold times in the last thousand years or so, with three especially cold periods centered around 1650, 1770 and 1850. It’s known as the Little Ice Age.
What the NSF study — published today in Science — is telling us is that we really should have kept heading into an ice age.
So how do the tree rings, sediment cores and ice cores help to understand the picture? NSF’s press release explains that these help to trace what actually happened to Arctic temperatures over those many centuries:
These reconstructions were based on evidence provided by sediments from Arctic lakes, including algal abundance, which reflects the length of the growing season, and the thickness of annually deposited sediment layers, which increases during warmer summers when deposits from glacial melt-water increase. The Kaufman et al. study also incorporated previously published data from glacial ice and tree rings that was calibrated against the instrumental temperature record.
Translated, the “instrumental temperature record” means temps as they actually were measured by thermometers over the last few hundred years. Scientists can see what happened to the algae and tree rings and so forth when we could measure the temperature. Then researchers in this study looked at what happened to those biologic indicators over the last 2,000 years, and could reconstruct what happened with Arctic temperatures.
OK, maybe that’s piling on, but with the widespread misinformation about what these studies show, we feel obligated to point out the most powerful pieces of evidence. This was one.
— Robert McClure