time:
Through most of the 20th century, the uniformitarian principle was cherished by geologists as the very foundation of their science. In human experience, temperatures apparently did not rise or fall radically in less than millennia, so the uniformitarian principle declared that such changes had never happened in the past.
If you’re positive something doesn’t exist, you’re not going to look for it, right? And because everyone was certain that global climate changes took at least a thousand years, nobody even bothered to look at the evidence in a way that could reveal faster change. Those Swedish scientists studying the layers of lake bottom clay who first postulated the “rapid” thousand-year onset of the Younger Dryas? They were looking at chunks of mud spanning centuries; they never looked at samples small enough to demonstrate faster change. The proof that the Younger Dryas descended on the Northern Hemisphere much more rapidly than they thought was right in front of their eyes – but they were blinded by their assumptions.
By the 1950s and 1960s, the uniformitarian vise started to lose its hold, or at least change its grip, as scientists began to understand the potential of catastrophic events to produce rapid change. In the late 1950s, Dave Fultz at the University of Chicago built a mock-up of the earth’s atmosphere using rotating fluids that simulated the behavior of atmospheric gases. Sure enough, the fluids moved in stable, repeating patterns – unless, that is, they were disturbed. Then, even the smallest interference could produce massive changes in the currents. It wasn’t proof by a long shot, but it certainly was a powerful suggestion that the real atmosphere was susceptible to significant change. Other scientists developed mathematical models that indicated similar possibilities for rapid shifts.
As new evidence was discovered and old evidence was re-examined, the scientific consensus evolved. By the 1970s there was general agreement that the temperature shifts and climate changes leading into and out of ice ages could occur over mere hundreds of years. Thousands were out, hundreds were in. Centuries were the new “rapid.”
There was a new consensus around when – but a total lack of agreement about how. Perhaps methane bubbled up from tundra bogs and trapped the heat of the sun. Perhaps ice sheets broke off from the Antarctic and cooled the oceans. Maybe a glacier melted into the North Atlantic, creating a massive freshwater lake that suddenly interrupted the ocean’s delivery of warm tropical water to the north.
It’s fitting that hard, cold proof was eventually found in hard, cold ice.
In the early 1970s, climatologists discovered that some of the best records of historic weather patterns were filed away in the glaciers and ice plateaus of northern Greenland. It was hard, treacherous work – if you’re imagining the stereotypical lab rat in a white coat, think again. This was Extreme Sports: Ph.D. – multinational teams trekking across miles of ice, climbing thousands of feet, hauling tons of machines, and enduring altitude sickness and freakish cold, all so they could bore into a two-mile core of ice. But the prize was a pristine and unambiguous record of yearly precipitation and past temperature, unspoiled by millennia and willing to reveal its secrets with just a little chemical analysis. Once you paid it a visit, of course.
By the 1980s, these ice cores definitively confirmed the existence of the Younger Dryas – a severe drop in temperature that began around 13,000 years ago and lasted more than a thousand years. But that was just, well, the tip of the iceberg.
In 1989 the United States mounted an expedition to drill a core all the way to the bottom of the two-mile Greenland ice sheet – representing 110,000 years of climate history. Just twenty miles away, a European team was conducting a similar study. Four years later, both teams got to the bottom – and the meaning of rapid was about to change again.
The ice cores revealed that the Younger Dryas – the last ice age – ended in just three years. Ice age to no ice age – not in three thousand years, not in three hundred years, but in three plain years. What’s more, the ice cores revealed that the onset of the Younger Dryas took just a decade. The proof was crystal clear this time – rapid climate change was very real. It was so rapid that scientists stopped using the word rapid to describe it, and started using words like abrupt and violent. Dr. Weart summed it up in his 2003 book:
Swings of temperature that scientists in the 1950s believed to take tens of thousands of years, in the 1970s to take thousands of years, and in the 1980s to take hundreds of years, were now found to take only decades.
In fact, there have been around a score of these abrupt climate changes over the last 110,000 years; the only truly stable period has been the last 11,000 years or so. Turns out, the present isn’t the key to the past – it’s the exception.
The most likely suspect for the onset of the Younger Dryas and the sudden return to ice age temperatures across Europe is the breakdown of the ocean “conveyor belt,” or thermohaline circulation, in the Atlantic Ocean. When it’s working normally – or at least the way we’re used to it – the conveyor carries warm tropical water on the ocean surface to the north, where it cools, becomes denser, sinks, and is carried south through the ocean depths back to the Tropics. Under those circumstances, Britain is temperate even though it’s on the same latitude as much of Siberia. But when the conveyor is disrupted – say, by a huge influx of warm fresh water melting off the Greenland ice sheet – it may have a significant impact on global climate and turn Europe into a very, very cold place.
Just before the Younger Dryas, our European ancestors were doing pretty well. Tracing human migration through DNA, scientists have documented a population explosion in Northern Europe as populations that had once migrated north out of Africa now moved north again into areas of Europe that had been uninhabitable during the last ice age (before the Younger Dryas). The average temperature was nearly as warm as it is today, grasslands flourished where glaciers had once stood, and human beings thrived.
And then the warming trend that had persisted since the end of the last ice age kicked rapidly into reverse. In just a decade or so, average yearly temperatures plunged nearly thirty degrees. Sea levels dropped by hundreds of feet as water froze and stayed in the ice caps. Forests and grasslands went into a steep decline. Coastlines were surrounded by hundreds of miles of ice. Icebergs were common as far south as Spain and Portugal. The great, mountainous glaciers marched south again. The Younger Dryas had arrived, and the world was changed.
Though humanity would survive, the short-term impact, especially for those populations that had moved north, was devastating. In less than a generation, virtually every learned method of survival – from the shelters they built to the hunting they practiced – was inadequate. Many thousands of humans almost certainly froze or starved to death. Radiocarbon dating from archaeological sites provides clear evidence that the human population in Northern Europe went into a steep decline, showing a steep drop-off in settlements and other human activity.
But humans clearly survived; the question is, how? Certainly some of our success was due to social adaptation – many scientists think that the Younger Dryas helped to spur the collapse of hunter-gatherer societies and the first development of agriculture. But what about biological adaptation and natural selection? Scientists believe some animals perfected their natural ability to survive cold spells during this period – notably the wood frog, which we’ll return to later. So why not humans? Just as the European population may have “selected” the hemochromatosis gene because it helped its carriers withstand the plague, might some other genetic trait have provided its carriers with superior ability to withstand the cold? To answer that, let’s take a look at the effect of cold on humans.
Immediately upon his death in July 2002, baseball legend Ted Williams was flown to a spa in Scottsdale, Arizona, checked in, and given a haircut, a shave, and a cold plunge. Of course, this wasn’t your typical Arizona spa – this was the Alcor Life Extension cryonics lab, and Williams was checking in for the foreseeable future. According to his son, he hoped that future medical science might be able to restore him to life.
Alcor separated Williams’s head from his body, drilled a couple of dime-size holes in it, and froze it in a bucket of liquid nitrogen at minus 320 degrees Fahrenheit. (His body got its own cold storage container.) Alcor brochures suggest that “mature nanotechnology” might