not the sort that The Sentence was meant to describe. No, the variety of future that we human beings manufacture–and that only we manufacture–is of another sort entirely.
The Ape That Looked Forward
Adults love to ask children idiotic questions so that we can chuckle when they give us idiotic answers. One particularly idiotic question we like to ask children is this: ‘What do you want to be when you grow up?’ Small children look appropriately puzzled, worried perhaps that our question implies they are at some risk of growing down. If they answer at all, they generally come up with things like ‘the candy guy’ or ‘a tree climber’. We chuckle because the odds that the child will ever become the candy guy or a tree climber are vanishingly small, and they are vanishingly small because these are not the sorts of things that most children will want to be once they are old enough to ask idiotic questions themselves. But notice that while these are the wrong answers to our question, they are the right answers to another question, namely, ‘What do you want to be now?’ Small children cannot say what they want to be later because they don’t really understand what later means.7 So, like shrewd politicians, they ignore the question they are asked and answer the question they can. Adults do much better, of course. When a thirtyish Manhattanite is asked where she thinks she might retire, she mentions Miami, Phoenix or some other hotbed of social rest. She may love her gritty urban existence right now, but she can imagine that in a few decades she will value bingo and prompt medical attention more than art museums and squeegee men. Unlike the child who can only think about how things are, the adult is able to think about how things will be. At some point between our high chairs and our rocking chairs, we learn about later.8
Later! What an astonishing idea. What a powerful concept. What a fabulous discovery. How did human beings ever learn to preview in their imaginations chains of events that had not yet come to pass? What prehistoric genius first realized that he could escape today by closing his eyes and silently transporting himself into tomorrow? Unfortunately, even big ideas leave no fossils for carbon dating, and thus the natural history of later is lost to us forever. But paleontologists and neuroanatomists assure us that this pivotal moment in the drama of human evolution happened sometime within the last 3 million years, and that it happened quite suddenly. The first brains appeared on earth about 500 million years ago, spent a leisurely 430 million years or so evolving into the brains of the earliest primates, and another 70 million years or so evolving into the brains of the first protohumans. Then something happened–no one knows quite what, but speculation runs from the weather turning chilly to the invention of cooking–and the soon-to-be-human brain experienced an unprecedented growth spurt that more than doubled its mass in a little over two million years, transforming it from the one-and-a-quarter-pound brain of Homo habilis to the nearly three-pound brain of Homo sapiens.9
Now, if you were put on a hot-fudge diet and managed to double your mass in a very short time, we would not expect all of your various body parts to share equally in the gain. Your belly and buttocks would probably be the major recipients of newly acquired flab, while your tongue and toes would remain relatively svelte and unaffected. Similarly, the dramatic increase in the size of the human brain did not democratically double the mass of every part so that modern people ended up with new brains that were structurally identical to the old ones, only bigger. Rather, a disproportionate share of the growth centred on a particular part of the brain known as the frontal lobe, which, as its name implies, sits at the front of the head, squarely above the eyes (see figure 2). The low, sloping brows of our earliest ancestors were pushed forward to become the sharp, vertical brows that keep our hats on, and the change in the structure of our heads occurred primarily to accommodate this sudden change in the size of our brains. What did this new bit of cerebral apparatus do to justify an architectural overhaul of the human skull? What is it about this particular part that made nature so anxious for each of us to have a big one? Just what good is a frontal lobe?
Until fairly recently, scientists thought it was not much good at all, because people whose frontal lobes were damaged seemed to do pretty well without them. Phineas Gage was a foreman for the Rutland Railroad who, on a lovely autumn day in 1848, ignited a small explosion in the vicinity of his feet, launching a three-and-a-half-foot-long iron rod into the air, which Phineas cleverly caught with his face. The rod entered just beneath his left cheek and exited through the top of his skull, boring a tunnel through his cranium and taking a good chunk of frontal lobe with it (see figure 3). Phineas was knocked to the ground, where he lay for a few minutes. Then, to everyone’s astonishment, he stood up and asked if a coworker might escort him to the doctor, insisting all the while that he didn’t need a ride and could walk by himself, thank you. The doctor cleaned some dirt from his wound, a coworker cleaned some brain from the rod, and in a relatively short while, Phineas and his rod were back about their business.10 His personality took a decided turn for the worse–and that fact is the source of his fame to this day–but the more striking thing about Phineas was just how normal he otherwise was. Had the rod made hamburger of another brain part–the visual cortex, Brace’s area, the brain stem–then Phineas might have died, gone blind, lost the ability to speak or spent the rest of his life doing a convincing impression of a cabbage. Instead, for the next twelve years, he lived, saw, spoke, worked and travelled so uncabbagely that neurologists could only conclude that the frontal lobe did little for a fellow that he couldn’t get along nicely without.11 As one neurologist wrote in 1884, ‘Ever since the occurrence of the famous American crowbar case it has been known that destruction of these lobes does not necessarily give rise to any symptoms.’12
But the neurologist was wrong. In the nineteenth century, knowledge of brain function was based largely on the observation of people who, like Phineas Gage, were the unfortunate subjects of one of nature’s occasional and inexact neurological experiments. In the twentieth century, surgeons picked up where nature left off and began to do more precise experiments whose results painted a very different picture of frontal lobe function. In the 1930s, a Portuguese physician named Antonio Egas Moniz was looking for a way to quiet his highly agitated psychotic patients when he heard about a new surgical procedure called frontal lobotomy, which involved the chemical or mechanical destruction of parts of the frontal lobe. This procedure had been performed on monkeys, who were normally quite angry when their food was withheld, but who reacted to such indignities with unruffled patience after experiencing the operation. Egas Moniz tried the procedure on his human patients and found that it had a similar calming effect. (It also had the calming effect of winning Egas Moniz the Nobel Prize for Medicine in 1949.) Over the next few decades, surgical techniques were improved (the procedure could be performed under local anesthesia with an ice pick) and unwanted side effects (such as lowered intelligence and bed-wetting) were diminished. The destruction of some part of the frontal lobe became a standard treatment for cases of anxiety and depression that resisted other forms of therapy.13 Contrary to the conventional medical wisdom of the previous century, the frontal lobe did make a difference. The difference was that some people seemed better off without it.
But while some surgeons were touting the benefits