race. Criminals and the mentally ill were discouraged from having children, as it was believed their negative traits would muddy the gene pool. Eugenics, infected as it was by a number of unsettling racist overtones, reached a deplorable climax in Nazi Germany as the ideological drive behind Hitler’s Final Solution.
The aftermath of the Second World War dealt a fatal blow to the ugly science, and a grievous — though not quite deadly — wound to the concept of genetic predestination. Eager to distance themselves from the horrors of Nazism, some scientists promptly embraced eugenics’ polar opposite: behaviourism. The discipline, which had existed as a philosophy for centuries and was forged into its modern form by Ivan Pavlov at the start of the 20th century, was championed by noted psychologist B.F. Skinner, whose “radical behaviourism” proposed that human beings were little more than highly complex machines programmed by a series of external stimuli. As supporters of nature worked desperately to scrub the swastika-shaped stains from their banner, Skinner waved the nurture flag proudly.
Behaviourism was later eclipsed by the cognitive sciences, which adopted the same “mind is machine” metaphor, but drew the opposite conclusion. The mind was indeed a machine, they argued, and human behaviour was preprogrammed into it by genes.[3]
Though nature and nurture each retain a small collection of loyalists, the battle has, in recent years, become more of an illusion than reality. Modern academic opinion has, for the most part, ceded the importance of both factors. This is not to say that there is unanimous approval of a single theory. However, the developmental sciences have undergone a paradigm shift. The question is no longer whether nature trumps nurture or vice versa; rather, it is how the two variables interact to produce a unique individual.
Family Resemblance
Newspaper headlines regularly trumpet the discovery of the gene for this or that, hinting to the average reader that, with just a little more research, everything from obesity to alcoholism will be miraculously cured by the tweak of a few key nucleotides. Sadly, this is not the case. A closer reading of such articles reveals a more mundane truth. The so-called “gene for drug use” or “gene for aggression” or “gene for the obsessive collection of Elvis memorabilia”[4] does not apply a fatalistic tag to the individual, dooming them to a life of addiction or anger or hording ceramic figurines of the King. It can, at best, only predict one’s susceptibility to this kind of behaviour. And even then, there are other factors to consider.
The idea purported by the phrase “gene for X” is that a brief series of nucleotides — the tiny molecules that comprise DNA’s four-letter alphabet — commands an organism to develop a certain trait. By deleting this sequence or changing the order of the letters, one could remove the aberrant trait or replace it with something more desirable. This concept raises suspicions to a number of observers outside of the scientific field. “This may be true for something relatively straightforward, like eye colour or height,” they might say, “but surely complex psychological traits like greed or anger cannot be the result of a single poorly worded genetic phrase.” A well-reasoned argument, but it is only half right. In truth, even those seemingly simple physiological traits arise from both environmental and genetic influences.
Certainly, some traits seem more genetically determined than others. When we see several generations of a single family gathered together, we often notice certain similarities between its members. Perhaps a majority of them have the same freckled skin. Or the same green eyes. Or the same stubby fingers. Maybe we spot a family resemblance in their high cheekbones or aquiline noses. Or we note that none of the adults are shorter than five foot eleven. However, even a brief observation will turn up differences as well. The grandfather sits at the kitchen table and delivers an impassioned argument to his youngest daughter, who responds in kind. Meanwhile, her older sister and her mother sit two chairs down, fidgeting awkwardly with the cutlery and sharing nervous glances, uneasy about the heated tone the conversation has taken. Among the youngest generation, a boy of about six dives off the couch and onto an easy chair while a girl, two years his senior, whines at him to stop before he hurts himself. Another girl, this one only four, stands in the corner and scribbles on the wallpaper with a crayon while her cousin, also four, watches her nervously, wondering whether or not he should tell the adults.
Were an outside observer asked to label which of the family’s traits were genetically determined, they would without hesitation point out the green eyes, the freckles, and height. More astute individuals would likely also mention the nose or the cheekbones or the stubby fingers. But most would hesitate to attribute a genetic link to the argumentative dispositions of the father and daughter, or the awkward brooding of the mother-daughter combo two seats down, or the devil-may-care bravado of the couch-leaper and the wall-scribbler. We tend to see these behaviours as less genetically motivated than something like eye colour. After all, one cannot educate a child taller or discipline green eyes brown. However, to divide traits into genetically determined and environmentally determined compartments is to misunderstand how genes work.
Consider hair colour, a trait that, on the surface, seems to be determined solely by a person’s genes. A child’s hair is seldom a colour that does not have some familial precedent. By contrast, the influence of the environment on one’s hair colour seems nonexistent. Blonde Nordic children adopted by Chinese families do not spontaneously develop black hair. However, this does not mean genes alone are responsible for a person’s hair colour. After all, genes can really only do one thing: instruct cells, by way of an interpreter called RNA, to create a series of amino acids, which then link together to form proteins. Now, this one function is extremely, unbelievably important. Proteins are the body’s proletariat, the workers who carry out the myriad tasks which allow us, the society in which they dwell, to function. But genes cannot, on their own, dictate the colour of a person’s hair. Hair colour is determined by melanin, which is the end product of the amino acid tyrosine. Now, genes do code for tyrosine, hence the genetic influence. However, in hair, the degree of melanin accumulation is decided in part by the concentration of copper in the cells producing that hair. When that cell has more copper, the hair is darker. Should the intake of copper be reduced to below a certain threshold, hair generated by the same follicle will be lighter than it was previously, when copper supplies were plentiful.
Similar factors are responsible for every human trait imaginable. The reason height seems to be determined solely by genetics is that, thankfully, just about everyone in the first world receives the base nutritional intake necessary for those genes to take effect. Likewise, most people get enough copper in their diet, as it can be found in a wide number of dietary staples, including fish, whole grains, nuts, potatoes, leafy greens, dried fruit, cocoa, black pepper, and yeast. Because it is almost universally consumed in sufficient quantities, copper’s contribution to hair colour goes largely unnoticed. Somewhat paradoxically, the ubiquity of its influences renders them invisible. Such is the case with thousands of environmental factors we take for granted. It isn’t until a radical change in the environment depletes once-plentiful resources that we realize how much those resources contributed to our development. In the words of Joni Mitchell, you don’t know what you’ve got ’til it’s gone.
A dramatic example of this occurred in Europe in October of 1944. The tides of war had turned on the Germans, who found their once seemingly invincible army forced back on all sides. Allied forces had reclaimed the southern part of the Netherlands, but the Germans maintained control of the rest. In an attempt to demoralize the Dutch, who had been emboldened by the partial liberation of their country and threatened a violent uprising, the Germans placed an embargo on all food supplies heading into the country and flooded the surrounding fields, spoiling the season’s harvest. To make matters worse, November proved the start of a very harsh winter. The Dutch canals were frozen solid, thwarting Allied attempts to ship in supplies by barge. Thus began the Dutch Hunger Winter, a devastating famine that lasted into the spring of 1945 and was responsible for 18,000 deaths by illness and starvation. Though the tragedy of the famine was harsh and immediately felt, the full brunt of its impact did not appear until long after the embargo was lifted and food supplies returned.
Almost immediately after it ended, researchers saw in the Hunger Winter the potential for a large-scale natural experiment. A population of well-fed people with documented medical histories had undergone severe malnutrition for a precisely delineated amount of time, and