David Eagleman

Livewired


Скачать книгу

emotions and sensations in your life are encoded in trillions of signals zipping in blackness, just as a beautiful screen saver on your computer screen is fundamentally built of zeros and ones.

      Imagine you went to an island of people born blind. They all read by Braille, feeling tiny patterns of inputs on their fingertips. You watch them break into laughter or melt into sobs as they brush over the small bumps. How can you fit all that emotion into the tip of your finger? You explain to them that when you enjoy a novel, you aim the spheres on your face toward particular lines and curves. Each sphere has a lawn of cells that record collisions with photons, and in this way you can register the shapes of the symbols. You’ve memorized a set of rules by which different shapes represent different sounds. Thus, for each squiggle you recite a small sound in your head, imagining what you would hear if someone were speaking aloud. The resulting pattern of neurochemical signaling makes you explode with hilarity or burst into tears. You couldn’t blame the islanders for finding your claim difficult to understand.

      You and they would finally have to allow a simple truth: the fingertip or the eyeball is just the peripheral device that converts information from the outside world into spikes in the brain. The brain then does all the hard work of interpretion. You and the islanders would break bread over the fact that in the end it’s all about the trillions of spikes racing around in the brain—and that the method of entry simply isn’t the part that matters.

      Whatever information the brain is fed, it will learn to adjust to it and extract what it can. As long as the data have a structure that reflects something important about the outside world (along with some other requirements we will see in the next chapters), the brain will figure out how to decode it.

image

       Sensory organs feed many different information sources to the brain.

      There’s an interesting consequence to this: your brain doesn’t know, and it doesn’t care, where the data come from. Whatever information comes in, it just works out how to leverage it.

      This makes the brain a very efficient kind of machine. It is a general-purpose computing device. It just sucks up the available signals and determines—nearly optimally—what it can do with them. And that strategy, I propose, frees up Mother Nature to tinker around with different sorts of input channels.

      I call this the Potato Head model of evolution. I use this name to emphasize that all the sensors that we know and love—like our eyes and our ears and our fingertips—are merely peripheral plug-and-play devices. You stick them in, and you’re good to go. The brain figures out what to do with the data that come in.

      As a result, Mother Nature can build new senses simply by building new peripherals. In other words, once she has figured out the operating principles of the brain, she can tinker around with different sorts of input channels to pick up on different energy sources from the world. Information carried by the reflection of electromagnetic radiation is captured by the photon detectors in the eyes. Air compression waves are captured by the sound detectors of the ears. Heat and texture information is gathered by the large sheets of sensory material we call skin. Chemical signatures are sniffed or licked up by the nose or tongue. And it all gets translated into spikes running around in the dark vault of the skull.

image

       The Potato Head hypothesis: plug in sensory organs, and the brain figures out how to use them.

      This remarkable ability of the brain to accept any sensory input shifts the burden of research and development of new senses to the exterior sensors. In the same way that you can plug in an arbitrary nose or eyes or mouth for Potato Head, likewise does nature plug a wide variety of instruments into the brain for the purpose of detecting energy sources in the outside world.

      Consider plug-and-play peripheral devices for your computer. The importance of the designation “plug-and-play” is that your computer does not have to know about the existence of the XJ-3000 SuperWeb-Cam that will be invented several years from now; instead, it needs only to be open to interfacing with an unknown, arbitrary device and receiving streams of data when the new device gets plugged in. As a result, you do not need to buy a new computer each time a new peripheral hits the market. You simply have a single, central device that opens its portholes for peripherals to be added in a standardized manner.4

      Viewing our peripheral detectors like individual, stand-alone devices might seem crazy; after all, aren’t thousands of genes involved in building these devices, and don’t those genes overlap with other pieces and parts of the body? Can we really look at the nose, eye, ear, or tongue as a device that stands alone? I dove deep into researching this problem. After all, if the Potato Head model was correct, wouldn’t that suggest that we might find simple switches in the genetics that lead to the presence or absence of these peripherals?

      As it turns out, all genes aren’t equal. Genes unpack in an exquisitely precise order, with the expression of one triggering the expression of the next ones in a sophisticated algorithm of feedback and feedforward. As a result, there are critical nodes in the genetic program for building, say, a nose. That program can be turned on or off.

      How do we know this? Look at mutations that happen with a genetic hiccup. Take the condition called arhinia, in which a child is born without a nose. It is simply missing from the face. Baby Eli, born in Alabama in 2015, is completely missing a nose, and he also lacks a nasal cavity or system for smelling.5 Such a mutation seems startling and difficult to fathom, but in our plug-and-play framework arhinia is predictable: with a slight tweak of the genes, the peripheral device simply doesn’t get built.

image

       Baby Eli was born with no nose.

image

       Baby Jordy was born with no eyes; beneath his lids one finds skin.

      If our sensory organs can be viewed as plug-and-play devices, we might expect to find medical cases in which a child is born with, say, no eyes. And indeed, that is exactly what the condition of anophthalmia is. Consider baby Jordy, born in Chicago in 2014.6 Beneath his eyelids, one simply finds smooth, glossy flesh. While Jordy’s behavior and brain imaging indicate that the rest of his brain is functioning just fine, he has no peripheral devices for capturing photons. Jordy’s grandmother points out, “He will know us by feeling us.” His mother, Brania Jackson, got a special “I love Jordy” tattoo—in Braille—on her right shoulder blade so that Jordy can grow up feeling it.

      Some babies are born without ears. In the rare condition of anotia, children are born with a complete absence of the external portion of the ear.

image

       A child with no ears.

      Relatedly, a mutation of a single protein makes the structures of the inner ear absent.7 Needless to say, children with these mutations are completely deaf, because they lack the peripheral devices that convert air pressure waves into spikes.

      Can you be born without a tongue, but otherwise healthy? Sure. That’s what happened to a Brazilian baby named Auristela. She spent years struggling to eat, speak, and breathe. Now an adult, she underwent an operation to put in a tongue, and at present she gives eloquent interviews on growing up tongueless.8

      The extraordinary list of the ways we can be disassembled goes on. Some children are born without any pain receptors in their skin and inner organs, so they are totally insensitive to the sting and agony of life’s lesser moments.9 (At first blush, it might seem as though freedom from pain would be an advantage. But it’s not: children unable to experience pain are covered