Bo Bennett

Reason: Book I


Скачать книгу

when “common sense” is anything but common, and often in conflict with reality. Further, what might be expressed as “personal freedoms” to reject the scientific consensus on certain issues can have a devastating impact on others. For these reasons and more, the problem of choosing “common sense” over scientific consensus is a serious one with far-reaching consequences.

      “Common Sense” and “Intuitions” Often Contradict Reality

      What is generally referred to as “common sense” is a subjective sense of our own knowledge. In psychology, there is a phenomenon known as the Dunning-Kruger Effect, which is the illusion that we are smarter than we actually are based on the fact we lack the cognitive capacity to realize our own ineptitude. In fact, this is only one of the hundreds of known cognitive biases, or a deviation from rationality or reason, that virtually assure that our perception of reality is greatly skewed and often incorrect.

      From an evolutionary perspective, these biases are a result of helping us pass on our genes. Evolution does not care if we are smart, logical, reasonable, rational, or even right—as long as our intuitions and “common sense” are more conducive to passing on our genes than they are a hindrance to that goal. The accurate information we do have as a result of the non-conscious and non-deliberate processes is obtained partly by genetics and partly by our environment. For example, it is “common sense” that we don’t want to jump off a cliff because our ancestors had a healthy fear of doing so... that is why they lived long enough to pass on their genes. It is “common sense” for us to look both ways before crossing a road not because of our primate ancestors, but because of learning in our current environment. Be careful not to associate all evolutionary tendencies with good common sense. For example, eating as much as you can at every opportunity might have helped our ancestors survive, but this tendency is sending us to our graves early.

      Here are just some examples where “common sense,” “gut-feelings,” “intuitions,” and “our own understanding” are clearly at odds with reality:

      •common sense tells us that the earth is flat (it’s not)

      •common sense tells us the earth is the center of the universe (it’s not)

      •common sense tells us the objects are solid (they are actually 99.9999999999999% empty space)

      •common sense tells us that heavier objects drop faster than light ones (they don’t)

      •common sense tells us that if we flip a fair coin 5 times in a row and it comes up heads, then tails is “due” (it’s not)

      •common sense tells us that the winning lottery numbers “23, 5, 14, 34, 8, 38” are far more likely than “1, 2, 3, 4, 5, 6” (it’s not)

      •common sense tells us that an airplane is too heavy to fly (it’s not)

      •common sense tells us that time passes at the same rate everywhere (it doesn’t)

      •... and virtually everything related to quantum mechanics, the most well-supported scientific theory every developed, is against common sense

      It would be nice if we can just “know” things magically. In fact, the appeal to common sense is often based on a combination of laziness and a defense of our own intellectual limitations. The pain of not knowing something is reduced by simply thinking we know—and not just know but know better than those who spend their lives doing the work to really find out.

      Experts and Trust

      There is a clear relationship between ignoring the claims of experts in favor of your own “common sense” and trust. If we were to undergo brain surgery, very few of us would question the surgeon’s technique and choice of surgical instruments. Why? Our level of trust in the surgeon is high, and our level of confidence in our own understanding of the topic is very low. But what if the Internet was full of websites claiming that brain surgery was a conspiracy and just a surgeon’s way of separating you from your money? No matter how full of crap these sites were, you may be persuaded by their strong emotional appeals, anecdotes, and cherry-picked data. You would be under the illusion that your level of “knowledge” on the topic is strong, and conversely, your level of trust in the surgeon would drop, to your own, and often society’s detriment.

      Here are some things to keep in mind to prevent this intellectually dangerous downward spiral.

      •Don’t think that reading a few articles on the Internet makes you more qualified than a scientist who spent 6+ years studying the topic and many more researching.

      •Realize that conspiracies are a lot less common than you might think they are.

      •Don’t trust people; trust the science. Separate the message from the messenger. Look at the sources and give more weight to meta-analyses that combine sometimes hundreds of studies into one.

      •Ask yourself if your views on the issue are politically motivated. The scientific conclusions are sometimes politically incorrect and provide evidence against the claims and positions of your political party. But science does not care about your politics, and neither should you when it comes to separating fact from fiction.

      Your “Freedom to Reject Science” Ends When It Puts the Safety and Lives of Others in Jeopardy.

      Discussing freedom is a touchy issue, but it is important in understanding that your “right” to reject certain findings in science is limited, just as your “right” to freedom of speech is limited by not yelling “bomb” in an airport. Some examples include:

      •Rejecting the science behind the safety of vaccines and putting your own children, and other people who cannot get vaccines at risk.

      •Lobbying against GMOs “because it just seems wrong” rather than understanding the science behind GMOs, not realizing that GMOs can save millions of lives of people who are starving and cannot afford to shop at Whole Foods.

      •Insisting that the universe is 6000 years old and trying to have that idea taught in public schools, which is undermining the younger generation’s trust in science.

      Use your own brain—but know its limitations. We are all subject to cognitive biases that give us a false sense of confidence in what we think we know and rationalize that the easy way out (expending no cognitive energy) gives us the intellectually superior high ground, when it clearly does not.

      Doubting Science Because of Unknown Possible Long-Term Effects

      First, let me say that it is reasonable to be wary of virtually any claim (including “I think, therefore I am”). However, “being wary” of something and “rejecting” it are two different things.

      Understanding the Science

      The claim that “there are no long-term negative effects” is not falsifiable (i.e., it cannot be proven false). Long-term negative effects (or risk factors) are a possibility with the introduction of every new medicine, technology, and ideology. Science does not work with certainties nor does it make any guarantees. What it can do, is use induction to make predictions of possible risk factors, and predict the probability associated with them. When a new medicine is released, extensive testing has already been done determining the risk factors linked to the medicine since testing began. The future potential risk factors are hypothesized based on theory. We don’t know that aspirin won’t make all heads explode on January 1, 2050. But to consider that as a risk factor, we need to have a reason (or a theoretical framework) for why that might be. In the absence of such a reason, we don’t need to consider an exploding head in the year 2050 a risk factor of aspirin. How about more reasonable long-term side effects such as cancer or heart attack? The same criteria are applied—what reason do we have to think this would be a risk factor? Without a reason, we have no risk factor.

      We Fear the Unknown

      There