There are many well intentioned folks out there who believe that childhood vaccinations cause Autism. Last week I covered the origins of this belief system as well as its subsequent debunking in Vaccines and Autism. Despite the conclusive data that clearly establishes no causal link between vaccines and Autism, the belief lives on. Why is this? Why do smart people fall prey to such illusions? Chabris and Simons contend in their book, The Invisible Gorilla, that we fall prey to such myths because of the Illusion of Cause. Michael Shermer (2000), in his book, How We Believe, eloquently describes our brains as a Belief Engine. Underlying this apt metaphor is the notion that “Humans evolved to be skilled pattern seeking creatures. Those who were best at finding patterns (standing upwind of game animals is bad for the hunt, cow manure is good for the crops) left behind the most offspring. We are their descendants.” (Shermer, p. 38). Chabris and Simons note that this refined ability “serves us well, enabling us to draw conclusions in seconds (or milliseconds) that would take minutes or hours if we had to rely on laborious logical calculations.” (p. 154). However, it is important to understand that we are all prone to drawing erroneous connections between stimuli in the environment and notable outcomes. Shermer further contends that “The problem in seeking and finding patterns is knowing which ones are meaningful and which ones are not.

 

From an evolutionary perspective, we have thrived in part, as a result of our tendency to infer cause or agency regardless of the reality of threat. For example, those who assumed that rustling in the bushes was a tiger (when it was just wind) were more likely to take precautions and thus less likely, in general, to succumb to predation. Those who were inclined to ignore such stimuli were more likely to later get eaten when in fact the rustling was a hungry predator. Clearly from a survival perspective, it is best to infer agency and run away rather than become lunch meat. The problem that Shermer refers to regarding this system is that we are subsequently inclined toward mystical and superstitious beliefs: giving agency to unworthy stimuli or drawing causal connections that do not exist. Dr. Steven Novella, a neurologist, in his blog post entitled Hyperactive Agency Detection notes that humans vary in the degree to which they assign agency. Some of us have Hyperactive Agency Detection Devices (HADD) and as such, are more prone to superstitious thinking, conspiratorial thinking, and more mystical thinking. It is important to understand as Shermer (2000) makes clear:

 

“The Belief Engine is real. It is normal. It is in all of us. Stuart Vyse [a research psychologist] shows for example, that superstition is not a form of psychopathology or abnormal behavior; it is not limited to traditional cultures; it is not restricted to race, religion, or nationality; nor is it only a product of people of low intelligence or lacking education. …all humans possess it because it is part of our nature, built into our neuronal mainframe.” (p. 47).

 

We all are inclined to detect patterns where there are none. Shermer refers to this tendency as patternicity. It is also called pareidolia. I’ve previously discussed this innate tendency noting that “Our brains do not tolerate vague or obscure stimuli very well. We have an innate tendency to perceive clear and distinct images within such extemporaneous stimuli.” It is precisely what leads us to see familiar and improbable shapes in puffy cumulus clouds or the Virgin Mary in a toasted cheese sandwich. Although this tendency can be fun, it can also lead to faulty and sometimes dangerous conclusions. And what is even worse is that when we hold a belief, we are even more prone to perceive patterns that are consistent with or confirm that belief. We are all prone to Confirmation Bias – an inclination to take in, and accept as true, information that supports our belief systems and miss, ignore, or discount information that runs contrary to our beliefs.

 

Patternicity and confirmation bias alone are not the only factors that contribute to the illusion of cause. There are at least two other equally salient intuitive inclinations that lead us astray. First, we tend to infer causation based on correlation. And second, the appeal of chronology, or the coincidence of timing, also leads us toward drawing such causal connections (Chabris & Simons, 2010).

 

A fundamental rule in science and statistics is that correlation does not infer causation. Just because two events occur in close temporal proximity, does not mean that one leads to the other. Chabris and Simons note that this rule is in place because our brains automatically – intuitively – draw causal associations, without any rational thought. We know that causation leads to correlation – but it is erroneous to assume that the opposite is true. Just because A and B occur together does not mean A causes B or vice-versa. There may be a third factor, C, that is responsible for both A and B. Chabris and Simons use ice cream consumption and drownings as an example. There is a sizable positive correlation between these two variables (as ice cream consumption goes up so do the incidences of drowning), but it would be silly to assume that ice cream consumption causes drowning, or that increases in the number of drownings causes increases in ice cream consumption. Obviously, a third factor, summer heat, leads to both more ice cream consumption and more swimming. With more swimming behavior there are more incidents of drowning.

 

Likewise, with vaccines and Autism, although there may be a correlation between the two (increases in the number of children vaccinated and increases in the number of Autism diagnoses), it is incidental, simply a coincidental relationship. But given our proclivity to draw inferences based on correlation, it is easy to see why people would be mislead by this relationship.

 

Add to this the chronology of the provision of the MMR vaccine (recommended between 12 and 18 months), and the typical time at which the most prevalent symptoms of Autism become evident (18-24 months), people are bound to infer causation. Given the fact that millions of children are vaccinated each year, there are bound to be examples of tight chronology.

 

So what is at work here are hyperactive agency detection (or overzealous patternicity), an inherent disposition to infer causality from correlation, and a propensity to “interpret events that happened earlier as the causes of events that happened or appeared to happen later” (Chabris & Simons, 2010, p. 184).  Additionally, you have a doctor like Andrew Wakefield misrepresenting data in such a way to solidify plausibility and celebrities like Jenny McCarthy using powerful anecdotes to convince others of the perceived link. And anecdotes are powerful indeed. “..[W]e naturally generalize from one example to the population as a whole, and our memories for such inferences are inherently sticky. Individual examples lodge in our minds, but statistics and averages do not. And it makes sense that anecdotes are compelling to us. Our brains evolved under conditions in which the only evidence available to us was what we experienced ourselves and what we heard from trusted others. Our ancestors lacked access to huge data sets, statistics, and experimental methods. By necessity, we learned from specific examples…” (Chabris & Simons, 2010, pp. 177-178).  When an emotional mother (Jenny McCarthy) is given a very popular stage (The Oprah Winfrey Show) and tells a compelling story, people buy it – intuitively – regardless of the veracity of the story. And when we empathize with others, particularly those in pain, we tend to become even less critical of the message conveyed (Chabris & Simons, 2010). These authors add that “Even in the face of overwhelming scientific evidence and statistics culled from studies of hundreds of thousands of people, that one personalized case carries undue influence” (p.178).

 

Although the efficacy of science is unquestionable, in terms of answering questions like the veracity of the relationship between vaccines and Autism, it appears that many people are incapable of accepting the reality of scientific inquiry (Chabris & Simons, 2010). Acceptance necessitates the arduous application of reason and the rejection of the influences rendered by the intuitive portion of our brain. This is harder than one might think. Again, it comes down to evolution. Although the ability to infer cause is a relatively recent development, we hominids are actually pretty good at it. And perhaps, in cases such as this one, we are too proficient for our own good (Chabris & Simons, 2010).

 

References

 

Center for Disease Control. (2009). Recommended Immunization Schedule for Persons Aged 0 Through 6 Years. http://www.cdc.gov/vaccines/recs/schedules/downloads/child/2009/09_0-6yrs_schedule_pr.pdf

 

Chabris, C. F., & Simons, D. J. (2010). The Invisible Gorilla. Random House: New York.

 

Novella, S. (2010). Hyperactive Agency Detection. NeuroLogica Blog. http://www.theness.com/neurologicablog/?p=1762

 

Shermer, M. (2000). How We Believe. W.H. Freeman / Henry Holt and Company: New York.

Share

3 Comments

  1. Pingback:The Data: Vaccines and Autism « How Do You Think?

  2. Pingback:Superstitious? It’s in your genes – It’s in your culture. « How Do You Think?

  3. Pingback:Multitasking: The Illusion of Efficacy - How Do You Think?

Leave a Comment

Your email address will not be published. Required fields are marked *