So really, what caused that earthquake and subsequent tsunami in Japan?  A quick Google search posing this very question yields a wide range of answers.  Fortunately a majority of the hits acknowledge and explain how plate tectonics caused this tragedy.  Sprinkled throughout the scientifically accurate explanations are conspiracy theories suggesting that the US government caused it through hyper-excitation of radio waves in the ionosphere (HAARP) and perhaps even planned radiation releases.  Other theories include the “Supermoon’s” increased tug on the earths crust due to the fact that it is at perigee (closest proximity to the earth in its cyclical orbit).  Solar flares (coronal mass ejections) were also blamed; and by some, the flares working in concert with the moon in perigee are believed to have triggered the quake.  Global warming also gets its share of the blame (but the proponents  suggest that real cause is the removal of oil from the crust leaving voids that ultimately trigger earthquake).   Some have even suggested that a comet or even God may have done this.

 

The problem with the scientific explanation is that plate tectonics is invisible to most of us.  Its motion is so gradual that it does not “on the surface” seem plausible.  We seemingly need a clear causal agent that fits within our understanding of the world.  Scientifically literate individuals are inclined to grasp the agency of tectonics because the theory and the effects do in fact, fit together in observable and measurable ways.  Others reach for causal explanations that better fit within their understanding of the world.

 

Our correlation calculators (brains) grab onto events temporally associated with such events and we then conjure up narratives to help us make sense of it all.  It is easy to understand why folks might assume that the moon at perigee, or increased solar activity, or even an approaching comet might cause such events.  Others, who are prone to conspiracy theories, who also have a corresponding belief that big brother is all powerful and sadistic, will grab onto theories that fit their world views.  The same is true for those with literal religious inclinations.  Unfortunately, this drive often leads to narrative fallacies that misplace the blame and sometimes ultimately blame the victims.

 

History is filled with stories drawn up to explain such tragedies.  In the times of ancient Greece and Rome, many tales were spun to explain famine, plagues, and military failures.  All of this occurred prior to our increasingly complex understanding of the world (e.g., germ theory, plate tectonics, meteorology), and it made sense to blame such events on vengeful gods.  How else could they make sense of such tragedies?  This seems to be how we are put together.

 

A study published in 2006 in the journal, Developmental Psychology, by University of Arkansas Psychologists Jesse Bering and Becky Parker looked at the development of such inclinations in children.  They pinpointed the age at which such thinking begins to flourish.   They also provided a hypothesis to explain this developmental progression.  This study was summarized in a March 13, 2011 online article at Scientific American by the first author titled: Signs, signs, everywhere signs: Seeing God in tsunamis and everyday events.

 

In this study of children ages three to nine years of age, the psychologists devised a clever technique to assess the degree to which individuals begin to assign agency to events in their environment and subsequently act on those signs.  What they found was that children between three and six years of age do not read communicative intent into unexplained events (e.g., lights flickering or pictures falling from the wall).  But at age seven, children start reading into and acting on such events.  So why is it that at the age of seven, children start inferring agency from events in their environment?  Bering suggests that:

 

“The answer probably lies in the maturation of children’s theory-of-mind abilities in this critical period of brain development. Research by University of Salzburg psychologist Josef Perner, for instance, has revealed that it’s not until about the age of seven that children are first able to reason about “multiple orders” of mental states. This is the type of everyday, grown-up social cognition whereby theory of mind becomes effortlessly layered in complex, soap opera–style interactions with other people. Not only do we reason about what’s going on inside someone else’s head, but we also reason about what other people are reasoning is happening inside still other people’s heads!”

 

So as it turns out, this tendency to read signs into random events is associated with the maturation of cognitive processes. Children with less mature “Theory of Mind” (click here for a very basic description of Theory of Mind) capabilities fail to draw the conclusion that a supernatural being, or any being for that matter, knows what they are thinking and can act in a way that will communicate something.

 

“To interpret [capricious] events as communicative messages, … demands a sort of third-person perspective of the self’s actions: ‘What must this other entity, who is watching my behavior, think is happening inside my head?’ [These] findings are important because they tell us that, before the age of seven, children’s minds aren’t quite cognitively ripe enough to allow them to be superstitious thinkers. The inner lives of slightly older children, by contrast, are drenched in symbolic meaning. One second-grader was even convinced that the bell in the nearby university clock tower was Princess Alice ‘talking’ to him.”

 

When a capricious event has great significance, we are seemingly driven by a ravenous appetite to look for “signs” or “reasons.”  We desperately need to understand.  Our searches for those “reasons” are largely shaped by previously held beliefs and cultural influences. Divine interventions, for example, have historically been ambiguous; therefore, a multitude of surreptitious events, can be interpreted as having a wide variety of meanings. And those meanings are guided by one’s beliefs.

 

“Misfortunes appear cryptic, symbolic; they seem clearly to be about our behaviors. Our minds restlessly gather up bits of the past as if they were important clues to what just happened. And no stone goes unturned. Nothing is too mundane or trivial; anything to settle our peripatetic [wandering] thoughts from arriving at the unthinkable truth that there is no answer because there is no riddle, that life is life and that is that.”

 

The implications of this understanding are profound.  We are by our very nature driven to search for signs and reasons to explain major life events, and we are likewise inclined to see major events as signs themselves. The ability to do so ironically depends on cognitive maturation. But, given the complexity and remoteness of scientific explanations, we often revert to familiar and culturally sanctioned explanations that have stood the test of time.  We do this because it gives us comfort, regardless of actual plausibility.  As I often say, we are a curious lot, we humans.

 

References:

 

Bering, J. (2011). Signs, signs, everywhere signs: Seeing God in tsunamis and everyday events. Scientific American.  http://www.scientificamerican.com/blog/post.cfm?id=signs-signs-everywhere-signs-seeing-2011-03-13&print=true

 

Bering, J., & Parker, B. (2006). Children’s Attributions of Intentions to an Invisible Agent. Developmental Psychology. Vol. 42, No. 2, 253–262

 

Share

Narrative Fallacy

13 March 2011

Evolution has conferred upon us a brain that is capable of truly amazing things.  We have, for thousands of years, been capable of creating incredibly beautiful art, telling compelling tales, and building magnificent structures.  We have risen from small and dispersed tribal bands to perhaps the dominate life force on the planet.  Our feats have been wondrous.  We have put men on the moon, our space probes have reached the outer limits of our solar system, and we have people living and working in space.  We have literally doubled the life expectancy of human beings, figured out how to feed billions of people, and eradicated some of the most dreadful diseases known to human kind.  We can join together in virtual social communities from remote corners of the world, and even change nations using Facebook and Twitter.  This list could go on and on.  We are very capable and very smart beings.

 

Our mark on this planet, for the moment, is indelible.  Yet, despite our great powers of intellect and creativity, we are incredibly vulnerable.  I am not referring to our susceptibility to the great powers of nature as evidenced in Japan this last week.  I am referring to an inherent mode of thinking that is core to our human nature.

 

It is pretty certain that nature-nature will destroy our species at some point in the future, be it via asteroid impact, super-volcanoes, climate change, microbiome evolution, or the encroachment of the sun’s surface as it goes red giant in five billion years.  Of all the species that have ever lived on this planet over 99% have gone extinct.  What’s living today will someday be gone – there really is no question about it.  But the question that remains is: “Will nature-nature do us in – or will human-nature do it first?”

 

We have evolved over billions of years to our current homo sapien (wise man) form, and for the vast majority of that evolutionary period, we have had very limited technology.  The development of primitive stone and wooden tools dates back only tens of thousands of years; and reading and writing dates back only several thousand years.  What we do and take for granted every day has only been around for a minuscule amount of time relative to the vastness of incomprehensible evolutionary and geological time. These facts are relevant because our brains, for the most part, developed under selective pressures that were vastly different than those we live under today.

 

Much as our appendix and coccyx hair follicle are remnants of our evolutionary past, so too are some of our core thought processes.  These vestigial cognitions play out both as adaptive intuitions and potentially quite destructive errors of judgment.  We would like to think that as an advanced thinking species, our ability to use reason, is our dominate mental force.  Unfortunately, this most recent evolutionary development, takes a back seat to lower and more powerful brain functions that have sustained us for millions of years.  I have previously written about this reason versus intuition/emotion paradigm so I won’t go into this issue in detail here; but, suffice it to say, much of what we do is guided by unconscious thought processes outside of our awareness and outside our direct control.  And again, these life guiding processes are mere remnants of what it took to survive as roaming bands of hunters and gatherers.

 

Ours brains came to their current form when we were not in possession of the tools and technologies that help us truly understand the world around us today.  Early survival depended on our ability to see patterns in randomness (pareidolia or patternicity) and to make snap judgments.  Rational thought, which is slow and arduous, has not played out in a dominate way because it failed to provide our ancestors with the survival advantages that emotional and rapid cognitions did.  As such, our brains have been programmed by evolution to make all kinds of rapid cognitions, that in this modern time, are simply prone to error.

 

We are uncomfortable with randomness and chaos and are driven to pull together causal stories that help us make sense of the world.  Our brains are correlation calculators, belief engines, and hyperactive agency detection devices – all inclinations of which lead us to develop polytheism to help explain the whims of “mother nature.”  All cultures, for example have also developed creation myths to help explain how we came to be.  We are a superstitious lot driven by these vestigial remnants.

 

It is easy to see how powerful this inclination is.  Look at the prevalence of beliefs about things like full moons and bad behavior.  And how about bad behavior and acts of nature?  Pat Robertson blamed Katrina on homosexuality and hedonism.  One wonders what the Japanese did to deserve their most current tragedy.  I’ve already heard talk of the attack on Pearl Harbor as an antecedent.  Like mother nature would align with the United States to punish long past deeds against us!  If mother nature cares at all about herself, I wonder what we have coming for Nagasaki and Hiroshima?  Likewise, people blame vaccines for autism and credit homeopathy for their wellness.  I could go and on about our silly inclinations.  We are prone to Confirmation Bias, Spinoza’s Conjecture, Attribution Error, Illusions of Attention, and the Illusions of Knowledge and Confidence.  In the same vein, we are manipulated by the Illusion of Narrative also known as the Narrative Fallacy.

 

Nassim Nicholas Taleb (a philosopher, author, statistician) coined the phrase “Narrative Fallacy,” which is an encapsulation of this very discussion.  We have a deep need to make up a narrative that serves to make sense of a series of connected or disconnected facts.  Our correlation calculators pull together these cause and effect stories to help us understand the world around us even if chance has dictated our circumstances.   We fit these stories around the observable facts and sometimes render the facts to make them fit the story.  This is particularly true, for example, in the case of Intelligent Design.

 

Now that I am aware of this innate proclivity I enjoy watching it play out in my own mind.  For example several weekends ago I went cross country skiing with my wife, Kimberly.  We were at Allegany State Park, in Western New York, where there are nearly 20 miles of incredibly beautiful and nicely groomed nordic ski trails.  Kimberly and I took a slightly different route than we normally do and at a junction of two trails, we serendipitously ran into a friend we hadn’t seen in quite some time.  It was an incredible and highly improbable meeting.  Any number of different events or decisions could have resulted in forgoing this meet-up.  Such events compel us to string together a narrative to make sense of the sheer randomness.  Was it fate, divine intervention, or just coincidence?  I am certain it was the latter – but it sure was fun dealing with the cognitions pouring forth to explain it.

 

I would really like to hear about your dealings with this inclination.  Please post comments detailing events that have happened to you and the narratives you fomented to make sense of  them.  This is a great exercise to help us understand this pattern detection mechanism, so, have some fun with it and share your stories.  At the very least, pay attention to how this tendency plays out in your life and think about how it plays out in your belief systems (and ideological paradigms).  I’m guessing that it will be informative.

Share

We all love a good story.  Children are mesmerized by them and adults, whether through books, TV, movies, sports, gossip, tabloids, or the news, to mention a few, constantly seek them out.  It is core to our identity, and a vital part of our nature.  It is both how we entertain ourselves, and how we make sense of the world.   This latter tendency troubles me.  Why?  Specifically because we are inclined to value narratives over aggregated data, and we are imbued with a plethora of cognitive biases and errors that all mesh together in a way to leave us vulnerable to believing very silly things.

 

This may be hard to swallow, but all of us, yes even you, are by default, gullible and biased: disinclined to move away from narratives that you unconsciously string together in order to make sense of an incredibly complex world.  Understanding this is paramount!

 

I have discussed many of the innate illusions, errors, and biases that we are inclined toward throughout this blog.  I have also discussed the genetic and social determinates that play out in our thought processes and beliefs.  And throughout all this I have worked diligently to remain objective and evidence based.  I do accept that I am inclined toward biases programmed into my brain.  This knowledge has forced me to question my beliefs and open my mind to different points of view.  I believe that the evidence I have laid down in my writings substantiates my objectivity.  But I am also tired, very tired in fact, of making excuses for, and offering platitudes to, others who do not open their minds to this not so obvious reality.

 

I am absolutely convinced that there is no resolution to the core political, economic, religious and social debates that pervade our societies, unless we can accept this reality.  Perhaps, the most important thing we can do as a species is come to an understanding of our failings and realize that in a multitude of ways, our brains lie to us.  Our brains deceive us in ways that necessitate us to step away from our gut feelings and core beliefs in order to seek out the truth.  Only when we understand and accept our shortcomings will we be open to the truth.

 

Because of these flawed tendencies we join together in tribal moral communities lending a blind eye to evidence that casts doubt on our core and sacred beliefs.  We cast aspersions of ignorance, immorality or partisanship on those that espouse viewpoints that differ from our own.  I cannot emphasize this enough, this is our nature.  But, I for one, cannot, and will not, accept this as “just the way it is.”

 

We as a species are better than that.  We know how to over come these inclinations.  We have the technology to do so.  It necessitates that we step back from ideology and look at things objectively.  It requires asking questions, taking measurements, and conducting analyses (all of which are not part of our nature).  It necessitates the scientific method.  It requires open peer review and repeated analyses.  It requires objective debate and outright rejection of ideology as a guiding principle.  It requires us to take a different path, a path that is not automatic, one that is not always fodder for good narrative.

 

I am no more inclined to believe the narrative of Muammar Muhammad al-Gaddafi suggesting that “his people love him and would die for him” than I am to accept the narrative from Creationists about the denial of evolution or those that deny anthropogenic global warming based on economic interests.  Likewise, I am not willing to accept the arguments from the anti-vaccine community or the anti-gay marriage community.

 

My positions are not based on ideology!  They are based on evidence: both the credible and substantive evidence that backs my position and the lack of any substantive evidence for the opposing views.

 

Granted, my positions are in line with what some may define as an ideology or tribal moral community; but there is a critical difference.  My positions are based on evidence, not on ideology, not on bronze-age moral teachings, and certainly not on fundamental flaws in thinking.  This is a huge and critical difference.  Another irrefutable difference is my willingness to abandon my position if the data suggests a more credible one.  Enough already! Its time to step back, take a long and deep breath – look at how our flawed neurology works – and stop filling in the gaps with narrative that is devoid of reality.  Enough is enough!

 

Share

Have you ever heard someone make an argument that leaves you shaking your head in disbelief?  Does it seem to you like some people are coming from a completely different reality than your own?  If so, then this blog is for you.  I have spent the last year trying to develop an understanding of the common thought patterns that drive the acrimonious spirit of our social and political dialogue.  I am continually amazed by what I hear coming from seemingly informed people.  I have assumed that some folks are either deluded, disingenuous, or downright ignorant.  There is yet another possibility here, including the reality that different moral schema or belief systems may be driving their thinking.  And if this is the case, how do these divergent processes come to be?  I  have learned a lot through this exploration and feel compelled do provide a recap of the posts I have made.  I want to share with you those posts that have gathered the most traction and some that I believe warrant a bit more attention.

 

Over the past year I have posted 52 articles often dealing with Erroneous Thought Processes, Intuitive Thinking, and Rational Thought.  Additionally, I have explored the down stream implications of these processes with regard to politics, morality, religion, parenting, memory, willpower, and general perception.  I have attempted to be evidenced-based and objective in this process – striving to avoid the very trappings of confirmation bias and the erroneous processes that I am trying to understand.   As it turns out, the brain is very complicated: and although it is the single most amazing system known to human kind, it can and does lead us astray in very surprising and alarming ways.

 

As for this blog, the top ten posts, based on the shear number of hits, are as follows:

  1. Attribution Error
  2. Nonmoral Nature, It is what it is.
  3. Multitasking: The Illusion of Efficacy
  4. Moral Instinct
  5. Pareidolia
  6. IAT: Questions of Reliability
  7. Are You a Hedgehog or a Fox?
  8. What Plato, Descartes, and Kant Got Wrong: Reason Does not Rule
  9. Illusion of Punditry
  10. Emotion vs.Reason: And the winner is?

What started out as ramblings from a curious guy in a remote corner of New York State ended up being read by folks from all over the planet.  It has been a difficult process at times, consuming huge amounts of time, but it has also been exhilarating and deeply fulfilling.

 

I have been heavily influenced by several scientists and authors in this exploration.  Of particular importance have been Steven Pinker, Daniel Simons, Christopher Chabris, Jonah Lehrer, Bruce Hood, Carl Sagan, and Malcolm Gladwell.  Exploring the combined works of these men has been full of twists and turns that in some cases necessitated deep re-evaluation of long held beliefs.  Holding myself to important standards – valuing evidence over ideology – has been an important and guiding theme.

 

Several important concepts have floated to the top as I poked through the diverse literature pertaining to thought processes. Of critical importance has been the realization that what we have, when it comes to our thought processes, is a highly developed yet deeply flawed system that has been shaped by natural selection over millions of years of evolution.  Also important has been my increased understanding of the importance of genes, the basic element of selective pressures, as they play out in morality and political/religious beliefs.  These issues are covered in the top ten posts listed above.

 

There are other worthy posts that did not garner as much attention as those listed above.  Some of my other favorites included a review of Steven Pinker’s article in the New York Times (also titled Moral Instinct,) a look at Jonathon Haidt’s Moral Foundations Theory in Political Divide, as well as the tricks of Retail Mind Manipulation and the Illusion of Attention.  This latter post and my series on Vaccines and Autism (Part 1, Part 2, Part 3) were perhaps the most important of the lot.  Having the content of these become general knowledge would make the world a safer place.

 

The evolution of understanding regarding the power and importance of Intuitive relative to Rational Thinking was humbling at times and Daniel Simons’ and Christopher Chabris’ book, The Invisible Gorilla, certainly provided a mind opening experience.  Hey, our intuitive capabilities are incredible (as illustrated by Gladwell in Blink & Lehrer in How We Decide) but the downfalls are amazingly humbling.  I’ve covered other topics such as  happiness, superstition, placebos, and the debate over human nature.

 

The human brain, no matter how remarkable, is flawed in two fundamental ways.  First, the proclivities toward patternicity (pareidolia), hyperactive agency detection, and superstition, although once adaptive mechanisms, now lead to many errors of thought.  Since the age of enlightenment, when human kind developed the scientific method, we have exponentially expanded our knowledge base regarding the workings of the world and the universe.  These leaps of knowledge have rendered those error prone proclivities unessential for survival.  Regardless, they have remained a dominant cognitive force.  Although our intuition and rapid cognitions have sustained us, and in some ways still do, the everyday illusions impede us in important ways.

 

Secondly, we are prone to a multitude of cognitive biases that diminish and narrow our capacity to truly understand the world. Time after time I have written of the dangers of ideology with regard to its capacity to put blind-folds on adherents.  Often the blind- folds are absolutely essential to sustain the ideology.  And this is dangerous when truths and facts are denied or innocents are subjugated or brutalized.  As I discussed in Spinoza’s Conjecture“We all look at the world through our personal lenses of experience.  Our experiences shape our understanding of the world, and ultimately our understanding of [it], then filters what we take in.  The end result is that we may reject or ignore new and important information simply because it does not conform to our previously held beliefs.

 

Because of our genetically inscribed tendencies toward mysticism and gullibility, we must make extra effort in order to find truth. As Dr. Steven Novella once wrote:

“We must realize that the default mode of human psychology is to grab onto comforting beliefs for purely emotional reasons, and then justify those beliefs to ourselves with post-hoc rationalizations. It takes effort to rise above this tendency, to step back from our beliefs and our emotional connection to conclusions and focus on the process.”

We must therefore be humble with regard to beliefs and be willing to accept that we are vulnerable to error prone influences outside our awareness.  Recognition and acceptance of these proclivities are important first steps.   Are you ready to move forward?  How do you think?

Share

Halloween seems like an appropriate time to discuss superstition.  What with ghosts and goblins and black cats and witches and all.  But would not Easter or Christmas, or any other evening that a five year old loses a tooth be an equally appropriate time?  In actuality, we massage magical thinking in our children with notions of Santa Claus, the Easter Bunny, and the tooth fairy.  And recall if you will, some of your favorite children’s books and the supernatural forces employed to delight your youthful whimsies.  Magic is, along with the thinking employed to delight in it, seemingly a rite of childhood, and in some ways the essence of what it is to be a child.

 

Much as magical thinking has its roots in childhood fantasies, superstition too has its roots in our species’ youth.  In that nascent time we lacked the capacity to understand the forces and whims of the natural world around us.  Our ancestors struggled to survive, and living another day in part depended on their ability to make sense of the forces that aided or impinged upon them.  We must not forget that our forefathers lived much like the non-domesticated animals around us today.  Survival was a day to day reality dependent upon the availability of life sustaining resources like food, water and shelter, and was often threatened by predation or the forces of nature.  Death was a real possibility and survival a real struggle.  The stakes were high and the hazards were plentiful.  As it turns out, these are the very conditions under which superstition is likely to thrive.

 

So what is superstition?  Bruce Hood, author of The Science of Superstition, notes that superstition is a belief “that there are patterns, forces, energies, and entities operating in the world that are denied by science…”  He adds that “the inclination or sense that they may be real is our supersense.” It involves an inclination to attempt to “control outcomes through supernatural influence.”  It is the belief that if you knock on wood or cross your fingers you can influence outcomes in your favor.  It is the belief that faithfully carrying out rituals as part of a wedding ceremony (e.g., wearing something blue, something new, something borrowed) or before going to bat or before giving a big speech will improve outcomes.  It is also the belief that negative outcomes can come as a result of stepping on a crack, breaking a mirror, or spilling salt.  Hood argues that supersense goes beyond these obvious notions and surfaces in more subtle ways associated with touching an object or entering a place that we feel has a connection with somebody bad or evil.  For example, how would you feel if you were told that you had to wear Jeffery Dalmer’s T-shirt or that you were living in a house where ritualistic torture and multiple murders took place?  Most of us would recoil at the thought of this.  Most of us also believe (erroneously) that we can sense when someone is looking at us, even when we cannot see them doing so.  These beliefs and much of the value we place on sentimental objects stems from this style of thinking.

 

I explored the deep evolutionary roots of superstitious thinking in a previous post, The Illusion of Cause: Vaccines and Autism.   The principle underpinnings are the same.  In that post I noted the following:

 

Michael Shermer (2000), in his book, How We Believe, eloquently describes our brains as a Belief Engine. Underlying this apt metaphor is the notion that “Humans evolved to be skilled pattern seeking creatures. Those who were best at finding patterns (standing upwind of game animals is bad for the hunt, cow manure is good for the crops) left behind the most offspring. We are their descendants.” (Shermer, p. 38). Chabris and Simons (2009) note that this refined ability “serves us well, enabling us to draw conclusions in seconds (or milliseconds) that would take minutes or hours if we had to rely on laborious logical calculations.” (p. 154). However, it is important to understand that we are all prone to drawing erroneous connections between stimuli in the environment and notable outcomes. Shermer further contends that “The problem in seeking and finding patterns is knowing which ones are meaningful and which ones are not.

From an evolutionary perspective, we have thrived in part, as a result of our tendency to infer cause or agency regardless of the reality of threat. For example, those who assumed that rustling in the bushes was a tiger (when it was just wind) were more likely to take precautions and thus less likely, in general, to succumb to predation. Those who were inclined to ignore such stimuli were more likely to later get eaten when in fact the rustling was a hungry predator. Clearly from a survival perspective, it is best to infer agency and run away rather than become lunch meat. The problem that Shermer refers to regarding this system is that we are subsequently inclined toward mystical and superstitious beliefs: giving agency to unworthy stimuli or drawing causal connections that do not exist. Dr. Steven Novella, a neurologist, in his blog post entitled Hyperactive Agency Detection notes that humans vary in the degree to which they assign agency. Some of us have Hyperactive Agency Detection Devices (HADD) and as such, are more prone to superstitious thinking, conspiratorial thinking, and more mystical thinking. It is important to understand as Shermer (2000) makes clear:

“The Belief Engine is real. It is normal. It is in all of us. Stuart Vyse [a research psychologist] shows for example, that superstition is not a form of psychopathology or abnormal behavior; it is not limited to traditional cultures; it is not restricted to race, religion, or nationality; nor is it only a product of people of low intelligence or lacking education. …all humans possess it because it is part of our nature, built into our neuronal mainframe.” (p. 47).

 

Bruce Hood takes this notion further and adds that the cultural factors discussed at the opening of this piece and other intuitive inclinations such as dualism (a belief in the separation of mind and body), essentialism (the notion that all discernible objects harbor an underlying reality that although intangible, gives each and every object it’s true identity), vitalism (the insistence that there is some big, mysterious extra ingredient in all living things), holism (that everything is connected by forces), and anism (the belief that the inanimate world is alive) shape adult superstition.  These latter belief mechanisms are developmental and naturally occurring in children: they are the tendencies that make magic and fantasy so compelling for children.  It is when they lurk in our intuition or are sustained in our rational thought that we as adults fall victim to this type of illusion.

 

It is interesting to note that much like our ancestors, we are more prone to this type of thinking when faced with high stakes, a low probability of success, and incomprehensible controlling circumstances.  Think about it.  In baseball, batters often have complex superstitious rituals associated with batting.  The best hitters experience success only one in three times at bat.  And the speed at which they have to decide to swing or not and where to position the swing defies the rational decision making capacity of humans.  On the other hand, these very same athletes have no rituals when it comes to fielding a ball (which is a high probability event for the proficient).

 

Superstition is a natural inclination with deep evolutionary and psychological roots embedded deeply in our natural child development.  These tendencies are nurtured and socialized as a part of child rearing and spill over into adult rituals in predictable circumstances (particularly when there is a low degree personal control).   When one deconstructs this form of thinking it makes complete and total sense.  This is not to suggest that reliance on superstitions is sensible.  Often, however, the costs are low and the rituals therein can be fun.  There are some potential costs associated with such thinking.  Some of the dangers are materialized in notions such as vaccines cause autism and homeopathy will cure what ails you in lieu of scientific medicine.  Resignation of personal power in deference to supernatural forces is a depressive response pattern.  Reliance on supernatural forces is essentially reliance on chance and in some cases its applications actually stack the deck against you.  So be careful when employing such tactics.  But, if you’re in the neighborhood, NEVER EVER walk under my ladder.  I’ve been known to drop my hammer.

 

References

 

Chabris, C. F., & Simons, D. J. (2010). The Invisible Gorilla. Random House: New York.

 

Dawkins, R. (2009). The Greatest Show on Earth: The Evidence for Evolution. Free Press: New York.

 

Gelman, S. A. (2004). Psychological Essentialism in Children. TRENDS in Cognitive Sciences, 8, 404-409.

 

Hood, B. (2008). The Science of Superstition (Formerly Titled: Supersense: Why We Believe in the Unbelievable). HarperCollins Publishers: New York.

 

Novella, S. (2010). Hyperactive Agency Detection. NeuroLogica Blog. http://www.theness.com/neurologicablog/?p=1762

 

Shermer, M. (2000). How We Believe. W.H. Freeman/Henry Holt and Company: New York.

Share

Why do you sometimes choose that scrumptious chocolate desert even when you are full?  Why is it that you are sometimes drawn in by the lure of the couch and TV when you should be exercising or at least reading a good book?  And why do you lose your patience when you are hungry or tired? Do these situations have anything to do with a weak will?

 

What is willpower anyways?  Perhaps it is your ability to heed the advice proffered by that virtuous and angelic voice in your head as you silence the hedonistic diabolical voice that goads you toward the pleasures of sloth or sin.   Or perhaps, as Sigmund Freud once contended, it is your ego strength that enables you to forgo the emotionally and impulsively driven urges of the id.   These images resonate so well with us because it often feels as though there is a tug-of-war going on inside our heads as we consider difficult or sometimes even routine choices.  Often, reason prevails, and other times it does not.  What is really at play here? Is it truly willpower? Is it really a matter of strength or even of choice?

 

As it turns out, like all issues of the human mind, it is complicated.  Studies within the disciplines of psychology and neuroscience are offering increased clarity regarding this very issue.  It is important to understand however, that the human brain is composed of a number of modules, each of which are striving to guide your choices.  There really isn’t a top down hierarchy inside your brain with a chief executive who is pulling and pushing the levers that control your behavior.  Instead, at various times, different modules assert greater amounts of control than others, and thus, the choices we make, do likewise vary in terms of quality over time.  As a result of advances in technology and understanding, we are becoming increasingly aware of the key variables associated with this variation.

 

At a very basic level we know of two major (angelic v. diabolical) driving forces that guide our decisions.  Within and across these forces there are multiple modules emitting neurotransmitters that ultimately influence the choices that we make.  Broadly, the two forces are reason and emotion.  As I discussed in previous posts, What Plato, Descartes, and Kant Got Wrong: Reason Does not Rule and Retail Mind Manipulation, there is not actually a true competitive dichotomy between these two forces; instead, there appears to be a collaborative interplay among them. Regardless of their collaborative nature, we do experience a dichotomy of sorts when we choose the cheeseburger and fries over the salad, the chocolate cake over the fruit salad, or abstinence over indulgence.

 

Now that I have clouded the picture a bit, lets look at one study that may help reintroduce some of that clarity that I mentioned.

 

At Stanford University, Professor Baba Shiv, under the ruse of a study on memory, solicited several dozen undergraduate students. He randomly assigned the students to two groups. For conveniences sake, I will label the groups the 2 Digit Group and the 7 Digit Group.  The students in the 2 Digit Group were given a two digit number (e.g., 17) to memorize whereas those in the 7 Digit Group where tasked with a seven digit number (e.g., 2583961).  In Room-A, each individual, one subject at a time, was given a number to memorize.  Once provide with the number they were given as much time as they needed to commit the number to memory.  They were also told that once they had memorized the number that they were to go to Room-B, down the hall, where their ability to recall the number would be tested.  As each individual student made the transition from the first room to the testing room, they were intercepted by a researcher offering them a gratuity for their participation. The offer was unannounced and provided prior to entering the testing room (Room-B).   The offer included either a large slice of chocolate cake or a bowl or fruit salad.

 

One would expect, given the random nature of group assignment, that those in the 2 Digit group would select the cake or fruit salad in the same proportions as those in the 7 Digit group.  As it turned out, there was a striking difference between the groups.  Those in the 2 Digit Group selected the healthy fruit salad 67% of the time.  On the other hand, those in the 7 Digit Group selected the scrumptious, but not so healthy, cake 59% of the time.  The only difference between the groups was the five digit discrepancy in the memorization task.  How could this seemingly small difference between the groups possibly explain why those saddled with the easier task would make a “good” rational choice 67% of the time while those with a more challenging task made the same healthy choice only 41% of the time?

 

The answer likely lies in the reality that memorizing a seven digit number is actually more taxing than you might think.  In 1956, Psychologist George Miller published a classic paper entitled “The Magical Number Seven, Plus or Minus Two” whereby he provided evidence that the limit of short term memory for most people is in fact seven items. This is why phone numbers and license plates are typically seven digits in length. Strings of letters or numbers that are not logically grouped in some other way, when approaching seven items in length, tend to max out one’s rational processing ability.  With seven digits, one is likely to have to recite the sequence over and over in order to keep it in short term memory.  It appears that those in the 7 Digit Group relative to the 2 Digit Group had reached the limits of their rational capacity and were less likely to employ good reason-based decision making with regard to the sweets. Those in the 2 Digit Group were not so preoccupied and were likely employing a more rationally based decision making apparatus.  They made the healthy choice simply because they had the mental capacity to weigh the pros and cons of the options.

 

An overtaxed brain is likely to fall back on emotional, non-rational mechanisms to make choices and the outcomes are not always good.  When you are cognitively stressed – actively engaged in problem solving – you are less likely to make sound, reason-based decisions regarding tangential or unrelated issues. That is one of the reasons why we “fall off the wagon” when we are overwhelmed.

 

And if you compound cognitive preoccupation with fatigue and hunger – then you may have more problems.  You know those times at the end of the day when you are tired, hungry, and really irritable?   Your muscles are not the only tissues that fatigue when they are not well nourished.  Your brain is a major consumer of nutritional resources – and it, particularly the reasoning portion of your brain, many scientists believe, does not tolerate glucose deficits.  Your grumpiness may be the result of the diminished capacity of your brain to employ reason in order to work out and cope with the little annoyances that you typically shrug off.

 

So, it seems, willpower is one’s ability to use the reasoning portion of your brain to make sound healthy decisions.  Studies like the one above, suggest that willpower is not a static force.  We must accept the limits of our willpower and realize that this source of control is in a near constant state of fluctuation – depending on one’s state of cognitive preoccupation, fatigue and perhaps blood glucose levels.  It is very important that you know your limits and understand the dynamic nature of your rational capacity – and if you do, you may proactively avoid temptation and thus stay in better control of your choices.  Relying on your willpower alone does not provide you with dependable safety net.  Be careful to not set yourself up for failure.

 

References:

 

Krakovsky, M. (2008). How Do We Decide? Inside the ‘Frinky’ Science of the Mind. Stanford Graduate School of Business Alumni Magazine. February Issue

 

Krulwich, R. & Abumrad, J. (2010). Willpower And The ‘Slacker’ Brain. National Public Radio: Radio Lab. http://www.npr.org/templates/story/story.php?storyId=122781981

 

Lehrer, J. (2009). How We Decide. Houghton Mifflin Harcourt: New York.

 

Miller, G. (1956). The Magical Number Seven, Plus or Minus Two. The Psychological Review. Vol. 63, pp. 81-97.

Share

The Implicit Associations Test (IAT) is a very popular method for measuring implicit (implied though not plainly expressed) biases. Greenwald, one of the primary test developers, suggests that “It has been self-administered online by millions, many of whom have been surprised—sometimes unpleasantly—by evidence of their own unconscious attitudes and stereotypes regarding race, age, gender, ethnicity, religion, or sexual orientation.” (2010). It purports to tap into our unconscious or intuitive attitudes at a deeper level than those that we are able to rationally express. The best way to get an idea of just what the IAT is, is to take it. If you haven’t done so already, go to the Implicit Associations Test website and participate in a demonstration of the Race Test. It takes about ten minutes.

 

I tend to have a skeptical inclination. This in part stems from the training that I benefited from in acquisition of my PhD in psychology. But it is also just part of who I am. Psychology is, in itself, a rather soft science – full of constructs – and variables that are inherently difficult to measure with any degree of certainty. I learned early in my training that there are dangers associated with inference and measuring intangibles. In fact, my training in personality and projective measures essentially focused on why not to use them – especially when tasked with helping to make important life decisions. Why is this? All psychological measures contain small and predictable amounts of unavoidable error – but those based on constructs and inference are particularly untenable.

 

This is relevant because as we look at thinking processes, we are dealing with intangibles. This is especially true when we are talking about implicit measures. Any discussion of implicit thought necessitates indirect or inferential measures and application of theoretical constructs. So, with regard to the Implicit Associations Test (IAT), one needs to be careful.

 

Currently, increasing evidence suggests that our intuition has a powerful influence over our behavior and moment to moment decision making. Books like Blink by Malcolm Gladwell and How We Decide by Jonah Lehrer point out the power of intuition and emotion in this regard. Chabris and Simons in their book, The Invisible Gorilla, make a strong argument that intuition itself sets us up for errors. Gladwell perhaps glorifies intuition – but the reality is, it (intuition) is a powerful force. Gladwell uses the story of the IAT as evidence of such power. Essentially, if the IAT is a valid and reliable measure, it provides strong evidence of the problems of intuition.

 

I am motivated to shed some light on the IAT – not because of my personal IAT results, which were disappointing, but because the IAT has the risk of gaining widespread application without sufficient technical adequacy. Just think of the ubiquitous Meyers-Briggs Personality Inventory and the breadth and depth of popular use and appeal that it has garnered (without a shred of legitimate science to back it up). Real decisions are made based on the results of this instrument and frankly it is dangerous. The Meyers-Briggs is based on unsubstantiated and long out-of-date Jungian constructs and was built by individuals with little to no training in psychology or psychometrics. This is not the case for the IAT for sure, but the risks of broad and perhaps erroneous application are similar.

 

The authors of the IAT have worked diligently over the years to publish studies and facilitate others’ research in order to establish the technical adequacy of the measure. This is a tough task because the IAT is not one test, but rather, it is a method of measurement that can be applied to measure a number of implicit attitudes. At the very foundation of this approach there is a construct, or belief, that necessitates a leap of faith.

 

So what is the IAT? Gladwell (2005) summarizes it in the following way:

The Implicit Association Test (IAT)…. measures a person’s attitude on an unconscious level, or the immediate and automatic associations that occur even before a person has time to think. According to the test results, unconscious attitudes may be totally different or incompatible with conscious values. This means that attitudes towards things like race or gender operate on two levels:
1. Conscious level- attitudes which are our stated values and which are used to direct behavior deliberately.
2. Unconscious level- the immediate, automatic associations that tumble out before you have time to think.
Clearly, this shows that aside from being a measurement of attitudes, the IAT can be a powerful predictor of how one [may] act in certain kinds of spontaneous situations.

So here is one of the difficulties I have with the measure. Take this statement: “The IAT measures a person’s attitude on an unconscious level, or the immediate and automatic associations that occur even before a person has time to think.” Tell me how one would directly and reliably measure “unconscious attitude” without using inference or indirect measures that are completely dependent on constructs? I am not alone in this concern. In fact, Texas A&M University psychologist Hart Blanton, PhD, worries that the IAT has been used prematurely in research without sufficient technical adequacy. Blanton has in fact published several articles (Blanton, et al., 2007; Blanton, et al., 2009) detailing the IAT’s multiple psychometric failings. He suggests that perhaps the greatest problem with this measure concerns the way that the test is scored.

 

First you have to understand how it all works. The IAT purports to measure the fluency of people’s associations between concepts. On the Race IAT, a comparison is made between how fluent the respondent pairs pictures of European-Americans with words carrying a connotation of “good” and pictures of African-Americans with words connoting “bad.” The task measures the latency between such pairings and draws a comparison to the fluency of responding when the associations are reversed (e.g., how quickly does the respondent pair European-Americans with words carrying a “bad” connotation and African-Americans with words connoting “good.”). If one is quicker at pairing European-Americans with “good” and African Americans with “bad” then it is inferred that the respondent has a European-American preference. The degree of preference is determined by the measure of fluency and dysfluency in making those pairings. Bigger differences in pairing times result in stronger ratings of one’s bias. Blanton questions the arbitrary nature of where the cutoffs for mild, moderate, and strong preferences are set when there is no research showing where the cutoffs should be. Bottom line, Blanton argues, is that the cutoffs are arbitrary. This is a common problem in social psychology.

 

Another issue of concern is the stability of the construct being measured. One has to question whether one’s bias, or racial preferences, are a trait (a stable attribute over time) or a state (a temporary attitude based on acute environmental influences). The test-retest reliability of the IAT is relatively unstable itself. Regardless, according to Greenwald: “The IAT has also shown reasonably good reliability over multiple assessments of the task. …. in 20 studies that have included more than one administration of the IAT, test–retest reliability ranged from .25 to .69, with mean and median test–retest reliability of .50.” Satisfactory test-retest reliability values are in the .70 to.80 range. To me, there is a fair amount of variance unaccounted for and a wide range of values (suggesting weak consistency). My IATs have bounced all over the map. And boy did I feel bad when my score suggested a level of preference that diverges significantly from my deeply held values. Thank goodness I have some level of understanding of the limitations of such metrics. Not everyone has such luxury.

 

As I noted previously, the IAT authors have worked diligently to establish the technical adequacy of this measure and they report statistics attesting to the internal-consistency, test-retest reliability, predictive validity, convergent validity, and discriminant validity, almost always suggesting that results are robust (Greenwald, 2010; Greenwald, 2010; Greenwald, et al, 2009; Lane, et al, 2007) . There are other studies including those carried out by Blanton and colleagues, that suggest otherwise. To me, these analyses are important and worthwhile – however, at the foundation, there is the inescapable problem of measuring unconscious thought.

 

Another core problem is that the validity analyses employ other equally problematic measures of intangibles in order to establish credibility. I can’t be explicit enough – when one enters the realm of the implicit – one enters a realm of intangibles: and like it or not, until minds can be read explicitly, the implicit is essentially immeasurable with any degree of certainty. The IAT may indeed measure what it purports to measure, but the data on this is unconvincing. Substantial questions of reliability and validity persist. I would suggest that you do not take your IAT scores to heart.

 

References

 

Azar, B. (2008). IAT: Fad or fabulous? Monitor on Psychology. July. Vol 39, No. 7,  page 44.

 

Blanton, H., Jaccard, J., Christie, C., and Gonzales, P. M. (2007). Plausible assumptions, questionable assumptions and post hoc rationalizations: Will the real IAT, please stand up? Journal of Experimental Social Psychology. Volume 43, Issue 3, Pages 399-409.

 

Blanton, H., Klick, J., Mitchell, G., Jaccard, J.,Mellers, B., Tetlock, P. E. (2009). Strong Claims and Weak Evidence: Reassessing the Predictive Validity of the IAT. Journal of Applied Psychology. Vol. 94, No. 3, 567–582

 

Chabris, C. F., & Simons, D. J., 2010. The Invisible Gorilla. Random House: New York.

 

Gladwell, M. 2005. Blink: The Power of Thinking Without Thinking. Little, Brown and Company: New York.

 

Greenwald, A. G. (2010).  I Love Him, I Love Him Not: Researchers adapt a test for unconscious bias to tap secrets of the heart. Scientific American.com: Mind Matters.   http://www.scientificamerican.com/article.cfm?id=i-love-him-i-love-him-not

 

Greenwald, A. G. (2009). Implicit Association Test: Validity Debates. http://faculty.washington.edu/agg/iat_validity.htm

 

Greenwald, A. G., Poehlman, T. A., Uhlmann, E., & Banaji, M. R. (2009). Understanding and using the Implicit Association Test: III. Meta-analysis of predictive validity. Journal of Personality and Social Psychology. 97, 17–41.

 

Lane, K. A., Banaji, M. R., Nosek, B. A., & Greenwald, A. G. (2007). Understanding and using the Implicit Association Test: IV. What we know (so far) (Pp. 59–102).  In B. Wittenbrink & N. S. Schwarz (Eds.). Implicit measures of attitudes: Procedures and controversies. New York: Guilford Press.

 

Lehrer, J. 2009. How We Decide. Houghton Mifflin Harcourt: New York.

Share

There are many well intentioned folks out there who believe that childhood vaccinations cause Autism. Last week I covered the origins of this belief system as well as its subsequent debunking in Vaccines and Autism. Despite the conclusive data that clearly establishes no causal link between vaccines and Autism, the belief lives on. Why is this? Why do smart people fall prey to such illusions? Chabris and Simons contend in their book, The Invisible Gorilla, that we fall prey to such myths because of the Illusion of Cause. Michael Shermer (2000), in his book, How We Believe, eloquently describes our brains as a Belief Engine. Underlying this apt metaphor is the notion that “Humans evolved to be skilled pattern seeking creatures. Those who were best at finding patterns (standing upwind of game animals is bad for the hunt, cow manure is good for the crops) left behind the most offspring. We are their descendants.” (Shermer, p. 38). Chabris and Simons note that this refined ability “serves us well, enabling us to draw conclusions in seconds (or milliseconds) that would take minutes or hours if we had to rely on laborious logical calculations.” (p. 154). However, it is important to understand that we are all prone to drawing erroneous connections between stimuli in the environment and notable outcomes. Shermer further contends that “The problem in seeking and finding patterns is knowing which ones are meaningful and which ones are not.

 

From an evolutionary perspective, we have thrived in part, as a result of our tendency to infer cause or agency regardless of the reality of threat. For example, those who assumed that rustling in the bushes was a tiger (when it was just wind) were more likely to take precautions and thus less likely, in general, to succumb to predation. Those who were inclined to ignore such stimuli were more likely to later get eaten when in fact the rustling was a hungry predator. Clearly from a survival perspective, it is best to infer agency and run away rather than become lunch meat. The problem that Shermer refers to regarding this system is that we are subsequently inclined toward mystical and superstitious beliefs: giving agency to unworthy stimuli or drawing causal connections that do not exist. Dr. Steven Novella, a neurologist, in his blog post entitled Hyperactive Agency Detection notes that humans vary in the degree to which they assign agency. Some of us have Hyperactive Agency Detection Devices (HADD) and as such, are more prone to superstitious thinking, conspiratorial thinking, and more mystical thinking. It is important to understand as Shermer (2000) makes clear:

 

“The Belief Engine is real. It is normal. It is in all of us. Stuart Vyse [a research psychologist] shows for example, that superstition is not a form of psychopathology or abnormal behavior; it is not limited to traditional cultures; it is not restricted to race, religion, or nationality; nor is it only a product of people of low intelligence or lacking education. …all humans possess it because it is part of our nature, built into our neuronal mainframe.” (p. 47).

 

We all are inclined to detect patterns where there are none. Shermer refers to this tendency as patternicity. It is also called pareidolia. I’ve previously discussed this innate tendency noting that “Our brains do not tolerate vague or obscure stimuli very well. We have an innate tendency to perceive clear and distinct images within such extemporaneous stimuli.” It is precisely what leads us to see familiar and improbable shapes in puffy cumulus clouds or the Virgin Mary in a toasted cheese sandwich. Although this tendency can be fun, it can also lead to faulty and sometimes dangerous conclusions. And what is even worse is that when we hold a belief, we are even more prone to perceive patterns that are consistent with or confirm that belief. We are all prone to Confirmation Bias – an inclination to take in, and accept as true, information that supports our belief systems and miss, ignore, or discount information that runs contrary to our beliefs.

 

Patternicity and confirmation bias alone are not the only factors that contribute to the illusion of cause. There are at least two other equally salient intuitive inclinations that lead us astray. First, we tend to infer causation based on correlation. And second, the appeal of chronology, or the coincidence of timing, also leads us toward drawing such causal connections (Chabris & Simons, 2010).

 

A fundamental rule in science and statistics is that correlation does not infer causation. Just because two events occur in close temporal proximity, does not mean that one leads to the other. Chabris and Simons note that this rule is in place because our brains automatically – intuitively – draw causal associations, without any rational thought. We know that causation leads to correlation – but it is erroneous to assume that the opposite is true. Just because A and B occur together does not mean A causes B or vice-versa. There may be a third factor, C, that is responsible for both A and B. Chabris and Simons use ice cream consumption and drownings as an example. There is a sizable positive correlation between these two variables (as ice cream consumption goes up so do the incidences of drowning), but it would be silly to assume that ice cream consumption causes drowning, or that increases in the number of drownings causes increases in ice cream consumption. Obviously, a third factor, summer heat, leads to both more ice cream consumption and more swimming. With more swimming behavior there are more incidents of drowning.

 

Likewise, with vaccines and Autism, although there may be a correlation between the two (increases in the number of children vaccinated and increases in the number of Autism diagnoses), it is incidental, simply a coincidental relationship. But given our proclivity to draw inferences based on correlation, it is easy to see why people would be mislead by this relationship.

 

Add to this the chronology of the provision of the MMR vaccine (recommended between 12 and 18 months), and the typical time at which the most prevalent symptoms of Autism become evident (18-24 months), people are bound to infer causation. Given the fact that millions of children are vaccinated each year, there are bound to be examples of tight chronology.

 

So what is at work here are hyperactive agency detection (or overzealous patternicity), an inherent disposition to infer causality from correlation, and a propensity to “interpret events that happened earlier as the causes of events that happened or appeared to happen later” (Chabris & Simons, 2010, p. 184).  Additionally, you have a doctor like Andrew Wakefield misrepresenting data in such a way to solidify plausibility and celebrities like Jenny McCarthy using powerful anecdotes to convince others of the perceived link. And anecdotes are powerful indeed. “..[W]e naturally generalize from one example to the population as a whole, and our memories for such inferences are inherently sticky. Individual examples lodge in our minds, but statistics and averages do not. And it makes sense that anecdotes are compelling to us. Our brains evolved under conditions in which the only evidence available to us was what we experienced ourselves and what we heard from trusted others. Our ancestors lacked access to huge data sets, statistics, and experimental methods. By necessity, we learned from specific examples…” (Chabris & Simons, 2010, pp. 177-178).  When an emotional mother (Jenny McCarthy) is given a very popular stage (The Oprah Winfrey Show) and tells a compelling story, people buy it – intuitively – regardless of the veracity of the story. And when we empathize with others, particularly those in pain, we tend to become even less critical of the message conveyed (Chabris & Simons, 2010). These authors add that “Even in the face of overwhelming scientific evidence and statistics culled from studies of hundreds of thousands of people, that one personalized case carries undue influence” (p.178).

 

Although the efficacy of science is unquestionable, in terms of answering questions like the veracity of the relationship between vaccines and Autism, it appears that many people are incapable of accepting the reality of scientific inquiry (Chabris & Simons, 2010). Acceptance necessitates the arduous application of reason and the rejection of the influences rendered by the intuitive portion of our brain. This is harder than one might think. Again, it comes down to evolution. Although the ability to infer cause is a relatively recent development, we hominids are actually pretty good at it. And perhaps, in cases such as this one, we are too proficient for our own good (Chabris & Simons, 2010).

 

References

 

Center for Disease Control. (2009). Recommended Immunization Schedule for Persons Aged 0 Through 6 Years. http://www.cdc.gov/vaccines/recs/schedules/downloads/child/2009/09_0-6yrs_schedule_pr.pdf

 

Chabris, C. F., & Simons, D. J. (2010). The Invisible Gorilla. Random House: New York.

 

Novella, S. (2010). Hyperactive Agency Detection. NeuroLogica Blog. http://www.theness.com/neurologicablog/?p=1762

 

Shermer, M. (2000). How We Believe. W.H. Freeman / Henry Holt and Company: New York.

Share

I find myself in an untenable situation. I have plenty to write about but I am finding that the choices I am making right now, in the splendor of summer, give me limited time and energy to write. I’ve decided to take a short hiatus.

 

Over the last seven months my writing has been spurred on by relentless curiosity about belief systems that are held despite mountains of overwhelming evidence to the contrary. This cognitive conservatism absolutely befuddles me. And I am further driven to understand why ideology carries such overwhelming power over people and how it drives people to attack evidence or science in general. In a similar vain, I struggle with politics. The efforts made by the United States on the world’s stage to me seem to be a desperate attempt to slay the Hydra by means of decapitation. People close to me, that I love and have deep respect for, look at this war and even the environment in vastly different ways than I do.

 

Looking back, I have learned a great deal about the thinking processes that drive these different world views. Essentially we have what Michael Shermer calls a Belief Engine for a brain. We are hard wired to believe and make copious errors that incline us to believe – even silly things – regardless of evidence. We have successfully evolved in a world for hundreds of thousands of years devoid of statistics and analysis all the while thriving on snap judgments. Evolution itself, as a process, has inhibited our ability to accept its veracity. Stepping away from the belief engine demands a level of analysis that is foreign and often unpalatable. It is hard to be a skeptic yet oh so easy to go with our hard wired intuitive thinking. If you are new to my blog look back at entries that explore erroneous thinking, rational thought, the adaptive unconscious, memory, morality and even religion.

 

Looking forward I plan on delving further into our enigmatic Belief Engine. I want to further explore the errors of intuition, specifically the illusion of cause, implicit associations, as well as Jonathon Haidt’s work on political affiliation. Later I hope to switch gears and delve into the unique attributes of our planet that makes it hospitable for complex life.

Share

In psychology there are some pretty famous studies that have penetrated popular culture. Many folks are at least familiar with Skinner’s rat box, Pavlov’s salivating dogs, Milgram’s obedience studies, Bandura’s Bobo Dolls, and Harlow’s rhesus monkeys reared by wire frame terry cloth mothers. In recent history, perhaps the most well known study pertains to inattentional blindness. If you have never heard of or seen a video of six college students, three in black shirts and three in white shirts, bouncing a couple basketballs back and forth, see the following video before you proceed.

 

 

So, of course I am referring to Daniel Simons’ Invisible Gorilla study. Just about everyone I know has seen this video, and I don’t recall any of them telling me that they did see the gorilla. I didn’t and I was absolutely flabbergasted – because I tend to be a pretty vigilant guy. This video is a graphic illustration of what Chabris and Simons (2010) refer to as the Illusion of Attention, and about 50% of those who watch the video while counting passes among white shirted players miss the gorilla.

 

This particular illusion concerns me because I spend a fare amount of time riding a bicycle on the roads of Western New York. So why should I or anyone who rides a bicycle or motorcycle, or anyone who drives while texting or talking on a cell phone be concerned?

 

The cold hard truth is that we may completely miss events or stimuli that we do not expect to see. If you don’t expect to see, and therefore fail to look for, bicycles and motorcycles, you may look right at them but fail to see them. LOOKING IS NOT SEEING just as hearing is not listening. This hearing/listening analogy is dead on.  How often have you been caught hearing someone but not listening to what was actually being said?  Chabris and Simons discuss in their book, The Invisible Gorilla, a study conducted by Daniel Memmert of Heidelberg University that demonstrated (using an eye-tracker) that virtually everyone who missed the gorilla looked directly at it at some point in the video (often for a full second). Bikers are the invisible gorillas of the roadways.

 

And as for drivers, if you are distracted by a cell phone conversation or by texting, you are less likely to see unexpected events (e.g., bicycles, motorcycles, pedestrians, wildlife).

 

Most drivers who text and talk on cell phones do not have problems. In fact, most driving is uneventful – as a result, most people get away with these behaviors. However, it is when there is an unexpected event that mobile phone users struggle with seeing and responding fluently to these events. You are under the same illusion as everybody else who has not been in an accident. Everyone believes, until they hit or kill somebody, that they are proficient drivers even while texting or talking on the phone.  And by the way, hands free head sets make no difference. Driving while talking on a cell phone disables you as much as does alcohol.

 

Think about driving down a road not seeing and subsequently hitting a young child on a bike. Think about having to live with killing a middle aged couple with three kids in college who were lawfully riding down the road on a tandem bicycle.  You hit the invisible gorilla.  Live with that!

 

Daniel Simons, in a recently published study, also suggests that even if you are expecting an unexpected event,  it is likely that you will miss other unanticipated events. Check out The Monkey Business Illusion video even if you have seen the invisible gorilla video. Test yourself.

 

 

I have long known that I am at risk while riding my bike on the road.  I have recently incorporated wearing bright hi-vis attire as I ride.  Doing so is completely inconsistent with my style; but I have done so in an effort to be safer.  I was surprised to learn that research shows that doing so will increase your visibility for those that are looking for you – but that it will likely make no difference at all for inattentionally blind drivers. For those drivers who do not expect to see cyclists, hi-vis clothing will not likely increase the likelihood that you will be seen.  Using head and tail lights works on a similar level.  They do increase visibility but only for those looking for such strange sights.  The best way to increase one’s safety while riding is to look like a car.

 

It is also important to note that riding in areas where there are more bikers helps too. Chabris and Simons (2010) noted a report by Peter Jacobson, a public health consultant in California who analyzed data on accidents involving automobiles striking pedestrians or cyclists. He found that in cities where there were more walkers and cyclists, there were actually fewer accidents. More folks walking or riding bikes seems to increase the level of driver expectation for seeing such individuals – thus making one less at risk of being victimized by inattentional blindness. It was further noted that drivers who also ride bikes may actually be more aware – if only more people would get out of their cars and get back on bicycles.

 

The bottom line is that our intuition about our attention is problematic. Intuitively we believe that we attend to and see, what is right before us. Research and real world data shows us that this is not the case. At the very least, when driving, we need to be aware of this erroneous assumption, and work diligently to avoid distractions like talking on the phone or texting. As for cyclists (motor powered or not) we must anticipate that we won’t be seen and behave accordingly. Although hi-vis clothing and lights may not aid in your visibility for some drivers, it will for those that are looking out for you.

 

Chabris and Simons contend that this illusion is a by product of modernity and the subsequent fast paced highly distracting world we live in. We have evolved for millions of years by process of natural selection in a middle sized slow paced world. Traveling faster than a few miles an hour is a relatively new development for our species. Today we travel in motor vehicles at break neck speeds. On top of that we distract ourselves with cell phones, Blackberries, iPhones, iPods and GPS units. Although the consequences of these factors can be grave – in most cases we squeak by – which is a double edged sword because it essentially reinforces the illusion and the behavior.

 

References:

 

Chabris, C. F., & Simons, D. J., 2010. The Invisible Gorilla. Random House: New York.

 

Simons, D. J., 2010. Monkeying around with the gorillas in our midst: familiarity with an inattentional-blindness task does not improve the detection of unexpected events i-Perception 1(1) 3–6

Share