The year 2011 proved to be a challenging year.  A number of serious health issues in close family members took center stage.  The frequency of my posts declined in part due to these important distractions but other factors also played a major role.  Although I published fewer articles, the number of visits to my blog increased substantially.

 

Over the course of the year, I had 18,305 hits at my website by 15,167 unique visitors, accounting for over 25,000 page views.  I had visitors from every state in the Union and visits from people from 140 nations around the world.  Visitors from the United States accounted for the vast majority of those hits, but the UK, Canada, and Australia also brought in a large contingent of visitors.

 

One article in particular far outpaced all other posts.  My post on Brain Waves and Other Brain Measures accounted for as many visits as the next three most popular posts combined.  Of my posts published in 2011, only four made it to this year’s top ten list.  The other six were published in 2010.  Of those six from 2010, four were also on the top ten list last year.

 

Great interest persisted in my post entitled Nonmoral Nature: It is what it is.  This review of Stephen Jay Gould’s most famous article sustained a number two ranking for a second straight year.  I had also reviewed in 2010 a very popular New York Time’s article by Steven Pinker entitled The Moral Instinct.  This article moved up a notch this year, ultimately ranking number three.  My critical article on the Implicit Associations Test ranked number four this year, versus a number six ranking last year.  And my Hedgehog versus the Fox mindset piece ranked number ten this year, compared to a number seven ranking last year.

 

So here is the Top Ten list for 2011.

  1. Brainwaves and Other Brain Measures (2011)
  2. Non Moral Nature: It is what it is (2010)
  3. Moral Instinct  (2010)
  4. IAT: Questions of Reliability and Validity  (2010)
  5. Where Does Prejudice Come From?  (2011)
  6. Cognitive Conservatism, Moral Relativism, Bias, and Human Flourishing  (2011)
  7. What Plato, Descartes, and Kant Got Wrong: Reason Does Not Rule.  (2010)
  8. Intuitive Thought  (2010)
  9. Effects of Low SES on Brain Development  (2011)
  10. Are you a Hedgehog or a Fox?  (2010)

It’s interesting to me that this list includes the very foundational issues that have driven me in my quest.  And each was posted with great personal satisfaction.   This encompassing cross section of my work is, in fact, a good starting point for those who are new to my blog.  There are several popular 2011 posts that ranked outside the top ten but ranked highly relative to other posts published in 2011.  These other posts include:

One article I published late in 2011 has attracted significant attention.   I believe that it is perhaps one of the most important posts I’ve written.  As I was writing this retrospective, Conspicuous Consumption and the Peacock’s Tail was far outpacing all other posts.

 

The most emotional and personally relevant articles pertained to significant problems in healthcare in the United States and my wife’s battle with breast cancer.  These articles include: (a) What not to say to someone with cancer: And what helps; (b) Up and Ever Onward: My Wife’s Battle With Cancer; (c) Cancer, Aging, & Healthcare: America, We Have a Problem; (d) We’re Number 37! USA USA USA!; and (e) Tears of Strength in Cancer’s Wake.  The latter pertains to perhaps the proudest parental moment of my life.

 

Another very important issue that I wrote a fair amount about includes the pernicious affect of poverty on child development.  Clicking here takes you to a page that lists all of the articles on this topic.  Knowing the information in this series should motivate us, as a society, to truly evaluate our current political and economic policies.

 

One of my favorite articles tackled my long standing curiosity about the geology of the place I live.  The article itself did not get a lot of attention, but I sure loved writing it.

 

This two-year journey, thus far has resulted in perhaps unparalleled personal and intellectual growth.  It has changed the way I look at life, the world around me, and my fellow human beings.   It is my sincerest hope that those who have seen fit to read some of my material have experienced shifts of perception or at least a modicum of enlightenment.

 

The bottom line:

 

The human brain, no matter how remarkable, is flawed in two fundamental ways.  First, the proclivities toward patternicity (pareidolia), hyperactive agency detection, and superstition, although once adaptive mechanisms, now lead to many errors of thought.  Since the age of enlightenment, when human kind developed the scientific method, we have exponentially expanded our knowledge base regarding the workings of the world and the universe.  These leaps of knowledge have rendered those error prone proclivities unessential for survival.  Regardless, they have remained a dominant cognitive force.  Although our intuition and rapid cognitions have sustained us, and in some ways still do, the subsequent everyday illusions impede us in important ways.

 

Secondly, we are prone to a multitude of cognitive biases that diminish and narrow our capacity to truly understand the world. Time after time I have written of the dangers of ideology with regard to its capacity to blindfold its disciples.  Often those blindfolds are absolutely essential to sustain the ideology.  And this is dangerous when truths and facts are denied or innocents are subjugated or brutalized.  As I discussed in Spinoza’s Conjecture:

“We all look at the world through our personal lenses of experience.  Our experiences shape our understanding of the world, and ultimately our understanding of [it], then filters what we take in.  The end result is that we may reject or ignore new and important information simply because it does not conform to our previously held beliefs.

Because of these innate tendencies, we must make additional effort in order to discover the truth.

 

Share

When I hit the publish button for my last post Cognitive Conservatism, Moral Relativism, Bias, and Human Flourishing I felt a tinge of angst.  It took a few days for my rational brain to figure out (or perhaps confabulate) a reason; but, I think I may have.  Perhaps it should have been immediately obvious, but my outrage likely clouded my judgment.  Anyways, that angst wasn’t due to the potential controversy of the article’s content – I had previously posted more provocative pieces.  What I have come to conclude is that the nature of the controversy could be construed as being more personal.

 

It is not hard to imagine that there is a very real possibility that people I love may have been hurt by what I wrote.  This left me feeling like a hypocrite because what I have continually aspired to communicate is that “true morality” should promote human flourishing for everyone.  Although the overarching message was consistent with my goal, the tone and tenor was not.

 

I was inspired by a blog post written by a family member that touched the nerves of my liberal sensitivities.  Further, and more importantly, I believe that what he wrote was likely hurtful to others in my family.   A couple of my tribal communities (moral and kin) were assaulted, and I responded assertively.

 

The whole purpose of my blog How Do You Think? has been driven toward understanding such diverse and mutually incompatible beliefs that do in fact transcend my family and the world in general.  In this particular situation, however,  I placed several family members in the crux of just such a moral juxtaposition.

 

I am certain that much of what I have written over the last year may be construed as offensive to some from a variety of different tribal moral communities.  But one thing I am equally certain of, is that attacking one’s core moral holdings is not an effective means of facilitating enlightenment.

 

I responded to my relative’s pontifications with moral outrage and indignation.  I was offended and mad.  That is what happens when core beliefs are challenged.  We circle the wagons and lash back.  But this does nothing to further the discussion.  I should have known better.  And, that error of judgment may have lasting familial consequences.  This saddens me, and I am sorry.

 

So then, how are we to cope with such diametrically opposed perspectives?

 

If you have consistently read my posts you are likely to have come away with an understanding of the workings of the human brain, and as such, realize that it is an incredible but highly flawed organ.   What is more important to recognize, is that these flaws leave us prone to a variety errors that are both universal and systematic.  The consequences of these errors include Confirmation Bias, Spinoza’s Conjecture, Attribution Error, Pareidolia, Superstition, Essentialism, Cognitive Conservatism, and Illusions of all sorts (e.g., Attention, Cause, Confidence, Memory, Efficacy, Willpower, and Narrative).  The down stream consequences of these errors, paired with our tribal nature, and our innate moral inclinations lead us to form tribal moral communities.  These communities unite around ideologies and sacred items, beliefs, or shared history’s.  Our genetically conferred Moral Instincts which are a part of our Human Nature lay the ground work for us to seek out others who share our beliefs and separate ourselves from others who do not.  This is how the divide occurs.  And our brain is instrumental in this division and the subsequent acrimony between groups.

 

This is perhaps the most important concept that I want to share.  Systematic brain errors divide us.  Understanding this – I mean truly understanding all of these systematic errors, is essential to uniting us.  Education is the key, and this is what I hope to provide.  Those very brain errors are themselves responsible for closing minds to the reality of these facts.  Regardless, the hopes that I have for universal enlightenment persist and I hope to endeavor ever onward opening minds without providing cause to close them.   I fear that I  have taken a misstep – spreading the divide rather than closing it.

 

Please know that Human Flourishing for all is my number one goal.  Never do I intend to come off as judgmental, hurtful, or otherwise arrogant or elitist.  When I do – please push back and offer constructive criticism.   We are all in this together – and time, love, life, peace, and compassion are precious.   This is the starting point – something that I am certain we share.  Don’t you think?

Share

Narrative Fallacy

13 March 2011

Evolution has conferred upon us a brain that is capable of truly amazing things.  We have, for thousands of years, been capable of creating incredibly beautiful art, telling compelling tales, and building magnificent structures.  We have risen from small and dispersed tribal bands to perhaps the dominate life force on the planet.  Our feats have been wondrous.  We have put men on the moon, our space probes have reached the outer limits of our solar system, and we have people living and working in space.  We have literally doubled the life expectancy of human beings, figured out how to feed billions of people, and eradicated some of the most dreadful diseases known to human kind.  We can join together in virtual social communities from remote corners of the world, and even change nations using Facebook and Twitter.  This list could go on and on.  We are very capable and very smart beings.

 

Our mark on this planet, for the moment, is indelible.  Yet, despite our great powers of intellect and creativity, we are incredibly vulnerable.  I am not referring to our susceptibility to the great powers of nature as evidenced in Japan this last week.  I am referring to an inherent mode of thinking that is core to our human nature.

 

It is pretty certain that nature-nature will destroy our species at some point in the future, be it via asteroid impact, super-volcanoes, climate change, microbiome evolution, or the encroachment of the sun’s surface as it goes red giant in five billion years.  Of all the species that have ever lived on this planet over 99% have gone extinct.  What’s living today will someday be gone – there really is no question about it.  But the question that remains is: “Will nature-nature do us in – or will human-nature do it first?”

 

We have evolved over billions of years to our current homo sapien (wise man) form, and for the vast majority of that evolutionary period, we have had very limited technology.  The development of primitive stone and wooden tools dates back only tens of thousands of years; and reading and writing dates back only several thousand years.  What we do and take for granted every day has only been around for a minuscule amount of time relative to the vastness of incomprehensible evolutionary and geological time. These facts are relevant because our brains, for the most part, developed under selective pressures that were vastly different than those we live under today.

 

Much as our appendix and coccyx hair follicle are remnants of our evolutionary past, so too are some of our core thought processes.  These vestigial cognitions play out both as adaptive intuitions and potentially quite destructive errors of judgment.  We would like to think that as an advanced thinking species, our ability to use reason, is our dominate mental force.  Unfortunately, this most recent evolutionary development, takes a back seat to lower and more powerful brain functions that have sustained us for millions of years.  I have previously written about this reason versus intuition/emotion paradigm so I won’t go into this issue in detail here; but, suffice it to say, much of what we do is guided by unconscious thought processes outside of our awareness and outside our direct control.  And again, these life guiding processes are mere remnants of what it took to survive as roaming bands of hunters and gatherers.

 

Ours brains came to their current form when we were not in possession of the tools and technologies that help us truly understand the world around us today.  Early survival depended on our ability to see patterns in randomness (pareidolia or patternicity) and to make snap judgments.  Rational thought, which is slow and arduous, has not played out in a dominate way because it failed to provide our ancestors with the survival advantages that emotional and rapid cognitions did.  As such, our brains have been programmed by evolution to make all kinds of rapid cognitions, that in this modern time, are simply prone to error.

 

We are uncomfortable with randomness and chaos and are driven to pull together causal stories that help us make sense of the world.  Our brains are correlation calculators, belief engines, and hyperactive agency detection devices – all inclinations of which lead us to develop polytheism to help explain the whims of “mother nature.”  All cultures, for example have also developed creation myths to help explain how we came to be.  We are a superstitious lot driven by these vestigial remnants.

 

It is easy to see how powerful this inclination is.  Look at the prevalence of beliefs about things like full moons and bad behavior.  And how about bad behavior and acts of nature?  Pat Robertson blamed Katrina on homosexuality and hedonism.  One wonders what the Japanese did to deserve their most current tragedy.  I’ve already heard talk of the attack on Pearl Harbor as an antecedent.  Like mother nature would align with the United States to punish long past deeds against us!  If mother nature cares at all about herself, I wonder what we have coming for Nagasaki and Hiroshima?  Likewise, people blame vaccines for autism and credit homeopathy for their wellness.  I could go and on about our silly inclinations.  We are prone to Confirmation Bias, Spinoza’s Conjecture, Attribution Error, Illusions of Attention, and the Illusions of Knowledge and Confidence.  In the same vein, we are manipulated by the Illusion of Narrative also known as the Narrative Fallacy.

 

Nassim Nicholas Taleb (a philosopher, author, statistician) coined the phrase “Narrative Fallacy,” which is an encapsulation of this very discussion.  We have a deep need to make up a narrative that serves to make sense of a series of connected or disconnected facts.  Our correlation calculators pull together these cause and effect stories to help us understand the world around us even if chance has dictated our circumstances.   We fit these stories around the observable facts and sometimes render the facts to make them fit the story.  This is particularly true, for example, in the case of Intelligent Design.

 

Now that I am aware of this innate proclivity I enjoy watching it play out in my own mind.  For example several weekends ago I went cross country skiing with my wife, Kimberly.  We were at Allegany State Park, in Western New York, where there are nearly 20 miles of incredibly beautiful and nicely groomed nordic ski trails.  Kimberly and I took a slightly different route than we normally do and at a junction of two trails, we serendipitously ran into a friend we hadn’t seen in quite some time.  It was an incredible and highly improbable meeting.  Any number of different events or decisions could have resulted in forgoing this meet-up.  Such events compel us to string together a narrative to make sense of the sheer randomness.  Was it fate, divine intervention, or just coincidence?  I am certain it was the latter – but it sure was fun dealing with the cognitions pouring forth to explain it.

 

I would really like to hear about your dealings with this inclination.  Please post comments detailing events that have happened to you and the narratives you fomented to make sense of  them.  This is a great exercise to help us understand this pattern detection mechanism, so, have some fun with it and share your stories.  At the very least, pay attention to how this tendency plays out in your life and think about how it plays out in your belief systems (and ideological paradigms).  I’m guessing that it will be informative.

Share

Have you ever heard someone make an argument that leaves you shaking your head in disbelief?  Does it seem to you like some people are coming from a completely different reality than your own?  If so, then this blog is for you.  I have spent the last year trying to develop an understanding of the common thought patterns that drive the acrimonious spirit of our social and political dialogue.  I am continually amazed by what I hear coming from seemingly informed people.  I have assumed that some folks are either deluded, disingenuous, or downright ignorant.  There is yet another possibility here, including the reality that different moral schema or belief systems may be driving their thinking.  And if this is the case, how do these divergent processes come to be?  I  have learned a lot through this exploration and feel compelled do provide a recap of the posts I have made.  I want to share with you those posts that have gathered the most traction and some that I believe warrant a bit more attention.

 

Over the past year I have posted 52 articles often dealing with Erroneous Thought Processes, Intuitive Thinking, and Rational Thought.  Additionally, I have explored the down stream implications of these processes with regard to politics, morality, religion, parenting, memory, willpower, and general perception.  I have attempted to be evidenced-based and objective in this process – striving to avoid the very trappings of confirmation bias and the erroneous processes that I am trying to understand.   As it turns out, the brain is very complicated: and although it is the single most amazing system known to human kind, it can and does lead us astray in very surprising and alarming ways.

 

As for this blog, the top ten posts, based on the shear number of hits, are as follows:

  1. Attribution Error
  2. Nonmoral Nature, It is what it is.
  3. Multitasking: The Illusion of Efficacy
  4. Moral Instinct
  5. Pareidolia
  6. IAT: Questions of Reliability
  7. Are You a Hedgehog or a Fox?
  8. What Plato, Descartes, and Kant Got Wrong: Reason Does not Rule
  9. Illusion of Punditry
  10. Emotion vs.Reason: And the winner is?

What started out as ramblings from a curious guy in a remote corner of New York State ended up being read by folks from all over the planet.  It has been a difficult process at times, consuming huge amounts of time, but it has also been exhilarating and deeply fulfilling.

 

I have been heavily influenced by several scientists and authors in this exploration.  Of particular importance have been Steven Pinker, Daniel Simons, Christopher Chabris, Jonah Lehrer, Bruce Hood, Carl Sagan, and Malcolm Gladwell.  Exploring the combined works of these men has been full of twists and turns that in some cases necessitated deep re-evaluation of long held beliefs.  Holding myself to important standards – valuing evidence over ideology – has been an important and guiding theme.

 

Several important concepts have floated to the top as I poked through the diverse literature pertaining to thought processes. Of critical importance has been the realization that what we have, when it comes to our thought processes, is a highly developed yet deeply flawed system that has been shaped by natural selection over millions of years of evolution.  Also important has been my increased understanding of the importance of genes, the basic element of selective pressures, as they play out in morality and political/religious beliefs.  These issues are covered in the top ten posts listed above.

 

There are other worthy posts that did not garner as much attention as those listed above.  Some of my other favorites included a review of Steven Pinker’s article in the New York Times (also titled Moral Instinct,) a look at Jonathon Haidt’s Moral Foundations Theory in Political Divide, as well as the tricks of Retail Mind Manipulation and the Illusion of Attention.  This latter post and my series on Vaccines and Autism (Part 1, Part 2, Part 3) were perhaps the most important of the lot.  Having the content of these become general knowledge would make the world a safer place.

 

The evolution of understanding regarding the power and importance of Intuitive relative to Rational Thinking was humbling at times and Daniel Simons’ and Christopher Chabris’ book, The Invisible Gorilla, certainly provided a mind opening experience.  Hey, our intuitive capabilities are incredible (as illustrated by Gladwell in Blink & Lehrer in How We Decide) but the downfalls are amazingly humbling.  I’ve covered other topics such as  happiness, superstition, placebos, and the debate over human nature.

 

The human brain, no matter how remarkable, is flawed in two fundamental ways.  First, the proclivities toward patternicity (pareidolia), hyperactive agency detection, and superstition, although once adaptive mechanisms, now lead to many errors of thought.  Since the age of enlightenment, when human kind developed the scientific method, we have exponentially expanded our knowledge base regarding the workings of the world and the universe.  These leaps of knowledge have rendered those error prone proclivities unessential for survival.  Regardless, they have remained a dominant cognitive force.  Although our intuition and rapid cognitions have sustained us, and in some ways still do, the everyday illusions impede us in important ways.

 

Secondly, we are prone to a multitude of cognitive biases that diminish and narrow our capacity to truly understand the world. Time after time I have written of the dangers of ideology with regard to its capacity to put blind-folds on adherents.  Often the blind- folds are absolutely essential to sustain the ideology.  And this is dangerous when truths and facts are denied or innocents are subjugated or brutalized.  As I discussed in Spinoza’s Conjecture“We all look at the world through our personal lenses of experience.  Our experiences shape our understanding of the world, and ultimately our understanding of [it], then filters what we take in.  The end result is that we may reject or ignore new and important information simply because it does not conform to our previously held beliefs.

 

Because of our genetically inscribed tendencies toward mysticism and gullibility, we must make extra effort in order to find truth. As Dr. Steven Novella once wrote:

“We must realize that the default mode of human psychology is to grab onto comforting beliefs for purely emotional reasons, and then justify those beliefs to ourselves with post-hoc rationalizations. It takes effort to rise above this tendency, to step back from our beliefs and our emotional connection to conclusions and focus on the process.”

We must therefore be humble with regard to beliefs and be willing to accept that we are vulnerable to error prone influences outside our awareness.  Recognition and acceptance of these proclivities are important first steps.   Are you ready to move forward?  How do you think?

Share

Halloween seems like an appropriate time to discuss superstition.  What with ghosts and goblins and black cats and witches and all.  But would not Easter or Christmas, or any other evening that a five year old loses a tooth be an equally appropriate time?  In actuality, we massage magical thinking in our children with notions of Santa Claus, the Easter Bunny, and the tooth fairy.  And recall if you will, some of your favorite children’s books and the supernatural forces employed to delight your youthful whimsies.  Magic is, along with the thinking employed to delight in it, seemingly a rite of childhood, and in some ways the essence of what it is to be a child.

 

Much as magical thinking has its roots in childhood fantasies, superstition too has its roots in our species’ youth.  In that nascent time we lacked the capacity to understand the forces and whims of the natural world around us.  Our ancestors struggled to survive, and living another day in part depended on their ability to make sense of the forces that aided or impinged upon them.  We must not forget that our forefathers lived much like the non-domesticated animals around us today.  Survival was a day to day reality dependent upon the availability of life sustaining resources like food, water and shelter, and was often threatened by predation or the forces of nature.  Death was a real possibility and survival a real struggle.  The stakes were high and the hazards were plentiful.  As it turns out, these are the very conditions under which superstition is likely to thrive.

 

So what is superstition?  Bruce Hood, author of The Science of Superstition, notes that superstition is a belief “that there are patterns, forces, energies, and entities operating in the world that are denied by science…”  He adds that “the inclination or sense that they may be real is our supersense.” It involves an inclination to attempt to “control outcomes through supernatural influence.”  It is the belief that if you knock on wood or cross your fingers you can influence outcomes in your favor.  It is the belief that faithfully carrying out rituals as part of a wedding ceremony (e.g., wearing something blue, something new, something borrowed) or before going to bat or before giving a big speech will improve outcomes.  It is also the belief that negative outcomes can come as a result of stepping on a crack, breaking a mirror, or spilling salt.  Hood argues that supersense goes beyond these obvious notions and surfaces in more subtle ways associated with touching an object or entering a place that we feel has a connection with somebody bad or evil.  For example, how would you feel if you were told that you had to wear Jeffery Dalmer’s T-shirt or that you were living in a house where ritualistic torture and multiple murders took place?  Most of us would recoil at the thought of this.  Most of us also believe (erroneously) that we can sense when someone is looking at us, even when we cannot see them doing so.  These beliefs and much of the value we place on sentimental objects stems from this style of thinking.

 

I explored the deep evolutionary roots of superstitious thinking in a previous post, The Illusion of Cause: Vaccines and Autism.   The principle underpinnings are the same.  In that post I noted the following:

 

Michael Shermer (2000), in his book, How We Believe, eloquently describes our brains as a Belief Engine. Underlying this apt metaphor is the notion that “Humans evolved to be skilled pattern seeking creatures. Those who were best at finding patterns (standing upwind of game animals is bad for the hunt, cow manure is good for the crops) left behind the most offspring. We are their descendants.” (Shermer, p. 38). Chabris and Simons (2009) note that this refined ability “serves us well, enabling us to draw conclusions in seconds (or milliseconds) that would take minutes or hours if we had to rely on laborious logical calculations.” (p. 154). However, it is important to understand that we are all prone to drawing erroneous connections between stimuli in the environment and notable outcomes. Shermer further contends that “The problem in seeking and finding patterns is knowing which ones are meaningful and which ones are not.

From an evolutionary perspective, we have thrived in part, as a result of our tendency to infer cause or agency regardless of the reality of threat. For example, those who assumed that rustling in the bushes was a tiger (when it was just wind) were more likely to take precautions and thus less likely, in general, to succumb to predation. Those who were inclined to ignore such stimuli were more likely to later get eaten when in fact the rustling was a hungry predator. Clearly from a survival perspective, it is best to infer agency and run away rather than become lunch meat. The problem that Shermer refers to regarding this system is that we are subsequently inclined toward mystical and superstitious beliefs: giving agency to unworthy stimuli or drawing causal connections that do not exist. Dr. Steven Novella, a neurologist, in his blog post entitled Hyperactive Agency Detection notes that humans vary in the degree to which they assign agency. Some of us have Hyperactive Agency Detection Devices (HADD) and as such, are more prone to superstitious thinking, conspiratorial thinking, and more mystical thinking. It is important to understand as Shermer (2000) makes clear:

“The Belief Engine is real. It is normal. It is in all of us. Stuart Vyse [a research psychologist] shows for example, that superstition is not a form of psychopathology or abnormal behavior; it is not limited to traditional cultures; it is not restricted to race, religion, or nationality; nor is it only a product of people of low intelligence or lacking education. …all humans possess it because it is part of our nature, built into our neuronal mainframe.” (p. 47).

 

Bruce Hood takes this notion further and adds that the cultural factors discussed at the opening of this piece and other intuitive inclinations such as dualism (a belief in the separation of mind and body), essentialism (the notion that all discernible objects harbor an underlying reality that although intangible, gives each and every object it’s true identity), vitalism (the insistence that there is some big, mysterious extra ingredient in all living things), holism (that everything is connected by forces), and anism (the belief that the inanimate world is alive) shape adult superstition.  These latter belief mechanisms are developmental and naturally occurring in children: they are the tendencies that make magic and fantasy so compelling for children.  It is when they lurk in our intuition or are sustained in our rational thought that we as adults fall victim to this type of illusion.

 

It is interesting to note that much like our ancestors, we are more prone to this type of thinking when faced with high stakes, a low probability of success, and incomprehensible controlling circumstances.  Think about it.  In baseball, batters often have complex superstitious rituals associated with batting.  The best hitters experience success only one in three times at bat.  And the speed at which they have to decide to swing or not and where to position the swing defies the rational decision making capacity of humans.  On the other hand, these very same athletes have no rituals when it comes to fielding a ball (which is a high probability event for the proficient).

 

Superstition is a natural inclination with deep evolutionary and psychological roots embedded deeply in our natural child development.  These tendencies are nurtured and socialized as a part of child rearing and spill over into adult rituals in predictable circumstances (particularly when there is a low degree personal control).   When one deconstructs this form of thinking it makes complete and total sense.  This is not to suggest that reliance on superstitions is sensible.  Often, however, the costs are low and the rituals therein can be fun.  There are some potential costs associated with such thinking.  Some of the dangers are materialized in notions such as vaccines cause autism and homeopathy will cure what ails you in lieu of scientific medicine.  Resignation of personal power in deference to supernatural forces is a depressive response pattern.  Reliance on supernatural forces is essentially reliance on chance and in some cases its applications actually stack the deck against you.  So be careful when employing such tactics.  But, if you’re in the neighborhood, NEVER EVER walk under my ladder.  I’ve been known to drop my hammer.

 

References

 

Chabris, C. F., & Simons, D. J. (2010). The Invisible Gorilla. Random House: New York.

 

Dawkins, R. (2009). The Greatest Show on Earth: The Evidence for Evolution. Free Press: New York.

 

Gelman, S. A. (2004). Psychological Essentialism in Children. TRENDS in Cognitive Sciences, 8, 404-409.

 

Hood, B. (2008). The Science of Superstition (Formerly Titled: Supersense: Why We Believe in the Unbelievable). HarperCollins Publishers: New York.

 

Novella, S. (2010). Hyperactive Agency Detection. NeuroLogica Blog. http://www.theness.com/neurologicablog/?p=1762

 

Shermer, M. (2000). How We Believe. W.H. Freeman/Henry Holt and Company: New York.

Share

The Implicit Associations Test (IAT) is a very popular method for measuring implicit (implied though not plainly expressed) biases. Greenwald, one of the primary test developers, suggests that “It has been self-administered online by millions, many of whom have been surprised—sometimes unpleasantly—by evidence of their own unconscious attitudes and stereotypes regarding race, age, gender, ethnicity, religion, or sexual orientation.” (2010). It purports to tap into our unconscious or intuitive attitudes at a deeper level than those that we are able to rationally express. The best way to get an idea of just what the IAT is, is to take it. If you haven’t done so already, go to the Implicit Associations Test website and participate in a demonstration of the Race Test. It takes about ten minutes.

 

I tend to have a skeptical inclination. This in part stems from the training that I benefited from in acquisition of my PhD in psychology. But it is also just part of who I am. Psychology is, in itself, a rather soft science – full of constructs – and variables that are inherently difficult to measure with any degree of certainty. I learned early in my training that there are dangers associated with inference and measuring intangibles. In fact, my training in personality and projective measures essentially focused on why not to use them – especially when tasked with helping to make important life decisions. Why is this? All psychological measures contain small and predictable amounts of unavoidable error – but those based on constructs and inference are particularly untenable.

 

This is relevant because as we look at thinking processes, we are dealing with intangibles. This is especially true when we are talking about implicit measures. Any discussion of implicit thought necessitates indirect or inferential measures and application of theoretical constructs. So, with regard to the Implicit Associations Test (IAT), one needs to be careful.

 

Currently, increasing evidence suggests that our intuition has a powerful influence over our behavior and moment to moment decision making. Books like Blink by Malcolm Gladwell and How We Decide by Jonah Lehrer point out the power of intuition and emotion in this regard. Chabris and Simons in their book, The Invisible Gorilla, make a strong argument that intuition itself sets us up for errors. Gladwell perhaps glorifies intuition – but the reality is, it (intuition) is a powerful force. Gladwell uses the story of the IAT as evidence of such power. Essentially, if the IAT is a valid and reliable measure, it provides strong evidence of the problems of intuition.

 

I am motivated to shed some light on the IAT – not because of my personal IAT results, which were disappointing, but because the IAT has the risk of gaining widespread application without sufficient technical adequacy. Just think of the ubiquitous Meyers-Briggs Personality Inventory and the breadth and depth of popular use and appeal that it has garnered (without a shred of legitimate science to back it up). Real decisions are made based on the results of this instrument and frankly it is dangerous. The Meyers-Briggs is based on unsubstantiated and long out-of-date Jungian constructs and was built by individuals with little to no training in psychology or psychometrics. This is not the case for the IAT for sure, but the risks of broad and perhaps erroneous application are similar.

 

The authors of the IAT have worked diligently over the years to publish studies and facilitate others’ research in order to establish the technical adequacy of the measure. This is a tough task because the IAT is not one test, but rather, it is a method of measurement that can be applied to measure a number of implicit attitudes. At the very foundation of this approach there is a construct, or belief, that necessitates a leap of faith.

 

So what is the IAT? Gladwell (2005) summarizes it in the following way:

The Implicit Association Test (IAT)…. measures a person’s attitude on an unconscious level, or the immediate and automatic associations that occur even before a person has time to think. According to the test results, unconscious attitudes may be totally different or incompatible with conscious values. This means that attitudes towards things like race or gender operate on two levels:
1. Conscious level- attitudes which are our stated values and which are used to direct behavior deliberately.
2. Unconscious level- the immediate, automatic associations that tumble out before you have time to think.
Clearly, this shows that aside from being a measurement of attitudes, the IAT can be a powerful predictor of how one [may] act in certain kinds of spontaneous situations.

So here is one of the difficulties I have with the measure. Take this statement: “The IAT measures a person’s attitude on an unconscious level, or the immediate and automatic associations that occur even before a person has time to think.” Tell me how one would directly and reliably measure “unconscious attitude” without using inference or indirect measures that are completely dependent on constructs? I am not alone in this concern. In fact, Texas A&M University psychologist Hart Blanton, PhD, worries that the IAT has been used prematurely in research without sufficient technical adequacy. Blanton has in fact published several articles (Blanton, et al., 2007; Blanton, et al., 2009) detailing the IAT’s multiple psychometric failings. He suggests that perhaps the greatest problem with this measure concerns the way that the test is scored.

 

First you have to understand how it all works. The IAT purports to measure the fluency of people’s associations between concepts. On the Race IAT, a comparison is made between how fluent the respondent pairs pictures of European-Americans with words carrying a connotation of “good” and pictures of African-Americans with words connoting “bad.” The task measures the latency between such pairings and draws a comparison to the fluency of responding when the associations are reversed (e.g., how quickly does the respondent pair European-Americans with words carrying a “bad” connotation and African-Americans with words connoting “good.”). If one is quicker at pairing European-Americans with “good” and African Americans with “bad” then it is inferred that the respondent has a European-American preference. The degree of preference is determined by the measure of fluency and dysfluency in making those pairings. Bigger differences in pairing times result in stronger ratings of one’s bias. Blanton questions the arbitrary nature of where the cutoffs for mild, moderate, and strong preferences are set when there is no research showing where the cutoffs should be. Bottom line, Blanton argues, is that the cutoffs are arbitrary. This is a common problem in social psychology.

 

Another issue of concern is the stability of the construct being measured. One has to question whether one’s bias, or racial preferences, are a trait (a stable attribute over time) or a state (a temporary attitude based on acute environmental influences). The test-retest reliability of the IAT is relatively unstable itself. Regardless, according to Greenwald: “The IAT has also shown reasonably good reliability over multiple assessments of the task. …. in 20 studies that have included more than one administration of the IAT, test–retest reliability ranged from .25 to .69, with mean and median test–retest reliability of .50.” Satisfactory test-retest reliability values are in the .70 to.80 range. To me, there is a fair amount of variance unaccounted for and a wide range of values (suggesting weak consistency). My IATs have bounced all over the map. And boy did I feel bad when my score suggested a level of preference that diverges significantly from my deeply held values. Thank goodness I have some level of understanding of the limitations of such metrics. Not everyone has such luxury.

 

As I noted previously, the IAT authors have worked diligently to establish the technical adequacy of this measure and they report statistics attesting to the internal-consistency, test-retest reliability, predictive validity, convergent validity, and discriminant validity, almost always suggesting that results are robust (Greenwald, 2010; Greenwald, 2010; Greenwald, et al, 2009; Lane, et al, 2007) . There are other studies including those carried out by Blanton and colleagues, that suggest otherwise. To me, these analyses are important and worthwhile – however, at the foundation, there is the inescapable problem of measuring unconscious thought.

 

Another core problem is that the validity analyses employ other equally problematic measures of intangibles in order to establish credibility. I can’t be explicit enough – when one enters the realm of the implicit – one enters a realm of intangibles: and like it or not, until minds can be read explicitly, the implicit is essentially immeasurable with any degree of certainty. The IAT may indeed measure what it purports to measure, but the data on this is unconvincing. Substantial questions of reliability and validity persist. I would suggest that you do not take your IAT scores to heart.

 

References

 

Azar, B. (2008). IAT: Fad or fabulous? Monitor on Psychology. July. Vol 39, No. 7,  page 44.

 

Blanton, H., Jaccard, J., Christie, C., and Gonzales, P. M. (2007). Plausible assumptions, questionable assumptions and post hoc rationalizations: Will the real IAT, please stand up? Journal of Experimental Social Psychology. Volume 43, Issue 3, Pages 399-409.

 

Blanton, H., Klick, J., Mitchell, G., Jaccard, J.,Mellers, B., Tetlock, P. E. (2009). Strong Claims and Weak Evidence: Reassessing the Predictive Validity of the IAT. Journal of Applied Psychology. Vol. 94, No. 3, 567–582

 

Chabris, C. F., & Simons, D. J., 2010. The Invisible Gorilla. Random House: New York.

 

Gladwell, M. 2005. Blink: The Power of Thinking Without Thinking. Little, Brown and Company: New York.

 

Greenwald, A. G. (2010).  I Love Him, I Love Him Not: Researchers adapt a test for unconscious bias to tap secrets of the heart. Scientific American.com: Mind Matters.   http://www.scientificamerican.com/article.cfm?id=i-love-him-i-love-him-not

 

Greenwald, A. G. (2009). Implicit Association Test: Validity Debates. http://faculty.washington.edu/agg/iat_validity.htm

 

Greenwald, A. G., Poehlman, T. A., Uhlmann, E., & Banaji, M. R. (2009). Understanding and using the Implicit Association Test: III. Meta-analysis of predictive validity. Journal of Personality and Social Psychology. 97, 17–41.

 

Lane, K. A., Banaji, M. R., Nosek, B. A., & Greenwald, A. G. (2007). Understanding and using the Implicit Association Test: IV. What we know (so far) (Pp. 59–102).  In B. Wittenbrink & N. S. Schwarz (Eds.). Implicit measures of attitudes: Procedures and controversies. New York: Guilford Press.

 

Lehrer, J. 2009. How We Decide. Houghton Mifflin Harcourt: New York.

Share

There are many well intentioned folks out there who believe that childhood vaccinations cause Autism. Last week I covered the origins of this belief system as well as its subsequent debunking in Vaccines and Autism. Despite the conclusive data that clearly establishes no causal link between vaccines and Autism, the belief lives on. Why is this? Why do smart people fall prey to such illusions? Chabris and Simons contend in their book, The Invisible Gorilla, that we fall prey to such myths because of the Illusion of Cause. Michael Shermer (2000), in his book, How We Believe, eloquently describes our brains as a Belief Engine. Underlying this apt metaphor is the notion that “Humans evolved to be skilled pattern seeking creatures. Those who were best at finding patterns (standing upwind of game animals is bad for the hunt, cow manure is good for the crops) left behind the most offspring. We are their descendants.” (Shermer, p. 38). Chabris and Simons note that this refined ability “serves us well, enabling us to draw conclusions in seconds (or milliseconds) that would take minutes or hours if we had to rely on laborious logical calculations.” (p. 154). However, it is important to understand that we are all prone to drawing erroneous connections between stimuli in the environment and notable outcomes. Shermer further contends that “The problem in seeking and finding patterns is knowing which ones are meaningful and which ones are not.

 

From an evolutionary perspective, we have thrived in part, as a result of our tendency to infer cause or agency regardless of the reality of threat. For example, those who assumed that rustling in the bushes was a tiger (when it was just wind) were more likely to take precautions and thus less likely, in general, to succumb to predation. Those who were inclined to ignore such stimuli were more likely to later get eaten when in fact the rustling was a hungry predator. Clearly from a survival perspective, it is best to infer agency and run away rather than become lunch meat. The problem that Shermer refers to regarding this system is that we are subsequently inclined toward mystical and superstitious beliefs: giving agency to unworthy stimuli or drawing causal connections that do not exist. Dr. Steven Novella, a neurologist, in his blog post entitled Hyperactive Agency Detection notes that humans vary in the degree to which they assign agency. Some of us have Hyperactive Agency Detection Devices (HADD) and as such, are more prone to superstitious thinking, conspiratorial thinking, and more mystical thinking. It is important to understand as Shermer (2000) makes clear:

 

“The Belief Engine is real. It is normal. It is in all of us. Stuart Vyse [a research psychologist] shows for example, that superstition is not a form of psychopathology or abnormal behavior; it is not limited to traditional cultures; it is not restricted to race, religion, or nationality; nor is it only a product of people of low intelligence or lacking education. …all humans possess it because it is part of our nature, built into our neuronal mainframe.” (p. 47).

 

We all are inclined to detect patterns where there are none. Shermer refers to this tendency as patternicity. It is also called pareidolia. I’ve previously discussed this innate tendency noting that “Our brains do not tolerate vague or obscure stimuli very well. We have an innate tendency to perceive clear and distinct images within such extemporaneous stimuli.” It is precisely what leads us to see familiar and improbable shapes in puffy cumulus clouds or the Virgin Mary in a toasted cheese sandwich. Although this tendency can be fun, it can also lead to faulty and sometimes dangerous conclusions. And what is even worse is that when we hold a belief, we are even more prone to perceive patterns that are consistent with or confirm that belief. We are all prone to Confirmation Bias – an inclination to take in, and accept as true, information that supports our belief systems and miss, ignore, or discount information that runs contrary to our beliefs.

 

Patternicity and confirmation bias alone are not the only factors that contribute to the illusion of cause. There are at least two other equally salient intuitive inclinations that lead us astray. First, we tend to infer causation based on correlation. And second, the appeal of chronology, or the coincidence of timing, also leads us toward drawing such causal connections (Chabris & Simons, 2010).

 

A fundamental rule in science and statistics is that correlation does not infer causation. Just because two events occur in close temporal proximity, does not mean that one leads to the other. Chabris and Simons note that this rule is in place because our brains automatically – intuitively – draw causal associations, without any rational thought. We know that causation leads to correlation – but it is erroneous to assume that the opposite is true. Just because A and B occur together does not mean A causes B or vice-versa. There may be a third factor, C, that is responsible for both A and B. Chabris and Simons use ice cream consumption and drownings as an example. There is a sizable positive correlation between these two variables (as ice cream consumption goes up so do the incidences of drowning), but it would be silly to assume that ice cream consumption causes drowning, or that increases in the number of drownings causes increases in ice cream consumption. Obviously, a third factor, summer heat, leads to both more ice cream consumption and more swimming. With more swimming behavior there are more incidents of drowning.

 

Likewise, with vaccines and Autism, although there may be a correlation between the two (increases in the number of children vaccinated and increases in the number of Autism diagnoses), it is incidental, simply a coincidental relationship. But given our proclivity to draw inferences based on correlation, it is easy to see why people would be mislead by this relationship.

 

Add to this the chronology of the provision of the MMR vaccine (recommended between 12 and 18 months), and the typical time at which the most prevalent symptoms of Autism become evident (18-24 months), people are bound to infer causation. Given the fact that millions of children are vaccinated each year, there are bound to be examples of tight chronology.

 

So what is at work here are hyperactive agency detection (or overzealous patternicity), an inherent disposition to infer causality from correlation, and a propensity to “interpret events that happened earlier as the causes of events that happened or appeared to happen later” (Chabris & Simons, 2010, p. 184).  Additionally, you have a doctor like Andrew Wakefield misrepresenting data in such a way to solidify plausibility and celebrities like Jenny McCarthy using powerful anecdotes to convince others of the perceived link. And anecdotes are powerful indeed. “..[W]e naturally generalize from one example to the population as a whole, and our memories for such inferences are inherently sticky. Individual examples lodge in our minds, but statistics and averages do not. And it makes sense that anecdotes are compelling to us. Our brains evolved under conditions in which the only evidence available to us was what we experienced ourselves and what we heard from trusted others. Our ancestors lacked access to huge data sets, statistics, and experimental methods. By necessity, we learned from specific examples…” (Chabris & Simons, 2010, pp. 177-178).  When an emotional mother (Jenny McCarthy) is given a very popular stage (The Oprah Winfrey Show) and tells a compelling story, people buy it – intuitively – regardless of the veracity of the story. And when we empathize with others, particularly those in pain, we tend to become even less critical of the message conveyed (Chabris & Simons, 2010). These authors add that “Even in the face of overwhelming scientific evidence and statistics culled from studies of hundreds of thousands of people, that one personalized case carries undue influence” (p.178).

 

Although the efficacy of science is unquestionable, in terms of answering questions like the veracity of the relationship between vaccines and Autism, it appears that many people are incapable of accepting the reality of scientific inquiry (Chabris & Simons, 2010). Acceptance necessitates the arduous application of reason and the rejection of the influences rendered by the intuitive portion of our brain. This is harder than one might think. Again, it comes down to evolution. Although the ability to infer cause is a relatively recent development, we hominids are actually pretty good at it. And perhaps, in cases such as this one, we are too proficient for our own good (Chabris & Simons, 2010).

 

References

 

Center for Disease Control. (2009). Recommended Immunization Schedule for Persons Aged 0 Through 6 Years. http://www.cdc.gov/vaccines/recs/schedules/downloads/child/2009/09_0-6yrs_schedule_pr.pdf

 

Chabris, C. F., & Simons, D. J. (2010). The Invisible Gorilla. Random House: New York.

 

Novella, S. (2010). Hyperactive Agency Detection. NeuroLogica Blog. http://www.theness.com/neurologicablog/?p=1762

 

Shermer, M. (2000). How We Believe. W.H. Freeman / Henry Holt and Company: New York.

Share

I find myself in an untenable situation. I have plenty to write about but I am finding that the choices I am making right now, in the splendor of summer, give me limited time and energy to write. I’ve decided to take a short hiatus.

 

Over the last seven months my writing has been spurred on by relentless curiosity about belief systems that are held despite mountains of overwhelming evidence to the contrary. This cognitive conservatism absolutely befuddles me. And I am further driven to understand why ideology carries such overwhelming power over people and how it drives people to attack evidence or science in general. In a similar vain, I struggle with politics. The efforts made by the United States on the world’s stage to me seem to be a desperate attempt to slay the Hydra by means of decapitation. People close to me, that I love and have deep respect for, look at this war and even the environment in vastly different ways than I do.

 

Looking back, I have learned a great deal about the thinking processes that drive these different world views. Essentially we have what Michael Shermer calls a Belief Engine for a brain. We are hard wired to believe and make copious errors that incline us to believe – even silly things – regardless of evidence. We have successfully evolved in a world for hundreds of thousands of years devoid of statistics and analysis all the while thriving on snap judgments. Evolution itself, as a process, has inhibited our ability to accept its veracity. Stepping away from the belief engine demands a level of analysis that is foreign and often unpalatable. It is hard to be a skeptic yet oh so easy to go with our hard wired intuitive thinking. If you are new to my blog look back at entries that explore erroneous thinking, rational thought, the adaptive unconscious, memory, morality and even religion.

 

Looking forward I plan on delving further into our enigmatic Belief Engine. I want to further explore the errors of intuition, specifically the illusion of cause, implicit associations, as well as Jonathon Haidt’s work on political affiliation. Later I hope to switch gears and delve into the unique attributes of our planet that makes it hospitable for complex life.

Share

Last week I discussed Philip Tetlock’s work that revealed the utter meaninglessness of punditry in The Illusion of Punditry. It is important to note that although professional pundits, on average, were less accurate than random chance, a few outliers actually performed well above average. Tetlock closely examined the variables associated with the distribution of accuracy scores and discovered that experts were often blinded by their preconceptions, essentially lead astray by how they think. To elucidate his point, Tetlock employed Isaiah Berlin’s famous metaphor, The Hedgehog and the Fox. Berlin, a historian, drew inspiration for the title of this essay from a classical Greek poet Archilochus, who wrote: “The fox knows many things, but the hedgehog knows one big thing.”

 

Berlin contended that there are two types of thinkers, hedgehogs and foxes. To make sense of this metaphor, one has to understand a bit about these creatures. A hedgehog is a small spiny mammal that when attacked rolls into a ball with its spines protruding outward. This response is its sole defensive maneuver, its “one big thing,” employed under any indication of threat. And by extension he suggested that hedgehog thinkers “… relate everything to a single central vision, one system less or more coherent or articulate, in terms of which they understand, think and feel—a single, universal, organizing principle in terms of which alone all that they are and say has significance…” The cunning fox survives by adapting from moment to moment by being flexible and employing survival strategies that make sense in the current situation. They “pursue many ends, often unrelated and even contradictory, … their thought is scattered or diffused, moving on many levels, seizing upon the essence of a vast variety of experiences and objects.”

 

John W. Dean, a former presidential counsel (for Richard Nixon), using the Berlin metaphor classified a number of US presidents as hedgehogs and foxes. In his column he wrote:

“With no fear of contradiction, Barack Obama can be described as a fox and George W. Bush as clearly a hedgehog. It is more difficult than I thought to describe all modern American presidents as either foxes or hedgehogs, but labeling FDR, JFK, and Clinton as foxes and LBJ and Reagan as hedgehogs is not likely to be contested. Less clear is how to categorize Truman, Nixon, Carter and Bush I. But Obama and Bush II are prototypical of these labels.”

 

Tetlock, in referring to pundit accuracy scores wrote that:

“Low scorers look like hedgehogs: thinkers who “know one big thing,” aggressively extend the explanatory reach of that one big thing into new domains, display bristly impatience with those who “do not get it,” and express considerable confidence that they are already pretty proficient forecasters, at least in the long term. High scorers look like foxes: thinkers who know many small things (tricks of their trade), are skeptical of grand schemes, see explanation and prediction not as deductive exercises but rather as exercises in flexible “ad hocery” that require stitching together diverse sources of information, and are rather diffident about their own forecasting prowess.”

 

Tetlock was careful to point out that there was no correlation between political affiliation and either hedgehog or fox classification. But what he did note was that the most accurate pundits were foxes and that the key variable associated with their success was introspection. Those who studied their own decision making process, were open to dealing with dissonance, and those who were not blinded by their preconceptions were far more capable of making accurate predictions. Successful pundits were also cautious about their predictions and were inclined to take information from a wide variety of sources.

 

Hedgehogs on the other hand, were prone to certainty and grand “irrefutable” ideas. They tend to boil problems down to simple grand theories or conflicts (e.g., good versus evil, socialism versus capitalism, free markets versus government regulations, and so on) and view these big issues as being the driving force of history. They are prone to over simplify situations and miss the many and diverse issues that ultimately shape history. They instead are more likely to attribute historical changes to single great men with simple great ideas (e.g., Ronald Reagan was responsible for the fall of the USSR, and without his leadership the cold war may still be raging).

 

So what are you a hedgehog or a fox? Both thinking approaches have strengths and weaknesses and appropriate and less appropriate applications. What were Copernicus, da Vinci, Galileo, Newton, Einstein, and Darwin? When do you suppose it is good to be a hedgehog and when a fox? I suppose it comes down to the task at hand: big unifying issues such as gravity, relativity, evolution, quantum mechanics may indeed necessitate hedgehog thinking. Here such single minded determinism is likely essential to persevere. Although, having read Darwin’s On the Origin of Species I am inclined to think that Darwin was a fox. Da Vinci too, was likely a fox, considering the vastness of his contributions. And Galileo was similarly a broad thinker. Knowing little of Newton and Einstein, I care not to speculate. It seems to me with the specialization of science these days, one must be a hedgehog. Early science history is replete with foxes. I don’t know about you, but I have a romantic notion about the lifestyles of men like Galileo and Darwin, following their curiosities dabbling hither and yon.

 

References:

Berlin, I. (1953). The Hedgehog and the Fox. The Isaiah Berlin Virtual Library. http://berlin.wolf.ox.ac.uk/published_works/rt/HF.pdf

Chabris, C. F., & Simons, D. J. (2010). The Invisible Gorilla. New York: Random House.

Dean, J. (2009). Barack Obama Is a “Fox,” Not a “Hedgehog,” and Thus More Likely To Get It Right. http://writ.news.findlaw.com/dean/20090724.html

Lehrer, J. (2009). How We Decide. New York: Houghton Mifflin Harcourt.

Menand, L. (2005). Everybody’s an Expert. The New Yorker. http://www.newyorker.com/archive/2005/12/05/051205crbo_books1?printable=true

Tetlock, P.E. (2005). Expert political judgment: How good is it? How can we know? Princeton: Princeton University Press.

Share

Have you ever wondered what makes a pundit a pundit? I mean really! Is there pundit school or a degree in punditry? Given what I hear, I can only imagine that what would be conferred upon graduation is a B.S. of different, more effluent sort. I mean REALLY!

 

I am certain that many of you have heard the rhetoric spewed by many of the talking heads on television and talk radio. This is true regardless of their alleged political ideology. And even more alarming, it seems to me, is that the more bombastic they are, the more popular they are. A pundit is supposed to be an expert – one with greater knowledge and insight than the general population – and subsequently they should possess the capacity to analyze current scenarios and draw better conclusions about the future than typical folk.

 

However, what we typically hear is two or more supremely confident versions of reality. You name the issue, be it anthropogenic global warming, health care reform, the value of free market systems, virtually no two pundits can agree unless of course they are political brethren.

 

Have you ever wondered if any one has ever put the predictive reliability of these so called experts to a test? Well, Philip Tetlock, a psychology professor at UC Berkley, has done just that. In 1984 Tetlock undertook such an analysis and his initial data was so so alarming (everybody had called the future wrong with regard to the cold war and demise of the USSR) that he decided to embark on what was to eventually become a two decade long quantitative analysis of, and report card on, the true predictive capabilities of professional pundits.

 

In 2005 Tetlock published his findings in his book, Expert political judgment: How good is it? How can we know? The results were again surprising. He analyzed the predictions made by over 280 professional experts. He gave each a series of professionally relevant real life situations and asked them to make probability predictions pertaining to three possible outcomes (often in the form of things will: stay the same, get better, or get worse). Further, Tetlock interviewed each expert to evaluate the thought processes used to draw their conclusions.

 

In the end, after nearly twenty years of predictions and real life playing itself out, Tetlock was able to analyze the accuracy of over 82,000 predictions. And the results were conclusive – the pundits performed worse than random chance in predicting outcomes within their supposed areas of expertise. These experts were able to accurately predict the future less than 33% of the time and non-specialists did equally as well. And to make matters worse, the most famous pundits were the least accurate. A clear pattern emerged – confidence in one’s predictions was highly correlated with error. Those who were most confident about their predictions were most often the least accurate. He noted that the most confident, despite their inaccuracy, were in fact the most popular! Tetlock noted that they were essentially blinded by their certainty.

 

Jonah Lehrer in How We Decide wrote of Tetlock’s study and stated “When pundits were convinced that they were right, they ignored any brain areas that implied that they might be wrong. This suggests that one of the best ways to distinguish genuine from phony expertise is to look at how a person responds to dissonant data. Does he or she reject the data out of hand? Perform elaborate mental gymnastics to avoid admitting error? He also suggested that people should “ignore those commentators that seem too confident or self assured. The people on television who are most certain are almost certainly going to be wrong.”

 

You might be surprised that the vast majority of the pundits actually believed that they were engaging in objective and rational analysis when drawing their conclusions.

 

So, experts, rationally analyzing data, drawing conclusions with less than random chance accuracy? One has to question either their actual level of expertise or the objectivity of their analysis. Tetlock suggests that they are “prisoners of their preconceptions.”

 

This begs the question: Is this an error of reason or an error of intuition? Jonah Lehrer suggests that this error is actually played out as one cherry picks which feelings to acknowledge and which to ignore. Lehrer noted: “Instead of trusting their gut feelings, they found ways to disregard the insights that contradicted their ideologies… Instead of encouraging the arguments inside their heads, these pundits settled on answers and then came up with reasons to justify those answers.

 

Chabris and Simons in the The Invisible Gorilla discuss why we are taken in by the pundits despite their measurable incompetence and why they likely make the errors that they do. The bottom line is that such ubiquitous errors (made by novices and experts alike) are in fact illusions of knowledge perpetrated by intuition and further that we are suckers for confidence.

 

First of all, our intuitive inclination is to overly generalize and assume that one’s confidence is a measure of one’s competence. Such an assumption is appropriate in situations where one personally knows the limits of the individual’s capabilities. When it comes to pundits, few people know the supposed expert well enough to accurately assess whether he or she is worthy of their confidence. Regardless, people prefer and are drawn toward confidence. Our intuitive attraction to, and trust in confidence, sets us up for error. It is the illusion of confidence.

 

Chabris and Simons then review numerous stories and studies that “show that even scientific experts can dramatically overestimate what they know.” They demonstrate how we confuse familiarity with knowledge – and that when our knowledge is put to the test “…our depth of understanding is sufficiently shallow that we may exhaust our knowledge after just the first question. We know that there is an answer, and we feel that we know it, but until asked to produce it we seem blissfully unaware of the shortcomings in our own knowledge.” They add:

And even when we do check our knowledge, we often mislead ourselves. We focus on those snippets of information that we do possess, or can easily obtain, but ignore all of the elements that are missing, leaving us with the impression that we understand everything we need to.

 

So what can we safely conclude?

 

For certain, we should be aware of the limits of our knowledge and be ever vigilant so as to remain skeptical about what other experts espouse (particularly if they come off as being very confident). Tetlock suggests that responsible pundits should state their predictions in measurable terms – so that they are subject to analysis – both for error correction/learning and accountability purposes. Further he discusses the importance of placing predictions within error bars denoting the probability of accuracy. Chabris and Simons contend that only through rational analytic thought can we overcome the illusion of knowledge. We have to stave off our intuitive inclination to trust bold, black and white predictions; we have to accept that complicated issues demand complicated solutions; and that predicting the future is very difficult. As such, we need to get more comfortable with probabilities and become more skeptical of certainties. As for the pundits – they are not worth listening to – they are almost always wrong – and all they really do is polarize the process and the nation. We need to inform one another of this – and ultimately make an active rational choice to stop victimizing ourselves.

 

References:

Chabris, C. F., & Simons, D. J. (2010). The Invisible Gorilla. New York: Random House.

Lehrer, J. (2009). How We Decide. New York: Houghton Mifflin Harcourt.

Menand, L. (2005). Everybody’s an Expert. The New Yorker. http://www.newyorker.com/archive/2005/12/05/051205crbo_books1?printable=true

Tetlock, P.E. (2005). Expert political judgment: How good is it? How can we know? Princeton: Princeton University Press.

Share