Surprise at Chautauqua

23 July 2010

There are moments in life when you hear something that absolutely blows you away. I experienced such a moment on July 1st at Chautauqua Institution in Western New York. It wasn’t just the words I heard that touched me so. It was the words within the context, and the relative embrace of the words exhibited by the people that surrounded me.

 

First you have to understand the unique setting that is Chautauqua: an amusement park for the mind. It was initially built on the shores of Chautauqua Lake in 1874 as an “educational experiment in out-of-school, vacation learning.” Although initially the courses were for Sunday school teachers, its success and popularity precipitated a broadening of the curriulum to include academic subjects, music, art and physical education. Today on their website they note that “7,500 persons are in residence on any day during a nine-week season, and a total of over 142,000 attend scheduled public events. Over 8,000 students enroll annually in the Chautauqua Summer Schools which offer courses in art, music, dance, theater, writing skills and a wide variety of special interests.”

 

For those longing for intellectual and artistic stimulation in a peaceful setting, it constitutes a veritable fantasy land adorned with quaint Victorian era cottages often fronted with beautiful and pristine landscaping. Among its many homes, inns, and entertainment facilities arranged in a cozy village-like setting, are church houses where people congregate from all over the United States for religious retreats. The four pillars of Chautauqua are Art, Education, Religion, and Recreation. Needless to say, religion (particularly Christianity) is a big part of this community. But so is education and art. They have a quality symphony orchestra, a theater group, an opera company, and a dance ensemble. Nightly, they provide top notch entertainment in the sizable amphitheater.  Throughout each day, every day, there are lectures and events galore.

 

The last three years my wife and I have ventured to Chautauqua for science themed days where we attended lectures by people like Donald Johanson and Carl Zimmer. NASA had a mock up of a Mars Rover there last year. This year I was drawn by Alan Alda who is a true science geek like myself.

 

Each afternoon the Department of Religion hosts a lecture series.  Although I often miss these events for a number of reasons, this year, my innkeeper, knowing my proclivities, strongly recommended that I consider listening to this week’s speaker.  I took her advice and my wife and I skeptically sat at the Hall of Philosophy among an overflow crowd that I could only guess exceeded 1000 people. The lecturer was John Shelby Spong, a Bishop in the Episcopal Church. Rabbi Samuel Stahl introduced Bishop Spong and the Rabbi’s words drew me in, in a way that made me feel as though my mind was being read. It was a spine-tingling experience from the outset, and Spong’s words were unlike any I had ever heard from a man of God.

 

I certainly will not be able to capture and share in this medium the true essence of his message – but I will attempt to briefly summarize it. I STRONGLY encourage any person of faith as well as any person like myself who falls into the agnostic or atheist camp to listen to this lecture: Transcending Religion without Transcending God. You can sign up for a 15 Day Free Trial / Download Account and listen to this lecture online or pay $9.95 for a download to your iPod or MP3 player.

 

I’m guessing that anyone who listens to this talk with an open mind will be in some way moved by his words. I am also guessing that personal reactions will run the gamut from “this guy is a heretic” to “finally a voice of reason coming from the religious community.”  If you are likely to be among the former, Spong proclaims that he wishes to destroy no one’s faith, but boldly states that “If I can take away your God, you had very little, if you can lose it all in one hour.”

 

If you are religious, keep in mind as you consider listening, that this lecture was part four of a five part series. Spong had lectured in a similar vain for three consecutive days at the Hall of Philosophy to a pretty religious group of people and this day’s crowd was the biggest I had ever seen gathered (excluding events at the amphitheater).  This was not an angry or defensive crowd, but a thoughtful and attentive one.  What Spong said deeply challenged conventional definitions of religion but the people came back for more.  And if you are a rationalist, more inclined toward science than mysticism, you will be refreshed by Spong’s embrace of science and urging away from the traditional notions of religion that many find hard to accept.  Even Richard Dawkins seems to respect Spong.

 

Spong derides religious zealots who promote racism, sexism, antisemitism, and homophobia based upon quotations from the Holy Scriptures.  His rational embrace of science and the realities of human suffering (often as a result of religion’s influence) have guided his journey toward a reinterpretation of the faith story.  He strongly asserts that he wants nothing to do with any institution that diminishes the humanity of any child of God. He deplores how the Bible and the Church have harbored those that have relegated blacks to subhuman status, women as second class citizens, and gay and lesbian people as essentially immoral.  He explains the human experience within a context of understanding derived from biology and anthropology. He links our instinctual drive to survive to all living organisms and with this understanding, supplants the notion of original sin.  He embraces the teachings of Darwin and reinterprets salvation – not as a rescue from the fall from perfection but as a new understanding of what it is to be fully human. After all, we haven’t fallen – we have evolved.

 

Salvation he argues is not to be made religious. It is not to be forced into a particular creed or to follow a particular faith story. Salvation is to be made whole – to be called beyond our limits, our fears, our boundaries, and to be called into a new consciousness, a new humanity – where we can be called beyond our selfish drive to survive, and begin to truly give of our lives and our love.

 

Spong challenges both the notions of a personal God with supernatural powers and the traditional Jesus story.  He derides the traditional notion that humans are inherently depraved – and looks at our understanding of human development and asks if it is a wise parenting strategy to tell a child that he is bad, evil, and depraved in an attempt to turn that child into a healthy adult. He looks at how religion victimizes its followers and how in turn its practice facilitates hate and division.

 

Spong provides a sobering account of religion in general – particularly the prejudicial inspiration it has historically provided and the violence it has incited in the name of one’s preferred deity. Again, rather than reject science as a threat to an ideology, he embraces evidence, and searches for a new spiritual transcendence of God – to fill what he describes as a God Shaped Whole in every living person. His ultimate mysticism is a bit of a stretch for me – but all in all – the 80 minutes required to listen to his message is indeed time well spent.  The experience itself, for me, set in the Chautauqua Institution context, was deeply moving and inspired hope that we can move away from the unnecessary corrosive derision whereby some religious zealots dumb down the masses to protect their fragile foothold or engage in promulgating the dehumanization of those who are different.  It gives me hope that those who have spiritual needs unfulfilled by the wonders of the universe can find peace with God in a way that bolsters our humanity rather than in a way that divides us. Please give Spong a listen and let me know what you experience through his message.

 

Share
 | Posted by | Categories: Religion, Science | Tagged: , |

In psychology there are some pretty famous studies that have penetrated popular culture. Many folks are at least familiar with Skinner’s rat box, Pavlov’s salivating dogs, Milgram’s obedience studies, Bandura’s Bobo Dolls, and Harlow’s rhesus monkeys reared by wire frame terry cloth mothers. In recent history, perhaps the most well known study pertains to inattentional blindness. If you have never heard of or seen a video of six college students, three in black shirts and three in white shirts, bouncing a couple basketballs back and forth, see the following video before you proceed.

 

 

So, of course I am referring to Daniel Simons’ Invisible Gorilla study. Just about everyone I know has seen this video, and I don’t recall any of them telling me that they did see the gorilla. I didn’t and I was absolutely flabbergasted – because I tend to be a pretty vigilant guy. This video is a graphic illustration of what Chabris and Simons (2010) refer to as the Illusion of Attention, and about 50% of those who watch the video while counting passes among white shirted players miss the gorilla.

 

This particular illusion concerns me because I spend a fare amount of time riding a bicycle on the roads of Western New York. So why should I or anyone who rides a bicycle or motorcycle, or anyone who drives while texting or talking on a cell phone be concerned?

 

The cold hard truth is that we may completely miss events or stimuli that we do not expect to see. If you don’t expect to see, and therefore fail to look for, bicycles and motorcycles, you may look right at them but fail to see them. LOOKING IS NOT SEEING just as hearing is not listening. This hearing/listening analogy is dead on.  How often have you been caught hearing someone but not listening to what was actually being said?  Chabris and Simons discuss in their book, The Invisible Gorilla, a study conducted by Daniel Memmert of Heidelberg University that demonstrated (using an eye-tracker) that virtually everyone who missed the gorilla looked directly at it at some point in the video (often for a full second). Bikers are the invisible gorillas of the roadways.

 

And as for drivers, if you are distracted by a cell phone conversation or by texting, you are less likely to see unexpected events (e.g., bicycles, motorcycles, pedestrians, wildlife).

 

Most drivers who text and talk on cell phones do not have problems. In fact, most driving is uneventful – as a result, most people get away with these behaviors. However, it is when there is an unexpected event that mobile phone users struggle with seeing and responding fluently to these events. You are under the same illusion as everybody else who has not been in an accident. Everyone believes, until they hit or kill somebody, that they are proficient drivers even while texting or talking on the phone.  And by the way, hands free head sets make no difference. Driving while talking on a cell phone disables you as much as does alcohol.

 

Think about driving down a road not seeing and subsequently hitting a young child on a bike. Think about having to live with killing a middle aged couple with three kids in college who were lawfully riding down the road on a tandem bicycle.  You hit the invisible gorilla.  Live with that!

 

Daniel Simons, in a recently published study, also suggests that even if you are expecting an unexpected event,  it is likely that you will miss other unanticipated events. Check out The Monkey Business Illusion video even if you have seen the invisible gorilla video. Test yourself.

 

 

I have long known that I am at risk while riding my bike on the road.  I have recently incorporated wearing bright hi-vis attire as I ride.  Doing so is completely inconsistent with my style; but I have done so in an effort to be safer.  I was surprised to learn that research shows that doing so will increase your visibility for those that are looking for you – but that it will likely make no difference at all for inattentionally blind drivers. For those drivers who do not expect to see cyclists, hi-vis clothing will not likely increase the likelihood that you will be seen.  Using head and tail lights works on a similar level.  They do increase visibility but only for those looking for such strange sights.  The best way to increase one’s safety while riding is to look like a car.

 

It is also important to note that riding in areas where there are more bikers helps too. Chabris and Simons (2010) noted a report by Peter Jacobson, a public health consultant in California who analyzed data on accidents involving automobiles striking pedestrians or cyclists. He found that in cities where there were more walkers and cyclists, there were actually fewer accidents. More folks walking or riding bikes seems to increase the level of driver expectation for seeing such individuals – thus making one less at risk of being victimized by inattentional blindness. It was further noted that drivers who also ride bikes may actually be more aware – if only more people would get out of their cars and get back on bicycles.

 

The bottom line is that our intuition about our attention is problematic. Intuitively we believe that we attend to and see, what is right before us. Research and real world data shows us that this is not the case. At the very least, when driving, we need to be aware of this erroneous assumption, and work diligently to avoid distractions like talking on the phone or texting. As for cyclists (motor powered or not) we must anticipate that we won’t be seen and behave accordingly. Although hi-vis clothing and lights may not aid in your visibility for some drivers, it will for those that are looking out for you.

 

Chabris and Simons contend that this illusion is a by product of modernity and the subsequent fast paced highly distracting world we live in. We have evolved for millions of years by process of natural selection in a middle sized slow paced world. Traveling faster than a few miles an hour is a relatively new development for our species. Today we travel in motor vehicles at break neck speeds. On top of that we distract ourselves with cell phones, Blackberries, iPhones, iPods and GPS units. Although the consequences of these factors can be grave – in most cases we squeak by – which is a double edged sword because it essentially reinforces the illusion and the behavior.

 

References:

 

Chabris, C. F., & Simons, D. J., 2010. The Invisible Gorilla. Random House: New York.

 

Simons, D. J., 2010. Monkeying around with the gorillas in our midst: familiarity with an inattentional-blindness task does not improve the detection of unexpected events i-Perception 1(1) 3–6

Share

Imagine yourself walking down a familiar street approaching a stranger who is obviously lost, staring hopelessly at a map.  As you saunter by you provide eye contact and a look of willingness to help. He asks you for directions.  As you begin to offer your advice, you are interrupted by a construction crew carrying a large door.  They walk right between you and the stranger.  Now imagine that as the construction crew parted you visually from the stranger a new and different person covertly took on the same lost role.  This new stranger is wearing different clothes, is taller by three inches, has a different build, and different vocal qualities.  Do you think you would notice?

 

Chabris and Simons (2010) in the The Invisible Gorilla share the results of a study carried out by Dan Simons and a colleague where they tested whether people would notice such changes in a scenario very much like the one I just described. When the scenario was described to undergraduates, 95% believed that they would certainly notice such a change (as is likely the case for you as well). Yet when this experiment was carried out in the real world, nearly 50% of the participants did not notice the switch!

 

This particularly startling data is indicative of change blindness, defined by Chabris and Simons (2010) as failure to notice changes between what was in view moments before and what is in view currently. Essentially, we tend not to compare and thus notice stimuli changes from moment to moment. As a result we tend to be “blind” in many cases to pretty obvious changes. And what is equally salient is that we are unaware of this blindness. If you are like most people you said “No way I’d miss that!” Yet it is likely that about half of you would miss such changes.

 

Unconvinced? So were a group of Harvard undergraduates who had just attended a lecture that covered the above “door study” and change blindness. After the lecture, students were recruited to participate in further research. Interested students were directed to a different floor where they were greeted by an experimenter behind a counter. As the recruits proceeded to review and complete the necessary paperwork, the experimenter who greeted and instructed them regarding the paperwork ducked down behind the counter, presumably to file some papers, only to depart as a new and different experimenter took over the role. Even after being primed with the knowledge of change blindness, not one of the students noticed the swap! This was true even for some of the students who had just moments before boldly stated that they would notice such a change. We are in fact largely blind to our change blindness regardless of our confidence regarding our vigilance.

 

These results, contend Chabris and Simons, comprise conclusive evidence for the illusion of memory, (which is the disconnect between how our memory works and how we think it works).

 

Most of us are all too aware of the failings of our short-term memory. We often forget where we put the car keys, cell phone, or sunglasses. These authors note that we are generally pretty accurate when it comes to knowing the limits of this type of memory. License plates and phone numbers have only seven digits because most of us can only hold that much data in short-term memory. However, when it comes to understanding the limits of our long-term memory we tend to hold entirely unrealistic, fallacious, and illusory expectations.

In a national survey of fifteen hundred people [Chabris and Simons] commissioned in 2009, we included several questions designed to probe how people think memory works. Nearly half (47%) of the respondents believed that ‘once you have experienced an event and formed a memory of it, that memory doesn’t change.’ An even greater percentage (63%) believed that ‘human memory works like a video camera, accurately recording the events we see and hear so that we can review and inspect them later.” (Chabris & Simons, 2010, pp. 45-46).

They added:

People who agreed with both statements apparently think that memories of all our experiences are stored permanently in our brains in an immutable form, even if we can’t access them. It is impossible to disprove this belief… but most experts on human memory find it implausible that the brain would devote energy and space to storing every detail of our lives…” (p. 46).

So, as it turns out, our memories of even significant life events are quite fallible. Although we perceive such memories as being vivid and clear, they are individual constructions based on what we already know, our previous experiences, and other cognitive and emotional associations that we ultimately pair with the event. “These associations help us discern what is important and to recall details about what we’ve seen. They provide ‘retrieval cues’ that make our memories more fluent. In most cases, such cues are helpful. But these associations can also lead us astray, precisely because they lead to an inflated sense of precision of memory.” (Chabris & Simons, 2010, p. 48). In other words, our memories are not exact recordings, they are instead modified and codified personal replicas that are anything but permanent.

 

I cannot do justice to the impressive and exhaustive detailing that Chabris and Simons provide in the The Invisible Gorilla regarding the illusion of memory. However, suffice it to say, that we give way too much credit to the accuracy of our own long-term memories and have unrealistic expectations regarding others’ recall. People recall what they expect to remember and memories are modified over time based on malleable belief systems. Memories fade and morph over time depending on the “motives and goals of the rememberer.” (Chabris & Simons, 2010, p. 51).

“Although we believe that our memories contain precise accounts of what we see and hear, in reality these records can be remarkably scanty. What we retrieve often is filled in based on gist, inference, and other influences; it is more like an improvised riff on a familiar melody than a digital recording of an original performance. We mistakenly believe that our memories are accurate and precise, and we cannot readily separate those aspects of our memory that accurately reflect what happened from those that were introduced later.” (Chabris & Simons, 2010, pp 62-63).

They detail with riveting stories continuity errors in movies, source memory errors (is it your memory or mine?), flashbulb memories, and false memories in a way that really drives home the point that our memories are not to be trusted as factual depictions of historical fact. They beg the question: Can you trust your memory?

 

The answer: Partially, but you must be aware that your memory is not immutable. It is erroneous to assume that your memories are factual and it is equally fallacious to presume that other’s memories are likewise infallible. Two people witnessing the same event from the same perspective are likely to recall the event differently because of their unique personal histories, capabilities, internal associations, and thus their unique internal cognitive associations, as they store into memory the bits and pieces of the event.

 

Isn’t it amazing and scary that we give so much credit and power to eye-witness testimony in the court of law? Such power is conferred based on the pervasive and deeply held belief in the accuracy of memory – which you must know by now is an illusion. This is just another example pertaining to the illusion of justice in this country.

 

On a more personal level, next time you and your significant other get into a debate about how some past event went down, you have to know that you both are probably wrong (and right) to some degree. There is your truth, their truth, and the real truth. These can be illustrated in a Venn Diagram with three circles that from time to time have various degrees of mutual overlap. We must admit that over time the real truth is likely to become a smaller piece of the story. This necessitates that we get comfortable with the reality that we don’t possess a DVR in our brains and that we part ways with yet another illusion of the importance and power of our uniquely human intuition.

 

Reference:

 

Chabris, C. F., & Simons, D. J. (2010). The Invisible Gorilla. New York: Random House.

Share

Last week I discussed Philip Tetlock’s work that revealed the utter meaninglessness of punditry in The Illusion of Punditry. It is important to note that although professional pundits, on average, were less accurate than random chance, a few outliers actually performed well above average. Tetlock closely examined the variables associated with the distribution of accuracy scores and discovered that experts were often blinded by their preconceptions, essentially lead astray by how they think. To elucidate his point, Tetlock employed Isaiah Berlin’s famous metaphor, The Hedgehog and the Fox. Berlin, a historian, drew inspiration for the title of this essay from a classical Greek poet Archilochus, who wrote: “The fox knows many things, but the hedgehog knows one big thing.”

 

Berlin contended that there are two types of thinkers, hedgehogs and foxes. To make sense of this metaphor, one has to understand a bit about these creatures. A hedgehog is a small spiny mammal that when attacked rolls into a ball with its spines protruding outward. This response is its sole defensive maneuver, its “one big thing,” employed under any indication of threat. And by extension he suggested that hedgehog thinkers “… relate everything to a single central vision, one system less or more coherent or articulate, in terms of which they understand, think and feel—a single, universal, organizing principle in terms of which alone all that they are and say has significance…” The cunning fox survives by adapting from moment to moment by being flexible and employing survival strategies that make sense in the current situation. They “pursue many ends, often unrelated and even contradictory, … their thought is scattered or diffused, moving on many levels, seizing upon the essence of a vast variety of experiences and objects.”

 

John W. Dean, a former presidential counsel (for Richard Nixon), using the Berlin metaphor classified a number of US presidents as hedgehogs and foxes. In his column he wrote:

“With no fear of contradiction, Barack Obama can be described as a fox and George W. Bush as clearly a hedgehog. It is more difficult than I thought to describe all modern American presidents as either foxes or hedgehogs, but labeling FDR, JFK, and Clinton as foxes and LBJ and Reagan as hedgehogs is not likely to be contested. Less clear is how to categorize Truman, Nixon, Carter and Bush I. But Obama and Bush II are prototypical of these labels.”

 

Tetlock, in referring to pundit accuracy scores wrote that:

“Low scorers look like hedgehogs: thinkers who “know one big thing,” aggressively extend the explanatory reach of that one big thing into new domains, display bristly impatience with those who “do not get it,” and express considerable confidence that they are already pretty proficient forecasters, at least in the long term. High scorers look like foxes: thinkers who know many small things (tricks of their trade), are skeptical of grand schemes, see explanation and prediction not as deductive exercises but rather as exercises in flexible “ad hocery” that require stitching together diverse sources of information, and are rather diffident about their own forecasting prowess.”

 

Tetlock was careful to point out that there was no correlation between political affiliation and either hedgehog or fox classification. But what he did note was that the most accurate pundits were foxes and that the key variable associated with their success was introspection. Those who studied their own decision making process, were open to dealing with dissonance, and those who were not blinded by their preconceptions were far more capable of making accurate predictions. Successful pundits were also cautious about their predictions and were inclined to take information from a wide variety of sources.

 

Hedgehogs on the other hand, were prone to certainty and grand “irrefutable” ideas. They tend to boil problems down to simple grand theories or conflicts (e.g., good versus evil, socialism versus capitalism, free markets versus government regulations, and so on) and view these big issues as being the driving force of history. They are prone to over simplify situations and miss the many and diverse issues that ultimately shape history. They instead are more likely to attribute historical changes to single great men with simple great ideas (e.g., Ronald Reagan was responsible for the fall of the USSR, and without his leadership the cold war may still be raging).

 

So what are you a hedgehog or a fox? Both thinking approaches have strengths and weaknesses and appropriate and less appropriate applications. What were Copernicus, da Vinci, Galileo, Newton, Einstein, and Darwin? When do you suppose it is good to be a hedgehog and when a fox? I suppose it comes down to the task at hand: big unifying issues such as gravity, relativity, evolution, quantum mechanics may indeed necessitate hedgehog thinking. Here such single minded determinism is likely essential to persevere. Although, having read Darwin’s On the Origin of Species I am inclined to think that Darwin was a fox. Da Vinci too, was likely a fox, considering the vastness of his contributions. And Galileo was similarly a broad thinker. Knowing little of Newton and Einstein, I care not to speculate. It seems to me with the specialization of science these days, one must be a hedgehog. Early science history is replete with foxes. I don’t know about you, but I have a romantic notion about the lifestyles of men like Galileo and Darwin, following their curiosities dabbling hither and yon.

 

References:

Berlin, I. (1953). The Hedgehog and the Fox. The Isaiah Berlin Virtual Library. http://berlin.wolf.ox.ac.uk/published_works/rt/HF.pdf

Chabris, C. F., & Simons, D. J. (2010). The Invisible Gorilla. New York: Random House.

Dean, J. (2009). Barack Obama Is a “Fox,” Not a “Hedgehog,” and Thus More Likely To Get It Right. http://writ.news.findlaw.com/dean/20090724.html

Lehrer, J. (2009). How We Decide. New York: Houghton Mifflin Harcourt.

Menand, L. (2005). Everybody’s an Expert. The New Yorker. http://www.newyorker.com/archive/2005/12/05/051205crbo_books1?printable=true

Tetlock, P.E. (2005). Expert political judgment: How good is it? How can we know? Princeton: Princeton University Press.

Share

Have you ever wondered what makes a pundit a pundit? I mean really! Is there pundit school or a degree in punditry? Given what I hear, I can only imagine that what would be conferred upon graduation is a B.S. of different, more effluent sort. I mean REALLY!

 

I am certain that many of you have heard the rhetoric spewed by many of the talking heads on television and talk radio. This is true regardless of their alleged political ideology. And even more alarming, it seems to me, is that the more bombastic they are, the more popular they are. A pundit is supposed to be an expert – one with greater knowledge and insight than the general population – and subsequently they should possess the capacity to analyze current scenarios and draw better conclusions about the future than typical folk.

 

However, what we typically hear is two or more supremely confident versions of reality. You name the issue, be it anthropogenic global warming, health care reform, the value of free market systems, virtually no two pundits can agree unless of course they are political brethren.

 

Have you ever wondered if any one has ever put the predictive reliability of these so called experts to a test? Well, Philip Tetlock, a psychology professor at UC Berkley, has done just that. In 1984 Tetlock undertook such an analysis and his initial data was so so alarming (everybody had called the future wrong with regard to the cold war and demise of the USSR) that he decided to embark on what was to eventually become a two decade long quantitative analysis of, and report card on, the true predictive capabilities of professional pundits.

 

In 2005 Tetlock published his findings in his book, Expert political judgment: How good is it? How can we know? The results were again surprising. He analyzed the predictions made by over 280 professional experts. He gave each a series of professionally relevant real life situations and asked them to make probability predictions pertaining to three possible outcomes (often in the form of things will: stay the same, get better, or get worse). Further, Tetlock interviewed each expert to evaluate the thought processes used to draw their conclusions.

 

In the end, after nearly twenty years of predictions and real life playing itself out, Tetlock was able to analyze the accuracy of over 82,000 predictions. And the results were conclusive – the pundits performed worse than random chance in predicting outcomes within their supposed areas of expertise. These experts were able to accurately predict the future less than 33% of the time and non-specialists did equally as well. And to make matters worse, the most famous pundits were the least accurate. A clear pattern emerged – confidence in one’s predictions was highly correlated with error. Those who were most confident about their predictions were most often the least accurate. He noted that the most confident, despite their inaccuracy, were in fact the most popular! Tetlock noted that they were essentially blinded by their certainty.

 

Jonah Lehrer in How We Decide wrote of Tetlock’s study and stated “When pundits were convinced that they were right, they ignored any brain areas that implied that they might be wrong. This suggests that one of the best ways to distinguish genuine from phony expertise is to look at how a person responds to dissonant data. Does he or she reject the data out of hand? Perform elaborate mental gymnastics to avoid admitting error? He also suggested that people should “ignore those commentators that seem too confident or self assured. The people on television who are most certain are almost certainly going to be wrong.”

 

You might be surprised that the vast majority of the pundits actually believed that they were engaging in objective and rational analysis when drawing their conclusions.

 

So, experts, rationally analyzing data, drawing conclusions with less than random chance accuracy? One has to question either their actual level of expertise or the objectivity of their analysis. Tetlock suggests that they are “prisoners of their preconceptions.”

 

This begs the question: Is this an error of reason or an error of intuition? Jonah Lehrer suggests that this error is actually played out as one cherry picks which feelings to acknowledge and which to ignore. Lehrer noted: “Instead of trusting their gut feelings, they found ways to disregard the insights that contradicted their ideologies… Instead of encouraging the arguments inside their heads, these pundits settled on answers and then came up with reasons to justify those answers.

 

Chabris and Simons in the The Invisible Gorilla discuss why we are taken in by the pundits despite their measurable incompetence and why they likely make the errors that they do. The bottom line is that such ubiquitous errors (made by novices and experts alike) are in fact illusions of knowledge perpetrated by intuition and further that we are suckers for confidence.

 

First of all, our intuitive inclination is to overly generalize and assume that one’s confidence is a measure of one’s competence. Such an assumption is appropriate in situations where one personally knows the limits of the individual’s capabilities. When it comes to pundits, few people know the supposed expert well enough to accurately assess whether he or she is worthy of their confidence. Regardless, people prefer and are drawn toward confidence. Our intuitive attraction to, and trust in confidence, sets us up for error. It is the illusion of confidence.

 

Chabris and Simons then review numerous stories and studies that “show that even scientific experts can dramatically overestimate what they know.” They demonstrate how we confuse familiarity with knowledge – and that when our knowledge is put to the test “…our depth of understanding is sufficiently shallow that we may exhaust our knowledge after just the first question. We know that there is an answer, and we feel that we know it, but until asked to produce it we seem blissfully unaware of the shortcomings in our own knowledge.” They add:

And even when we do check our knowledge, we often mislead ourselves. We focus on those snippets of information that we do possess, or can easily obtain, but ignore all of the elements that are missing, leaving us with the impression that we understand everything we need to.

 

So what can we safely conclude?

 

For certain, we should be aware of the limits of our knowledge and be ever vigilant so as to remain skeptical about what other experts espouse (particularly if they come off as being very confident). Tetlock suggests that responsible pundits should state their predictions in measurable terms – so that they are subject to analysis – both for error correction/learning and accountability purposes. Further he discusses the importance of placing predictions within error bars denoting the probability of accuracy. Chabris and Simons contend that only through rational analytic thought can we overcome the illusion of knowledge. We have to stave off our intuitive inclination to trust bold, black and white predictions; we have to accept that complicated issues demand complicated solutions; and that predicting the future is very difficult. As such, we need to get more comfortable with probabilities and become more skeptical of certainties. As for the pundits – they are not worth listening to – they are almost always wrong – and all they really do is polarize the process and the nation. We need to inform one another of this – and ultimately make an active rational choice to stop victimizing ourselves.

 

References:

Chabris, C. F., & Simons, D. J. (2010). The Invisible Gorilla. New York: Random House.

Lehrer, J. (2009). How We Decide. New York: Houghton Mifflin Harcourt.

Menand, L. (2005). Everybody’s an Expert. The New Yorker. http://www.newyorker.com/archive/2005/12/05/051205crbo_books1?printable=true

Tetlock, P.E. (2005). Expert political judgment: How good is it? How can we know? Princeton: Princeton University Press.

Share

Over the last couple months I have submitted posts proclaiming the potency of intuition. One of my major resources has been Malcolm Gladwell’s Blink: The Power of Thinking Without Thinking. Among Gladwell’s tenets, the most prominent was the power of intuition and its relative supremacy, in certain situations, over rational thought. I have also heavily referenced Jonah Lehrer’s, How We Decide. Lehrer argues that there is not in fact, a Platonic Dichotomy that establishes rationality in a supreme and distinct role over intuition. Instead, he suggests that emotion plays a key role in making decisions, much more so than has historically been acknowledged. Lehrer, however, uses more scientific scrutiny and relies more heavily on research than does Gladwell.

 

Currently I am reading The Invisible Gorilla by Daniel J. Simons and Christopher F. Chabris. These cognitive psychologists are best known for their Invisible Gorilla study illustrating selective attention. These authors appear to be on a mission to resurrect rational thought by highlighting the inherent weaknesses of intuition. Gladwell in particular comes under scrutiny by these authors for his alleged glorification of rapid cognition.

 

Not only have Gladwell’s hypotheses come under attack, so to has his journalistic approach. Simons and Chabris efficiently deconstruct a couple of Gladwell’s anecdotes as examples of illusions manifested by intuition. Contrary to the message of Blink, Simons and Chabris contend that intuition is inherently problematic and detail automatic illusions that spring forth as manifested by the adaptive unconscious.

 

Anecdotal evidence is inherently flawed yet amazingly compelling. Gladwell, they acknowledge, is a master story teller, and he uses this talent to effectively support his contentions. They argue, however, that he falls prey to the very illusions of intuition that he is ultimately celebrating.

 

Jonah Lehrer seems to escape Simons’ and Chabris’ scrutiny – yet this may simply be an artifact of release date. How We Decide was released in 2009 while Gladwell’s Blink was released in 2007. Whereas Blink appears on the surface to be a celebration of intuition, Lehrer instead puts a microscope on the brain and the interplay of reason and emotion. He identifies the regions in the brain thought to be involved in these functions and highlights the research that systematically debunks the notion of reason and emotion being distinct epic foes battling it out for supremacy. Lehrer does not seem to celebrate the relative power of intuition over reason, but instead makes it clear that emotion, acting as a messenger of intuition, actually plays a crucial role in reason itself.

 

Rarely are parts in complex systems clearly distinct. Dividing brain function into dichotomous terms like reason and intuition is just another example of a flawed human inclination to pigeon hole nature or make issues black and white. Although Gladwell puts a more positive spin on intuition than has historically been the case, he also makes an effort to identify at least some of its shortcomings. Lehrer brings into focus the complexity and interconnectedness of the system and dispels the traditional dichotomy. Simons and Chabris scientifically scrutinize the Gladwellian notion of the supremacy of intuition. Their skeptical message lacks the sex appeal of thinking without thinking, but it is very important just the same. I look forward to detailing parts of The Invisible Gorilla in the weeks to come.

 

References:

 

Chabris, C. F., & Simons, D. J., 2010. The Invisible Gorilla. Random House: New York.

 

Gladwell, M. 2005. Blink: The Power of Thinking Without Thinking. Little, Brown and Company: New York.

 

Lehrer, J. 2009. How We Decide. Houghton Mifflin Harcourt: New York.

Share

Believe it or not, freewill, to a large extent, is an illusion. For the most part, what you do, as you go through your day is based on decisions made outside of your conscious awareness. Many of these decisions involve a complicated and largely unconscious interplay among various brain regions that each struggle for control of your behavior.

 

One has to be careful to avoid anthropomorphic tendencies when trying to understand this epic struggle. It is not as though there are specific Freudian (Id, Ego, Superego) forces at play, each with a specific and unique mission.  In reality it is more like chemical warfare going on in your brain – where neurotransmitters are released by those relevant brain centers based on current environmental circumstances (what your senses perceive in the world), your previous experiences in similar circumstances, and your treasure trove of knowledge. The subsequent emotions triggered by those neurotransmitters are then weighed out in the orbitofrontal cortex (OFC) in what has essentially been a tug of war involving varying measures of reinforcement and punishment.

 

Most of us are unaware of this neurological process and are under the illusion that we go through life making rational reason-based decisions. Although we may live within this illusion, the people who layout super center floor plans or produce advertisements know the truth. This discrepancy in knowledge makes you vulnerable. They use their knowledge of how the brain works in a manipulative and concerted effort to help you part ways with your hard earned money. It is not really a conspiracy, it is just an effort to gain a competitive advantage. It’s business.

 

Following is an abbreviated explanation of the brain systems in play and then an expose of how marketers use our brains against us. This information is drawn from Jonah Lehrer’s excellent book entitled How We Decide.

 

First there is the dopamine reward pathway. Dopamine is a neurotransmitter that serves a number of important functions in the brain. One of its most cogent roles is played out as a result of activation of the nucleus accumbens (NAcc). When the NAcc is activated it floods the brain with dopamine and we as a result experience pleasure. Desire for an item activates the NAcc. Being in the presence of the desired item activates it further. The greater the arousal of the NAcc the more pleasure we experience. It is your NAcc that is responsible for the happiness you feel when you eat a piece of chocolate cake, or listen to your favorite song, or watch your sports team win an exciting game (Lehrer, 2009).

 

Then there is the insula – a brain region that produces among other sensations, aversive feelings. In a New York Times article on the insula, Sandra Blakeslee (2006) noted that this center “lights up” in brain scans when people feel pain, anticipate pain, empathize with others, see disgust on someone’s face, are shunned in a social settings, and decide not to buy an item. In many cases we avoid exciting the insula as it is the system that produces the unpleasantness of caffeine or nicotine withdrawal and the negative feelings associated with spending money.

 

Super stores are designed to excite your NAcc and quiet the insula. You can’t help but notice when you walk into a Target, Walmart, Lowes, or even Pier 1 Imports just how much stuff is there – most of which you do not possess. Just by entering the store you have aroused your NAcc and the associated cravings.  Lehrer (2009) notes:

Just look at the interior of a Costco warehouse. It’s no accident that the most coveted items are put in the most prominent places. A row of high-definition televisions lines the entrence. The fancy jewelry, Rolex watches, iPods, and other luxury items are conspicuously placed along the corridors with the heaviest foot traffic. And then there are the free samples of food, liberally distributed throughout the store. The goal of a Costco is to constantly prime the pleasure centers of the brain, to keep us lusting after things we don’t need. Even though you probably wont buy the Rolex, just looking at the fancy watch makes you more likely to buy something else, since the desired item activates the NAcc. You have been conditioned to crave a reward.”

He further noted:

“But exciting the NAcc is not enough; retailers must also inhibit the insula. This brain area is responsible for making sure you don’t get ripped off, and when it’s repeatedly assured by retail stores that low prices are “guaranteed,” or that a certain item is on sale, or that it’s getting the “wholesale price,” the insula stops worrying so much about the price tag.  In fact, researchers have found that when a store puts a promotional sticker next to a price tag – something like “Bargain Buy!” or “Hot Deal!” – but doesn’t actually reduce the price, sales of that item still dramatically increase.  The retail tactics lull the brain into buying more things, since the insula is pacified.  We go broke convinced that we are saving money.”

I hypothesize that the frequently redundant catalogs that routinely fill our mailboxes from retailers like LLBean and Lands End work on our brains much like super centers do.  They excite the NAcc with idealized images modeled by perfect pretty people.  They pacify the insula by noting improved features, sales, and deep discounts on closeouts.  The necessary use of credit cards, Lehrer (2009) notes, has an additional inhibitory affect on the insula.  When the insula is calm and you are primed with dopamine, the pleasure center has a disproportional amount of control.  You may think you have complete rational control over this – but all this takes place outside of your direct awareness and plays out as feelings that guide your behavior.  I further hypothesize that online retail stores work in a similar way (although for some the insula may be aroused by security issues pertaining to using a credit card online).  Regardless, substantial marketing attempts by companies like EMS, REI, Victoria’s Secrets, LLBean, Bath & Body Works fill my in box, always hoping to draw in my NAcc and pacify my insula and subsequently open my wallet.  You have to guess that the amount of money devoted to catalogs and internet marketing pays off for these companies or they wouldn’t do it.

 

Being aware of one’s neurology and how we are manipulated may help us mediate these unconscious forces and thus help us make better decisions.  I myself try to avoid Malls and stores like Target because of the feelings they create in me.  And for this very reason, I’ve stopped routinely looking at catalogs.  I try to shop based only on need – not want.  I’m making progress – but it is hard – these patterns have been in place and reinforced for a long time.

 

References

 

Blakeslee, Sandra. 2007. Small Part of the Brain, and Its Profound Effects. New York Times. http://www.nytimes.com/2007/02/06/health/psychology/06brain.html?emc=eta1&pagewanted=all

 

Lehrer, J. 2009. How We Decide. Houghton Mifflin Harcourt: New York.

Share

For nearly as long as humans have been thinking about thinking, one of the most intriguing issues has been the interplay of reason and emotion. For the greatest thinkers throughout recorded history, reason has reigned supreme. The traditional paradigm has been one of a dichotomy where refined and uniquely human REASON pitches an ongoing battle for control over animalistic and lustful EMOTIONS. It has been argued by the likes of Plato, Descartes, Kant and and even Thomas Jefferson that reason is the means to enlightenment and that emotion is the sure road to human suffering (Lehrer, 2009).

 

This Platonic dichotomy remains a pillar of Western thought (Lehrer, 2009). Suppressing your urges is a matter of will – recall the mantras “Just say no!” or “Just do it!” My guess is that most people today continue to think of the brain in these terms. Until recently even the cognitive sciences reinforced this notion. Only through very recent advances in the tools used to study the brain (e.g., fMRI) and other ingenious studies (e.g., Damasio’s IGT) has any evidence been generated to place this traditional paradigm in doubt. As it turns out, emotion plays a very crucial role in decision making. Without it, our ability to reason effectively is seriously compromised. I have long believed that feelings and emotions should be under the control of our evolutionary gift – the frontal cortex. Reason, after all, is what sets us apart from the other animals. Instead it is important to understand that we have learned that these forces are NOT foes but essentially collaborative and completely interdependent forces.

 

The implications of this recent knowledge certainly do not suggest that it is fruitless to employ our reason and critical thinking capabilities as we venture through life. Reason is crucial and it does set us apart from other life forms that lack such fully developed frontal cortices. This part of the outdated concept is correct. However, we are wrong to suppose that emotion with regard to decision making lacks value or that it is a villainous force.

 

Jonah Lehrer, in his book, How We Decide discusses this very issue and notes that: “The crucial importance of our emotions – the fact that we can’t make decisions without them – contradicts the conventional view of human nature, with its ancient philosophical roots.” He further notes:

 

“The expansion of the frontal cortex during human evolution did not turn us into purely rational creatures, able to ignore our impulses. In fact, neuroscience now knows that the opposite is true: a significant part of our frontal cortex is involved with emotion. David Hume, the eighteenth-century Scottish philosopher who delighted in heretical ideas, was right when he declared that reason was the “the slave of the passions.”

 

So how does this work? How do emotion and critical thinking join forces? Neuroscientists now know that the orbitofrontal cortex (OFC) is the brain center where this interplay takes place. Located in the lower frontal cortex (the area just above and behind your eyes), your OFC integrates a multitude of information from various brain regions along with visceral emotions in an attempt to facilitate adaptive decision making. Current neuroimaging evidence suggests that the OFC is involved in monitoring, learning, as well as the memorization of the potency of both reinforcers and punishers. It operates within your adaptive unconscious – analyzing the available options, and communicating its decisions by creating emotions that are supposed to help you make decisions.

 

Next time you are faced with a decision, and you experience an associated emotion – it is the result of your OFC’s attempt to tell you what to do. Such feelings actually guide most of our decisions.

 

Most animals lack an OFC and in our primate cousins, this cortical area is much smaller. As a result, these other organisms lack the capacity to use emotions to guide their decisions. Lehrer notes: “From the perspective of the human brain, Homo sapiens is the most emotional animal of all.”

 

I am struck by the reality that natural selection has hit upon this opaque approach to guide behavior. This just reinforces the notion that evolution is not goal directed. Had evolution been goal directed or had we been intelligently designed don’t you suppose a more direct or more obviously rational process would have been devised? The reality of the OFC even draws into question the notion of free will – which is a topic all its own.

 

This largely adaptive brain system of course has draw backs and limitations – many of which I have previously discussed (e.g., implicit associations, cognitive conservatism, attribution error, cognitive biases, essentialism, pareidolia). This is true, in part, because these newer and “higher” brain functions are relatively recent evolutionary developments and the kinks have yet to be worked out (Lehrer, 2009). I also believe that perhaps the complexities and diversions of modernity exceed our neural specifications. Perhaps in time, natural selection will take us in a different direction, but none of us will ever see this. Regardless, by learning about how our brains work, we certainly can take an active role in shaping how we think. How do you think?

 

References:

 

Gladwell, M. (2005). ‘Blink: The Power of Thinking Without Thinking.’ Little, Brown and Company:New York.

 

Lehrer, J. 2009. How We Decide. Houghton Mifflin Harcourt: New York.

Share

Recently, Fox News, aired a story posing the question as to whether Fred Rogers was evil.  Why you may ask, would anyone use the word evil in reference to such a gentle man?  They were suggesting that his you’re special message fostered unworthy self esteem and in effect ruined an entire generation of children.  This accusation inspired a fair amount of discourse that in some cases boiled down to the question of why children today have such hollow needy shells.  An example of the discourse on this topic can be seen at Bruce Hood’s blog in an article entitled Mr. Rogers is Evil According to Fox News.

 

The consensus among skeptics was that Mr. Rogers was not, in fact, evil and that he is not responsible for the current juvenile generation’s need for much praise and attention for relatively meaningless contribution. There was almost universal acknowledgment of the problem however, and discussions lead to troubling issues such as grade inflation at schools and universities and poor performance in the workplace. In an intriguing article by Carol Mithers in the Ladies Home Journal entitled Work Place Wars addresses the workplace implications of this phenomena. Mithers notes:

“.…. the Millennials — at a whopping 83 million, the biggest generation of all…. are technokids, glued to their cell phones, laptops, and iPods. They’ve grown up in a world with few boundaries and think nothing of forming virtual friendships through the Internet or disclosing intimate details about themselves on social networking sites. And, many critics charge, they’ve been so coddled and overpraised by hovering parents that they enter the job market convinced of their own importance. Crane calls them the T-ball Generation for the childhood sport where “no one fails, everyone on the team’s assured a hit, and every kid gets a trophy, just for showing up.

 

Workers of this generation are known for their optimism and energy — but also their demands: “They want feedback, flexibility, fun, the chance to do meaningful work right away and a ‘customized’ career that allows them to slow down or speed up to match the different phases of life,” says Ron Alsop, author of The Trophy Kids Grow Up: How the Millennial Generation Is Shaking Up the Workplace.

I find it ironic that the very people today who struggle with the behavior of the Millennials are the ones who shaped the behaviors of concern. I personally have struggled with the rampant misapplication of praise, attention, and the provision of reinforcement for meaningless achievements. I have seen this everywhere – in homes, schools, youth athletic clubs, you name it. It has been the most recent parenting zeitgeist. But where did this philosophy come from?

 

Throughout my doctoral training in psychology (late 80’s and early 90’s) I learned that reinforcement is a powerful tool, but it was clear to me that it has to be applied following behaviors you WANT to increase. Nowhere in my studies did I read of the importance of raising children through the application of copious amounts of reinforcement just to bolster their self esteem. I am aware of no evidence based teachings that suggest this approach. However, given the near universal application of these practices it must of come from somewhere. This very question, I’m sure, lead to the placement of responsibility squarely on the shoulders of poor Mr. Rogers.

 

Although the source of this approach remains a mystery to me, Dr. Carol Dweck’s work clarifies the process of the outcome. In an interview in Highlights, Dr. Dweck discusses Developing a Growth Mindset.  Dr. Dweck has identified two basic mindsets that profoundly shape the thinking and behavior both we as adults exhibit and foster in our children.  She refers to these as the Fixed Mindset and the Growth Mindset. People with a Fixed Mindset, Dr. Dweck notes in the Highlights article, “believe that their achievements are based on innate abilities. As a result, they are reluctant to take on challenges.” Dweck further notes that “People with Growth Mindsets believe that they can learn, change, and develop needed skills.  They are better equipped to handle inevitable setbacks, and know that hard work can help them accomplish their goals.” In this same article “She suggests that we should think twice about praising kids for being “smart” or “talented,” since this may foster a Fixed Mindset. Instead, if we encourage our kids’ efforts, acknowledging their persistence and hard work, we will support their development of a Growth Mindset – better equipping them to learn, persist and pick themselves up when things don’t go their way.”

 

Dweck’s conclusions are based on extensive research that clearly supports this notion. Jonah Lehrer, in his powerful book, How We Decide discussed the relevance of Dweck’s most famous study. This work involved more than 400 fifth grade students in New York City, who were individually given a set of relatively simple non-verbal puzzles. Upon completing the puzzles the students were provided with one of two one-sentence praise statements. Half of the participants were praised for their innate intelligence (e.g., “You must be smart at this.”).  The other half were praised for their effort (e.g., “You must have worked really hard.”).

 

All participants were then given a choice between two subsequent tasks – one described as a more challenging set of puzzles (paired with the assurance that they would learn a lot from attempting) and a set of easier puzzles like the ones the subjects just completed.  In summarizing Dweck’s results, Lehrer noted “Of the group of kids that had been praised for their efforts, 90 percent chose the harder set of puzzles. However, of the kids that were praised for their intelligence , most went for the easier test.”  Dweck concludes that praise statements that focus on intelligence encourage risk avoidance. The “smart” children do not want to risk having their innate intelligence come under suspicion.  It is better to take the safe route and maintain the perception and feeling of being smart.

 

Dweck went on to demonstrate how this fear of failure can inhibit learning.  The same participants were then given a third set of puzzles that were intentionally very difficult in order to see how the children would respond to the challenge.   Those who were praised for their effort on the initial puzzles worked diligently on the very difficult puzzles and many of them remarked about how much they enjoyed the challenge. The children who were praised for their intelligence were easily discouraged and quickly gave up.  Their innate intelligence was challenged – perhaps they were not so smart after all.  Then all subjects were subjected to a final round of testing.  This set of puzzles had a degree of difficulty comparable to the first relatively simple set. Those participants praised for their effort showed marked improvements in their performance.  On average their scores improved by 30 percentage points.   Those who were praised for their intelligence, the very children who had just had their confidence shaken by the very difficult puzzles, on average scored 20 percentage points lower than they had on the first set.  Lehrer noted in reference to the participants praised for their effort that “Because these kids were willing to challenge themselves, even if it meant failing at first, they ended up performing at a much higher level.” With regard to the participants praised for intelligence Lehrer writes “The experience of failure had been so discouraging for the “smart” kids that they actually regressed.

 

In the Highlights interview Dweck suggests:

“It’s a mistake to think that when children are not challenged they feel unconditionally loved. When you give children easy tasks and praise them to the skies for their success, they come to think that your love and respect depend on their doing things quickly and easily. They become afraid to do hard things and make mistakes, lest they lose your love and respect. When children know you value challenges, effort, mistakes, and learning, they won’t worry about disappointing you if they don’t do something well right away.”

She further notes:

“The biggest surprise has been learning the extent of the problem—how fragile and frightened children and young adults are today (while often acting knowing and entitled). I watched as so many of our Winter Olympics athletes folded after a setback. Coaches have complained to me that many of their athletes can’t take constructive feedback without experiencing it as a blow to their self-esteem. I have read in the news, story after story, how young workers can hardly get through the day without constant praise and perhaps an award. I see in my own students the fear of participating in class and making a mistake or looking foolish. Parents and educators tried to give these kids self-esteem on a silver platter, but instead seem to have created a generation of very vulnerable people.”

So, we have an improved understanding of what has happened – but not necessarily of how the thinking that drives such parenting behavior came to be. Regardless, it is what it is, and all we can do is change our future behavior. Here are some cogent words of advice from Dr. Dweck (again from the Highlights article):

  1. “Parents can also show children that they value learning and improvement, not just quick, perfect performance. When children do something quickly and perfectly or get an easy A in school, parents should not tell the children how great they are. Otherwise, the children will equate being smart with quick and easy success, and they will become afraid of challenges. Parents should, whenever possible, show pleasure over their children’s learning and improvement.”
  2. Parents should not shield their children from challenges, mistakes, and struggles. Instead, parents should teach children to love challenges. They can say things like “This is hard. What fun!” or “This is too easy. It’s no fun.” They should teach their children to embrace mistakes, “Oooh, here’s an interesting mistake. What should we do next?” And they should teach them to love effort: “That was a fantastic struggle. You really stuck to it and made great progress” or “This will take a lot of effort—boy, will it be fun.
  3. Finally, parents must stop praising their children’s intelligence. My research has shown that, far from boosting children’s self-esteem, it makes them more fragile and can undermine their motivation and learning. Praising children’s intelligence puts them in a fixed mindset, makes them afraid of making mistakes, and makes them lose their confidence when something is hard for them. Instead, parents should praise the process—their children’s effort, strategy, perseverance, or improvement. Then the children will be willing to take on challenges and will know how to stick with things—even the hard ones.”

 

References

 

Dweck, C. Developing a Growth Mindset. Highlights Parents.com Interview  http://www.highlightsparents.com/parenting_perspectives/interview_with_dr_carol_dweckdeveloping_a_growth_mindset.html

 

Hood, B. Mr Rogers is Evil According to Fox Newshttp://brucemhood.wordpress.com/2010/05/03/mr-rogers-is-evil-according-to-fox-news/

 

Lehrer, J. 2009.  How We Decide. Houghton Mifflin Harcourt: New York.

 

Mithers, C. Workplace Wars. Ladies Home Journal. http://www.lhj.com/relationships/work/worklife-balance/generation-gaps-at-work/

Share

The capabilities of our adaptive unconscious are really quite amazing. In an earlier post, entitled Intuitive Thought, I covered the general relative strengths of this silent supercomputer running outside of our awareness. It has long been believed that rational thought, the application of logic and reason, over intuition, is the key to a successful life. One wonders, given the recent revelations about the importance of emotion and intuition, how reasoning capabilities would fair in a head to head (pun intended) competition with emotion?

 

Believe it or not, a research team from the University of Iowa devised a rather ingenious way of holding such a competition. In 1994 neuroscientists Antonio Damasio, Antoine Bechara, Daniel Tranel, and Steven Anderson developed the Iowa Gambling Task (IGT) to facilitate the identification of decision-making errors in individuals with prefrontal cortex damage. Both Malcolm Gladwell (Blink) and Jonah Lehrer (How We Decide) highlight this study in their powerful books on how we think. The IGT website describes the IGT as “a computerized experiment that is carried out in real time and resembles real-world contingencies. The task allows participants to select cards from four decks displayed on-screen. Participants are instructed that the selection of each card will result in winning or losing money. The objective is to attempt to win as much money as possible.” Sounds straight forward – although there is a catch. The participants are not aware that the decks are rigged in such a way that two decks consistently offer modest cash advances ($50) and rare penalties. These are the “good decks.” The two other decks, the “bad decks,” provide bigger advances ($100) but also devastating penalties ($1250). Playing the good decks is a slow but sure road to substantial winnings. The bad decks lead to disaster.

 

As participants began selecting cards, they tended to draw from all four decks (in a random fashion). However, as card selection proceeded and the consequences of their choices were realized, on average it took the typical participant about 50 cards before they started exclusively drawing from the “good decks.”  After drawing 50 cards, most participants developed a hunch that there were deck specific patterns in rewards and penalties and they began responding to those patterns. But It took on average about 80 cards before the typical subject could explain why they favored the good decks. That is 80 draws before most people concluded, rationally and logically, that there were good and bad decks.

 

In their original study, Damasio and his colleagues were interested in the emotional responses the subjects had to the task.  Participants were hooked up to a machine that specifically monitored their stress response (nervousness and anxiety) associated with each and every card selection.   What they discovered was that their subjects responded emotionally to the bad decks long before they changed their behavior or developed any rational understanding of the card distribution. On average most subjects exhibited a stress response to the bad decks after ten draws, a full 40 draws before their behavior changed and 70 draws before they could identify the reason for avoidance of the bad decks. Lehrer noted that “Although the subject still had little inkling of which card piles were the most lucrative, his emotions had developed an accurate sense of fear. The emotions knew which decks were dangerous. The subject’s feelings figured out the game first.”

 

On the IGT, neurotypical individuals almost always came out well ahead financially. Ultimately the emotions they experienced associated with draws from the various decks clued them into the correct responding pattern. However, individuals who were incapable of experiencing any emotional response – typically due to damaged orbito-frontal cotices – proved incapable of identifying the patterns and often went bankrupt. As it turns out, our emotional responses serve a very crucial role in good decision making – much more so than reason and logic. Again from Lehrer: “When the mind is denied the emotional sting of losing, it never figures out how to win.” The adaptive unconscious and the associated underlying emotional capacity of the brain serve an essential role in the decision making process. “Even when we think we know nothing, our brains know something. That’s what our feelings are trying to tell us.” (Lehrer, 2009).

 

It really is quite amazing that we strive for, and so greatly value, rational thought as a savior of sorts; yet it is our intuition and emotions that really serve as our most effective advisers. The acceptance of the inferiority of rationality is literally and figuratively counter-intuitive. Of course this does not mean we should devalue rationality and go with all our impulses. There are limits and dangers associated with such thinking, and our emotions are kept in balance by our reasoning capabilities. It is crucial that we understand the capacity and strengths of both reason and intuition, as well as their downfalls. I am devoted to this pursuit with growing passion and will continue to share my insights.

 

References:

 

Gladwell, M. 2005.  Blink: The Power of Thinking Without Thinking. Little, Brown and Company: New York

 

Lehrer, J. 2009.  How We Decide. Houghton Mifflin Harcourt: New York.

Share