Halloween seems like an appropriate time to discuss superstition.  What with ghosts and goblins and black cats and witches and all.  But would not Easter or Christmas, or any other evening that a five year old loses a tooth be an equally appropriate time?  In actuality, we massage magical thinking in our children with notions of Santa Claus, the Easter Bunny, and the tooth fairy.  And recall if you will, some of your favorite children’s books and the supernatural forces employed to delight your youthful whimsies.  Magic is, along with the thinking employed to delight in it, seemingly a rite of childhood, and in some ways the essence of what it is to be a child.

 

Much as magical thinking has its roots in childhood fantasies, superstition too has its roots in our species’ youth.  In that nascent time we lacked the capacity to understand the forces and whims of the natural world around us.  Our ancestors struggled to survive, and living another day in part depended on their ability to make sense of the forces that aided or impinged upon them.  We must not forget that our forefathers lived much like the non-domesticated animals around us today.  Survival was a day to day reality dependent upon the availability of life sustaining resources like food, water and shelter, and was often threatened by predation or the forces of nature.  Death was a real possibility and survival a real struggle.  The stakes were high and the hazards were plentiful.  As it turns out, these are the very conditions under which superstition is likely to thrive.

 

So what is superstition?  Bruce Hood, author of The Science of Superstition, notes that superstition is a belief “that there are patterns, forces, energies, and entities operating in the world that are denied by science…”  He adds that “the inclination or sense that they may be real is our supersense.” It involves an inclination to attempt to “control outcomes through supernatural influence.”  It is the belief that if you knock on wood or cross your fingers you can influence outcomes in your favor.  It is the belief that faithfully carrying out rituals as part of a wedding ceremony (e.g., wearing something blue, something new, something borrowed) or before going to bat or before giving a big speech will improve outcomes.  It is also the belief that negative outcomes can come as a result of stepping on a crack, breaking a mirror, or spilling salt.  Hood argues that supersense goes beyond these obvious notions and surfaces in more subtle ways associated with touching an object or entering a place that we feel has a connection with somebody bad or evil.  For example, how would you feel if you were told that you had to wear Jeffery Dalmer’s T-shirt or that you were living in a house where ritualistic torture and multiple murders took place?  Most of us would recoil at the thought of this.  Most of us also believe (erroneously) that we can sense when someone is looking at us, even when we cannot see them doing so.  These beliefs and much of the value we place on sentimental objects stems from this style of thinking.

 

I explored the deep evolutionary roots of superstitious thinking in a previous post, The Illusion of Cause: Vaccines and Autism.   The principle underpinnings are the same.  In that post I noted the following:

 

Michael Shermer (2000), in his book, How We Believe, eloquently describes our brains as a Belief Engine. Underlying this apt metaphor is the notion that “Humans evolved to be skilled pattern seeking creatures. Those who were best at finding patterns (standing upwind of game animals is bad for the hunt, cow manure is good for the crops) left behind the most offspring. We are their descendants.” (Shermer, p. 38). Chabris and Simons (2009) note that this refined ability “serves us well, enabling us to draw conclusions in seconds (or milliseconds) that would take minutes or hours if we had to rely on laborious logical calculations.” (p. 154). However, it is important to understand that we are all prone to drawing erroneous connections between stimuli in the environment and notable outcomes. Shermer further contends that “The problem in seeking and finding patterns is knowing which ones are meaningful and which ones are not.

From an evolutionary perspective, we have thrived in part, as a result of our tendency to infer cause or agency regardless of the reality of threat. For example, those who assumed that rustling in the bushes was a tiger (when it was just wind) were more likely to take precautions and thus less likely, in general, to succumb to predation. Those who were inclined to ignore such stimuli were more likely to later get eaten when in fact the rustling was a hungry predator. Clearly from a survival perspective, it is best to infer agency and run away rather than become lunch meat. The problem that Shermer refers to regarding this system is that we are subsequently inclined toward mystical and superstitious beliefs: giving agency to unworthy stimuli or drawing causal connections that do not exist. Dr. Steven Novella, a neurologist, in his blog post entitled Hyperactive Agency Detection notes that humans vary in the degree to which they assign agency. Some of us have Hyperactive Agency Detection Devices (HADD) and as such, are more prone to superstitious thinking, conspiratorial thinking, and more mystical thinking. It is important to understand as Shermer (2000) makes clear:

“The Belief Engine is real. It is normal. It is in all of us. Stuart Vyse [a research psychologist] shows for example, that superstition is not a form of psychopathology or abnormal behavior; it is not limited to traditional cultures; it is not restricted to race, religion, or nationality; nor is it only a product of people of low intelligence or lacking education. …all humans possess it because it is part of our nature, built into our neuronal mainframe.” (p. 47).

 

Bruce Hood takes this notion further and adds that the cultural factors discussed at the opening of this piece and other intuitive inclinations such as dualism (a belief in the separation of mind and body), essentialism (the notion that all discernible objects harbor an underlying reality that although intangible, gives each and every object it’s true identity), vitalism (the insistence that there is some big, mysterious extra ingredient in all living things), holism (that everything is connected by forces), and anism (the belief that the inanimate world is alive) shape adult superstition.  These latter belief mechanisms are developmental and naturally occurring in children: they are the tendencies that make magic and fantasy so compelling for children.  It is when they lurk in our intuition or are sustained in our rational thought that we as adults fall victim to this type of illusion.

 

It is interesting to note that much like our ancestors, we are more prone to this type of thinking when faced with high stakes, a low probability of success, and incomprehensible controlling circumstances.  Think about it.  In baseball, batters often have complex superstitious rituals associated with batting.  The best hitters experience success only one in three times at bat.  And the speed at which they have to decide to swing or not and where to position the swing defies the rational decision making capacity of humans.  On the other hand, these very same athletes have no rituals when it comes to fielding a ball (which is a high probability event for the proficient).

 

Superstition is a natural inclination with deep evolutionary and psychological roots embedded deeply in our natural child development.  These tendencies are nurtured and socialized as a part of child rearing and spill over into adult rituals in predictable circumstances (particularly when there is a low degree personal control).   When one deconstructs this form of thinking it makes complete and total sense.  This is not to suggest that reliance on superstitions is sensible.  Often, however, the costs are low and the rituals therein can be fun.  There are some potential costs associated with such thinking.  Some of the dangers are materialized in notions such as vaccines cause autism and homeopathy will cure what ails you in lieu of scientific medicine.  Resignation of personal power in deference to supernatural forces is a depressive response pattern.  Reliance on supernatural forces is essentially reliance on chance and in some cases its applications actually stack the deck against you.  So be careful when employing such tactics.  But, if you’re in the neighborhood, NEVER EVER walk under my ladder.  I’ve been known to drop my hammer.

 

References

 

Chabris, C. F., & Simons, D. J. (2010). The Invisible Gorilla. Random House: New York.

 

Dawkins, R. (2009). The Greatest Show on Earth: The Evidence for Evolution. Free Press: New York.

 

Gelman, S. A. (2004). Psychological Essentialism in Children. TRENDS in Cognitive Sciences, 8, 404-409.

 

Hood, B. (2008). The Science of Superstition (Formerly Titled: Supersense: Why We Believe in the Unbelievable). HarperCollins Publishers: New York.

 

Novella, S. (2010). Hyperactive Agency Detection. NeuroLogica Blog. http://www.theness.com/neurologicablog/?p=1762

 

Shermer, M. (2000). How We Believe. W.H. Freeman/Henry Holt and Company: New York.

Share

I’m sure you have heard of subliminal messages. You know that classic story where it was alleged that flashing the words DRINK COKE on a movie screen for a fraction of a second would increase cola buying behavior at the concession stand.  Well, that was a hoax, but you should know that I can, in other ways, tap into your subconscious thoughts and make you smarter, dumber, more assertive, or more passive for a short period of time.

 

This is not brainwashing!  It has a different name.  In the field of psychology, this interesting phenomena is referred to as primingJohn Bargh (now at Yale University) and colleagues formerly at New York University demonstrated the legitimacy of priming in a very interesting paper entitled Automaticity of Social Behavior: Direct Effects of Trait Construct and Stereotype Activation on Action (Bargh, Chen, & Burrows, 1996).  These researchers contend “that social behavior is often triggered automatically on the mere presence of relevant situational features [and that] this behavior is unmediated by conscious perceptual or judgmental processes.”  One of the studies they used to empirically demonstrate the implications of automatic social behavior (priming) involved a group of undergraduates from NYU who were given the scrambled sentence test.  The test involves the presentation of a series of five scrambled word groupings.  From each grouping one is to devise a grammatical four word sentence.  For example, one of the groupings might include the words: blue the from is sky.  From this grouping your job would be to write The sky is blue.  A typical scrambled sentence test takes about five minutes.

 

The scrambled sentence test is a diversion and a means to present words that may influence or prime the subject’s behavior, thoughts, or capabilities.  In this study the subjects were randomly assigned to one of two groups.  One group was presented with scrambled sentences that were sprinkled with words like “bold,” “intrude,” “bother,” “rude,” “infringe,” and “disturb.”  The second group was presented with scrambled sentences containing words like “patiently,” “appreciate,” “yield,” “polite,” and “courteous.”  Each student independently completed their test in one room and were told upon completion to walk down the hall to get their next task from an experimenter in another office.  For every subject, however, there was another student (a stooge) at the experimenter’s office asking a series of questions forcing the subject to wait.   Bargh and colleagues predicted that those primed with words like “rude” and “intrude” would interrupt the stooge and barge in quicker than those primed with words like “polite” and “yield.”    Bargh anticipated that the difference between the groups would be measured in milliseconds or at most, seconds.  These were New Yorkers, after all, with a proclivity to be very assertive (Gladwell, 2005).  The results were surprisingly quite dramatic!

 

Those primed with the “rude” words interrupted after about 5 minutes.  Interestingly, the university board responsible for approving experiments involving human subjects limited the wait period in the study to a maximum of ten minutes. The vast majority (82%) of those primed with the “polite” words never interrupted at all.   It is unknown how long they would have waited.  The difference between these groups based simply on the nature of the priming words was huge!  In the same paper Bargh et al., (1996) presented how students primed with words denoting old age (e.g., worried, Florida, lonely, gray, bingo, forgetful) walked more slowly leaving the office after completing the scrambled sentence test than they did on their way to the testing office.  It is suggested that the subjects mediated their behavior as a result of thoughts planted in their sub-conscious pertaining to being old.  These thoughts, in this case, resulted in the subjects behaving older (e.g., walking more slowly).

 

Priming one to be more or less polite or sprite is interesting, but there are disturbing and perhaps very damaging implications of this phenomena.

 

Dijksterhuis and van Knippenberg, a research team from Holland, looked at how priming might affect intellectual performance (1998).  Their subjects were divided into two random groups.  The first group was tasked for five minutes with thinking and writing down attributes pertaining to being a college professor.  The second group was tasked with thinking about and listing the attributes of soccer hooligans.  Following this thinking and writing task, the subjects were given 47 challenging questions from the board game Trivial Pursuits.  Those in the “professorial” priming group got 55.6% of the items correct while those primed with soccer hooliganism got only 42.6% correct.  One group was not smarter than the other – but it is contended that those in the “smart” frame of mind were better able to tap into their cognitive resources than those with a less erudite frame of mind.

 

And then there is the research from Claude Steele and Joshua Aronson (1995).  These psychologists investigated the impact on African Americans of reporting one’s race before taking a very difficult test.  They employed African American college students and a test made up of 20 questions from the Graduate Record Exam (GRE).  The students were randomly split into two groups.  One group had to indicate their race on the test while the others did not.  Those who indicated their race got half as many of the GRE items correct as their non-race-reporting counterparts.  Simply reporting that they were African American seemed to prime them for lower achievement.

 

All of these effects were accomplished completely and totally outside the awareness of the involved parties.  In fact, this is an essential attribute.  Effective priming absolutely necessitates that it be done outside the subject’s awareness.  Awareness negates the effect.

 

Regardless, consider the implications, intended or otherwise of such priming.  Malcolm Gladwell in his book Blink notes: “The results from these experiments are, obviously quite disturbing.  They suggest that what we think of as freewill is largely an illusion: much of the time, we are simply operating on automatic pilot, and the way we think and act – and how well we think and act on the spur of the moment – are a lot more susceptible to outside influences than we realize.” (p. 58).

 

Yes, It is disturbing on a personal level with regard to the vulnerability of rational decision making, but I am more concerned about the ethical implications of our insight into this tool. Priming may be used by those with the power, influence, and intentions to manipulate outcomes to serve ideological purposes.  On yet another level the reality of this phenomena supports my contention in Do we all get a fair start? that there is no true equal starting point.  Societal morays and the media in particular shape how we think about others and ourselves in profound ways.  We all are susceptible to stereotypes, prejudices, and biases and these tendencies can cut in multiple directions.  They can also be used to bolster negative attitudes or weaken individuals in destructive ways.  I am not suggesting that the sky is falling or that there is a huge ideological conspiracy going on, but we must be aware of our vulnerabilities in this regard.  And we must act to avoid constraining individuals as a function of subgroup affiliation.

 

References

 

Bargh, J. A., Chen, M.,  & Burrows, L. (1996).  Automaticity of Social Behavior: Direct Effects of Trait Construct and Stereotype Activation on Action. Journal of Personality and Social Psychology. Vol. 71, No. 2. 230-244

 

Dijksterhuis, A., & van Knippenberg, A. (1998). The relation between perception and behavior or how to win a game of Trivial Pursuit. Journal of Personality and Social Psychology, Vol. 74, 865-877.

 

Gladwell, M. (2005).  Blink: The Power of Thinking Without Thinking. Little, Brown and Company: New York.

 

Steele, C. M., & Aronson, J. (1995). Stereotype threat and the intellectual test performance of African Americans. Journal of Personality and Social Psychology, Vol. 69  No. 5. 797–811.

Share

Do we all get a fair start?

16 October 2010

I had an interesting conversation with a close family member the other day.  He was struggling to understand why people in the lower echelons of socioeconomic status do not understand or act on their ability to change their circumstances.  He firmly held the belief that the drive to achieve is universal and that we all have the same potential.  Essentially he was convinced that anyone can rise up by working hard in school or the workplace.  Those who do not achieve, he contended, are making an explicitly different choice.  Many refer to these folks as lazy, free loaders and/or cheaters.  He recounted the stories from his days working at the local grocery where people would use their public assistance checks to buy beer, cigarettes and other non essential items.  This is the same story I’ve heard from countless people who contend that public assistance is for lazy people content about, or highly skilled at, manipulating the system for a free ride.  I had a similar conversation with another family member recently, who was enraged about Obama shoving publicly supported health care down the throats of the American tax payer.

 

We are inherently tribal people and part of our human nature, it seems, is to be on the lookout for freeloaders.  As Jonathon Haidt’s work points out, such vigilance is inherent to various degrees in all of us, as part of the ingroup loyalty moral drive that is fundamental to social cohesion.   Freeloaders detract from the viability and survivability of the group.  This deeply emotional moral position has clear evolutionary roots that remain strong today.

 

No doubt, there are freeloaders among us.  There are people who scam the system and I am guessing that there will always be those who are comfortable with, or even proud of, their ability to live off the diligence and contributions made by others.  Some argue that entitlement programs enable the freeloaders among us to prosper and propagate.   This may be true for some.  But we need to keep it all in perspective.  To do so there are a number of other factors to consider.

 

First, isn’t it interesting that we frame freeloaders at the lower end of the spectrum differently than we classify white collar criminals?  Do they not accomplish essentially the same thing?  They illegitimately acquire resources that they are not entitled to.  And I am guessing that the true costs of white collar crime exceed those of “welfare fraud.”  Keep in mind that the major frauds in the medicaid system are generally perpetrated by white collar criminals – Doctors or administrators billing for un-rendered services.  Also think back to the impact of people like Bernie Madoff who essentially stole $21 Billion.  They are criminals indeed, but their crimes do not result in all those within their income bracket as being likewise identified as untrustworthy.  Granted, all crime is bad, but I have to challenge the implications of labeling an entire subset of a population as “bad” because some of them cheat.

 

Second, isn’t it also interesting that our hyper vigilance for cheaters targets the less fortunate among us rather than the corporations who bilk the system of billions of your hard earned dollars.  Why do we turn our anger against our fellow human beings when corporations like Exxon Mobile get huge tax subsidies while at the same time they are raking in billions of dollars of quarterly profit?  Then consider the financial melt down and the huge bail-outs provided to corporations deemed “too big to fail.”  The costs to our society as a results of welfare cheaters are a pittance in comparison to the impact of the deregulated market-place.

 

Third, although nobody likes a cheater, when given a chance to do so, and a low probability of getting caught, almost everybody will cut corners or scam the system to save a buck.  And everybody knows someone who works or gets paid “under the table.”  Somehow these folks are given a pass and escape the wrath of the stigma of freeloader.  My guess is, the proportion of people who cheat the system span all income brackets, and the actual social costs rise exponentially and commensurately with income.   The disdain that we target toward the less fortunate among us, I argue, is too convenient and hugely disproportionate.   Part of this may stem from the perception that welfare fraud is more visible to us than is white collar crime.  And while white collar crime is perpetrated by people that look and think like we do (or by faceless corporations), welfare fraud is sometimes perpetrated by people whose faces and lifestyles are different from ours.  We see these cheaters and often hear of their exploits.  I contend that much of what we hear amounts to rehashed urban myths.

 

The stereotype that many of us hold about the poor is inaccurate and maintained both by attribution error and confirmation bias.  And the belief that many white middle class college-educated people hold – that they alone are responsible for their position in life is reflective of self-serving bias.  Each generation launches from the shoulders of their parents who each launched from the shoulders of their respective parents.   My children are launching from a place that is exponentially different than that of a poor African American from the east side of Buffalo, New York, or a poor Latino from East L.A., or that of a poor white child raised in remote rural Appalachia, or that of white boarding school attendee from a heavily connected affluent Manhattan family.  The educational, social, and economic opportunities across these launching points vary in important and significant ways that shape their perceptions, aspirations, and realities in profound ways.   Heritage, and thus opportunity, play the biggest role in one’s socioeconomic status – although, “the system” benefits from people believing that it is hard work and intelligence that drives wealth distribution.  Believing the American Dream keeps the masses contented.  It keeps people striving, believing that they can rise up if only they are smart enough and diligent enough.   A significant part of our population has figured this out – they are the disenfranchised.  Without hope or opportunity it is hard to buy into the myth that one can rise out of the ghetto by working hard.  It’s difficult to continually swim against the current; and for the fortunate, it is sometimes hard to see that there is in fact a current when one is floating along with it.

Share

I don’t know if you caught it the other night when you were watching the news while skimming your email, checking your twitter and RSS feeds, and updating your Facebook status, but there was an interesting story about multitasking.  Silly me, who actually watches the news anymore? Anyways, much of the recent buzz on this endemic behavior (among the technologically savvy) is not good.  Multitasking is a paradox of sorts – where we tend to romanticize and overestimate our ability to split attention among multiple competing demands. The belief goes something like this: “I’ve got a lot to do and if I work on all my tasks simultaneously I’ll get them done faster.”   However, what most of us fail to realize is that when we split our attention, what we are actually doing is dividing an already limited and finite capacity in a way that hinders overall performance. And some research is showing that chronic multitasking may have deleterious affects on one’s ability to process information even when one is not multitasking (Nass, 2009).

 

Advances in computer technology seem to fuel this behavior.  If you do a Google search on multitasking you will get a mix of information on the technological wonders of machines that can multitask (AKA computers) mixed with news regarding how bad media multitasking is for you.

 

Think about it.  There has been increasing pressure on the workforce to be more productive and gains in productivity have been made lockstep with increases in personal computing power. Applications have been developed on the back of the rising tide of computer capacity, thus making human multitasking more possible.  These advances include faster microprocessors, increased RAM, increased monitor size, the internet itself, browsers that facilitate the use of multiple tabs, relatively inexpensive computers with sufficient power to keep open email, word processing programs, Facebook, Twitter, iTunes, and YouTube. Compound these tools with hardware that allows you to do these things on the go. No longer are you tethered to the desktop computer with an Ethernet cable.  Wifi and 3G connectivity allow all the above activities almost anywhere via use of a smart phone, laptop, iPad, or notebook computer.  Also in the mix are devices such as bluetooth headsets and other headphones that offer hands free operation of telephones.

 

Currently, technology offers one the ability to divide one’s attention in ways inconceivable only a decade ago. The ease of doing so has resulted in the generalization of this behavior across settings and situations including talking on cell phones while driving, texting while driving, texting while engaged in a face to face personal interactions, and even cooking dinner while talking on the phone. Some of these behaviors are dangerous, some rude, and all likely lead to inferior outcomes.

 

Don’t believe it? If you don’t, you are likely among the worst skilled of those who multitask. “Not me!” you may claim. Well research has shown that those who routinely multitask are also the most confident in their ability to do so (Nass, 2009).  But when you look at the products of these “confidently proficient” multitaskers, you find the poorest outcomes.

 

Multitasking involves shifting attention from one task to another, refocusing attention, sustaining attention, and exercising ongoing judgment about the pertinence and salience of various competing demands. Doing this successfully is exceptionally difficult and is likely well beyond the capacity of most typical human beings. Our brains can only generally concentrate on one task at a time, and as such, multitasking necessitates devoting shorter periods of time on dissimilar tasks.  As a result, overall effectiveness, on all tasks is reduced.

 

Researchers at the University of Michigan Brain, Cognition and Action Laboratory, including Professor David E. Meyer, point out that the act of switching focus itself has deleterious effects. When you switch from task A to task B you lose time in making the transition and the completion time of the transition itself increases with the degree of complexity of the task involved. Depending on how often you transition between stimuli, you can waste as much as 40% of your productive time just in task switching (APA, 2006).

 

Shorter periods of focus reduce overall time on task and each transition reduces this time further. Dr. Glenn Wilson at the Institute of Psychiatry, University of London in 2005 discovered that his subjects experienced a 10-point fall in their IQ when distracted by incoming email and phone calls. This effect size was “more than twice that found in studies of the impact of smoking marijuana” and was similar to the effects of losing a night’s sleep (BBC, 2005).

 

As for the negative long term affects of multitasking, Dr. Nass noted that:

 

“We studied people who were chronic multitaskers, and even when we did not ask them to do anything close to the level of multitasking they were doing, their cognitive processes were impaired. So basically, they are worse at most of the kinds of thinking not only required for multitasking but what we generally think of as involving deep thought.”

 

Nass (2009) has found that these habitual multitaskers have chronic filtering difficulties, impaired capacity to manage working memory, and slower task switching abilities. One must be careful to avoid the Illusion of Cause in this situation. Correlation is not causation and we must avoid inferring that multitasking causes these cognitive declines. The reverse may be true or other undetected variables may cause both.

 

Much of the research in this area is in its infancy and thus limited in scope and depth, so it is prudent to be a bit skeptical about whether or not multitasking is bad for you. But with regard to the efficacy of multitasking – when you look at the issue from an anecdotal perspective, apply the tangentially related evidence logically, and then consider the data, you have to conclude that multitasking on important jobs is not a good idea.  If you have important tasks to accomplish, it is best to focus your attention on one task at a time and to minimize distractions.  To do so, avoid temptation to text, tweet, watch TV, check your email, talk on the phone, instant message, chat on Facebook, Skype, or otherwise divide you attention. If you believe employing these other distractions helps you do better, you are deluding yourself and falling victim to the reinforcement systems that make multitasking enjoyable. Socializing, virtually or otherwise, is more pleasurable than the arduous processes involved in truly working or studying.

 

You can likely apply the same principles to plumbing, cooking, housework, woodworking, etc.  The key to success, it seems is to FOCUS on one task at a time, FINISH the job, and then move one.  You’ll save time, be more efficient, and do a better job! Remember – FOCUS & FINISH!

 

References

 

American Psychological Association. (March 20, 2006). Multitasking: Switching Costs.
http://www.apa.org/research/action/multitask.aspx

 

BBC News (2005). ‘Infomania’ worse than marijuana. http://news.bbc.co.uk/2/hi/uk_news/4471607.stm

 

Keim, B. (2009). Multitasking muddles Brains, even when the computer is off. Wired Science News for Your Neurons. http://www.wired.com/wiredscience/2009/08/multitasking/#ixzz11LfOUISp

 

Ophir, E., Nass, C., & Wagner, A. D. (2009). Cognitive Control in Media Multitaskers. Proceedings of the National Academy of Sciences. v. 106, no. 37. http://www.pnas.org/content/106/37/15583

 

Nass, C. (August 28, 2009).  Talk of the Nation: National Public Radio:  Multitasking May Not Mean Higher Productivity. http://www.npr.org/templates/story/story.php?storyId=112334449

 

Seldon, B. (2009). Multitasking, marijuana, managing? http://www.management-issues.com/2009/9/21/opinion/multitasking–marijuana–managing.asp

Share

Why do you sometimes choose that scrumptious chocolate desert even when you are full?  Why is it that you are sometimes drawn in by the lure of the couch and TV when you should be exercising or at least reading a good book?  And why do you lose your patience when you are hungry or tired? Do these situations have anything to do with a weak will?

 

What is willpower anyways?  Perhaps it is your ability to heed the advice proffered by that virtuous and angelic voice in your head as you silence the hedonistic diabolical voice that goads you toward the pleasures of sloth or sin.   Or perhaps, as Sigmund Freud once contended, it is your ego strength that enables you to forgo the emotionally and impulsively driven urges of the id.   These images resonate so well with us because it often feels as though there is a tug-of-war going on inside our heads as we consider difficult or sometimes even routine choices.  Often, reason prevails, and other times it does not.  What is really at play here? Is it truly willpower? Is it really a matter of strength or even of choice?

 

As it turns out, like all issues of the human mind, it is complicated.  Studies within the disciplines of psychology and neuroscience are offering increased clarity regarding this very issue.  It is important to understand however, that the human brain is composed of a number of modules, each of which are striving to guide your choices.  There really isn’t a top down hierarchy inside your brain with a chief executive who is pulling and pushing the levers that control your behavior.  Instead, at various times, different modules assert greater amounts of control than others, and thus, the choices we make, do likewise vary in terms of quality over time.  As a result of advances in technology and understanding, we are becoming increasingly aware of the key variables associated with this variation.

 

At a very basic level we know of two major (angelic v. diabolical) driving forces that guide our decisions.  Within and across these forces there are multiple modules emitting neurotransmitters that ultimately influence the choices that we make.  Broadly, the two forces are reason and emotion.  As I discussed in previous posts, What Plato, Descartes, and Kant Got Wrong: Reason Does not Rule and Retail Mind Manipulation, there is not actually a true competitive dichotomy between these two forces; instead, there appears to be a collaborative interplay among them. Regardless of their collaborative nature, we do experience a dichotomy of sorts when we choose the cheeseburger and fries over the salad, the chocolate cake over the fruit salad, or abstinence over indulgence.

 

Now that I have clouded the picture a bit, lets look at one study that may help reintroduce some of that clarity that I mentioned.

 

At Stanford University, Professor Baba Shiv, under the ruse of a study on memory, solicited several dozen undergraduate students. He randomly assigned the students to two groups. For conveniences sake, I will label the groups the 2 Digit Group and the 7 Digit Group.  The students in the 2 Digit Group were given a two digit number (e.g., 17) to memorize whereas those in the 7 Digit Group where tasked with a seven digit number (e.g., 2583961).  In Room-A, each individual, one subject at a time, was given a number to memorize.  Once provide with the number they were given as much time as they needed to commit the number to memory.  They were also told that once they had memorized the number that they were to go to Room-B, down the hall, where their ability to recall the number would be tested.  As each individual student made the transition from the first room to the testing room, they were intercepted by a researcher offering them a gratuity for their participation. The offer was unannounced and provided prior to entering the testing room (Room-B).   The offer included either a large slice of chocolate cake or a bowl or fruit salad.

 

One would expect, given the random nature of group assignment, that those in the 2 Digit group would select the cake or fruit salad in the same proportions as those in the 7 Digit group.  As it turned out, there was a striking difference between the groups.  Those in the 2 Digit Group selected the healthy fruit salad 67% of the time.  On the other hand, those in the 7 Digit Group selected the scrumptious, but not so healthy, cake 59% of the time.  The only difference between the groups was the five digit discrepancy in the memorization task.  How could this seemingly small difference between the groups possibly explain why those saddled with the easier task would make a “good” rational choice 67% of the time while those with a more challenging task made the same healthy choice only 41% of the time?

 

The answer likely lies in the reality that memorizing a seven digit number is actually more taxing than you might think.  In 1956, Psychologist George Miller published a classic paper entitled “The Magical Number Seven, Plus or Minus Two” whereby he provided evidence that the limit of short term memory for most people is in fact seven items. This is why phone numbers and license plates are typically seven digits in length. Strings of letters or numbers that are not logically grouped in some other way, when approaching seven items in length, tend to max out one’s rational processing ability.  With seven digits, one is likely to have to recite the sequence over and over in order to keep it in short term memory.  It appears that those in the 7 Digit Group relative to the 2 Digit Group had reached the limits of their rational capacity and were less likely to employ good reason-based decision making with regard to the sweets. Those in the 2 Digit Group were not so preoccupied and were likely employing a more rationally based decision making apparatus.  They made the healthy choice simply because they had the mental capacity to weigh the pros and cons of the options.

 

An overtaxed brain is likely to fall back on emotional, non-rational mechanisms to make choices and the outcomes are not always good.  When you are cognitively stressed – actively engaged in problem solving – you are less likely to make sound, reason-based decisions regarding tangential or unrelated issues. That is one of the reasons why we “fall off the wagon” when we are overwhelmed.

 

And if you compound cognitive preoccupation with fatigue and hunger – then you may have more problems.  You know those times at the end of the day when you are tired, hungry, and really irritable?   Your muscles are not the only tissues that fatigue when they are not well nourished.  Your brain is a major consumer of nutritional resources – and it, particularly the reasoning portion of your brain, many scientists believe, does not tolerate glucose deficits.  Your grumpiness may be the result of the diminished capacity of your brain to employ reason in order to work out and cope with the little annoyances that you typically shrug off.

 

So, it seems, willpower is one’s ability to use the reasoning portion of your brain to make sound healthy decisions.  Studies like the one above, suggest that willpower is not a static force.  We must accept the limits of our willpower and realize that this source of control is in a near constant state of fluctuation – depending on one’s state of cognitive preoccupation, fatigue and perhaps blood glucose levels.  It is very important that you know your limits and understand the dynamic nature of your rational capacity – and if you do, you may proactively avoid temptation and thus stay in better control of your choices.  Relying on your willpower alone does not provide you with dependable safety net.  Be careful to not set yourself up for failure.

 

References:

 

Krakovsky, M. (2008). How Do We Decide? Inside the ‘Frinky’ Science of the Mind. Stanford Graduate School of Business Alumni Magazine. February Issue

 

Krulwich, R. & Abumrad, J. (2010). Willpower And The ‘Slacker’ Brain. National Public Radio: Radio Lab. http://www.npr.org/templates/story/story.php?storyId=122781981

 

Lehrer, J. (2009). How We Decide. Houghton Mifflin Harcourt: New York.

 

Miller, G. (1956). The Magical Number Seven, Plus or Minus Two. The Psychological Review. Vol. 63, pp. 81-97.

Share