So really, what caused that earthquake and subsequent tsunami in Japan?  A quick Google search posing this very question yields a wide range of answers.  Fortunately a majority of the hits acknowledge and explain how plate tectonics caused this tragedy.  Sprinkled throughout the scientifically accurate explanations are conspiracy theories suggesting that the US government caused it through hyper-excitation of radio waves in the ionosphere (HAARP) and perhaps even planned radiation releases.  Other theories include the “Supermoon’s” increased tug on the earths crust due to the fact that it is at perigee (closest proximity to the earth in its cyclical orbit).  Solar flares (coronal mass ejections) were also blamed; and by some, the flares working in concert with the moon in perigee are believed to have triggered the quake.  Global warming also gets its share of the blame (but the proponents  suggest that real cause is the removal of oil from the crust leaving voids that ultimately trigger earthquake).   Some have even suggested that a comet or even God may have done this.

 

The problem with the scientific explanation is that plate tectonics is invisible to most of us.  Its motion is so gradual that it does not “on the surface” seem plausible.  We seemingly need a clear causal agent that fits within our understanding of the world.  Scientifically literate individuals are inclined to grasp the agency of tectonics because the theory and the effects do in fact, fit together in observable and measurable ways.  Others reach for causal explanations that better fit within their understanding of the world.

 

Our correlation calculators (brains) grab onto events temporally associated with such events and we then conjure up narratives to help us make sense of it all.  It is easy to understand why folks might assume that the moon at perigee, or increased solar activity, or even an approaching comet might cause such events.  Others, who are prone to conspiracy theories, who also have a corresponding belief that big brother is all powerful and sadistic, will grab onto theories that fit their world views.  The same is true for those with literal religious inclinations.  Unfortunately, this drive often leads to narrative fallacies that misplace the blame and sometimes ultimately blame the victims.

 

History is filled with stories drawn up to explain such tragedies.  In the times of ancient Greece and Rome, many tales were spun to explain famine, plagues, and military failures.  All of this occurred prior to our increasingly complex understanding of the world (e.g., germ theory, plate tectonics, meteorology), and it made sense to blame such events on vengeful gods.  How else could they make sense of such tragedies?  This seems to be how we are put together.

 

A study published in 2006 in the journal, Developmental Psychology, by University of Arkansas Psychologists Jesse Bering and Becky Parker looked at the development of such inclinations in children.  They pinpointed the age at which such thinking begins to flourish.   They also provided a hypothesis to explain this developmental progression.  This study was summarized in a March 13, 2011 online article at Scientific American by the first author titled: Signs, signs, everywhere signs: Seeing God in tsunamis and everyday events.

 

In this study of children ages three to nine years of age, the psychologists devised a clever technique to assess the degree to which individuals begin to assign agency to events in their environment and subsequently act on those signs.  What they found was that children between three and six years of age do not read communicative intent into unexplained events (e.g., lights flickering or pictures falling from the wall).  But at age seven, children start reading into and acting on such events.  So why is it that at the age of seven, children start inferring agency from events in their environment?  Bering suggests that:

 

“The answer probably lies in the maturation of children’s theory-of-mind abilities in this critical period of brain development. Research by University of Salzburg psychologist Josef Perner, for instance, has revealed that it’s not until about the age of seven that children are first able to reason about “multiple orders” of mental states. This is the type of everyday, grown-up social cognition whereby theory of mind becomes effortlessly layered in complex, soap opera–style interactions with other people. Not only do we reason about what’s going on inside someone else’s head, but we also reason about what other people are reasoning is happening inside still other people’s heads!”

 

So as it turns out, this tendency to read signs into random events is associated with the maturation of cognitive processes. Children with less mature “Theory of Mind” (click here for a very basic description of Theory of Mind) capabilities fail to draw the conclusion that a supernatural being, or any being for that matter, knows what they are thinking and can act in a way that will communicate something.

 

“To interpret [capricious] events as communicative messages, … demands a sort of third-person perspective of the self’s actions: ‘What must this other entity, who is watching my behavior, think is happening inside my head?’ [These] findings are important because they tell us that, before the age of seven, children’s minds aren’t quite cognitively ripe enough to allow them to be superstitious thinkers. The inner lives of slightly older children, by contrast, are drenched in symbolic meaning. One second-grader was even convinced that the bell in the nearby university clock tower was Princess Alice ‘talking’ to him.”

 

When a capricious event has great significance, we are seemingly driven by a ravenous appetite to look for “signs” or “reasons.”  We desperately need to understand.  Our searches for those “reasons” are largely shaped by previously held beliefs and cultural influences. Divine interventions, for example, have historically been ambiguous; therefore, a multitude of surreptitious events, can be interpreted as having a wide variety of meanings. And those meanings are guided by one’s beliefs.

 

“Misfortunes appear cryptic, symbolic; they seem clearly to be about our behaviors. Our minds restlessly gather up bits of the past as if they were important clues to what just happened. And no stone goes unturned. Nothing is too mundane or trivial; anything to settle our peripatetic [wandering] thoughts from arriving at the unthinkable truth that there is no answer because there is no riddle, that life is life and that is that.”

 

The implications of this understanding are profound.  We are by our very nature driven to search for signs and reasons to explain major life events, and we are likewise inclined to see major events as signs themselves. The ability to do so ironically depends on cognitive maturation. But, given the complexity and remoteness of scientific explanations, we often revert to familiar and culturally sanctioned explanations that have stood the test of time.  We do this because it gives us comfort, regardless of actual plausibility.  As I often say, we are a curious lot, we humans.

 

References:

 

Bering, J. (2011). Signs, signs, everywhere signs: Seeing God in tsunamis and everyday events. Scientific American.  http://www.scientificamerican.com/blog/post.cfm?id=signs-signs-everywhere-signs-seeing-2011-03-13&print=true

 

Bering, J., & Parker, B. (2006). Children’s Attributions of Intentions to an Invisible Agent. Developmental Psychology. Vol. 42, No. 2, 253–262

 

Share

Narrative Fallacy

13 March 2011

Evolution has conferred upon us a brain that is capable of truly amazing things.  We have, for thousands of years, been capable of creating incredibly beautiful art, telling compelling tales, and building magnificent structures.  We have risen from small and dispersed tribal bands to perhaps the dominate life force on the planet.  Our feats have been wondrous.  We have put men on the moon, our space probes have reached the outer limits of our solar system, and we have people living and working in space.  We have literally doubled the life expectancy of human beings, figured out how to feed billions of people, and eradicated some of the most dreadful diseases known to human kind.  We can join together in virtual social communities from remote corners of the world, and even change nations using Facebook and Twitter.  This list could go on and on.  We are very capable and very smart beings.

 

Our mark on this planet, for the moment, is indelible.  Yet, despite our great powers of intellect and creativity, we are incredibly vulnerable.  I am not referring to our susceptibility to the great powers of nature as evidenced in Japan this last week.  I am referring to an inherent mode of thinking that is core to our human nature.

 

It is pretty certain that nature-nature will destroy our species at some point in the future, be it via asteroid impact, super-volcanoes, climate change, microbiome evolution, or the encroachment of the sun’s surface as it goes red giant in five billion years.  Of all the species that have ever lived on this planet over 99% have gone extinct.  What’s living today will someday be gone – there really is no question about it.  But the question that remains is: “Will nature-nature do us in – or will human-nature do it first?”

 

We have evolved over billions of years to our current homo sapien (wise man) form, and for the vast majority of that evolutionary period, we have had very limited technology.  The development of primitive stone and wooden tools dates back only tens of thousands of years; and reading and writing dates back only several thousand years.  What we do and take for granted every day has only been around for a minuscule amount of time relative to the vastness of incomprehensible evolutionary and geological time. These facts are relevant because our brains, for the most part, developed under selective pressures that were vastly different than those we live under today.

 

Much as our appendix and coccyx hair follicle are remnants of our evolutionary past, so too are some of our core thought processes.  These vestigial cognitions play out both as adaptive intuitions and potentially quite destructive errors of judgment.  We would like to think that as an advanced thinking species, our ability to use reason, is our dominate mental force.  Unfortunately, this most recent evolutionary development, takes a back seat to lower and more powerful brain functions that have sustained us for millions of years.  I have previously written about this reason versus intuition/emotion paradigm so I won’t go into this issue in detail here; but, suffice it to say, much of what we do is guided by unconscious thought processes outside of our awareness and outside our direct control.  And again, these life guiding processes are mere remnants of what it took to survive as roaming bands of hunters and gatherers.

 

Ours brains came to their current form when we were not in possession of the tools and technologies that help us truly understand the world around us today.  Early survival depended on our ability to see patterns in randomness (pareidolia or patternicity) and to make snap judgments.  Rational thought, which is slow and arduous, has not played out in a dominate way because it failed to provide our ancestors with the survival advantages that emotional and rapid cognitions did.  As such, our brains have been programmed by evolution to make all kinds of rapid cognitions, that in this modern time, are simply prone to error.

 

We are uncomfortable with randomness and chaos and are driven to pull together causal stories that help us make sense of the world.  Our brains are correlation calculators, belief engines, and hyperactive agency detection devices – all inclinations of which lead us to develop polytheism to help explain the whims of “mother nature.”  All cultures, for example have also developed creation myths to help explain how we came to be.  We are a superstitious lot driven by these vestigial remnants.

 

It is easy to see how powerful this inclination is.  Look at the prevalence of beliefs about things like full moons and bad behavior.  And how about bad behavior and acts of nature?  Pat Robertson blamed Katrina on homosexuality and hedonism.  One wonders what the Japanese did to deserve their most current tragedy.  I’ve already heard talk of the attack on Pearl Harbor as an antecedent.  Like mother nature would align with the United States to punish long past deeds against us!  If mother nature cares at all about herself, I wonder what we have coming for Nagasaki and Hiroshima?  Likewise, people blame vaccines for autism and credit homeopathy for their wellness.  I could go and on about our silly inclinations.  We are prone to Confirmation Bias, Spinoza’s Conjecture, Attribution Error, Illusions of Attention, and the Illusions of Knowledge and Confidence.  In the same vein, we are manipulated by the Illusion of Narrative also known as the Narrative Fallacy.

 

Nassim Nicholas Taleb (a philosopher, author, statistician) coined the phrase “Narrative Fallacy,” which is an encapsulation of this very discussion.  We have a deep need to make up a narrative that serves to make sense of a series of connected or disconnected facts.  Our correlation calculators pull together these cause and effect stories to help us understand the world around us even if chance has dictated our circumstances.   We fit these stories around the observable facts and sometimes render the facts to make them fit the story.  This is particularly true, for example, in the case of Intelligent Design.

 

Now that I am aware of this innate proclivity I enjoy watching it play out in my own mind.  For example several weekends ago I went cross country skiing with my wife, Kimberly.  We were at Allegany State Park, in Western New York, where there are nearly 20 miles of incredibly beautiful and nicely groomed nordic ski trails.  Kimberly and I took a slightly different route than we normally do and at a junction of two trails, we serendipitously ran into a friend we hadn’t seen in quite some time.  It was an incredible and highly improbable meeting.  Any number of different events or decisions could have resulted in forgoing this meet-up.  Such events compel us to string together a narrative to make sense of the sheer randomness.  Was it fate, divine intervention, or just coincidence?  I am certain it was the latter – but it sure was fun dealing with the cognitions pouring forth to explain it.

 

I would really like to hear about your dealings with this inclination.  Please post comments detailing events that have happened to you and the narratives you fomented to make sense of  them.  This is a great exercise to help us understand this pattern detection mechanism, so, have some fun with it and share your stories.  At the very least, pay attention to how this tendency plays out in your life and think about how it plays out in your belief systems (and ideological paradigms).  I’m guessing that it will be informative.

Share

We all love a good story.  Children are mesmerized by them and adults, whether through books, TV, movies, sports, gossip, tabloids, or the news, to mention a few, constantly seek them out.  It is core to our identity, and a vital part of our nature.  It is both how we entertain ourselves, and how we make sense of the world.   This latter tendency troubles me.  Why?  Specifically because we are inclined to value narratives over aggregated data, and we are imbued with a plethora of cognitive biases and errors that all mesh together in a way to leave us vulnerable to believing very silly things.

 

This may be hard to swallow, but all of us, yes even you, are by default, gullible and biased: disinclined to move away from narratives that you unconsciously string together in order to make sense of an incredibly complex world.  Understanding this is paramount!

 

I have discussed many of the innate illusions, errors, and biases that we are inclined toward throughout this blog.  I have also discussed the genetic and social determinates that play out in our thought processes and beliefs.  And throughout all this I have worked diligently to remain objective and evidence based.  I do accept that I am inclined toward biases programmed into my brain.  This knowledge has forced me to question my beliefs and open my mind to different points of view.  I believe that the evidence I have laid down in my writings substantiates my objectivity.  But I am also tired, very tired in fact, of making excuses for, and offering platitudes to, others who do not open their minds to this not so obvious reality.

 

I am absolutely convinced that there is no resolution to the core political, economic, religious and social debates that pervade our societies, unless we can accept this reality.  Perhaps, the most important thing we can do as a species is come to an understanding of our failings and realize that in a multitude of ways, our brains lie to us.  Our brains deceive us in ways that necessitate us to step away from our gut feelings and core beliefs in order to seek out the truth.  Only when we understand and accept our shortcomings will we be open to the truth.

 

Because of these flawed tendencies we join together in tribal moral communities lending a blind eye to evidence that casts doubt on our core and sacred beliefs.  We cast aspersions of ignorance, immorality or partisanship on those that espouse viewpoints that differ from our own.  I cannot emphasize this enough, this is our nature.  But, I for one, cannot, and will not, accept this as “just the way it is.”

 

We as a species are better than that.  We know how to over come these inclinations.  We have the technology to do so.  It necessitates that we step back from ideology and look at things objectively.  It requires asking questions, taking measurements, and conducting analyses (all of which are not part of our nature).  It necessitates the scientific method.  It requires open peer review and repeated analyses.  It requires objective debate and outright rejection of ideology as a guiding principle.  It requires us to take a different path, a path that is not automatic, one that is not always fodder for good narrative.

 

I am no more inclined to believe the narrative of Muammar Muhammad al-Gaddafi suggesting that “his people love him and would die for him” than I am to accept the narrative from Creationists about the denial of evolution or those that deny anthropogenic global warming based on economic interests.  Likewise, I am not willing to accept the arguments from the anti-vaccine community or the anti-gay marriage community.

 

My positions are not based on ideology!  They are based on evidence: both the credible and substantive evidence that backs my position and the lack of any substantive evidence for the opposing views.

 

Granted, my positions are in line with what some may define as an ideology or tribal moral community; but there is a critical difference.  My positions are based on evidence, not on ideology, not on bronze-age moral teachings, and certainly not on fundamental flaws in thinking.  This is a huge and critical difference.  Another irrefutable difference is my willingness to abandon my position if the data suggests a more credible one.  Enough already! Its time to step back, take a long and deep breath – look at how our flawed neurology works – and stop filling in the gaps with narrative that is devoid of reality.  Enough is enough!

 

Share

Tragedies like the events of January 8th in Tuscan shake the nation.  We grieve for the victims and struggle to make sense of it all.  The dialogue that has followed the event is not surprising.  People want and need to understand why a person would do such a thing.   These events are mind boggling and the human brain does not tolerate the ambiguity and senselessness of such acts.  We gain solace by filling in the blanks with assumptions about the gunman’s sanity or motives.

 

We respond by presuming that only a mad man could commit such heinous acts.  Or, we conclude that because the principle target was a politician, that his behavior must be ideologically driven.  These assumptions provide a framework within which the event is easier to comprehend.  The notion of insanity simplifies the situation: mental illness becomes the culprit.  The notion of it being a politically motivated act also allows us to point a finger.

 

In fact, however, we don’t know what brought this young man to make such a terrible choice.  The incomplete mosaic of the shooter’s life drawn from disparate snapshots by relative strangers suggests erratic behavior and disjointed thoughts.  Was he abusing substances or evidencing symptoms of psychosis?  As of right now we just don’t know.

 

We do know, however, that he and his family lived a reclusive life and that he struggled to exhibit sufficient adaptive skills to successfully navigate the worlds of work and college.  I suggest that although it may be easy to conclude that Loughner is deranged, it is important to remember that insanity is not a prerequisite for such atrocious behavior.  You may assume that only insane people would commit such crimes – but the reality is that ordinary people are capable of doing equally terrible things if their beliefs and their culture condone it or even honor it.

 

It is for example quite wrong to assume that the 9/11 terrorists were insane.  Their faith in their god and the teachings of their religious book as well as the value put on such beliefs by the narrow sect of their particular extremist culture made them heroes and martyrs, destined for eternal bliss.

 

There are other situations where the sanity line is a bit murkier.  Timothy McVeigh was apparently paranoid and quite capable of rationalizing his behavior by revising or cherry picking historical facts: but he too was highly motivated to act out his own form of justice for crimes that he believed were committed by the Federal Government (e.g., Waco, Ruby Ridge).  His perspective about right and wrong was different than most of ours, but that doesn’t necessarily qualify him as insane.  We don’t understand or align with his thinking and thus conclude that he must be mentally ill.  The brutality of his behavior certainly bolsters such a conclusion.  But recall that he was a decorated soldier in the Gulf War.  He was a trained killer.  The enemy, at some point following his discharge from the Army, shifted from Saddam Hussein to the US Federal Government.  His beliefs justified his behavior in his eyes.

 

There are legitimate examples of heinous crimes committed by individuals with clear mental illness issues such as Jeffrey Dahmer, Ted Bundy, and John Wayne Gacy.   Seung-Hui Cho (of Virginia Tech) also comes to mind.  Jared Loughner on the surface appears to be more in this category, but it is a presumption at this point.  There is no evidence that he was driven by a different set of moral imperatives spurred on by rancorous political hate speech.

 

Regardless, as many pundits have proclaimed, there is a fear that the vitriol that abounds in our political discourse may inspire and incite the Timothy McVeighs of the world.  Frankly, my first assumption was that the attack on Giffords was inspired by the very hate and fear emanating from the likes of Glenn Beck and Sarah Palin.  Again, it is imprudent to draw such conclusions, but when someone has a differing political perspective and you target them as enemies of the state, destined to destroy America, then you have touched your toes on the line!  And when you incite hate and associate the opposing side with Hitler and Soviet Stalinists, then you have crossed the line.  Compound such rhetoric with images of violence and you have become grossly irresponsible.

 

I implore all US citizens to embrace civility and reject those that employ hatred to further their ideology.  We have all too real and tragic examples of the consequences of this behavior.  Lets devote our attention to civil dialogue about the issues that challenge our people and our planet.  The issues and we the people, deserve better!

Share

Have you ever heard someone make an argument that leaves you shaking your head in disbelief?  Does it seem to you like some people are coming from a completely different reality than your own?  If so, then this blog is for you.  I have spent the last year trying to develop an understanding of the common thought patterns that drive the acrimonious spirit of our social and political dialogue.  I am continually amazed by what I hear coming from seemingly informed people.  I have assumed that some folks are either deluded, disingenuous, or downright ignorant.  There is yet another possibility here, including the reality that different moral schema or belief systems may be driving their thinking.  And if this is the case, how do these divergent processes come to be?  I  have learned a lot through this exploration and feel compelled do provide a recap of the posts I have made.  I want to share with you those posts that have gathered the most traction and some that I believe warrant a bit more attention.

 

Over the past year I have posted 52 articles often dealing with Erroneous Thought Processes, Intuitive Thinking, and Rational Thought.  Additionally, I have explored the down stream implications of these processes with regard to politics, morality, religion, parenting, memory, willpower, and general perception.  I have attempted to be evidenced-based and objective in this process – striving to avoid the very trappings of confirmation bias and the erroneous processes that I am trying to understand.   As it turns out, the brain is very complicated: and although it is the single most amazing system known to human kind, it can and does lead us astray in very surprising and alarming ways.

 

As for this blog, the top ten posts, based on the shear number of hits, are as follows:

  1. Attribution Error
  2. Nonmoral Nature, It is what it is.
  3. Multitasking: The Illusion of Efficacy
  4. Moral Instinct
  5. Pareidolia
  6. IAT: Questions of Reliability
  7. Are You a Hedgehog or a Fox?
  8. What Plato, Descartes, and Kant Got Wrong: Reason Does not Rule
  9. Illusion of Punditry
  10. Emotion vs.Reason: And the winner is?

What started out as ramblings from a curious guy in a remote corner of New York State ended up being read by folks from all over the planet.  It has been a difficult process at times, consuming huge amounts of time, but it has also been exhilarating and deeply fulfilling.

 

I have been heavily influenced by several scientists and authors in this exploration.  Of particular importance have been Steven Pinker, Daniel Simons, Christopher Chabris, Jonah Lehrer, Bruce Hood, Carl Sagan, and Malcolm Gladwell.  Exploring the combined works of these men has been full of twists and turns that in some cases necessitated deep re-evaluation of long held beliefs.  Holding myself to important standards – valuing evidence over ideology – has been an important and guiding theme.

 

Several important concepts have floated to the top as I poked through the diverse literature pertaining to thought processes. Of critical importance has been the realization that what we have, when it comes to our thought processes, is a highly developed yet deeply flawed system that has been shaped by natural selection over millions of years of evolution.  Also important has been my increased understanding of the importance of genes, the basic element of selective pressures, as they play out in morality and political/religious beliefs.  These issues are covered in the top ten posts listed above.

 

There are other worthy posts that did not garner as much attention as those listed above.  Some of my other favorites included a review of Steven Pinker’s article in the New York Times (also titled Moral Instinct,) a look at Jonathon Haidt’s Moral Foundations Theory in Political Divide, as well as the tricks of Retail Mind Manipulation and the Illusion of Attention.  This latter post and my series on Vaccines and Autism (Part 1, Part 2, Part 3) were perhaps the most important of the lot.  Having the content of these become general knowledge would make the world a safer place.

 

The evolution of understanding regarding the power and importance of Intuitive relative to Rational Thinking was humbling at times and Daniel Simons’ and Christopher Chabris’ book, The Invisible Gorilla, certainly provided a mind opening experience.  Hey, our intuitive capabilities are incredible (as illustrated by Gladwell in Blink & Lehrer in How We Decide) but the downfalls are amazingly humbling.  I’ve covered other topics such as  happiness, superstition, placebos, and the debate over human nature.

 

The human brain, no matter how remarkable, is flawed in two fundamental ways.  First, the proclivities toward patternicity (pareidolia), hyperactive agency detection, and superstition, although once adaptive mechanisms, now lead to many errors of thought.  Since the age of enlightenment, when human kind developed the scientific method, we have exponentially expanded our knowledge base regarding the workings of the world and the universe.  These leaps of knowledge have rendered those error prone proclivities unessential for survival.  Regardless, they have remained a dominant cognitive force.  Although our intuition and rapid cognitions have sustained us, and in some ways still do, the everyday illusions impede us in important ways.

 

Secondly, we are prone to a multitude of cognitive biases that diminish and narrow our capacity to truly understand the world. Time after time I have written of the dangers of ideology with regard to its capacity to put blind-folds on adherents.  Often the blind- folds are absolutely essential to sustain the ideology.  And this is dangerous when truths and facts are denied or innocents are subjugated or brutalized.  As I discussed in Spinoza’s Conjecture“We all look at the world through our personal lenses of experience.  Our experiences shape our understanding of the world, and ultimately our understanding of [it], then filters what we take in.  The end result is that we may reject or ignore new and important information simply because it does not conform to our previously held beliefs.

 

Because of our genetically inscribed tendencies toward mysticism and gullibility, we must make extra effort in order to find truth. As Dr. Steven Novella once wrote:

“We must realize that the default mode of human psychology is to grab onto comforting beliefs for purely emotional reasons, and then justify those beliefs to ourselves with post-hoc rationalizations. It takes effort to rise above this tendency, to step back from our beliefs and our emotional connection to conclusions and focus on the process.”

We must therefore be humble with regard to beliefs and be willing to accept that we are vulnerable to error prone influences outside our awareness.  Recognition and acceptance of these proclivities are important first steps.   Are you ready to move forward?  How do you think?

Share

As I read Steven Pinker’s book The Blank Slate: Modern Denial of Human Nature I was, for lack of a better word, flabbergasted, about the extent of acrimony that seemingly persists regarding the nature versus nurture debate.  This parley, from my naive perspective, was over long ago.  Yet Pinker detailed the extensive history to which some intellectuals, even today, attack the notion of any genetic contribution to traits such as IQ, behavior, political views, religious views, and personality.

 

For me there is very little question about the impact of genes.  It is clear as day in my family.  My daughter for example is very much like me.  And I see the influence of genes nearly every day in my practice.  As a psychologist with a specialty in evaluating and treating difficult to manage children (i.e., Autism Spectrum Disorder, Oppositional Defiant Disorder, and ADHD), I often work with families who have an exceptionally strong willed and self directed child.  The children that have these latter traits, without Autistic like symptoms, are often classified as Oppositional Defiant.  Along with such independent mindedness, typically comes an explosive temperament and a highly sensitive and precocious level of personal dignity.   It is important to note that a vast majority of the time, the child is a proverbial chip off the ole-block: usually, the father was similarly quite difficult to manage as a youngster.

 

One with a nurture bias might suggest that my daughter and those oppositional children I see are simply products of their environment.  But here is what is interesting.  Often in the families I serve, there are other well behaved, well adjusted, and polite children.  To suggest that the environment uniquely and exclusively shaped the behavior and affect of the troubled child would suggest that there was a substantial level of differential parenting going on in the home.  This scenario is far too common to be a product of differentiated parenting style.  And thorough behavioral analysis almost always rules out this variable.  Socially, the parents are blamed for their bad kid, not because of their gene contribution, but because their alleged poor parenting practices.  Well, most often, poor parenting is not the cause of the problem!  And my daughter’s similarity to me unfolded despite my attempts to foster in her, her own unique identity and insufficient environmental influence.

 

The argument really is moot.  Genes do matter!  The evidence is substantial and it transcends the anecdotes I just shared.  Only those with an ideological position inconvenienced by this reality argue otherwise.  I actually prefer the idea that genes don’t matter.  It would give me greater capacity to affect change in homes given my behavior analytic skills.  It would also give me more hope that my daughter will not develop the same geeky interests that I have.  Too late!  She is a geology major.  Like me, she loves rocks.  It would also give me hope that she wont develop the same G/I ailments that have incapacitated me, my mother, and my grandfather. Again too late.  Sadly, the other day she had to buy some Tums.

 

People are uncomfortable with the idea that issues such as personality and IQ, for example, would have any genetic determinism.  It seems too limiting, too materialistic, and too deterministic.  People, I think, are more comfortable with the idea that they can affect change – that they can arrange outcomes, that the power is in our hands.  But the real power, it seems, is spread out – residing both in our hands and in our genes.  Environmental determinism, in fact, is more consistent with my political and social views, but no matter how inconvenient, I am compelled by evidence to soften my stance regarding this romantic notion.  How I wish that DNA did not enter the picture with regard to such issues.  Or do I?  Had it not, we wouldn’t be here to write/read such musings.  You’ve heard of the whole evolution by means of natural selection thing, haven’t you?

 

As it turns out, we are products of our genes and our environment.  No duh!  Debate over!  Right?  Nope!   I had assumed that it was commonly accepted that genes matter.  I had no idea that acknowledging this reality was in a sense sacrilegious to some.  Although Pinker made clear the debate, I suspected that perhaps this was an esoteric intellectual war of words limited to philosophical types with high brow notions about macro economic models and so on.  But, I became more aware of the lingering embers of environmental determinism as a result of a firestorm that erupted last week regarding an essay written by an environmental advocacy group spread about on Twitter and a subsequent article posted in the Huffington Post.  These articles essentially minimized genetic determinism in major health issues due to the failure of the Human Genome Project to isolate specific genes responsible for specific illnesses.  Out with the genes – in with the environment the proponents celebrated.   Environmental determinists pounced on the absence of evidence as if it were evidence of absence (Carmichael, 2010).  As it turns out, genes are really complex and diseases are influenced, it seems, by gene cohorts rather than any one specific gene.  I am less familiar with the research regarding genetic influence on disease but the tone of the banter reminded me of the debate about human nature detailed by Pinker.

 

I have discussed in several recent posts the impact of genes on important issues such as personality, adaptive functioning, and even political perspectives.  The psychologist Eric Turkheimer pulled together the unusually robust evidence from extensive studies of twins (fraternal and identical) reared together and apart as well as studies of adopted children relative to biological children and concluded that there are three important laws that help explain the development of personality characteristics and intelligence.   The three laws are as follows:

 

  1. All Human traits are heritable;
  2. The effect of being raised in the same family is smaller than the effect of the genes; and
  3. A substantial portion of the variation in complex human behavioral traits is not accounted for by the effects of genes or families.

 

These laws are best summarized based on current research from behavioral genetics as follows:

 

  1. Heredity accounts for about 50% of the variance in the adaptive functioning outcomes of children.
  2. The home environment, as it is influenced by parents, accounts for 0 to 10%, and
  3. The child’s peer group accounts for the remainder (40-50%)  (Pinker, 2002).

 

Corresponding laws regarding the variants affecting diseases are perhaps unclear at this time.  But denial of genetic influence is much like the denial of the heliocentric theory of the solar system or the arguments put forth by Creationists and anti vaccine advocates.  They are guided by ideological notions that hang by a thin thread.  Something near and dear to the hearts of the proponents of exclusive environmental determinism is threatened by evidence.  The only recourse is denial.  Its an old and tired song and dance.  Genes matter – but not exclusively.  Environment matters – but not exclusively.  Get used to it.

 

References:

 

Carmichael, M. (2010). DNA, Denial, and the Rise of “Environmental Determinism”. Wild Type. http://marycarmichael.wordpress.com/2010/12/22/dna-denial-and-the-rise-of-environmental-determinism/#comments

 

Katz, D. (2010).  Is There a Genie in the Genome? The Huffington Post. http://www.huffingtonpost.com/david-katz-md/is-there-a-genie-in-the-g_b_792844.html

 

Latham, J., & Wilson, A. (2010). The Great DNA Data Deficit: Are Genes for Disease a Mirage? The Bioscience Resource Project Commentaries.  http://www.bioscienceresource.org/commentaries/article.php?id=46

 

Pinker, S. (2002). The Blank Slate: Modern Denial of Human Nature. New York: Penguin Books.

Share

Halloween seems like an appropriate time to discuss superstition.  What with ghosts and goblins and black cats and witches and all.  But would not Easter or Christmas, or any other evening that a five year old loses a tooth be an equally appropriate time?  In actuality, we massage magical thinking in our children with notions of Santa Claus, the Easter Bunny, and the tooth fairy.  And recall if you will, some of your favorite children’s books and the supernatural forces employed to delight your youthful whimsies.  Magic is, along with the thinking employed to delight in it, seemingly a rite of childhood, and in some ways the essence of what it is to be a child.

 

Much as magical thinking has its roots in childhood fantasies, superstition too has its roots in our species’ youth.  In that nascent time we lacked the capacity to understand the forces and whims of the natural world around us.  Our ancestors struggled to survive, and living another day in part depended on their ability to make sense of the forces that aided or impinged upon them.  We must not forget that our forefathers lived much like the non-domesticated animals around us today.  Survival was a day to day reality dependent upon the availability of life sustaining resources like food, water and shelter, and was often threatened by predation or the forces of nature.  Death was a real possibility and survival a real struggle.  The stakes were high and the hazards were plentiful.  As it turns out, these are the very conditions under which superstition is likely to thrive.

 

So what is superstition?  Bruce Hood, author of The Science of Superstition, notes that superstition is a belief “that there are patterns, forces, energies, and entities operating in the world that are denied by science…”  He adds that “the inclination or sense that they may be real is our supersense.” It involves an inclination to attempt to “control outcomes through supernatural influence.”  It is the belief that if you knock on wood or cross your fingers you can influence outcomes in your favor.  It is the belief that faithfully carrying out rituals as part of a wedding ceremony (e.g., wearing something blue, something new, something borrowed) or before going to bat or before giving a big speech will improve outcomes.  It is also the belief that negative outcomes can come as a result of stepping on a crack, breaking a mirror, or spilling salt.  Hood argues that supersense goes beyond these obvious notions and surfaces in more subtle ways associated with touching an object or entering a place that we feel has a connection with somebody bad or evil.  For example, how would you feel if you were told that you had to wear Jeffery Dalmer’s T-shirt or that you were living in a house where ritualistic torture and multiple murders took place?  Most of us would recoil at the thought of this.  Most of us also believe (erroneously) that we can sense when someone is looking at us, even when we cannot see them doing so.  These beliefs and much of the value we place on sentimental objects stems from this style of thinking.

 

I explored the deep evolutionary roots of superstitious thinking in a previous post, The Illusion of Cause: Vaccines and Autism.   The principle underpinnings are the same.  In that post I noted the following:

 

Michael Shermer (2000), in his book, How We Believe, eloquently describes our brains as a Belief Engine. Underlying this apt metaphor is the notion that “Humans evolved to be skilled pattern seeking creatures. Those who were best at finding patterns (standing upwind of game animals is bad for the hunt, cow manure is good for the crops) left behind the most offspring. We are their descendants.” (Shermer, p. 38). Chabris and Simons (2009) note that this refined ability “serves us well, enabling us to draw conclusions in seconds (or milliseconds) that would take minutes or hours if we had to rely on laborious logical calculations.” (p. 154). However, it is important to understand that we are all prone to drawing erroneous connections between stimuli in the environment and notable outcomes. Shermer further contends that “The problem in seeking and finding patterns is knowing which ones are meaningful and which ones are not.

From an evolutionary perspective, we have thrived in part, as a result of our tendency to infer cause or agency regardless of the reality of threat. For example, those who assumed that rustling in the bushes was a tiger (when it was just wind) were more likely to take precautions and thus less likely, in general, to succumb to predation. Those who were inclined to ignore such stimuli were more likely to later get eaten when in fact the rustling was a hungry predator. Clearly from a survival perspective, it is best to infer agency and run away rather than become lunch meat. The problem that Shermer refers to regarding this system is that we are subsequently inclined toward mystical and superstitious beliefs: giving agency to unworthy stimuli or drawing causal connections that do not exist. Dr. Steven Novella, a neurologist, in his blog post entitled Hyperactive Agency Detection notes that humans vary in the degree to which they assign agency. Some of us have Hyperactive Agency Detection Devices (HADD) and as such, are more prone to superstitious thinking, conspiratorial thinking, and more mystical thinking. It is important to understand as Shermer (2000) makes clear:

“The Belief Engine is real. It is normal. It is in all of us. Stuart Vyse [a research psychologist] shows for example, that superstition is not a form of psychopathology or abnormal behavior; it is not limited to traditional cultures; it is not restricted to race, religion, or nationality; nor is it only a product of people of low intelligence or lacking education. …all humans possess it because it is part of our nature, built into our neuronal mainframe.” (p. 47).

 

Bruce Hood takes this notion further and adds that the cultural factors discussed at the opening of this piece and other intuitive inclinations such as dualism (a belief in the separation of mind and body), essentialism (the notion that all discernible objects harbor an underlying reality that although intangible, gives each and every object it’s true identity), vitalism (the insistence that there is some big, mysterious extra ingredient in all living things), holism (that everything is connected by forces), and anism (the belief that the inanimate world is alive) shape adult superstition.  These latter belief mechanisms are developmental and naturally occurring in children: they are the tendencies that make magic and fantasy so compelling for children.  It is when they lurk in our intuition or are sustained in our rational thought that we as adults fall victim to this type of illusion.

 

It is interesting to note that much like our ancestors, we are more prone to this type of thinking when faced with high stakes, a low probability of success, and incomprehensible controlling circumstances.  Think about it.  In baseball, batters often have complex superstitious rituals associated with batting.  The best hitters experience success only one in three times at bat.  And the speed at which they have to decide to swing or not and where to position the swing defies the rational decision making capacity of humans.  On the other hand, these very same athletes have no rituals when it comes to fielding a ball (which is a high probability event for the proficient).

 

Superstition is a natural inclination with deep evolutionary and psychological roots embedded deeply in our natural child development.  These tendencies are nurtured and socialized as a part of child rearing and spill over into adult rituals in predictable circumstances (particularly when there is a low degree personal control).   When one deconstructs this form of thinking it makes complete and total sense.  This is not to suggest that reliance on superstitions is sensible.  Often, however, the costs are low and the rituals therein can be fun.  There are some potential costs associated with such thinking.  Some of the dangers are materialized in notions such as vaccines cause autism and homeopathy will cure what ails you in lieu of scientific medicine.  Resignation of personal power in deference to supernatural forces is a depressive response pattern.  Reliance on supernatural forces is essentially reliance on chance and in some cases its applications actually stack the deck against you.  So be careful when employing such tactics.  But, if you’re in the neighborhood, NEVER EVER walk under my ladder.  I’ve been known to drop my hammer.

 

References

 

Chabris, C. F., & Simons, D. J. (2010). The Invisible Gorilla. Random House: New York.

 

Dawkins, R. (2009). The Greatest Show on Earth: The Evidence for Evolution. Free Press: New York.

 

Gelman, S. A. (2004). Psychological Essentialism in Children. TRENDS in Cognitive Sciences, 8, 404-409.

 

Hood, B. (2008). The Science of Superstition (Formerly Titled: Supersense: Why We Believe in the Unbelievable). HarperCollins Publishers: New York.

 

Novella, S. (2010). Hyperactive Agency Detection. NeuroLogica Blog. http://www.theness.com/neurologicablog/?p=1762

 

Shermer, M. (2000). How We Believe. W.H. Freeman/Henry Holt and Company: New York.

Share

I’m sure you have heard of subliminal messages. You know that classic story where it was alleged that flashing the words DRINK COKE on a movie screen for a fraction of a second would increase cola buying behavior at the concession stand.  Well, that was a hoax, but you should know that I can, in other ways, tap into your subconscious thoughts and make you smarter, dumber, more assertive, or more passive for a short period of time.

 

This is not brainwashing!  It has a different name.  In the field of psychology, this interesting phenomena is referred to as primingJohn Bargh (now at Yale University) and colleagues formerly at New York University demonstrated the legitimacy of priming in a very interesting paper entitled Automaticity of Social Behavior: Direct Effects of Trait Construct and Stereotype Activation on Action (Bargh, Chen, & Burrows, 1996).  These researchers contend “that social behavior is often triggered automatically on the mere presence of relevant situational features [and that] this behavior is unmediated by conscious perceptual or judgmental processes.”  One of the studies they used to empirically demonstrate the implications of automatic social behavior (priming) involved a group of undergraduates from NYU who were given the scrambled sentence test.  The test involves the presentation of a series of five scrambled word groupings.  From each grouping one is to devise a grammatical four word sentence.  For example, one of the groupings might include the words: blue the from is sky.  From this grouping your job would be to write The sky is blue.  A typical scrambled sentence test takes about five minutes.

 

The scrambled sentence test is a diversion and a means to present words that may influence or prime the subject’s behavior, thoughts, or capabilities.  In this study the subjects were randomly assigned to one of two groups.  One group was presented with scrambled sentences that were sprinkled with words like “bold,” “intrude,” “bother,” “rude,” “infringe,” and “disturb.”  The second group was presented with scrambled sentences containing words like “patiently,” “appreciate,” “yield,” “polite,” and “courteous.”  Each student independently completed their test in one room and were told upon completion to walk down the hall to get their next task from an experimenter in another office.  For every subject, however, there was another student (a stooge) at the experimenter’s office asking a series of questions forcing the subject to wait.   Bargh and colleagues predicted that those primed with words like “rude” and “intrude” would interrupt the stooge and barge in quicker than those primed with words like “polite” and “yield.”    Bargh anticipated that the difference between the groups would be measured in milliseconds or at most, seconds.  These were New Yorkers, after all, with a proclivity to be very assertive (Gladwell, 2005).  The results were surprisingly quite dramatic!

 

Those primed with the “rude” words interrupted after about 5 minutes.  Interestingly, the university board responsible for approving experiments involving human subjects limited the wait period in the study to a maximum of ten minutes. The vast majority (82%) of those primed with the “polite” words never interrupted at all.   It is unknown how long they would have waited.  The difference between these groups based simply on the nature of the priming words was huge!  In the same paper Bargh et al., (1996) presented how students primed with words denoting old age (e.g., worried, Florida, lonely, gray, bingo, forgetful) walked more slowly leaving the office after completing the scrambled sentence test than they did on their way to the testing office.  It is suggested that the subjects mediated their behavior as a result of thoughts planted in their sub-conscious pertaining to being old.  These thoughts, in this case, resulted in the subjects behaving older (e.g., walking more slowly).

 

Priming one to be more or less polite or sprite is interesting, but there are disturbing and perhaps very damaging implications of this phenomena.

 

Dijksterhuis and van Knippenberg, a research team from Holland, looked at how priming might affect intellectual performance (1998).  Their subjects were divided into two random groups.  The first group was tasked for five minutes with thinking and writing down attributes pertaining to being a college professor.  The second group was tasked with thinking about and listing the attributes of soccer hooligans.  Following this thinking and writing task, the subjects were given 47 challenging questions from the board game Trivial Pursuits.  Those in the “professorial” priming group got 55.6% of the items correct while those primed with soccer hooliganism got only 42.6% correct.  One group was not smarter than the other – but it is contended that those in the “smart” frame of mind were better able to tap into their cognitive resources than those with a less erudite frame of mind.

 

And then there is the research from Claude Steele and Joshua Aronson (1995).  These psychologists investigated the impact on African Americans of reporting one’s race before taking a very difficult test.  They employed African American college students and a test made up of 20 questions from the Graduate Record Exam (GRE).  The students were randomly split into two groups.  One group had to indicate their race on the test while the others did not.  Those who indicated their race got half as many of the GRE items correct as their non-race-reporting counterparts.  Simply reporting that they were African American seemed to prime them for lower achievement.

 

All of these effects were accomplished completely and totally outside the awareness of the involved parties.  In fact, this is an essential attribute.  Effective priming absolutely necessitates that it be done outside the subject’s awareness.  Awareness negates the effect.

 

Regardless, consider the implications, intended or otherwise of such priming.  Malcolm Gladwell in his book Blink notes: “The results from these experiments are, obviously quite disturbing.  They suggest that what we think of as freewill is largely an illusion: much of the time, we are simply operating on automatic pilot, and the way we think and act – and how well we think and act on the spur of the moment – are a lot more susceptible to outside influences than we realize.” (p. 58).

 

Yes, It is disturbing on a personal level with regard to the vulnerability of rational decision making, but I am more concerned about the ethical implications of our insight into this tool. Priming may be used by those with the power, influence, and intentions to manipulate outcomes to serve ideological purposes.  On yet another level the reality of this phenomena supports my contention in Do we all get a fair start? that there is no true equal starting point.  Societal morays and the media in particular shape how we think about others and ourselves in profound ways.  We all are susceptible to stereotypes, prejudices, and biases and these tendencies can cut in multiple directions.  They can also be used to bolster negative attitudes or weaken individuals in destructive ways.  I am not suggesting that the sky is falling or that there is a huge ideological conspiracy going on, but we must be aware of our vulnerabilities in this regard.  And we must act to avoid constraining individuals as a function of subgroup affiliation.

 

References

 

Bargh, J. A., Chen, M.,  & Burrows, L. (1996).  Automaticity of Social Behavior: Direct Effects of Trait Construct and Stereotype Activation on Action. Journal of Personality and Social Psychology. Vol. 71, No. 2. 230-244

 

Dijksterhuis, A., & van Knippenberg, A. (1998). The relation between perception and behavior or how to win a game of Trivial Pursuit. Journal of Personality and Social Psychology, Vol. 74, 865-877.

 

Gladwell, M. (2005).  Blink: The Power of Thinking Without Thinking. Little, Brown and Company: New York.

 

Steele, C. M., & Aronson, J. (1995). Stereotype threat and the intellectual test performance of African Americans. Journal of Personality and Social Psychology, Vol. 69  No. 5. 797–811.

Share

Do we all get a fair start?

16 October 2010

I had an interesting conversation with a close family member the other day.  He was struggling to understand why people in the lower echelons of socioeconomic status do not understand or act on their ability to change their circumstances.  He firmly held the belief that the drive to achieve is universal and that we all have the same potential.  Essentially he was convinced that anyone can rise up by working hard in school or the workplace.  Those who do not achieve, he contended, are making an explicitly different choice.  Many refer to these folks as lazy, free loaders and/or cheaters.  He recounted the stories from his days working at the local grocery where people would use their public assistance checks to buy beer, cigarettes and other non essential items.  This is the same story I’ve heard from countless people who contend that public assistance is for lazy people content about, or highly skilled at, manipulating the system for a free ride.  I had a similar conversation with another family member recently, who was enraged about Obama shoving publicly supported health care down the throats of the American tax payer.

 

We are inherently tribal people and part of our human nature, it seems, is to be on the lookout for freeloaders.  As Jonathon Haidt’s work points out, such vigilance is inherent to various degrees in all of us, as part of the ingroup loyalty moral drive that is fundamental to social cohesion.   Freeloaders detract from the viability and survivability of the group.  This deeply emotional moral position has clear evolutionary roots that remain strong today.

 

No doubt, there are freeloaders among us.  There are people who scam the system and I am guessing that there will always be those who are comfortable with, or even proud of, their ability to live off the diligence and contributions made by others.  Some argue that entitlement programs enable the freeloaders among us to prosper and propagate.   This may be true for some.  But we need to keep it all in perspective.  To do so there are a number of other factors to consider.

 

First, isn’t it interesting that we frame freeloaders at the lower end of the spectrum differently than we classify white collar criminals?  Do they not accomplish essentially the same thing?  They illegitimately acquire resources that they are not entitled to.  And I am guessing that the true costs of white collar crime exceed those of “welfare fraud.”  Keep in mind that the major frauds in the medicaid system are generally perpetrated by white collar criminals – Doctors or administrators billing for un-rendered services.  Also think back to the impact of people like Bernie Madoff who essentially stole $21 Billion.  They are criminals indeed, but their crimes do not result in all those within their income bracket as being likewise identified as untrustworthy.  Granted, all crime is bad, but I have to challenge the implications of labeling an entire subset of a population as “bad” because some of them cheat.

 

Second, isn’t it also interesting that our hyper vigilance for cheaters targets the less fortunate among us rather than the corporations who bilk the system of billions of your hard earned dollars.  Why do we turn our anger against our fellow human beings when corporations like Exxon Mobile get huge tax subsidies while at the same time they are raking in billions of dollars of quarterly profit?  Then consider the financial melt down and the huge bail-outs provided to corporations deemed “too big to fail.”  The costs to our society as a results of welfare cheaters are a pittance in comparison to the impact of the deregulated market-place.

 

Third, although nobody likes a cheater, when given a chance to do so, and a low probability of getting caught, almost everybody will cut corners or scam the system to save a buck.  And everybody knows someone who works or gets paid “under the table.”  Somehow these folks are given a pass and escape the wrath of the stigma of freeloader.  My guess is, the proportion of people who cheat the system span all income brackets, and the actual social costs rise exponentially and commensurately with income.   The disdain that we target toward the less fortunate among us, I argue, is too convenient and hugely disproportionate.   Part of this may stem from the perception that welfare fraud is more visible to us than is white collar crime.  And while white collar crime is perpetrated by people that look and think like we do (or by faceless corporations), welfare fraud is sometimes perpetrated by people whose faces and lifestyles are different from ours.  We see these cheaters and often hear of their exploits.  I contend that much of what we hear amounts to rehashed urban myths.

 

The stereotype that many of us hold about the poor is inaccurate and maintained both by attribution error and confirmation bias.  And the belief that many white middle class college-educated people hold – that they alone are responsible for their position in life is reflective of self-serving bias.  Each generation launches from the shoulders of their parents who each launched from the shoulders of their respective parents.   My children are launching from a place that is exponentially different than that of a poor African American from the east side of Buffalo, New York, or a poor Latino from East L.A., or that of a poor white child raised in remote rural Appalachia, or that of white boarding school attendee from a heavily connected affluent Manhattan family.  The educational, social, and economic opportunities across these launching points vary in important and significant ways that shape their perceptions, aspirations, and realities in profound ways.   Heritage, and thus opportunity, play the biggest role in one’s socioeconomic status – although, “the system” benefits from people believing that it is hard work and intelligence that drives wealth distribution.  Believing the American Dream keeps the masses contented.  It keeps people striving, believing that they can rise up if only they are smart enough and diligent enough.   A significant part of our population has figured this out – they are the disenfranchised.  Without hope or opportunity it is hard to buy into the myth that one can rise out of the ghetto by working hard.  It’s difficult to continually swim against the current; and for the fortunate, it is sometimes hard to see that there is in fact a current when one is floating along with it.

Share

I don’t know if you caught it the other night when you were watching the news while skimming your email, checking your twitter and RSS feeds, and updating your Facebook status, but there was an interesting story about multitasking.  Silly me, who actually watches the news anymore? Anyways, much of the recent buzz on this endemic behavior (among the technologically savvy) is not good.  Multitasking is a paradox of sorts – where we tend to romanticize and overestimate our ability to split attention among multiple competing demands. The belief goes something like this: “I’ve got a lot to do and if I work on all my tasks simultaneously I’ll get them done faster.”   However, what most of us fail to realize is that when we split our attention, what we are actually doing is dividing an already limited and finite capacity in a way that hinders overall performance. And some research is showing that chronic multitasking may have deleterious affects on one’s ability to process information even when one is not multitasking (Nass, 2009).

 

Advances in computer technology seem to fuel this behavior.  If you do a Google search on multitasking you will get a mix of information on the technological wonders of machines that can multitask (AKA computers) mixed with news regarding how bad media multitasking is for you.

 

Think about it.  There has been increasing pressure on the workforce to be more productive and gains in productivity have been made lockstep with increases in personal computing power. Applications have been developed on the back of the rising tide of computer capacity, thus making human multitasking more possible.  These advances include faster microprocessors, increased RAM, increased monitor size, the internet itself, browsers that facilitate the use of multiple tabs, relatively inexpensive computers with sufficient power to keep open email, word processing programs, Facebook, Twitter, iTunes, and YouTube. Compound these tools with hardware that allows you to do these things on the go. No longer are you tethered to the desktop computer with an Ethernet cable.  Wifi and 3G connectivity allow all the above activities almost anywhere via use of a smart phone, laptop, iPad, or notebook computer.  Also in the mix are devices such as bluetooth headsets and other headphones that offer hands free operation of telephones.

 

Currently, technology offers one the ability to divide one’s attention in ways inconceivable only a decade ago. The ease of doing so has resulted in the generalization of this behavior across settings and situations including talking on cell phones while driving, texting while driving, texting while engaged in a face to face personal interactions, and even cooking dinner while talking on the phone. Some of these behaviors are dangerous, some rude, and all likely lead to inferior outcomes.

 

Don’t believe it? If you don’t, you are likely among the worst skilled of those who multitask. “Not me!” you may claim. Well research has shown that those who routinely multitask are also the most confident in their ability to do so (Nass, 2009).  But when you look at the products of these “confidently proficient” multitaskers, you find the poorest outcomes.

 

Multitasking involves shifting attention from one task to another, refocusing attention, sustaining attention, and exercising ongoing judgment about the pertinence and salience of various competing demands. Doing this successfully is exceptionally difficult and is likely well beyond the capacity of most typical human beings. Our brains can only generally concentrate on one task at a time, and as such, multitasking necessitates devoting shorter periods of time on dissimilar tasks.  As a result, overall effectiveness, on all tasks is reduced.

 

Researchers at the University of Michigan Brain, Cognition and Action Laboratory, including Professor David E. Meyer, point out that the act of switching focus itself has deleterious effects. When you switch from task A to task B you lose time in making the transition and the completion time of the transition itself increases with the degree of complexity of the task involved. Depending on how often you transition between stimuli, you can waste as much as 40% of your productive time just in task switching (APA, 2006).

 

Shorter periods of focus reduce overall time on task and each transition reduces this time further. Dr. Glenn Wilson at the Institute of Psychiatry, University of London in 2005 discovered that his subjects experienced a 10-point fall in their IQ when distracted by incoming email and phone calls. This effect size was “more than twice that found in studies of the impact of smoking marijuana” and was similar to the effects of losing a night’s sleep (BBC, 2005).

 

As for the negative long term affects of multitasking, Dr. Nass noted that:

 

“We studied people who were chronic multitaskers, and even when we did not ask them to do anything close to the level of multitasking they were doing, their cognitive processes were impaired. So basically, they are worse at most of the kinds of thinking not only required for multitasking but what we generally think of as involving deep thought.”

 

Nass (2009) has found that these habitual multitaskers have chronic filtering difficulties, impaired capacity to manage working memory, and slower task switching abilities. One must be careful to avoid the Illusion of Cause in this situation. Correlation is not causation and we must avoid inferring that multitasking causes these cognitive declines. The reverse may be true or other undetected variables may cause both.

 

Much of the research in this area is in its infancy and thus limited in scope and depth, so it is prudent to be a bit skeptical about whether or not multitasking is bad for you. But with regard to the efficacy of multitasking – when you look at the issue from an anecdotal perspective, apply the tangentially related evidence logically, and then consider the data, you have to conclude that multitasking on important jobs is not a good idea.  If you have important tasks to accomplish, it is best to focus your attention on one task at a time and to minimize distractions.  To do so, avoid temptation to text, tweet, watch TV, check your email, talk on the phone, instant message, chat on Facebook, Skype, or otherwise divide you attention. If you believe employing these other distractions helps you do better, you are deluding yourself and falling victim to the reinforcement systems that make multitasking enjoyable. Socializing, virtually or otherwise, is more pleasurable than the arduous processes involved in truly working or studying.

 

You can likely apply the same principles to plumbing, cooking, housework, woodworking, etc.  The key to success, it seems is to FOCUS on one task at a time, FINISH the job, and then move one.  You’ll save time, be more efficient, and do a better job! Remember – FOCUS & FINISH!

 

References

 

American Psychological Association. (March 20, 2006). Multitasking: Switching Costs.
http://www.apa.org/research/action/multitask.aspx

 

BBC News (2005). ‘Infomania’ worse than marijuana. http://news.bbc.co.uk/2/hi/uk_news/4471607.stm

 

Keim, B. (2009). Multitasking muddles Brains, even when the computer is off. Wired Science News for Your Neurons. http://www.wired.com/wiredscience/2009/08/multitasking/#ixzz11LfOUISp

 

Ophir, E., Nass, C., & Wagner, A. D. (2009). Cognitive Control in Media Multitaskers. Proceedings of the National Academy of Sciences. v. 106, no. 37. http://www.pnas.org/content/106/37/15583

 

Nass, C. (August 28, 2009).  Talk of the Nation: National Public Radio:  Multitasking May Not Mean Higher Productivity. http://www.npr.org/templates/story/story.php?storyId=112334449

 

Seldon, B. (2009). Multitasking, marijuana, managing? http://www.management-issues.com/2009/9/21/opinion/multitasking–marijuana–managing.asp

Share