The Power of an Apology

15 November 2012

Saying “I’m sorry” can be very difficult for some of us.  We routinely make mistakes.  As coined by Alexander Pope: “To err is human; to forgive, divine.”  Within any interpersonal relationship there will be inadvertent missteps or even acts of anger that hurt those close to us.  Its not a matter of if, it’s a matter of when.  Forgiving is important, as Pope emphasizes: and it is also quite often a difficult thing to do.  But the act of apologizing, it seems to me, can be even harder.

 

But why?

 

Obviously it necessitates swallowing one’s pride and accepting responsibility for one’s misdeeds.  It also requires a departure from one’s unique view of the world and the adoption of another person’s perspective.  Swallowing one’s pride is hard enough and perspective taking stirs the feelings of guilt.  For these reasons alone, I believe that saying the two simple words “I’m sorry” is perhaps one of the bravest things a person can do.

 

There are other factors that contribute to the difficulty associated with an apology.   Some view it as a tacit acknowledgement of one’s weakness.  It does tend to elicit a personal feeling of vulnerability and perhaps pangs of subjugation, defeat, and loss of status.  It can entwine and envelope one in a aura of incompetence and humility.  No one likes such feelings: none of them elevate one’s sense of  well being.  The opposite is true: they instead elicit dysphoric feelings that essentially punish the inclination to apologize.   Thus, many avoid, ignore, or steep themselves in denial.  Pointing outward and blaming the other party for causing the problem strips one of responsibility and allows escape from the unpleasantness of having to apologize.  It is the easy way out, and ultimately it tends to bankrupt a relationship.

 

I really like how Stephen Covey, author of Seven Habits of Highly Effective People, conceptualizes relationships.  He analogizes relationships to a bank account.  When you treat another person with dignity and respect, you make deposits in their emotional bank account.  When you hurt someone, you essentially make a withdrawal.  By virtue of being in a sustained relationship, you will, over time, make a series of deposits and withdrawals.  When you hurt another person and then deny your responsibility for having done so, you compound the withdrawal.  And too many withdrawals can drain that person’s emotional bank account.  A drained account stirs contempt and lays the foundation for the end of that relationship.  A genuine apology is typically a deposit and it can go a long way toward bringing the account back into balance.  To be effective, it must be heartfelt, with an acknowledgment of the depth of harm done, and with full acceptance of responsibility.  The results should help heal wounds and it may even strengthen the relationship.  It is a gift, because it can make forgiveness easier for the injured party.  Denial, on the other hand, deepens the wound and widens the gap.

 

Saying “I’m sorry” is supposed to be difficult.  It is an act of contrition, whereby one bares the difficult weight of the misstep and takes responsibility for it.  This courageous endeavor is essential for sustaining a loving and caring relationship.  The world in general, and your relationships specifically, will be better if you endeavor to be brave enough to utter these simple words.  Doing the right thing is ultimately way more important than being right (Ludwig, 2010). To err is human; to apologize, heroic.

 

References:

 

Belkin, L., (2010). Why is it so Hard to Apologize Well? The New York Times

 

Lazare, A., (2004). Making Peace Through Apology.  GreaterGood.berkley.edu

 

Ludwig, R., (2009).  Why is it so Hard to Say “I’m Sorry?”  NBC NEWS.com

 

Mumford & Sons (2010). Little Lion Man

 

O Leary, T. (2007). 5 Steps to an Effective Apology.  Pick The Brain.com

 

 

Share
 | Posted by | Categories: Happiness, Psychology | Tagged: |

Sometimes the quietest moments are the most troubling.  Serenity seems to occasionally pave the way for a sequence of thoughts triggered by a song or a smell, or anything really, that ushers in a blast from the past.  A cavalcade of memories then flow forth both effortlessly and seamlessly.  And all of this occurs outside of conscious control.  For me, it often begins with a pleasant memory, but it can take a circuitous route, bringing me to memories that I would prefer remain inaccessible.  The ending point is usually a moment in time where I come face to face with a mistake I made – usually a long forgotten unintentional misstep that reveled a less sensitive or perceptive side of my persona.

 

Does this sound familiar?  I have long struggled to make sense of this sequence of thoughts.  It’s not as though these distant missteps weigh heavily in my conscious mind.  And most of the time they have no or very little current relevance.   Almost always the events involve a situation where I had no intention of being hurtful.  So why would my brain dredge up painful events and spoil a perfectly pleasant moment?   It makes little sense to me.

 

I have long felt like there is a dark and deeply self effacing entity lurking in the shadows of my mind just waiting for an opportunity to rain guilt on me.   Really, it does feel like there is something lurking inside my mind, stalking my thoughts, waiting for a memory that can be linked back to an event that will make me feel bad about myself.  Freud’s notion of the Super-ego seems particularly relevant, but there is no evidence of such embodied moralistic forces battling it out in the brain.  There are however, brain systems that interact in a way that are compellingly similar to Freud’s model with regard to active decision making.  But it is not clear to me how, or why, these systems would reach back in time to spoil a moment of serenity.

 

As I understand it, the brain is comprised of a complex combinatorial neuronal network that has evolved over millions of years.  With this being the case, there must be either some adaptive value to this capacity to stir up guilty feelings, or it may be a side effect of some other adaptive neurological system.   These hypotheses are made assuming that this propensity is neither pathological or unique to me.  Given the fact that these recall events do not adversely affect my life in any substantive way, beyond briefly bumming me out, and the likelihood that I am not alone in experiencing this – it must be adaptive at some level.

 

As it turns out there appears to be evidence for a relationship between dispositional empathy and one’s proneness to feelings of guilt.  In a study titled Empathy, Shame, Guilt, and Narratives of Interpersonal Conflicts: Guilt-Prone People Are Better at Perspective Taking by Karen P. Leith and Roy F. Baumeister they found that Guilt:

“… seems to be linked to the important cognitive components of empathy, particularly the ability to appreciate another person’s perspective (or at least to recognize that the other’s perspective differs from one’s own). Guilt-proneness is linked to both the ability and the willingness to consider the other’s perspective.”

 

So these feelings of remote guilt may indeed be adaptive in that they fuel my perspective taking capacity.  In other words, they compel me to be all the more careful and sensitive so as to facilitate better outcomes with regard to current social relationships (and thus avoid future negative recollections).  I am inherently driven to look at the other person’s perspective in most of my encounters with people. It seems that those situations that spring forth from the depths of my memory are those occasions when I did not effectively employ good perspective taking.

 

Empathy is widely accepted as being an adaptive skill and perhaps guilt proneness facilitates positive feedback thus driving one toward more effective empathy.  Or perhaps the guilty feelings drudged up are experiential outliers – the memories with stronger visceral tags – the ones that are more easily dragged to the forefront as my brain meanders down memory lane.   Leith and Baumeister’s research did not address the retrospective nature of experiences like mine; therefore, I continue to speculate.  But this link between empathy and guilt makes sense.  Or maybe this is a self-serving bias.

 
If you have a moment, please click on the link below to answer some questions that will give me some preliminary information on this empathy-guilt relationship. It’s only 5 questions – and really, it should only take a minute or so.
Click here to take survey

Share

We humans like to think of ourselves as strong and dominant forces.  Why shouldn’t we?  After all, we have conquered many of our natural foes and reign supreme as rational and commanding masters of our destiny.  That is what we like to think.  But this may be an illusion because as it turns out, we share our bodies with an unimaginably vast array of organisms that seem to play a substantial role in our well-being.

 

In and on your body, there are ten microorganisms for every single human cell.  They are invisible to the naked eye – microscopic actually.  For the most part they are bacteria, but also protozoans, viruses, and fungi.  This collection of organisms is referred to as the microbiome and it accounts for about three pounds of your total body weight: about the same weight as your brain.  In all, there are an estimated 100 trillion individuals thriving on your skin, in your mouth, in your gut, and in your respiratory system, among other places.  And it is estimated that there are one to two thousand different species making up this community.(2)

Image of Microscopic Bacteria

 

Since wide spread acceptance of the Germ Theory, in the late nineteenth century, we have considered bacteria as the enemy.  These organisms are germs after all, and germs make us sick.  This is accurate in many ways: acceptance and application of the germ theory vastly extended the human life expectancy (from 30 years in the Dark Ages to 60 years in the 1930s).   Other advances have since increased that expectancy to about 80 years.

 

But, as we are increasingly becoming aware, this microbiome plays a crucial role in our ability to live in the first place.  There are “good” and “bad” microbes.  But this dichotomy is not so black and white.  Some good microbes turn problematic only if they get in the wrong place (e.g., sepsis and peritonitis).  But what we must accept is that we would not survive without the good ones.  We are just beginning to learn of the extent to which they control our health and even our moods.

 

For example, some of our nutritive staples would be of very limited value if it wasn’t for Baceroides thetaiotaomicron.  This microbe in our stomach has the job of breaking down complex carbohydrates found in foods such as oranges, apples, potatoes, and wheat germ.  Without this microbe we simply do not have the capability to digest such carbohydrates.(1)  And this is just the tip of the proverbial iceberg.

 

The “beneficial” bacteria in our guts are clearly very important.  They compete with the harmful bacteria, they help us digest our food, and they help our bodies produce vitamins that we could not synthesize on our own.(3)  Surprisingly, these microbes may play a significant role in our mood.  A recent study looking at the bacteria lacto bacillus, fed to mice, resulted in a significant release of the neurotransmitter gaba which is known to have a calming affect.  When this relationship was tested in humans we discovered a relationship between such gut bacteria and calmness to a therapeutic level consistent with the efficacy of anti-anxiety pharmaceuticals.(2)  This alone is amazing.

 

But wait, there’s more.  Take for example Helicobacter pylori (H pylori) whose job seems to be regulating acid levels in the stomach.  It acts much like a thermostat by producing proteins that communicate with our cells signaling the need to tone down acid production.  Sometimes things go wrong and these proteins actually provoke gastric ulcers.  This discovery resulted in an all out war on H pylori through the use of antibiotics.   Two to three generations ago more than 80% of Americans hosted this bacteria.  Now, since the discovery of the connection with gastric ulcers, less than 6% of American school children test positive for it.(1)  This is a good thing! Right?

 

Perhaps not.  As we have recently come to discover, H pylori plays an important role in our experience of hunger.  Our stomach produces two hormones that regulate food intake.  Ghrelin (the hunger hormone), tells your brain that you need food.  Leptin, the second hormone, signals the fact that your stomach is full.  Ghrelin is ramped up when you have not eaten for a while.  Exercise also seems to boost Ghrelin levels.  Eating food diminishes Ghrelin levels.  Studies have shown that H pylori significantly regulates Ghrelin levels and that without it your Ghrelin levels may be unmediated thus leading to a greater appetite and excessive caloric intake.(1)  Sound like a familiar crisis?

 

The long and the short of this latter example is that we really do not understand the down stream consequences of our widespread use of antibiotics.  Obesity may be one of those consequences.  When we take antibiotics, they do not specifically target the bad bacteria, they affect the good bacteria as well.  Its not just medical antibiotics that cause problems – we have increasingly created a hygienic environment that is hostile to our microbiome.  We are increasingly isolating ourselves from exposure to good and bad bacteria, and some suggest that this is just making us sicker.  See the Hygiene Hypothesis.

 

We have co-evolved with our microbiome and as such have developed an “immune system that depends on the constant intervention of beneficial bacteria... [and] over the eons the immune system has evolved numerous checks and balances that generally prevent it from becoming either too aggressive (and attacking it’s own tissue) or too lax (and failing to recognize dangerous pathogens).”(1)   Bacteroides fragilis (B fragilis) for example has been found to have a profoundly important and positive impact on the immune system  by keeping it in balance through “boosting it’s anti-inflammatory arm.”  Auto immune diseases such as Chrones Disease, Type 1 Diabetes, and Multiple Sclerosis have increased recently by a factor of 7-8.  Concurrently we have changed our relationship with the microbiome.(1) This relationship is not definitively established but it clearly merits more research.

 

Gaining a better understanding of the microbiome is imperative, and is, I dare say, the future of medicine.  We humans are big and strong, but we can be taken down by single celled organisms. And if we are not careful stewards of our partners in life, these meek organisms may destroy us.  It is certain that they will live on well beyond our days.  Perhaps they shall reclaim the biotic world they created.

 

Author’s Note:  This article was written in part as a summary of  (1) Jennifer Ackerman’s article The Ultimate Social Network in Scientific American (June 2012).  Information was also drawn from (2) a Radio Lab podcast titled GUTS from April of 2012 and (3) a story on NPR by Allison Aubrey called Thriving Gut Bacteria Linked to Good Health in July of 2012.

Share
 | Posted by | Categories: Biology, Healthcare, Psychology | Tagged: , |

What drives you crazy about your partner? Dirty dishes left piled in the sink. Several days worth of laundry strewn about the bedroom. The toilet paper roll is never replenished. She talks too much – he doesn’t talk enough. He’s always late – she’s a compulsive neat freak. These are a few of the common complaints that spouses have about their loved ones. It is well known that close intimate relationships can be very tough to sustain over time. There is something about living with someone for a long period of time that turns idiosyncratic quirks into incendiary peeves. Why is this?

 

I’ve recently finished reading Annoying: The Science of What Bugs Us by Joe Palca and Flora Lichtman. This fascinating read dives into a topic that has escaped much direct scientific scrutiny. This fact is amazing because “although everyone can tell you what’s annoying, few, if any, can explain why” (Palca & Lichtman, 2011). One of the topics that these authors explore is this issue of the bothersome habits of intimate partners. It’s exceedingly common – if your partner drives you crazy – you are not alone.

 

What is very curious is that often the very things that attracted you to your partner, are the things that, in the end, foster contempt. Palca and Lichtman explore the concept of Fatal Attraction coined by sociologist Diane Felmlee of UC – Davis. Felmlee has explored this concept for years and she has seen this tendency in couples all over the world. In the first stage of love (Romantic Love), we are drawn in, in part, by the cute little things, the person’s novel traits, that trigger affection. But, over time, those initially positive attractors often have an annoying flip side.

 

Why does something that attracted you to your partner get flipped into a detractor? Felmlee believes that this disillusionment occurs due to Social Exchange Theory where “extreme traits have [their] rewards, but they also have costs associated with them, especially when you are in a relationship.”

  • If you were drawn to partner because he was nice and agreeable, he may later be seen as passive and prone to letting people walk all over him.
  • If you were attracted to your partner because of her assertiveness, confidence, and self-directed demeanor, you may later find her to be stubborn and unreasonable.
  • If you were swooned by his strong work ethic and motivation to be successful, you may later be disappointed because you now have an inattentive, inaccessible, workaholic.
  • Someone who is a romantic, attentive, and caring suitor may later be viewed as a needy and clingy partner.
  • The passionate may become the dramatic or explosive hot-head.
  • The calm, cool, and collected becomes the aloof stoic.
  • The laid back guy becomes the lazy slob.
  • The exciting risk taker becomes the irresponsible adrenaline junkie.
  • The gregarious life of the party becomes the clown who takes nothing seriously.

 

And so it goes. Repetition seems to be a crucial contributor notes Elaine Hatfield, a psychologist from the University of Hawaii. “The same thing keeps happening over and over again in a marriage” she notes. Michael Cunningham, a psychologist from the University of Louisville has come to refer to these annoying attributes as Social Allergens. The analogy with an allergen is played out in the dose effect. He notes that “small things don’t elicit much of a reaction at first” but that with repeated exposure over time, they “can lead to emotional explosions.” Palca and Lichtman note that:

People frequently describe their partners as both “the love of my life” and “one of the most annoying people I know.”

 

Elaine Hatfield also believes that these social allergens get amplified when there is an imbalance in equity within a relationship. Equity Theory, she notes, suggests that when there is an imbalance of power, commitment, or contribution in a relationship, these quirks take on a disproportionate amount of negative value. However, if there is balance in the relationship (equity), the annoyance value of a partner’s quirks is more easily tolerated. So, if your partner is a good contributor and there is a balance of power, you are less likely to be annoyed. If, on the other hand, your needs are left unmet, or you do the lion’s share of the work around the house, or you feel unappreciated or diminished by your spouse, there is likely to be more annoyance associated with his or her quirks.

 

It is also important to note that the nature of a relationship changes over time. During the initial passionate Romantic Love stage, the couple tends to be on their best behavior. Once commitment and comfort are attained, one’s truer attributes tend to come to the surface. There tends to be less effort to conceal one’s quirks and thus increased occurrences of these social allergens.

 

Over time, increased and accelerated exposure take their toll and if there are equity issues, it’s a recipe for disaster. So, what is one to do?

 

The first step is to think about the issues that get to you with regard to how the value of those attributes may have a positive side. We all have our strengths and our quirks – yes, you too have your annoying tendencies! Michael Cunningham suggests that you should try to be accepting of your partners quirks. These behaviors are a part of who the person is. He notes that “You’ve got to take this if you want all of the other good things.

 

Own your feelings and explore them at a deeper level, particularly with regard to the equity issues in your relationship. Arthur Aaron, a psychology professor at the State University of New York at Stony Brook urges couples to nurture their relationship. “Celebrate when something good happens to your partner” he notes. Attend to and accentuate the positive. He also suggests engaging in novel, challenging and exciting activities fairly often. “Anything you can do that will make your relationship better will tend to make your partner less annoying.” My suggestion is to think of a relationship as a garden that needs attention, maintenance, and nurturance. It’s impossible to rid the garden of all its weeds and pests. But the more attention and nurturance you provide, the more it will flourish. As Stephen Covey is fond of saying: “Love is a verb. Love the feeling is the fruit of love the verb.” So do loving things.

Share
 | Posted by | Categories: Psychology | Tagged: , |

The more I learn about the workings of the human brain – the more I am stirred by feelings that Freud may have been right.  Although his theories have long since been discredited, he characterized the brain as a battle ground where three forces jockeyed  for control over your decision making.  There was the Id whose hedonistic impulse drove us toward self pleasuring.  And then there was the conscientious Superego whose role was to compel us to make moral decisions.  Finally, he believed there was the Ego whose job was to mediate between the drives of Id and Superego so as to facilitate adaptive navigation of the real world.

Sigmund Freud

 

Freud’s theories have always been compelling because they feel right.  I often feel as if there is a tug of war going on inside my head.  The struggles occur in the form of decisions to be made – whether its about ordering french fries or a salad, fish or steak, having a cookie or an apple, exercising or relaxing, jumping over that crevasse or avoiding it, buying a new coat or saving the money.  These battles are seemingly between good choices and bad ones.  But, where you place the good and the bad is highly contingent on one’s priorities in the moment.  The fries, steak, cookie, relaxing and that new coat all seem like good ideas in the moment – they’d bring me pleasure.  On the other hand, there are the downstream consequences of unnecessary calories from fat and sugar or squandered resources.  It’s a classic Id versus Superego battle.

 

But of course there are no entities in the human brain whose express duties are defined as Freud characterized them.

 

Or are there?

 

Well actually, there are brain regions that do wage contentious battles for control over your behaviors.  Across time, different modules assert greater amounts of control than others, and thus, the choices we make, do likewise vary in terms of quality.  As a result of advances in technology and understanding, we are becoming increasingly aware of the key factors associated with this variation.

Nucleus-Accumbens (NAcc) highlighted in red

 

One of the centers that play out in our multi-component brain is the dopamine reward pathway. Dopamine is a neurotransmitter that serves a number of important functions in the brain. One of its most significant roles plays out as a result of activation of the Nucleus Accumbens (NAcc). When the NAcc is activated it floods the brain with dopamine and we experience pleasure. Desire for an item activates the NAcc. Being in the presence of the desired item activates it further. The greater the arousal of the NAcc the more pleasure we experience. It is your NAcc that is responsible for the happiness you feel when you both anticipate and eat those fries or that steak or buy that new coat.  It is also responsible for that rush you feel when your team wins the big game (Lehrer, 2009).

Insula highlighted in teal

 

Then there is the Insula – a brain region that produces, among other sensations, unpleasantness. This center “lights up” in brain scans when people feel pain, anticipate pain, empathize with others, see disgust on someone’s face, are shunned in a social settings, or decide not to buy an item. In many cases we avoid exciting the Insula as it is the system that produces the unpleasantness of caffeine or nicotine withdrawal and the negative feelings associated with spending money (Blakslee, 2007; Lehrer, 2009).  When you are jonesing for that coffee or nicotine fix, it is your Insula that is making you feel badly – necessarily compelling you to feed the habit.  And when you satisfy the craving it is your NAcc that gives you that Ahhhhh!that sense of well being.

 

Perhaps the NAcc is Freud’s Id and the Insula Freud’s Superego?  It is actually much more complicated than this, but the overlap is interesting.

 

In an article I posted last month I wrote about the concept of an Alief.  An Alief is a primal and largely irrational fear (emotion) that arises from the deep unconscious recesses of your brain and plays a significant role in guiding some of the decisions you make.  At a very basic level, we know of two major driving forces that guide our decisions.  Broadly, the two forces are reason and emotion.  So how does this work? How do we process and deal with such diverse forces?

Orbitofrontal-Cortex (OFC) highlighted in pink

 

Neuroscientists now know that the OrbitoFrontal Cortex (OFC) is the brain center that integrates a multitude of information from various brain regions along with visceral emotions in an attempt to facilitate adaptive decision making.  Current neuroimaging evidence suggests that the OFC is involved in monitoring, learning, as well as the memorization of the potency of both reinforcers and punishers.  It analyzes the available options, and communicates its decisions by creating emotions that are supposed to help you make decisions.  Next time you are faced with a difficult decision, and you experience an associated emotion – this is the result of your OFC’s attempt to tell you what to do.  Such feelings actually guide most of our decisions without us even knowing that it is happening.

 

The OFC operates outside your awareness: opaquely communicating with your rational decision making center using the language of feelings.   Our rational center, the Prefrontal Cortex, the more apt Freudian Ego analogy, is not as predominant as he suggested.  In fact, it is limited in capacity – both easily fatigued and overly taxed.  See my post on Willpower for a deeper discussion of this issue.

 

So, as crazed as we view Freud’s notions today, there were some aspects of his explanation of human behavior that were rooted in actual brain systems.  As I previously noted, these systems are much more complicated than I have described above, but in essence, there are battles waged in your head between forces that manipulate you and your choices through the use of chemical neurotransmitters.  A portion of these battles occur outside your awareness, but it is the influence of the emotions that stem from these unconscious battles that ultimately make you feel as though there is a Devil (Id) on one shoulder and an angel (Superego) on the other as your Prefrontal Cortex (Ego) struggles to make the best possible decision.

 

By understanding these systems you may become empowered to make better decisions, avoid bad choices, and ultimately take more personal responsibility for the process.  It’s not the Devil that made you do it, and it’s not poor Ego Strength – necessitating years of psychotherapy.  It is the influence of deeply stirred emotions and manipulation occurring inside of you and perhaps some over dependence on a vulnerable and easily over burdened Prefrontal Cortex that leads you down that gluttonous path.

 

References

 

Blakeslee, Sandra. 2007. Small Part of the Brain, and Its Profound Effects. New York Times.

 

Gladwell, M. 2005.  Blink: The Power of Thinking Without Thinking. Little, Brown and Company: New York.

 

Guild, G. 2010. Retail Mind Manipulation.  How Do You Think?

 

Guild, G. 2010. What Plato, Descartes, and Kant Got Wrong: Reason Does Not Rule.  How Do You Think?

 

Guild, G. 2010. Willpower: What is it really? How Do You Think?

 

Guild, G. 2011. Irrational Fear: It’s Just an Alief. How Do You Think?

 

Lehrer, J. 2009. How We Decide. Houghton Mifflin Harcourt: New York.

Share

I have always said that there is a fine line between intelligence and fear.  Some fear is adaptive and entirely reasonable: particularly when the catalyst truly involves danger. There are some anxieties however, that take hold and profoundly affect behavior in unreasonable ways.

 

One personal example comes to mind to illustrate this. Last winter I was backpacking on a trail that traversed some rock city formations with deep, but relatively narrow, crevasses. Many of the cracks were unintimidating and easily traversed. There was one however, that stopped me in my tracks. The gap was 36-40 inches across a sheer 25 foot drop. Under more typical circumstances, this gap would have not phased me. Yet, in this situation, I was completely frozen.

Rock City Crevasse

To be clear there was some risk associated with this crossing. But, in my mind, the risk took on unreasonable proportions.

 

Frankly, I was both embarrassed and befuddled by this situation. Were it a stream of equal width, I would have easily hopped over it.

 

I stood there at battle with myself for what seemed like an eternity. In reality, it was probably only a minute or two.  My body was hostage to a cognitive tug-of-war between my rational brain urging me to leap. “Come-on” I uttered to myself “It’s only three feet across!” “You can do this!”

 

Another force in my brain countered with incapacitating doubt.  Kevin, my backpacking companion, patiently waited on the other side of the crevasse after easily leaping across. I saw him do it with no difficulty.  I had clear evidence that the crossing was easily within my capabilities; but, the cost of a slip and a fall, far overshadowed my confidence. The frustration I felt over this coup of sorts, was immense. Finally, I was able to muster up enough confidence to take the leap. It was, in fact, quite easy.  We hiked on and no further mention of this humbling pause was made.

 

Many fears are like this. Whether it is a fear of mice, or bees, spiders, or snakes. These stimuli impose, in most circumstances, no grave threat, but the flight response they trigger in the phobic is immense. Even when a person knows that there is no reason for fear, it persists.

 

This response is akin to the reluctance that most people have about eating chocolate fudge in the shape of dog feces, or eating soup from a clean unused bedpan, or drinking juice from a glass in which a sterile cockroach has been dipped. Psychologist Paul Rozin, in his famous studies on disgust, discovered that when presented with these circumstances, most people choose not to eat the fudge or the soup, or drink from the glass – even knowing there is no real danger in doing so.  It is the irrational essence of contagion that drives these inhibitions.

 

These situations are all very different than rock climbing without ropes, where there is clear and present danger. When we are compelled to flee a truly benign stimulus, we are likely driven by an internal cognitive force that screams “RISK!” even when there is no true danger.  Intriguing isn’t it, that this innate force is so powerful that even our capacity to use reason and evidence pales in comparison.

 

Philosopher Tamar Gendler has coined the word “alief” to describe this cognitive phenomenon.  She fashioned the word around the word “belief,” which is a conscious manifestation of how we suppose things to be.  An alief is a deep and powerful feeling of sorts that can and does play an important role in decision-making, but it is not based in reason or evidence.  Beliefs can be more susceptible to such rational forces.  But aliefs defy reason and exert powerful influence despite one’s attempts to rationally dispel them.  This voice is intuitive and its origins are outside your awareness.  They typically appear in an attempt to facilitate self-preservation.

 

You may believe that the feces shaped fudge is “JUST FUDGE!” but it is your alief that the fudge is excrement (as a result of it’s characteristic size, shape, and color) that makes it very hard to eat.  I believed that hopping over the crevasse was easily within my capabilities, but it was my “alief” that – leaping over the gap is DANGEROUS – that kept me frozen in my tracks.

 

You see, you can simultaneously hold opposing beliefs and aliefs and it was, in fact, these opposing forces that waged war as I stood at the edge of the precipice.  You might believe that a bee is generally harmless and unlikely to sting you unless you threaten it.  But, it is your alief, that the bee will sting and hurt you that triggers the autonomic arousal that compels you to flee.  It is this deeply primal alief that often wins, no matter how rational you attempt to be.

 

In my situation, my belief in my leaping ability ultimately prevailed.  Perhaps this was due to my machismo or humiliation, but ultimately I fought down and defeated the alief.  It was a hard fought battle that left me feeling like a chicken despite my “victory.”

 

In retrospect, getting an understanding of this internal process has helped me come to grips with my hesitation.  And as such, I stand in awe of the internal brain systems that play out in such circumstances.

 

Perhaps in the future, when in a similar situation, I will be better prepared to deal with self doubt as it springs forth from my lizard brain so that I will more effectively cope with it before it builds incapacitating momentum.  After all – it’s just an alief!

Share

The Brain’s False Idols

4 December 2011

I’ve been exploring the subtleties of human cognition for nearly two years now.  The most amazing and persistent lesson I’ve learned is that our ability to understand the world is limited by the way our brains work.  All of us are constrained by fundamentally flawed cognitive processes, and the advanced studies of human cognition, perception, and neuro-anatomy all reveal this to be true.  Although this lesson feels incredibly fresh to me, it is not new news to mankind.   Long ago, serious thinkers understood this to be true without the aid of sensitive measurement devices (e.g., fMRI) or statistical analysis.

 

It pains me a bit to have been scooped by Sir Francis Bacon, who knew this well in the early 17th Century.  After all, It took me two years of intensive, self-driven investigation, 18 years after getting a PhD in psychology, to come to grips with this.  I have to ask “Why isn’t this common knowledge?”  and “Why wasn’t this central to my training as a psychologist?”

 

Bacon, an English lawyer, statesman, and thinker, who devoted his intellect to advancing the human condition, astutely identified the innate fallibility of the human brain in his book entitled New Organon published in 1620.  He referred to these cognitive flaws as The Four Idols.  The word idol he derived from the Greek word eidolon which when translated to English means a phantom or an apparition, that he argued, blunts or blurs logic and stands in the way of truly understanding external reality.  What we know today, adds greater understanding of the mechanisms of these errors, but they stand intact.

 

The terms Bacon used to describe these flaws probably made more sense in his day, but they are opaque today.  My preference is to use a more current vernacular to explain his thoughts and then back-fill with Bacon’s descriptors.  My intention is not to provide an abstract of his thesis, but rather to drive home the notion that long ago the brain’s flaws had been identified and acknowledged as perhaps the biggest barrier to the forward progress of mankind.  Much has changed since Bacon’s day, but these idols remain as true and steadfast today as they were 400 years ago.  It is important to note that Bacon’s thesis was foundational in the development of the scientific process that has ultimately reshaped the human experience.

 

I have previously written about some of the flaws that Bacon himself detailed long ago.  Bacon’s first idol can be summed up as the universal transcendent human tendencies toward Pareidolia, Confirmation Bias, and Spinoza’s Conjecture.  In other words, humans instinctively: (a) make patterns out of chaos; (b) accept things as being true because they fit within their preconceived notions of the world; (c) reject things that don’t fit within their current understanding; and (d) tend to avoid the effort to skeptically scrutinize any and all information.   These tendencies, Bacon described as the Idols of the Tribe.  To him the tribe was us as a species.  He noted that these tendencies are in fact, universal.

 

The second set of attributes seem more tribal to me because although the first set is universal, the second set vary by what we today more commonly refer to as tribes.  Cultural biases and ideological tendencies shared within subsets of people make up this second idol – the Idols of the Cave.  People with shared experiences tend to have specific perspectives and blind spots.  Those within such tribal moral communities share these similarities and differentiate their worldviews from outsiders.  People within these subgroups tend to close their minds off to openness and diverse input.  As such, most people innately remain loyal to the sentiments and teachings of the in-group and resist questioning tradition.  Cohabitants within their respective “caves” are more cohesive as a result – but more likely to be in conflict with out-groups.

 

The third idol is more a matter of faulty, misguided, or sloppy semantics.  Examples of this include the overuse of, or misapplication of, vague terms or jargon.  Even the perpetual “spin” we now hear is an example of this.  In such situations, language is misused (i.e., quotes used out of context) or talking points told and retold as a means to drive a specific ideological agenda regardless of whether there is any overlap with the facts.  It is important to note that this does not necessarily have to be an act of malice, it can be unintentional.  Because language can be vague and specific words, depending on context, can have vastly different meanings, we are inherently vulnerable to the vagaries of language itself.  These are the Idols of the Market Place where people consort, engage in discourse, and learn the news of the day.  Today we would probably refer to this as the Idols of the 24 Hour News Channel or the Idols of the Blogosphere.

 

The final idol reflects the destructive power of ideology.  At the core of ideology are several human inclinations that feed and sustain many of the perpetual conflicts that consume our blood and treasure and in other ways gravely harm our brothers and sisters.  Deeper still, at the root of erroneous human inclinations, is this tendency that makes us vulnerable to the draw of ideologies that sustain beliefs without good reason.  Such is the Idol of the Theater, where theologians, politicians, and philosophers play out their agendas to their vulnerable and inherently gullible disciples.  Beliefs ultimately filter what we accept as true and false.  This is how the brain works.  This proclivity is so automatic and so intrinsic that in order to overcome it, we have to overtly fight it.  What is most troubling is that most people don’t even know that this is occurring within them.  It is this intuitive, gut-level thinking that acts as a filter and kicks out, or ignores incongruity.  And our beliefs become so core to us, that when they are challenged, it is as if we ourselves have been threatened.

 

It takes knowledge of these idols and subsequently overt efforts, to overcome them, so that we don’t become ignorant victims of our own neurology: or worse, victims of the cynical and malicious people who do understand these things to be true.  We are inherently vulnerable – be aware – be wary – and strive to strike down your brain’s false idols.

 

Share

Do you believe that economic success is just a matter of having a good work ethic and strong personal motivation?  Most people do.  But in reality this is a perfect example of the Fundamental Attribution Error and the Self Serving Bias.

 

Attribution Error occurs when we negatively judge the unfortunate circumstances of others as being a reflection of their character traits rather than as a result of environmental circumstances (e.g., growing up in poverty).  What is even more interesting is that when we mess up, we tend to blame it on environmental factors rather than accepting personal responsibility.  When we are successful however, we take credit for the outcome assigning credit to internal personal attributes and devaluing environmental contributors.  This latter error is the Self Serving Bias.

 

This erroneous thinking is universal, automatic, and it is what drives a wedge between people on different points of the socio-economic spectrum.  If you believe that poor people are impoverished simply because they are lazy free-loaders, you are likely a victim of this thinking error.  The same is true if you believe that your success is completely of your own doing.

 

I have written numerous articles on the impact of poverty on early childhood development (i.e., The Effects of Low SES on Brain Development) and the bottom line is that economic deprivation weakens the social and neurobiological foundation of children in ways that have life-long implications.  In this post I will summarize a review article by Knudsen, Heckman, Cameron, and Shonkoff entitiled: Economic, Neurobiological, and Behavioral Perspectives on Building America’s Future Workforce.  This 2006 article published in the Proceedings of the National Academy of Sciences provides an excellent review of the research across many fields including developmental psychology, neuroscience, and economics.  It highlights the core concepts that converge with regard to the fact that the quality of early childhood environment is a strong predictor of adult productivity.  The authors point to the evidence that robustly supports the following notions:

 

  1. Genes and environment play out in an interdependent manner. Knudsen et al., (2006) noted that “… the activation of neural circuits by experience also can cause dramatic changes in the genes that are expressed (“turned on”) in specific circuits (58-60). The protein products of these genes can have far reaching effects on the chemistry of neurons and, therefore, on their excitability and architecture.”  Adverse experiences can and do fundamentally alter one’s temperament and capacity to learn throughout life.
  2. Essential cognitive skills are built in a hierarchical manner, whereby fundamental skills are laid down in early childhood and these foundational neural pathways serve as a basis upon which important higher level skills are built.
  3. Cognitive, linguistic, social, and emotional competencies are interdependent – all nascent in early childhood, when adverse environmental perturbations reek havoc on, and across, each of these fundamental skill sets.
  4. There are crucial and time-sensitive windows of opportunity for building these fundamental competencies.  Should one fail to develop these core skills during this crucial early developmental stage, it becomes increasingly unlikely that later remediation will approximate the potential one had, if those skills were developed on schedule.  A cogent analogy here is learning a new language – it is far easier to learn a new language early in development when the language acquisition window is open, than it is later in life when this window is nearly closed.

 

In my last two posts (Halting the Negative Feedback Loop of Poverty: Early Intervention is the Key and Poverty Preventing Preschool Programs: Fade-Out, Grit, and the Rich get Richer) I discussed two successful early intervention programs (e.g., Perry Preschool Program & Abecedarian Project) that demonstrated positive long-term benefits with regard to numerous important social and cognitive skills. Knudsen, et al, (2006) noted:

 

“At the oldest ages tested (Perry, 40 yrs; Abecedarian, 21 yrs), individuals scored higher on achievement tests, reached higher levels of education, required less special education, earned higher wages, were more likely to own a home, and were less likely to go on welfare or be incarcerated than individuals from the control groups.”

 

These findings converge with research on animal analogues investigating the neurodevelopmental impact of early stimulation versus deprivation across species.  Knudsen et al., (2006) point out that:

 

  1. There are indeed cross species negative neurodevelopmental consequences associated with adverse early developmental perturbations.
  2. There clearly are time sensitive windows during which failure to develop crucial skills have life-long consequences.  Neural plasticity decreases with age.
  3. However, there are time sensitive windows of opportunity during which quality programs and therapies can reverse the consequences of adverse environmental circumstances (i.e., poverty, stress, violence).

 

Early learning clearly shapes the architecture of the brain.  Appropriate early stimulation fosters neural development, while conversely, impoverished environments diminish adaptive neural stimulation and thus hinders neural development.  Timing is everything it seems.  Although we learn throughout our lifespan, our capacity to learn is built upon a foundation that can be strengthened or impaired by early environmental experiences.  It is very difficult to make up for lost time later in life – much as it is difficult to build a stable building on an inadequate foundation.  Stimulating environments during these crucial early neurodevelopment periods are far more efficient than remediation after the fact.  These realities provide further justification for universally available evidence based early preschool services for children at the lower end of the socio-economic spectrum.  Proactive stimulation fosters stronger and more productive citizens – yet, we continue to respond in a reactive manner with remedial and/or punitive measures that miss the mark.  The necessary proactive response is clear.

 

References:

 

Knudsen, E. I., Heckman, J. J., Cameron, J. L., and Shonkoff, J. P. (2006). Economic, neurobiological, and behavioral perspectives on building America’s future workforce.  Proceedings of the National Academy of Sciences.  v. 103, n. 27. 10155-10162.

 

Share

In my last post, Halting the Negative Feedback Loop of Poverty: Early Intervention is the Key I looked at the evidence from two quality studies of preschool intervention programs that substantiated a capacity to counteract the impairing impact of growing up in economic deprivation.  Both studies,  Perry Preschool Program and the Abecedarian Project demonstrated positive long-term benefits with regard to numerous important social and cognitive skills.  In this post I shall discuss some interesting issues and concepts that underlie the gains made at Perry and Abecedarian, including fade-out, grit, and positive and negative feedback loops.

 

The issue of fade-out, and its implications, are very important.  In both the Perry and Abecedarian Programs there were substantial positive outcomes with regard to immediate IQ and other cognitive scores.  Once the children entered typical school age programs,  some of their gains, particularly their IQ (which had a 10-15 point boost during treatment) faded away.  This fade-out was strikingly true for the Perry Preschool Program but not so for the Abecedarian Project, which had a substantially more intensive program, involving both longer school days and more school days per year.  See Figure 1 below.

 

Figure 1

 

Despite this apparent fade-out, when the recipients of this specialized programing where assessed decades later, they did much better than non-recipients on relative life issues such as high school graduation, four-year college attendance, and home ownership.  These results are encouraging on the one hand, yet puzzling on the other.  Such fade-out renders programs like Head Start vulnerable to those who cherry pick  data in order to advance ideologically driven political agendas.  Regardless, this does raise some important questions.

 

  1. Why do gains in IQ appear to fade-out?
  2. What skill gains account for the long-term gains made?

 

Some prominent researchers (e.g., David Barnett) question whether there is actually any true fade-out at all – suggesting that faulty research design and attrition may better explain these results.  Regardless, IQ is not the sole variable at play here – if anything, this data highlights the questionable validity of the IQ construct itself, relative to important life skills.  If improved IQ is not the variable that results in improved social outcomes we need to understand what happens to these children as a result of the programming they receive.  One likely hypothesis has been proffered to explain these data:

 

the intervention programs may have induced greater powers of self-regulation and self-control in the children, and … these enhanced executive skills may have manifested themselves in greater academic achievement much later in life.” (Raizada & Kishiyama, 2010).

 

Evidence has been substantiated for this hypothesis by Duckworth et al., (2005, 2007, 2009) who demonstrated that self discipline and perseverance or “grit” is more predictive of academic performance than is IQ and other conventional measures of cognitive ability (Raizada & Kishiyama, 2010).  It appears that enhancing one’s grit has the effect of triggering long-term capabilities that are self-reinforcing.  Improved self-control and attentiveness fosters achievement that ultimately feeds-back in a positive way making traditional school more rewarding and thus promoting even more intellectual growth (Raizada & Kishiyama, 2010).  Poor children, without intervention, on the other hand, appear less able to focus, attend, and sustain effort on learning and thus enter a negative feedback loop of struggle, failure, and academic disenchantment.

The bottom line is that success begets success and failure begets failure.  Stanovich (1986) offered an analogous explanation for reading proficiency: “…learning to read can produce precisely such effects: the better a child can read, the more likely they are to seek out and find new reading material, thereby improving their reading ability still further.” (Raizada & Kishiyama, 2010).

 

Both the Perry Preschool and Abecedarian Programs have impressive long-term outcome data.  See figures 2 & 3 below for a summary of those data.

 

Figure 2

Figure 3

 

The efficacy of each program has spawned other programs such as Knowledge is Power Program and the Harlem’s Children’s Zone.  Both of these intensive programs lack randomized assignment to treatment and non-treatment (control) groups.  As a result, it is difficult to make any claims about their treatment impact on important cognitive and social skills.  Given what we learned from the Perry and Abecedarian Programs, I have to wonder whether it would be ethical to withhold such treatment from those children randomly assigned to the control group.  It now seems to me, that we absolutely have an ethical obligation to short circuit the negative feedback loop of poverty and put into place universally accessible programs that diminish and/or eradicate poverty’s crippling life long impact.

 

We all pay a heavy price for poverty, but no one pays a greater cost than those children, who have been thrust into their circumstances, with little hope of rising out of poverty unless we join together to give them a fair shot at economic and social equality.

 

Yes, such programs cost money, but the long term economic costs of the status-quo are much greater.  Pay me now and build positive contributors to society, or pay me later and pay greater costs for special education, prisons, medicaid, and public assistance.  It certainly pays to step back from ideology and look at the real costs – both in terms of human lives and in terms of dollars and cents.  It makes no sense to continually blame the victims here.  Early intervention is good fiscal policy and it is the right thing to do.  It just makes sense!

 

NOTE: In a future post I will look at the evidence put forward by cognitive neuroscience for such programs.  Also see The Effects of Low SES on Brain Development for further evidence of the negative impact low SES has on children.

 

References:

 

Knudsen, E. I., Heckman, J. J., Cameron, J. L., and Shonkoff, J. P. (2006). Economic, neurobiological, and behavioral perspectives on building America’s future workforce.  Proceedings of the National Academy of Sciences.  v. 103, n. 27. 10155-10162.

 

Raizada, R. D. S., and Kishiyama, M. M. (2010). Effects of socioeconomic status on brain development, and how cognitive neuroscience may contribute to leveling the playing field.  Frontiers in Human Neuroscience. v. 4 article 3.

Share

We humans are very good at dividing ourselves up into groups.  We accomplish this in a multitude of ways.  Even within homogeneous groupings we tend to find subtle ways to carve people out.  It is far easier however, when people vary by gender, ethnicity, race, class, neighborhood, region, nationality, religion, and/or sexual orientation.  For some reason we are drawn to and comforted by others that share physical resemblance, culture, attitude, values, history, important symbols, and affiliations.  Conversely, we are threatened by those in the outgroup.  Why is this?  What drives us to carve out, cast away and divide our fellow human beings into camps of “us” and “them?” Is it a byproduct of socialization or perhaps a part of our nature?

 

I saw this very clearly growing up in a small rural town in Western New York.  Even though we were all white middle class Christian kids for the most part, we effectively divided ourselves into camps – some actively participating in the parceling and others passively falling victim to it.  There were the popular kids, the tough kids, the village kids, and the farm kids.  And as we became more “sophisticated,” the parcels emerged with more universal group titles such as the heads, the jocks, the brains, the nerds, etc.  Some kids traversed multiple groups quite effectively while others fit into no group at all.

 

It wasn’t until I went to college that I was immersed with young adults who parceled out their peers in even more “enlightened” ways.  I went to SUNY Geneseo where the student body was very similar to that of my home town, again, largely a white middle class subset of New York State – but a bit more diverse geographically and religiously.  The most striking division was imposed by students from Westchester County, Long Island, and New York City who looked at their fellow New Yorkers emanating from any location west of the Hudson River as being inferior.  This “geographism” was shocking to me.  I was clearly in the inferior outgroup.

 

On top of that, there were sorority and fraternity groupings, valuations made by respect for one’s major, and more subtly by the size of the town one came from.  All this being said, I enjoyed college, learned a lot, and have great respect for the institution today.  I am not singling out any one town or university – I suspect that my experience was no different than that most kids encountered growing up.  The point is this – we are seemingly driven to parcel ourselves.  Even during my doctoral training in Cincinnati there was “geographism” whereby people from Kentucky (just across the Ohio River) were cast in a relative negative light by Ohioans much as New Yorkers downcast people from Pennsylvania or New Jersey.  On another level, think about the antipathy between cat lovers and dog lovers.  Then there are Yankee fans and Red Sox fans (insert any sports team where fans divide themselves with similar acrimony).  It is every where!

 

I was very fortunate to have a mother who encouraged me to respect diversity and not to judge others by group affiliation.  She spoke out against or talked with me privately so that I would not emulate other role models who were not so open minded.  I have always been thankful for her influence.  And because of her I have in maturity always tried to emulate her.  It’s not always easy – but I do try.  Something tells me that one’s level of prejudice is not simply a function of having a great role model or a bad one.  This tendency is so universal and plays out in very subtle ways that are not always evidenced as explicit overt racism or sexism.

 

Evidence, as it turns out, is increasingly supporting my hunch.  Group prejudices are evident even in pre-vocal babies (Mahajan, 2011). This growing body of research has been supplemented by an ingenious set of studies of prejudice in nonhuman primates published recently in the Journal of Personality and Social Psychology.  The primary author, Neha Mahajan, from Yale University, was kind enough to share with me her paper entitled The Evolution of Intergroup Bias: Perceptions and Attitudes in Rhesus Macaques.

 

The researchers conducted seven different in-vivo experiments to explore whether old world monkeys, with whom we shared a common ancestor more than 30 million years ago (Hedges & Blair, Ed., 2009), evidence human-like intergroup bias.  This preliminary work establishes that we do share this trait, suggesting that prejudice may in fact be a part of our very nature.  It appears that prejudicial thinking has been adaptive from an evolutionary perspective or at least has been a vestigial stow away linked with some other trait that has been naturally selected.

 

There is some danger in this notion.  If we accept prejudice as a part of our nature, we may be more inclined to devote less effort to address it from a social perspective.  The authors are careful to point out, however, that previous research has established that prejudices can be re-mediated  through exposure and teaching or conversely entrenched through poor modeling.  These results do not diminish the influence of nurture, instead the authors highlight the importance of understanding that our brains are pre-wired for prejudice.  I have discussed human prejudice before within the context of the Implicit Associations Test (IAT) that suggests that our biases are implicit (unconscious).  Although implicit attributes are difficult to measure, there is good reason to believe that we do universally, inherently, and unknowingly harbor biases.  We must accept this and build programs upon this understanding with targeted evidenced based strategies to combat such erroneous thinking.  It is part of who we are – and once again, evidence of how flawed the human brain is.  Hate, bullying, homophobia, and racism – they all are part of our “monkey-brain.”  Here’s hoping we can rise above it.

 

References:

 

Grewal, D. (2011).  The Evolution of Prejudice: Scientists See the Beginnings of Racism in Monkeys. Scientific American: MIND. April 5.

 

Hedges, S. B., & Blair, S. (Eds.).  (2009).  The Timetree of Life. New York, NY: Oxford University Press.

 

Mahajan, N., Martinez, M. A., Gutiezzez, N. L., Diesendruck, G., Banaji, M., & Santos, L. R.  (2011).  The Evolution of Intergroup Bias: Perceptions and Attitudes in Rhesus Macaques. Journal of Personality and Social Psychology. Vol. 100, No. 3. 387-405.

Share