Autism and the DSM-5

19 December 2012

There has been a lot of talk in the media about the forthcoming DSM-5 and the diagnosis of Autism.  The DSM-5 is the Fifth Edition of the Diagnostic and Statistical Manual used by Doctors to make diagnoses pertaining to Autism and other behavioral and mental health disorders.  There are in fact two major changes in this newest edition regarding Autism.  The first has to do with changes to the name of the diagnosis.  The second has to do with the actual diagnostic criteria used to make a diagnosis.

 

Currently, when presented with a child who exhibits some characteristics of Autism, Doctors have to determine whether or not the child exhibits a sufficient array of clinically significant symptoms to warrant a diagnosis.  This process requires the clinician to rule out other disorders that may instead be causing the problematic symptoms.  The clinician also has to make a differential diagnosis to determine which of the Pervasive Developmental Disorders best describes the child.  Many professionals, me included, believe that the dividing lines between the various forms of Autism are difficult to distinguish.  The new DSM does away with this problem by eliminating the different labels (Autistic Disorder, Asperger’s Disorder, PDD-NOS, Childhood Disintegrative Disorder) and instead puts in place a more general term – Autism Spectrum Disorder (ASD).  Many researchers and clinicians agree that this change is warranted.

 

When the DSM-5 is published in May of 2013, children who previously would have been diagnosed with Autistic Disorder, Asperger’s Disorder, or PDD-NOS, will be given the new diagnosis – Autistic Spectrum Disorder (ASD).  A differentiation will then be made by indicating the degree of symptom severity.  Specifically, those with more classical Autism will be diagnosed with ASD-Severe.  At the other end of the spectrum, children diagnosed with Pervasive Developmental Disorder – Not Otherwise Specified (PDD-NOS) will likely get an ASD-Mild designation.  Those with Asperger’s may fall anywhere from ASD-Severe to ASD-Mild, depending on the degree of impairment.  Many with Asperger’s will likely fall in the Moderate range.  To be clear however, Classical Autism may span Severe to Mild ASD while PDD-NOS will likely span Moderate to Mild ASD.  Again, the severity designation depends on the number and severity of symptoms present.  If your child already carries a diagnosis, little will change, except perhaps how professionals refer to the disorder itself.   Your child will be referred to as being on the Autistic Spectrum.

 

The second change involves a modification of the Diagnostic Criterion used to provide a diagnosis.  When making a diagnosis, a clinician such as myself, has to have evidence of a sufficient array of behaviors listed in the DSM in order to provide a diagnosis.  The behaviors commonly associated with Autism make up the list of Diagnostic Criterion in the manual.  The new DSM includes an update of the behaviors used as these criteria.  It defines ASD by two sets of core features, namely: 1) impaired social communication and social interactions; and 2) restricted and repetitive behavior and interests. It more appropriately reorganizes the symptoms in these domains and adds sensory interests and sensory aversions to the list.

 

The new version is touted as an improvement because it adds to and reorganizes the diagnostic criterion so that they better address the needs of people with ASD across all developmental levels and ages.  It also includes improvements to better address the atypical symptom presentation of girls.  The goal of DSM-5 is to apply what is detailed in the scientific literature so as to add precision and validity to the diagnostic process.

 

As with any change, there have been some concerns expressed in the media.  Perhaps the most frequently heard concern is the fear that those at the mildest end of the spectrum with strong cognitive capabilities will no longer qualify for the diagnosis and thus may lose services.  Advocacy groups such as Autism Speaks have been actively engaging in this reorganization process and the American Psychiatric Association (the publisher of the DSM) has made statements aimed to calm the concerns.  They suggest that clinical judgment remains a crucial piece of the diagnostic process and that the new criteria are designed to be completely inclusive of those diagnosed using the current DSM-IV.  The research released by the American Psychiatric Association shows improved reliability and validity of diagnoses using the DSM-5 and strong inclusiveness of those already diagnosed using the DSM-IV.  I have seen the proposed diagnostic criterion and upon review I did not have any serious concerns with regard to how it will affect my ability to make diagnoses.

 

The bottom line is that for most parents, there will be no appreciable change other than how we refer to your child.  In anticipation of this change we have already been using the phrase Autism Spectrum Disorder or “on the spectrum” for quite some time now.  Diagnoses in the near term will still be made using the current DSM-IV, and thus, we will still be using the terms Autistic Disorder, Asperger’s Disorder, and PDD-NOS.   It is advisable for clinicians/diagnosticians to commence using both sets of terminology so as to minimize confusion in the future.  Sharing a document such as this one with the parents of the newly diagnosed is also advised.

Share
 | Posted by | Categories: Autism, Psychology | Tagged: , |

The Power of an Apology

15 November 2012

Saying “I’m sorry” can be very difficult for some of us.  We routinely make mistakes.  As coined by Alexander Pope: “To err is human; to forgive, divine.”  Within any interpersonal relationship there will be inadvertent missteps or even acts of anger that hurt those close to us.  Its not a matter of if, it’s a matter of when.  Forgiving is important, as Pope emphasizes: and it is also quite often a difficult thing to do.  But the act of apologizing, it seems to me, can be even harder.

 

But why?

 

Obviously it necessitates swallowing one’s pride and accepting responsibility for one’s misdeeds.  It also requires a departure from one’s unique view of the world and the adoption of another person’s perspective.  Swallowing one’s pride is hard enough and perspective taking stirs the feelings of guilt.  For these reasons alone, I believe that saying the two simple words “I’m sorry” is perhaps one of the bravest things a person can do.

 

There are other factors that contribute to the difficulty associated with an apology.   Some view it as a tacit acknowledgement of one’s weakness.  It does tend to elicit a personal feeling of vulnerability and perhaps pangs of subjugation, defeat, and loss of status.  It can entwine and envelope one in a aura of incompetence and humility.  No one likes such feelings: none of them elevate one’s sense of  well being.  The opposite is true: they instead elicit dysphoric feelings that essentially punish the inclination to apologize.   Thus, many avoid, ignore, or steep themselves in denial.  Pointing outward and blaming the other party for causing the problem strips one of responsibility and allows escape from the unpleasantness of having to apologize.  It is the easy way out, and ultimately it tends to bankrupt a relationship.

 

I really like how Stephen Covey, author of Seven Habits of Highly Effective People, conceptualizes relationships.  He analogizes relationships to a bank account.  When you treat another person with dignity and respect, you make deposits in their emotional bank account.  When you hurt someone, you essentially make a withdrawal.  By virtue of being in a sustained relationship, you will, over time, make a series of deposits and withdrawals.  When you hurt another person and then deny your responsibility for having done so, you compound the withdrawal.  And too many withdrawals can drain that person’s emotional bank account.  A drained account stirs contempt and lays the foundation for the end of that relationship.  A genuine apology is typically a deposit and it can go a long way toward bringing the account back into balance.  To be effective, it must be heartfelt, with an acknowledgment of the depth of harm done, and with full acceptance of responsibility.  The results should help heal wounds and it may even strengthen the relationship.  It is a gift, because it can make forgiveness easier for the injured party.  Denial, on the other hand, deepens the wound and widens the gap.

 

Saying “I’m sorry” is supposed to be difficult.  It is an act of contrition, whereby one bares the difficult weight of the misstep and takes responsibility for it.  This courageous endeavor is essential for sustaining a loving and caring relationship.  The world in general, and your relationships specifically, will be better if you endeavor to be brave enough to utter these simple words.  Doing the right thing is ultimately way more important than being right (Ludwig, 2010). To err is human; to apologize, heroic.

 

References:

 

Belkin, L., (2010). Why is it so Hard to Apologize Well? The New York Times

 

Lazare, A., (2004). Making Peace Through Apology.  GreaterGood.berkley.edu

 

Ludwig, R., (2009).  Why is it so Hard to Say “I’m Sorry?”  NBC NEWS.com

 

Mumford & Sons (2010). Little Lion Man

 

O Leary, T. (2007). 5 Steps to an Effective Apology.  Pick The Brain.com

 

 

Share
 | Posted by | Categories: Happiness, Psychology | Tagged: |

Sometimes the quietest moments are the most troubling.  Serenity seems to occasionally pave the way for a sequence of thoughts triggered by a song or a smell, or anything really, that ushers in a blast from the past.  A cavalcade of memories then flow forth both effortlessly and seamlessly.  And all of this occurs outside of conscious control.  For me, it often begins with a pleasant memory, but it can take a circuitous route, bringing me to memories that I would prefer remain inaccessible.  The ending point is usually a moment in time where I come face to face with a mistake I made – usually a long forgotten unintentional misstep that reveled a less sensitive or perceptive side of my persona.

 

Does this sound familiar?  I have long struggled to make sense of this sequence of thoughts.  It’s not as though these distant missteps weigh heavily in my conscious mind.  And most of the time they have no or very little current relevance.   Almost always the events involve a situation where I had no intention of being hurtful.  So why would my brain dredge up painful events and spoil a perfectly pleasant moment?   It makes little sense to me.

 

I have long felt like there is a dark and deeply self effacing entity lurking in the shadows of my mind just waiting for an opportunity to rain guilt on me.   Really, it does feel like there is something lurking inside my mind, stalking my thoughts, waiting for a memory that can be linked back to an event that will make me feel bad about myself.  Freud’s notion of the Super-ego seems particularly relevant, but there is no evidence of such embodied moralistic forces battling it out in the brain.  There are however, brain systems that interact in a way that are compellingly similar to Freud’s model with regard to active decision making.  But it is not clear to me how, or why, these systems would reach back in time to spoil a moment of serenity.

 

As I understand it, the brain is comprised of a complex combinatorial neuronal network that has evolved over millions of years.  With this being the case, there must be either some adaptive value to this capacity to stir up guilty feelings, or it may be a side effect of some other adaptive neurological system.   These hypotheses are made assuming that this propensity is neither pathological or unique to me.  Given the fact that these recall events do not adversely affect my life in any substantive way, beyond briefly bumming me out, and the likelihood that I am not alone in experiencing this – it must be adaptive at some level.

 

As it turns out there appears to be evidence for a relationship between dispositional empathy and one’s proneness to feelings of guilt.  In a study titled Empathy, Shame, Guilt, and Narratives of Interpersonal Conflicts: Guilt-Prone People Are Better at Perspective Taking by Karen P. Leith and Roy F. Baumeister they found that Guilt:

“… seems to be linked to the important cognitive components of empathy, particularly the ability to appreciate another person’s perspective (or at least to recognize that the other’s perspective differs from one’s own). Guilt-proneness is linked to both the ability and the willingness to consider the other’s perspective.”

 

So these feelings of remote guilt may indeed be adaptive in that they fuel my perspective taking capacity.  In other words, they compel me to be all the more careful and sensitive so as to facilitate better outcomes with regard to current social relationships (and thus avoid future negative recollections).  I am inherently driven to look at the other person’s perspective in most of my encounters with people. It seems that those situations that spring forth from the depths of my memory are those occasions when I did not effectively employ good perspective taking.

 

Empathy is widely accepted as being an adaptive skill and perhaps guilt proneness facilitates positive feedback thus driving one toward more effective empathy.  Or perhaps the guilty feelings drudged up are experiential outliers – the memories with stronger visceral tags – the ones that are more easily dragged to the forefront as my brain meanders down memory lane.   Leith and Baumeister’s research did not address the retrospective nature of experiences like mine; therefore, I continue to speculate.  But this link between empathy and guilt makes sense.  Or maybe this is a self-serving bias.

 
If you have a moment, please click on the link below to answer some questions that will give me some preliminary information on this empathy-guilt relationship. It’s only 5 questions – and really, it should only take a minute or so.
Click here to take survey

Share

We humans like to think of ourselves as strong and dominant forces.  Why shouldn’t we?  After all, we have conquered many of our natural foes and reign supreme as rational and commanding masters of our destiny.  That is what we like to think.  But this may be an illusion because as it turns out, we share our bodies with an unimaginably vast array of organisms that seem to play a substantial role in our well-being.

 

In and on your body, there are ten microorganisms for every single human cell.  They are invisible to the naked eye – microscopic actually.  For the most part they are bacteria, but also protozoans, viruses, and fungi.  This collection of organisms is referred to as the microbiome and it accounts for about three pounds of your total body weight: about the same weight as your brain.  In all, there are an estimated 100 trillion individuals thriving on your skin, in your mouth, in your gut, and in your respiratory system, among other places.  And it is estimated that there are one to two thousand different species making up this community.(2)

Image of Microscopic Bacteria

 

Since wide spread acceptance of the Germ Theory, in the late nineteenth century, we have considered bacteria as the enemy.  These organisms are germs after all, and germs make us sick.  This is accurate in many ways: acceptance and application of the germ theory vastly extended the human life expectancy (from 30 years in the Dark Ages to 60 years in the 1930s).   Other advances have since increased that expectancy to about 80 years.

 

But, as we are increasingly becoming aware, this microbiome plays a crucial role in our ability to live in the first place.  There are “good” and “bad” microbes.  But this dichotomy is not so black and white.  Some good microbes turn problematic only if they get in the wrong place (e.g., sepsis and peritonitis).  But what we must accept is that we would not survive without the good ones.  We are just beginning to learn of the extent to which they control our health and even our moods.

 

For example, some of our nutritive staples would be of very limited value if it wasn’t for Baceroides thetaiotaomicron.  This microbe in our stomach has the job of breaking down complex carbohydrates found in foods such as oranges, apples, potatoes, and wheat germ.  Without this microbe we simply do not have the capability to digest such carbohydrates.(1)  And this is just the tip of the proverbial iceberg.

 

The “beneficial” bacteria in our guts are clearly very important.  They compete with the harmful bacteria, they help us digest our food, and they help our bodies produce vitamins that we could not synthesize on our own.(3)  Surprisingly, these microbes may play a significant role in our mood.  A recent study looking at the bacteria lacto bacillus, fed to mice, resulted in a significant release of the neurotransmitter gaba which is known to have a calming affect.  When this relationship was tested in humans we discovered a relationship between such gut bacteria and calmness to a therapeutic level consistent with the efficacy of anti-anxiety pharmaceuticals.(2)  This alone is amazing.

 

But wait, there’s more.  Take for example Helicobacter pylori (H pylori) whose job seems to be regulating acid levels in the stomach.  It acts much like a thermostat by producing proteins that communicate with our cells signaling the need to tone down acid production.  Sometimes things go wrong and these proteins actually provoke gastric ulcers.  This discovery resulted in an all out war on H pylori through the use of antibiotics.   Two to three generations ago more than 80% of Americans hosted this bacteria.  Now, since the discovery of the connection with gastric ulcers, less than 6% of American school children test positive for it.(1)  This is a good thing! Right?

 

Perhaps not.  As we have recently come to discover, H pylori plays an important role in our experience of hunger.  Our stomach produces two hormones that regulate food intake.  Ghrelin (the hunger hormone), tells your brain that you need food.  Leptin, the second hormone, signals the fact that your stomach is full.  Ghrelin is ramped up when you have not eaten for a while.  Exercise also seems to boost Ghrelin levels.  Eating food diminishes Ghrelin levels.  Studies have shown that H pylori significantly regulates Ghrelin levels and that without it your Ghrelin levels may be unmediated thus leading to a greater appetite and excessive caloric intake.(1)  Sound like a familiar crisis?

 

The long and the short of this latter example is that we really do not understand the down stream consequences of our widespread use of antibiotics.  Obesity may be one of those consequences.  When we take antibiotics, they do not specifically target the bad bacteria, they affect the good bacteria as well.  Its not just medical antibiotics that cause problems – we have increasingly created a hygienic environment that is hostile to our microbiome.  We are increasingly isolating ourselves from exposure to good and bad bacteria, and some suggest that this is just making us sicker.  See the Hygiene Hypothesis.

 

We have co-evolved with our microbiome and as such have developed an “immune system that depends on the constant intervention of beneficial bacteria... [and] over the eons the immune system has evolved numerous checks and balances that generally prevent it from becoming either too aggressive (and attacking it’s own tissue) or too lax (and failing to recognize dangerous pathogens).”(1)   Bacteroides fragilis (B fragilis) for example has been found to have a profoundly important and positive impact on the immune system  by keeping it in balance through “boosting it’s anti-inflammatory arm.”  Auto immune diseases such as Chrones Disease, Type 1 Diabetes, and Multiple Sclerosis have increased recently by a factor of 7-8.  Concurrently we have changed our relationship with the microbiome.(1) This relationship is not definitively established but it clearly merits more research.

 

Gaining a better understanding of the microbiome is imperative, and is, I dare say, the future of medicine.  We humans are big and strong, but we can be taken down by single celled organisms. And if we are not careful stewards of our partners in life, these meek organisms may destroy us.  It is certain that they will live on well beyond our days.  Perhaps they shall reclaim the biotic world they created.

 

Author’s Note:  This article was written in part as a summary of  (1) Jennifer Ackerman’s article The Ultimate Social Network in Scientific American (June 2012).  Information was also drawn from (2) a Radio Lab podcast titled GUTS from April of 2012 and (3) a story on NPR by Allison Aubrey called Thriving Gut Bacteria Linked to Good Health in July of 2012.

Share
 | Posted by | Categories: Biology, Healthcare, Psychology | Tagged: , |

What drives you crazy about your partner? Dirty dishes left piled in the sink. Several days worth of laundry strewn about the bedroom. The toilet paper roll is never replenished. She talks too much – he doesn’t talk enough. He’s always late – she’s a compulsive neat freak. These are a few of the common complaints that spouses have about their loved ones. It is well known that close intimate relationships can be very tough to sustain over time. There is something about living with someone for a long period of time that turns idiosyncratic quirks into incendiary peeves. Why is this?

 

I’ve recently finished reading Annoying: The Science of What Bugs Us by Joe Palca and Flora Lichtman. This fascinating read dives into a topic that has escaped much direct scientific scrutiny. This fact is amazing because “although everyone can tell you what’s annoying, few, if any, can explain why” (Palca & Lichtman, 2011). One of the topics that these authors explore is this issue of the bothersome habits of intimate partners. It’s exceedingly common – if your partner drives you crazy – you are not alone.

 

What is very curious is that often the very things that attracted you to your partner, are the things that, in the end, foster contempt. Palca and Lichtman explore the concept of Fatal Attraction coined by sociologist Diane Felmlee of UC – Davis. Felmlee has explored this concept for years and she has seen this tendency in couples all over the world. In the first stage of love (Romantic Love), we are drawn in, in part, by the cute little things, the person’s novel traits, that trigger affection. But, over time, those initially positive attractors often have an annoying flip side.

 

Why does something that attracted you to your partner get flipped into a detractor? Felmlee believes that this disillusionment occurs due to Social Exchange Theory where “extreme traits have [their] rewards, but they also have costs associated with them, especially when you are in a relationship.”

  • If you were drawn to partner because he was nice and agreeable, he may later be seen as passive and prone to letting people walk all over him.
  • If you were attracted to your partner because of her assertiveness, confidence, and self-directed demeanor, you may later find her to be stubborn and unreasonable.
  • If you were swooned by his strong work ethic and motivation to be successful, you may later be disappointed because you now have an inattentive, inaccessible, workaholic.
  • Someone who is a romantic, attentive, and caring suitor may later be viewed as a needy and clingy partner.
  • The passionate may become the dramatic or explosive hot-head.
  • The calm, cool, and collected becomes the aloof stoic.
  • The laid back guy becomes the lazy slob.
  • The exciting risk taker becomes the irresponsible adrenaline junkie.
  • The gregarious life of the party becomes the clown who takes nothing seriously.

 

And so it goes. Repetition seems to be a crucial contributor notes Elaine Hatfield, a psychologist from the University of Hawaii. “The same thing keeps happening over and over again in a marriage” she notes. Michael Cunningham, a psychologist from the University of Louisville has come to refer to these annoying attributes as Social Allergens. The analogy with an allergen is played out in the dose effect. He notes that “small things don’t elicit much of a reaction at first” but that with repeated exposure over time, they “can lead to emotional explosions.” Palca and Lichtman note that:

People frequently describe their partners as both “the love of my life” and “one of the most annoying people I know.”

 

Elaine Hatfield also believes that these social allergens get amplified when there is an imbalance in equity within a relationship. Equity Theory, she notes, suggests that when there is an imbalance of power, commitment, or contribution in a relationship, these quirks take on a disproportionate amount of negative value. However, if there is balance in the relationship (equity), the annoyance value of a partner’s quirks is more easily tolerated. So, if your partner is a good contributor and there is a balance of power, you are less likely to be annoyed. If, on the other hand, your needs are left unmet, or you do the lion’s share of the work around the house, or you feel unappreciated or diminished by your spouse, there is likely to be more annoyance associated with his or her quirks.

 

It is also important to note that the nature of a relationship changes over time. During the initial passionate Romantic Love stage, the couple tends to be on their best behavior. Once commitment and comfort are attained, one’s truer attributes tend to come to the surface. There tends to be less effort to conceal one’s quirks and thus increased occurrences of these social allergens.

 

Over time, increased and accelerated exposure take their toll and if there are equity issues, it’s a recipe for disaster. So, what is one to do?

 

The first step is to think about the issues that get to you with regard to how the value of those attributes may have a positive side. We all have our strengths and our quirks – yes, you too have your annoying tendencies! Michael Cunningham suggests that you should try to be accepting of your partners quirks. These behaviors are a part of who the person is. He notes that “You’ve got to take this if you want all of the other good things.

 

Own your feelings and explore them at a deeper level, particularly with regard to the equity issues in your relationship. Arthur Aaron, a psychology professor at the State University of New York at Stony Brook urges couples to nurture their relationship. “Celebrate when something good happens to your partner” he notes. Attend to and accentuate the positive. He also suggests engaging in novel, challenging and exciting activities fairly often. “Anything you can do that will make your relationship better will tend to make your partner less annoying.” My suggestion is to think of a relationship as a garden that needs attention, maintenance, and nurturance. It’s impossible to rid the garden of all its weeds and pests. But the more attention and nurturance you provide, the more it will flourish. As Stephen Covey is fond of saying: “Love is a verb. Love the feeling is the fruit of love the verb.” So do loving things.

Share
 | Posted by | Categories: Psychology | Tagged: , |

The more I learn about the workings of the human brain – the more I am stirred by feelings that Freud may have been right.  Although his theories have long since been discredited, he characterized the brain as a battle ground where three forces jockeyed  for control over your decision making.  There was the Id whose hedonistic impulse drove us toward self pleasuring.  And then there was the conscientious Superego whose role was to compel us to make moral decisions.  Finally, he believed there was the Ego whose job was to mediate between the drives of Id and Superego so as to facilitate adaptive navigation of the real world.

Sigmund Freud

 

Freud’s theories have always been compelling because they feel right.  I often feel as if there is a tug of war going on inside my head.  The struggles occur in the form of decisions to be made – whether its about ordering french fries or a salad, fish or steak, having a cookie or an apple, exercising or relaxing, jumping over that crevasse or avoiding it, buying a new coat or saving the money.  These battles are seemingly between good choices and bad ones.  But, where you place the good and the bad is highly contingent on one’s priorities in the moment.  The fries, steak, cookie, relaxing and that new coat all seem like good ideas in the moment – they’d bring me pleasure.  On the other hand, there are the downstream consequences of unnecessary calories from fat and sugar or squandered resources.  It’s a classic Id versus Superego battle.

 

But of course there are no entities in the human brain whose express duties are defined as Freud characterized them.

 

Or are there?

 

Well actually, there are brain regions that do wage contentious battles for control over your behaviors.  Across time, different modules assert greater amounts of control than others, and thus, the choices we make, do likewise vary in terms of quality.  As a result of advances in technology and understanding, we are becoming increasingly aware of the key factors associated with this variation.

Nucleus-Accumbens (NAcc) highlighted in red

 

One of the centers that play out in our multi-component brain is the dopamine reward pathway. Dopamine is a neurotransmitter that serves a number of important functions in the brain. One of its most significant roles plays out as a result of activation of the Nucleus Accumbens (NAcc). When the NAcc is activated it floods the brain with dopamine and we experience pleasure. Desire for an item activates the NAcc. Being in the presence of the desired item activates it further. The greater the arousal of the NAcc the more pleasure we experience. It is your NAcc that is responsible for the happiness you feel when you both anticipate and eat those fries or that steak or buy that new coat.  It is also responsible for that rush you feel when your team wins the big game (Lehrer, 2009).

Insula highlighted in teal

 

Then there is the Insula – a brain region that produces, among other sensations, unpleasantness. This center “lights up” in brain scans when people feel pain, anticipate pain, empathize with others, see disgust on someone’s face, are shunned in a social settings, or decide not to buy an item. In many cases we avoid exciting the Insula as it is the system that produces the unpleasantness of caffeine or nicotine withdrawal and the negative feelings associated with spending money (Blakslee, 2007; Lehrer, 2009).  When you are jonesing for that coffee or nicotine fix, it is your Insula that is making you feel badly – necessarily compelling you to feed the habit.  And when you satisfy the craving it is your NAcc that gives you that Ahhhhh!that sense of well being.

 

Perhaps the NAcc is Freud’s Id and the Insula Freud’s Superego?  It is actually much more complicated than this, but the overlap is interesting.

 

In an article I posted last month I wrote about the concept of an Alief.  An Alief is a primal and largely irrational fear (emotion) that arises from the deep unconscious recesses of your brain and plays a significant role in guiding some of the decisions you make.  At a very basic level, we know of two major driving forces that guide our decisions.  Broadly, the two forces are reason and emotion.  So how does this work? How do we process and deal with such diverse forces?

Orbitofrontal-Cortex (OFC) highlighted in pink

 

Neuroscientists now know that the OrbitoFrontal Cortex (OFC) is the brain center that integrates a multitude of information from various brain regions along with visceral emotions in an attempt to facilitate adaptive decision making.  Current neuroimaging evidence suggests that the OFC is involved in monitoring, learning, as well as the memorization of the potency of both reinforcers and punishers.  It analyzes the available options, and communicates its decisions by creating emotions that are supposed to help you make decisions.  Next time you are faced with a difficult decision, and you experience an associated emotion – this is the result of your OFC’s attempt to tell you what to do.  Such feelings actually guide most of our decisions without us even knowing that it is happening.

 

The OFC operates outside your awareness: opaquely communicating with your rational decision making center using the language of feelings.   Our rational center, the Prefrontal Cortex, the more apt Freudian Ego analogy, is not as predominant as he suggested.  In fact, it is limited in capacity – both easily fatigued and overly taxed.  See my post on Willpower for a deeper discussion of this issue.

 

So, as crazed as we view Freud’s notions today, there were some aspects of his explanation of human behavior that were rooted in actual brain systems.  As I previously noted, these systems are much more complicated than I have described above, but in essence, there are battles waged in your head between forces that manipulate you and your choices through the use of chemical neurotransmitters.  A portion of these battles occur outside your awareness, but it is the influence of the emotions that stem from these unconscious battles that ultimately make you feel as though there is a Devil (Id) on one shoulder and an angel (Superego) on the other as your Prefrontal Cortex (Ego) struggles to make the best possible decision.

 

By understanding these systems you may become empowered to make better decisions, avoid bad choices, and ultimately take more personal responsibility for the process.  It’s not the Devil that made you do it, and it’s not poor Ego Strength – necessitating years of psychotherapy.  It is the influence of deeply stirred emotions and manipulation occurring inside of you and perhaps some over dependence on a vulnerable and easily over burdened Prefrontal Cortex that leads you down that gluttonous path.

 

References

 

Blakeslee, Sandra. 2007. Small Part of the Brain, and Its Profound Effects. New York Times.

 

Gladwell, M. 2005.  Blink: The Power of Thinking Without Thinking. Little, Brown and Company: New York.

 

Guild, G. 2010. Retail Mind Manipulation.  How Do You Think?

 

Guild, G. 2010. What Plato, Descartes, and Kant Got Wrong: Reason Does Not Rule.  How Do You Think?

 

Guild, G. 2010. Willpower: What is it really? How Do You Think?

 

Guild, G. 2011. Irrational Fear: It’s Just an Alief. How Do You Think?

 

Lehrer, J. 2009. How We Decide. Houghton Mifflin Harcourt: New York.

Share

I have always said that there is a fine line between intelligence and fear.  Some fear is adaptive and entirely reasonable: particularly when the catalyst truly involves danger. There are some anxieties however, that take hold and profoundly affect behavior in unreasonable ways.

 

One personal example comes to mind to illustrate this. Last winter I was backpacking on a trail that traversed some rock city formations with deep, but relatively narrow, crevasses. Many of the cracks were unintimidating and easily traversed. There was one however, that stopped me in my tracks. The gap was 36-40 inches across a sheer 25 foot drop. Under more typical circumstances, this gap would have not phased me. Yet, in this situation, I was completely frozen.

Rock City Crevasse

To be clear there was some risk associated with this crossing. But, in my mind, the risk took on unreasonable proportions.

 

Frankly, I was both embarrassed and befuddled by this situation. Were it a stream of equal width, I would have easily hopped over it.

 

I stood there at battle with myself for what seemed like an eternity. In reality, it was probably only a minute or two.  My body was hostage to a cognitive tug-of-war between my rational brain urging me to leap. “Come-on” I uttered to myself “It’s only three feet across!” “You can do this!”

 

Another force in my brain countered with incapacitating doubt.  Kevin, my backpacking companion, patiently waited on the other side of the crevasse after easily leaping across. I saw him do it with no difficulty.  I had clear evidence that the crossing was easily within my capabilities; but, the cost of a slip and a fall, far overshadowed my confidence. The frustration I felt over this coup of sorts, was immense. Finally, I was able to muster up enough confidence to take the leap. It was, in fact, quite easy.  We hiked on and no further mention of this humbling pause was made.

 

Many fears are like this. Whether it is a fear of mice, or bees, spiders, or snakes. These stimuli impose, in most circumstances, no grave threat, but the flight response they trigger in the phobic is immense. Even when a person knows that there is no reason for fear, it persists.

 

This response is akin to the reluctance that most people have about eating chocolate fudge in the shape of dog feces, or eating soup from a clean unused bedpan, or drinking juice from a glass in which a sterile cockroach has been dipped. Psychologist Paul Rozin, in his famous studies on disgust, discovered that when presented with these circumstances, most people choose not to eat the fudge or the soup, or drink from the glass – even knowing there is no real danger in doing so.  It is the irrational essence of contagion that drives these inhibitions.

 

These situations are all very different than rock climbing without ropes, where there is clear and present danger. When we are compelled to flee a truly benign stimulus, we are likely driven by an internal cognitive force that screams “RISK!” even when there is no true danger.  Intriguing isn’t it, that this innate force is so powerful that even our capacity to use reason and evidence pales in comparison.

 

Philosopher Tamar Gendler has coined the word “alief” to describe this cognitive phenomenon.  She fashioned the word around the word “belief,” which is a conscious manifestation of how we suppose things to be.  An alief is a deep and powerful feeling of sorts that can and does play an important role in decision-making, but it is not based in reason or evidence.  Beliefs can be more susceptible to such rational forces.  But aliefs defy reason and exert powerful influence despite one’s attempts to rationally dispel them.  This voice is intuitive and its origins are outside your awareness.  They typically appear in an attempt to facilitate self-preservation.

 

You may believe that the feces shaped fudge is “JUST FUDGE!” but it is your alief that the fudge is excrement (as a result of it’s characteristic size, shape, and color) that makes it very hard to eat.  I believed that hopping over the crevasse was easily within my capabilities, but it was my “alief” that – leaping over the gap is DANGEROUS – that kept me frozen in my tracks.

 

You see, you can simultaneously hold opposing beliefs and aliefs and it was, in fact, these opposing forces that waged war as I stood at the edge of the precipice.  You might believe that a bee is generally harmless and unlikely to sting you unless you threaten it.  But, it is your alief, that the bee will sting and hurt you that triggers the autonomic arousal that compels you to flee.  It is this deeply primal alief that often wins, no matter how rational you attempt to be.

 

In my situation, my belief in my leaping ability ultimately prevailed.  Perhaps this was due to my machismo or humiliation, but ultimately I fought down and defeated the alief.  It was a hard fought battle that left me feeling like a chicken despite my “victory.”

 

In retrospect, getting an understanding of this internal process has helped me come to grips with my hesitation.  And as such, I stand in awe of the internal brain systems that play out in such circumstances.

 

Perhaps in the future, when in a similar situation, I will be better prepared to deal with self doubt as it springs forth from my lizard brain so that I will more effectively cope with it before it builds incapacitating momentum.  After all – it’s just an alief!

Share

Science has a PR problem.  Perhaps it is because science is responsible for some technological developments that have outpaced our moral capacity.  Or perhaps it is because the knowledge bestowed upon us through the scientific process increasingly pushes God out of the gaps.  But some are irritated by “scientists” who arrogantly assert absolute truths about the universe when in actuality, underneath their assertions, there are only probabilities with error bars.

 

I believe that one of the most fundamental problems with science is that we cannot see it.  The vastness of time and space and the minuteness of science’s edge, right now, defy the senses.  We do not have the capacity to imagine the scope and breadth of time involved in the formation of the universe or even the time scale of the evolution of complex life.  It is beyond our capacity to imagine how incredibly insignificant our place is in the cosmos.  Likewise, the realities of life at the cellular level and the complexity of interactions at the subatomic level, escape logic and defy the rules by which we live our lives.

 

Science is a juggernaut of increasingly and unapproachable complexity.  No longer are great discoveries made with home-made telescopes or in monastery greenhouses.  Science has become so specialized and at its focus, so minute, or so vast, that it is beyond the human experience.  The technical and mathematical skills required, and the sophistication of the instruments employed, all take us deeper and deeper, and further and further beyond anything that most of us can comprehend.

 

These realities literally bring science to the level of science fiction.  I once read a bumper sticker that said “I don’t have enough faith to believe in science.”  Although that sticker was posted by a Christian troubled about science’s role in the diminishment of God, it strikes me, that it may, on another level, represent the level of detachment science has accomplished through its very own progress.  If one does not truly understand the scientific process and the absolute intellectual scrutiny of the process itself, it is easy to assume that faith is necessary to believe in science. To the average person, buying what science tells us does require a leap of faith.

 

Yet, there is a fundamental difference between science and faith.  I once heard Donald Johanson talk about Lucy, his famous find.  In 1973 Johanson found a fossil that dramatically changed the way we conceptualized hominid evolution.  Lucy was a 3.2 million year old Australopithecus afarensis fossil that provided evidence that hominids walked upright before the brain got bigger.  It had been believed up until then, that in hominids, a bigger brain evolved first, giving our ancestral kin the smarts needed to survive a ground based and bipedal existence. The paradigm shifted based on this new evidence.  Such is the way of science.  In his talk, Dr. Johanson clearly and simply differentiated science and faith.  What he said was:

Science is evidence without certainty while Faith is certainty without evidence

 

I guess it boils down to what degree one values evidence.

 

A related issue pertains to the fact that sometimes the results of science are portrayed with too much certainty.  And sometimes writers overreach with their interpretation of findings.  This is a legitimate concern.  The greater scrutiny I give science, the more I see that this problem generally emanates from science writers (journalists) rather than from the scientific community.  Humility and the acknowledgement of the limits of one’s findings (i.e., error bars), are the hallmarks of good science.  This becomes increasingly important as we investigate deeply remote phenomena, be it the quantum realm, the formation of the universe, or even the geological evolution of our planet.  Science attempts to form a clear picture when only intermittent pixels are accessible.

 

A wonderful example of such humility is evidenced in Charles Darwin’s On the Origin of Species. Some people use his own skeptical analysis as a refutation of his own theory.  Reading the book negates such an argument.  Every paper published in a reputable peer reviewed journal includes a Discussion section where the authors detail the potential flaws and confounds, as well as suggested areas of improvement for future research.  If one accesses the actual science itself, this humility is evident.  But in the media, over reaching is commonplace, and it warrants reasonable suspicion.

 

There are however, areas of science where the evidence is so broad and so complete that certainty is absolutely asserted.  Evolution by means of natural selection is one of those areas.  Yet evolution and the dating of the planet for example run into controversy as they intersect with the beliefs of those who sustain a literal interpretation of the Bible. This is where two world-views diverge, or more aptly, collide.

 

Long ago, when we lacked an understanding of geology, meteorology, the germ theory of disease, and neurology, people tried to make sense of random events like floods, earthquakes, tsunamis, hurricanes, droughts, plagues, seizures, depression, mania, and dementia.  We did this because we struggled to make sense of substantial, catastrophic,  and seemingly random events.  When such events occur, it is our nature to seek out patterns that help us make sense of it all.  Vengeful deities were historically the agents of such destructive forces.  Just as we are universally driven to explain our origins, as evidenced by a plethora of diverse creation stories, we are compelled to make sense of our destruction.  As we have come to develop a better understanding of the world around us, little by little, God as a creative and destructive force has been displaced.

 

This increased material understanding of our world poses a serious threat to literal religion.  Although, for most scientists, the target is not the destruction of God.  On the contrary, knowledge is the goal.  Unfortunately, because of this looming and powerful threat, science and knowledge have become targets for some religious people.  The problem with science is that it threatens deeply held ideological belief systems that, at their core, value faith over evidence.

 

It comes back to that Evidence question again.  As humans we are more compelled by stories that provide comfort and give significance to our existence, than by the data that asserts and demands humility.  This is not a problem with science, it is a problem with the human brain.

 

Share

The Brain’s False Idols

4 December 2011

I’ve been exploring the subtleties of human cognition for nearly two years now.  The most amazing and persistent lesson I’ve learned is that our ability to understand the world is limited by the way our brains work.  All of us are constrained by fundamentally flawed cognitive processes, and the advanced studies of human cognition, perception, and neuro-anatomy all reveal this to be true.  Although this lesson feels incredibly fresh to me, it is not new news to mankind.   Long ago, serious thinkers understood this to be true without the aid of sensitive measurement devices (e.g., fMRI) or statistical analysis.

 

It pains me a bit to have been scooped by Sir Francis Bacon, who knew this well in the early 17th Century.  After all, It took me two years of intensive, self-driven investigation, 18 years after getting a PhD in psychology, to come to grips with this.  I have to ask “Why isn’t this common knowledge?”  and “Why wasn’t this central to my training as a psychologist?”

 

Bacon, an English lawyer, statesman, and thinker, who devoted his intellect to advancing the human condition, astutely identified the innate fallibility of the human brain in his book entitled New Organon published in 1620.  He referred to these cognitive flaws as The Four Idols.  The word idol he derived from the Greek word eidolon which when translated to English means a phantom or an apparition, that he argued, blunts or blurs logic and stands in the way of truly understanding external reality.  What we know today, adds greater understanding of the mechanisms of these errors, but they stand intact.

 

The terms Bacon used to describe these flaws probably made more sense in his day, but they are opaque today.  My preference is to use a more current vernacular to explain his thoughts and then back-fill with Bacon’s descriptors.  My intention is not to provide an abstract of his thesis, but rather to drive home the notion that long ago the brain’s flaws had been identified and acknowledged as perhaps the biggest barrier to the forward progress of mankind.  Much has changed since Bacon’s day, but these idols remain as true and steadfast today as they were 400 years ago.  It is important to note that Bacon’s thesis was foundational in the development of the scientific process that has ultimately reshaped the human experience.

 

I have previously written about some of the flaws that Bacon himself detailed long ago.  Bacon’s first idol can be summed up as the universal transcendent human tendencies toward Pareidolia, Confirmation Bias, and Spinoza’s Conjecture.  In other words, humans instinctively: (a) make patterns out of chaos; (b) accept things as being true because they fit within their preconceived notions of the world; (c) reject things that don’t fit within their current understanding; and (d) tend to avoid the effort to skeptically scrutinize any and all information.   These tendencies, Bacon described as the Idols of the Tribe.  To him the tribe was us as a species.  He noted that these tendencies are in fact, universal.

 

The second set of attributes seem more tribal to me because although the first set is universal, the second set vary by what we today more commonly refer to as tribes.  Cultural biases and ideological tendencies shared within subsets of people make up this second idol – the Idols of the Cave.  People with shared experiences tend to have specific perspectives and blind spots.  Those within such tribal moral communities share these similarities and differentiate their worldviews from outsiders.  People within these subgroups tend to close their minds off to openness and diverse input.  As such, most people innately remain loyal to the sentiments and teachings of the in-group and resist questioning tradition.  Cohabitants within their respective “caves” are more cohesive as a result – but more likely to be in conflict with out-groups.

 

The third idol is more a matter of faulty, misguided, or sloppy semantics.  Examples of this include the overuse of, or misapplication of, vague terms or jargon.  Even the perpetual “spin” we now hear is an example of this.  In such situations, language is misused (i.e., quotes used out of context) or talking points told and retold as a means to drive a specific ideological agenda regardless of whether there is any overlap with the facts.  It is important to note that this does not necessarily have to be an act of malice, it can be unintentional.  Because language can be vague and specific words, depending on context, can have vastly different meanings, we are inherently vulnerable to the vagaries of language itself.  These are the Idols of the Market Place where people consort, engage in discourse, and learn the news of the day.  Today we would probably refer to this as the Idols of the 24 Hour News Channel or the Idols of the Blogosphere.

 

The final idol reflects the destructive power of ideology.  At the core of ideology are several human inclinations that feed and sustain many of the perpetual conflicts that consume our blood and treasure and in other ways gravely harm our brothers and sisters.  Deeper still, at the root of erroneous human inclinations, is this tendency that makes us vulnerable to the draw of ideologies that sustain beliefs without good reason.  Such is the Idol of the Theater, where theologians, politicians, and philosophers play out their agendas to their vulnerable and inherently gullible disciples.  Beliefs ultimately filter what we accept as true and false.  This is how the brain works.  This proclivity is so automatic and so intrinsic that in order to overcome it, we have to overtly fight it.  What is most troubling is that most people don’t even know that this is occurring within them.  It is this intuitive, gut-level thinking that acts as a filter and kicks out, or ignores incongruity.  And our beliefs become so core to us, that when they are challenged, it is as if we ourselves have been threatened.

 

It takes knowledge of these idols and subsequently overt efforts, to overcome them, so that we don’t become ignorant victims of our own neurology: or worse, victims of the cynical and malicious people who do understand these things to be true.  We are inherently vulnerable – be aware – be wary – and strive to strike down your brain’s false idols.

 

Share

Mahatma Gandhi once said that Poverty is the worst form of violence.  At the very least it appears to be a neurotoxin.  Evidence continues to build a solid case for the notion that poverty itself is self-propagating and that the mechanism of this replication takes place in the neuro-anatomy of the innocent children reared in environmental deprivation.

 

In my article titled The Effects of Low SES on Brain Development I review an article that provides clear quantitative data that indicates that children raised in low SES environments have diminished brain activity relative to their more affluent peers.  The impact of low SES on brain activity was so profound that the brains of these poor kids were comparable to individuals who had had actual physical brain damage.  This data gathered through EEG is a non-specific measure that provides no clear understanding of what underlies this diminished functioning.  In other words, it evidences diminished brain activity, but it does not specifically identify what has occurred in the brain that is responsible for these differences.

 

Jamie Hanson and colleagues from the University of Wisconsin-Madison and Harvard University published a paper titled Association Between Income and the Hippocampus in the peer reviewed on-line journal PLoS ONE that points to one possible culprit.  Their study shows in a measurable way, how poverty actually hinders growth of the hippocampus, a very important brain region associated with learning and memory.

 

In non-human animal studies, it has been shown that environmental enrichment is associated with “greater dendritic branching and wider dendritic fields; increased astrocyte number and size, and improved synaptic transmission in portions of the hippocampus” (Hanson et. al. 2011).  This essentially means that environmental enrichment enhances the density and functioning capacity of the hippocampus.  In humans, parental nurturance, contact, and environmental stimulation has been associated with improved performance on tasks (long-term memory formation) greatly influenced by the hippocampus. On the flip side, it has also been demonstrated that stress, inadequate environmental nurturance and low stimulation have the opposite affect (thinning hippocapmal density).

 

Hanson et. al., (2011) hypothesized that hippocampus density would be positively related to gradients in parental income.  Affluent children would evidence more hippocampal density (associated with better learning, memory, emotional control) while their low income counterparts would evidence diminished levels of density.  They used datea from MRI imaging studies to measure the actual hippocampal gray matter density in a large cross section of children (ages 4-18 years old) across the United States.  They also collected data on the income and education level of each participant’s parents.  As a control measure, they also quantified the whole-brain volume and the density of the amygdala, a brain region that does not vary as a function of environmental perturbations or enrichment.   These latter variables were important because they assist in ruling out brain size variation associated with other confounding variables.  They hypothesized that these latter measures would not vary associated with income.

 

The top left brain slice shows a sagittal brain slice with the hippocampus highlighted in yellow and the amygdala in turquoise, while the top right brain image shows an axial slice (with the hippocampus again highlighted in yellow and the amygdala in turquoise). The bottom left brain picture shows a coronal slice with the amygdala in turquoise and the hippocampus in yellow.

 

Their measures confirmed each of their hypotheses.  Amygdala and whole brain volume did not vary associated with parental income but hippocampal density did.  Those with parents at the lower end of the income spectrum evidenced lower hippocampal density than those children from more affluent families.  They wrote that “taken together, these findings suggest that differences in the hippocampus, perhaps due to stress tied to growing up in poverty, might partially explain differences in long-term memory, learning, control of endocrine functions, and modulation of emotional behavior” (Hanson, 2011).

 

The authors carefully noted that this correlation is not necessarily indicative of causation – and that more specific longitudinal measures along with direct measures of cognitive functioning, environmental stress,  and stimulation are necessary to truly understand the association between income and these neurobiological outcomes.  But they also warned that the data set was limited to children unaffected by mental health issues or low intelligence.  As such, the data set likely underestimates the actual hippocampal volume variation because children at the lower end of the income spectrum have disproportionately high levels of these mental health and low intelligence issues.

 

These results confirm and fit with a growing and already substantial set of findings that implicate poverty as a neurotoxin that causes a self sustaining feedback loop.  Poverty seems to weaken the foundation on which fundamental skills and capabilities are built that ultimately facilitate adaptive functioning and positive societal contributions.  A weak foundation hinders such capacities.

 

I have previously posted articles titled Halting the Negative Feedback Loop of Poverty: Early Intervention is the Key, Poverty Preventing Preschool Programs: Fade-Out, Grit, and the Rich get Richer, and The Economic, Neurobiological, and Behavioral Implications of Poverty.  In these articles I review various other studies that address this issue, but I also highlight the steps that can be taken to remediate the problem.  There really is not much question about the needed steps we as a society should take.    A recent series of articles published in the UK’s Lancet, drives this point home!

 

In one particular article, titled Strategies for reducing inequalities and improving developmental outcomes for young children in low-income and middle-income countries, the authors noted that:

 

“A conservative estimate of the returns to investment in early child development is illustrated by the effects of improving one component, preschool attendance. Achieving enrolment rates of 25% per country in 1 year would result in a benefit of US$10·6 billion and achieving 50% preschool enrolment could have a benefit of more than $33 billion (in terms of the present discounted value of future labour market productivity) with a benefit-to-cost ratio of 17·6. Incorporating improved nutrition and parenting programmes would result in a larger gain.”

 

The monetary value alone seems sufficient to motivate implementation.  For each dollar spent on quality preschool programs, we ultimately gain up to $17.60 in labor market productivity alone.  This does not account for the decreased expenditures on special education, incarceration, and other social safety net programs.  Quality preschool programing has been shown to increase high school graduation rates and home ownership rates.  If we as a society, are truly driven to promote human flourishing, equal opportunity for all, and a level playing field, then we must, I argue, take action with regard to providing universal access to quality preschool programs particularly for poor children.  What I propose is not a hand-out, but a fiscally responsible hand-up that benefits each and every one of us.

 

References:

 

Engle, P., L., Fernald, L. CH., Alderman, H., Behrman, J., O’Gara, C.,  Yousafzai A.,  de Mello M. C., Hidrobo, M.,  Ulkuer, N., Ertem, I., Iltus, S., The Global Child Development Steering Group. (2011).  Strategies for reducing inequalities and improving developmental outcomes for young children in low-income and middle-income countries. The Lancet, Early Online Publication, 23 September 2011. doi:10.1016/S0140-6736(11)60889-1

 

Hanson, J.L., Chandra, A., Wolfe, B. L., Pollak, S.D., (2011).  Association between Income and the Hippocampus. PLoS ONE 6(5): e18712. doi:10.1371/journal.pone.0018712

Share