The more I learn about the workings of the human brain – the more I am stirred by feelings that Freud may have been right.  Although his theories have long since been discredited, he characterized the brain as a battle ground where three forces jockeyed  for control over your decision making.  There was the Id whose hedonistic impulse drove us toward self pleasuring.  And then there was the conscientious Superego whose role was to compel us to make moral decisions.  Finally, he believed there was the Ego whose job was to mediate between the drives of Id and Superego so as to facilitate adaptive navigation of the real world.

Sigmund Freud

 

Freud’s theories have always been compelling because they feel right.  I often feel as if there is a tug of war going on inside my head.  The struggles occur in the form of decisions to be made – whether its about ordering french fries or a salad, fish or steak, having a cookie or an apple, exercising or relaxing, jumping over that crevasse or avoiding it, buying a new coat or saving the money.  These battles are seemingly between good choices and bad ones.  But, where you place the good and the bad is highly contingent on one’s priorities in the moment.  The fries, steak, cookie, relaxing and that new coat all seem like good ideas in the moment – they’d bring me pleasure.  On the other hand, there are the downstream consequences of unnecessary calories from fat and sugar or squandered resources.  It’s a classic Id versus Superego battle.

 

But of course there are no entities in the human brain whose express duties are defined as Freud characterized them.

 

Or are there?

 

Well actually, there are brain regions that do wage contentious battles for control over your behaviors.  Across time, different modules assert greater amounts of control than others, and thus, the choices we make, do likewise vary in terms of quality.  As a result of advances in technology and understanding, we are becoming increasingly aware of the key factors associated with this variation.

Nucleus-Accumbens (NAcc) highlighted in red

 

One of the centers that play out in our multi-component brain is the dopamine reward pathway. Dopamine is a neurotransmitter that serves a number of important functions in the brain. One of its most significant roles plays out as a result of activation of the Nucleus Accumbens (NAcc). When the NAcc is activated it floods the brain with dopamine and we experience pleasure. Desire for an item activates the NAcc. Being in the presence of the desired item activates it further. The greater the arousal of the NAcc the more pleasure we experience. It is your NAcc that is responsible for the happiness you feel when you both anticipate and eat those fries or that steak or buy that new coat.  It is also responsible for that rush you feel when your team wins the big game (Lehrer, 2009).

Insula highlighted in teal

 

Then there is the Insula – a brain region that produces, among other sensations, unpleasantness. This center “lights up” in brain scans when people feel pain, anticipate pain, empathize with others, see disgust on someone’s face, are shunned in a social settings, or decide not to buy an item. In many cases we avoid exciting the Insula as it is the system that produces the unpleasantness of caffeine or nicotine withdrawal and the negative feelings associated with spending money (Blakslee, 2007; Lehrer, 2009).  When you are jonesing for that coffee or nicotine fix, it is your Insula that is making you feel badly – necessarily compelling you to feed the habit.  And when you satisfy the craving it is your NAcc that gives you that Ahhhhh!that sense of well being.

 

Perhaps the NAcc is Freud’s Id and the Insula Freud’s Superego?  It is actually much more complicated than this, but the overlap is interesting.

 

In an article I posted last month I wrote about the concept of an Alief.  An Alief is a primal and largely irrational fear (emotion) that arises from the deep unconscious recesses of your brain and plays a significant role in guiding some of the decisions you make.  At a very basic level, we know of two major driving forces that guide our decisions.  Broadly, the two forces are reason and emotion.  So how does this work? How do we process and deal with such diverse forces?

Orbitofrontal-Cortex (OFC) highlighted in pink

 

Neuroscientists now know that the OrbitoFrontal Cortex (OFC) is the brain center that integrates a multitude of information from various brain regions along with visceral emotions in an attempt to facilitate adaptive decision making.  Current neuroimaging evidence suggests that the OFC is involved in monitoring, learning, as well as the memorization of the potency of both reinforcers and punishers.  It analyzes the available options, and communicates its decisions by creating emotions that are supposed to help you make decisions.  Next time you are faced with a difficult decision, and you experience an associated emotion – this is the result of your OFC’s attempt to tell you what to do.  Such feelings actually guide most of our decisions without us even knowing that it is happening.

 

The OFC operates outside your awareness: opaquely communicating with your rational decision making center using the language of feelings.   Our rational center, the Prefrontal Cortex, the more apt Freudian Ego analogy, is not as predominant as he suggested.  In fact, it is limited in capacity – both easily fatigued and overly taxed.  See my post on Willpower for a deeper discussion of this issue.

 

So, as crazed as we view Freud’s notions today, there were some aspects of his explanation of human behavior that were rooted in actual brain systems.  As I previously noted, these systems are much more complicated than I have described above, but in essence, there are battles waged in your head between forces that manipulate you and your choices through the use of chemical neurotransmitters.  A portion of these battles occur outside your awareness, but it is the influence of the emotions that stem from these unconscious battles that ultimately make you feel as though there is a Devil (Id) on one shoulder and an angel (Superego) on the other as your Prefrontal Cortex (Ego) struggles to make the best possible decision.

 

By understanding these systems you may become empowered to make better decisions, avoid bad choices, and ultimately take more personal responsibility for the process.  It’s not the Devil that made you do it, and it’s not poor Ego Strength – necessitating years of psychotherapy.  It is the influence of deeply stirred emotions and manipulation occurring inside of you and perhaps some over dependence on a vulnerable and easily over burdened Prefrontal Cortex that leads you down that gluttonous path.

 

References

 

Blakeslee, Sandra. 2007. Small Part of the Brain, and Its Profound Effects. New York Times.

 

Gladwell, M. 2005.  Blink: The Power of Thinking Without Thinking. Little, Brown and Company: New York.

 

Guild, G. 2010. Retail Mind Manipulation.  How Do You Think?

 

Guild, G. 2010. What Plato, Descartes, and Kant Got Wrong: Reason Does Not Rule.  How Do You Think?

 

Guild, G. 2010. Willpower: What is it really? How Do You Think?

 

Guild, G. 2011. Irrational Fear: It’s Just an Alief. How Do You Think?

 

Lehrer, J. 2009. How We Decide. Houghton Mifflin Harcourt: New York.

Share

I have always said that there is a fine line between intelligence and fear.  Some fear is adaptive and entirely reasonable: particularly when the catalyst truly involves danger. There are some anxieties however, that take hold and profoundly affect behavior in unreasonable ways.

 

One personal example comes to mind to illustrate this. Last winter I was backpacking on a trail that traversed some rock city formations with deep, but relatively narrow, crevasses. Many of the cracks were unintimidating and easily traversed. There was one however, that stopped me in my tracks. The gap was 36-40 inches across a sheer 25 foot drop. Under more typical circumstances, this gap would have not phased me. Yet, in this situation, I was completely frozen.

Rock City Crevasse

To be clear there was some risk associated with this crossing. But, in my mind, the risk took on unreasonable proportions.

 

Frankly, I was both embarrassed and befuddled by this situation. Were it a stream of equal width, I would have easily hopped over it.

 

I stood there at battle with myself for what seemed like an eternity. In reality, it was probably only a minute or two.  My body was hostage to a cognitive tug-of-war between my rational brain urging me to leap. “Come-on” I uttered to myself “It’s only three feet across!” “You can do this!”

 

Another force in my brain countered with incapacitating doubt.  Kevin, my backpacking companion, patiently waited on the other side of the crevasse after easily leaping across. I saw him do it with no difficulty.  I had clear evidence that the crossing was easily within my capabilities; but, the cost of a slip and a fall, far overshadowed my confidence. The frustration I felt over this coup of sorts, was immense. Finally, I was able to muster up enough confidence to take the leap. It was, in fact, quite easy.  We hiked on and no further mention of this humbling pause was made.

 

Many fears are like this. Whether it is a fear of mice, or bees, spiders, or snakes. These stimuli impose, in most circumstances, no grave threat, but the flight response they trigger in the phobic is immense. Even when a person knows that there is no reason for fear, it persists.

 

This response is akin to the reluctance that most people have about eating chocolate fudge in the shape of dog feces, or eating soup from a clean unused bedpan, or drinking juice from a glass in which a sterile cockroach has been dipped. Psychologist Paul Rozin, in his famous studies on disgust, discovered that when presented with these circumstances, most people choose not to eat the fudge or the soup, or drink from the glass – even knowing there is no real danger in doing so.  It is the irrational essence of contagion that drives these inhibitions.

 

These situations are all very different than rock climbing without ropes, where there is clear and present danger. When we are compelled to flee a truly benign stimulus, we are likely driven by an internal cognitive force that screams “RISK!” even when there is no true danger.  Intriguing isn’t it, that this innate force is so powerful that even our capacity to use reason and evidence pales in comparison.

 

Philosopher Tamar Gendler has coined the word “alief” to describe this cognitive phenomenon.  She fashioned the word around the word “belief,” which is a conscious manifestation of how we suppose things to be.  An alief is a deep and powerful feeling of sorts that can and does play an important role in decision-making, but it is not based in reason or evidence.  Beliefs can be more susceptible to such rational forces.  But aliefs defy reason and exert powerful influence despite one’s attempts to rationally dispel them.  This voice is intuitive and its origins are outside your awareness.  They typically appear in an attempt to facilitate self-preservation.

 

You may believe that the feces shaped fudge is “JUST FUDGE!” but it is your alief that the fudge is excrement (as a result of it’s characteristic size, shape, and color) that makes it very hard to eat.  I believed that hopping over the crevasse was easily within my capabilities, but it was my “alief” that – leaping over the gap is DANGEROUS – that kept me frozen in my tracks.

 

You see, you can simultaneously hold opposing beliefs and aliefs and it was, in fact, these opposing forces that waged war as I stood at the edge of the precipice.  You might believe that a bee is generally harmless and unlikely to sting you unless you threaten it.  But, it is your alief, that the bee will sting and hurt you that triggers the autonomic arousal that compels you to flee.  It is this deeply primal alief that often wins, no matter how rational you attempt to be.

 

In my situation, my belief in my leaping ability ultimately prevailed.  Perhaps this was due to my machismo or humiliation, but ultimately I fought down and defeated the alief.  It was a hard fought battle that left me feeling like a chicken despite my “victory.”

 

In retrospect, getting an understanding of this internal process has helped me come to grips with my hesitation.  And as such, I stand in awe of the internal brain systems that play out in such circumstances.

 

Perhaps in the future, when in a similar situation, I will be better prepared to deal with self doubt as it springs forth from my lizard brain so that I will more effectively cope with it before it builds incapacitating momentum.  After all – it’s just an alief!

Share

The Brain’s False Idols

4 December 2011

I’ve been exploring the subtleties of human cognition for nearly two years now.  The most amazing and persistent lesson I’ve learned is that our ability to understand the world is limited by the way our brains work.  All of us are constrained by fundamentally flawed cognitive processes, and the advanced studies of human cognition, perception, and neuro-anatomy all reveal this to be true.  Although this lesson feels incredibly fresh to me, it is not new news to mankind.   Long ago, serious thinkers understood this to be true without the aid of sensitive measurement devices (e.g., fMRI) or statistical analysis.

 

It pains me a bit to have been scooped by Sir Francis Bacon, who knew this well in the early 17th Century.  After all, It took me two years of intensive, self-driven investigation, 18 years after getting a PhD in psychology, to come to grips with this.  I have to ask “Why isn’t this common knowledge?”  and “Why wasn’t this central to my training as a psychologist?”

 

Bacon, an English lawyer, statesman, and thinker, who devoted his intellect to advancing the human condition, astutely identified the innate fallibility of the human brain in his book entitled New Organon published in 1620.  He referred to these cognitive flaws as The Four Idols.  The word idol he derived from the Greek word eidolon which when translated to English means a phantom or an apparition, that he argued, blunts or blurs logic and stands in the way of truly understanding external reality.  What we know today, adds greater understanding of the mechanisms of these errors, but they stand intact.

 

The terms Bacon used to describe these flaws probably made more sense in his day, but they are opaque today.  My preference is to use a more current vernacular to explain his thoughts and then back-fill with Bacon’s descriptors.  My intention is not to provide an abstract of his thesis, but rather to drive home the notion that long ago the brain’s flaws had been identified and acknowledged as perhaps the biggest barrier to the forward progress of mankind.  Much has changed since Bacon’s day, but these idols remain as true and steadfast today as they were 400 years ago.  It is important to note that Bacon’s thesis was foundational in the development of the scientific process that has ultimately reshaped the human experience.

 

I have previously written about some of the flaws that Bacon himself detailed long ago.  Bacon’s first idol can be summed up as the universal transcendent human tendencies toward Pareidolia, Confirmation Bias, and Spinoza’s Conjecture.  In other words, humans instinctively: (a) make patterns out of chaos; (b) accept things as being true because they fit within their preconceived notions of the world; (c) reject things that don’t fit within their current understanding; and (d) tend to avoid the effort to skeptically scrutinize any and all information.   These tendencies, Bacon described as the Idols of the Tribe.  To him the tribe was us as a species.  He noted that these tendencies are in fact, universal.

 

The second set of attributes seem more tribal to me because although the first set is universal, the second set vary by what we today more commonly refer to as tribes.  Cultural biases and ideological tendencies shared within subsets of people make up this second idol – the Idols of the Cave.  People with shared experiences tend to have specific perspectives and blind spots.  Those within such tribal moral communities share these similarities and differentiate their worldviews from outsiders.  People within these subgroups tend to close their minds off to openness and diverse input.  As such, most people innately remain loyal to the sentiments and teachings of the in-group and resist questioning tradition.  Cohabitants within their respective “caves” are more cohesive as a result – but more likely to be in conflict with out-groups.

 

The third idol is more a matter of faulty, misguided, or sloppy semantics.  Examples of this include the overuse of, or misapplication of, vague terms or jargon.  Even the perpetual “spin” we now hear is an example of this.  In such situations, language is misused (i.e., quotes used out of context) or talking points told and retold as a means to drive a specific ideological agenda regardless of whether there is any overlap with the facts.  It is important to note that this does not necessarily have to be an act of malice, it can be unintentional.  Because language can be vague and specific words, depending on context, can have vastly different meanings, we are inherently vulnerable to the vagaries of language itself.  These are the Idols of the Market Place where people consort, engage in discourse, and learn the news of the day.  Today we would probably refer to this as the Idols of the 24 Hour News Channel or the Idols of the Blogosphere.

 

The final idol reflects the destructive power of ideology.  At the core of ideology are several human inclinations that feed and sustain many of the perpetual conflicts that consume our blood and treasure and in other ways gravely harm our brothers and sisters.  Deeper still, at the root of erroneous human inclinations, is this tendency that makes us vulnerable to the draw of ideologies that sustain beliefs without good reason.  Such is the Idol of the Theater, where theologians, politicians, and philosophers play out their agendas to their vulnerable and inherently gullible disciples.  Beliefs ultimately filter what we accept as true and false.  This is how the brain works.  This proclivity is so automatic and so intrinsic that in order to overcome it, we have to overtly fight it.  What is most troubling is that most people don’t even know that this is occurring within them.  It is this intuitive, gut-level thinking that acts as a filter and kicks out, or ignores incongruity.  And our beliefs become so core to us, that when they are challenged, it is as if we ourselves have been threatened.

 

It takes knowledge of these idols and subsequently overt efforts, to overcome them, so that we don’t become ignorant victims of our own neurology: or worse, victims of the cynical and malicious people who do understand these things to be true.  We are inherently vulnerable – be aware – be wary – and strive to strike down your brain’s false idols.

 

Share

Do you believe that economic success is just a matter of having a good work ethic and strong personal motivation?  Most people do.  But in reality this is a perfect example of the Fundamental Attribution Error and the Self Serving Bias.

 

Attribution Error occurs when we negatively judge the unfortunate circumstances of others as being a reflection of their character traits rather than as a result of environmental circumstances (e.g., growing up in poverty).  What is even more interesting is that when we mess up, we tend to blame it on environmental factors rather than accepting personal responsibility.  When we are successful however, we take credit for the outcome assigning credit to internal personal attributes and devaluing environmental contributors.  This latter error is the Self Serving Bias.

 

This erroneous thinking is universal, automatic, and it is what drives a wedge between people on different points of the socio-economic spectrum.  If you believe that poor people are impoverished simply because they are lazy free-loaders, you are likely a victim of this thinking error.  The same is true if you believe that your success is completely of your own doing.

 

I have written numerous articles on the impact of poverty on early childhood development (i.e., The Effects of Low SES on Brain Development) and the bottom line is that economic deprivation weakens the social and neurobiological foundation of children in ways that have life-long implications.  In this post I will summarize a review article by Knudsen, Heckman, Cameron, and Shonkoff entitiled: Economic, Neurobiological, and Behavioral Perspectives on Building America’s Future Workforce.  This 2006 article published in the Proceedings of the National Academy of Sciences provides an excellent review of the research across many fields including developmental psychology, neuroscience, and economics.  It highlights the core concepts that converge with regard to the fact that the quality of early childhood environment is a strong predictor of adult productivity.  The authors point to the evidence that robustly supports the following notions:

 

  1. Genes and environment play out in an interdependent manner. Knudsen et al., (2006) noted that “… the activation of neural circuits by experience also can cause dramatic changes in the genes that are expressed (“turned on”) in specific circuits (58-60). The protein products of these genes can have far reaching effects on the chemistry of neurons and, therefore, on their excitability and architecture.”  Adverse experiences can and do fundamentally alter one’s temperament and capacity to learn throughout life.
  2. Essential cognitive skills are built in a hierarchical manner, whereby fundamental skills are laid down in early childhood and these foundational neural pathways serve as a basis upon which important higher level skills are built.
  3. Cognitive, linguistic, social, and emotional competencies are interdependent – all nascent in early childhood, when adverse environmental perturbations reek havoc on, and across, each of these fundamental skill sets.
  4. There are crucial and time-sensitive windows of opportunity for building these fundamental competencies.  Should one fail to develop these core skills during this crucial early developmental stage, it becomes increasingly unlikely that later remediation will approximate the potential one had, if those skills were developed on schedule.  A cogent analogy here is learning a new language – it is far easier to learn a new language early in development when the language acquisition window is open, than it is later in life when this window is nearly closed.

 

In my last two posts (Halting the Negative Feedback Loop of Poverty: Early Intervention is the Key and Poverty Preventing Preschool Programs: Fade-Out, Grit, and the Rich get Richer) I discussed two successful early intervention programs (e.g., Perry Preschool Program & Abecedarian Project) that demonstrated positive long-term benefits with regard to numerous important social and cognitive skills. Knudsen, et al, (2006) noted:

 

“At the oldest ages tested (Perry, 40 yrs; Abecedarian, 21 yrs), individuals scored higher on achievement tests, reached higher levels of education, required less special education, earned higher wages, were more likely to own a home, and were less likely to go on welfare or be incarcerated than individuals from the control groups.”

 

These findings converge with research on animal analogues investigating the neurodevelopmental impact of early stimulation versus deprivation across species.  Knudsen et al., (2006) point out that:

 

  1. There are indeed cross species negative neurodevelopmental consequences associated with adverse early developmental perturbations.
  2. There clearly are time sensitive windows during which failure to develop crucial skills have life-long consequences.  Neural plasticity decreases with age.
  3. However, there are time sensitive windows of opportunity during which quality programs and therapies can reverse the consequences of adverse environmental circumstances (i.e., poverty, stress, violence).

 

Early learning clearly shapes the architecture of the brain.  Appropriate early stimulation fosters neural development, while conversely, impoverished environments diminish adaptive neural stimulation and thus hinders neural development.  Timing is everything it seems.  Although we learn throughout our lifespan, our capacity to learn is built upon a foundation that can be strengthened or impaired by early environmental experiences.  It is very difficult to make up for lost time later in life – much as it is difficult to build a stable building on an inadequate foundation.  Stimulating environments during these crucial early neurodevelopment periods are far more efficient than remediation after the fact.  These realities provide further justification for universally available evidence based early preschool services for children at the lower end of the socio-economic spectrum.  Proactive stimulation fosters stronger and more productive citizens – yet, we continue to respond in a reactive manner with remedial and/or punitive measures that miss the mark.  The necessary proactive response is clear.

 

References:

 

Knudsen, E. I., Heckman, J. J., Cameron, J. L., and Shonkoff, J. P. (2006). Economic, neurobiological, and behavioral perspectives on building America’s future workforce.  Proceedings of the National Academy of Sciences.  v. 103, n. 27. 10155-10162.

 

Share

In my last post, Halting the Negative Feedback Loop of Poverty: Early Intervention is the Key I looked at the evidence from two quality studies of preschool intervention programs that substantiated a capacity to counteract the impairing impact of growing up in economic deprivation.  Both studies,  Perry Preschool Program and the Abecedarian Project demonstrated positive long-term benefits with regard to numerous important social and cognitive skills.  In this post I shall discuss some interesting issues and concepts that underlie the gains made at Perry and Abecedarian, including fade-out, grit, and positive and negative feedback loops.

 

The issue of fade-out, and its implications, are very important.  In both the Perry and Abecedarian Programs there were substantial positive outcomes with regard to immediate IQ and other cognitive scores.  Once the children entered typical school age programs,  some of their gains, particularly their IQ (which had a 10-15 point boost during treatment) faded away.  This fade-out was strikingly true for the Perry Preschool Program but not so for the Abecedarian Project, which had a substantially more intensive program, involving both longer school days and more school days per year.  See Figure 1 below.

 

Figure 1

 

Despite this apparent fade-out, when the recipients of this specialized programing where assessed decades later, they did much better than non-recipients on relative life issues such as high school graduation, four-year college attendance, and home ownership.  These results are encouraging on the one hand, yet puzzling on the other.  Such fade-out renders programs like Head Start vulnerable to those who cherry pick  data in order to advance ideologically driven political agendas.  Regardless, this does raise some important questions.

 

  1. Why do gains in IQ appear to fade-out?
  2. What skill gains account for the long-term gains made?

 

Some prominent researchers (e.g., David Barnett) question whether there is actually any true fade-out at all – suggesting that faulty research design and attrition may better explain these results.  Regardless, IQ is not the sole variable at play here – if anything, this data highlights the questionable validity of the IQ construct itself, relative to important life skills.  If improved IQ is not the variable that results in improved social outcomes we need to understand what happens to these children as a result of the programming they receive.  One likely hypothesis has been proffered to explain these data:

 

the intervention programs may have induced greater powers of self-regulation and self-control in the children, and … these enhanced executive skills may have manifested themselves in greater academic achievement much later in life.” (Raizada & Kishiyama, 2010).

 

Evidence has been substantiated for this hypothesis by Duckworth et al., (2005, 2007, 2009) who demonstrated that self discipline and perseverance or “grit” is more predictive of academic performance than is IQ and other conventional measures of cognitive ability (Raizada & Kishiyama, 2010).  It appears that enhancing one’s grit has the effect of triggering long-term capabilities that are self-reinforcing.  Improved self-control and attentiveness fosters achievement that ultimately feeds-back in a positive way making traditional school more rewarding and thus promoting even more intellectual growth (Raizada & Kishiyama, 2010).  Poor children, without intervention, on the other hand, appear less able to focus, attend, and sustain effort on learning and thus enter a negative feedback loop of struggle, failure, and academic disenchantment.

The bottom line is that success begets success and failure begets failure.  Stanovich (1986) offered an analogous explanation for reading proficiency: “…learning to read can produce precisely such effects: the better a child can read, the more likely they are to seek out and find new reading material, thereby improving their reading ability still further.” (Raizada & Kishiyama, 2010).

 

Both the Perry Preschool and Abecedarian Programs have impressive long-term outcome data.  See figures 2 & 3 below for a summary of those data.

 

Figure 2

Figure 3

 

The efficacy of each program has spawned other programs such as Knowledge is Power Program and the Harlem’s Children’s Zone.  Both of these intensive programs lack randomized assignment to treatment and non-treatment (control) groups.  As a result, it is difficult to make any claims about their treatment impact on important cognitive and social skills.  Given what we learned from the Perry and Abecedarian Programs, I have to wonder whether it would be ethical to withhold such treatment from those children randomly assigned to the control group.  It now seems to me, that we absolutely have an ethical obligation to short circuit the negative feedback loop of poverty and put into place universally accessible programs that diminish and/or eradicate poverty’s crippling life long impact.

 

We all pay a heavy price for poverty, but no one pays a greater cost than those children, who have been thrust into their circumstances, with little hope of rising out of poverty unless we join together to give them a fair shot at economic and social equality.

 

Yes, such programs cost money, but the long term economic costs of the status-quo are much greater.  Pay me now and build positive contributors to society, or pay me later and pay greater costs for special education, prisons, medicaid, and public assistance.  It certainly pays to step back from ideology and look at the real costs – both in terms of human lives and in terms of dollars and cents.  It makes no sense to continually blame the victims here.  Early intervention is good fiscal policy and it is the right thing to do.  It just makes sense!

 

NOTE: In a future post I will look at the evidence put forward by cognitive neuroscience for such programs.  Also see The Effects of Low SES on Brain Development for further evidence of the negative impact low SES has on children.

 

References:

 

Knudsen, E. I., Heckman, J. J., Cameron, J. L., and Shonkoff, J. P. (2006). Economic, neurobiological, and behavioral perspectives on building America’s future workforce.  Proceedings of the National Academy of Sciences.  v. 103, n. 27. 10155-10162.

 

Raizada, R. D. S., and Kishiyama, M. M. (2010). Effects of socioeconomic status on brain development, and how cognitive neuroscience may contribute to leveling the playing field.  Frontiers in Human Neuroscience. v. 4 article 3.

Share

We humans are very good at dividing ourselves up into groups.  We accomplish this in a multitude of ways.  Even within homogeneous groupings we tend to find subtle ways to carve people out.  It is far easier however, when people vary by gender, ethnicity, race, class, neighborhood, region, nationality, religion, and/or sexual orientation.  For some reason we are drawn to and comforted by others that share physical resemblance, culture, attitude, values, history, important symbols, and affiliations.  Conversely, we are threatened by those in the outgroup.  Why is this?  What drives us to carve out, cast away and divide our fellow human beings into camps of “us” and “them?” Is it a byproduct of socialization or perhaps a part of our nature?

 

I saw this very clearly growing up in a small rural town in Western New York.  Even though we were all white middle class Christian kids for the most part, we effectively divided ourselves into camps – some actively participating in the parceling and others passively falling victim to it.  There were the popular kids, the tough kids, the village kids, and the farm kids.  And as we became more “sophisticated,” the parcels emerged with more universal group titles such as the heads, the jocks, the brains, the nerds, etc.  Some kids traversed multiple groups quite effectively while others fit into no group at all.

 

It wasn’t until I went to college that I was immersed with young adults who parceled out their peers in even more “enlightened” ways.  I went to SUNY Geneseo where the student body was very similar to that of my home town, again, largely a white middle class subset of New York State – but a bit more diverse geographically and religiously.  The most striking division was imposed by students from Westchester County, Long Island, and New York City who looked at their fellow New Yorkers emanating from any location west of the Hudson River as being inferior.  This “geographism” was shocking to me.  I was clearly in the inferior outgroup.

 

On top of that, there were sorority and fraternity groupings, valuations made by respect for one’s major, and more subtly by the size of the town one came from.  All this being said, I enjoyed college, learned a lot, and have great respect for the institution today.  I am not singling out any one town or university – I suspect that my experience was no different than that most kids encountered growing up.  The point is this – we are seemingly driven to parcel ourselves.  Even during my doctoral training in Cincinnati there was “geographism” whereby people from Kentucky (just across the Ohio River) were cast in a relative negative light by Ohioans much as New Yorkers downcast people from Pennsylvania or New Jersey.  On another level, think about the antipathy between cat lovers and dog lovers.  Then there are Yankee fans and Red Sox fans (insert any sports team where fans divide themselves with similar acrimony).  It is every where!

 

I was very fortunate to have a mother who encouraged me to respect diversity and not to judge others by group affiliation.  She spoke out against or talked with me privately so that I would not emulate other role models who were not so open minded.  I have always been thankful for her influence.  And because of her I have in maturity always tried to emulate her.  It’s not always easy – but I do try.  Something tells me that one’s level of prejudice is not simply a function of having a great role model or a bad one.  This tendency is so universal and plays out in very subtle ways that are not always evidenced as explicit overt racism or sexism.

 

Evidence, as it turns out, is increasingly supporting my hunch.  Group prejudices are evident even in pre-vocal babies (Mahajan, 2011). This growing body of research has been supplemented by an ingenious set of studies of prejudice in nonhuman primates published recently in the Journal of Personality and Social Psychology.  The primary author, Neha Mahajan, from Yale University, was kind enough to share with me her paper entitled The Evolution of Intergroup Bias: Perceptions and Attitudes in Rhesus Macaques.

 

The researchers conducted seven different in-vivo experiments to explore whether old world monkeys, with whom we shared a common ancestor more than 30 million years ago (Hedges & Blair, Ed., 2009), evidence human-like intergroup bias.  This preliminary work establishes that we do share this trait, suggesting that prejudice may in fact be a part of our very nature.  It appears that prejudicial thinking has been adaptive from an evolutionary perspective or at least has been a vestigial stow away linked with some other trait that has been naturally selected.

 

There is some danger in this notion.  If we accept prejudice as a part of our nature, we may be more inclined to devote less effort to address it from a social perspective.  The authors are careful to point out, however, that previous research has established that prejudices can be re-mediated  through exposure and teaching or conversely entrenched through poor modeling.  These results do not diminish the influence of nurture, instead the authors highlight the importance of understanding that our brains are pre-wired for prejudice.  I have discussed human prejudice before within the context of the Implicit Associations Test (IAT) that suggests that our biases are implicit (unconscious).  Although implicit attributes are difficult to measure, there is good reason to believe that we do universally, inherently, and unknowingly harbor biases.  We must accept this and build programs upon this understanding with targeted evidenced based strategies to combat such erroneous thinking.  It is part of who we are – and once again, evidence of how flawed the human brain is.  Hate, bullying, homophobia, and racism – they all are part of our “monkey-brain.”  Here’s hoping we can rise above it.

 

References:

 

Grewal, D. (2011).  The Evolution of Prejudice: Scientists See the Beginnings of Racism in Monkeys. Scientific American: MIND. April 5.

 

Hedges, S. B., & Blair, S. (Eds.).  (2009).  The Timetree of Life. New York, NY: Oxford University Press.

 

Mahajan, N., Martinez, M. A., Gutiezzez, N. L., Diesendruck, G., Banaji, M., & Santos, L. R.  (2011).  The Evolution of Intergroup Bias: Perceptions and Attitudes in Rhesus Macaques. Journal of Personality and Social Psychology. Vol. 100, No. 3. 387-405.

Share

When I hit the publish button for my last post Cognitive Conservatism, Moral Relativism, Bias, and Human Flourishing I felt a tinge of angst.  It took a few days for my rational brain to figure out (or perhaps confabulate) a reason; but, I think I may have.  Perhaps it should have been immediately obvious, but my outrage likely clouded my judgment.  Anyways, that angst wasn’t due to the potential controversy of the article’s content – I had previously posted more provocative pieces.  What I have come to conclude is that the nature of the controversy could be construed as being more personal.

 

It is not hard to imagine that there is a very real possibility that people I love may have been hurt by what I wrote.  This left me feeling like a hypocrite because what I have continually aspired to communicate is that “true morality” should promote human flourishing for everyone.  Although the overarching message was consistent with my goal, the tone and tenor was not.

 

I was inspired by a blog post written by a family member that touched the nerves of my liberal sensitivities.  Further, and more importantly, I believe that what he wrote was likely hurtful to others in my family.   A couple of my tribal communities (moral and kin) were assaulted, and I responded assertively.

 

The whole purpose of my blog How Do You Think? has been driven toward understanding such diverse and mutually incompatible beliefs that do in fact transcend my family and the world in general.  In this particular situation, however,  I placed several family members in the crux of just such a moral juxtaposition.

 

I am certain that much of what I have written over the last year may be construed as offensive to some from a variety of different tribal moral communities.  But one thing I am equally certain of, is that attacking one’s core moral holdings is not an effective means of facilitating enlightenment.

 

I responded to my relative’s pontifications with moral outrage and indignation.  I was offended and mad.  That is what happens when core beliefs are challenged.  We circle the wagons and lash back.  But this does nothing to further the discussion.  I should have known better.  And, that error of judgment may have lasting familial consequences.  This saddens me, and I am sorry.

 

So then, how are we to cope with such diametrically opposed perspectives?

 

If you have consistently read my posts you are likely to have come away with an understanding of the workings of the human brain, and as such, realize that it is an incredible but highly flawed organ.   What is more important to recognize, is that these flaws leave us prone to a variety errors that are both universal and systematic.  The consequences of these errors include Confirmation Bias, Spinoza’s Conjecture, Attribution Error, Pareidolia, Superstition, Essentialism, Cognitive Conservatism, and Illusions of all sorts (e.g., Attention, Cause, Confidence, Memory, Efficacy, Willpower, and Narrative).  The down stream consequences of these errors, paired with our tribal nature, and our innate moral inclinations lead us to form tribal moral communities.  These communities unite around ideologies and sacred items, beliefs, or shared history’s.  Our genetically conferred Moral Instincts which are a part of our Human Nature lay the ground work for us to seek out others who share our beliefs and separate ourselves from others who do not.  This is how the divide occurs.  And our brain is instrumental in this division and the subsequent acrimony between groups.

 

This is perhaps the most important concept that I want to share.  Systematic brain errors divide us.  Understanding this – I mean truly understanding all of these systematic errors, is essential to uniting us.  Education is the key, and this is what I hope to provide.  Those very brain errors are themselves responsible for closing minds to the reality of these facts.  Regardless, the hopes that I have for universal enlightenment persist and I hope to endeavor ever onward opening minds without providing cause to close them.   I fear that I  have taken a misstep – spreading the divide rather than closing it.

 

Please know that Human Flourishing for all is my number one goal.  Never do I intend to come off as judgmental, hurtful, or otherwise arrogant or elitist.  When I do – please push back and offer constructive criticism.   We are all in this together – and time, love, life, peace, and compassion are precious.   This is the starting point – something that I am certain we share.  Don’t you think?

Share

So really, what caused that earthquake and subsequent tsunami in Japan?  A quick Google search posing this very question yields a wide range of answers.  Fortunately a majority of the hits acknowledge and explain how plate tectonics caused this tragedy.  Sprinkled throughout the scientifically accurate explanations are conspiracy theories suggesting that the US government caused it through hyper-excitation of radio waves in the ionosphere (HAARP) and perhaps even planned radiation releases.  Other theories include the “Supermoon’s” increased tug on the earths crust due to the fact that it is at perigee (closest proximity to the earth in its cyclical orbit).  Solar flares (coronal mass ejections) were also blamed; and by some, the flares working in concert with the moon in perigee are believed to have triggered the quake.  Global warming also gets its share of the blame (but the proponents  suggest that real cause is the removal of oil from the crust leaving voids that ultimately trigger earthquake).   Some have even suggested that a comet or even God may have done this.

 

The problem with the scientific explanation is that plate tectonics is invisible to most of us.  Its motion is so gradual that it does not “on the surface” seem plausible.  We seemingly need a clear causal agent that fits within our understanding of the world.  Scientifically literate individuals are inclined to grasp the agency of tectonics because the theory and the effects do in fact, fit together in observable and measurable ways.  Others reach for causal explanations that better fit within their understanding of the world.

 

Our correlation calculators (brains) grab onto events temporally associated with such events and we then conjure up narratives to help us make sense of it all.  It is easy to understand why folks might assume that the moon at perigee, or increased solar activity, or even an approaching comet might cause such events.  Others, who are prone to conspiracy theories, who also have a corresponding belief that big brother is all powerful and sadistic, will grab onto theories that fit their world views.  The same is true for those with literal religious inclinations.  Unfortunately, this drive often leads to narrative fallacies that misplace the blame and sometimes ultimately blame the victims.

 

History is filled with stories drawn up to explain such tragedies.  In the times of ancient Greece and Rome, many tales were spun to explain famine, plagues, and military failures.  All of this occurred prior to our increasingly complex understanding of the world (e.g., germ theory, plate tectonics, meteorology), and it made sense to blame such events on vengeful gods.  How else could they make sense of such tragedies?  This seems to be how we are put together.

 

A study published in 2006 in the journal, Developmental Psychology, by University of Arkansas Psychologists Jesse Bering and Becky Parker looked at the development of such inclinations in children.  They pinpointed the age at which such thinking begins to flourish.   They also provided a hypothesis to explain this developmental progression.  This study was summarized in a March 13, 2011 online article at Scientific American by the first author titled: Signs, signs, everywhere signs: Seeing God in tsunamis and everyday events.

 

In this study of children ages three to nine years of age, the psychologists devised a clever technique to assess the degree to which individuals begin to assign agency to events in their environment and subsequently act on those signs.  What they found was that children between three and six years of age do not read communicative intent into unexplained events (e.g., lights flickering or pictures falling from the wall).  But at age seven, children start reading into and acting on such events.  So why is it that at the age of seven, children start inferring agency from events in their environment?  Bering suggests that:

 

“The answer probably lies in the maturation of children’s theory-of-mind abilities in this critical period of brain development. Research by University of Salzburg psychologist Josef Perner, for instance, has revealed that it’s not until about the age of seven that children are first able to reason about “multiple orders” of mental states. This is the type of everyday, grown-up social cognition whereby theory of mind becomes effortlessly layered in complex, soap opera–style interactions with other people. Not only do we reason about what’s going on inside someone else’s head, but we also reason about what other people are reasoning is happening inside still other people’s heads!”

 

So as it turns out, this tendency to read signs into random events is associated with the maturation of cognitive processes. Children with less mature “Theory of Mind” (click here for a very basic description of Theory of Mind) capabilities fail to draw the conclusion that a supernatural being, or any being for that matter, knows what they are thinking and can act in a way that will communicate something.

 

“To interpret [capricious] events as communicative messages, … demands a sort of third-person perspective of the self’s actions: ‘What must this other entity, who is watching my behavior, think is happening inside my head?’ [These] findings are important because they tell us that, before the age of seven, children’s minds aren’t quite cognitively ripe enough to allow them to be superstitious thinkers. The inner lives of slightly older children, by contrast, are drenched in symbolic meaning. One second-grader was even convinced that the bell in the nearby university clock tower was Princess Alice ‘talking’ to him.”

 

When a capricious event has great significance, we are seemingly driven by a ravenous appetite to look for “signs” or “reasons.”  We desperately need to understand.  Our searches for those “reasons” are largely shaped by previously held beliefs and cultural influences. Divine interventions, for example, have historically been ambiguous; therefore, a multitude of surreptitious events, can be interpreted as having a wide variety of meanings. And those meanings are guided by one’s beliefs.

 

“Misfortunes appear cryptic, symbolic; they seem clearly to be about our behaviors. Our minds restlessly gather up bits of the past as if they were important clues to what just happened. And no stone goes unturned. Nothing is too mundane or trivial; anything to settle our peripatetic [wandering] thoughts from arriving at the unthinkable truth that there is no answer because there is no riddle, that life is life and that is that.”

 

The implications of this understanding are profound.  We are by our very nature driven to search for signs and reasons to explain major life events, and we are likewise inclined to see major events as signs themselves. The ability to do so ironically depends on cognitive maturation. But, given the complexity and remoteness of scientific explanations, we often revert to familiar and culturally sanctioned explanations that have stood the test of time.  We do this because it gives us comfort, regardless of actual plausibility.  As I often say, we are a curious lot, we humans.

 

References:

 

Bering, J. (2011). Signs, signs, everywhere signs: Seeing God in tsunamis and everyday events. Scientific American.  http://www.scientificamerican.com/blog/post.cfm?id=signs-signs-everywhere-signs-seeing-2011-03-13&print=true

 

Bering, J., & Parker, B. (2006). Children’s Attributions of Intentions to an Invisible Agent. Developmental Psychology. Vol. 42, No. 2, 253–262

 

Share

Narrative Fallacy

13 March 2011

Evolution has conferred upon us a brain that is capable of truly amazing things.  We have, for thousands of years, been capable of creating incredibly beautiful art, telling compelling tales, and building magnificent structures.  We have risen from small and dispersed tribal bands to perhaps the dominate life force on the planet.  Our feats have been wondrous.  We have put men on the moon, our space probes have reached the outer limits of our solar system, and we have people living and working in space.  We have literally doubled the life expectancy of human beings, figured out how to feed billions of people, and eradicated some of the most dreadful diseases known to human kind.  We can join together in virtual social communities from remote corners of the world, and even change nations using Facebook and Twitter.  This list could go on and on.  We are very capable and very smart beings.

 

Our mark on this planet, for the moment, is indelible.  Yet, despite our great powers of intellect and creativity, we are incredibly vulnerable.  I am not referring to our susceptibility to the great powers of nature as evidenced in Japan this last week.  I am referring to an inherent mode of thinking that is core to our human nature.

 

It is pretty certain that nature-nature will destroy our species at some point in the future, be it via asteroid impact, super-volcanoes, climate change, microbiome evolution, or the encroachment of the sun’s surface as it goes red giant in five billion years.  Of all the species that have ever lived on this planet over 99% have gone extinct.  What’s living today will someday be gone – there really is no question about it.  But the question that remains is: “Will nature-nature do us in – or will human-nature do it first?”

 

We have evolved over billions of years to our current homo sapien (wise man) form, and for the vast majority of that evolutionary period, we have had very limited technology.  The development of primitive stone and wooden tools dates back only tens of thousands of years; and reading and writing dates back only several thousand years.  What we do and take for granted every day has only been around for a minuscule amount of time relative to the vastness of incomprehensible evolutionary and geological time. These facts are relevant because our brains, for the most part, developed under selective pressures that were vastly different than those we live under today.

 

Much as our appendix and coccyx hair follicle are remnants of our evolutionary past, so too are some of our core thought processes.  These vestigial cognitions play out both as adaptive intuitions and potentially quite destructive errors of judgment.  We would like to think that as an advanced thinking species, our ability to use reason, is our dominate mental force.  Unfortunately, this most recent evolutionary development, takes a back seat to lower and more powerful brain functions that have sustained us for millions of years.  I have previously written about this reason versus intuition/emotion paradigm so I won’t go into this issue in detail here; but, suffice it to say, much of what we do is guided by unconscious thought processes outside of our awareness and outside our direct control.  And again, these life guiding processes are mere remnants of what it took to survive as roaming bands of hunters and gatherers.

 

Ours brains came to their current form when we were not in possession of the tools and technologies that help us truly understand the world around us today.  Early survival depended on our ability to see patterns in randomness (pareidolia or patternicity) and to make snap judgments.  Rational thought, which is slow and arduous, has not played out in a dominate way because it failed to provide our ancestors with the survival advantages that emotional and rapid cognitions did.  As such, our brains have been programmed by evolution to make all kinds of rapid cognitions, that in this modern time, are simply prone to error.

 

We are uncomfortable with randomness and chaos and are driven to pull together causal stories that help us make sense of the world.  Our brains are correlation calculators, belief engines, and hyperactive agency detection devices – all inclinations of which lead us to develop polytheism to help explain the whims of “mother nature.”  All cultures, for example have also developed creation myths to help explain how we came to be.  We are a superstitious lot driven by these vestigial remnants.

 

It is easy to see how powerful this inclination is.  Look at the prevalence of beliefs about things like full moons and bad behavior.  And how about bad behavior and acts of nature?  Pat Robertson blamed Katrina on homosexuality and hedonism.  One wonders what the Japanese did to deserve their most current tragedy.  I’ve already heard talk of the attack on Pearl Harbor as an antecedent.  Like mother nature would align with the United States to punish long past deeds against us!  If mother nature cares at all about herself, I wonder what we have coming for Nagasaki and Hiroshima?  Likewise, people blame vaccines for autism and credit homeopathy for their wellness.  I could go and on about our silly inclinations.  We are prone to Confirmation Bias, Spinoza’s Conjecture, Attribution Error, Illusions of Attention, and the Illusions of Knowledge and Confidence.  In the same vein, we are manipulated by the Illusion of Narrative also known as the Narrative Fallacy.

 

Nassim Nicholas Taleb (a philosopher, author, statistician) coined the phrase “Narrative Fallacy,” which is an encapsulation of this very discussion.  We have a deep need to make up a narrative that serves to make sense of a series of connected or disconnected facts.  Our correlation calculators pull together these cause and effect stories to help us understand the world around us even if chance has dictated our circumstances.   We fit these stories around the observable facts and sometimes render the facts to make them fit the story.  This is particularly true, for example, in the case of Intelligent Design.

 

Now that I am aware of this innate proclivity I enjoy watching it play out in my own mind.  For example several weekends ago I went cross country skiing with my wife, Kimberly.  We were at Allegany State Park, in Western New York, where there are nearly 20 miles of incredibly beautiful and nicely groomed nordic ski trails.  Kimberly and I took a slightly different route than we normally do and at a junction of two trails, we serendipitously ran into a friend we hadn’t seen in quite some time.  It was an incredible and highly improbable meeting.  Any number of different events or decisions could have resulted in forgoing this meet-up.  Such events compel us to string together a narrative to make sense of the sheer randomness.  Was it fate, divine intervention, or just coincidence?  I am certain it was the latter – but it sure was fun dealing with the cognitions pouring forth to explain it.

 

I would really like to hear about your dealings with this inclination.  Please post comments detailing events that have happened to you and the narratives you fomented to make sense of  them.  This is a great exercise to help us understand this pattern detection mechanism, so, have some fun with it and share your stories.  At the very least, pay attention to how this tendency plays out in your life and think about how it plays out in your belief systems (and ideological paradigms).  I’m guessing that it will be informative.

Share

We all love a good story.  Children are mesmerized by them and adults, whether through books, TV, movies, sports, gossip, tabloids, or the news, to mention a few, constantly seek them out.  It is core to our identity, and a vital part of our nature.  It is both how we entertain ourselves, and how we make sense of the world.   This latter tendency troubles me.  Why?  Specifically because we are inclined to value narratives over aggregated data, and we are imbued with a plethora of cognitive biases and errors that all mesh together in a way to leave us vulnerable to believing very silly things.

 

This may be hard to swallow, but all of us, yes even you, are by default, gullible and biased: disinclined to move away from narratives that you unconsciously string together in order to make sense of an incredibly complex world.  Understanding this is paramount!

 

I have discussed many of the innate illusions, errors, and biases that we are inclined toward throughout this blog.  I have also discussed the genetic and social determinates that play out in our thought processes and beliefs.  And throughout all this I have worked diligently to remain objective and evidence based.  I do accept that I am inclined toward biases programmed into my brain.  This knowledge has forced me to question my beliefs and open my mind to different points of view.  I believe that the evidence I have laid down in my writings substantiates my objectivity.  But I am also tired, very tired in fact, of making excuses for, and offering platitudes to, others who do not open their minds to this not so obvious reality.

 

I am absolutely convinced that there is no resolution to the core political, economic, religious and social debates that pervade our societies, unless we can accept this reality.  Perhaps, the most important thing we can do as a species is come to an understanding of our failings and realize that in a multitude of ways, our brains lie to us.  Our brains deceive us in ways that necessitate us to step away from our gut feelings and core beliefs in order to seek out the truth.  Only when we understand and accept our shortcomings will we be open to the truth.

 

Because of these flawed tendencies we join together in tribal moral communities lending a blind eye to evidence that casts doubt on our core and sacred beliefs.  We cast aspersions of ignorance, immorality or partisanship on those that espouse viewpoints that differ from our own.  I cannot emphasize this enough, this is our nature.  But, I for one, cannot, and will not, accept this as “just the way it is.”

 

We as a species are better than that.  We know how to over come these inclinations.  We have the technology to do so.  It necessitates that we step back from ideology and look at things objectively.  It requires asking questions, taking measurements, and conducting analyses (all of which are not part of our nature).  It necessitates the scientific method.  It requires open peer review and repeated analyses.  It requires objective debate and outright rejection of ideology as a guiding principle.  It requires us to take a different path, a path that is not automatic, one that is not always fodder for good narrative.

 

I am no more inclined to believe the narrative of Muammar Muhammad al-Gaddafi suggesting that “his people love him and would die for him” than I am to accept the narrative from Creationists about the denial of evolution or those that deny anthropogenic global warming based on economic interests.  Likewise, I am not willing to accept the arguments from the anti-vaccine community or the anti-gay marriage community.

 

My positions are not based on ideology!  They are based on evidence: both the credible and substantive evidence that backs my position and the lack of any substantive evidence for the opposing views.

 

Granted, my positions are in line with what some may define as an ideology or tribal moral community; but there is a critical difference.  My positions are based on evidence, not on ideology, not on bronze-age moral teachings, and certainly not on fundamental flaws in thinking.  This is a huge and critical difference.  Another irrefutable difference is my willingness to abandon my position if the data suggests a more credible one.  Enough already! Its time to step back, take a long and deep breath – look at how our flawed neurology works – and stop filling in the gaps with narrative that is devoid of reality.  Enough is enough!

 

Share