It is widely believed that as a society, we are heavily burdened by freeloaders who are content with living off the fruits of others’ labor. Inherent in this belief is the idea that the poor are more likely to be cheaters. This notion is core to the ideology that fuels the discontent of many on the conservative end of the political spectrum. I have recently written about cheating behavior in general, and how pervasive it is, particularly at the upper end of the economic spectrum. I have also written about corporate and white-collar crime and the egregious costs we all bare as a result of misconduct among the nation’s economic elite. When I sat down to write each of those two previous articles, my intent was to write something about a recent peer-reviewed paper that looked empirically at the relationship between cheating behavior and income level. The evidence substantiated in this series of studies challenges the belief that poor people are more inclined to cheat. In fact, the results turn this misconception upside down.
In this very interesting 2012 paper titled: Higher Social Class Predicts Increased Unethical Behavior, published in the Proceedings of the National Academy of Sciences (PNAS), the authors, Paul Piff, Daniel Stancato, Stephanie Côté, Rodolfo Mendoza-Denton, and Dacher Keltner empirically examine the relationship between relative wealth, propensity to engage in unethical behavior, and attitude about greed. Piff, et al. reviewed the relevant literature and hypothesized, based on a landslide of evidence, that affluent people, relative to low income people, will be more likely to engage in and condone unethical behavior and value greed. In their review of the literature they note that:
“Abundant resources and elevated rank allow upper-class individuals increased freedom and independence, giving rise to self-focused patterns of social cognition and behavior. Relative to lower-class individuals, upper-class individuals have been shown to be less cognizant of others and worse at identifying the emotions that others feel. Furthermore, upper-class individuals are more disengaged during social interactions — for example, checking their cell phones or doodling on a questionnaire — compared with their lower-class peers. Individuals from upper-class backgrounds are also less generous and altruistic. In one study, upper-class individuals proved more selfish in an economic game, keeping significantly more laboratory credits — which they believed would later be exchanged for cash — than did lower-class participants, who shared more of their credits with a stranger. These results parallel nationwide survey data showing that upper-class households donate a smaller proportion of their incomes to charity than do lower-class households. These findings suggest that upper-class individuals are particularly likely to value their own welfare over the welfare of others and, thus, may hold more positive attitudes toward greed.”
To test their hypotheses, these investigators devised seven studies to look at these relationships across a variety of contexts. Research subjects included more than 1,000 people from all walks of life. Studies 1 and 2 were naturalistic field studies whereby “Observers stood near the intersection, coded the status of approaching vehicles, and recorded  whether the driver cut off other vehicles by crossing the intersection before waiting their turn” and  “whether upper-class drivers are more likely to cut off pedestrians at a crosswalk.” Affluence was calibrated based on the make, age, and appearance of the vehicles driven – because vehicles have been established as a reliable indicator of a person’s “social rank and wealth.” People driving expensive (premium brands such as BMW, Lexus, Mercedes-Benz, etc.), late model (newish), and well cared for automobiles were deemed to be affluent – while those driving older, less expensive (i.e., Chevy, Dodge, etc.), and more poorly maintained automobiles, were deemed to be low income individuals.
Study 3 directly assessed the participant’s relative level of affluence and their subsequent proclivity toward a variety of unethical decisions (e.g., “participants read eight different scenarios that implicated an actor in unrightfully taking or benefiting from something, and reported the likelihood that they would engage in the behavior described“). Study 4 literally assessed whether there was a correlation between affluence and the willingness to take candy from a jar purportedly for children participating in a different study. Study 5 assessed the relationship between affluence and honesty in a role play involving a hypothetical scenario where participants were asked to engage in negotiations with a job candidate looking for long-term employment. The participants were told that the job they were filling was likely to be eliminated, and their honesty about sharing the instability of the job with applicants was assessed. Study 6 looked at actual cheating behavior on a game of chance “in which the computer presented them with one side of a six-sided die, ostensibly randomly, on five separate rolls. Participants were told that higher rolls would increase their chances of winning a cash prize and were asked to report their total score at the end of the game. In fact, die rolls were predetermined to sum up to 12. The extent to which participants reported a total exceeding 12 served as a direct behavioral measure of cheating.” The tendency to cheat on this game was also assessed as a function of affluence.
In each of these first six studies, the findings suggest that upper-income people, relative to low-income people, were statistically more likely to: (Study 1) cut off other drivers, (Study 2) disregard people in crosswalks, (Study 3) condone and report a likelihood to engage in similar unethical conduct, (Study 4) take candy from children, (Study 5) be dishonest in the role of hiring someone regarding the permanence of the position, and (Study 6) cheat on a game of chance. In addition, in studies 5 and 6, people of greater wealth were more likely to favor and value greed relative to their less affluent compatriots.
Study 7 was a bit more complicated and assessed the degree to which attitudes toward greed were responsive to pro-cheating priming. Individuals from across both high and low income groups were assigned to one of two conditions: (a) a greed neutral activity (listing three things they did that day), or (b) pro-greed priming (an activity where they were asked to list several positive attributes of greed). Participants were assessed regarding their attitude toward greed and self-reported propensity to engage in unethical behavior at work. Regardless of level of affluence, those exposed to pro-greed priming were more likely to engage in unethical behavior. Attitude about greed seems to play a crucial role in driving ethical behavior. The authors note that: “… upper-class individuals’ more favorable attitudes toward greed can help explain their propensity toward unethical behavior.” They also assert that throughout their lives, richer people are more likely to be educated and primed to be assertive with regard to accomplishing their own goals. Poorer people generally have negative feelings about greed and are thus less likely to behave unethically.
In these naturalistic and laboratory studies, affluent individuals were more likely to cheat or act unethically than were poor people, and to have positive feelings about greed. These results generalized across self-reported measures of affluence as well as objective measures. The implications of these findings are not to suggest that affluent people, as a whole, are unethical and greedy – nor does it suggest that the poor are uniformly ethical and less greedy. The bottom-line here is that relative to poor people, affluent people have a greater likelihood of engaging in unethical behavior and endorsing greed. These conclusions contrast a popular misconception about the poor and expand how we should think about cheating behavior in general.
It is important to note that this study will need replication in order to become firmly established; however, these findings are unidirectional and unambiguous. They are also consistent with what has been verified in the literature to date. Although there are examples of extraordinary philanthropy by affluent people such as Warren Buffet and Bill Gates, there are many other examples of systematic corruption and crime among the economic elite. On the other end of the spectrum there are those poor individuals who proudly game the system in such a way to take more than they contribute. I hear stories of such individuals with such regularity that these narratives take on the feel of urban legend. I routinely work with hard working individuals from the lowest end of the economic spectrum and in my more than 20 years of exposure to this population, I have only come across one family that fits this description. Meanwhile, my professional colleagues have to devote huge amounts of time to documenting Early Intervention and Preschool Services as a result of Medicaid Fraud perpetrated by affluent and unethical service providers who bill for services never rendered. In other words, my extensive personal anecdotes align with the findings of this series of studies.
It seems to me that it is indeed time to challenge the meme that poor people are lazy, freeloading, cheaters. At the same time it seems prudent to open our eyes to the misconduct of the affluent. The evidence certainly supports such a conclusion. This brings to mind a quote by John Adams:
“Facts are stubborn things; and whatever may be our wishes, our inclinations, or the dictates of our passion, they cannot alter the state of facts and evidence.”
Piff, P. K., Stancato, D. M., Côté, S., Mendoza-Denton, R., & Keltner, D. (2012). Higher social class predicts increased unethical behavior. Proceedings of the National Academy of Sciences (PNAS).
Out to dinner recently, a friend and I were discussing an organization whose name implies one thing, when in actuality, what they promote is entirely the opposite. We both racked our brains to come up with the name of that organization with no success. Days later, without any recent thought of the elusive name – the words Discovery Institute sprung forward in my mind. It was a spontaneous and surprising recall that brought me relief and pleasure. “Ah Ha! That’s what we were trying to remember the other night. Yes!” I said to myself. These types of memories are called Mind Pops.
They are also referred to as involuntary semantic memories. As was the case in my example, they are completely involuntary in that this type of recall occurs without any current conscious, active thought. In the more scholarly term (involuntary semantic memories), the word semantic suggests that the relevant recall springs forth from one’s semantic knowledge – for example, most commonly the item recalled is a word, phrase, image, melody, or a proper name that one has learned or has previously been exposed to. These recall events pop into conscious thought (i.e., your “mind“), without current conscious active pursuit – thus the origin of the more compelling descriptor Mind Pops.
These memory events are a relatively new topic of research revealing, as was the case in my example, that such events are not always truly random. Although the memory may be irrelevant at the exact moment that it pops into awareness, they usually are linked to one’s past experiences. Sometimes they occur with no conscious awareness of the the trigger itself. In my example, there was an event that consciously set the stage for my Mind Pop (i.e., striving to recall the Discovery Institute), but some Mind Pops are more mysterious.
Kvavilashvili and her colleague George Mandler, propose that “the completely out of the blue” Mind Pops are often explained by “long-term priming.” Priming itself is an interesting topic, but essentially it is a phenomena whereby your behavior can be altered by exposure to stimuli that enters your unconscious (implicit) memory. Research has demonstrated that people can be primed to be more polite and patient if unwittingly exposed to words in an unrelated task that lists concepts associated with being polite and patient. People will walk more slowly if they are implicitly primed with words associated with the elderly. Furthermore, recall of trivia is better if people are asked to think about the role of being a college professor before being asked the trivia questions relative to folks asked to first think about being a soccer hooligan (with other variables held constant).
This unconscious priming sets the stage for these mysterious out of the blue Mind Pops. Subconscious exposure to an image, a word, a song, or a scene serves as the trigger for later Popping. As the word subconscious implies, the exposure occurs completely outside of conscious awareness. When Kvavilashvili and Mandler asked subjects to journal their Mind Pops, there were numerous examples where the Pops had no clear, or very subtle, triggers. “Most of the information we encounter on a daily basis activates certain representations in the mind,” Kvavilashvili explains. “If you go past a fish and chips shop, not only the concept of fish may get activated but lots of things related to fish, and they may stay activated for a certain amount of time—for hours or even days. Later on, other things in the environment may trigger these already active concepts, which have the feeling of coming out of nowhere.” Kvavilashvili noted that “I got curious about [Mind Pops] because they seemed so random and out of the blue, but these mind pops are genuine fragments of knowledge about the world. What it shows us is that our subconscious often knows the meaning of an experience, even if consciously we don’t.”
Researchers like Dr. Lia Kvavilashvili are finding that Mind Pops are quite common. I’m sure that you have likely experienced such events yourself. Kvavilashvili suggests that they are most often words or phrases rather than images or sounds and that they usually occur in the midst of some routine activity such as engaging self care. In other words, they are most likely to occur when your mind is not focused on the task at hand and is free to wander. A variant of this phenomena is the Tip of the Tongue (TOT) experience – where you may be struggling to remember a name or a word and it feels as though it is right on the tip of your tongue; yet, you just can’t spit it out. Then later, when you have stopped actively pursuing it, the word surfaces. That letting go of pursuit allows your implicit (unconscious) memory do its work.
Although almost everyone experiences Mind Pops, there seems to be an increased frequency of Mind Popping in individuals with mental health issues. Researchers Keith Laws, Lia Kvavilashvili, and Ia Elua, conducted some preliminary research whereby they compared the frequency of Mind Pops in 37 individuals with schizophrenia, 31 people with depression, and 26 individuals with no mental health issues. On average, individuals with Schizophrenia reported 3-4 Mind Pops a weeks, while individuals with depression reported 1-2 a month, and healthy individuals reported 1-2 every six months. Invasive thoughts that bleed through consciousness are indeed some of the prominent features of schizophrenia and depression, so these categorical differences do make sense.
In my personal correspondence with Dr. Kvavilashvili, she differentiated Mind Pops from the Involuntary Autobiographical Memories I described in a previous post titled The Guilt-Empathy Connection. In that post I discussed a similar phenomena whereby “serenity seems to occasionally pave the way for a sequence of thoughts triggered by a song or a smell, or anything really, that ushers in a blast from the past. A cavalcade of memories then flow forth both effortlessly and seamlessly. And all of this occurs outside of conscious control. For me, it often begins with a pleasant memory, but it can take a circuitous route, bringing me to memories that I would prefer remain inaccessible. The ending point is usually a moment in time where I come face to face with a mistake I made – usually a long forgotten unintentional misstep that reveled a less sensitive or perceptive side of my persona.“ Dr. Kvavilashvili noted that there seem to be “personality and individual difference variables at play” in my type of guilt based Involuntary Autobiographical Memories.
In a cursory review of the literature, I did come across a study by Dr. Dorthe Berntsen and she wrote that “The involuntary [autobiographical] memories more frequently referred to specific episodes, came with more physical reaction, had more impact on mood, and dealt with more unusual and less positive events.“ This coincides with my anecdotal experiences (for whatever that is worth). For me, these events were indeed outliers, they were negative and viscerally so, and they did significantly affect my mood. Mind Pops are quite different from such Involuntary Autobiographical Memories in that the Pops are more semantic in nature (rather than biographical or experiential), and the Pops tend to be more positively experienced.
Although Mind Pops and Involuntary Autobiographical Memories are commonplace, they certainly constitute manifestations of our amazing and incredibly complex brain. Please share your interesting Mind Pops or Involuntary Autobiographical Memories in the Comments section below so that you can showcase the amazing capabilities of your brain. And when you have one of those “out of the blue” Mind Pops look deep to find the source of the subconscious trigger – you might be amazed by your inattentional blindness or the vastness of what your mind’s eye takes in beyond what you see.
Berntsen, D., and Hall, N. M., (2004). The episodic nature of involuntary autobiographical memories. Memory & Cognition. Jul; 32(5): 789-803.
Cowen, Mark, (2012). ‘Mind-pop’ frequency increased in schizophrenia patients. MedWire News.com
Guild, G. (2010). Are You a Robot? can I Program Your Responses? How Do You Think? http://geraldguild.com
Guild, G. (2012). The Guilt – Empathy Connection. How Do You Think? http://geraldguild.com
Elua, I., Laws, K., and Kvavilashvili, L.. (2012). From mind-pops to hallucinations? A study of involuntary semantic memories in schizophrenia. Psychiatry Research. V. 196 (2), Pgs. 165-170.
Jbar, Ferris, (2012). Mind-Pops: Psychologists Begin to Study an Unusual form of Proustian Memory. Scientific American.com
Kvavilashvilia, L., and Mandler, G. (2003). Out of one’s mind: A study of involuntary semantic memories. Paper shared by author in personal correspondence.
Science Daily (2012). Mind-Pops More Likely With Schizophrenia. ScienceDaily.com
Although I did not make a substantial number of posts in 2012, the traffic to my site doubled. Throughout 2012 my blog had 35,819 hits from 31,960 unique visitors, accounting for over 46,720 page views. I had visitors from every state in the US and visits from people from 165 nations around the world. Visitors from the United States accounted for the vast majority of those hits, but the UK, Canada, India, and Australia also brought in large contingents.
This year the top ranked article was my 2011 post on Conspicuous Consumption and the Peacock’s Tail, which accounted for 50% more hits than this year’s number two ranked article (Brainwaves and Other Brain Measures – the number one post from last year). The piece on conspicuous consumption, is in my opinion, one of my all time most important pieces. It addresses our inherent drive to advance one’s social standing while actually going nowhere on the hedonic treadmill. It delves into the environmental costs of buying into the illusion of consumer materialism and its biological origins (the signaling instinct much like that of the Peacock). The Brainwave piece, also from 2011, compares and contrasts the different measures used to peer into the workings of the brain.
Of my posts published in 2012, only two made it to this year’s top ten list: five were from 2010 and three were published in 2011. Of those eight from previous years, five were also on the top ten list last year.
My 2012 review and discussion of the Broadway Musical Wicked topped the list of posts actually written in 2012, but it came in third overall this year relative to all other posts. This article explores the theme that “things are not as they seem.” I relate the story told in the show to the political and historical manipulation American citizens are subjected to, and it stirs up unpleasant and inconvenient realities that many would prefer remain unknown.
Great interest persists in my post entitled Nonmoral Nature: It is what it is. This review of Stephen Jay Gould’s most famous article received a number four ranking, down from a number two ranking over the last two years. I had also reviewed in 2010 a very popular New York Time’s article by Steven Pinker entitled The Moral Instinct. This article moved down two notches this year, ultimately ranking number five. My critical article on the Implicit Associations Test ranked number six this year, versus a number four ranking last year. My 2011 post Where Does Prejudice Come From? ranked number seven this year, down two spots from its ranking in 2011. One of my all time favorite posts from 2010, Emotion vs. Reason: And the Winner is? returned to the top ten list this year coming in eighth. In 2010 it ranked number ten, but it fell off the list last year. My Hedgehog versus the Fox mindset piece ranked number nine this year, compared to a number ten ranking last year. Finally, in the number ten slot this year, is my 2012 article Happiness as Measured by GDP: Really? This post was perhaps the most important post of the year.
So here is the Top Ten list for 2011.
- Conspicuous Consumption and the Peacock’s Tail (2011)
- Brainwaves and Other Brain Measures (2011)
- Wicked! Things are NOT as they Seem (2012)
- Non Moral Nature: It is what it is (2010)
- Moral Instinct (2010)
- IAT: Questions of Reliability and Validity (2010)
- Where Does Prejudice Come From? (2011)
- Emotion vs. Reason: And the Winner is? (2010)
- Are you a Hedgehog or a Fox? (2010)
- Happiness as Measured by GDP: Really? (2012)
Again this year, the top ten articles represent the foundational issues that have driven me in my quest to understand how people think. This cross section of my work is, in fact, a good starting point for those who are new to my blog. There are several other 2012 posts that ranked outside the top ten; regardless, I believe they are important. These other posts include:
This latter article, The Meek Shall Inherit The Earth, pertains to the microbiome, the collection of an estimated 100 trillion individual organisms thriving in and on your body that account for about three pounds of your total body weight (about the same weight as your brain). These little creatures play a huge role in your physical and mental well being and we are just beginning to understand the extent of their reach. Modern medicine in the future, will likely embrace the microbiome as a means of preventing and treating many illnesses (including treating some mental illnesses).
Although, not among the most popular articles this year, my pieces on the pernicious affects of poverty on child development from 2011 warrant ongoing attention. If we truly wish to halt the cycle of poverty, then we need to devote early and evidenced based intervention services for children and families living in poverty. As it turns out, poverty is a neurotoxin. Knowing the information in this series should motivate us, as a society, to truly evaluate our current political and economic policies.
The bottom line:
The human brain, no matter how remarkable, is flawed in two fundamental ways. First, the proclivities toward patternicity (pareidolia), hyperactive agency detection, and superstition, although once adaptive mechanisms, now lead to many errors of thought. Since the age of enlightenment, when human kind developed the scientific method, we have exponentially expanded our knowledge base regarding the workings of the world and the universe. These leaps of knowledge have rendered those error prone proclivities unessential for survival. Regardless, they have remained a dominant cognitive force. Although our intuition and rapid cognitions have sustained us, and in some ways still do, the subsequent everyday illusions impede us in important ways.
Secondly, we are prone to a multitude of cognitive biases that diminish and narrow our capacity to truly understand the world. Time after time I have written of the dangers of ideology with regard to its capacity to blindfold its disciples. Often those blindfolds are absolutely essential to sustain the ideology. And this is dangerous when truths and facts are denied or innocents are subjugated or brutalized. As I discussed in Spinoza’s Conjecture:
“We all look at the world through our personal lenses of experience. Our experiences shape our understanding of the world, and ultimately our understanding of [it], then filters what we take in. The end result is that we may reject or ignore new and important information simply because it does not conform to our previously held beliefs.
Because of these innate tendencies, we must make additional effort to step away from what we believe to be true in order to discover the truth.
Posted by Gerald Guild
, Erroneous Thinking
, Rational Thought
, Socioeconomic Status
| Tagged: Erroneous Thinking
, Intuitive Thinking
, Rational Thought
, Spinoza's Conjecture
There has been a lot of talk in the media about the forthcoming DSM-5 and the diagnosis of Autism. The DSM-5 is the Fifth Edition of the Diagnostic and Statistical Manual used by Doctors to make diagnoses pertaining to Autism and other behavioral and mental health disorders. There are in fact two major changes in this newest edition regarding Autism. The first has to do with changes to the name of the diagnosis. The second has to do with the actual diagnostic criteria used to make a diagnosis.
Currently, when presented with a child who exhibits some characteristics of Autism, Doctors have to determine whether or not the child exhibits a sufficient array of clinically significant symptoms to warrant a diagnosis. This process requires the clinician to rule out other disorders that may instead be causing the problematic symptoms. The clinician also has to make a differential diagnosis to determine which of the Pervasive Developmental Disorders best describes the child. Many professionals, me included, believe that the dividing lines between the various forms of Autism are difficult to distinguish. The new DSM does away with this problem by eliminating the different labels (Autistic Disorder, Asperger’s Disorder, PDD-NOS, Childhood Disintegrative Disorder) and instead puts in place a more general term – Autism Spectrum Disorder (ASD). Many researchers and clinicians agree that this change is warranted.
When the DSM-5 is published in May of 2013, children who previously would have been diagnosed with Autistic Disorder, Asperger’s Disorder, or PDD-NOS, will be given the new diagnosis – Autistic Spectrum Disorder (ASD). A differentiation will then be made by indicating the degree of symptom severity. Specifically, those with more classical Autism will be diagnosed with ASD-Severe. At the other end of the spectrum, children diagnosed with Pervasive Developmental Disorder – Not Otherwise Specified (PDD-NOS) will likely get an ASD-Mild designation. Those with Asperger’s may fall anywhere from ASD-Severe to ASD-Mild, depending on the degree of impairment. Many with Asperger’s will likely fall in the Moderate range. To be clear however, Classical Autism may span Severe to Mild ASD while PDD-NOS will likely span Moderate to Mild ASD. Again, the severity designation depends on the number and severity of symptoms present. If your child already carries a diagnosis, little will change, except perhaps how professionals refer to the disorder itself. Your child will be referred to as being on the Autistic Spectrum.
The second change involves a modification of the Diagnostic Criterion used to provide a diagnosis. When making a diagnosis, a clinician such as myself, has to have evidence of a sufficient array of behaviors listed in the DSM in order to provide a diagnosis. The behaviors commonly associated with Autism make up the list of Diagnostic Criterion in the manual. The new DSM includes an update of the behaviors used as these criteria. It defines ASD by two sets of core features, namely: 1) impaired social communication and social interactions; and 2) restricted and repetitive behavior and interests. It more appropriately reorganizes the symptoms in these domains and adds sensory interests and sensory aversions to the list.
The new version is touted as an improvement because it adds to and reorganizes the diagnostic criterion so that they better address the needs of people with ASD across all developmental levels and ages. It also includes improvements to better address the atypical symptom presentation of girls. The goal of DSM-5 is to apply what is detailed in the scientific literature so as to add precision and validity to the diagnostic process.
As with any change, there have been some concerns expressed in the media. Perhaps the most frequently heard concern is the fear that those at the mildest end of the spectrum with strong cognitive capabilities will no longer qualify for the diagnosis and thus may lose services. Advocacy groups such as Autism Speaks have been actively engaging in this reorganization process and the American Psychiatric Association (the publisher of the DSM) has made statements aimed to calm the concerns. They suggest that clinical judgment remains a crucial piece of the diagnostic process and that the new criteria are designed to be completely inclusive of those diagnosed using the current DSM-IV. The research released by the American Psychiatric Association shows improved reliability and validity of diagnoses using the DSM-5 and strong inclusiveness of those already diagnosed using the DSM-IV. I have seen the proposed diagnostic criterion and upon review I did not have any serious concerns with regard to how it will affect my ability to make diagnoses.
The bottom line is that for most parents, there will be no appreciable change other than how we refer to your child. In anticipation of this change we have already been using the phrase Autism Spectrum Disorder or “on the spectrum” for quite some time now. Diagnoses in the near term will still be made using the current DSM-IV, and thus, we will still be using the terms Autistic Disorder, Asperger’s Disorder, and PDD-NOS. It is advisable for clinicians/diagnosticians to commence using both sets of terminology so as to minimize confusion in the future. Sharing a document such as this one with the parents of the newly diagnosed is also advised.
Posted by Gerald Guild
| Tagged: Autism
Saying “I’m sorry” can be very difficult for some of us. We routinely make mistakes. As coined by Alexander Pope: “To err is human; to forgive, divine.“ Within any interpersonal relationship there will be inadvertent missteps or even acts of anger that hurt those close to us. Its not a matter of if, it’s a matter of when. Forgiving is important, as Pope emphasizes: and it is also quite often a difficult thing to do. But the act of apologizing, it seems to me, can be even harder.
Obviously it necessitates swallowing one’s pride and accepting responsibility for one’s misdeeds. It also requires a departure from one’s unique view of the world and the adoption of another person’s perspective. Swallowing one’s pride is hard enough and perspective taking stirs the feelings of guilt. For these reasons alone, I believe that saying the two simple words “I’m sorry” is perhaps one of the bravest things a person can do.
There are other factors that contribute to the difficulty associated with an apology. Some view it as a tacit acknowledgement of one’s weakness. It does tend to elicit a personal feeling of vulnerability and perhaps pangs of subjugation, defeat, and loss of status. It can entwine and envelope one in a aura of incompetence and humility. No one likes such feelings: none of them elevate one’s sense of well being. The opposite is true: they instead elicit dysphoric feelings that essentially punish the inclination to apologize. Thus, many avoid, ignore, or steep themselves in denial. Pointing outward and blaming the other party for causing the problem strips one of responsibility and allows escape from the unpleasantness of having to apologize. It is the easy way out, and ultimately it tends to bankrupt a relationship.
I really like how Stephen Covey, author of Seven Habits of Highly Effective People, conceptualizes relationships. He analogizes relationships to a bank account. When you treat another person with dignity and respect, you make deposits in their emotional bank account. When you hurt someone, you essentially make a withdrawal. By virtue of being in a sustained relationship, you will, over time, make a series of deposits and withdrawals. When you hurt another person and then deny your responsibility for having done so, you compound the withdrawal. And too many withdrawals can drain that person’s emotional bank account. A drained account stirs contempt and lays the foundation for the end of that relationship. A genuine apology is typically a deposit and it can go a long way toward bringing the account back into balance. To be effective, it must be heartfelt, with an acknowledgment of the depth of harm done, and with full acceptance of responsibility. The results should help heal wounds and it may even strengthen the relationship. It is a gift, because it can make forgiveness easier for the injured party. Denial, on the other hand, deepens the wound and widens the gap.
Saying “I’m sorry” is supposed to be difficult. It is an act of contrition, whereby one bares the difficult weight of the misstep and takes responsibility for it. This courageous endeavor is essential for sustaining a loving and caring relationship. The world in general, and your relationships specifically, will be better if you endeavor to be brave enough to utter these simple words. Doing the right thing is ultimately way more important than being right (Ludwig, 2010). To err is human; to apologize, heroic.
Belkin, L., (2010). Why is it so Hard to Apologize Well? The New York Times
Lazare, A., (2004). Making Peace Through Apology. GreaterGood.berkley.edu
Ludwig, R., (2009). Why is it so Hard to Say “I’m Sorry?” NBC NEWS.com
Mumford & Sons (2010). Little Lion Man
O Leary, T. (2007). 5 Steps to an Effective Apology. Pick The Brain.com
Sometimes the quietest moments are the most troubling. Serenity seems to occasionally pave the way for a sequence of thoughts triggered by a song or a smell, or anything really, that ushers in a blast from the past. A cavalcade of memories then flow forth both effortlessly and seamlessly. And all of this occurs outside of conscious control. For me, it often begins with a pleasant memory, but it can take a circuitous route, bringing me to memories that I would prefer remain inaccessible. The ending point is usually a moment in time where I come face to face with a mistake I made – usually a long forgotten unintentional misstep that reveled a less sensitive or perceptive side of my persona.
Does this sound familiar? I have long struggled to make sense of this sequence of thoughts. It’s not as though these distant missteps weigh heavily in my conscious mind. And most of the time they have no or very little current relevance. Almost always the events involve a situation where I had no intention of being hurtful. So why would my brain dredge up painful events and spoil a perfectly pleasant moment? It makes little sense to me.
I have long felt like there is a dark and deeply self effacing entity lurking in the shadows of my mind just waiting for an opportunity to rain guilt on me. Really, it does feel like there is something lurking inside my mind, stalking my thoughts, waiting for a memory that can be linked back to an event that will make me feel bad about myself. Freud’s notion of the Super-ego seems particularly relevant, but there is no evidence of such embodied moralistic forces battling it out in the brain. There are however, brain systems that interact in a way that are compellingly similar to Freud’s model with regard to active decision making. But it is not clear to me how, or why, these systems would reach back in time to spoil a moment of serenity.
As I understand it, the brain is comprised of a complex combinatorial neuronal network that has evolved over millions of years. With this being the case, there must be either some adaptive value to this capacity to stir up guilty feelings, or it may be a side effect of some other adaptive neurological system. These hypotheses are made assuming that this propensity is neither pathological or unique to me. Given the fact that these recall events do not adversely affect my life in any substantive way, beyond briefly bumming me out, and the likelihood that I am not alone in experiencing this – it must be adaptive at some level.
As it turns out there appears to be evidence for a relationship between dispositional empathy and one’s proneness to feelings of guilt. In a study titled Empathy, Shame, Guilt, and Narratives of Interpersonal Conflicts: Guilt-Prone People Are Better at Perspective Taking by Karen P. Leith and Roy F. Baumeister they found that Guilt:
“… seems to be linked to the important cognitive components of empathy, particularly the ability to appreciate another person’s perspective (or at least to recognize that the other’s perspective differs from one’s own). Guilt-proneness is linked to both the ability and the willingness to consider the other’s perspective.”
So these feelings of remote guilt may indeed be adaptive in that they fuel my perspective taking capacity. In other words, they compel me to be all the more careful and sensitive so as to facilitate better outcomes with regard to current social relationships (and thus avoid future negative recollections). I am inherently driven to look at the other person’s perspective in most of my encounters with people. It seems that those situations that spring forth from the depths of my memory are those occasions when I did not effectively employ good perspective taking.
Empathy is widely accepted as being an adaptive skill and perhaps guilt proneness facilitates positive feedback thus driving one toward more effective empathy. Or perhaps the guilty feelings drudged up are experiential outliers – the memories with stronger visceral tags – the ones that are more easily dragged to the forefront as my brain meanders down memory lane. Leith and Baumeister’s research did not address the retrospective nature of experiences like mine; therefore, I continue to speculate. But this link between empathy and guilt makes sense. Or maybe this is a self-serving bias.
If you have a moment, please click on the link below to answer some questions that will give me some preliminary information on this empathy-guilt relationship. It’s only 5 questions – and really, it should only take a minute or so.
Click here to take survey
We humans like to think of ourselves as strong and dominant forces. Why shouldn’t we? After all, we have conquered many of our natural foes and reign supreme as rational and commanding masters of our destiny. That is what we like to think. But this may be an illusion because as it turns out, we share our bodies with an unimaginably vast array of organisms that seem to play a substantial role in our well-being.
In and on your body, there are ten microorganisms for every single human cell. They are invisible to the naked eye – microscopic actually. For the most part they are bacteria, but also protozoans, viruses, and fungi. This collection of organisms is referred to as the microbiome and it accounts for about three pounds of your total body weight: about the same weight as your brain. In all, there are an estimated 100 trillion individuals thriving on your skin, in your mouth, in your gut, and in your respiratory system, among other places. And it is estimated that there are one to two thousand different species making up this community.(2)
Image of Microscopic Bacteria
Since wide spread acceptance of the Germ Theory, in the late nineteenth century, we have considered bacteria as the enemy. These organisms are germs after all, and germs make us sick. This is accurate in many ways: acceptance and application of the germ theory vastly extended the human life expectancy (from 30 years in the Dark Ages to 60 years in the 1930s). Other advances have since increased that expectancy to about 80 years.
But, as we are increasingly becoming aware, this microbiome plays a crucial role in our ability to live in the first place. There are “good” and “bad” microbes. But this dichotomy is not so black and white. Some good microbes turn problematic only if they get in the wrong place (e.g., sepsis and peritonitis). But what we must accept is that we would not survive without the good ones. We are just beginning to learn of the extent to which they control our health and even our moods.
For example, some of our nutritive staples would be of very limited value if it wasn’t for Baceroides thetaiotaomicron. This microbe in our stomach has the job of breaking down complex carbohydrates found in foods such as oranges, apples, potatoes, and wheat germ. Without this microbe we simply do not have the capability to digest such carbohydrates.(1) And this is just the tip of the proverbial iceberg.
The “beneficial” bacteria in our guts are clearly very important. They compete with the harmful bacteria, they help us digest our food, and they help our bodies produce vitamins that we could not synthesize on our own.(3) Surprisingly, these microbes may play a significant role in our mood. A recent study looking at the bacteria lacto bacillus, fed to mice, resulted in a significant release of the neurotransmitter gaba which is known to have a calming affect. When this relationship was tested in humans we discovered a relationship between such gut bacteria and calmness to a therapeutic level consistent with the efficacy of anti-anxiety pharmaceuticals.(2) This alone is amazing.
But wait, there’s more. Take for example Helicobacter pylori (H pylori) whose job seems to be regulating acid levels in the stomach. It acts much like a thermostat by producing proteins that communicate with our cells signaling the need to tone down acid production. Sometimes things go wrong and these proteins actually provoke gastric ulcers. This discovery resulted in an all out war on H pylori through the use of antibiotics. Two to three generations ago more than 80% of Americans hosted this bacteria. Now, since the discovery of the connection with gastric ulcers, less than 6% of American school children test positive for it.(1) This is a good thing! Right?
Perhaps not. As we have recently come to discover, H pylori plays an important role in our experience of hunger. Our stomach produces two hormones that regulate food intake. Ghrelin (the hunger hormone), tells your brain that you need food. Leptin, the second hormone, signals the fact that your stomach is full. Ghrelin is ramped up when you have not eaten for a while. Exercise also seems to boost Ghrelin levels. Eating food diminishes Ghrelin levels. Studies have shown that H pylori significantly regulates Ghrelin levels and that without it your Ghrelin levels may be unmediated thus leading to a greater appetite and excessive caloric intake.(1) Sound like a familiar crisis?
The long and the short of this latter example is that we really do not understand the down stream consequences of our widespread use of antibiotics. Obesity may be one of those consequences. When we take antibiotics, they do not specifically target the bad bacteria, they affect the good bacteria as well. Its not just medical antibiotics that cause problems – we have increasingly created a hygienic environment that is hostile to our microbiome. We are increasingly isolating ourselves from exposure to good and bad bacteria, and some suggest that this is just making us sicker. See the Hygiene Hypothesis.
We have co-evolved with our microbiome and as such have developed an “immune system that depends on the constant intervention of beneficial bacteria... [and] over the eons the immune system has evolved numerous checks and balances that generally prevent it from becoming either too aggressive (and attacking it’s own tissue) or too lax (and failing to recognize dangerous pathogens).”(1) Bacteroides fragilis (B fragilis) for example has been found to have a profoundly important and positive impact on the immune system by keeping it in balance through “boosting it’s anti-inflammatory arm.” Auto immune diseases such as Chrones Disease, Type 1 Diabetes, and Multiple Sclerosis have increased recently by a factor of 7-8. Concurrently we have changed our relationship with the microbiome.(1) This relationship is not definitively established but it clearly merits more research.
Gaining a better understanding of the microbiome is imperative, and is, I dare say, the future of medicine. We humans are big and strong, but we can be taken down by single celled organisms. And if we are not careful stewards of our partners in life, these meek organisms may destroy us. It is certain that they will live on well beyond our days. Perhaps they shall reclaim the biotic world they created.
Author’s Note: This article was written in part as a summary of (1) Jennifer Ackerman’s article The Ultimate Social Network in Scientific American (June 2012). Information was also drawn from (2) a Radio Lab podcast titled GUTS from April of 2012 and (3) a story on NPR by Allison Aubrey called Thriving Gut Bacteria Linked to Good Health in July of 2012.
What drives you crazy about your partner? Dirty dishes left piled in the sink. Several days worth of laundry strewn about the bedroom. The toilet paper roll is never replenished. She talks too much – he doesn’t talk enough. He’s always late – she’s a compulsive neat freak. These are a few of the common complaints that spouses have about their loved ones. It is well known that close intimate relationships can be very tough to sustain over time. There is something about living with someone for a long period of time that turns idiosyncratic quirks into incendiary peeves. Why is this?
I’ve recently finished reading Annoying: The Science of What Bugs Us by Joe Palca and Flora Lichtman. This fascinating read dives into a topic that has escaped much direct scientific scrutiny. This fact is amazing because “although everyone can tell you what’s annoying, few, if any, can explain why” (Palca & Lichtman, 2011). One of the topics that these authors explore is this issue of the bothersome habits of intimate partners. It’s exceedingly common – if your partner drives you crazy – you are not alone.
What is very curious is that often the very things that attracted you to your partner, are the things that, in the end, foster contempt. Palca and Lichtman explore the concept of Fatal Attraction coined by sociologist Diane Felmlee of UC – Davis. Felmlee has explored this concept for years and she has seen this tendency in couples all over the world. In the first stage of love (Romantic Love), we are drawn in, in part, by the cute little things, the person’s novel traits, that trigger affection. But, over time, those initially positive attractors often have an annoying flip side.
Why does something that attracted you to your partner get flipped into a detractor? Felmlee believes that this disillusionment occurs due to Social Exchange Theory where “extreme traits have [their] rewards, but they also have costs associated with them, especially when you are in a relationship.”
- If you were drawn to partner because he was nice and agreeable, he may later be seen as passive and prone to letting people walk all over him.
- If you were attracted to your partner because of her assertiveness, confidence, and self-directed demeanor, you may later find her to be stubborn and unreasonable.
- If you were swooned by his strong work ethic and motivation to be successful, you may later be disappointed because you now have an inattentive, inaccessible, workaholic.
- Someone who is a romantic, attentive, and caring suitor may later be viewed as a needy and clingy partner.
- The passionate may become the dramatic or explosive hot-head.
- The calm, cool, and collected becomes the aloof stoic.
- The laid back guy becomes the lazy slob.
- The exciting risk taker becomes the irresponsible adrenaline junkie.
- The gregarious life of the party becomes the clown who takes nothing seriously.
And so it goes. Repetition seems to be a crucial contributor notes Elaine Hatfield, a psychologist from the University of Hawaii. “The same thing keeps happening over and over again in a marriage” she notes. Michael Cunningham, a psychologist from the University of Louisville has come to refer to these annoying attributes as Social Allergens. The analogy with an allergen is played out in the dose effect. He notes that “small things don’t elicit much of a reaction at first” but that with repeated exposure over time, they “can lead to emotional explosions.” Palca and Lichtman note that:
People frequently describe their partners as both “the love of my life” and “one of the most annoying people I know.”
Elaine Hatfield also believes that these social allergens get amplified when there is an imbalance in equity within a relationship. Equity Theory, she notes, suggests that when there is an imbalance of power, commitment, or contribution in a relationship, these quirks take on a disproportionate amount of negative value. However, if there is balance in the relationship (equity), the annoyance value of a partner’s quirks is more easily tolerated. So, if your partner is a good contributor and there is a balance of power, you are less likely to be annoyed. If, on the other hand, your needs are left unmet, or you do the lion’s share of the work around the house, or you feel unappreciated or diminished by your spouse, there is likely to be more annoyance associated with his or her quirks.
It is also important to note that the nature of a relationship changes over time. During the initial passionate Romantic Love stage, the couple tends to be on their best behavior. Once commitment and comfort are attained, one’s truer attributes tend to come to the surface. There tends to be less effort to conceal one’s quirks and thus increased occurrences of these social allergens.
Over time, increased and accelerated exposure take their toll and if there are equity issues, it’s a recipe for disaster. So, what is one to do?
The first step is to think about the issues that get to you with regard to how the value of those attributes may have a positive side. We all have our strengths and our quirks – yes, you too have your annoying tendencies! Michael Cunningham suggests that you should try to be accepting of your partners quirks. These behaviors are a part of who the person is. He notes that “You’ve got to take this if you want all of the other good things.”
Own your feelings and explore them at a deeper level, particularly with regard to the equity issues in your relationship. Arthur Aaron, a psychology professor at the State University of New York at Stony Brook urges couples to nurture their relationship. “Celebrate when something good happens to your partner” he notes. Attend to and accentuate the positive. He also suggests engaging in novel, challenging and exciting activities fairly often. “Anything you can do that will make your relationship better will tend to make your partner less annoying.” My suggestion is to think of a relationship as a garden that needs attention, maintenance, and nurturance. It’s impossible to rid the garden of all its weeds and pests. But the more attention and nurturance you provide, the more it will flourish. As Stephen Covey is fond of saying: “Love is a verb. Love the feeling is the fruit of love the verb.” So do loving things.
The more I learn about the workings of the human brain – the more I am stirred by feelings that Freud may have been right. Although his theories have long since been discredited, he characterized the brain as a battle ground where three forces jockeyed for control over your decision making. There was the Id whose hedonistic impulse drove us toward self pleasuring. And then there was the conscientious Superego whose role was to compel us to make moral decisions. Finally, he believed there was the Ego whose job was to mediate between the drives of Id and Superego so as to facilitate adaptive navigation of the real world.
Freud’s theories have always been compelling because they feel right. I often feel as if there is a tug of war going on inside my head. The struggles occur in the form of decisions to be made – whether its about ordering french fries or a salad, fish or steak, having a cookie or an apple, exercising or relaxing, jumping over that crevasse or avoiding it, buying a new coat or saving the money. These battles are seemingly between good choices and bad ones. But, where you place the good and the bad is highly contingent on one’s priorities in the moment. The fries, steak, cookie, relaxing and that new coat all seem like good ideas in the moment – they’d bring me pleasure. On the other hand, there are the downstream consequences of unnecessary calories from fat and sugar or squandered resources. It’s a classic Id versus Superego battle.
But of course there are no entities in the human brain whose express duties are defined as Freud characterized them.
Or are there?
Well actually, there are brain regions that do wage contentious battles for control over your behaviors. Across time, different modules assert greater amounts of control than others, and thus, the choices we make, do likewise vary in terms of quality. As a result of advances in technology and understanding, we are becoming increasingly aware of the key factors associated with this variation.
Nucleus-Accumbens (NAcc) highlighted in red
One of the centers that play out in our multi-component brain is the dopamine reward pathway. Dopamine is a neurotransmitter that serves a number of important functions in the brain. One of its most significant roles plays out as a result of activation of the Nucleus Accumbens (NAcc). When the NAcc is activated it floods the brain with dopamine and we experience pleasure. Desire for an item activates the NAcc. Being in the presence of the desired item activates it further. The greater the arousal of the NAcc the more pleasure we experience. It is your NAcc that is responsible for the happiness you feel when you both anticipate and eat those fries or that steak or buy that new coat. It is also responsible for that rush you feel when your team wins the big game (Lehrer, 2009).
Insula highlighted in teal
Then there is the Insula – a brain region that produces, among other sensations, unpleasantness. This center “lights up” in brain scans when people feel pain, anticipate pain, empathize with others, see disgust on someone’s face, are shunned in a social settings, or decide not to buy an item. In many cases we avoid exciting the Insula as it is the system that produces the unpleasantness of caffeine or nicotine withdrawal and the negative feelings associated with spending money (Blakslee, 2007; Lehrer, 2009). When you are jonesing for that coffee or nicotine fix, it is your Insula that is making you feel badly – necessarily compelling you to feed the habit. And when you satisfy the craving it is your NAcc that gives you that Ahhhhh! – that sense of well being.
Perhaps the NAcc is Freud’s Id and the Insula Freud’s Superego? It is actually much more complicated than this, but the overlap is interesting.
In an article I posted last month I wrote about the concept of an Alief. An Alief is a primal and largely irrational fear (emotion) that arises from the deep unconscious recesses of your brain and plays a significant role in guiding some of the decisions you make. At a very basic level, we know of two major driving forces that guide our decisions. Broadly, the two forces are reason and emotion. So how does this work? How do we process and deal with such diverse forces?
Orbitofrontal-Cortex (OFC) highlighted in pink
Neuroscientists now know that the OrbitoFrontal Cortex (OFC) is the brain center that integrates a multitude of information from various brain regions along with visceral emotions in an attempt to facilitate adaptive decision making. Current neuroimaging evidence suggests that the OFC is involved in monitoring, learning, as well as the memorization of the potency of both reinforcers and punishers. It analyzes the available options, and communicates its decisions by creating emotions that are supposed to help you make decisions. Next time you are faced with a difficult decision, and you experience an associated emotion – this is the result of your OFC’s attempt to tell you what to do. Such feelings actually guide most of our decisions without us even knowing that it is happening.
The OFC operates outside your awareness: opaquely communicating with your rational decision making center using the language of feelings. Our rational center, the Prefrontal Cortex, the more apt Freudian Ego analogy, is not as predominant as he suggested. In fact, it is limited in capacity – both easily fatigued and overly taxed. See my post on Willpower for a deeper discussion of this issue.
So, as crazed as we view Freud’s notions today, there were some aspects of his explanation of human behavior that were rooted in actual brain systems. As I previously noted, these systems are much more complicated than I have described above, but in essence, there are battles waged in your head between forces that manipulate you and your choices through the use of chemical neurotransmitters. A portion of these battles occur outside your awareness, but it is the influence of the emotions that stem from these unconscious battles that ultimately make you feel as though there is a Devil (Id) on one shoulder and an angel (Superego) on the other as your Prefrontal Cortex (Ego) struggles to make the best possible decision.
By understanding these systems you may become empowered to make better decisions, avoid bad choices, and ultimately take more personal responsibility for the process. It’s not the Devil that made you do it, and it’s not poor Ego Strength – necessitating years of psychotherapy. It is the influence of deeply stirred emotions and manipulation occurring inside of you and perhaps some over dependence on a vulnerable and easily over burdened Prefrontal Cortex that leads you down that gluttonous path.
Blakeslee, Sandra. 2007. Small Part of the Brain, and Its Profound Effects. New York Times.
Gladwell, M. 2005. Blink: The Power of Thinking Without Thinking. Little, Brown and Company: New York.
Guild, G. 2010. Retail Mind Manipulation. How Do You Think?
Guild, G. 2010. What Plato, Descartes, and Kant Got Wrong: Reason Does Not Rule. How Do You Think?
Guild, G. 2010. Willpower: What is it really? How Do You Think?
Guild, G. 2011. Irrational Fear: It’s Just an Alief. How Do You Think?
Lehrer, J. 2009. How We Decide. Houghton Mifflin Harcourt: New York.
I have always said that there is a fine line between intelligence and fear. Some fear is adaptive and entirely reasonable: particularly when the catalyst truly involves danger. There are some anxieties however, that take hold and profoundly affect behavior in unreasonable ways.
One personal example comes to mind to illustrate this. Last winter I was backpacking on a trail that traversed some rock city formations with deep, but relatively narrow, crevasses. Many of the cracks were unintimidating and easily traversed. There was one however, that stopped me in my tracks. The gap was 36-40 inches across a sheer 25 foot drop. Under more typical circumstances, this gap would have not phased me. Yet, in this situation, I was completely frozen.
Rock City Crevasse
To be clear there was some risk associated with this crossing. But, in my mind, the risk took on unreasonable proportions.
Frankly, I was both embarrassed and befuddled by this situation. Were it a stream of equal width, I would have easily hopped over it.
I stood there at battle with myself for what seemed like an eternity. In reality, it was probably only a minute or two. My body was hostage to a cognitive tug-of-war between my rational brain urging me to leap. “Come-on” I uttered to myself “It’s only three feet across!” “You can do this!”
Another force in my brain countered with incapacitating doubt. Kevin, my backpacking companion, patiently waited on the other side of the crevasse after easily leaping across. I saw him do it with no difficulty. I had clear evidence that the crossing was easily within my capabilities; but, the cost of a slip and a fall, far overshadowed my confidence. The frustration I felt over this coup of sorts, was immense. Finally, I was able to muster up enough confidence to take the leap. It was, in fact, quite easy. We hiked on and no further mention of this humbling pause was made.
Many fears are like this. Whether it is a fear of mice, or bees, spiders, or snakes. These stimuli impose, in most circumstances, no grave threat, but the flight response they trigger in the phobic is immense. Even when a person knows that there is no reason for fear, it persists.
This response is akin to the reluctance that most people have about eating chocolate fudge in the shape of dog feces, or eating soup from a clean unused bedpan, or drinking juice from a glass in which a sterile cockroach has been dipped. Psychologist Paul Rozin, in his famous studies on disgust, discovered that when presented with these circumstances, most people choose not to eat the fudge or the soup, or drink from the glass – even knowing there is no real danger in doing so. It is the irrational essence of contagion that drives these inhibitions.
These situations are all very different than rock climbing without ropes, where there is clear and present danger. When we are compelled to flee a truly benign stimulus, we are likely driven by an internal cognitive force that screams “RISK!” even when there is no true danger. Intriguing isn’t it, that this innate force is so powerful that even our capacity to use reason and evidence pales in comparison.
Philosopher Tamar Gendler has coined the word “alief” to describe this cognitive phenomenon. She fashioned the word around the word “belief,” which is a conscious manifestation of how we suppose things to be. An alief is a deep and powerful feeling of sorts that can and does play an important role in decision-making, but it is not based in reason or evidence. Beliefs can be more susceptible to such rational forces. But aliefs defy reason and exert powerful influence despite one’s attempts to rationally dispel them. This voice is intuitive and its origins are outside your awareness. They typically appear in an attempt to facilitate self-preservation.
You may believe that the feces shaped fudge is “JUST FUDGE!” but it is your alief that the fudge is excrement (as a result of it’s characteristic size, shape, and color) that makes it very hard to eat. I believed that hopping over the crevasse was easily within my capabilities, but it was my “alief” that - leaping over the gap is DANGEROUS - that kept me frozen in my tracks.
You see, you can simultaneously hold opposing beliefs and aliefs and it was, in fact, these opposing forces that waged war as I stood at the edge of the precipice. You might believe that a bee is generally harmless and unlikely to sting you unless you threaten it. But, it is your alief, that the bee will sting and hurt you that triggers the autonomic arousal that compels you to flee. It is this deeply primal alief that often wins, no matter how rational you attempt to be.
In my situation, my belief in my leaping ability ultimately prevailed. Perhaps this was due to my machismo or humiliation, but ultimately I fought down and defeated the alief. It was a hard fought battle that left me feeling like a chicken despite my “victory.”
In retrospect, getting an understanding of this internal process has helped me come to grips with my hesitation. And as such, I stand in awe of the internal brain systems that play out in such circumstances.
Perhaps in the future, when in a similar situation, I will be better prepared to deal with self doubt as it springs forth from my lizard brain so that I will more effectively cope with it before it builds incapacitating momentum. After all – it’s just an alief!