I saw it with my own two eyes!” Does this argument suffice? As it turns out – “NO!” that’s not quite good enough. Seeing should not necessarily conclude in believing. Need proof? Play the video below.



As should be evident as a result of this video, what we perceive, can’t necessarily be fully trusted. Our brains complete patterns, fill in missing data, interpret, and make sense of chaos in ways that do not necessarily coincide with reality. Need more proof? Check these out.


Visual Illusion - A & B are the same shade of gray

Visual Illusion – A & B are the same shade of gray

Illusion - Notice the perceived motion around the green circles.

Illusion – Notice the perceived motion around the green circles.


Convinced? The software in our brains is responsible for these phenomena. And this software was coded through progressive evolutionary steps that conferred survival benefits to those with such capabilities. Just as pareidolia confers as survival advantage to those that assign agency to things that go bump in the night, there are survival advantages offered to those that evidence the adaptations that are responsible for these errors.


So really, you can’t trust what you see. Check out the following video for further implications.



Many of you are likely surprised by what you missed. We tend to see what we are looking for and we may miss other important pieces of information. The implications of this video seriously challenge the value of eye witness testimony.


To add insult to injury you have to know that even our memory is vulnerable. Memory is a reconstructive process not a reproductive one.2 During memory retrieval we piece together fragments of information, however, due to our own biases and expectations, errors creep in.2 Most often these errors are minimal, so regardless of these small deviations from reality, our memories are usually pretty reliable. Sometimes however, too many errors are inserted and our memory becomes unreliable.2 In extreme cases, our memories can be completely false2 (even though we are convinced of their accuracy). This confabulation as it is called, is most often unintentional and can spontaneously occur as a result of the power of suggestion (e.g., leading questions or exposure to a manipulated photograph).2 Frontal lobe damage (due to a tumor or traumatic brain injury) is known to make one more vulnerable to such errors.2


Even when our brain is functioning properly, we are susceptible to such departures from reality. We are more vulnerable to illusions and hallucinations, be they hypnagogic or otherwise, when we are ill (e.g., have a high fever, are sleep deprived, oxygen deprived, or have neurotransmitter imbalances). All of us are likely to experience at least one if not many illusions or hallucinations throughout our lifetime. In most cases the occurrence is perfectly normal, simply an acute neurological misfiring. Regardless, many individuals experience religious conversions or become convinced of personal alien abductions as a result of these aberrant neurological phenomena.


We are most susceptible to these particular inaccuracies when we are ignorant of them. On the other hand, improved decisions are likely if we understand these mechanisms, as well as, the limitations of the brain’s capacity to process incoming sensory information. Bottom line – you can’t necessarily believe what you see. The same is true for your other senses as well – and these sensory experiences are tightly associated and integrated into long-term memory storage. When you consider the vulnerabilities of our memory, it leaves one wondering to what degree we reside within reality.


For the most part, our perceptions of the world are real. If you think about it, were it otherwise we would be at a survival disadvantage. The errors in perception we experience are in part a result of the rapid cognitions we make in our adaptive unconscious (intuitive brain) so that we can quickly process and successfully react to our environment. For the most part it works very well. But sometimes we experience aberrations, and it is important that we understand the workings of these cognitive missteps. This awareness absolutely necessitates skepticism. Be careful what you believe!




1.  169 Best Illusions–A Sampling, Scientific American: Mind & Brain. May 10, 2010


2.  Anatomy of a false memory. Posted on: June 13, 2008 6:25 PM, by Mo


3.  Simons, Daniel J., 1999. Selective Attention Test. Visual Cognitions Lab, University of Illinois. http://viscog.beckman.illinois.edu/flashmovie/15.php


4.  Sugihara, Koukichi 2010. Impossible motion: magnet-like slopes. Meiji Institute for Advanced Study of Mathematical Sciences, Japan. http://illusioncontest.neuralcorrelate.com/2010/impossible-motion-magnet-like-slopes/


There is a learning curve to the application of Skeptism. Raw, unchecked challenges to other’s beliefs, in a social context, are not well tolerated. People tend to find such notions rather offputting. In fact, as I have certainly encountered, it elicits defensiveness and sometimes hurt feelings. People often own their ideas and beliefs in a way that is essentially linked to their identity. As Carl Sagan wrote in ‘The Deamon Haunted World’ “All of us cherish our beliefs. They are, to a degree, self-defining. When someone comes along who challenges our belief system as insufficiently well-based — or who, like Socrates, merely asks embarrassing questions that we haven’t thought of, or demonstrates that we’ve swept key underlying assumptions under the rug — it becomes much more than a search for knowledge. It feels like a personal assault.”


These assaults repel people and in effect insolate them from the rational inquiry you may wish to posit. People are inclined to respond to uninvited or poorly crafted skepticism much as one would respond to contemptuous arrogance.


Throughout most of human history, the social consequences of skeptical inquiry were likely quite costly. This was most certainly true in the preagrarian stages of our evolution. It is believed that throughout early human evolution individual survival was linked to social cohesion. Although this is not as true today, in prehistory skepticism likely hindered, rather than promoted survival. With this in mind, it certainly makes sense that we as a species are inclined toward unquestioning belief rather than skepticism. This inclination also makes us vulnerable to mysticism and superstition. Natural selection, it seems, has selected for gulibility.


Sensitive, judicious, and scant use of sketicism, in social contexts, is prudent. This is true unless you just don’t care about how others feel about you, how they feel about interacting with you, and even about how they feel about themselves. There is a time and place for everything. Choosing those times carefully and selecting one’s words even more cautiously will more likely get better results.


I admire great thinkers like Bruno, Coppernicus, and Galileo who faced more than mere social consequences for putting forward their theories. Bruno, in fact, paid with his life. Darwin too faced significant costs. However, their rejection of accepted explanations (stemming from skeptical inquiry) moved us forward. We owe much to these men for their courage and steadfast dedication to the truth. We move forward when we step away from blind acceptance; but, let’s not lend a blind eye toward the social consequences of our own personal skepticism.



12 March 2010

Essentialism within the purview of psychology is a cognitive bias whose roots form in early childhood (Gelman, 2004). This concept pertains to the notion that all discernible objects harbor an underlying reality that although intangible, gives each and every object it’s true identity – it’s essence (Dawkins, 2009; Hood, 2008).  To put it another way:

people believe that natural category members share some hidden, unobservable, empirically discoverable deep structure or essence, whose possession is necessary and sufficient for category membership” (Jylkkäa, Railob, and Haukiojaa, 2008).

In our early childhood, as we were developing language, essentialism played a crucial role in the expansion of our vocabulary, the generalization of our knowledge, in discriminating among objects, and in our ability to construct causal explanations (Gelman, 2004).  In our struggle to understand the vast and complicated world, our brain forced us to partition things into categories so we chopped and divided what we surveyed into distinct groupings based on defining characteristics driven by our internalized understanding of the essence of those groupings.  This was initially a very simplistic process (dog, cat, cow), then more complex (mammal, reptile, insect),  and then even more sophisticated for those who progressed in the biological sciences (kingdom, phylum, class, order, family, genus, species). This is necessarily a dynamic process because as we mature and take in increasing complexity we need increased specificity when parsing the world up into discrete categories.


This pattern of thinking/learning transcends all cultures and is central to our language development (Hood, 2008). Given this central role, it forms the foundation of our thought processes (Hood 2008; Dawkins, 2009). The overgeneralization of this process is what gets us into difficulty. Bruce Hood, author of Supersense (2008), convincingly argues that this innate tendency forms the core of our superstitious and supernatural thinking. Richard Dawkins (2009), an evolutionary biologist, suggests that such an inclination explains why people have such great difficulty grasping and accepting the concept of evolution by means of natural selection. I suggest, that like evolution (which necessitates quintessential anti-essentialist thinking), the concepts of plate tectonics, deep geological time, and deep space time are also very hard to grasp for the same reasons. We are inclined to think that what we see are constants – that the world as we see it has been eternally so, and so shall it always remain.


In biology, essentialism sustains the notion that all animals are clear and distinct, belonging to a specific species. In fact, as Dawkins  suggests: “On the ‘population-thinking’ evolution view, every animal [living form] is linked to every other animal [living form], say rabbit to leopard, by a chain of intermediates, each so similar to the next that every link could in principle mate with its neighbors in the chain and produce fertile offspring” (2009, p. 24).  This is true for all conceivable pairings including bacteria and viruses, giant sequoias and lichen, spiders and flies, cats and dogs, birds and snakes, foxes and chickens, and even humans and turnips.


Plato demonstrated essentialist thinking in The Republic in his cave allegory, where he suggested that the world as we experience it is only a composite of mere shadows tethered to their true and perfect forms (essences) floating about somewhere in the heavens (Dawkins, 2009; Hood, 2008). Many people still believe that there is something more to the physical world than what we see. As Hood (2008) put it, “Humans like to think that special things are unique by virtue of something deep and irreplaceable.” This thinking, and other intuitive errors such as vitalism (that vital life energies cause things to be alive) and holism (that everything is connected by forces) are likely artifacts of our natural involuntary inclinations (Hood, 2008).


Essentialism is more than a heuristic and it has ramifications beyond making us less inclined to believe in evolution or more inclined toward superstition. It is what makes rape more than a physical crime. The defilement and contamination the victim feels is a psychological violation of one’s essential integrity. Genocide is perpetrated by individuals who dehumanize or define the victims as essentially different and/or contaminated. Essentialism, is what makes original works of art more valuable than exact duplicates (Hood, 2008). It also drives the belief systems that sustain homeopathy.


It is interesting that this intuitive process plays such an important and fundamental role in our development and sustains both powerfully positive and hugely negative influences on us as adults.  When you get right down to the essence of this concept, you must accept that these inclinations have their roots in the same thinking that makes a preschool child believe that a Mommy can’t be a firefighter (Gelman, 2004).




Dawkins, R. 2009. The Greatest Show on Earth: The Evidence for Evolution. New York: Free Press.


Gelman, S. A. 2004. ‘Psychological Essentialism in Children’, TRENDS in Cognitive Sciences, 8, 404-409.


Hood, B. 2008. Supersense: Why We Believe in the Unbelievable. New York: HarperCollins Publishers.


Jylkkäa, J., Railob, H., & Haukiojaa, J. 2008. ‘Psychological Essentialism and Semantic Externalism: Evidence for Externalism in Lay Speakers’ Language Use‘. Philosophical Psychology


The Intuitive Cling

5 March 2010

The more I talk with others about the erroneous inclinations of the intuitive brain the more I face responses that are incredulous, emotional, and sometimes irrational. When it comes to intuition, it seems, people are quite fond of theirs. Rational thought, I am often reminded, elicits annoyance. Ramp up the annoyance when you remind people of the biases that underlie their silly beliefs. Rational and scientific thinking in the public domain is no way to win friends either. And it seems, that its use is not necessarily an effective way to win an argument. In a recent conversation with a colleague about the magical power of full moons I said something silly like the data doesn’t support a relationship between the phase of the moon and problem behaviors in classrooms. The response was “I don’t believe in data.” How do you respond to that? How do you respond to the rejection of reality?


I don’t have anything against intuitive thinking, well that may not be completely true; as it clearly is prone to error. It is however the source of creativity. My wife suggests that intuition is part of the essence of being a women: that women are socialized to value it as if it where foundational. Rejecting it is like rejecting a core piece of oneself.


I can’t imagine a world devoid of intuition. I’m not sure I want to. On the other hand, the costs of it are ever present and often very destructive. When I strive to find the balance, I struggle. Perhaps you can help me find that balance, or perhaps bolster the value of this sticky propensity. Please tell me what you think.


Rules of Thought

12 February 2010

We are innately intuitive thinkers inclined toward making all sorts of cognitive errors as we muddle through our lives. The consequences in many cases are benign enough; however, I dare say that many an interpersonal conflict stems from such thinking. However, the consequences of this type of thinking can be huge in some circumstances. For example when these biases are carried out by those who, from a position of power (or vulnerability), deny anthropogenic climate change, we all suffer. Other deleterious errors play out in political debates over such issues as health care reform and the privatization of social security, as well as in the struggles between creationists and science minded folk in the discussions over whether to teach intelligent design as part of the science curriculum.


It really doesn’t matter on which side of the issue you stand – we are all subject to errors and biases that ultimately widen the gap between the antagonists rather than bring them closer to resolution. There is little debate about the relative impact of these biases and errors as they play out in the conversations about such complicated and contentious issues. All you have to do is listen to the soundbites and spin – regardless of the side you are on, it is plainly evident that the opposing pundits and/or “experts” come from completely different realities. Sometimes it is evident that there can be no resolution because of the lack of a foundational agreement as to the terms or rules of the discussion.


My quest for some rules of thought to serve as an inoculation, of sorts, for these pervasive and seemingly instinctual erroneous inclinations has proven difficult. Instincts it seems are hard to resist. Virtually all of the errors I have discussed have their origins in the intuitive brain, away from the higher order thinking areas of the cerebral cortex. Millions of years of evolution have honed these processes conferring a survival advantage to those who attend closely to things that go bump in the night. In the arms race for survival faced by our ancestors, quick decisions were absolutely essential. Arduous skepticism was likely lethal if not by means of predation certainly by means of ostracization. It takes an additional cognitive step – involving higher order thinking to bypass these inclinations. And as Spinoza suggested, we as a species are not inclined to take this additional step. Skepticism is difficult and perhaps even viscerally unpalatable. We must make the extra effort to employ critical thinking – particularly when the stakes are high!


It is crucially important to note that the following guidelines will only be fruitful if both sides agree to them. If not, the parties will go round and round – never really accomplishing anything.


First, we have to acknowledge the following:

A. Our default thoughts are likely intuitive thoughts and they are thus likely biased by cognitive errors. Gut-level thinking just doesn’t cut it for complex issues.

B. Things that make immediate intuitive sense are likely to be uncritically accepted. Agreeable data should not escape scrutiny.

C. Jumping to conclusions about the internal attributes of others (individuals or groups) as an explanation of behavior or circumstances is likely short sighted. We should always seek a greater understanding of the true circumstances.


As such, we must:
1. Give equal time and scrutiny to the pursuit of disconfirming information; particularly regarding agreeable facts because we are inclined toward finding data to support preconceptions.

2. No matter how much you like your hypothesis – you must always be willing to abandon it.

3. Use substantive – observable – measurable – data – always being wary of the expectancy and placebo effects. For evaluation of treatment efficacy – double blind, randomized, placebo controlled studies are the gold standard. And one study is rarely conclusive. Multiple confirming replications are necessary.

4. Universal application of these rules is absolutely essential. It is imprudent to apply these guidelines only as they serve your purpose(s).

5. In order to use scientific methods to investigate any concept, the concept itself must be falsifiable.

6. Be parsimonious. The simplest among equally plausible explanations is usually the best explanation.


Some issues cannot be rationally discussed particularly due to guidelines 2, 4, and 5. Issues that necessitate violation of these tenants are often ideologically driven and thus preclude rational or scientific debate. Some really big issues, such as the existence of God, or the merit of creationism most often cannot be reasonably debated following these guidelines, again because it is unlikely that both parties will agree to these guidelines. A big sticking point is that God’s existence, in particular, is not falsifiable. It therefore, is not the domain of science to either prove or disprove God’s existence. But, other big issues such as anthropogenic global climate change or the merits of health care reform can, and should be, subjected to these guidelines.


In a recent article at dbskeptic.com, titled Five Habits of the Skeptical Mind Nicholas Covington wisely detailed his suggestions for good skeptical hygiene. He included: (1) Your belief will not change reality; (2) Look for the best overall explanation of the facts; (3) Use authorities carefully; (4) Don’t confuse a possibility with a probability; and (5) Dissect your thoughts. R. C. Moore in a comment to Covington’s article added some additional strong points – including: (1) objective evidence results when all observers who follow the same protocol achieve the same results, regardless of their personal beliefs; (2) statistical error never improves with the repetition of independent samples; (3) uncalibrated experimentation is useless; and (4) while logic is very useful for modeling the behaviour of the universe, in no way does it control its behaviour. Both of these lists are helpful and wise (although I have not done them justice here). Carl Sagan’s Baloney Detection Kit is another great list.


I ask you, my readers, to add to this list. What are your rules of thought?


My previous posts addressed several common cognitive biases while briefly touching on their subsequent consequences.  In review, the Fundamental Attribution Error leads us to make hasty and often erroneous conclusions about others’ personal attributes based on our superficial observations.  Generally such conclusions are in fact erroneous because we lack a sufficient understanding of the situational or external circumstances associated with the behavior in question. One particularly counterproductive manifestation of this tendency is the prejudice many individuals have regarding the plight of the poor. The commonly held misbelief is that the poor are so, because they are lazy or stupid or otherwise worthy of their circumstance. Further, the Self Serving Bias is manifested as an overvaluation of the degree of internal attribution the more fortunate make regarding their own personal social and economic position. The reality is that our social economic status has more to do with heritage than with personal attributes such as hard work and discipline.


Confirmation Bias, like Spinoza’s Conjecture facilitates the internalization of information that fits our beliefs and leads us to miss, ignore, or dismiss information that challenges deeply held beliefs. We are thus likely to dismiss pertinent and valid information that may move us from deeply held beliefs. And, perhaps most importantly, these tendencies disincline us from taking the additional steps necessary to critically scrutinize intuitively logical information. Thus we filter and screen information in a way that sustains our preconceptions – rarely truly opening our minds to alternative notions.


These biases are evident throughout society but are plain to see in those who hold strong attitudes about issues such as religion and politics.  The overarching implications are that we tend to cherry pick and integrate information in order to stay in our comfortable belief paradigms. For example, some Conservatives are reassured by watching Fox News because the information aired is presorted based on the core political ideology of political conservatism. Its viewers are presented with information that avoids the unpleasantness of having to legitimately deal with divergent perspectives. Similarly, creationists ignore or negate the overwhelming evidence that substantiates the theory of evolution.


It is interesting to me that the positions held by divergent individuals, liberals or conservatives and skeptics or believers are often quite emotionally based and staunchly guarded.  And rarely are “facts” universally regarded as such.  We are even more likely to cling to these attitudes and values and thus be more prone to such errors in times of distress or threat.  It takes careful rational discipline on both sides to constructively debate these issues.


The tendency to firmly hold onto one’s beliefs, be they religious, political, or intellectual, even in the face of compellingly disconfirming evidence, is referred to as “cognitive conservatism” (Herrnstein Smith, 2010).  Between groups or individuals with divergent “belief” systems, the entrenched rarely concede points and even less frequently do they change perspectives. The polar opposites jab and attack looking for the weakest point in the argument of their nemesis.  These generally fruitless exchanges include ad hominem attacks and the copious use of logical fallacies.


This is clearly evident today in debates between Republicans and Democrats as they battle over public policy. The case is the same between skeptics and believers as they pointlessly battle over the existence of God (as if existence was a provable or disprovable fact).  And it is interesting that some individuals and groups selectively employ skepticism only when it serves their particular interests. This is especially evident in those who make desperate attempts to discredit the evidence for evolution while demanding that different standards be employed with regard to the question of God’s existence.


Because it seems that we as humans are hard-wired with a default for intuitive thinking we are particularly susceptible to magical, supernatural, and superstitious thinking. Compound that default with a tendency to make the above discussed cognitive errors and it is no wonder that we have pervasive and intractable political partisanship and deadly religious conflicts. Further ramifications include the widespread use of homeopathic and “alternative” medicine, the anti-vaccine movement, racism, sexism, classism, and as mentioned previously, ideologically driven denial of both evolution and anthropogenic global climate change.


It is fascinating to me that how people think and at what level they think (intuitive versus rational) plays out in such globally destructive ways. How do you think?


Confirmation Bias

29 January 2010

“The kids are crazy today it must be a full moon.”    This and other similar notions are widely held.  For example, people working in Emergency Departments (ED) assume that spikes in ED admissions are linked to the phase of the moon.  Again, the thinking is that the full moon brings out the craziness in people’s behavior.  Similar links are firmly held about the relationship between the consumption of sugar and bad behavior in children.  They believe that when children eat sugar, it is like consuming an amphetamine – they get wild!


Such cause and effect notions are easily dismissed when you look closely at the laws of physics or the biological plausibility of the effects of sugar on behavior.  Further, if you actually look at the numbers of ED Admissions or behavior problems in schools and the phases of the moon or sugar consumption, there are no relationships. PERIOD! End of story!  Yet, these beliefs are firmly held despite the evidence; which is not necessarily widely available.  Why is it that we hold onto such notions?


The answer is Confirmation Bias.  We are inclined to take in, and accept as true, information that supports our belief systems and miss, ignore, or discount information that runs contrary to our beliefs.  For example, a full moon provides a significant visual reference to which memories can be linked.  And because there is a widely held mythical belief that full moons affect behavior, we also remember those confirmations more clearly.  We are less likely to remember similarly bad days that lack such a strikingly visual reference point and that do not support our beliefs.  As a result, we are less likely to use that data to challenge the myth.


This bias is not limited to full moons and sugar.  It transcends rational thought and is pervasive throughout the human race.  It shapes our religious and political beliefs, our parenting choices, our teaching strategies, and our romantic and social relationships.  It also plays a significant role in the development of stereotypes and the maintenance of prejudices.  These beliefs, good or bad, when challenged, tend to elicit emotional responses (this is a topic all its own).  Much has been written about these phenomena, pertaining to issues related to how and why this occurs.  There are other factors as well that play a role in this erroneous thought process (e.g., communal reinforcement, folklore, the media, attribution error, expectancy effect, and Spinoza’s Conjecture); however, my goal is to raise your awareness of this bias, because knowing that we are prone to it may help us avoid drawing mistaken conclusions. Bottom line – it may help us open and widen our minds to different ideas and maybe even challenge some long held mistaken beliefs.


Spinoza’s Conjecture

22 January 2010

Last week I discussed fundamental attribution error, leaving confirmation bias and Spinoza’s Conjuncture left to explore.  Today I’m going to delve into the latter.  Benedict Spinoza, a 17th-century Dutch philosopher, wrote with great insight that “mere comprehension of a statement entails the tacit acceptance of its being true, whereas disbelief requires a subsequent process of rejection.”  What this suggests is that we are likely to accept, as true, a statement that makes immediate sense to us. But we can also infer from this statement that we are, in general, unlikely to critically scrutinize such logical statements.  A further implication is that we are likely to reject statements that don’t make immediate sense to us.


Sam Harris, a noted neuroscientist and author, and several colleagues at the University of California recently published the results of a study noting that we tend to process understood information very quickly while we process false or uncertain statements more slowly.  And what is even more interesting is that we process ambiguous or uncertain statements in regions of the brain (specifically: the left inferior frontal gyrus, anterior insula, and dorsal anterior cingulate) that are associated with processing pain and disgust.  Hmmm, critical thinking hurts!  This is just one example of much evidence that suggests that our brains work this way.


We all look at the world through our personal lenses of experience.  Our experiences shape our understanding of the world, and ultimately our understanding of the world then filters what we take in.  The end result is that we may reject or ignore new and important information simply because it does not conform to our previously held beliefs.  Subsequently, we may not grow or expand our understanding of the world and we may become intellectually or professionally stagnate.


It is important to remember this tendency when we are taking in novel information. New ideas that run contrary to long-held beliefs are hard to embrace regardless of the degree of merit. And we are disinclined to question the legitimacy of new information particularly if it fits our preconceptions. Challenging and/or ambiguous information, like quantum mechanics may, in some people, elicit feelings similar to pain or even disgust. Perhaps this also explains that uneasy feeling that many people experience when they think about such mind blowing concepts as our size and importance relative to the vastness of time and space. The slowed, arduous, and perhaps even painful process of thinking about such ambiguous or incongruous information may certainly discourage the endeavor. Perhaps the cliche “no pain – no gain” reasonably applies.


Cognitive Biases

8 January 2010

Did you know that you are likely to accept as true those pieces of information that make immediate sense to you? On a similar vein, did you know that you are more likely to take in information that supports your beliefs and to reject or ignore information that runs counter to your beliefs?  Lastly, did you know that you are likely to use entirely different criteria to evaluate someone else’s behavior than you use to evaluate your own?


These three tendencies are pervasive cognitive biases.  They are so universal that it seems that they are hard wired into our brains.  I want to spend some time exploring these biases because they commonly lead to mistakes or at least the maintenance and/or promulgation of misinformation.  Over the next several weeks I will delve into these biases, one at a time, and hopefully help you avoid the erroneous trappings of your own neurology.


The first bias is known as Spinoza’s Conjecture.  The 17th-century Dutch philosopher Benedict Spinoza’s wrote that “mere comprehension of a statement entails the tacit acceptance of its being true, whereas disbelief requires a subsequent process of rejection.”  Sam Harris, a noted neuroscientist, has written that most people have difficulty tolerating vagueness.  On the other hand he has stated that “belief comes quickly and naturally.”  The end result is that “skepticism is slow and unnatural.


The second bias known as Confirmation Bias refers to a type of selective thinking whereby one tends to notice and to look for what confirms one’s beliefs, and to ignore, not look for, or undervalue the relevance of what contradicts one’s beliefs (Skeptic’s Dictionary).  In other words we hear what we want to hear.


The third bias is Fundamental Attribution Error.  This bias refers to our tendency to over estimate the influence of the internal or personal attributes of an individual and underestimate the external or situational factors when explaining the behaviors of others.  This is particularly true when we don’t know the other person very well.  So other people mess up because they are stupid or lazy.  We make mistakes because we are tired, stressed, or have been short changed in some way.


As we will explore later, there are personal, organizational, and societal costs associated with each of these biases.  This is particularly true if we are unaware of these tendencies.  I’ll discuss this more next time.