There is a learning curve to the application of Skeptism. Raw, unchecked challenges to other’s beliefs, in a social context, are not well tolerated. People tend to find such notions rather offputting. In fact, as I have certainly encountered, it elicits defensiveness and sometimes hurt feelings. People often own their ideas and beliefs in a way that is essentially linked to their identity. As Carl Sagan wrote in ‘The Deamon Haunted World’ “All of us cherish our beliefs. They are, to a degree, self-defining. When someone comes along who challenges our belief system as insufficiently well-based — or who, like Socrates, merely asks embarrassing questions that we haven’t thought of, or demonstrates that we’ve swept key underlying assumptions under the rug — it becomes much more than a search for knowledge. It feels like a personal assault.”

 

These assaults repel people and in effect insolate them from the rational inquiry you may wish to posit. People are inclined to respond to uninvited or poorly crafted skepticism much as one would respond to contemptuous arrogance.

 

Throughout most of human history, the social consequences of skeptical inquiry were likely quite costly. This was most certainly true in the preagrarian stages of our evolution. It is believed that throughout early human evolution individual survival was linked to social cohesion. Although this is not as true today, in prehistory skepticism likely hindered, rather than promoted survival. With this in mind, it certainly makes sense that we as a species are inclined toward unquestioning belief rather than skepticism. This inclination also makes us vulnerable to mysticism and superstition. Natural selection, it seems, has selected for gulibility.

 

Sensitive, judicious, and scant use of sketicism, in social contexts, is prudent. This is true unless you just don’t care about how others feel about you, how they feel about interacting with you, and even about how they feel about themselves. There is a time and place for everything. Choosing those times carefully and selecting one’s words even more cautiously will more likely get better results.

 

I admire great thinkers like Bruno, Coppernicus, and Galileo who faced more than mere social consequences for putting forward their theories. Bruno, in fact, paid with his life. Darwin too faced significant costs. However, their rejection of accepted explanations (stemming from skeptical inquiry) moved us forward. We owe much to these men for their courage and steadfast dedication to the truth. We move forward when we step away from blind acceptance; but, let’s not lend a blind eye toward the social consequences of our own personal skepticism.

Share

How one chooses to live one’s life is complicated by the uncertainties of tomorrow.  Often there is an internal tug of war between the interests de jour and those that will be realized tomorrow.  Due to the wonders of compounded interest, it is wise to save as much as you can – as early as you can. However, another powerful reality is that there may be no tomorrow – or a reality that tomorrow may manifest itself in unimaginable ways.

 

I am surrounded by reminders that saving your better days for tomorrow is unwise.  Over the last decade, I have witnessed numerous loved ones and colleagues ravaged by disease.   Most of them died, but those who survived are essentially incapacitated.  They live-on, but are unable to experience life as they would prefer.  Of those that are no longer with us, some were quite young and some were reaching or had just reached retirement age.  Most lived their lives well, some did not: regardless, their peril certainly raised the value of their time, and they certainly had much left to live for.

 

Then there are the statistical realities of threats that my loved ones and I face.  These threats include cancer and car accidents and even the more improbable, but not impossible, threats associated with catstrophic volcanism and asteroid strikes.  The latter two events may seem to be ridiculous considerations, but the fact of the matter is that both are likely in near geological time. Some facts to contemplate:

 

Volcanoes -In a Discovery Channel piece on the supervolcano at Yellowstone it was indicated that “A modern full-force Yellowstone eruption could kill millions, directly and indirectly, and would make every volcano in recorded human history look minor by comparison. Fortunately, “super-eruptions” from supervolcanoes have occurred on a geologic time scale so vast that a study by the Geological Society of London declared an eruption on the magnitude of Yellowstone’s biggest (the Huckleberry Ridge eruption 2.1 million years ago) occurs somewhere on the planet only about once every million years.” It was also reported that “But at this hot spot’s current position under Yellowstone there have been three massive eruptions: 2.1 million, 1.3 million and 640,000 years ago. While those eruptions have been spaced roughly 800,000 and 660,000 years apart, the three events are not enough statistically to declare this an eruption pattern…” The risk is low but the threat is very real.

 

Asteroids – Although small (relatively harmless) bodies frequently enter the Earth’s atmosphere, it is estimated that 1 km (0.62 mi) in diameter asteroids hit our planet on average every 500,000 years. Larger asteroids (5 km or 3 mi) strike Earth approximately once every ten million years.  Even more rare are the large body impacts (10 km or 6.2 mi).  The last known major impact was the dinosaur killing KT extinction event 65 million years ago.  Although it is unlikely that an Earth shattering asteroid will end or drastically alter my life – were it to happen – life as we know it would end.  And we are past due.  According to NASAStatistically, the greatest danger is from an NEO [Near Earth Object] with about 1 million megatons energy (roughly 2 km in diameter). On average, one of these collides with the Earth once or twice per million years, producing a global catastrophe that would kill a substantial (but unknown) fraction of the Earth’s human population. Reduced to personal terms, this means that you have about one chance in 40,000 of dying as a result of a collision.”

 

I am careful not to “blow” these threats out of proportion, but they have figured into my thinking. Taking all this into consideration, I find it prudent to plan for tomorrow (by saving for retirement), but I find it equally important to live for today. Thus tomorrow, my wife and I jet off to Europe for a two week exploration of Paris, Venice, Florence and Rome. This is something that my wife has dreamed of her entire life. We are relatively young and able-bodied and can afford it (kind of): putting it off any longer seems unwise.  Next Friday we will be in Venice, but I think I’ll wait to make my next post until the weekend when I’m in Florence where I’ll post a picture of Galileo’s middle finger. 😉

Share

Intuitive Thought

2 April 2010

What is Intuitive Thought?

 

I have devoted numerous posts to a general category of cognitive errors and biases that are broadly lumped into errors associated with the intuitive mind. The lay notions of intuition are often referred to as gut instincts and they are generally considered emotional and irrational responses.  It is in this context that intuition is vilified.  Such impulsive reactions are countered with teachings typified by adages such as: “Look before you leap;” “Don’t judge a book by its cover;” “Haste makes waste;” and “The hurrier you go the behinder you get.”  Although this narrow understanding of intuition is in part correct, it largely misses the mark regarding this very complicated and sophisticated neuro-system.  Intuition is largely misunderstood, and has frankly not been well understood to begin with. Herein I hope to offer a cursory explanation of intuition and broadly differentiate it from rational thought. The vast majority of the following content is drawn from Malcolm Gladwell’s intriguing 2005 book called ‘Blink: The Power of Thinking Without Thinking.’ Gladwell draws together a vast array of research from cognitive and social psychology and a number of other sciences in an attempt to elucidate this ambiguous concept.

 

Rational thought serves as a good starting place because, in fact, it offers a good point of comparison helping to bring intuition into slightly better focus. Reason is the hallmark of rational thought. It involves an active application of the cerebral cortex, whereby personal history, knowledge, and active cognitions are employed in a conscious manner to solve problems. The keywords here are active and conscious. When we engage in reasoning we are generally aware of the cognitive effort directed toward this process.  Another aspect of relevance to this process is the passage of time. Reason-based thought is not generally instantaneous. Although solutions may seem to pop into awareness out of the blue, generally some measure of time passes as we strive for enlightenment. Think of an occasion where you had word finding difficulties. You probably actively thought about the word, the context of the word, and so on. If you failed to recall the word you may have cognitively moved on to something else, only to have the word come to you.  The former was rational thought; the latter, the result of intuitive thought.

 

Intuition is different from rational thought with regard to those key variables. First, this instantaneous process is seemingly unconscious.  Second, it is automatic (or at least seemingly so) consuming no apparent effort or time.  The popular and scientific literature is replete with descriptive names for this seemingly mystical capacity.  Gladwell uses a full complement of these terms and he sprinkles them throughout his text.  Terms that emanate from the sciences include the adaptive unconscious, unconscious reasoning, rapid cognition, and thin slicing. Other descriptive terms include snap judgments, fast and frugal thinking, and eloquently the “mind behind the locked door.” Regardless of what we call it, intuition is constantly at work, drawing instantaneous conclusions outside of our awareness.

 

Because of the nature of this process, Gladwell notes that people are often ignorant of the secret decisions that affect their behavior, yet they do not feel ignorant. We often behave in manners driven by the adaptive unconscious and later try to justify those behaviors invoking the rational brain to do so. This fact is what calls into the question the reality of free will.  Intriguing isn’t it!   It is as though there is a covert super-powerful, super-fast computer running in tandem with our overt reasoning computer: yet outside our awareness this covert computer remains ever vigilant, soaking in the world through our senses, and actively directing our behavior.

 

Although the adaptive unconscious lies outside our direct control, life experiences, practice, and our intellectual pursuits contribute to the data set that is used when snap judgments are made. The more informed, erudite, and experienced one is, the more accurate one’s rapid cognitions become.  Just think about driving.  When learning to drive there are an overwhelming number of things to think about – so many in fact, that mistakes made are likely due to “analysis paralysis.”  Too much to compute!  Through practice and repetition, all those things we previously had to actively think about become more automatic.  We don’t think about the countless micro adjustments we make on the steering wheel as we drive down the highway.   Novice drivers must think about these adjustments, along with attending to their speed (generally with gross applications of the accelerator and brakes), and myriad other factors that seasoned drivers do not overtly contemplate. The novice’s driving is chunky – experienced drivers, with the benefit of many miles in the drivers seat, are generally more smooth and more refined in their driving.

 

Experts in their given fields become more intuitive or automatic with regard to their area of expertise over time as a result of exposure, learning, and practice.   Their thoughts become seemingly automatic, their judgments and reactions more spontaneous – all of this in many situations without the expert even having to actively think.  In these cases (where there is sufficient expertise) snap judgments can be even more accurate than the arduous process of working through problems rationally.   On the other hand, this intuitive process can lead to problems because it is remarkably susceptible to prejudices and errors.  This is particularly true, as you might surmise, in areas where the individual lacks experience or knowledge.

 

Under certain circumstances the adaptive unconscious serves our purposes very well.  In addition to those situations where one’s expertise applies, we tend to effectively use snap judgments in social situations, in complicated situations, or in life or death situations that necessitate quick decisions.   This is where evolution has played a role in shaping this capacity.  It has had the effect of contributing to the survival of our species.  He who can make effective snap judgments in life or death situations is more likely to pass on this very capacity.  And tens of thousands of years of such natural selection has refined this capacity.

 

The catch is that there are erroneous thought processes that are artifacts, residuals or the direct consequence of the adaptive unconscious.  Issues such as essentialism, pareidolia, and superstition fall into this category, as they have been ushered along with the survival advantage that the adaptive unconscious has conferred.  Cognitive errors and biases hamper the effectiveness of the adaptive unconscious because of its inclination toward implicit associations and other accidental error imposing tendencies. Implicit associations are automatic and non deliberate pairings we make between concepts, people, things, etc., (e.g., African Americans are athletic, blonds are scatterbrained, gay men are effeminate) as they are folded into memory.  This is an intriguing concept, one deserving its own post, but you have to take the Implicit Associations Test, particularly the race test, to get a true sense of this powerful bias.  Confirmation bias, self serving bias, as well as the numerous other cognitive biases are likewise linked to this influential super-computer.  However, just because we cannot directly and purposefully access this incredible system, does not mean we have to bow entirely to its influence.  In fact, we can proactively prime this system through active learning.  And we can be aware of this powerful system and the advantages and disadvantages it confers.  We can learn of the errors it inclines us toward and monitor ourselves when it comes to our biases and prejudices.  We can impose certain rules of thought when it comes to important issues.  I believe that  we all should take these very important steps both to make our intuitive brain more accurate and to buffer its influences in those situations where it is likely to lead us astray.

 

References:

 

Gladwell, M. (2005). ‘Blink: The Power of Thinking Without Thinking.’ New York: Little, Brown and Company.

Share

Historically, morality has not been considered a topic of discussion within the domain of science. Instead, this issue has almost exclusively been within the purview of religion. Increasingly, however, concepts such as moral instinct have gained legitimacy as discussed by scientists such as Steven Pinker and Jonathon Haidt, who argue that there are neurological factors associated with morality and that natural selection has played a fundamental role in shaping universal instinctual moral truths. The evidence for this position is compelling. The question remains: “Can science offer moral guidance?” In other words, should science play a role in helping us discern what is right or wrong? Or does science have to relinquish issues of morality to other social systems based solely on historical precedence?

 

First of all, the definition of morality has to be accepted. Dictionary.com defines morality as “conformity to the rules of right conduct; moral or virtuous conduct.” The Stanford Encyclopedia of Philosophy definition of morality reads as follows “descriptively to refer to a code of conduct put forward by a society or, some other group, such as a religion, or accepted by an individual for her own behavior; or normatively to refer to a code of conduct that, given specified conditions, would be put forward by all rational persons.” These definitions are devoid of the de facto notion that this concept is values based. Sam Harris argues, and I believe most people would agree, that human values pertain to circumstances that have the positive affect of enhancing the well being of conscious beings. As such, it does not seem like a reach to suggest that science can play a role in setting the parameters of morality.

 

Quite simply, it can be suggested that there are certain conditions under which humans are more likely to prosper and other conditions under which they are more likely to falter. For instance it is known that children raised in a loving environment where life’s basic needs are provided for, are more likely to grow into happy and productive adults than those raised in hostile and deprived environments. We may intuitively know this, but it is science that provides the evidence for such claims.  The profession of psychology devotes considerable resources to this pursuit. As a psychologist myself I employ evidenced based practices as I endeavor to facilitate the betterment of my clients’ lives. Why is it then, that we dismiss the influences of science when we discuss morals? At a recent TED Conference Sam Harris posed this very question.

 

I suggest, as did Harris, that science is very capable of pointing us, as a society, in the right direction when it comes to morals and values. Russell Blackford wrote in his post on Harris’ speech that “…science can give us information about what individual conduct, moral systems, laws, and so on are likely to lead to such plausible goals for ….. individual and collective human flourishing, social survival, and reduction of suffering. Any source of information about what will lead to goals such as these has some moral authority.

 

Harris argues that it boils down to understanding the conditions that lead to human flourishing – and accepting that these conditions are fundamental facts that should serve as the basis of universal morals. He further contends that there are distinctly problematic values within our current human systems that run counter to human flourishing. For example he discusses the costs of the extremist cultural expectation for women of Islam to wear burkas (and the brutal costs of non-compliance). He contrasts this with the unrealistically perfect portrayal of the female body in modern western cultures. Neither of these circumstances promotes healthy thriving circumstances for young women.

 

He also argues that religion should not be given a pass when it comes to the values they promote just because of their religious status. The natural deference given to religion in our “pluralistic” society in fact, promotes many clearly harmful practices (including the prohibition of birth control, the denial of civil liberties for homosexual couples, sanctioned murder of victims of rape to preserve the honor of the family, male foreskin and in some cultures clitoral circumcision, and the application of prayer in lieu of modern medical services particularly for ill children).  Values rendered in distant Bronze Age cultures and sustained based on ideology are far from being in touch with those values that are likely to promote healthy human development today.

 

Individuals suffer, indeed society as a whole suffers when these or similar prohibitions and/or expectations thrive.  Science, it seems to me is far more capable of really looking at the human and societal costs of such “values.”  Harris suggests that “Morality is certainly a domain where knowledge and expertise applies.” We need to “bring into our dialogue the issues of these truths of right and wrong.”  By accepting that values are drawn based on quality of life issues pertaining to the greater good of all, and by accepting that there are certain truths pertaining to life experiences that either enhance or impinge upon the well being of the human conscious, then isn’t the domain of science to draw out these truths?

 

References:

 

Blackford, Russell. 2010. Sam Harris on Science and Morality. Metamagician and the Hellfire Club. http://metamagician3000.blogspot.com/2010/03/sam-harris-on-science-and-morality.html

 

Harris, Sam. 2010. Science can answer moral questions. TED Conference. http://www.ted.com/talks/sam_harris_science_can_show_what_s_right.html

Share
 | Posted by | Categories: Evolution, Morality, Science | Tagged: , |

Nature is harsh. This reality is evidenced with potential discomfort to those who care to open their eyes to what goes on around us. Most living creatures struggle to survive, facing either limited resources or predation on a continual basis. In most developed nations many humans escape this reality, but not too long ago even we had to struggle survive.

 

I remember the reality of this struggle burning into my memory cells as a child while watching nature shows like The Underwater Odyssey of Commander Cousteau and Wild Kingdom. I vividly recall the horror and intrigue I experienced watching cheetahs and lions chasing down and killing antelope or gazelles. To this day I experience a visceral response when I witness this predation carried to its conclusion with the blood soaked carnivore licking it’s chops. Harsh indeed!

 

The moral implications of nature’s harshness has stirred our intellect for quite some time. It certainly weighed heavily on Darwin as he developed his theory of evolution by means of natural selection. A pressing question in natural theology asked how a benevolent and loving God could create such a system with pervasive suffering. Stephen Jay Gould, in perhaps his most famous essay, titled Nonmoral Nature addressed this very issue.

 

Gould (1982) provides a historical review of this controversy dating back to the mid nineteenth century. One particular scholar from that era, William Buckland, gained comfort from the notion that predation is moral because carnivores increase “the aggregate of animal enjoyment” and “diminish that of pain” because:

Death after all, is swift and relatively painless, victims are spared the ravages of decrepitude and senility, and populations do not outrun their food supply to the greater sorrow of all.”

Buckland concluded that predation on a grand scale is moral. But to some, the real challenge to the morality of nature lies outside run of the mill predation. The reproduction cycle of the ichneumon fly epitomizes this challenge.

 

The ichneumon fly is is actually a wasp belonging to the ichneumonoidea superfamily. This diverse group of insects lay their eggs on or in other insects setting into motion a synchronized chain of events that defies any sense of morality. The endoparasitic ichneumon wasps insert their eggs into the body of their host (e.g., caterpillars, aphids, or spiders). The larvae upon hatching carefully ingest their host’s internal organs – first devouring the non-essential tissues saving the vital organs for last so as to prolong the life of their meal. The ectoparasitic ichneumons sting and paralyze the host before laying eggs on the exterior of the host’s body. The paralysis is permanent but the host remains alive. Once the eggs hatch the larvae penetrate the host’s body and again selectively devour the incapacitated but fully alive host little by little, sustaining the live fresh meal as long as possible.

 

This process is, to say the least, horrifying to contemplate. We humans do not cope well with the notion of parasites on or in our body. Think of the circus that ensues when a child comes home from school with head lice. Think of the horror and shame associated with pubic lice. How about scabies or tape worms? People don’t even like to hear that approximately 10% of our body mass is that of our essential parasitic partners (bacteria). One does not have to use much imagination to shudder with the notion of being slowly devoured from within. ‘Alien’ – need I say more.

 

The ichneumon reproduction contrivance became the supreme challenge to the morality of the designer. Gould wrote of the 19th Century theologians who attempted to resolve this dilemma by anthropomorphizing the mother’s love for its progeny and by downplaying the implications of the plight of the host. They also suggested that this approach may be adaptive for humans as the predation has the effect of minimizing crop loss due to the ravenous appetites of living caterpillars. Finally, they argued that animals are not moral agents, and that they thus must feel little, if any pain. They suggested that lower life forms and even “primitive people suffer less than advanced and cultured folk. It was also believed during this Victorian era that consciousness was only within the realm of man. Needless to say, these arguments fail to resolve the dilemma if one contends that there is a “lurking goodness behind everything.” Darwin wrote in a 1856 note to Joseph Hooker:

What a book a devil’s chaplain might write on the clumsy, wasteful, blundering, low, and horribly cruel works of nature!”

Gould wrote that in the face of this conundrum intellectuals had two options:

  1. Retain the notion “that nature holds moral messages” and that morality involves knowing the ways of nature and doing the opposite. Be not a savage – be not an animal.
  2. Accept that nature is nonmoral, that it is what it is, that morality plays no role in the struggle for existence.

Darwin himself leaned toward the second option although he struggled with letting go of the notion that the laws of nature might denote some higher purpose.  In his essay, Gould (1982) suggested that:

Since ichneumons are a detail, and since natural selection is a law regulating details, the answer to the ancient dilema of why such cruelty (in our terms) exists in nature can only be that there isn’t any answer – and that framing the question “in our terms” is thoroughly inappropriate in a natural world neither made for us nor ruled by us. It just plain happens.”

It is a strategy that works for ichneumons and that natural selection has programmed into their behavioral repertoire. Caterpillars are not suffering to teach us something; they have simply been outmaneuvered, for now, in the evolutionary game.”

 

I too, am inclined toward the notion that nature as it plays out evolution’s dance, is entirely devoid of anything pertaining to morality or evil. We anthropomorphize when we apply these concepts. Even to suggest that nature is cruel is anthropomorphizing. Any true and deep look at the struggle for life that constantly dances in our midst can scarcely lead to any other conclusion but that nature is brutal, harsh, and nonmoral. Should I be wrong about this, I am inclined to be reluctant to meet its designer.

 

Reference:

 

Gould, S. J. 1982. ‘Nonmoral Nature.’ Natural History. 91. pg.19-26.

Share
 | Posted by | Categories: Evolution, Morality, Science | Tagged: , |

Essentialism

12 March 2010

Essentialism within the purview of psychology is a cognitive bias whose roots form in early childhood (Gelman, 2004). This concept pertains to the notion that all discernible objects harbor an underlying reality that although intangible, gives each and every object it’s true identity – it’s essence (Dawkins, 2009; Hood, 2008).  To put it another way:

people believe that natural category members share some hidden, unobservable, empirically discoverable deep structure or essence, whose possession is necessary and sufficient for category membership” (Jylkkäa, Railob, and Haukiojaa, 2008).

In our early childhood, as we were developing language, essentialism played a crucial role in the expansion of our vocabulary, the generalization of our knowledge, in discriminating among objects, and in our ability to construct causal explanations (Gelman, 2004).  In our struggle to understand the vast and complicated world, our brain forced us to partition things into categories so we chopped and divided what we surveyed into distinct groupings based on defining characteristics driven by our internalized understanding of the essence of those groupings.  This was initially a very simplistic process (dog, cat, cow), then more complex (mammal, reptile, insect),  and then even more sophisticated for those who progressed in the biological sciences (kingdom, phylum, class, order, family, genus, species). This is necessarily a dynamic process because as we mature and take in increasing complexity we need increased specificity when parsing the world up into discrete categories.

 

This pattern of thinking/learning transcends all cultures and is central to our language development (Hood, 2008). Given this central role, it forms the foundation of our thought processes (Hood 2008; Dawkins, 2009). The overgeneralization of this process is what gets us into difficulty. Bruce Hood, author of Supersense (2008), convincingly argues that this innate tendency forms the core of our superstitious and supernatural thinking. Richard Dawkins (2009), an evolutionary biologist, suggests that such an inclination explains why people have such great difficulty grasping and accepting the concept of evolution by means of natural selection. I suggest, that like evolution (which necessitates quintessential anti-essentialist thinking), the concepts of plate tectonics, deep geological time, and deep space time are also very hard to grasp for the same reasons. We are inclined to think that what we see are constants – that the world as we see it has been eternally so, and so shall it always remain.

 

In biology, essentialism sustains the notion that all animals are clear and distinct, belonging to a specific species. In fact, as Dawkins  suggests: “On the ‘population-thinking’ evolution view, every animal [living form] is linked to every other animal [living form], say rabbit to leopard, by a chain of intermediates, each so similar to the next that every link could in principle mate with its neighbors in the chain and produce fertile offspring” (2009, p. 24).  This is true for all conceivable pairings including bacteria and viruses, giant sequoias and lichen, spiders and flies, cats and dogs, birds and snakes, foxes and chickens, and even humans and turnips.

 

Plato demonstrated essentialist thinking in The Republic in his cave allegory, where he suggested that the world as we experience it is only a composite of mere shadows tethered to their true and perfect forms (essences) floating about somewhere in the heavens (Dawkins, 2009; Hood, 2008). Many people still believe that there is something more to the physical world than what we see. As Hood (2008) put it, “Humans like to think that special things are unique by virtue of something deep and irreplaceable.” This thinking, and other intuitive errors such as vitalism (that vital life energies cause things to be alive) and holism (that everything is connected by forces) are likely artifacts of our natural involuntary inclinations (Hood, 2008).

 

Essentialism is more than a heuristic and it has ramifications beyond making us less inclined to believe in evolution or more inclined toward superstition. It is what makes rape more than a physical crime. The defilement and contamination the victim feels is a psychological violation of one’s essential integrity. Genocide is perpetrated by individuals who dehumanize or define the victims as essentially different and/or contaminated. Essentialism, is what makes original works of art more valuable than exact duplicates (Hood, 2008). It also drives the belief systems that sustain homeopathy.

 

It is interesting that this intuitive process plays such an important and fundamental role in our development and sustains both powerfully positive and hugely negative influences on us as adults.  When you get right down to the essence of this concept, you must accept that these inclinations have their roots in the same thinking that makes a preschool child believe that a Mommy can’t be a firefighter (Gelman, 2004).

 

References:

 

Dawkins, R. 2009. The Greatest Show on Earth: The Evidence for Evolution. New York: Free Press.

 

Gelman, S. A. 2004. ‘Psychological Essentialism in Children’, TRENDS in Cognitive Sciences, 8, 404-409.

 

Hood, B. 2008. Supersense: Why We Believe in the Unbelievable. New York: HarperCollins Publishers.

 

Jylkkäa, J., Railob, H., & Haukiojaa, J. 2008. ‘Psychological Essentialism and Semantic Externalism: Evidence for Externalism in Lay Speakers’ Language Use‘. Philosophical Psychology

Share

Pareidolia

26 February 2010

Have you ever seen familiar and improbable shapes in those puffy white cumulus clouds as they pass overhead? Notice the squirrel or dinosaur in the image to the right. Some of you may have you seen the recent American Express commercial that portrays items positioned in such a way that we perceive them as sad or happy faces (much like the bathtub fixture below). Now notice the “Hand of God” in the NASA image below and to the right, taken by the Chandra X-ray Observatory. This picture shows energized particles streaming from a pulsar, in a field of debris from a massive supernova. Many of us, instinctively see in this image what looks like the wrist and hand of a person (or God as the name of this nebula implies). Speaking of God, on the internet there are many more explicit examples of religious imagery in much more benign items such as tree trunks, clouds, pancakes or tortillas. This tendency is not limited to the visual sense. We make the same type of errors with auditory information (as is evident in backmasking in popular music). These tendencies, which are in fact illusory, are a consequence of our neural circuitry.

 

Our brains do not tolerate vague or obscure stimuli very well. We have an innate tendency to perceive clear and distinct images within such extemporaneous stimuli. This tendency is called pareidolia. It is also referred to as patternicity. This tendency is so ubiquitous that a projective personality test (the Rorschach Inkblot Test) relies on and “interprets” this inclination.*

 

It has been suggested that our ancestors, the ones who assigned agency to things that went bump in the night (perceiving vague data as a threat) responded in a way that facilitated survival. Those who ignored the stimuli were more likely to be predated and thus not pass on their genes. Carl Sagan noted in his classic book, The Demon Haunted World that this tendency is likely linked to other aspects of individual survival. He wrote:

“As soon as the infant can see, it recognizes faces, and we now know that this skill is hardwired in our brains. Those infants who a million years ago were unable to recognize a face smiled back less, were less likely to win the hearts of their parents, and less likely to prosper. These days, nearly every infant is quick to identify a human face, and to respond with a goony grin.

 

As an inadvertent side effect, the pattern recognition machinery in our brains is so efficient in extracting a face from a clutter of other detail that we sometimes see faces where there are none. We assemble disconnected patches of light and dark and unconsciously see a face. The Man in the Moon is one result”(Sagan 1995: 45).

Michael Shermer wrote of patternicity in the December 2008 issue of Scientific American Magazine. In that article Shermer wrote that scientists have historically treated patternicity as an error in cognition. More specifically he noted that this tendency is a type I error, or a false positive. A false positive in this context, is believing that something is real when, in fact, it is not. Shermer discussed a paper in the Proceedings of the Royal Society entitled “The Evolution of Superstitious and Superstition-like Behaviour” by biologists Kevin R. Foster (Harvard University) and Hanna Kokko (University of Helsinki). These scientists tested the hypothesis that patternicity will enhance survivability using evolutionary modeling. Shermer wrote “They demonstrated that whenever the cost of believing a false pattern is real is less than the cost of not believing a real pattern, natural selection will favor patternicity.” The implications Shermer wrote: “…believing that the rustle in the grass is a dangerous predator when it is only the wind does not cost much, but believing that a dangerous predator is the wind may cost an animal its life.

 

It is a double edged sword it seems. Not only has this tendency entertained us and likely facilitated our very survival as a species, but it may in fact serve as the basis of our individual inclinations toward superstitious thinking. Shermer wrote:

“Through a series of complex formulas that include additional stimuli (wind in the trees) and prior events (past experience with predators and wind), the authors conclude that “the inability of individuals—human or otherwise—to assign causal probabilities to all sets of events that occur around them will often force them to lump causal associations with non-causal ones. From here, the evolutionary rationale for superstition is clear: natural selection will favour strategies that make many incorrect causal associations in order to establish those that are essential for survival and reproduction.”

Yet again this is an example of how our intuitive brain can lead us astray!

 

* The Rorschach inkblot test, along with most projective measures in the field of psychology, have fallen out of favor due to poor reliability and validity.

Share

Moral Instinct

19 February 2010

Two years ago Steven Pinker wrote an intriguing piece in the New York Times entitled The Moral Instinct. Dr. Pinker is a Harvard College Professor and Johnstone Family Professor in the Department of Psychology at Harvard University who conducts research on language and cognition. This article in many ways stirred me and lead to a paradigm shift in my thinking about morality. I am a cognitive behavioral psychologist and my training regarding moral development looked at morality as a rationally driven developmental process (Piaget & Kohlberg). In other words, it was believed that morality developed as one’s cognitive capacity to think advanced. It also helped me to get more comfortable with letting go of the notion that religion is the sole driver of morality in society.

 

Pinker’s article is a long one and I cannot do it justice here, but I want to share some of his major arguments.

 

Morality is a complex concept shaped by evolution, neurobiology, and culture. Pinker states that “Moral goodness is what gives each of us the sense that we are worthy human beings. We seek it in our friends and mates, nurture it in our children, advance it in our politics and justify it with our religions. A disrespect for morality is blamed for everyday sins and history’s worst atrocities. To carry this weight, the concept of morality would have to be bigger than any of us and outside all of us.” Looking at morality from a scientific perspective causes concern in those who hold the view that it is sacred and the unique domain of religion. Regardless, Pinker urges us to step back and look at it in a systematic way. Much research has been conducted on the concept and he touches on the most important findings that have shaped the modern understanding of this topic.

 

Moral judgment it seems is a “switch” on a continuum of valuations we make about other’s or our own behavior. We may judge a behavior as imprudent, unfashionable, disagreeable, or perhaps immoral. The switching point on that continuum, where judgments are made that deem a behavior immoral, is in some cases universal (e.g., rape and murder); however, the line is not so clear about other acts. For example there are individuals who today may flip the switch of immoral judgment when looking at someone eating meat (e.g., an ethical vegetarian), using paper towels, shopping at Walmart, or even smoking. The zeitgeist (accepted standard of conduct and morality), certainly does shift over time. Pinker notes “…. many behaviors have been amoralized, switched from moral failings to lifestyle choices. They include divorce, illegitimacy, being a working mother, marijuana use and homosexuality. Many afflictions have been reassigned from payback for bad choices to unlucky misfortunes.” And he adds “This wave of amoralization has led the cultural right to lament that morality itself is under assault, as we see in the group that anointed itself the Moral Majority. In fact there seems to be a Law of Conservation of Moralization, so that as old behaviors are taken out of the moralized column, new ones are added to it. Dozens of things that past generations treated as practical matters are now ethical battlegrounds, including disposable diapers, I.Q. tests, poultry farms, Barbie dolls….. Food alone has become a minefield, with critics sermonizing about the size of sodas, the chemistry of fat, the freedom of chickens, the price of coffee beans, the species of fish and now the distance the food has traveled from farm to plate.

 

The root of these moralzations are not rational he argues. When people are pressed for the reasons why they find a particular behavior morally repugnant they struggle. Pinker discusses Jonathon Haidt’s research that suggests that people do not engage in moral reasoning; rather they engage in moral rationalization. According to Pinker, Haidt contends that “they begin with the conclusion, coughed up by an unconscious emotion, and then work backward to a plausible justification.” Again when pressed for justification for their judgment of certain behaviors as immoral “many people admit, “I don’t know, I can’t explain it, I just know it’s wrong.”

 

So, morality may not be a cognitive developmental progression. Well alright then, but where does it come from? Research is building toward substantiating that there are genetic implications – suggesting that it may very well be instinctual. Pinker contends “According to Noam Chomsky, we are born with a “universal grammar” that forces us to analyze speech in terms of its grammatical structure, with no conscious awareness of the rules in play. By analogy, we are born with a universal moral grammar that forces us to analyze human action in terms of its moral structure, with just as little awareness.” If this is the case then a moral sense should be universal, and in fact there appear to be five universal morals that transcend all cultures. Again reflecting Haidt’s research Pinker lists “… harm, fairness, community (or group loyalty), authority and purity — and suggests that they are the primary colors of our moral sense. Not only do they keep reappearing in cross-cultural surveys, but each one tugs on the moral intuitions of people in our own culture.”

 

If we accept that morals are in fact universal and instinctual, then how do we come to terms with the blatant discrepancies seen across cultures? Pinker contends that culture itself is the culprit. How the five spheres are ranked in terms of importance, in and across cultures, accounts for these differences. Pinker notes:

Many of the flabbergasting practices in faraway places become more intelligible when you recognize that the same moralizing impulse that Western elites channel toward violations of harm and fairness (our moral obsessions) is channeled elsewhere to violations in the other spheres. Think of the Japanese fear of nonconformity (community), the holy ablutions and dietary restrictions of Hindus and Orthodox Jews (purity), the outrage at insulting the Prophet among Muslims (authority). In the West, we believe that in business and government, fairness should trump community and try to root out nepotism and cronyism. In other parts of the world this is incomprehensible — what heartless creep would favor a perfect stranger over his own brother?

 

The cultural divide that exists today in the United States makes sense when we look at it from this perspective. Pinker writes:

“The ranking and placement of moral spheres also divides the cultures of liberals and conservatives in the United States. Many bones of contention, like homosexuality, atheism and one-parent families from the right, or racial imbalances, sweatshops and executive pay from the left, reflect different weightings of the spheres. In a large Web survey, Haidt found that liberals put a lopsided moral weight on harm and fairness while playing down group loyalty, authority and purity. Conservatives instead place a moderately high weight on all five. It’s not surprising that each side thinks it is driven by lofty ethical values and that the other side is base and unprincipled.”

 

 

When you compound these moralistically different vantage points with other common errors of thought (e.g., confirmation bias, fundamental attribution error), and a lack of rules of engagement, it is no wonder that our (US) political system is so paralyzed.

 

Pinker delves into the neurological factors associated with morality and the evolutionary evidence and arguments for an instinctual morality. He reviews several important studies that provide evidence for these hypotheses. But, he argues that morality is more than an inheritance – it is larger than that. It is contextually driven. He notes: “At the very least, the science tells us that even when our adversaries’ agenda is most baffling, they may not be amoral psychopaths but in the throes of a moral mind-set that appears to them to be every bit as mandatory and universal as ours does to us. Of course, some adversaries really are psychopaths, and others are so poisoned by a punitive moralization that they are beyond the pale of reason. ” He further contends “But in any conflict in which a meeting of the minds is not completely hopeless, a recognition that the other guy is acting from moral rather than venal reasons can be a first patch of common ground. One side can acknowledge the other’s concern for community or stability or fairness or dignity, even while arguing that some other value should trump it in that instance.

 

Pinker closes with:

Our habit of moralizing problems, merging them with intuitions of purity and contamination, and resting content when we feel the right feelings, can get in the way of doing the right thing. Far from debunking morality, then, the science of the moral sense can advance it, by allowing us to see through the illusions that evolution and culture have saddled us with and to focus on goals we can share and defend.

 

Again this comes down to getting away from intuitive thinking when it comes to important and complex issues. This not so simple, but very doable step, continues to stymie the best among us.

Share

Rules of Thought

12 February 2010

We are innately intuitive thinkers inclined toward making all sorts of cognitive errors as we muddle through our lives. The consequences in many cases are benign enough; however, I dare say that many an interpersonal conflict stems from such thinking. However, the consequences of this type of thinking can be huge in some circumstances. For example when these biases are carried out by those who, from a position of power (or vulnerability), deny anthropogenic climate change, we all suffer. Other deleterious errors play out in political debates over such issues as health care reform and the privatization of social security, as well as in the struggles between creationists and science minded folk in the discussions over whether to teach intelligent design as part of the science curriculum.

 

It really doesn’t matter on which side of the issue you stand – we are all subject to errors and biases that ultimately widen the gap between the antagonists rather than bring them closer to resolution. There is little debate about the relative impact of these biases and errors as they play out in the conversations about such complicated and contentious issues. All you have to do is listen to the soundbites and spin – regardless of the side you are on, it is plainly evident that the opposing pundits and/or “experts” come from completely different realities. Sometimes it is evident that there can be no resolution because of the lack of a foundational agreement as to the terms or rules of the discussion.

 

My quest for some rules of thought to serve as an inoculation, of sorts, for these pervasive and seemingly instinctual erroneous inclinations has proven difficult. Instincts it seems are hard to resist. Virtually all of the errors I have discussed have their origins in the intuitive brain, away from the higher order thinking areas of the cerebral cortex. Millions of years of evolution have honed these processes conferring a survival advantage to those who attend closely to things that go bump in the night. In the arms race for survival faced by our ancestors, quick decisions were absolutely essential. Arduous skepticism was likely lethal if not by means of predation certainly by means of ostracization. It takes an additional cognitive step – involving higher order thinking to bypass these inclinations. And as Spinoza suggested, we as a species are not inclined to take this additional step. Skepticism is difficult and perhaps even viscerally unpalatable. We must make the extra effort to employ critical thinking – particularly when the stakes are high!

 

It is crucially important to note that the following guidelines will only be fruitful if both sides agree to them. If not, the parties will go round and round – never really accomplishing anything.

 

First, we have to acknowledge the following:

A. Our default thoughts are likely intuitive thoughts and they are thus likely biased by cognitive errors. Gut-level thinking just doesn’t cut it for complex issues.

B. Things that make immediate intuitive sense are likely to be uncritically accepted. Agreeable data should not escape scrutiny.

C. Jumping to conclusions about the internal attributes of others (individuals or groups) as an explanation of behavior or circumstances is likely short sighted. We should always seek a greater understanding of the true circumstances.

 

As such, we must:
1. Give equal time and scrutiny to the pursuit of disconfirming information; particularly regarding agreeable facts because we are inclined toward finding data to support preconceptions.

2. No matter how much you like your hypothesis – you must always be willing to abandon it.

3. Use substantive – observable – measurable – data – always being wary of the expectancy and placebo effects. For evaluation of treatment efficacy – double blind, randomized, placebo controlled studies are the gold standard. And one study is rarely conclusive. Multiple confirming replications are necessary.

4. Universal application of these rules is absolutely essential. It is imprudent to apply these guidelines only as they serve your purpose(s).

5. In order to use scientific methods to investigate any concept, the concept itself must be falsifiable.

6. Be parsimonious. The simplest among equally plausible explanations is usually the best explanation.

 

Some issues cannot be rationally discussed particularly due to guidelines 2, 4, and 5. Issues that necessitate violation of these tenants are often ideologically driven and thus preclude rational or scientific debate. Some really big issues, such as the existence of God, or the merit of creationism most often cannot be reasonably debated following these guidelines, again because it is unlikely that both parties will agree to these guidelines. A big sticking point is that God’s existence, in particular, is not falsifiable. It therefore, is not the domain of science to either prove or disprove God’s existence. But, other big issues such as anthropogenic global climate change or the merits of health care reform can, and should be, subjected to these guidelines.

 

In a recent article at dbskeptic.com, titled Five Habits of the Skeptical Mind Nicholas Covington wisely detailed his suggestions for good skeptical hygiene. He included: (1) Your belief will not change reality; (2) Look for the best overall explanation of the facts; (3) Use authorities carefully; (4) Don’t confuse a possibility with a probability; and (5) Dissect your thoughts. R. C. Moore in a comment to Covington’s article added some additional strong points – including: (1) objective evidence results when all observers who follow the same protocol achieve the same results, regardless of their personal beliefs; (2) statistical error never improves with the repetition of independent samples; (3) uncalibrated experimentation is useless; and (4) while logic is very useful for modeling the behaviour of the universe, in no way does it control its behaviour. Both of these lists are helpful and wise (although I have not done them justice here). Carl Sagan’s Baloney Detection Kit is another great list.

 

I ask you, my readers, to add to this list. What are your rules of thought?

Share

My previous posts addressed several common cognitive biases while briefly touching on their subsequent consequences.  In review, the Fundamental Attribution Error leads us to make hasty and often erroneous conclusions about others’ personal attributes based on our superficial observations.  Generally such conclusions are in fact erroneous because we lack a sufficient understanding of the situational or external circumstances associated with the behavior in question. One particularly counterproductive manifestation of this tendency is the prejudice many individuals have regarding the plight of the poor. The commonly held misbelief is that the poor are so, because they are lazy or stupid or otherwise worthy of their circumstance. Further, the Self Serving Bias is manifested as an overvaluation of the degree of internal attribution the more fortunate make regarding their own personal social and economic position. The reality is that our social economic status has more to do with heritage than with personal attributes such as hard work and discipline.

 

Confirmation Bias, like Spinoza’s Conjecture facilitates the internalization of information that fits our beliefs and leads us to miss, ignore, or dismiss information that challenges deeply held beliefs. We are thus likely to dismiss pertinent and valid information that may move us from deeply held beliefs. And, perhaps most importantly, these tendencies disincline us from taking the additional steps necessary to critically scrutinize intuitively logical information. Thus we filter and screen information in a way that sustains our preconceptions – rarely truly opening our minds to alternative notions.

 

These biases are evident throughout society but are plain to see in those who hold strong attitudes about issues such as religion and politics.  The overarching implications are that we tend to cherry pick and integrate information in order to stay in our comfortable belief paradigms. For example, some Conservatives are reassured by watching Fox News because the information aired is presorted based on the core political ideology of political conservatism. Its viewers are presented with information that avoids the unpleasantness of having to legitimately deal with divergent perspectives. Similarly, creationists ignore or negate the overwhelming evidence that substantiates the theory of evolution.

 

It is interesting to me that the positions held by divergent individuals, liberals or conservatives and skeptics or believers are often quite emotionally based and staunchly guarded.  And rarely are “facts” universally regarded as such.  We are even more likely to cling to these attitudes and values and thus be more prone to such errors in times of distress or threat.  It takes careful rational discipline on both sides to constructively debate these issues.

 

The tendency to firmly hold onto one’s beliefs, be they religious, political, or intellectual, even in the face of compellingly disconfirming evidence, is referred to as “cognitive conservatism” (Herrnstein Smith, 2010).  Between groups or individuals with divergent “belief” systems, the entrenched rarely concede points and even less frequently do they change perspectives. The polar opposites jab and attack looking for the weakest point in the argument of their nemesis.  These generally fruitless exchanges include ad hominem attacks and the copious use of logical fallacies.

 

This is clearly evident today in debates between Republicans and Democrats as they battle over public policy. The case is the same between skeptics and believers as they pointlessly battle over the existence of God (as if existence was a provable or disprovable fact).  And it is interesting that some individuals and groups selectively employ skepticism only when it serves their particular interests. This is especially evident in those who make desperate attempts to discredit the evidence for evolution while demanding that different standards be employed with regard to the question of God’s existence.

 

Because it seems that we as humans are hard-wired with a default for intuitive thinking we are particularly susceptible to magical, supernatural, and superstitious thinking. Compound that default with a tendency to make the above discussed cognitive errors and it is no wonder that we have pervasive and intractable political partisanship and deadly religious conflicts. Further ramifications include the widespread use of homeopathic and “alternative” medicine, the anti-vaccine movement, racism, sexism, classism, and as mentioned previously, ideologically driven denial of both evolution and anthropogenic global climate change.

 

It is fascinating to me that how people think and at what level they think (intuitive versus rational) plays out in such globally destructive ways. How do you think?

Share