Pareidolia

26 February 2010

Have you ever seen familiar and improbable shapes in those puffy white cumulus clouds as they pass overhead? Notice the squirrel or dinosaur in the image to the right. Some of you may have you seen the recent American Express commercial that portrays items positioned in such a way that we perceive them as sad or happy faces (much like the bathtub fixture below). Now notice the “Hand of God” in the NASA image below and to the right, taken by the Chandra X-ray Observatory. This picture shows energized particles streaming from a pulsar, in a field of debris from a massive supernova. Many of us, instinctively see in this image what looks like the wrist and hand of a person (or God as the name of this nebula implies). Speaking of God, on the internet there are many more explicit examples of religious imagery in much more benign items such as tree trunks, clouds, pancakes or tortillas. This tendency is not limited to the visual sense. We make the same type of errors with auditory information (as is evident in backmasking in popular music). These tendencies, which are in fact illusory, are a consequence of our neural circuitry.

 

Our brains do not tolerate vague or obscure stimuli very well. We have an innate tendency to perceive clear and distinct images within such extemporaneous stimuli. This tendency is called pareidolia. It is also referred to as patternicity. This tendency is so ubiquitous that a projective personality test (the Rorschach Inkblot Test) relies on and “interprets” this inclination.*

 

It has been suggested that our ancestors, the ones who assigned agency to things that went bump in the night (perceiving vague data as a threat) responded in a way that facilitated survival. Those who ignored the stimuli were more likely to be predated and thus not pass on their genes. Carl Sagan noted in his classic book, The Demon Haunted World that this tendency is likely linked to other aspects of individual survival. He wrote:

“As soon as the infant can see, it recognizes faces, and we now know that this skill is hardwired in our brains. Those infants who a million years ago were unable to recognize a face smiled back less, were less likely to win the hearts of their parents, and less likely to prosper. These days, nearly every infant is quick to identify a human face, and to respond with a goony grin.

 

As an inadvertent side effect, the pattern recognition machinery in our brains is so efficient in extracting a face from a clutter of other detail that we sometimes see faces where there are none. We assemble disconnected patches of light and dark and unconsciously see a face. The Man in the Moon is one result”(Sagan 1995: 45).

Michael Shermer wrote of patternicity in the December 2008 issue of Scientific American Magazine. In that article Shermer wrote that scientists have historically treated patternicity as an error in cognition. More specifically he noted that this tendency is a type I error, or a false positive. A false positive in this context, is believing that something is real when, in fact, it is not. Shermer discussed a paper in the Proceedings of the Royal Society entitled “The Evolution of Superstitious and Superstition-like Behaviour” by biologists Kevin R. Foster (Harvard University) and Hanna Kokko (University of Helsinki). These scientists tested the hypothesis that patternicity will enhance survivability using evolutionary modeling. Shermer wrote “They demonstrated that whenever the cost of believing a false pattern is real is less than the cost of not believing a real pattern, natural selection will favor patternicity.” The implications Shermer wrote: “…believing that the rustle in the grass is a dangerous predator when it is only the wind does not cost much, but believing that a dangerous predator is the wind may cost an animal its life.

 

It is a double edged sword it seems. Not only has this tendency entertained us and likely facilitated our very survival as a species, but it may in fact serve as the basis of our individual inclinations toward superstitious thinking. Shermer wrote:

“Through a series of complex formulas that include additional stimuli (wind in the trees) and prior events (past experience with predators and wind), the authors conclude that “the inability of individuals—human or otherwise—to assign causal probabilities to all sets of events that occur around them will often force them to lump causal associations with non-causal ones. From here, the evolutionary rationale for superstition is clear: natural selection will favour strategies that make many incorrect causal associations in order to establish those that are essential for survival and reproduction.”

Yet again this is an example of how our intuitive brain can lead us astray!

 

* The Rorschach inkblot test, along with most projective measures in the field of psychology, have fallen out of favor due to poor reliability and validity.

Share

Moral Instinct

19 February 2010

Two years ago Steven Pinker wrote an intriguing piece in the New York Times entitled The Moral Instinct. Dr. Pinker is a Harvard College Professor and Johnstone Family Professor in the Department of Psychology at Harvard University who conducts research on language and cognition. This article in many ways stirred me and lead to a paradigm shift in my thinking about morality. I am a cognitive behavioral psychologist and my training regarding moral development looked at morality as a rationally driven developmental process (Piaget & Kohlberg). In other words, it was believed that morality developed as one’s cognitive capacity to think advanced. It also helped me to get more comfortable with letting go of the notion that religion is the sole driver of morality in society.

 

Pinker’s article is a long one and I cannot do it justice here, but I want to share some of his major arguments.

 

Morality is a complex concept shaped by evolution, neurobiology, and culture. Pinker states that “Moral goodness is what gives each of us the sense that we are worthy human beings. We seek it in our friends and mates, nurture it in our children, advance it in our politics and justify it with our religions. A disrespect for morality is blamed for everyday sins and history’s worst atrocities. To carry this weight, the concept of morality would have to be bigger than any of us and outside all of us.” Looking at morality from a scientific perspective causes concern in those who hold the view that it is sacred and the unique domain of religion. Regardless, Pinker urges us to step back and look at it in a systematic way. Much research has been conducted on the concept and he touches on the most important findings that have shaped the modern understanding of this topic.

 

Moral judgment it seems is a “switch” on a continuum of valuations we make about other’s or our own behavior. We may judge a behavior as imprudent, unfashionable, disagreeable, or perhaps immoral. The switching point on that continuum, where judgments are made that deem a behavior immoral, is in some cases universal (e.g., rape and murder); however, the line is not so clear about other acts. For example there are individuals who today may flip the switch of immoral judgment when looking at someone eating meat (e.g., an ethical vegetarian), using paper towels, shopping at Walmart, or even smoking. The zeitgeist (accepted standard of conduct and morality), certainly does shift over time. Pinker notes “…. many behaviors have been amoralized, switched from moral failings to lifestyle choices. They include divorce, illegitimacy, being a working mother, marijuana use and homosexuality. Many afflictions have been reassigned from payback for bad choices to unlucky misfortunes.” And he adds “This wave of amoralization has led the cultural right to lament that morality itself is under assault, as we see in the group that anointed itself the Moral Majority. In fact there seems to be a Law of Conservation of Moralization, so that as old behaviors are taken out of the moralized column, new ones are added to it. Dozens of things that past generations treated as practical matters are now ethical battlegrounds, including disposable diapers, I.Q. tests, poultry farms, Barbie dolls….. Food alone has become a minefield, with critics sermonizing about the size of sodas, the chemistry of fat, the freedom of chickens, the price of coffee beans, the species of fish and now the distance the food has traveled from farm to plate.

 

The root of these moralzations are not rational he argues. When people are pressed for the reasons why they find a particular behavior morally repugnant they struggle. Pinker discusses Jonathon Haidt’s research that suggests that people do not engage in moral reasoning; rather they engage in moral rationalization. According to Pinker, Haidt contends that “they begin with the conclusion, coughed up by an unconscious emotion, and then work backward to a plausible justification.” Again when pressed for justification for their judgment of certain behaviors as immoral “many people admit, “I don’t know, I can’t explain it, I just know it’s wrong.”

 

So, morality may not be a cognitive developmental progression. Well alright then, but where does it come from? Research is building toward substantiating that there are genetic implications – suggesting that it may very well be instinctual. Pinker contends “According to Noam Chomsky, we are born with a “universal grammar” that forces us to analyze speech in terms of its grammatical structure, with no conscious awareness of the rules in play. By analogy, we are born with a universal moral grammar that forces us to analyze human action in terms of its moral structure, with just as little awareness.” If this is the case then a moral sense should be universal, and in fact there appear to be five universal morals that transcend all cultures. Again reflecting Haidt’s research Pinker lists “… harm, fairness, community (or group loyalty), authority and purity — and suggests that they are the primary colors of our moral sense. Not only do they keep reappearing in cross-cultural surveys, but each one tugs on the moral intuitions of people in our own culture.”

 

If we accept that morals are in fact universal and instinctual, then how do we come to terms with the blatant discrepancies seen across cultures? Pinker contends that culture itself is the culprit. How the five spheres are ranked in terms of importance, in and across cultures, accounts for these differences. Pinker notes:

Many of the flabbergasting practices in faraway places become more intelligible when you recognize that the same moralizing impulse that Western elites channel toward violations of harm and fairness (our moral obsessions) is channeled elsewhere to violations in the other spheres. Think of the Japanese fear of nonconformity (community), the holy ablutions and dietary restrictions of Hindus and Orthodox Jews (purity), the outrage at insulting the Prophet among Muslims (authority). In the West, we believe that in business and government, fairness should trump community and try to root out nepotism and cronyism. In other parts of the world this is incomprehensible — what heartless creep would favor a perfect stranger over his own brother?

 

The cultural divide that exists today in the United States makes sense when we look at it from this perspective. Pinker writes:

“The ranking and placement of moral spheres also divides the cultures of liberals and conservatives in the United States. Many bones of contention, like homosexuality, atheism and one-parent families from the right, or racial imbalances, sweatshops and executive pay from the left, reflect different weightings of the spheres. In a large Web survey, Haidt found that liberals put a lopsided moral weight on harm and fairness while playing down group loyalty, authority and purity. Conservatives instead place a moderately high weight on all five. It’s not surprising that each side thinks it is driven by lofty ethical values and that the other side is base and unprincipled.”

 

 

When you compound these moralistically different vantage points with other common errors of thought (e.g., confirmation bias, fundamental attribution error), and a lack of rules of engagement, it is no wonder that our (US) political system is so paralyzed.

 

Pinker delves into the neurological factors associated with morality and the evolutionary evidence and arguments for an instinctual morality. He reviews several important studies that provide evidence for these hypotheses. But, he argues that morality is more than an inheritance – it is larger than that. It is contextually driven. He notes: “At the very least, the science tells us that even when our adversaries’ agenda is most baffling, they may not be amoral psychopaths but in the throes of a moral mind-set that appears to them to be every bit as mandatory and universal as ours does to us. Of course, some adversaries really are psychopaths, and others are so poisoned by a punitive moralization that they are beyond the pale of reason. ” He further contends “But in any conflict in which a meeting of the minds is not completely hopeless, a recognition that the other guy is acting from moral rather than venal reasons can be a first patch of common ground. One side can acknowledge the other’s concern for community or stability or fairness or dignity, even while arguing that some other value should trump it in that instance.

 

Pinker closes with:

Our habit of moralizing problems, merging them with intuitions of purity and contamination, and resting content when we feel the right feelings, can get in the way of doing the right thing. Far from debunking morality, then, the science of the moral sense can advance it, by allowing us to see through the illusions that evolution and culture have saddled us with and to focus on goals we can share and defend.

 

Again this comes down to getting away from intuitive thinking when it comes to important and complex issues. This not so simple, but very doable step, continues to stymie the best among us.

Share

Rules of Thought

12 February 2010

We are innately intuitive thinkers inclined toward making all sorts of cognitive errors as we muddle through our lives. The consequences in many cases are benign enough; however, I dare say that many an interpersonal conflict stems from such thinking. However, the consequences of this type of thinking can be huge in some circumstances. For example when these biases are carried out by those who, from a position of power (or vulnerability), deny anthropogenic climate change, we all suffer. Other deleterious errors play out in political debates over such issues as health care reform and the privatization of social security, as well as in the struggles between creationists and science minded folk in the discussions over whether to teach intelligent design as part of the science curriculum.

 

It really doesn’t matter on which side of the issue you stand – we are all subject to errors and biases that ultimately widen the gap between the antagonists rather than bring them closer to resolution. There is little debate about the relative impact of these biases and errors as they play out in the conversations about such complicated and contentious issues. All you have to do is listen to the soundbites and spin – regardless of the side you are on, it is plainly evident that the opposing pundits and/or “experts” come from completely different realities. Sometimes it is evident that there can be no resolution because of the lack of a foundational agreement as to the terms or rules of the discussion.

 

My quest for some rules of thought to serve as an inoculation, of sorts, for these pervasive and seemingly instinctual erroneous inclinations has proven difficult. Instincts it seems are hard to resist. Virtually all of the errors I have discussed have their origins in the intuitive brain, away from the higher order thinking areas of the cerebral cortex. Millions of years of evolution have honed these processes conferring a survival advantage to those who attend closely to things that go bump in the night. In the arms race for survival faced by our ancestors, quick decisions were absolutely essential. Arduous skepticism was likely lethal if not by means of predation certainly by means of ostracization. It takes an additional cognitive step – involving higher order thinking to bypass these inclinations. And as Spinoza suggested, we as a species are not inclined to take this additional step. Skepticism is difficult and perhaps even viscerally unpalatable. We must make the extra effort to employ critical thinking – particularly when the stakes are high!

 

It is crucially important to note that the following guidelines will only be fruitful if both sides agree to them. If not, the parties will go round and round – never really accomplishing anything.

 

First, we have to acknowledge the following:

A. Our default thoughts are likely intuitive thoughts and they are thus likely biased by cognitive errors. Gut-level thinking just doesn’t cut it for complex issues.

B. Things that make immediate intuitive sense are likely to be uncritically accepted. Agreeable data should not escape scrutiny.

C. Jumping to conclusions about the internal attributes of others (individuals or groups) as an explanation of behavior or circumstances is likely short sighted. We should always seek a greater understanding of the true circumstances.

 

As such, we must:
1. Give equal time and scrutiny to the pursuit of disconfirming information; particularly regarding agreeable facts because we are inclined toward finding data to support preconceptions.

2. No matter how much you like your hypothesis – you must always be willing to abandon it.

3. Use substantive – observable – measurable – data – always being wary of the expectancy and placebo effects. For evaluation of treatment efficacy – double blind, randomized, placebo controlled studies are the gold standard. And one study is rarely conclusive. Multiple confirming replications are necessary.

4. Universal application of these rules is absolutely essential. It is imprudent to apply these guidelines only as they serve your purpose(s).

5. In order to use scientific methods to investigate any concept, the concept itself must be falsifiable.

6. Be parsimonious. The simplest among equally plausible explanations is usually the best explanation.

 

Some issues cannot be rationally discussed particularly due to guidelines 2, 4, and 5. Issues that necessitate violation of these tenants are often ideologically driven and thus preclude rational or scientific debate. Some really big issues, such as the existence of God, or the merit of creationism most often cannot be reasonably debated following these guidelines, again because it is unlikely that both parties will agree to these guidelines. A big sticking point is that God’s existence, in particular, is not falsifiable. It therefore, is not the domain of science to either prove or disprove God’s existence. But, other big issues such as anthropogenic global climate change or the merits of health care reform can, and should be, subjected to these guidelines.

 

In a recent article at dbskeptic.com, titled Five Habits of the Skeptical Mind Nicholas Covington wisely detailed his suggestions for good skeptical hygiene. He included: (1) Your belief will not change reality; (2) Look for the best overall explanation of the facts; (3) Use authorities carefully; (4) Don’t confuse a possibility with a probability; and (5) Dissect your thoughts. R. C. Moore in a comment to Covington’s article added some additional strong points – including: (1) objective evidence results when all observers who follow the same protocol achieve the same results, regardless of their personal beliefs; (2) statistical error never improves with the repetition of independent samples; (3) uncalibrated experimentation is useless; and (4) while logic is very useful for modeling the behaviour of the universe, in no way does it control its behaviour. Both of these lists are helpful and wise (although I have not done them justice here). Carl Sagan’s Baloney Detection Kit is another great list.

 

I ask you, my readers, to add to this list. What are your rules of thought?

Share

My previous posts addressed several common cognitive biases while briefly touching on their subsequent consequences.  In review, the Fundamental Attribution Error leads us to make hasty and often erroneous conclusions about others’ personal attributes based on our superficial observations.  Generally such conclusions are in fact erroneous because we lack a sufficient understanding of the situational or external circumstances associated with the behavior in question. One particularly counterproductive manifestation of this tendency is the prejudice many individuals have regarding the plight of the poor. The commonly held misbelief is that the poor are so, because they are lazy or stupid or otherwise worthy of their circumstance. Further, the Self Serving Bias is manifested as an overvaluation of the degree of internal attribution the more fortunate make regarding their own personal social and economic position. The reality is that our social economic status has more to do with heritage than with personal attributes such as hard work and discipline.

 

Confirmation Bias, like Spinoza’s Conjecture facilitates the internalization of information that fits our beliefs and leads us to miss, ignore, or dismiss information that challenges deeply held beliefs. We are thus likely to dismiss pertinent and valid information that may move us from deeply held beliefs. And, perhaps most importantly, these tendencies disincline us from taking the additional steps necessary to critically scrutinize intuitively logical information. Thus we filter and screen information in a way that sustains our preconceptions – rarely truly opening our minds to alternative notions.

 

These biases are evident throughout society but are plain to see in those who hold strong attitudes about issues such as religion and politics.  The overarching implications are that we tend to cherry pick and integrate information in order to stay in our comfortable belief paradigms. For example, some Conservatives are reassured by watching Fox News because the information aired is presorted based on the core political ideology of political conservatism. Its viewers are presented with information that avoids the unpleasantness of having to legitimately deal with divergent perspectives. Similarly, creationists ignore or negate the overwhelming evidence that substantiates the theory of evolution.

 

It is interesting to me that the positions held by divergent individuals, liberals or conservatives and skeptics or believers are often quite emotionally based and staunchly guarded.  And rarely are “facts” universally regarded as such.  We are even more likely to cling to these attitudes and values and thus be more prone to such errors in times of distress or threat.  It takes careful rational discipline on both sides to constructively debate these issues.

 

The tendency to firmly hold onto one’s beliefs, be they religious, political, or intellectual, even in the face of compellingly disconfirming evidence, is referred to as “cognitive conservatism” (Herrnstein Smith, 2010).  Between groups or individuals with divergent “belief” systems, the entrenched rarely concede points and even less frequently do they change perspectives. The polar opposites jab and attack looking for the weakest point in the argument of their nemesis.  These generally fruitless exchanges include ad hominem attacks and the copious use of logical fallacies.

 

This is clearly evident today in debates between Republicans and Democrats as they battle over public policy. The case is the same between skeptics and believers as they pointlessly battle over the existence of God (as if existence was a provable or disprovable fact).  And it is interesting that some individuals and groups selectively employ skepticism only when it serves their particular interests. This is especially evident in those who make desperate attempts to discredit the evidence for evolution while demanding that different standards be employed with regard to the question of God’s existence.

 

Because it seems that we as humans are hard-wired with a default for intuitive thinking we are particularly susceptible to magical, supernatural, and superstitious thinking. Compound that default with a tendency to make the above discussed cognitive errors and it is no wonder that we have pervasive and intractable political partisanship and deadly religious conflicts. Further ramifications include the widespread use of homeopathic and “alternative” medicine, the anti-vaccine movement, racism, sexism, classism, and as mentioned previously, ideologically driven denial of both evolution and anthropogenic global climate change.

 

It is fascinating to me that how people think and at what level they think (intuitive versus rational) plays out in such globally destructive ways. How do you think?

Share

Spinoza’s Conjecture

22 January 2010

Last week I discussed fundamental attribution error, leaving confirmation bias and Spinoza’s Conjuncture left to explore.  Today I’m going to delve into the latter.  Benedict Spinoza, a 17th-century Dutch philosopher, wrote with great insight that “mere comprehension of a statement entails the tacit acceptance of its being true, whereas disbelief requires a subsequent process of rejection.”  What this suggests is that we are likely to accept, as true, a statement that makes immediate sense to us. But we can also infer from this statement that we are, in general, unlikely to critically scrutinize such logical statements.  A further implication is that we are likely to reject statements that don’t make immediate sense to us.

 

Sam Harris, a noted neuroscientist and author, and several colleagues at the University of California recently published the results of a study noting that we tend to process understood information very quickly while we process false or uncertain statements more slowly.  And what is even more interesting is that we process ambiguous or uncertain statements in regions of the brain (specifically: the left inferior frontal gyrus, anterior insula, and dorsal anterior cingulate) that are associated with processing pain and disgust.  Hmmm, critical thinking hurts!  This is just one example of much evidence that suggests that our brains work this way.

 

We all look at the world through our personal lenses of experience.  Our experiences shape our understanding of the world, and ultimately our understanding of the world then filters what we take in.  The end result is that we may reject or ignore new and important information simply because it does not conform to our previously held beliefs.  Subsequently, we may not grow or expand our understanding of the world and we may become intellectually or professionally stagnate.

 

It is important to remember this tendency when we are taking in novel information. New ideas that run contrary to long-held beliefs are hard to embrace regardless of the degree of merit. And we are disinclined to question the legitimacy of new information particularly if it fits our preconceptions. Challenging and/or ambiguous information, like quantum mechanics may, in some people, elicit feelings similar to pain or even disgust. Perhaps this also explains that uneasy feeling that many people experience when they think about such mind blowing concepts as our size and importance relative to the vastness of time and space. The slowed, arduous, and perhaps even painful process of thinking about such ambiguous or incongruous information may certainly discourage the endeavor. Perhaps the cliche “no pain – no gain” reasonably applies.

Share