The year 2011 proved to be a challenging year. A number of serious health issues in close family members took center stage. The frequency of my posts declined in part due to these important distractions but other factors also played a major role. Although I published fewer articles, the number of visits to my blog increased substantially.
Over the course of the year, I had 18,305 hits at my website by 15,167 unique visitors, accounting for over 25,000 page views. I had visitors from every state in the Union and visits from people from 140 nations around the world. Visitors from the United States accounted for the vast majority of those hits, but the UK, Canada, and Australia also brought in a large contingent of visitors.
One article in particular far outpaced all other posts. My post on Brain Waves and Other Brain Measures accounted for as many visits as the next three most popular posts combined. Of my posts published in 2011, only four made it to this year’s top ten list. The other six were published in 2010. Of those six from 2010, four were also on the top ten list last year.
Great interest persisted in my post entitled Nonmoral Nature: It is what it is. This review of Stephen Jay Gould’s most famous article sustained a number two ranking for a second straight year. I had also reviewed in 2010 a very popular New York Time’s article by Steven Pinker entitled The Moral Instinct. This article moved up a notch this year, ultimately ranking number three. My critical article on the Implicit Associations Test ranked number four this year, versus a number six ranking last year. And my Hedgehog versus the Fox mindset piece ranked number ten this year, compared to a number seven ranking last year.
It’s interesting to me that this list includes the very foundational issues that have driven me in my quest. And each was posted with great personal satisfaction. This encompassing cross section of my work is, in fact, a good starting point for those who are new to my blog. There are several popular 2011 posts that ranked outside the top ten but ranked highly relative to other posts published in 2011. These other posts include:
One article I published late in 2011 has attracted significant attention. I believe that it is perhaps one of the most important posts I’ve written. As I was writing this retrospective, Conspicuous Consumption and the Peacock’s Tail was far outpacing all other posts.
Another very important issue that I wrote a fair amount about includes the pernicious affect of poverty on child development. Clicking here takes you to a page that lists all of the articles on this topic. Knowing the information in this series should motivate us, as a society, to truly evaluate our current political and economic policies.
One of my favorite articles tackled my long standing curiosity about the geology of the place I live. The article itself did not get a lot of attention, but I sure loved writing it.
This two-year journey, thus far has resulted in perhaps unparalleled personal and intellectual growth. It has changed the way I look at life, the world around me, and my fellow human beings. It is my sincerest hope that those who have seen fit to read some of my material have experienced shifts of perception or at least a modicum of enlightenment.
The bottom line:
The human brain, no matter how remarkable, is flawed in two fundamental ways. First, the proclivities toward patternicity (pareidolia), hyperactive agency detection, and superstition, although once adaptive mechanisms, now lead to many errors of thought. Since the age of enlightenment, when human kind developed the scientific method, we have exponentially expanded our knowledge base regarding the workings of the world and the universe. These leaps of knowledge have rendered those error prone proclivities unessential for survival. Regardless, they have remained a dominant cognitive force. Although our intuition and rapid cognitions have sustained us, and in some ways still do, the subsequent everyday illusions impede us in important ways.
Secondly, we are prone to a multitude of cognitive biases that diminish and narrow our capacity to truly understand the world. Time after time I have written of the dangers of ideology with regard to its capacity to blindfold its disciples. Often those blindfolds are absolutely essential to sustain the ideology. And this is dangerous when truths and facts are denied or innocents are subjugated or brutalized. As I discussed in Spinoza’s Conjecture:
“We all look at the world through our personal lenses of experience. Our experiences shape our understanding of the world, and ultimately our understanding of [it], then filters what we take in. The end result is that we may reject or ignore new and important information simply because it does not conform to our previously held beliefs.
Because of these innate tendencies, we must make additional effort in order to discover the truth.
Evolution has conferred upon us a brain that is capable of truly amazing things. We have, for thousands of years, been capable of creating incredibly beautiful art, telling compelling tales, and building magnificent structures. We have risen from small and dispersed tribal bands to perhaps the dominate life force on the planet. Our feats have been wondrous. We have put men on the moon, our space probes have reached the outer limits of our solar system, and we have people living and working in space. We have literally doubled the life expectancy of human beings, figured out how to feed billions of people, and eradicated some of the most dreadful diseases known to human kind. We can join together in virtual social communities from remote corners of the world, and even change nations using Facebook and Twitter. This list could go on and on. We are very capable and very smart beings.
Our mark on this planet, for the moment, is indelible. Yet, despite our great powers of intellect and creativity, we are incredibly vulnerable. I am not referring to our susceptibility to the great powers of nature as evidenced in Japan this last week. I am referring to an inherent mode of thinking that is core to our human nature.
It is pretty certain that nature-nature will destroy our species at some point in the future, be it via asteroid impact, super-volcanoes, climate change, microbiome evolution, or the encroachment of the sun’s surface as it goes red giant in five billion years. Of all the species that have ever lived on this planet over 99% have gone extinct. What’s living today will someday be gone – there really is no question about it. But the question that remains is: “Will nature-nature do us in – or will human-nature do it first?”
We have evolved over billions of years to our current homo sapien (wise man) form, and for the vast majority of that evolutionary period, we have had very limited technology. The development of primitive stone and wooden tools dates back only tens of thousands of years; and reading and writing dates back only several thousand years. What we do and take for granted every day has only been around for a minuscule amount of time relative to the vastness of incomprehensible evolutionary and geological time. These facts are relevant because our brains, for the most part, developed under selective pressures that were vastly different than those we live under today.
Much as our appendix and coccyx hair follicle are remnants of our evolutionary past, so too are some of our core thought processes. These vestigial cognitions play out both as adaptive intuitions and potentially quite destructive errors of judgment. We would like to think that as an advanced thinking species, our ability to use reason, is our dominate mental force. Unfortunately, this most recent evolutionary development, takes a back seat to lower and more powerful brain functions that have sustained us for millions of years. I have previously written about this reason versus intuition/emotion paradigm so I won’t go into this issue in detail here; but, suffice it to say, much of what we do is guided by unconscious thought processes outside of our awareness and outside our direct control. And again, these life guiding processes are mere remnants of what it took to survive as roaming bands of hunters and gatherers.
Ours brains came to their current form when we were not in possession of the tools and technologies that help us truly understand the world around us today. Early survival depended on our ability to see patterns in randomness (pareidolia or patternicity) and to make snap judgments. Rational thought, which is slow and arduous, has not played out in a dominate way because it failed to provide our ancestors with the survival advantages that emotional and rapid cognitions did. As such, our brains have been programmed by evolution to make all kinds of rapid cognitions, that in this modern time, are simply prone to error.
We are uncomfortable with randomness and chaos and are driven to pull together causal stories that help us make sense of the world. Our brains are correlation calculators, belief engines, and hyperactive agency detection devices – all inclinations of which lead us to develop polytheism to help explain the whims of “mother nature.” All cultures, for example have also developed creation myths to help explain how we came to be. We are a superstitious lot driven by these vestigial remnants.
It is easy to see how powerful this inclination is. Look at the prevalence of beliefs about things like full moons and bad behavior. And how about bad behavior and acts of nature? Pat Robertson blamed Katrina on homosexuality and hedonism. One wonders what the Japanese did to deserve their most current tragedy. I’ve already heard talk of the attack on Pearl Harbor as an antecedent. Like mother nature would align with the United States to punish long past deeds against us! If mother nature cares at all about herself, I wonder what we have coming for Nagasaki and Hiroshima? Likewise, people blame vaccines for autism and credit homeopathy for their wellness. I could go and on about our silly inclinations. We are prone to Confirmation Bias, Spinoza’s Conjecture, Attribution Error, Illusions of Attention, and the Illusions of Knowledge and Confidence. In the same vein, we are manipulated by the Illusion of Narrative also known as the Narrative Fallacy.
Nassim Nicholas Taleb (a philosopher, author, statistician) coined the phrase “Narrative Fallacy,” which is an encapsulation of this very discussion. We have a deep need to make up a narrative that serves to make sense of a series of connected or disconnected facts. Our correlation calculators pull together these cause and effect stories to help us understand the world around us even if chance has dictated our circumstances. We fit these stories around the observable facts and sometimes render the facts to make them fit the story. This is particularly true, for example, in the case of Intelligent Design.
Now that I am aware of this innate proclivity I enjoy watching it play out in my own mind. For example several weekends ago I went cross country skiing with my wife, Kimberly. We were at Allegany State Park, in Western New York, where there are nearly 20 miles of incredibly beautiful and nicely groomed nordic ski trails. Kimberly and I took a slightly different route than we normally do and at a junction of two trails, we serendipitously ran into a friend we hadn’t seen in quite some time. It was an incredible and highly improbable meeting. Any number of different events or decisions could have resulted in forgoing this meet-up. Such events compel us to string together a narrative to make sense of the sheer randomness. Was it fate, divine intervention, or just coincidence? I am certain it was the latter – but it sure was fun dealing with the cognitions pouring forth to explain it.
I would really like to hear about your dealings with this inclination. Please post comments detailing events that have happened to you and the narratives you fomented to make sense of them. This is a great exercise to help us understand this pattern detection mechanism, so, have some fun with it and share your stories. At the very least, pay attention to how this tendency plays out in your life and think about how it plays out in your belief systems (and ideological paradigms). I’m guessing that it will be informative.
Halloween seems like an appropriate time to discuss superstition. What with ghosts and goblins and black cats and witches and all. But would not Easter or Christmas, or any other evening that a five year old loses a tooth be an equally appropriate time? In actuality, we massage magical thinking in our children with notions of Santa Claus, the Easter Bunny, and the tooth fairy. And recall if you will, some of your favorite children’s books and the supernatural forces employed to delight your youthful whimsies. Magic is, along with the thinking employed to delight in it, seemingly a rite of childhood, and in some ways the essence of what it is to be a child.
Much as magical thinking has its roots in childhood fantasies, superstition too has its roots in our species’ youth. In that nascent time we lacked the capacity to understand the forces and whims of the natural world around us. Our ancestors struggled to survive, and living another day in part depended on their ability to make sense of the forces that aided or impinged upon them. We must not forget that our forefathers lived much like the non-domesticated animals around us today. Survival was a day to day reality dependent upon the availability of life sustaining resources like food, water and shelter, and was often threatened by predation or the forces of nature. Death was a real possibility and survival a real struggle. The stakes were high and the hazards were plentiful. As it turns out, these are the very conditions under which superstition is likely to thrive.
So what is superstition? Bruce Hood, author of The Science of Superstition, notes that superstition is a belief “that there are patterns, forces, energies, and entities operating in the world that are denied by science…“ He adds that “the inclination or sense that they may be real is our supersense.” It involves an inclination to attempt to “control outcomes through supernatural influence.“ It is the belief that if you knock on wood or cross your fingers you can influence outcomes in your favor. It is the belief that faithfully carrying out rituals as part of a wedding ceremony (e.g., wearing something blue, something new, something borrowed) or before going to bat or before giving a big speech will improve outcomes. It is also the belief that negative outcomes can come as a result of stepping on a crack, breaking a mirror, or spilling salt. Hood argues that supersense goes beyond these obvious notions and surfaces in more subtle ways associated with touching an object or entering a place that we feel has a connection with somebody bad or evil. For example, how would you feel if you were told that you had to wear Jeffery Dalmer’s T-shirt or that you were living in a house where ritualistic torture and multiple murders took place? Most of us would recoil at the thought of this. Most of us also believe (erroneously) that we can sense when someone is looking at us, even when we cannot see them doing so. These beliefs and much of the value we place on sentimental objects stems from this style of thinking.
Michael Shermer (2000), in his book, How We Believe, eloquently describes our brains as a Belief Engine. Underlying this apt metaphor is the notion that “Humans evolved to be skilled pattern seeking creatures. Those who were best at finding patterns (standing upwind of game animals is bad for the hunt, cow manure is good for the crops) left behind the most offspring. We are their descendants.” (Shermer, p. 38). Chabris and Simons (2009) note that this refined ability “serves us well, enabling us to draw conclusions in seconds (or milliseconds) that would take minutes or hours if we had to rely on laborious logical calculations.” (p. 154). However, it is important to understand that we are all prone to drawing erroneous connections between stimuli in the environment and notable outcomes. Shermer further contends that “The problem in seeking and finding patterns is knowing which ones are meaningful and which ones are not.“
From an evolutionary perspective, we have thrived in part, as a result of our tendency to infer cause or agency regardless of the reality of threat. For example, those who assumed that rustling in the bushes was a tiger (when it was just wind) were more likely to take precautions and thus less likely, in general, to succumb to predation. Those who were inclined to ignore such stimuli were more likely to later get eaten when in fact the rustling was a hungry predator. Clearly from a survival perspective, it is best to infer agency and run away rather than become lunch meat. The problem that Shermer refers to regarding this system is that we are subsequently inclined toward mystical and superstitious beliefs: giving agency to unworthy stimuli or drawing causal connections that do not exist. Dr. Steven Novella, a neurologist, in his blog post entitled Hyperactive Agency Detection notes that humans vary in the degree to which they assign agency. Some of us have Hyperactive Agency Detection Devices (HADD) and as such, are more prone to superstitious thinking, conspiratorial thinking, and more mystical thinking. It is important to understand as Shermer (2000) makes clear:
“The Belief Engine is real. It is normal. It is in all of us. Stuart Vyse [a research psychologist] shows for example, that superstition is not a form of psychopathology or abnormal behavior; it is not limited to traditional cultures; it is not restricted to race, religion, or nationality; nor is it only a product of people of low intelligence or lacking education. …all humans possess it because it is part of our nature, built into our neuronal mainframe.” (p. 47).
Bruce Hood takes this notion further and adds that the cultural factors discussed at the opening of this piece and other intuitive inclinations such as dualism (a belief in the separation of mind and body), essentialism (the notion that all discernible objects harbor an underlying reality that although intangible, gives each and every object it’s true identity), vitalism (the insistence that there is some big, mysterious extra ingredient in all living things), holism (that everything is connected by forces), and anism (the belief that the inanimate world is alive) shape adult superstition. These latter belief mechanisms are developmental and naturally occurring in children: they are the tendencies that make magic and fantasy so compelling for children. It is when they lurk in our intuition or are sustained in our rational thought that we as adults fall victim to this type of illusion.
It is interesting to note that much like our ancestors, we are more prone to this type of thinking when faced with high stakes, a low probability of success, and incomprehensible controlling circumstances. Think about it. In baseball, batters often have complex superstitious rituals associated with batting. The best hitters experience success only one in three times at bat. And the speed at which they have to decide to swing or not and where to position the swing defies the rational decision making capacity of humans. On the other hand, these very same athletes have no rituals when it comes to fielding a ball (which is a high probability event for the proficient).
Superstition is a natural inclination with deep evolutionary and psychological roots embedded deeply in our natural child development. These tendencies are nurtured and socialized as a part of child rearing and spill over into adult rituals in predictable circumstances (particularly when there is a low degree personal control). When one deconstructs this form of thinking it makes complete and total sense. This is not to suggest that reliance on superstitions is sensible. Often, however, the costs are low and the rituals therein can be fun. There are some potential costs associated with such thinking. Some of the dangers are materialized in notions such as vaccines cause autism and homeopathy will cure what ails you in lieu of scientific medicine. Resignation of personal power in deference to supernatural forces is a depressive response pattern. Reliance on supernatural forces is essentially reliance on chance and in some cases its applications actually stack the deck against you. So be careful when employing such tactics. But, if you’re in the neighborhood, NEVER EVER walk under my ladder. I’ve been known to drop my hammer.
It is hard to imagine anything more precious than one’s newborn child. Part of the joy of raising a child is the corresponding hope one has for the future. Don’t we all wish for our children a life less fraught with the angst and struggles we ourselves endured? One of the less pleasant aspects of my job has the effect, at least temporarily, of robbing parents of that hope. This erosion occurs in the parent’s mind and heart as a consequence of a diagnosis I often have to provide. I am a psychologist employed in part to provide diagnostic evaluations of preschool age children suspected of having Autism. My intention is never to crush hope, instead it is to get the child on the right therapeutic path as early as possible in order to sustain as much hope as possible. However, uttering the word AUTISM in reference to one’s child constitutes a serious and devastating emotional blow.
Many parents come to my office very aware of their child’s challenges and the subsequent implications. They love their child, accept him as he is, and just want to do whatever they can to make his life better. Others come still steeped in hope that their child’s challenges are just a phase or believing that she is just fine. Regardless, most of them report that they suspected difficulties very early in the child’s development. For example, many note a lack of smiles, chronic agitation and difficulty soothing their child. Some children had not been calmed by being held or may have even resisted it. Some other children I see develop quite typically. They smile, giggle, rejoice at being held, coo and babble, and ultimately start to use a few words with communicative intent. The parents of this latter and rather rare subset, then watch in dismay as their child withdraws, often losing both functional communication and interest in other children.
The timing of this developmental back-slide most often occurs at around 18 months of age. This regression happens to coincide with the recommended timing of the provision of the Measles-Mumps-Rubella (MMR) vaccine. This temporal chronology is important as it has lead, in part, to a belief that the vaccine itself is responsible for the development of Autism. What these parents must experience at this time, I can only imagine, is a horrible combination of confusion and grief. They have had their hopes encouraged and reinforced only to have them vanquished. And it is human nature, under such circumstances, to look for a direct cause. It makes perfect sense that parents would, given the chronicity of events in some cases, suspect the MMR vaccine as the cause of their child’s regression.
During my occasional community talks on Autism, I often am asked about the alleged connection between vaccines and Autism. The coincidental temporal relationship between the provision of the MMR vaccine and this developmental decay leads to what Chabris and Simons in The Invisible Gorilla refer to as the Illusion of Cause. Chabris and Simons discuss how “chronologies or mere sequences of happenings” lead to the inference “that earlier events must have caused the later ones.” (2010, p. 165). By default, as a result of evolution, our brains automatically infer causal explanations based on temporal associations (Chabris & Simons, 2010).
At nearly every talk I give, there is someone in the audience who is convinced that their child (or a relative) is a victim of the MMR vaccine. Their compelling anecdotes are very difficult to refute or discuss. I find that the application of reason, or data, or both, misses the mark and comes off as being cold and insensitive.
For such causal relationships to endure and spread they often need some confirmation of the effect by an “expert.” This is where the story of Dr. Andrew Wakefield comes into play. Wakefield, a GI Surgeon from the UK published a paper in the prestigious UK medical journal, The Lancet, alleging a relationship between the MMR vaccine and the development of Autism. His “expert” opinion offered legitimacy to already brewing suspicions backed by the perceived correlates of increases in both vaccination and Autism rates, as well as the apparent chronology between the timing of the vaccines and the onset of Autism. Wakefield provided credibility and sufficient plausibility: and as a result, the news of the alleged relationship gained traction.
But hold on! There were major flaws with Wakefield’s study that were not initially detected by The Lancet’s peer review panel. First of all, Wakefield was hired and funded by a personal injury attorney who commissioned him to prove that the MMR vaccine had harmed his clients (caused Autism). His study was not designed to test a hypothesis: it was carried out with the specific objective of positively establishing a link between Autism and provision of the MMR vaccine. From the outset the study was a ruse, disguised as science.
Just this year (2010), 12 years after the initial publication of Wakefield’s infamous study, The Lancet retracted it and Dr. Wakefield has been stripped of his privilege to practice medicine in the UK. Problems however, surfaced years ago: as early as 2004, when 10 of 13 co-authors retracted their support of a causal link. In 2005 it was alleged that Wakefield had fabricated data – in fact, some of the afflicted children used to establish the causal link had never actually received the MMR vaccine!
Since the initial publication of this study, hundreds of millions of dollars have been spent investigating the purported relationship between vaccines and Autism. Despite extensive large scale epidemiological studies, there have been no replications of Wakefield’s findings. Children who had not been vaccinated developed Autism at the same rate as those who had received the MMR. There is no relationship between the MMR vaccine and the development of Autism. As a result of Wakefield’s greed, hundreds of millions of dollars have been wasted. Those dollars could have been devoted to more legitimate pursuits, and that is not the worst of it. I will get to the real costs in a bit.
Another aspect of the history of this controversy is associated with the use of thimerosal as a preservative in vaccines. This notion, which has also been debunked, gained plausibility because thimerosal contains mercury, a known neurotoxin. You may ask: “Why on earth would a neurotoxin be used in vaccines?” Researchers have clearly established that thimerosal poses no credible threat to humans at the dosage levels used in vaccines. However, given the perceived threat, Thimerosal is no longer used as a preservative in routine childhood vaccinations. In fact, the last doses using this preservative were produced in 1999 and expired in 2001. Regardless, the prevalence of autism seems to be rising.
It is important to understand that mercury can and does adversely affect neurological development and functioning. However, long term exposure at substantially higher doses than present in thimerosal are necessary for such impact. The mercury in thimerosal is ethyl-mercury, which is not fat-soluble. Unlike the fat-soluble form of methyl-mercury (industrial mercury), ethyl-mercury is flushed from the body very quickly. Methyl-mercury can be readily absorbed into fatty brain tissue and render its damage through protracted contact. Methyl-mercury works its way into the food chain and poses a hazard to us if we eat too much fish (particularly those at the high end of the food chain). In reality, one is at more risk from eating too much seafood (shark and tuna) than from getting an injection of a vaccine preserved with thimerosal. Yet there does not seem to be a movement to implicate seafood as the cause of Autism.
Even though the relationship between vaccines and Autism has been thoroughly debunked, there is a movement afoot, steeped in conspiratorial thinking, that alleges that “Big Pharmacy” and the “Government” are colluding to deceive the people and that elaborately fabricated data is used to cover up a relationship. This belief lives on. How can this be so? Even intelligent and well educated people I know are avoiding important childhood immunizations based on the fear and misinformation spread by these well intentioned people.
In 2003, in the UK, the MMR vaccine rate had fallen to below 79% whereas a 95% rate is necessary to maintain herd immunity. Currently, the vaccine rates are dropping in the US due to the efforts of celebrities like Jenny McCarthy who purports that her son’s Autism was caused by vaccines. McCarthy campaigns fiercely against childhood immunizations spurred on by the likes of Oprah Winfrey. Even folks like John McCain, Joe Lieberman, and Robert F. Kennedy, Jr have spread such misinformation. Continuing to contend that the MMR vaccine is the culprit, Wakefield has moved to the US and has risen to martyr status among the anti-vaccine folk. You need to know that just months before he published his seminal paper, Wakefield received a patent on a Measles Vaccine that he alleges, “cures” Autism. He has much to gain financially, in his attempt to scare people away from the current safe and effective MMR vaccine.
It amazes me that people do not automatically dismiss this alleged vaccine-Autism link. Wakefield’s conflict of interest and discredited research practices alone draw into question anything he has to say. The mountains of epidemiological evidence also favors rejection of a causal relationship between the MMR vaccine and Autism. However, the power of anecdotes and misguided beliefs place millions of children in harm’s way.
Imagine yourself as a parent of a child who cannot get the MMR vaccine because of a serious medical condition (e.g., cancer). Such vulnerable children, of which there are millions worldwide, depend on herd immunity for their very survival. Now imagine that your child is inadvertently exposed to measles by coming into contact with a child who wasn’t vaccinated (because of misguided parental fear). Because your child’s compromised immunity, she develops the measles and gets seriously ill or dies. Such a scenario, although improbable is not impossible. It is more likely today largely due to the diminished herd immunity caused by misinformation. Whooping Cough (Pertussis) is likewise posing serious concerns (and one documented death) in unvaccinated clusters because of the anti-vaccine folk. This myth persists, in part, because of the Illusion of Cause, and the consequences have become deadly. Next week I will delve into this Illusion that sustains this erroneous and dangerous belief system.
“I saw it with my own two eyes!” Does this argument suffice? As it turns out – “NO!” that’s not quite good enough. Seeing should not necessarily conclude in believing. Need proof? Play the video below.
As should be evident as a result of this video, what we perceive, can’t necessarily be fully trusted. Our brains complete patterns, fill in missing data, interpret, and make sense of chaos in ways that do not necessarily coincide with reality. Need more proof? Check these out.
Visual Illusion – A & B are the same shade of gray
Illusion – Notice the perceived motion around the green circles.
Convinced? The software in our brains is responsible for these phenomena. And this software was coded through progressive evolutionary steps that conferred survival benefits to those with such capabilities. Just as pareidolia confers as survival advantage to those that assign agency to things that go bump in the night, there are survival advantages offered to those that evidence the adaptations that are responsible for these errors.
So really, you can’t trust what you see. Check out the following video for further implications.
Many of you are likely surprised by what you missed. We tend to see what we are looking for and we may miss other important pieces of information. The implications of this video seriously challenge the value of eye witness testimony.
To add insult to injury you have to know that even our memory is vulnerable. Memory is a reconstructive process not a reproductive one.2 During memory retrieval we piece together fragments of information, however, due to our own biases and expectations, errors creep in.2 Most often these errors are minimal, so regardless of these small deviations from reality, our memories are usually pretty reliable. Sometimes however, too many errors are inserted and our memory becomes unreliable.2 In extreme cases, our memories can be completely false2 (even though we are convinced of their accuracy). This confabulation as it is called, is most often unintentional and can spontaneously occur as a result of the power of suggestion (e.g., leading questions or exposure to a manipulated photograph).2 Frontal lobe damage (due to a tumor or traumatic brain injury) is known to make one more vulnerable to such errors.2
Even when our brain is functioning properly, we are susceptible to such departures from reality. We are more vulnerable to illusions and hallucinations, be they hypnagogic or otherwise, when we are ill (e.g., have a high fever, are sleep deprived, oxygen deprived, or have neurotransmitter imbalances). All of us are likely to experience at least one if not many illusions or hallucinations throughout our lifetime. In most cases the occurrence is perfectly normal, simply an acute neurological misfiring. Regardless, many individuals experience religious conversions or become convinced of personal alien abductions as a result of these aberrant neurological phenomena.
We are most susceptible to these particular inaccuracies when we are ignorant of them. On the other hand, improved decisions are likely if we understand these mechanisms, as well as, the limitations of the brain’s capacity to process incoming sensory information. Bottom line – you can’t necessarily believe what you see. The same is true for your other senses as well – and these sensory experiences are tightly associated and integrated into long-term memory storage. When you consider the vulnerabilities of our memory, it leaves one wondering to what degree we reside within reality.
For the most part, our perceptions of the world are real. If you think about it, were it otherwise we would be at a survival disadvantage. The errors in perception we experience are in part a result of the rapid cognitions we make in our adaptive unconscious (intuitive brain) so that we can quickly process and successfully react to our environment. For the most part it works very well. But sometimes we experience aberrations, and it is important that we understand the workings of these cognitive missteps. This awareness absolutely necessitates skepticism. Be careful what you believe!
Have you ever seen familiar and improbable shapes in those puffy white cumulus clouds as they pass overhead? Notice the squirrel or dinosaur in the image to the right. Some of you may have you seen the recent American Express commercial that portrays items positioned in such a way that we perceive them as sad or happy faces (much like the bathtub fixture below). Now notice the “Hand of God” in the NASA image below and to the right, taken by the Chandra X-ray Observatory. This picture shows energized particles streaming from a pulsar, in a field of debris from a massive supernova. Many of us, instinctively see in this image what looks like the wrist and hand of a person (or God as the name of this nebula implies). Speaking of God, on the internet there are many more explicit examples of religious imagery in much more benign items such as tree trunks, clouds, pancakes or tortillas. This tendency is not limited to the visual sense. We make the same type of errors with auditory information (as is evident in backmasking in popular music). These tendencies, which are in fact illusory, are a consequence of our neural circuitry.
Our brains do not tolerate vague or obscure stimuli very well. We have an innate tendency to perceive clear and distinct images within such extemporaneous stimuli. This tendency is called pareidolia. It is also referred to as patternicity. This tendency is so ubiquitous that a projective personality test (the Rorschach Inkblot Test) relies on and “interprets” this inclination.*
It has been suggested that our ancestors, the ones who assigned agency to things that went bump in the night (perceiving vague data as a threat) responded in a way that facilitated survival. Those who ignored the stimuli were more likely to be predated and thus not pass on their genes. Carl Sagan noted in his classic book, The Demon Haunted World that this tendency is likely linked to other aspects of individual survival. He wrote:
“As soon as the infant can see, it recognizes faces, and we now know that this skill is hardwired in our brains. Those infants who a million years ago were unable to recognize a face smiled back less, were less likely to win the hearts of their parents, and less likely to prosper. These days, nearly every infant is quick to identify a human face, and to respond with a goony grin.
As an inadvertent side effect, the pattern recognition machinery in our brains is so efficient in extracting a face from a clutter of other detail that we sometimes see faces where there are none. We assemble disconnected patches of light and dark and unconsciously see a face. The Man in the Moon is one result”(Sagan 1995: 45).
Michael Shermer wrote of patternicity in the December 2008 issue of Scientific American Magazine. In that article Shermer wrote that scientists have historically treated patternicity as an error in cognition. More specifically he noted that this tendency is a type I error, or a false positive. A false positive in this context, is believing that something is real when, in fact, it is not. Shermer discussed a paper in the Proceedings of the Royal Society entitled “The Evolution of Superstitious and Superstition-like Behaviour” by biologists Kevin R. Foster (Harvard University) and Hanna Kokko (University of Helsinki). These scientists tested the hypothesis that patternicity will enhance survivability using evolutionary modeling. Shermer wrote “They demonstrated that whenever the cost of believing a false pattern is real is less than the cost of not believing a real pattern, natural selection will favor patternicity.” The implications Shermer wrote: “…believing that the rustle in the grass is a dangerous predator when it is only the wind does not cost much, but believing that a dangerous predator is the wind may cost an animal its life.”
It is a double edged sword it seems. Not only has this tendency entertained us and likely facilitated our very survival as a species, but it may in fact serve as the basis of our individual inclinations toward superstitious thinking. Shermer wrote:
“Through a series of complex formulas that include additional stimuli (wind in the trees) and prior events (past experience with predators and wind), the authors conclude that “the inability of individuals—human or otherwise—to assign causal probabilities to all sets of events that occur around them will often force them to lump causal associations with non-causal ones. From here, the evolutionary rationale for superstition is clear: natural selection will favour strategies that make many incorrect causal associations in order to establish those that are essential for survival and reproduction.”
Yet again this is an example of how our intuitive brain can lead us astray!
* The Rorschach inkblot test, along with most projective measures in the field of psychology, have fallen out of favor due to poor reliability and validity.