I saw it with my own two eyes!” Does this argument suffice? As it turns out – “NO!” that’s not quite good enough. Seeing should not necessarily conclude in believing. Need proof? Play the video below.

 

 

As should be evident as a result of this video, what we perceive, can’t necessarily be fully trusted. Our brains complete patterns, fill in missing data, interpret, and make sense of chaos in ways that do not necessarily coincide with reality. Need more proof? Check these out.

 

Visual Illusion - A & B are the same shade of gray

Visual Illusion – A & B are the same shade of gray

Illusion - Notice the perceived motion around the green circles.

Illusion – Notice the perceived motion around the green circles.

 

Convinced? The software in our brains is responsible for these phenomena. And this software was coded through progressive evolutionary steps that conferred survival benefits to those with such capabilities. Just as pareidolia confers as survival advantage to those that assign agency to things that go bump in the night, there are survival advantages offered to those that evidence the adaptations that are responsible for these errors.

 

So really, you can’t trust what you see. Check out the following video for further implications.

 

 

Many of you are likely surprised by what you missed. We tend to see what we are looking for and we may miss other important pieces of information. The implications of this video seriously challenge the value of eye witness testimony.

 

To add insult to injury you have to know that even our memory is vulnerable. Memory is a reconstructive process not a reproductive one.2 During memory retrieval we piece together fragments of information, however, due to our own biases and expectations, errors creep in.2 Most often these errors are minimal, so regardless of these small deviations from reality, our memories are usually pretty reliable. Sometimes however, too many errors are inserted and our memory becomes unreliable.2 In extreme cases, our memories can be completely false2 (even though we are convinced of their accuracy). This confabulation as it is called, is most often unintentional and can spontaneously occur as a result of the power of suggestion (e.g., leading questions or exposure to a manipulated photograph).2 Frontal lobe damage (due to a tumor or traumatic brain injury) is known to make one more vulnerable to such errors.2

 

Even when our brain is functioning properly, we are susceptible to such departures from reality. We are more vulnerable to illusions and hallucinations, be they hypnagogic or otherwise, when we are ill (e.g., have a high fever, are sleep deprived, oxygen deprived, or have neurotransmitter imbalances). All of us are likely to experience at least one if not many illusions or hallucinations throughout our lifetime. In most cases the occurrence is perfectly normal, simply an acute neurological misfiring. Regardless, many individuals experience religious conversions or become convinced of personal alien abductions as a result of these aberrant neurological phenomena.

 

We are most susceptible to these particular inaccuracies when we are ignorant of them. On the other hand, improved decisions are likely if we understand these mechanisms, as well as, the limitations of the brain’s capacity to process incoming sensory information. Bottom line – you can’t necessarily believe what you see. The same is true for your other senses as well – and these sensory experiences are tightly associated and integrated into long-term memory storage. When you consider the vulnerabilities of our memory, it leaves one wondering to what degree we reside within reality.

 

For the most part, our perceptions of the world are real. If you think about it, were it otherwise we would be at a survival disadvantage. The errors in perception we experience are in part a result of the rapid cognitions we make in our adaptive unconscious (intuitive brain) so that we can quickly process and successfully react to our environment. For the most part it works very well. But sometimes we experience aberrations, and it is important that we understand the workings of these cognitive missteps. This awareness absolutely necessitates skepticism. Be careful what you believe!

 

References:

 

1.  169 Best Illusions–A Sampling, Scientific American: Mind & Brain. May 10, 2010
http://www.scientificamerican.com/slideshow.cfm?id=169-best-illusions&photo_id=82E73209-C951-CBB7-7CD7B53D7346132B

 

2.  Anatomy of a false memory. Posted on: June 13, 2008 6:25 PM, by Mo
http://scienceblogs.com/neurophilosophy/2008/06/anatomy_of_a_false_memory.php

 

3.  Simons, Daniel J., 1999. Selective Attention Test. Visual Cognitions Lab, University of Illinois. http://viscog.beckman.illinois.edu/flashmovie/15.php

 

4.  Sugihara, Koukichi 2010. Impossible motion: magnet-like slopes. Meiji Institute for Advanced Study of Mathematical Sciences, Japan. http://illusioncontest.neuralcorrelate.com/2010/impossible-motion-magnet-like-slopes/

Share

In my post entitled Intuitive Thought I mentioned that rational thought is slow and arduous. I also mentioned in Spinoza’s Conjecture that vague or confusing information is processed in a portion of the brain that also processes pain and disgust. Rational thought, it seems, is not all it’s cracked up to be. In fact we are not very good at it. According to Daniel Willingham, professor of cognitive psychology at the University of Virginia, “People are naturally curious but they are not naturally good thinkers; unless the cognitive conditions are right, people will avoid thinking.

 

Willingham delved into this reality in his intriguing article “Why Don’t Students Like School? Because the Mind is Not Designed for Thinking.” He suggests that we are good at certain types of reason (relative to other animals), but that we are much better at other brain functions like seeing and moving. Both of these capabilities are highly complex; however, they are relatively automatic and we tend to take them for granted. We don’t have to think to see or generally to ambulate. A massive portion of our neurological terrain is dedicated to these activities because seeing and moving are actually much more complicated than working out a complex physics problem for example.

 

Working out novel and/or complicated problems requires concentration and the complete dedication of one’s attention. Think about a personal situation that necessitated solving an important and novel problem involving numerous and complicated variables. Recall how difficult and slow the process was – and how it demanded single minded dedication and supreme concentration. Do you remember how disruptive distracting stimuli became? Personally, I recall a complicated wood working project where for the life of me I could not work out the solution. Another situation that stumped me recently was trying to navigate the streets of Paris. Oy vey! Between the language difference that made reading street signs difficult, the meandering streets, and the uniform buildings I had a difficult time.

 

When the solution is not quickly evident or one lacks experience in solving similar problems, novel and complicated scenarios can become frustrating and downright unpleasant. Extraneous distractions can completely short circuit problem solving.

 

Another complication associated with thinking is that it is uncertain and even prone to error. So let’s see, slow, arduous, and error prone – not much of a selling point for rational thought.

 

These realities are responsible for us tending to avoid thinking when we can get away with it. So if this is true, how do we get through life? The answer is that we rely to a significant extent on memory including the adaptive unconscious. Willingham states:

“For the vast majority of decisions you make, you don’t stop to consider what you might do, reason about it, anticipate possible consequences, and so on. You do take such steps when faced with a new problem, but not when faced with a problem you’ve already encountered many times. That’s because one way that your brain saves you from having to think is by changing. If you repeat the same task again and again, it will eventually become automatic; your brain will change so that you can complete the task without thinking about it.”

In other words, exposure and repetition brings the task into the realm of the intuitive brain – the fast and frugal mind behind the locked door. And this adaptive unconscious is entirely dependent on memory.

 

Memory, it is important to note, is a multidimensional function involving working memory and long-term memory. This is a simplification of the process mind you, but it will do for this discussion. Working memory houses the information that is the focus of your active attention regardless of the source of that information (e.g., drawn from the immediate environment or recollections of past events). These memories are within your awareness and are the focus of your attention. Long term memory, which lies outside your awareness functions as a vast warehouse of your factual and procedural knowledge.

 

When you need the details contained in long-term memory, you pull them from this passive warehouse into your active working memory. Thinking, as we know it, occurs when there is a collaborative effort combining input from the environment with knowledge stores from your long-term memory. Successful thinking requires effective strategies for combining these two sources of input. In other words, you have to know how to think – how to problem solve. This procedural knowledge is essentially a recipe that is itself stored within long-term memory.

 

It is complicated, but this is why we do better when tasked with familiar problems – because we have experiential knowledge and procedural knowledge that we can readily employ. Novel problems tend to be more challenging because we lack a recipe and/or cogent memories to assist us in problem resolution.

 

This is why practice is so important when one wants to become proficient at something – repetition increases efficacy, ease, and ultimately how automatic subconscious responding can become. Learning to ski, golf, or drive a car are initially quite difficult (a lot of novel thinking is mandated) but as one gets more and more practice these skills become more refined and fluid.

 

It is important to note that effective thinking is dependent not on just having procedural knowledge (a recipe) but also on having factual knowledge. So although critical thinking is important, it is equally critical to be knowledgeable. Being widely read and having a vast storehouse of knowledge is crucial to effective thinking and is likely to make snap judgments more accurate and rational thought faster, more precise, and less arduous.

 

The finite capacity of our working memory is yet another factor that impinges on one’s ability to think. Most of us are familiar with the experience of information overload, which is indicative of an overloaded working memory. I experience this phenomena after two full days at a professional conference or even when trying to solve multiple digit long division in my head. For example try diving 753 by 13 in your head. This is difficult because we are not adept at storing much organized data in working memory. Anyways, if working memory is overloaded, one’s ability to think is compromised.

 

So why is it that despite the arduous nature of rational thought, that we are drawn to mental challenges? Why do people enjoy games like chess, suduko, crossword puzzles, framegames, and so on? Why, if we’re not good at thinking, do we seek out mental challenges that actively engage thinking? Willingham suggests that “mental work appeals to us because it offers the opportunity for that pleasant feeling when it succeeds.”

 

Success is the crucial variable within the pursuit of mental challenges. Problems that are either too easy or too difficult are unlikely to be attended to. Puzzles that challenge, but that do not overtax one’s capacity are likely to offer that reinforcing “pleasant feeling” associated with success. In reality this is a behavioral paradigm. Again, Willingham notes:

“…when you size up a problem as very difficult, you are judging that you’re unlikely able to solve it, and therefore unlikely able to get the satisfaction that would come with the solution. So there is inconsistency in claiming that people avoid thought and in claiming that people are naturally curious – curiosity prompts people to explore new ideas and problems but when they do, they quickly evaluate how much mental work it will take to solve the problem. If it’s too much or too little, people stop working on the problem if they can.”

Willingham’s intent in his article was to help teacher’s understand that learning is hard and frankly aversive for many students because of how they are taught. He provided specific strategies for helping teachers plan appropriately and teach in a way that fosters a love of learning. The key is finding that magical Goldilock’s Zone – where the work is neither too easy or too hard – it needs to be just right.

 

Developing an understanding of these concepts is crucial for anyone interested in learning, teaching, or in becoming a more effective thinker or doer. Here are just a couple of things to “keep in mind:”

  • Read a lot – build the stores in your long-term memory
  • Experience a lot – build those same stores.
  • Diversify your exposure. Expand your stores.
  • Practice a lot – if you want to get better at a particular skill – practice, practice, practice!
  • If something is too hard – don’t give up. Instead back up a bit – work on fundamental skills – refine your procedural skills. Find your level of competence and slowly raise the bar.

 

We need not be victims of evolution and the subsequent configuration of our brains. Just as we can proactively upgrade our adaptive unconscious, so too can we adapt our rational thought.

 

Willingham, D. (2009). Why Don’t Students Like School? Because the Mind is Not Designed for Thinking. American Educator. Spring Issue. http://archive.aft.org/pubs-reports/american_educator/issues/spring2009/WILLINGHAM%282%29.pdf

Share

I just spent two weeks in Europe with my fellow adventurer and wife visiting the relics of times gone by. In the Louvre we peered upon works laid down well over two thousand years ago by Greek sculptors as well as by Roman, Middle Age, Byzantine, Gothic, Renaissance, and Baroque artists. We admired the Impressionists at Musée d’Orsay.

 

We then traveled to Venice, a city that blended Byzantine, International Gothic, Renaissance, and Baroque art and architecture in a way that is unique to this breathtaking city. It’s Eastern influences are palpable. Then on to Florence, the home of the Renaissance, which proved to be a showcase for the works of da Vinci, Botticelli, Titian, Michelangelo and many others.

 

When In Rome, we focused on the age of the Empire devoting our attention to the Colosseum, Palantine Hill, the Roman Forum, the Pantheon, and our day trip to Pompeii. We didn’t prioritize the treasures at the Vatican or the many other indoor sites. Between the Louvre, Orsay, Uffuzi, and the many works within the countless Basilicas and churches we had previously visited, we had had our fill of crowded indoor shrines. Here we largely delved into the out of doors. The Pantheon was far more striking than I had imagined. And Pompeii, wow! It has to be seen to be appreciated.

 

All this is relevant because although you can see it at home, it is just not the same. Go to Google Maps and search for Pompeii. You can tour the site using street view. Or get a book or watch Travel Channel or History Channel episodes on these great destinations. I guarantee it won’t be the same as seeing it in person, touching it, feeling it, or breathing it in in-vivo. No duh, right?

 

Well, what is it about seeing the “real thing?” Why was I moved to tears to see a statue of Galileo in Florence? Why was it exciting to walk the same basalt cobbles in the Roman Forum as historical figures such as Julius Caesar, Brutus, Marc Antony, and Augustus? Why were there throngs of people gathered around da Vinci’s Mona Lisa? All over Paris, Venice and Florence you could find “descent” replicas (prints and even posters) – yet these images gathered no lines.

 

The answer is essentialism. There is nothing on the streets left by these famous people that magically imbibes the stones with a quality that makes them somehow special. They don’t contain anything truly special at all. I absorbed nothing by touching them or by looking at da Vinci’s or Michelangelo’s original works. And my personal telescope is far more capable than any Galileo original. But it was very exciting to see two of the scopes that he himself had made.

 

I knew that there was an irrational magical quality to these experiences. I knew I was cognitively embellishing all the aforementioned relics; however, I was able to let go, and enjoy the emotional implications. I did, however, find myself less inclined to part with my few and precious Euros for sentimental mementos (made in China) to remember this trip by.

Share
 | Posted by | Categories: Erroneous Thinking | Tagged: , , , |

According to the website at the Museo di Storia della Scienza Galileo’s finger was detached by Anton Francesco Gori on March 12, 1737, when Galileo’s remains were moved from the original grave to the monumental tomb at Basilica di Santa Croce in Florence. The finger became the property of Angelo Maria Bandini and was long exhibited at the Biblioteca Laurenziana. In 1841, the relic was transferred to the just-opened Tribuna di Galileo in the Museo di Fisica e Storia Naturale. Together with the Medici-Lorraine instruments, it was eventually moved to the Museo di Storia della Scienza in 1927. On the marble base is carved a commemorative inscription by Tommaso Perelli.

Share
 | Posted by | Categories: Astronomy, Science | Tagged: , , , |

There is a learning curve to the application of Skeptism. Raw, unchecked challenges to other’s beliefs, in a social context, are not well tolerated. People tend to find such notions rather offputting. In fact, as I have certainly encountered, it elicits defensiveness and sometimes hurt feelings. People often own their ideas and beliefs in a way that is essentially linked to their identity. As Carl Sagan wrote in ‘The Deamon Haunted World’ “All of us cherish our beliefs. They are, to a degree, self-defining. When someone comes along who challenges our belief system as insufficiently well-based — or who, like Socrates, merely asks embarrassing questions that we haven’t thought of, or demonstrates that we’ve swept key underlying assumptions under the rug — it becomes much more than a search for knowledge. It feels like a personal assault.”

 

These assaults repel people and in effect insolate them from the rational inquiry you may wish to posit. People are inclined to respond to uninvited or poorly crafted skepticism much as one would respond to contemptuous arrogance.

 

Throughout most of human history, the social consequences of skeptical inquiry were likely quite costly. This was most certainly true in the preagrarian stages of our evolution. It is believed that throughout early human evolution individual survival was linked to social cohesion. Although this is not as true today, in prehistory skepticism likely hindered, rather than promoted survival. With this in mind, it certainly makes sense that we as a species are inclined toward unquestioning belief rather than skepticism. This inclination also makes us vulnerable to mysticism and superstition. Natural selection, it seems, has selected for gulibility.

 

Sensitive, judicious, and scant use of sketicism, in social contexts, is prudent. This is true unless you just don’t care about how others feel about you, how they feel about interacting with you, and even about how they feel about themselves. There is a time and place for everything. Choosing those times carefully and selecting one’s words even more cautiously will more likely get better results.

 

I admire great thinkers like Bruno, Coppernicus, and Galileo who faced more than mere social consequences for putting forward their theories. Bruno, in fact, paid with his life. Darwin too faced significant costs. However, their rejection of accepted explanations (stemming from skeptical inquiry) moved us forward. We owe much to these men for their courage and steadfast dedication to the truth. We move forward when we step away from blind acceptance; but, let’s not lend a blind eye toward the social consequences of our own personal skepticism.

Share

How one chooses to live one’s life is complicated by the uncertainties of tomorrow.  Often there is an internal tug of war between the interests de jour and those that will be realized tomorrow.  Due to the wonders of compounded interest, it is wise to save as much as you can – as early as you can. However, another powerful reality is that there may be no tomorrow – or a reality that tomorrow may manifest itself in unimaginable ways.

 

I am surrounded by reminders that saving your better days for tomorrow is unwise.  Over the last decade, I have witnessed numerous loved ones and colleagues ravaged by disease.   Most of them died, but those who survived are essentially incapacitated.  They live-on, but are unable to experience life as they would prefer.  Of those that are no longer with us, some were quite young and some were reaching or had just reached retirement age.  Most lived their lives well, some did not: regardless, their peril certainly raised the value of their time, and they certainly had much left to live for.

 

Then there are the statistical realities of threats that my loved ones and I face.  These threats include cancer and car accidents and even the more improbable, but not impossible, threats associated with catstrophic volcanism and asteroid strikes.  The latter two events may seem to be ridiculous considerations, but the fact of the matter is that both are likely in near geological time. Some facts to contemplate:

 

Volcanoes -In a Discovery Channel piece on the supervolcano at Yellowstone it was indicated that “A modern full-force Yellowstone eruption could kill millions, directly and indirectly, and would make every volcano in recorded human history look minor by comparison. Fortunately, “super-eruptions” from supervolcanoes have occurred on a geologic time scale so vast that a study by the Geological Society of London declared an eruption on the magnitude of Yellowstone’s biggest (the Huckleberry Ridge eruption 2.1 million years ago) occurs somewhere on the planet only about once every million years.” It was also reported that “But at this hot spot’s current position under Yellowstone there have been three massive eruptions: 2.1 million, 1.3 million and 640,000 years ago. While those eruptions have been spaced roughly 800,000 and 660,000 years apart, the three events are not enough statistically to declare this an eruption pattern…” The risk is low but the threat is very real.

 

Asteroids – Although small (relatively harmless) bodies frequently enter the Earth’s atmosphere, it is estimated that 1 km (0.62 mi) in diameter asteroids hit our planet on average every 500,000 years. Larger asteroids (5 km or 3 mi) strike Earth approximately once every ten million years.  Even more rare are the large body impacts (10 km or 6.2 mi).  The last known major impact was the dinosaur killing KT extinction event 65 million years ago.  Although it is unlikely that an Earth shattering asteroid will end or drastically alter my life – were it to happen – life as we know it would end.  And we are past due.  According to NASAStatistically, the greatest danger is from an NEO [Near Earth Object] with about 1 million megatons energy (roughly 2 km in diameter). On average, one of these collides with the Earth once or twice per million years, producing a global catastrophe that would kill a substantial (but unknown) fraction of the Earth’s human population. Reduced to personal terms, this means that you have about one chance in 40,000 of dying as a result of a collision.”

 

I am careful not to “blow” these threats out of proportion, but they have figured into my thinking. Taking all this into consideration, I find it prudent to plan for tomorrow (by saving for retirement), but I find it equally important to live for today. Thus tomorrow, my wife and I jet off to Europe for a two week exploration of Paris, Venice, Florence and Rome. This is something that my wife has dreamed of her entire life. We are relatively young and able-bodied and can afford it (kind of): putting it off any longer seems unwise.  Next Friday we will be in Venice, but I think I’ll wait to make my next post until the weekend when I’m in Florence where I’ll post a picture of Galileo’s middle finger. 😉

Share

Intuitive Thought

2 April 2010

What is Intuitive Thought?

 

I have devoted numerous posts to a general category of cognitive errors and biases that are broadly lumped into errors associated with the intuitive mind. The lay notions of intuition are often referred to as gut instincts and they are generally considered emotional and irrational responses.  It is in this context that intuition is vilified.  Such impulsive reactions are countered with teachings typified by adages such as: “Look before you leap;” “Don’t judge a book by its cover;” “Haste makes waste;” and “The hurrier you go the behinder you get.”  Although this narrow understanding of intuition is in part correct, it largely misses the mark regarding this very complicated and sophisticated neuro-system.  Intuition is largely misunderstood, and has frankly not been well understood to begin with. Herein I hope to offer a cursory explanation of intuition and broadly differentiate it from rational thought. The vast majority of the following content is drawn from Malcolm Gladwell’s intriguing 2005 book called ‘Blink: The Power of Thinking Without Thinking.’ Gladwell draws together a vast array of research from cognitive and social psychology and a number of other sciences in an attempt to elucidate this ambiguous concept.

 

Rational thought serves as a good starting place because, in fact, it offers a good point of comparison helping to bring intuition into slightly better focus. Reason is the hallmark of rational thought. It involves an active application of the cerebral cortex, whereby personal history, knowledge, and active cognitions are employed in a conscious manner to solve problems. The keywords here are active and conscious. When we engage in reasoning we are generally aware of the cognitive effort directed toward this process.  Another aspect of relevance to this process is the passage of time. Reason-based thought is not generally instantaneous. Although solutions may seem to pop into awareness out of the blue, generally some measure of time passes as we strive for enlightenment. Think of an occasion where you had word finding difficulties. You probably actively thought about the word, the context of the word, and so on. If you failed to recall the word you may have cognitively moved on to something else, only to have the word come to you.  The former was rational thought; the latter, the result of intuitive thought.

 

Intuition is different from rational thought with regard to those key variables. First, this instantaneous process is seemingly unconscious.  Second, it is automatic (or at least seemingly so) consuming no apparent effort or time.  The popular and scientific literature is replete with descriptive names for this seemingly mystical capacity.  Gladwell uses a full complement of these terms and he sprinkles them throughout his text.  Terms that emanate from the sciences include the adaptive unconscious, unconscious reasoning, rapid cognition, and thin slicing. Other descriptive terms include snap judgments, fast and frugal thinking, and eloquently the “mind behind the locked door.” Regardless of what we call it, intuition is constantly at work, drawing instantaneous conclusions outside of our awareness.

 

Because of the nature of this process, Gladwell notes that people are often ignorant of the secret decisions that affect their behavior, yet they do not feel ignorant. We often behave in manners driven by the adaptive unconscious and later try to justify those behaviors invoking the rational brain to do so. This fact is what calls into the question the reality of free will.  Intriguing isn’t it!   It is as though there is a covert super-powerful, super-fast computer running in tandem with our overt reasoning computer: yet outside our awareness this covert computer remains ever vigilant, soaking in the world through our senses, and actively directing our behavior.

 

Although the adaptive unconscious lies outside our direct control, life experiences, practice, and our intellectual pursuits contribute to the data set that is used when snap judgments are made. The more informed, erudite, and experienced one is, the more accurate one’s rapid cognitions become.  Just think about driving.  When learning to drive there are an overwhelming number of things to think about – so many in fact, that mistakes made are likely due to “analysis paralysis.”  Too much to compute!  Through practice and repetition, all those things we previously had to actively think about become more automatic.  We don’t think about the countless micro adjustments we make on the steering wheel as we drive down the highway.   Novice drivers must think about these adjustments, along with attending to their speed (generally with gross applications of the accelerator and brakes), and myriad other factors that seasoned drivers do not overtly contemplate. The novice’s driving is chunky – experienced drivers, with the benefit of many miles in the drivers seat, are generally more smooth and more refined in their driving.

 

Experts in their given fields become more intuitive or automatic with regard to their area of expertise over time as a result of exposure, learning, and practice.   Their thoughts become seemingly automatic, their judgments and reactions more spontaneous – all of this in many situations without the expert even having to actively think.  In these cases (where there is sufficient expertise) snap judgments can be even more accurate than the arduous process of working through problems rationally.   On the other hand, this intuitive process can lead to problems because it is remarkably susceptible to prejudices and errors.  This is particularly true, as you might surmise, in areas where the individual lacks experience or knowledge.

 

Under certain circumstances the adaptive unconscious serves our purposes very well.  In addition to those situations where one’s expertise applies, we tend to effectively use snap judgments in social situations, in complicated situations, or in life or death situations that necessitate quick decisions.   This is where evolution has played a role in shaping this capacity.  It has had the effect of contributing to the survival of our species.  He who can make effective snap judgments in life or death situations is more likely to pass on this very capacity.  And tens of thousands of years of such natural selection has refined this capacity.

 

The catch is that there are erroneous thought processes that are artifacts, residuals or the direct consequence of the adaptive unconscious.  Issues such as essentialism, pareidolia, and superstition fall into this category, as they have been ushered along with the survival advantage that the adaptive unconscious has conferred.  Cognitive errors and biases hamper the effectiveness of the adaptive unconscious because of its inclination toward implicit associations and other accidental error imposing tendencies. Implicit associations are automatic and non deliberate pairings we make between concepts, people, things, etc., (e.g., African Americans are athletic, blonds are scatterbrained, gay men are effeminate) as they are folded into memory.  This is an intriguing concept, one deserving its own post, but you have to take the Implicit Associations Test, particularly the race test, to get a true sense of this powerful bias.  Confirmation bias, self serving bias, as well as the numerous other cognitive biases are likewise linked to this influential super-computer.  However, just because we cannot directly and purposefully access this incredible system, does not mean we have to bow entirely to its influence.  In fact, we can proactively prime this system through active learning.  And we can be aware of this powerful system and the advantages and disadvantages it confers.  We can learn of the errors it inclines us toward and monitor ourselves when it comes to our biases and prejudices.  We can impose certain rules of thought when it comes to important issues.  I believe that  we all should take these very important steps both to make our intuitive brain more accurate and to buffer its influences in those situations where it is likely to lead us astray.

 

References:

 

Gladwell, M. (2005). ‘Blink: The Power of Thinking Without Thinking.’ New York: Little, Brown and Company.

Share

Historically, morality has not been considered a topic of discussion within the domain of science. Instead, this issue has almost exclusively been within the purview of religion. Increasingly, however, concepts such as moral instinct have gained legitimacy as discussed by scientists such as Steven Pinker and Jonathon Haidt, who argue that there are neurological factors associated with morality and that natural selection has played a fundamental role in shaping universal instinctual moral truths. The evidence for this position is compelling. The question remains: “Can science offer moral guidance?” In other words, should science play a role in helping us discern what is right or wrong? Or does science have to relinquish issues of morality to other social systems based solely on historical precedence?

 

First of all, the definition of morality has to be accepted. Dictionary.com defines morality as “conformity to the rules of right conduct; moral or virtuous conduct.” The Stanford Encyclopedia of Philosophy definition of morality reads as follows “descriptively to refer to a code of conduct put forward by a society or, some other group, such as a religion, or accepted by an individual for her own behavior; or normatively to refer to a code of conduct that, given specified conditions, would be put forward by all rational persons.” These definitions are devoid of the de facto notion that this concept is values based. Sam Harris argues, and I believe most people would agree, that human values pertain to circumstances that have the positive affect of enhancing the well being of conscious beings. As such, it does not seem like a reach to suggest that science can play a role in setting the parameters of morality.

 

Quite simply, it can be suggested that there are certain conditions under which humans are more likely to prosper and other conditions under which they are more likely to falter. For instance it is known that children raised in a loving environment where life’s basic needs are provided for, are more likely to grow into happy and productive adults than those raised in hostile and deprived environments. We may intuitively know this, but it is science that provides the evidence for such claims.  The profession of psychology devotes considerable resources to this pursuit. As a psychologist myself I employ evidenced based practices as I endeavor to facilitate the betterment of my clients’ lives. Why is it then, that we dismiss the influences of science when we discuss morals? At a recent TED Conference Sam Harris posed this very question.

 

I suggest, as did Harris, that science is very capable of pointing us, as a society, in the right direction when it comes to morals and values. Russell Blackford wrote in his post on Harris’ speech that “…science can give us information about what individual conduct, moral systems, laws, and so on are likely to lead to such plausible goals for ….. individual and collective human flourishing, social survival, and reduction of suffering. Any source of information about what will lead to goals such as these has some moral authority.

 

Harris argues that it boils down to understanding the conditions that lead to human flourishing – and accepting that these conditions are fundamental facts that should serve as the basis of universal morals. He further contends that there are distinctly problematic values within our current human systems that run counter to human flourishing. For example he discusses the costs of the extremist cultural expectation for women of Islam to wear burkas (and the brutal costs of non-compliance). He contrasts this with the unrealistically perfect portrayal of the female body in modern western cultures. Neither of these circumstances promotes healthy thriving circumstances for young women.

 

He also argues that religion should not be given a pass when it comes to the values they promote just because of their religious status. The natural deference given to religion in our “pluralistic” society in fact, promotes many clearly harmful practices (including the prohibition of birth control, the denial of civil liberties for homosexual couples, sanctioned murder of victims of rape to preserve the honor of the family, male foreskin and in some cultures clitoral circumcision, and the application of prayer in lieu of modern medical services particularly for ill children).  Values rendered in distant Bronze Age cultures and sustained based on ideology are far from being in touch with those values that are likely to promote healthy human development today.

 

Individuals suffer, indeed society as a whole suffers when these or similar prohibitions and/or expectations thrive.  Science, it seems to me is far more capable of really looking at the human and societal costs of such “values.”  Harris suggests that “Morality is certainly a domain where knowledge and expertise applies.” We need to “bring into our dialogue the issues of these truths of right and wrong.”  By accepting that values are drawn based on quality of life issues pertaining to the greater good of all, and by accepting that there are certain truths pertaining to life experiences that either enhance or impinge upon the well being of the human conscious, then isn’t the domain of science to draw out these truths?

 

References:

 

Blackford, Russell. 2010. Sam Harris on Science and Morality. Metamagician and the Hellfire Club. http://metamagician3000.blogspot.com/2010/03/sam-harris-on-science-and-morality.html

 

Harris, Sam. 2010. Science can answer moral questions. TED Conference. http://www.ted.com/talks/sam_harris_science_can_show_what_s_right.html

Share
 | Posted by | Categories: Evolution, Morality, Science | Tagged: , |

Nature is harsh. This reality is evidenced with potential discomfort to those who care to open their eyes to what goes on around us. Most living creatures struggle to survive, facing either limited resources or predation on a continual basis. In most developed nations many humans escape this reality, but not too long ago even we had to struggle survive.

 

I remember the reality of this struggle burning into my memory cells as a child while watching nature shows like The Underwater Odyssey of Commander Cousteau and Wild Kingdom. I vividly recall the horror and intrigue I experienced watching cheetahs and lions chasing down and killing antelope or gazelles. To this day I experience a visceral response when I witness this predation carried to its conclusion with the blood soaked carnivore licking it’s chops. Harsh indeed!

 

The moral implications of nature’s harshness has stirred our intellect for quite some time. It certainly weighed heavily on Darwin as he developed his theory of evolution by means of natural selection. A pressing question in natural theology asked how a benevolent and loving God could create such a system with pervasive suffering. Stephen Jay Gould, in perhaps his most famous essay, titled Nonmoral Nature addressed this very issue.

 

Gould (1982) provides a historical review of this controversy dating back to the mid nineteenth century. One particular scholar from that era, William Buckland, gained comfort from the notion that predation is moral because carnivores increase “the aggregate of animal enjoyment” and “diminish that of pain” because:

Death after all, is swift and relatively painless, victims are spared the ravages of decrepitude and senility, and populations do not outrun their food supply to the greater sorrow of all.”

Buckland concluded that predation on a grand scale is moral. But to some, the real challenge to the morality of nature lies outside run of the mill predation. The reproduction cycle of the ichneumon fly epitomizes this challenge.

 

The ichneumon fly is is actually a wasp belonging to the ichneumonoidea superfamily. This diverse group of insects lay their eggs on or in other insects setting into motion a synchronized chain of events that defies any sense of morality. The endoparasitic ichneumon wasps insert their eggs into the body of their host (e.g., caterpillars, aphids, or spiders). The larvae upon hatching carefully ingest their host’s internal organs – first devouring the non-essential tissues saving the vital organs for last so as to prolong the life of their meal. The ectoparasitic ichneumons sting and paralyze the host before laying eggs on the exterior of the host’s body. The paralysis is permanent but the host remains alive. Once the eggs hatch the larvae penetrate the host’s body and again selectively devour the incapacitated but fully alive host little by little, sustaining the live fresh meal as long as possible.

 

This process is, to say the least, horrifying to contemplate. We humans do not cope well with the notion of parasites on or in our body. Think of the circus that ensues when a child comes home from school with head lice. Think of the horror and shame associated with pubic lice. How about scabies or tape worms? People don’t even like to hear that approximately 10% of our body mass is that of our essential parasitic partners (bacteria). One does not have to use much imagination to shudder with the notion of being slowly devoured from within. ‘Alien’ – need I say more.

 

The ichneumon reproduction contrivance became the supreme challenge to the morality of the designer. Gould wrote of the 19th Century theologians who attempted to resolve this dilemma by anthropomorphizing the mother’s love for its progeny and by downplaying the implications of the plight of the host. They also suggested that this approach may be adaptive for humans as the predation has the effect of minimizing crop loss due to the ravenous appetites of living caterpillars. Finally, they argued that animals are not moral agents, and that they thus must feel little, if any pain. They suggested that lower life forms and even “primitive people suffer less than advanced and cultured folk. It was also believed during this Victorian era that consciousness was only within the realm of man. Needless to say, these arguments fail to resolve the dilemma if one contends that there is a “lurking goodness behind everything.” Darwin wrote in a 1856 note to Joseph Hooker:

What a book a devil’s chaplain might write on the clumsy, wasteful, blundering, low, and horribly cruel works of nature!”

Gould wrote that in the face of this conundrum intellectuals had two options:

  1. Retain the notion “that nature holds moral messages” and that morality involves knowing the ways of nature and doing the opposite. Be not a savage – be not an animal.
  2. Accept that nature is nonmoral, that it is what it is, that morality plays no role in the struggle for existence.

Darwin himself leaned toward the second option although he struggled with letting go of the notion that the laws of nature might denote some higher purpose.  In his essay, Gould (1982) suggested that:

Since ichneumons are a detail, and since natural selection is a law regulating details, the answer to the ancient dilema of why such cruelty (in our terms) exists in nature can only be that there isn’t any answer – and that framing the question “in our terms” is thoroughly inappropriate in a natural world neither made for us nor ruled by us. It just plain happens.”

It is a strategy that works for ichneumons and that natural selection has programmed into their behavioral repertoire. Caterpillars are not suffering to teach us something; they have simply been outmaneuvered, for now, in the evolutionary game.”

 

I too, am inclined toward the notion that nature as it plays out evolution’s dance, is entirely devoid of anything pertaining to morality or evil. We anthropomorphize when we apply these concepts. Even to suggest that nature is cruel is anthropomorphizing. Any true and deep look at the struggle for life that constantly dances in our midst can scarcely lead to any other conclusion but that nature is brutal, harsh, and nonmoral. Should I be wrong about this, I am inclined to be reluctant to meet its designer.

 

Reference:

 

Gould, S. J. 1982. ‘Nonmoral Nature.’ Natural History. 91. pg.19-26.

Share
 | Posted by | Categories: Evolution, Morality, Science | Tagged: , |

Essentialism

12 March 2010

Essentialism within the purview of psychology is a cognitive bias whose roots form in early childhood (Gelman, 2004). This concept pertains to the notion that all discernible objects harbor an underlying reality that although intangible, gives each and every object it’s true identity – it’s essence (Dawkins, 2009; Hood, 2008).  To put it another way:

people believe that natural category members share some hidden, unobservable, empirically discoverable deep structure or essence, whose possession is necessary and sufficient for category membership” (Jylkkäa, Railob, and Haukiojaa, 2008).

In our early childhood, as we were developing language, essentialism played a crucial role in the expansion of our vocabulary, the generalization of our knowledge, in discriminating among objects, and in our ability to construct causal explanations (Gelman, 2004).  In our struggle to understand the vast and complicated world, our brain forced us to partition things into categories so we chopped and divided what we surveyed into distinct groupings based on defining characteristics driven by our internalized understanding of the essence of those groupings.  This was initially a very simplistic process (dog, cat, cow), then more complex (mammal, reptile, insect),  and then even more sophisticated for those who progressed in the biological sciences (kingdom, phylum, class, order, family, genus, species). This is necessarily a dynamic process because as we mature and take in increasing complexity we need increased specificity when parsing the world up into discrete categories.

 

This pattern of thinking/learning transcends all cultures and is central to our language development (Hood, 2008). Given this central role, it forms the foundation of our thought processes (Hood 2008; Dawkins, 2009). The overgeneralization of this process is what gets us into difficulty. Bruce Hood, author of Supersense (2008), convincingly argues that this innate tendency forms the core of our superstitious and supernatural thinking. Richard Dawkins (2009), an evolutionary biologist, suggests that such an inclination explains why people have such great difficulty grasping and accepting the concept of evolution by means of natural selection. I suggest, that like evolution (which necessitates quintessential anti-essentialist thinking), the concepts of plate tectonics, deep geological time, and deep space time are also very hard to grasp for the same reasons. We are inclined to think that what we see are constants – that the world as we see it has been eternally so, and so shall it always remain.

 

In biology, essentialism sustains the notion that all animals are clear and distinct, belonging to a specific species. In fact, as Dawkins  suggests: “On the ‘population-thinking’ evolution view, every animal [living form] is linked to every other animal [living form], say rabbit to leopard, by a chain of intermediates, each so similar to the next that every link could in principle mate with its neighbors in the chain and produce fertile offspring” (2009, p. 24).  This is true for all conceivable pairings including bacteria and viruses, giant sequoias and lichen, spiders and flies, cats and dogs, birds and snakes, foxes and chickens, and even humans and turnips.

 

Plato demonstrated essentialist thinking in The Republic in his cave allegory, where he suggested that the world as we experience it is only a composite of mere shadows tethered to their true and perfect forms (essences) floating about somewhere in the heavens (Dawkins, 2009; Hood, 2008). Many people still believe that there is something more to the physical world than what we see. As Hood (2008) put it, “Humans like to think that special things are unique by virtue of something deep and irreplaceable.” This thinking, and other intuitive errors such as vitalism (that vital life energies cause things to be alive) and holism (that everything is connected by forces) are likely artifacts of our natural involuntary inclinations (Hood, 2008).

 

Essentialism is more than a heuristic and it has ramifications beyond making us less inclined to believe in evolution or more inclined toward superstition. It is what makes rape more than a physical crime. The defilement and contamination the victim feels is a psychological violation of one’s essential integrity. Genocide is perpetrated by individuals who dehumanize or define the victims as essentially different and/or contaminated. Essentialism, is what makes original works of art more valuable than exact duplicates (Hood, 2008). It also drives the belief systems that sustain homeopathy.

 

It is interesting that this intuitive process plays such an important and fundamental role in our development and sustains both powerfully positive and hugely negative influences on us as adults.  When you get right down to the essence of this concept, you must accept that these inclinations have their roots in the same thinking that makes a preschool child believe that a Mommy can’t be a firefighter (Gelman, 2004).

 

References:

 

Dawkins, R. 2009. The Greatest Show on Earth: The Evidence for Evolution. New York: Free Press.

 

Gelman, S. A. 2004. ‘Psychological Essentialism in Children’, TRENDS in Cognitive Sciences, 8, 404-409.

 

Hood, B. 2008. Supersense: Why We Believe in the Unbelievable. New York: HarperCollins Publishers.

 

Jylkkäa, J., Railob, H., & Haukiojaa, J. 2008. ‘Psychological Essentialism and Semantic Externalism: Evidence for Externalism in Lay Speakers’ Language Use‘. Philosophical Psychology

Share