Michael Shermer. Shermer is precisely the sort of atheist that I desire to be; calm, level headed, friendly, patient, and skeptical without being cynical. (To see just how incredibly patient Shermer is watch this.) I honestly can't remember how I first encountered his work. It may have been that I simply looked him up, after hearing that he was a famous skeptic who used to be a Christian. I have tended to feel a strong bond, with former Christians, since only they truly "get" what it is like to embrace faith and then to change your mind completely.
Anyway, it's neither here nor there. I will seek to remedy the situation, over these next few posts, by discussing an admittedly tiny fragment of Shermer's work. More specifically, I would like to examine the concepts of "patternicity" & "agenticity". When I am done I will do my best to tie it all together, in terms of how these two ideas have influenced my own thinking on spiritual matters. For simplicities sake, all of my quotes will come from Shermer's excellent book "The Believing Brain" (where he covers patternicity and agenticity in back to back chapters).
Let's begin with a definition: patternicity="the tendency to find meaningful patterns in both meaningful and meaningless noise".
We are pattern seeking primates; descendants of those who who were most successful at what's sometimes called "association learning". In other words, we see patterns. Lots and lots and lots of patterns. And we see them everywhere. It's just part of how we learn. As Shermer says, "we can no more eliminate superstitious learning than we can eliminate all learning". Not all of these perceived patterns are real, however, and there are two types of related errors:
Type II Error (false negative)--not believing a pattern is real when it is (not recognizing a real pattern)
We tend to make a large number of type I errors. Why? The answer lies in evolution. Let's suppose that you are a hominid, taking a walk, a few million years ago. You hear a rustle in the grass. Is it a dangerous predator? It could be. Then again, maybe it's just the wind. If you determine it's a dangerous predator, but it's not, no real harm is done (a type I error). However, if you assume it to be the wind, and it turns out to be a dangerous predator, you're lunch (a type II error). Natural selection has favored strategies that make many incorrect causal associations in order to establish those that are essential for survival and reproduction. To put it a different way, people believe weird things because of our evolved need to believe non-weird things.
With a sufficient amount of information (or time) we could surely determine more accurately the true cause of that rustle in the grass. Right? True enough. "The problem is that assessing the difference between a Type I and Type II error is highly problematic--especially in the split second timing that often determined the difference between life and death in our ancestral environments--so the default position is to assume that all patterns are real; that is, assume that all rustles in the grass are dangerous predators and not the wind." (bolding mine)
Have you ever noticed that your first gut reaction, to an unexpected sound in the middle of the night, is to think that an intruder has entered your home? It is usually only after you take a moment, to allow your rational brain to kick in, that you realize the chances of it being something else (probably much less scary) are significantly higher. How many people hear weird sounds in the middle of the night, at least from time to time? (Millions.) And how many of those weird sounds actually turn out to be dangerous intruders? (By comparison, only a tiny fragment.) Because of evolution our default setting is to make a type I error (false positive). This is the same reason that children so often see "faces" in their closet (when it's really just the clothes or a shadow). We have evolved to see patterns, many of which are not actually there.
This tendency, in the direction of assuming initially that all perceived patterns are real, has been scientifically demonstrated many times over (in both human and non human subjects). For example, in one experiment, pigeons were taught to peck at two keys to receive grain through a food hopper. The researchers discovered that if they randomly delivered the food reinforcement, whatever the pigeon happened to be doing, just before the delivery of the food, would be repeated the next time. The results, as you can imagine, were rather humorous. Before pecking the key some pigeons would spin around once to the left, others would turn counter-clockwise, or hop side to side, whatever they thought would bring the food. None of this had the slightest impact on the food delivery schedule of course; which was entirely random. These odd behaviors were almost always repeated in the same part of the cage, and they were generally oriented toward some feature of the cage. The birds in this experiment were developing a superstition; or, as Shermer calls it, pigeon patternicity.
Or how about (human) babies? "When an infant observes the cooing happy face of its mother or father, the face acts as a sign stimulus that initiates the innate releasing mechanism in its brain to trigger the fixed action pattern of smiling back, thereby setting up a symphony of parent-child staring and cooing and smiling--and bonding attachment. It need not even be a real face. Two black dots on a cardboard cutout elicit a smile in infants, although one dot does not, indicating that the newborn brain is preconditioned by evolution to look for and find the simplest pattern of a face by two to four data points: two eyes, a nose, and a mouth, which may even be represented as two dots, a vertical line, and a horizontal line. Facial-recognition software was built into our brains by evolution because of the importance of the face in establishing and maintaining relationships, reading emotions, and determining trust in social interactions." (bolding mine)
There are dozens of other examples, such as "...the human enjoyment of artificial sweeteners as well as with our modern problem of obesity. In the natural environment, (A) sweet and rich foods are strongly associated with (B) nutritious and rare. Therefore, we gravitate to any and all foods that are sweet and rich, and because they were once rare we have no satiation network in the brain that tells us to shut off the hunger mechanism, so we eat as much as we can of them. On the other end of the taste spectrum, there is the well-known taste aversion effect--one-trial learning--where the pairing of a food or drink with severe nausea and vomiting often results in a long-term aversion for that food or drink."
I think you get the idea.
It's also worth noting that research demonstrates the propensity to find patterns goes up when people feel a lack of control. Shermer uses a sports analogy to illustrate the point. Why is it that, in baseball, superstitious behaviors & beliefs always seem to exhibit themselves in batters, but never (or rarely) in fielders? Perhaps it is because fielders are successful 90% to 95% of the time, but even the best batters fail 7 out of 10 times. The patternicity (if I wear the same pair of socks every time I go up to bat I will be more successful) is the batter's way of trying to regain control over the situation. "Lacking control is highly aversive, and one fundamental way we can bolster our sense of control is to understand what's going on. So we instinctively seek out patterns to regain control--even if those patterns are illusory."
At this point you might be asking yourself, "what's the point?". "Where is this Respectful Atheist guy going with all of this?" Somewhere, I promise. For now though, I simply want to establish the point that patternicity is real. And it ties in directly with what I will cover next time, "agenticity". As promised, when I am done outlining both concepts, I will connect all of the dots and offer some additional personal reflections.