, , , , , , ,

The information age has brought along with it the perils of having a fallible mind. Pseudosciences, scams, hoaxes, frauds, and misinformation are the rather sinister undercurrent to this new information tide. To cope with this, a loosely concerted effort of scientists, critical thinkers and skeptics have taken up the charge of promoting good science in society. New findings in psychology and neuroscience are now illuminating our poor grasp of probability, risk, our fragile memories, and the factors that allow weird beliefs to endure. This “Second Enlightenment” brings with it a new wave of reason, a new beginning to scientific advocacy.


Astrology, Bigfoot and “faith healers” come to mind when thinking about pseudoscience. These abnormalities pervade popular culture, influence television shows and movies, and speak to the human fascination with the weird. Almost as long as pseudoscience has been around, there has been a loosely organized effort to debunk it. Sometimes it came from magicians and escape artists like James “The Amazing” Randi or Houdini. Other times scientists themselves used their expertise to separate bunk from truth. And in the digital age, bloggers and the lay public have taken up this turf war, battling for sway over our believing brains.

Like any movement, the skeptical movement started out small, comprised of a few prominent authors, scientists, and magicians. Classic pseudosciences were easily dispelled. Astrology fell to the “Forer Effect,” the finding that we apply vague and general statements that could apply to anyone to ourselves with great accuracy. Bigfoot remained elusive as experts asked for evidence, pointing out that the current state of knowledge about the creature was indistinguishable from its nonexistence. “Faith healers” met their match when magicians who knew how the tricks were done exposed their parlor tricks on national television.

As the skeptical community grew, so did the scope of the skeptical eye. No longer did skepticism deal exclusively with the weird, but also with medical claims like whether or not homeopathy works, and biological claims like whether or not the cell is too complicated to have come about by natural selection. But along with this came a backlash from the purveyors of pseudoscience. As the skeptical community grew, the proponents of pseudoscience started targeting new, untapped fields. For instance, many proponents had significant influence over new areas of medicine and biology. This diversification increased pseudoscience’s reach and skeptics could no longer advance the scientific position with the evidence alone. The strategy had to change to get to the heart of pseudoscientific, irrational, and just plain weird beliefs. It was time to deal with human cognition.


The Enlightenment was the birth of modern scientific values in the West. It represented rationality, open inquiry, and the scientific method. It was critical of the unsupported ideas advanced by tradition and authority, and undermined their power. The Enlightenment gave us a new way to think about the world, and this was quite obviously a successful endeavor. However, as pseudoscience progressed beyond the reach of a naturally cautious body of scientists, rationalists realized that we needed to not only think properly, but to think about thinking properly. The “Second Enlightenment,” as it has been called, came with advances in psychology and neuroscience, and gave skeptics new tools to re-fuel the candle in the dark.

Human are naturally curious, social primates, who use their adapted intellects to make useful judgments about the world. In our ancestral environment, these biased judgments served as a significant survival advantage. The classic thought experiment highlighting these rudimentary judgments involves an ancestral human foraging the plains of Africa. If she was to hear a rustling in the tall grass, would it be more beneficial to think a lion was approaching, or that the wind had simply picked up? Even if nothing was there, it is still more beneficial for her to assume that a lion may be creeping up on her, making a type one error in reasoning, or a false positive. The cost of fleeing from an imaginary lion is far less than the type two error, or false negative, which would be dismissing a possible threat while the lion readies its claws. Based on these ancient selections, humans today are equipped with highly evolved pattern-seeking brains. We see patterns everywhere; the conglomeration of shadows in the corner is a monster at the foot of the bed, the robbery during a full moon was under a lunar influence. Clearly, we have to think about our own thinking to overcome these fundamental biases and properly investigate the world. But just how bad are we?

The Second Enlightenment has brought with it a number of findings about human cognition that has helped scientists, skeptics, and rationalists better counter the tide of unreason. We have quite clearly discovered that we are susceptible to all manner of cognitive foibles, whether that is motivated reasoning, a misunderstanding of statistics and probability, an overreliance on a fragile memory, or a failure to effectively estimate risk.

One of the most robust findings in psychology and communication research is that our default mode of information processing is biased towards our own ends. This confirmation bias has us attending to, selecting, or valuing information that is congruent with our preconceptions, beliefs, values, and worldviews, and ignoring, dismissing, or outright discrediting information that does not confirm. Our brains actively filter out disconfirming information, possibly creating a dangerous echo chamber where weird beliefs can run wild.

Not only do we pick and choose information that best suits us, we are not privy to all the information available to us. Psychologists have found that our attention is selective; our brains simply cannot process the trillions of bits of information hitting our senses at any one moment, so it focuses down on some of the information and blurs out the rest (you can watch a famous experiment displaying this phenomenon here). As an outcropping of this spotlight of attention, clever psychologists have also demonstrated that you can even change out the person someone is talking to without them noticing (video here). This “change blindness” again illustrates that we do not notice all we think we do. A large portion of possible everyday experience is unavailable to us. This inspection of our cognition should give us pause when hearing eyewitness testimony about any weird belief; for extraordinary claims, extraordinary evidence is required.

But it gets worse. We fail to understand risk. This influences where we place our fears and how irrational beliefs develop. Though driving the car to work is likely the most dangerous thing we do everyday, we are more afraid of flying. Though coal-based power plants contribute to preventable deaths to an extent that should appall us, we argue over the utility of (far safer) nuclear power. A misunderstanding of risk undermines our ability to have sensible discussions about vaccine safety, solutions to climate change, and other issues that skeptical and scientific communities needs to stand firm against the tide of pseudoscience on.

Even more fundamentally, what we think we have seen, heard, said, or experienced is based on fallible memory. Memory expert Elizabeth Loftus has found that simply changing one word in a question can change a memory. In an experiment she had participants watch a film of a car crash, and then they were asked about what they saw. They were either asked “how fast were the cars going when they hit each other,” or “how fast were the cars going when they smashed into each other.” One week later the participants returned for some memory questions. Loftus asked whether or not there was broken glass at the scene of the crash. Those participants who heard the word “smashed” were more than twice as likely to recall seeing broken glass as those who heard the word “hit.” (There was no broken glass at the scene.) Every time we recall something, our memory changes. It contorts to the narrative our brains construct to make sense of the world. Even the most vivid “flashbulb” and “I remember exactly where I was when…” memories evolve, twist, and eventually disintegrate into hopeless inaccuracy. But because we do not subjectively feel that this is the case, it is again wise to not trust every thing we think.

Lastly, and to the heart of what the Second Enlightenment has provided scientists and skeptics with, is the finding that reality, as we subjectively know it, is a construction of the brain. Many a pseudoscientific or weird belief falls out of the idea that our senses, and their subsequent signals in the brain, are faithful reporters of reality. But this cannot be the case, as neuroscience has shown us. For example, try touching your finger to the tip of your nose. It feels simultaneous, but we know that it is not. The signal input from your nose reaches your brain before the signal input from your finger. It feels simultaneous because the brain is actually “buffering” reality, allowing for a coherent and presentable world. In this case, the feeling of simultaneity is a construction of the brain, and not objective reality. As another example, recall when someone last spoke to you. As we hear someone speak, there appear to be pauses in the sentences that allow us to distinguish individual words. However, record any sentence of normal structure and speed and you find that it is one continuous sound. The brain actually inserts pauses between the words in this singular sound so that we can make sense of it. Again, the brain is constructing reality before it is a conscious experience. Weird beliefs no longer have shelter in obscure “I saw it with my own eyes!” or “It worked for me!” arguments. From what we now know about the brain and human cognition, much more likely natural explanations present themselves and call into question our abilities to separate science from pseudoscience.


The new skeptical paradigm of thinking about thinking now colors the steps that skeptics and scientists take to combat bad science, pseudoscience, and the weird. Gone are the days when UFO enthusiasts were “wing nuts;” confirmation bias, inaccurate perceptions, and other biases allow us to get to a more understanding position. Skeptics recognize the humanity of weird beliefs, and more effectively communicate about them as a result. More findings of the Second Enlightenment have shown that even “debunking” has psychology behind it. “Backfire effects,” once the bane of a skeptic’s existence, are now understood landmines to avoid. For example, acknowledging that something is a myth, and then arguing against the myth leads people to remember the myth more than the correction. As another example, psychologists have found that there is a certain threshold of information that people can handle. Using arbitrary numbers, six reasons why evolution is true is more convincing to people than twelve reasons why evolution is true. And after people’s heels are dug into the ground of some belief, contradictory evidence can increase the confidence in the critiqued belief, no matter how well the evidence discredits it. Thinking about thinking has put these nuances on the table, and has humanized the intricacies of belief. Skeptics, freethinkers, and rationalists now use them to advance science in society.

Pseudoscience is a morphing monster of undue credulity; an “unsinkable rubber duck,” as some skeptics have called it. The reality is that we will always be burdened with the irrational and the unscientific. Believing in weird things isn’t unnatural; rather it is an extension of a highly adapted mind. But to move accurately through today’s world, a healthy scientific skepticism is warranted. With New Age beliefs making a resurgence, the anti-vaccination movement gaining strength, creationist bills passing US state legislatures, promises of personal genomics spawning new and dubious treatments, and health gurus sprinkling the word “quantum” on everything like an over-used spice, skepticism should be, now more than ever, a liberally applied tool. For the critical thinker, discovering and understanding our cognitive foundations is tantamount to a new beginning, a fresh way to look at the world. Learning how to think about thinking, learning how to navigate the perils of human cognition, is the way through.

Originally posted on Nature Education’s Student Voices Blog