Don’t Believe Everything Your Mind Says

optical illusionsFor some reason, my publisher cut the best chapters from The User’s Guide to the Human Mind. What happened to my haggis recipes and the chapter on Bigfoot?

Actually, this essay didn’t get cut; it’s an afterthought. Think of it as a bonus chapter to an odd little book that can use all the help it can get.

In The User’s Guide, I wrote about some of the mind’s most inexplicable and troubling tendencies, and what to do about them. Today I’ll discuss one of the most important things we should know about our minds before attempting to use them in public: we perceive only bits and pieces of the world around us, but our minds fill in the blanks to create the illusion of a seamless experience.

Our perception of the world is like a telegram with every other word missing. We get the gist of things, while our minds fill in the missing pieces. Sometimes our minds get it right; sometimes they are spectacularly wrong.

The following is a little example of the latter case using a quirk of our visual system.

The Disappearing O

In the image below, you’ll notice an X on the left and an O on the right. If you cover your left eye and slowly move your head toward the computer monitor while staring directly at the X, you will notice that at some point the O will disappear. It will then reappear if you shift the position of your head or eyes. (For me, the O disappears when my right eye is level with the X,directly in front of it, and about 8 inches away.)

optic disc diagram

You should have noticed that the O was replaced by white space when it vanished. Now, repeat the same process with this image:

optic disc illusion

This time, you probably noticed that the O disappeared but the horizontal and vertical lines did not. They may have been squiggly or imperfect, but for the most part the lines should have remained intact while the O vanished. What gives?

In each eye there is a blind spot where the optic nerve joins the retina. Photoreceptors are absent at this junction, called the optic disc, but the mind manages to fill in that blind spot using visual information from the general vicinity.

In the second half of the demonstration, the mind assumes that straight lines remain straight, and so it gives us the experience of perceiving straight lines when we are not actually seeing straight lines. It’s a pretty impressive computational feat. (It’s also prone to mistakes. If we didn’t know the O was supposed to be there, we would be none the wiser and that O could have snuck up on us.)

The visual system does this sort of thing routinely, and in different ways (Weil & Rees, 2011). In the image below, we can see seven segments of violin string (the upper left segment disappears into the shadow of the fingerboard) but we perceive four continuous strings. Our brains routinely feeds us contrived bits of information, making the world seem smooth and predictable.

visual misinformation

In truth, the brain relies on scant few bits of information as we navigate the world. We simply cannot perceive all that is around us, and luckily we don’t need to. In the above image, we see only a few features, but we know it is a violin. Or an evil robot. But probably a violin.

Even if we could absorb more information, we couldn’t process it. Our brains routinely encounter information bottlenecks as we’re seeing, hearing, and interacting with the world. Our brains make snap decisions about which pieces of information to process and which to discard (Tombu et al. 2011). The copious gaps get filled in with good guesses.

Relating On Autopilot

It’s an imperfect analogy, but a similar process takes place in relationships. We form views of others that seem continuous, predictable, and reliable. That makes us prone to certain errors, just as our vision is prone to errors.

For example, have you ever assumed that someone was upset with you because his or her behavior seemed cold or distant, then later discovered that their behavior had nothing to do with you?

We assign personalities to people that help us understand and predict their behavior. Cognitive psychologists call them schemas. When a person’s behavior violates our schema, the mind is always at the ready with a quick and easy explanation: Uncle Marty forgot my birthday. Clearly he is angry with me.

From a mind’s point of view, the most sensible explanation is the one that ensures our safety. If a mind assumes that Uncle Marty is angry, then we will feel compelled to respond, perhaps by repairing the relationship or distancing ourselves from it. …Gosh, I better figure out why Uncle Marty is angry. Or, …Screw Uncle Marty. I never liked him anyway.

Even though automatic interpretations can be inaccurate, like the missing O, there is something beautiful about them. They are designed to minimize our exposure to rejection or emotional injury by compelling us to respond to problems. Whichever direction the mind pushes us, we end up less vulnerable, in theory. (Of course we always have the option of ignoring what our minds tell us.)

No matter how counterproductive, annoying, or just plain wrong the mind may be, it is almost always watching out for our safety. That is the beauty of minds – and the trouble with them. They are often trying to save our lives, even when our lives aren’t in danger.

Filling in the blanks is just one of the mind’s survival-driven, error-prone functions. In The User’s Guide to the Human Mind, I discuss other engrained mental functions that serve to keep us safe:

  • Confusing the present for the past. When we become physically or emotionally injured, the mind will forever be on the lookout for it to happen again. It will even find ways to recreate the injury. There’s a clear survival value in this. When the mind errs on the side of over-preparation, we’re less likely to be caught with our pants down.
  • Double standards. The mind often has higher standards for ourselves than for others, even with things like emotions: it’s OK if others feel this way, but I shouldn’t. This particular double standard makes us forgiving of other people while keeping our own behavior in check so that we don’t become too annoying and risk being ostracized. Other kinds of double standards function similarly to keep us safe.
  • Pessimistic thinking. We tend to think of pessimism as a bad quality, but in fact pessimistic thinking is one of the mind’s most powerful armaments against danger. Think of it as one of the mind’s error-management systems. It helps us err on the side of caution so that we don’t get hurt. The trick is in knowing how to use pessimism correctly.

Most of what our minds do behind the scenes is geared toward survival. Even things like anxiety, depression, and addiction stem from mental adaptations to a dangerous world. When these minds of ours seem to be beating up on us, we are usually better served by exercising compassion for them rather than cursing them and trying to change their nature.

How to Live With a Human Mind

You may be saying, “Fine, Dr. Smartypants, my mind is watching out for me. What the heck am I supposed to do about it, because it sure doesn’t feel like my mind is on my side.” The answer is fairly simple on paper, but it requires a bit of practice.

The first step is simply to build awareness of what the mind is doing, and why. The second step is to recognize the choices we have in responding to our minds. We don’t always have to do what it says.

For example, most minds have a lot to say about public speaking. Everyone will judge you. Your whole career is riding on this speech. They will all notice your [fill in your greatest insecurity]. Rarely is any of it true, but the mind can certainly convince us that it is. We’re at a disadvantage when we get sucked into powerful thoughts and feelings without realizing it.

Noticing the mind’s activities is not merely an academic exercise. Observing our internal processes with distance and dispassion is a powerful step toward managing anxiety, depression, destructive relationship patterns, and other problems.

Maybe the mind is a bit like an anchor being dragged around behind our boat. If we don’t know why it’s there or what it is for, we will probably come to resent it. We’ll see that anchor as dead weight, and we’ll do our best to navigate around it.

But that anchor can become something cherished when know why it exists and how to use it. It can keep us steady and safe when necessary, or we can carry it along without impediment when the seas are calm.

Well, that is a little taste of The User’s Guide to the Human Mind. Like any good book, mine can be summed up in one sentence: the government is engaged in a massive Bigfoot coverup.

Wait, that’s the wrong thesis. Here’s a better one: the mind is almost always looking out for us, and we don’t have to believe everything it says.

Hungry for more? You can read the book’s introduction here, and drop me a line if you would like my haggis recipe.

– IS

References

Tombu, M.N., Asplund, C.L., Dux, P.E., Godwin, D., Martin, J.W., and Marios, R. (2011). A unified attentional bottleneck in the human brain. Proceedings of the National Academy of Sciences of the United States of America, 108(33), 13426-13431.

Weil, R.S. & Rees, G. (2011). A new taxonomy for perceptual filling-in. Brain Research Reviews, 67, 40-45.

coverup conspiracy