Image
Why is observing yourself so important to the exercise of your intelligence? Because much of what gets in the way of thinking effectively and powerfully is not a lack of ability or brainpower, but the interference of ones own reactive mind. Let's look at some examples.

John opens up a book about moral philosophy, and he is excited to read it. But what has him excited is not exactly the prospect of discovering new ideas. What he's really looking forward to is the confirmation of his own beliefs, and the discovery of new arguments to defend them and push them onto others.

This is common, of course. Many of us buy books that are based on ideas we already agree with after all, don't we? Capitalists buy books about the virtues of free markets, creationists buy books about the flaws in evolutionary theory, and environmentalists buy books about the damage we're causing to the planet. By itself, this tendency is not harmful, and certainly not surprising. It limits our thinking, though, when we do not recognize it in ourselves and therefore don't make allowance for the bias it creates.

This isn't just about books, of course. In fact, we "buy" ideas all the time from the intellectual environment around us. We "pay" for these ideas by investing our time and thought and ego into them. But we don't see how often we are only interested in those that fit our existing way of thinking. And because of that lack of awareness resulting from a lack of self observation, we pass over facts and ideas that may lead to a better understanding.

Seeing our own biases

Suppose a man has a strong belief that "a person is responsible for his or her actions." As a result of this and his accompanying philosophy, he not only dismisses certain ideas, but finds them offensive. For example, when he hears about a study showing that most criminals have a deficiency of copper in their bodies, he is annoyed and assumes it is an attack on the idea of personal responsibility. "They're just helping people excuse their bad behavior," he says.

Now, if he's not in the habit of self observation, he won't notice that this isn't reasoning, but a reaction. It may even seem perfectly clear to him that such science is dangerous and ill-intentioned. On the other hand, what if he does watch himself, and catches the reactive nature of his thinking? Then he can question what he believes, or find a way to fit new facts into his thinking.

With this and other scientific information about the physical and psychological "causes" of behavior, he might come to a better understanding. He might even decide that people generally aren't responsible for their actions, but that they can be if they so choose. Upon having this thought, he might notice that his reactive mind is saying, "but we have to hold them responsible or people will all be criminals." This, he sees, is the fear that supports his prior belief. Upon seeing that, he can think, "no, they just have to be locked up if they are dangerous to others - that doesn't require a belief in personal guilt" nor suggest that others will become criminals if we don't call them "sinners."

That is one possibility. The other is that upon seeing that there are things which encourage people to commit crimes, he still believes that people are responsible for their actions, but now recognizes that context is not irrelevant. We're all weaker at points in time after all, for all sorts of reasons, and recognizing this isn't a denial of responsibility for our actions. Perhaps correcting nutritional deficiencies, treating psychological problems and providing a better environment for people will lead to many less of them choosing to do bad things.

However he changes his thinking or broadens his understanding, it happens because of self observation. The resulting self awareness lets him see his biases and work past them.

This isn't just about philosophical examples like those given, either. Simple pride about ones theory in biology, economics or family life can blind one to better ideas if it is not recognized as a limiting force. Being afraid to admit ignorance is another mind-killer. There all sorts of other things going on inside us too. One's own unconscious mind throws many obstacles in the path of clear thought, and self observation is what allows us to clear the way for better brainpower.

At the highest levels, better brainpower cannot be separated from higher self awareness. How do you become more self aware? Some people turn to meditation, and this is a good start. This helps one deal with the "monkey mind," which describes how the mind often jumps from thought to thought like a restless monkey in a tree, jumping around from branch to branch. The idea is to "tame" that busy barrage of thoughts.

Meditative practices help you observe things more clearly, concentrate better, and perhaps think more efficiently. Efficient doesn't necessarily mean effective, however. A perfectly tuned car can still take you to the wrong destination, right? Self awareness, then, starts with this meditative observation of the "chatter" in your mind, but for more powerful thinking you have to look deeper, to see the content of those thoughts and identify the patterns and biases working there, often unnoticed by you.

With that in mind, here are some more of the common biases and other patterns of thought that can get in the way of better brainpower.

Effective thinking - Three stumbling blocks

Source biases

We're not always aware that we're under the influence of a bias against the source of an idea. For example, even a very rational scientist may discount the theories of another, without realizing that it has as much to do with his dislike of the person as with the merits of the ideas. If you doubt this, you can prove it to yourself with an experiment. Tell 100 people, "John Wayne said that citizens have a duty to fight for their country when their government asks them to. Do you agree?" Then make the same statement to another 100 people, but start with "Adolph Hitler said..." See how many agree with each version.

You can guess the results without doing the work. We know from experience that where information, ideas, or even evidence comes from helps determine how people perceive these things - even when there is no rational reason to differentiate (it is rational, of course, to be more skeptical of information from a source with a justified reputation for being unreliable). It is common to note this bias, but it is also common to assume that "I'm not like that." Of course we are all subject to this ordinary pattern of thought.

To get past this, then, we have to become aware of it in ourselves. We can start by asking questions like, "How do I feel about this source, and could that be affecting my thinking?" You might hate the slant of a particular news channel, for example, and so discount the importance of something they report on. Upon reflection, you realize that despite the political slant to their reporting, they never invent facts, and that if you saw the same story on a different station you would have thought about it differently.

Here's another approach: When you feel a strong negative or positive reaction to some idea, evidence or information, imagine for a moment what your response/reaction would have been if you heard the same thing from a different source. In fact, imagine several sources and pay attention to what your mind does. We've probably all heard a friend defend the ideas of a favorite politician, even though we know he would denounce the same ideas if they came from someone he didn't like. To see if, and to what extent, this is happening in yourself, do this little mental exercise as honestly as possible.

Philosophical biases

We all have some fundamental ideas about various aspects of life. I hesitate to call this a "philosophy," because these thoughts are not always consistent nor consciously formulated in a person. In fact, many people's unconscious philosophical perspectives contradict conscious beliefs. For example, a man might openly express capitalistic beliefs and yet still feel like business is somehow "dirty," perhaps because of childhood experiences.

Whether conscious or not, our philosophical "leanings" can affect our ability to clearly and rationally think about things. For example, suppose a woman has a basic feeling or philosophy that hard work toward goals is what makes us happy. Then she reads about a study which found that those who could not quickly name three specific personal goals were actually happier than those who could. What might her response be?

I made that study up by the way. If it was true, though, it might be fascinating to take it further and see why people without definite goals were happier. However, given this woman's basic philosophical bias, it seem likely that she might start asking questions like, "How did they measure happiness?" and "Why did they have to name their goals quickly?" These are valid questions, but probably a reactive challenge to the validity of the study rather than an attempt to get at the truth. Or to put it another way, the "truth" she wants to get at is getting in the way of honestly looking at the evidence and learning something new.

Before we consciously see the logical implications of an idea, our unconscious mind has already figured them out, and caused an uneasy feeling if they contradict other important beliefs. We then react according to this "processing," and we may even feel obligated to defend our response -that's where rationalization enters. If we asked this woman why she so quickly attacked the study rather than exploring the fascinating implications of its findings, she might say, "Because there is so much bad science out there." True, perhaps, but we would have to wonder if she did the same with studies that confirmed her philosophical leanings.

With self observation we develop more self awareness. How do apply that here? When you react quickly to something, don't allow yourself to create "reasons" to defend your reaction. Instead look for causes that may have been hiding just below the surface. What important beliefs do you have that might be challenged or confirmed by this new information or idea?

Social biases

If we think a certain way because all of our associates and friends do, that can be fear of being ostracized from the group. Imagine if a scientist found evidence of a physical aspect of thoughts. Perhaps his mind races ahead to imagine the ridicule he'll face from his peers if he mentions such a radical hypothesis, so he ignores what he found, and stops thinking about it.

This bias is commonly used against us. A person starts a statement with, "We all know that..." and whether what "we all know" is true or not, we have been warned that we will be looked at as an "outsider" if we disagree. This is less a statement of the obvious than an argument from intimidation, but it is often more subtle than this. For example, there is an unspoken agreement among many people that they should never point out that affirmative action (purposely hiring minorities) fits the exact definition of discriminatory hiring. I propose that such social "correctness" not only intimidates people into silence, but it prevents clear thinking (and certainly rational discussion) on some topics.

There may be good reasons to limit what you say, but why limit what you think about? Ask yourself what uncomfortable thoughts you've entertained at times. Then do a simple experiment. Imagine if you lived in a place where everyone agreed with those ideas. Would you feel more comfortable exploring them? You may be biased and limited by the beliefs of those around you.