My favorite example is the story of why human-resource people decided that they should discourage people from “using knowledge in the workplace.” Huh? I really did not understand this idea at first.

Then I understood that they wanted their company’s employees to use “skills” instead of “knowledge.” That was because research had shown, in high-risk industries, people had to be skilled to make rapid decisions. They couldn’t simply have knowledge because it would take them too long to apply it (Jens Rasmussen, 1983). But, somehow, that turned into a blanket, company-wide check-your-brains-at-the-door-type mentality in a lot of places.

This is a clear example of how a data set gathered at companies involved in high-risk industries such as aviation, nuclear-energy generation, chemical processing, etc. ended up being thoughtlessly applied to companies participating in normal manufacturing operations (and probably restaurants, banks and other consumer-focused activities as well). This was another discussion from the Yahoo Root Cause Forum. As soon as someone posted something about applying knowledge being bad, I was suspicious. One of the participants gave me the reference to the Rasmussen article. But it took me a while to “put my finger” on the critical-thinking error, even though I had already been studying critical thinking for over 10 years at the time. My sudden insight? All industries are NOT high-risk.

Humans are pattern-seeking beings. We are known to see patterns where they do not exist. My office used to be on a major downtown street in Grand Rapids, and people would stop in for various reasons, including curiosity. One day, a middle-aged gentleman stopped in to my engineering test lab with a giant enlargement of a high-resolution photo he had taken from an airplane, which he said proved that Satan was coming soon. I tried to explain to him that his interpretation of the patterns he saw emerging from the winter view of harvested corn fields was not necessarily the only possible one, but I might as well have been talking to a wall.

Seeing patterns in events is generally more difficult than seeing patterns in visual data, although jealous people seem to fairly easily see their partners engaging in objectionable behavior patterns quite readily. The emotional content draws our attention. It doesn’t mean that the data are being properly interpreted in that case either.

We are aided by historians in seeing patterns within the frame of large-scale human-affairs situations. A non-profit organization, commenting on the history of the civil-rights movement, recently noted in their newsletter that multiple politicians have held political rallies in cities where minorities suffered significant and newsworthy violence. The author implied that this is a symbolic way for the politician to say that they approved of the violence, without actually stating that they approved of the violence.

Of course, one of the critical-thinking axioms is “correlation does not prove causation.” Silent messages may be misinterpreted in the same way as verbal messages. Going back to the historical facts of the politicians’ rally location selection, however, I certainly would not have made that connection on my own because I did not have the historical knowledge. Furthermore, I try to think well of people even if I don’t like their views. We’re all humans. I don’t like being a suspicious type. But that is my frame of reference, and sometimes people do mean harm to others.

Critical thinking relies on all the other types of thinking we do, from the most basic information gathering to meditation on the most complex situation we can imagine. The entire court-of-appeals system is needed in order to review whether the lower courts properly applied earlier precedents of law. In other words, did they choose the right frame of reference to judge the case?