The Knowledge Illusion – Philip Fernbach and Steven Sloman on ignorance and irrationality

The Knowledge Illusion - Philip Fernbach and Steven Sloman

Knowing less than we think we do

While people are capable of amazing feats, we’re also prone to error, hubris, irrationality and ignorance. Authors like Annie Duke (Thinking in Bets), Daniel Kahneman (Thinking, Fast and Slow) and Robert Cialdini (Influence) have discussed this at length. Cognitive scientists Philip Fernbach and Steven Sloman offer an interesting extension and complement to the topic in their book The Knowledge Illusion. Where others focus more on thinking errors of the individual, Fernbach and Sloman look at the nature and community of knowledge. It offers another framework to think about our ignorance, irrationality and decision making. In this post, we’ll look at a few lessons that I took from their study.

The community of knowledge

Fernbach and Sloman suggests that we’re successful as individuals and communities for several reasons. Firstly, we live with knowledge around us. This includes our bodies, the environment and other people. Secondly, individuals within a community can divide cognitive labor and specialise. Thirdly, individuals are able to skills and knowledge. These factors enable more effective coordination and collaboration within and between communities.

Thinking as collective action

Unlike computers, we are unable to hold vast amounts of information. We select only the most useful or available information to guide our decisions. Our limited capacity means we rely on a community of knowledge within our minds, the environment and other people to function. To Fernbach and Sloman, human thought is a product of the community, where people are like bees and society our beehive. Our intelligence is contained in the collective mind and not the individual brain.

To emphasise the point, our knowledge as individuals are intertwined with the knowledge of others. Since our beliefs and attitudes are shaped by communities, there becomes a tendency for us to associate with one prescription or another. We tend to let the group think for us because it is difficult to reject the opinions of our peers. As such, we don’t always evaluate the merit of every belief or attitude shared.

It takes a lot of expertise to understand the complexities of an argument. Society would be less polarised if we understood the communal nature of knowledge better. It helps to know where the limitations of ourselves and others begin and end. This can help us to determine our beliefs, values and biases with more perspective.

The knowledge illusion

Individuals cannot know or master everything. Instead, we rely on abstract knowledge to make decisions. Sometimes this includes drawing upon vague and under-analysed associations to provide high-level links between objects.

Illusion of understanding

Since our brains are not equipped to understand every event or discipline, we apply prior generalisations to new situations to act and make decisions. Our capacity to function in new environments depends on the regularities of the world and our understanding of these regularities.

People are typically more ignorant than they think they are (much like the Dunning-Kruger effect). This illusion of understanding exists because we often fail to draw a distinction between the knowledge of the individual and the community. This can lead to overconfidence and not knowing that we do not know. 

Illusion of explanatory depth

Our illusion of understanding often hides how shallow our understanding of causal mechanisms are across subjects. The authors give a striking example in their three-part questionnaire. To paraphrase them – For any everyday object (e.g. zipper, bicycle, kettle and toilet), ask yourself or someone these three questions in order:

  1. On a scale of 1-10, how well do you understand how it works?
  2. Can you explain to us in as much detail as possible about how it works?
  3. Now, on a scale of 1-10, how well do you understand how it works? 

This assessment can show our preconceived illusion of explanatory depth. Our self-assessment scores in the third question is often lower than the self-assessment score of the first. Like the illusion of knowledge, we can also suffer from an illusion of comprehension. We sometimes confuse understanding with recognition or familiarity.

Evolution’s endowment

The human mind evolved abilities to take actions to better enable our survival. Remembering vast amounts of information was not helpful to our evolution. As our brains developed in complexity, we got better at responding to more abstract environmental cues and new situations. Furthermore, uncessary details are counterproductive to effective and efficient actioin if a broad understanding suffices. This is why our attention and memory systems are sometimes limited and fallible. We may have evolved a different set of logic if it was conducive to our evolution and survival as a species.

First we overestimate, next we ignore

Humans use causal logic to reach conclusions and project into the future. We share similarities with computers in that we undertake complex tasks by combining simpler sets of skills and knowledge together. However, the illusion of explanatory depth means we often overestimate the extent and quality of our causal reasoning.

We acquire information slowly and spend only a small fraction of our lives deliberating. Doing otherwise would leave us unable to handle systems that are too complex, chaotic or fractal in nature. The illusion of understanding means we tend manage complexity by ignoring it. Likewise, we ignore alternative explanations when we’re unable to retrofit it into our pre-existing model of understanding.

Hierarchies of thought and processes

Fernbach and Sloman highlight how we construct our models of the world from a tiny set of observations. This is possible because our world usually conforms to generally consistent principles. We can also think of the brain as one part of a larger processing system. It combines with the body and the external environment (an outside memory store) to remember, reason and interact.

Complex behaviours emerge when individual systems can interact or coordinate with each other. Multiple cognitive systems that work together may give rise to a group intelligence that exceeds the capability of any one individual. This is the case with bees, ants and humans. The growth in our collective intelligence for example can be attributed to the growing size and complexity of our social groups.

Knowledge and intentionality

Our ability to distribute and share knowledge and intentionality allows us to pursue common goals collaboratively. This ability, combined with our capacity to communicate arbitrarily complex ideas through language, helps us to store and transfer knowledge between generations. Fernbach suggests that this is what separates us from artificial intelligence and most other biological species.

Cumulative cultures

A cumulative culture can form with the transmission of knowledge, division of labour and model for cooperation. The division of cognitive labour (through language, memory, attention, etc.) enables specialisation and greater individual contributions. But individuals tend to overestimate their personal contributions because individual and group thinking are interconnected and difficult to delineate. People sometimes feel knowledgeable simply because the knowledge exists within their community.

Superficiality

Very little of what we understand occurs through direct observation or sensory experience. Most of what we know comes from what we are told and understand superficially. This usually suffices because communities can divide cognitive labour and responsibilities. A corollary of all this is the automation paradox. As we become more dependent on technology and automated systems, our ability to navigate new situations diminishes if or when such systems fail.

Redefining intelligence

Intelligence is not necessarily a personal attribute, or one marked by academic success. In a community of knowledge, it is how much an individual contributes to a group’s success. Teams with complementary skills are more likely to divide and use cognitive labour successfully than teams with homogeneous skills. For this reason, education is not purely about the development of intellectual independence. It is also learning about what we don’t know, where we fit within our community of knowledge, and how to separate accurate statements from garbage and noise.

Knowledge, science and faith

Science involves the development of justifiable conclusions, whether through direct observation or inference. However, scientific progress is often built upon prior conclusions held by authority inside the community of knowledge. The authors cite empirical studies to describe how scientific attitudes are often shaped by a combination of cultural and contextual factors that are difficult to change. This is easy to see as well in scientific history (think Galileo).

Again, the authors highlight how personal beliefs and reasoning are often interconnected with the beliefs, cultural values and identities of our communities. As social creatures, it is difficult to be perfectly rational or scientific in every mode of process and reasoning.

Furthermore, since we often don’t know enough to analyse the consequences of new policy or science, we have little choice but to depend on the views of people we trust. This relies on faith that others are telling the truth. However, this faith is different to that of religious faith due to the power and threat of verification.

Fernbach and Sloman suggests that it is this distinction or social pressure that incentivises scientists to pursue truth-seeking within their community of knowledge. This ‘incentive’, in combination with the process of falsification, allows us to advance the frontiers of knowledge in an adaptive process, even if we stumble with errors and mistaken beliefs along the way.

Reinforcing illusion

Bertrand Russell once said that “the opinions that are held with passion are always those for which no good ground exists”. Surrounding ourselves with like-minded people can reinforce our opinions and beliefs. They can accentuate our illusion of knowledge and comprehension. It’s why we sometimes hold strong opinions without a detailed understanding.

Since we rely on simple causal models and our community of knowledge, our beliefs and confidence often go untested. An illusion of knowledge means we’re less likely to check and update our understanding. The power of culture and social proof can make beliefs difficult to change, even in the face of new evidence.

The illusion of knowledge implies that a swath of knowledge can make us feel like experts. When we feel and talk like an expert, and join a community of knowledge that agrees, the herd mentality that emerges can be dangerous. Such attitudes are often mutually reinforcing.

Focus on the how

To correct this, people should be open to the idea that their community of knowledge can be incorrect. This is difficult since it can challenge our identity and values. Sloman cites empirical studies that found that asking people to explain their positions causally can encourage more moderate positions upon reflection. This is because causal explanations require the person explain the link between their position and the subject or outcome. In other words, focusing on the ‘how’ requires us to confront gaps in our knowledge, and potentially shatter the illusion of knowledge.

Explanation foes and fiends

Policies are often supported or rejected on moral conclusions with little causal analysis. We don’t consider the nuances of policy, only whether our values are reflected. We do this because it simplifies the decision-making process. This behavior helps to explain why political discourse and media is shallow.

We substitute analysis with generalisations and values because our memory and capacity for reasoning is grossly finite. It explains why we lionize or demonize individuals (fairly or unfairly). This is evident in the way we think about political groups, large companies and scientific achievements. We also find it easier to simplify history than to understand the complex role of community and contingency.

Fernbach and Sloman suggest that most people are ‘explanation foes’. They don’t like too much or too little explanatory detail. By contrast, very few people are ‘explanation fiends’. These people prefer to understand as much relevant information as possible before deciding. Both explanation foes and fiends have their strengths and weaknesses.

Making smarter decisions

Given our limitations as individual learners and analysts, we should structure our environments and processes in ways that are more conducive to effective decision making. The distinction here is the focus on ‘environment’ and ‘process’ than the person itself. Fernbach and Sloman offer four recommendations:

  1. Reduce complexity: We should understand how much complexity we can tolerate and scale our information accordingly.
  2. Simple decision rules: It is often difficult to educate ourselves on complex matters in a timely, consistent and considered manner. We should develop decision rules with short and clear rationales to navigate and check our processes.
  3. Just-in-time education: The authors argue that we can improve our retention and application of information by learning (or revising) it just as we need it.
  4. Check our understanding: We should be aware of our tendency to be explanation foes. While we cannot consume every possible detail for decision making, we should endeavor to know enough. Knowing what we don’t know can help us to fill in our gaps, minimise intellectual arrogance and make better decisions.

Inevitability of ignorance

To end on a quasi-hopeful note, Ferman and Sloman suggest that ignorance is perhaps inevitable. It is the natural state of humanity. There is simply too much complexity in this universe for any one individual to fully understand. As such, we do what we know and ignore what we have little conception of. Ignorance shapes our lives because we have little awareness of what is truly possible. At the same time, it is the illusion of knowledge that gives some people the self-confidence to pursue new and risky frontiers. Perhaps some illusions are necessary for the development and progress of civilisation itself.

Further reading

References