Familiarity and the Illusion of Knowledge

The greatest enemy of knowledge is not ignorance; it is the illusion of knowledge.

Daniel J. Boorstin

 

Estimates of how much information the average person is exposed to on a given day vary, but it’s always a big number, and there’s little doubt that the amount of information the average person ‘consumes’ daily has risen over the past few decades as technology has led to more platforms and channels for consuming it.

One of the biggest problems being exposed to lots of information – either through the media, social media, or just through reading a lot – is that we get used to hearing names and labels: of people, places, policies, etc. To put it another way, we develop familiarity with these names and labels.

Quite a lot of the time this familiarity is skin-deep. We can recite names and labels easily, and we might know whether something is ‘broadly good’ or ‘broadly bad’, but we would struggle to explain what a thing is in any detail, why it’s important (or unimportant) in objective terms, and how the current situation has come to be.

Of course, nobody can know everything. It’s unrealistic to expect everybody to have a detailed understanding of politics, economics, business trends, the physical sciences, arts and culture, etc.

However, the problem with familiarity is that we so often confuse it with actual knowledge. This happens simply and naturally when we are not alert and questioning. We read or hear something, make a guess about what it means based on the context or on a superficial explanation, and then feel as though we know enough about it to make sense of the rest of the information we are presented with. Later on, we hear the same term again, and our illusion of knowledge is reinforced. And so on.

Part of the wisdom of Boorstin’s quote above is that, in a sense, ignorance – without an accompanying sense of knowledge – can be ‘fixed’ through good teaching. But the illusion of knowledge – familiarity – can be much more pernicious. The impacts of this play out both at a societal level, and individually.

This article by Elizabeth Kolbert in the New Yorker (which cites, amongst others: The Knowledge Illusion: Why We Never Think Alone, by Steven Sloman and Philip Fernbach) puts it well:

In a study conducted at Yale, graduate students were asked to rate their understanding of everyday devices, including toilets, zips, and cylinder locks. They were then asked to write detailed, step-by-step explanations of how the devices work, and to rate their understanding again. Apparently, the effort revealed to the students their own ignorance, because their self-assessments dropped. (Toilets, it turns out, are more complicated than they appear.)

… It’s one thing for me to flush a toilet without knowing how it operates, and another for me to favour (or oppose) an immigration ban without knowing what I’m talking about. Sloman and Fernbach cite a survey conducted in 2014, not long after Russia annexed the Ukrainian territory of Crimea. Respondents were asked how they thought the U.S. should react, and also whether they could identify Ukraine on a map. The farther off base they were about the geography, the more likely they were to favour military intervention. (Respondents were so unsure of Ukraine’s location that the median guess was wrong by eighteen hundred miles, roughly the distance from Kiev to Madrid.)

How do we address this? First, it’s important to be prepared to really re-evaluate what you actually know. Humility – at the very least in the privacy of our own minds – is essential to learning.

Secondly, we have to learn to spot things that we don’t really know. Ask yourself: ‘what does this mean?’ If you can’t answer the question without using the term itself then you almost certainly don’t know enough.

Once you’ve learned ‘what?’ you need to start thinking about ‘why?’ – as in ‘why is this important?’ or ‘why did this happen?’. ‘Why’ questions are tougher, but even knowing how much you don’t understand is instructive and will help prevent you from making poor judgement calls based on familiarity but not understanding of a concept or situation.

In the example given by Kolbert, simply attempting to write down how a toilet or a zip works was enough to help people re-evaluate how much they knew. Kolbert also gives an example of how this works on more far reaching questions:

In a study conducted in 2012, they asked people for their stance on questions like: Should there be a single-payer health-care system? Or merit-based pay for teachers? Participants were asked to rate their positions depending on how strongly they agreed or disagreed with the proposals. Next, they were instructed to explain, in as much detail as they could, the impacts of implementing each one. Most people at this point ran into trouble. Asked once again to rate their views, they ratcheted down the intensity, so that they either agreed or disagreed less vehemently.

So we must be prepared to question the labels we are using and ask ourselves how much we really understand them. Doing so will make us more aware of how little we actually know and understand, encourage us to fill in the most important gaps, and make it less likely that we will make important mistakes due to false perceptions about our own knowledge.