A few months ago I attended a talk by Dr. Steven Sloman at the Rotman School of Management. Sloman was promoting his new book, The Knowledge Illusion: Why We Never Think Alone, which explores the many ways in which we know much less than we think we do. I already thought I knew nothing, but Sloman’s talk convinced me I know even less than that.
The Dunning-Kruger effect is an oft-cited piece of research about how most of us overestimate our own intelligence. “85% of people think they’re better than average drivers” might be one of my favourite statistics of all time. The Knowledge Illusion ably illustrates the many ways in which we convince ourselves that we know things, mostly because we are around things that are known. We call this a community of knowledge. I don’t know how to fix my vacuum, but I know that someone in my community does, and I could avail myself of said knowledge if I cared about having a working vacuum.
More pointedly, The Knowledge Illusion explores what our brains are for. They’re not computers; they’re devices that enable us to sift through complexity, helping us focus on the things we need to know in order to make decisions and survive.
Identifying the gaps in our knowledge
One of the ways Sloman illustrates our knowledge gap — how we focus and what it leaves out — references Rebecca Lawson’s cycle study. Essentially, Lawson asks people to draw a bicycle from memory. For the most part, people can’t.
At Pilot PMR, at least 50% of us bike to work — and a good number are certifiable bike nerds. So, naturally, I wondered if we’d be any better at the task: can a bunch of designers and creative types with greater-than-average bike knowledge draw a bike from memory?
For the most part, no.
The exercise was a great way of illustrating how little we know. But does understanding this lack of understanding help us? Yes, in so much as it forces us to check ourselves.
In many ways, our Pilot design process is one built, perhaps unintentionally, with an eye toward removing bias. With every new project, we embark on a period of discovery. We try to ignore preconceived ideas about a problem, client, or industry. Fresh eyes, along with an unfamiliarity with new terrain, have a way of disabling the knowledge illusion. Clients school us on the inner workings of their operation, and though the knowledge they share comes with their own baked-in bias, it’s being given to us for the first time. We don’t expect ourselves to be able to draw it from memory.
Checking the bias in what you believe
In his book Sloman discusses another phenomenon that he uses to test people’s tenacity of belief in their own (often incorrect) ideas. It’s called the the illusion of explanatory depth. Sloman finds representative groups of people and asks them to rate their understanding of a policy or idea. He then asks them to explain said policy, before again rating their understanding of it. Unsurprisingly, understanding how little they understand often helps bring down the extremity of their position.
At Pilot, we try to verbalize our understanding of what our clients relay to us in our discovery sessions. In so doing, we very quickly realize whether what we’re saying makes good sense. And whether it conforms to our biases, or upends them. We like to think we get darn close to the truth. But we also thought we could draw a bike from memory.