Metacognition: Part 2

22.08.2018 |

Episode #3 of the course Integrative thinking: A practical guide for leaders by Jennifer Riel

 

Welcome to Lesson 3!

Last time, we explored the notion of metacognition. We talked about the fact that we see the world through models and that those models are at least a little bit wrong. Today, we’re going to dive into the ways in which our models are wrong and the implications for decision making.

 

Cognitive Biases

Over the past two decades, the field of behavioral economics, pioneered by Nobel laureate Daniel Kahneman and his thinking partner, Amos Tversky, has helped demonstrate just how easily our mental models—the models we build in our minds to help us make sense of the world—can become deeply held cognitive biases. Cognitive biases are systematic errors in the way we think.

For instance, once we see the world a certain way, it is challenging to see it with fresh eyes. This is due to confirmation bias: We look for and interpret information that fits with how we already see the world. If you think someone is a slacker, you will look for proof that this is the case. But it also means you will treat that person as if they are a slacker, giving them less to do, setting lower expectations, and actually helping produce the behavior you expected to see.

Our mental models are also implicit. We rarely know, in any real sense, how we reached our models and how those models are driving our behaviors. We may be, for instance, aware that our team is not as diverse as we’d like it to be, but we aren’t conscious of the subtle ways in which our cognitive biases produced this outcome. Evidence suggests that we have an unconscious tendency to favor people who are physically and professionally similar to us in interviewing and promotions. So, without taking this bias into account, our teams wind up looking and sounding a lot like us.

These are just two ways in which our cognitive biases can lead to unhelpful actions. There are many, many more examples of what can go wrong when we think we see the world as it really is, rather than understanding that what we see is biased through the lens of our own experiences.

 

A Tool for Understanding Thinking

To mitigate against the biases in our models, we need tools to more deeply understand our thinking. Chris Argyris designed one such tool, a concept he called, the ladder of inference. Argyris explained that the world is full of data—testable facts that can be directly observed and experienced. But because that vast data pool is so big, we have no choice but to select and pay attention to just a portion of that data. Each of us selects our data based on our own experiences, needs, and biases—and we do so unconsciously, without being aware of the choices we’re making.

From that data, we create meaning. We interpret the information, make sense of it, use logical inferences to come to conclusions, and build our models of the world. But this process happens quickly and implicitly. We are aware of our conclusions, but we rarely question why or how we came to believe in those conclusions, nor do we question our inferences or note the gaps in our logic.

 

Building Your Own Ladder

To gain a better understanding of how you think, consider a belief you hold (for example: Hard work is how people succeed). Ask: How did I reach this conclusion? What data did I select? How did I make sense of it? And where did I make a leap from concrete data to abstract inferences? How might I make my logic clearer and my conclusions richer? Did your conclusion about hard work come from lessons from your parents? Personal experiences in school? Serving your peers and leaders?

Then, consider what the ladder would look like for someone who holds the opposite view (for example: Success is all about who you know, not how hard you work). Reflect on how your own ladder might be improved by considering the opposing view.

The ladder of inference is a useful conceptual tool for thinking about thinking regardless of context. Working with it can help you gain clearer access to how you think, setting yourself up for the kind of metacognitive tasks embedded in the integrative thinking process.

Tomorrow, we’ll talk about another foundational concept for integrative thinking: empathy.

 

Recommended book

Thinking, Fast and Slow by Daniel Kahneman

 

Share with friends