Steve Fleming’s research is definitely “meta” – a Greek prefix indicating self-reference. He is one cognitive neuroscientist at University College London who studies metacognition: what we know about what we know, think about what we think, believe about what we believe. Although this may seem quite philosophical and almost impossible to study in the laboratory, he has made it his mission to measure and model it and understand where in the brain it manifests.
Fleming explored these issues in his 2021 book, Know Yourself: The Science of Self-Awareness. In 2024 Annual Review of Psychologyhe probed further the link between metacognition and self-esteem: our sense of whether we have made the right decision, whether we succeed in the tasks presented to us, and whether our worldview is likely to be correct.
Fleming’s work sheds new light on why some people seem chronically insecure even when they’re doing just fine, and why others are completely convinced they’re right about everything, even when there’s overwhelming evidence to the contrary. In the following discussion, which has been edited for length and clarity, Fleming shared his thoughts on some of the questions that inevitably arise when our brains assess their own activity.
Metacognition is a rather unusual research topic. How did you end up studying this?
I studied experimental psychology at Oxford, where I had the opportunity to work with psychologist Paul Azzopardi. He studies blindsight, a condition in which people, due to certain types of brain damage, are subjectively blind, but still able to perform various tasks with the help of visual information. This presents a fascinating dissociation between conscious experience and actual functionality.
At the time, I hadn’t figured out how to connect the more philosophical ideas of conscious experience to something we can actually measure and study in the lab. But ever since then, my career has moved towards achieving the original goal of using mathematical models from psychology to explain aspects of self-awareness. These are things that psychologists and philosophers have always been interested in, but which are quite difficult to pin down in practice.
How do you measure something like metacognition in the lab?
The standard approach is to measure people’s objective performance on a task as well as their subjective assessment of their own performance, usually in the form of confidence ratings. For example, we might ask whether a visual stimulus known as a grating is tilted left or right, or to compare the brightness of two gratings displayed one after the other. It would be a judgment on the outside world. We can then also ask them a metacognitive question, to assess their confidence in their decision about the world.
When we have many of these types of assessments over time, we can observe the degree to which self-esteem tracks performance, on a trial-by-trial basis. If someone has high self-esteem when they are right and lower self-esteem when they are wrong, they can be attributed a high degree of what we call metacognitive efficacy. We can use it as a way to quantify differences in metacognition between individuals or groups.

Can you link these differences to what happens in people’s brains?
A popular way of doing this has been to look at differences in brain activity and structure between people, using brain imaging techniques such as fMRI and magnetoencephalography to try to determine which aspects of brain function give some people better metacognition than others. But we have realized that the approach is limited.
So the field has shifted. More recently, we look instead at the relationship between patterns of brain activity and trial-to-trial variation in how confident individual people feel about decisions we ask them to make in experiments.
Essentially, what has been found is that there are different stages of tracking uncertainty about our own performance when performing a particular task.
For example, if you’re trying to discriminate the direction of a line, neurons in the part of the brain that are sensitive to different possible line directions will fire to different degrees, reflecting any uncertainty in what you’re seeing. Studies show that if there is conflicting information at that level, it affects people’s confidence estimates in the tests.
There is also data that suggests another higher-level stage of assessment: There are brain areas in the prefrontal cortex signals trust in a more general wayone that is not linked to the specific input we receive when performing a particular task. This process continues after you have made a decision, and the brain also considers information that was not initially available. It’s like it’s still trying to figure out if it got it right or wrong.
It seems to happen quite automatically. It requires no external instruction or conscious effort. When we ask people to consciously engage in metacognition and report how they feel about their performance, they appear to be engaging in yet another stage of processing, involving the frontopolar regions of the human brain: regions right toward the front of the cortex that are particularly well developed in humans compared to other primates. These areas are activated when metacognitive estimates used to communicate with others or consciously control behavioras we asked them to do in these experiments.
What happens if metacognition does not work as it should?
A pervasive sense of insecurity has regularly been linked to symptoms of anxiety and depression. We know that individuals who suffer from this general sense of insecurity do not necessarily perform tasks worse than the next person. So one of the puzzles we’re interested in trying to solve is why some people don’t learn from their own performance. Why is it that they are unable to realize that they are actually doing quite well and then update their beliefs about their skills and abilities appropriately?
What we have found is that on a trial-by-trial level, people with anxiety and depression are just as likely as others to show instances of high self-esteem. But there is an asymmetry in how they learn from these. Sometimes they are very confident that they are doing well, but they do not incorporate these cues into their more global estimates of how well they are doing in these experiments, and presumably in daily life as well. At the same time, they are perfectly capable of including evidence from trials where they were not very confident of performing well.
Interestingly, this is not the case when we give them explicit feedback about their performance. When we tell them they are right, they realize that they are actually performing quite well.
How can this be used to help people who struggle with low self-esteem?
In a recent study, we have shown that underconfidence in people with greater anxiety symptoms is worsens over time. If we probe their confidence immediately after they have made a decision, they will be slightly underconfident. But if we wait a few seconds, they are even more underconfident about the previous decision, all else being equal. And it only gets worse.
What we think happens is that they engage all these brain mechanisms that I talked about earlier to reflect on their own decisions and actions. Now, as time goes on, if you tend to be a more anxious person, these processes lead you to become even more confident than you would otherwise be. You spend too much time thinking about your achievements.
So, a concrete piece of advice that we can draw from these findings is that if you know you’re prone to that kind of bias, it’s better not to think too much after you’ve made a choice. If you immediately think, “Okay, yeah, that was a reasonable thing to do,” let it be.

What about people who are perhaps a little more confident than they should be? It seems that it can be quite useful in today’s society.
It is very interesting to think about what is adaptive, at a societal level, for future success. One hypothesis I put forward in the book is that if you have a slightly overconfident worldview as well as good metacognitive sensitivity that helps you realize when you’re really wrong, it can be quite a powerful mix. Because as you say, there is a lot of research that suggests that people who may be a little overconfident do well socially. People tend to like them and want them in positions of power because they seem decisive.
At the same time, you don’t want someone without proper self-awareness to be able to bluff their way to the top and reach a position of power.
So I think there’s a sweet spot where you need to project a little bit of confidence to be perceived as competent, but you also want to make sure you’re not too seduced by confidence, whether it’s your own or someone else’s.
We have found that people with a more open-minded worldview, who are willing to acknowledge that their view may not be the only valid one and believe it is important to listen to the views of people who disagree with them, also tend to have more accurate metacognition in the kinds of tasks we can study in the lab. Accurate metacognition asks them to seek out new information and update their beliefs if they may be inaccurate. There is solid evidence to suggest that in this way these signals can help us over time to develop a more accurate world view.
Could it be possible to train metacognition using this type of task, and do you think it could help us reduce the social tensions we experience today?
I think lack of metacognition is far from the only reason we see polarization in today’s society. But our research offers some tools that we can use to try to cultivate people’s ability to think critically about their own thinking, knowledge and decisions, without getting into politics.
The obvious place to do this would be in education, which I think has a lot of potential. Parents and teachers implicitly encourage children to be more self-aware, but they rarely do so explicitly.
We don’t teach metacognition the same way we teach math or history or physics. I think it can be a very powerful way to develop more open ways of thinking.
This article originally appeared in Knowledgeable magazinea non-profit publication dedicated to making scientific knowledge accessible to all. Sign up Knowledgeable magazineits newsletter.






