How to Change Your Mind: A Possible Way Toward Saving the Human Race

Peter Breslin
6 min readMay 27, 2020

Reflect for a few seconds on your top three most passionately held beliefs or values. Pick one. Really lean into how strong that belief is for you. Feel that sense of conviction, of rightness, of a territory that you fiercely defend.

Now take a few moments and investigate how it feels to consider that you could be wrong.

What is that like? For me, a lot of emotions come up when I imagine one of my most cherished values being challenged. I get fairly defensive right away. I want to find all the evidence that led me to the belief in the first place. I want to hold that territory, and I want to close ranks on anything that would undermine that glorious, solid feeling of not only being right, but knowing that I am right. Standing strong in that certainty, it’s important for me to imagine that people who disagree with my belief or who hold different values are, at best, ignorant and mistaken, at worst, contemptible and probably malicious fools.

The experience of discomfort, anger, frustration, and contempt that can arise around having a core belief or value challenged is a lot more common now, probably, than it used to be. Definitely, we keep our bubble fairly safe to the degree that we are able, perhaps blocking or at least unfriending people on social media who have beliefs or values counter to our most cherished ones. But it is probably the case that social media and other connectivity leads to a lot more experiences of views contrary to our own than we used to have to deal with.

It often seems to me that we are in early phases of negotiating this increased frequency and intensity of perceived threats to our world view. I’ve been active on the internet since 1992, and it does seem like we might have made at least some progress, with perhaps fewer “flame wars,” and perhaps a somewhat reduced degree of violent or contemptuous interactions. Yet, it’s easy to stand back even a little bit and get an ominous sense of things. It feels as if political and ideological lines have been drawn very boldly. For example, the politicization of the current global pandemic plays out repeatedly and constantly on Facebook. Nefarious actors know how powerful the manipulative rhetoric can be, with a recent study (supported by strong evidence!) suggesting that nearly half of all Twitter accounts promoting “re-opening the economy,” for example, are bots or bot-assisted humans.

I’ll probably write more about various ways to navigate “hot button” encounters that almost never end well, around our sense of perceived threats to territory we have to defend. This particular piece is focused on one area that has been bewildering for non-scientists, but is standard fare for those of us with even a little bit of science background. That area is the sometimes fascinating, sometimes annoying or frustrating phenomenon of scientists changing their minds.

It’s a simplified form of the rather fancy-sounding framework called “Bayesian epistemology.” The link provided here gives a more in-depth look. The framework is named after the Reverend Thomas Bayes (c. 1701–61), who developed the field called “conditional probability.” For my purposes, Bayesian epistemology simply means: changing one’s mind when presented with new evidence. Doesn’t that sound easy? It may also sound trivial in a lot of ways.

But go back to the thought/feeling experiment at the beginning of this piece. Get into that cognitively miserly, epistemically foreclosed space of not only “feeling right” but knowing you are right. Get into that tangled place of being confronted, of having that rightness and solidity of knowing you are right face a challenge. Now imagine being presented with evidence that is reliable, supported, and convincing, which “ought” to lead to, at least, a relaxation of our defenses, if not an important change in one of our core beliefs.

I think following the thread of that thought experiment through to the place of being open to considering new evidence shows that my overly simplified description of Bayesian epistemology is far from easy, and anything but trivial. In fact, the more I think about it, the moment of being open to considering new evidence that suggests a loosening or abandonment of a core belief marks one of the most profound abilities of, and challenges to, our species. I could be wrong, of course, and I’m willing to consider contrary evidence….

An example from my own recent experience, echoed also in the community of scientific experts: wearing masks in an effort to stop the spread of SARS-CoV-2. Back in March, when the grassroots movement of people sewing homemade cloth masks for distribution to healthcare workers started, I was skeptical and dismissive. I had found a study that seemed well designed that showed that cloth masks were ineffective to stop viral spread, and may even contribute to increased risk of infection. I was expressing my skepticism frequently on various Facebook comment threads. My opinion was not particularly popular, since there were a lot of different dimensions and sociopolitical aspects of what was going on.

Flash forward to now, and, lo and behold, I am a strong supporter of wearing masks. Admittedly, I’m still not in strong favor of ordinary cloth face coverings, such as bandannas, etc. Yet I still feel those are better than nothing. I wear a surgical face mask now whenever I am in public. I am appalled to see so many people not wearing masks at all, at places such as the grocery store.

Why did I change my mind? Because I try to be a good Bayesian epistemologist. I was presented with new evidence that, although not ideal, and not particularly to protect the wearer of the mask, but especially to help prevent possible spread of SARS-CoV-2 to others, masks absolutely play a role in reducing spread. (Ironically, the link is to a Mayo Clinic article, a health care institution to which a certain high ranking politician paid a visit and infamously did not wear a mask).

Admittedly, my belief that masks were ineffective and people were “wasting their time” making homemade ones, was not a passionately held belief. It was relatively easy for me to, at first, make a simple social concession to participate in a public health effort. Then, when I saw more evidence, it was very easy for me to buy some damned surgical masks and start wearing one. Not that big of a deal. The surreal politicization of mask wearing in public has been exploited specifically to draw stark boundaries around deeply held values that are vastly different from my own. I generally have tried to stay out of that, as well as stay away from people and businesses where masks are absent.

The larger point is that being able to change one’s mind when presented with new evidence could save the human race. At the very least, this simple practice of Bayesian epistemology could reduce cognitive miserliness, cherry picking of data that supports our existing belief, and hardening of fronts between seemingly irreconcilable ideological camps. It’s not easy in the least, especially when it comes to one of my top values. But if I am willing to at least investigate the possibility that I could be wrong, or that there is room to loosen up my belief, or that I could more effectively explain my belief if I were less defensive and aggressive, that’s a start. And it often feels now like we need as simple a starting place as possible.

I could be wrong. If I’m presented with new evidence, I’m willing to consider it.

--

--

Peter Breslin

Conservation biologist, botanist, Ph.D. in Environmental Life Sciences from Arizona State, ancient Gen X SJW accomplice and culture critic.