CLEARER THINKING

with Spencer Greenberg
the podcast about ideas that matter

Episode 190: Bringing conspiracy theorists back from the brink (with Jesse Richardson)

Enjoying the episode? Want to listen later? Subscribe on any of these apps or stores to be notified when we release new episodes:

December 29, 2023

Have conspiracy theories been more prevalent, more persuasive, or more convoluted in the last few decades than at other points in human history? Is the presence of conspiracy theorists a feature of every society? The phrase "conspiracy theory" usually implies a false theory, even though some are eventually proven to be true; so how can we update our language to better differentiate between disconfirmed and not-yet-confirmed conspiracy theories? How can people who've really gone down a conspiracy theory rabbit hole come back back from the brink? More generally, what conditions need to be met for a person to change their mind about anything? What are the key motivators of conspiratorial thinking? Why do so many conspiracy theories incorporate strong antisemitic elements? To what degree are conspiracy theorists swayed by arguments from the requisite number of co-conspirators in a conspiracy? How should people research a conspiracy theory? Which personality traits are correlated with conspiratorial thinking? What's a good definition of wisdom? And how could wisdom help us combat the epistemic crisis through which we seem to be living right now? When, if ever, is it useful to approach a topic adversarially? Which would better mitigate the epistemic crisis: education reform or cultural change?

Jesse Richardson is an internationally award-winning creative director and the founder of the nonprofit The School of Thought, which is dedicated to promoting critical thinking, reason, and understanding. The Creative Commons resources The School of Thought has produced have so far reached over 30 million people and are being used in thousands of schools, universities, and companies worldwide. Their latest project is The Conspiracy Test, which is a gamified way to help increase healthy skepticism about conspiracy theories. It can been accessed for free at theconspiracytest.org. Learn more about Jesse and The School of Thought at schoolofthought.org.

Resources:

JOSH: Hello, and welcome to Clearer Thinking with Spencer Greenberg, the podcast about ideas that matter. I'm Josh Castle, the producer of the podcast and I'm so glad you joined us today. In this episode Spencer speaks with Jesse Richardson about conspiratorial thinking and intellectual humility.

SPENCER: Jessie, welcome.

JESSE: Thanks so much, Spencer.

SPENCER: I think a lot of people have a sense that conspiracy theories are on the rise. We saw this after 9/11, where all these conspiracies came up: "Maybe the attacks were planned by the US government." But then we especially saw this around QAnon in the previous election cycle. Do you think that conspiracies are actually more common now than they were in the past?

JESSE: There's been some interesting research on this topic, actually. And the reality of it is kind of counterintuitive. Because when you look at the historical record, the amount of conspiracy theories and the amount of people that believe in conspiracy theories doesn't appear to have shifted too significantly. Now, it's a little bit interesting because some conspiracy theories and conspiratorial conjectures have gained a lot more traction recently, fueled by social media. But it seems to be the case that it's maybe not as prolific and as much of a widespread issue that people that believe very strongly in conspiracy theories is on the rise. However, there is an interesting wrinkle in that, which is that just kind of muddying the epistemic waters can have its own effects on politics and other things and just create a lot of doubt, even if people aren't hardcore conspiracy theorists, if you were.

SPENCER: So should we think about conspiracy theories as something that's always happening in the background, that they're just kind of a feature of society?

JESSE: Yeah. It seems to be the case that there is a percentage of the population who are very vociferously inclined to want to question the powers that be. And there's actually really quite a lot of benefit to having that as a polity, that we have people that are willing to stand up and whistleblow and question people in power. That's actually a really important part of the democratic process. And so there's this kind of attitude towards conspiracy theories and conspiracy theorists, which is very dismissive and contemptuous, that maybe doesn't recognize that there is some value in holding, say, pharmaceutical companies or governments to account. We ought to hold our governments to account. But that doesn't necessarily mean that we should wholecloth believe every conspiracy theory and every aspect of every conspiracy theory either.

SPENCER: Yeah, I kind of don't like the phrase conspiracy theory because it makes it seem conspiracies are never true, which we know that there were some crazy conspiracies that were actually true. There's the MKUltra project (if I understand correctly) involved kind of doing experiments with LSD on people. There was some kind of crazy project where they dropped some kind of, I think, bacteria over San Francisco as an experiment on what bioweapons would look like, and then some people got sick. And then they thought, "Maybe it was actually because of this hidden government program." So these things do sometimes happen. How would you differentiate between conspiracy theories as in completely inaccurate theories of the way society works that involve massive conspiracies versus the fact that sometimes powerful groups actually do have conspiracies?

JESSE: This is a really interesting kind of meta-consideration because the truth of it is, of course, that it's a scale like most things. We tend to think in these binaries of something is either true or false, or conspiracy theories are all crazy, or conspiracy theories are all true on the other side of that. And of course, the reality is much more nuanced when you look at it. There's this fantastic website and infographic that was put together at conspiracychart.com by Abbie Richards. What she did is put this inverted pyramid, looking at things that actually happened — like COINTELPRO, Watergate, and the Tuskegee experiment, all these sorts of things that we have very reliable evidence for actual conspiracies that occurred — and then they've got like, "We have questions." Like Epstein didn't kill himself. And then it goes unequivocally false, but mostly harmless. And then there's dangerous to yourself and others, and the anti-semitic point of no return, right at the top of the pyramid with QAnon and all those sorts of things. So thinking in a scale of things is just a much better way to think about things. And that's a lot of what we're trying to promote as well.

SPENCER: So tell us about this conspiracy test that you've developed.

JESSE: The way that it originated was I was talking to a friend of mine (who's a lovely guy) and he had gone right down the conspiracy rabbit hole and had believed in the QAnon conspiracy, which as you may well know, is the idea that there is a secret society of Hollywood elites and other Satanists who are running a child sex ring and injecting themselves with chemicals to keep themselves young forever, which is based on an older conspiracy that has very anti-semitic roots as well, and has kind of proliferated through social media. So he'd gone down this rabbit hole, and he started just throwing all of these things at me. And it was an experience for us. As a rational skeptic and someone who's developed things to try and help people think more critically, I realized that if I attempt to link him to some of the resources that we've developed, it would actually be counterproductive. He'd feel like he was being attacked, he'd just dig his heels in more, and it wasn't going to be effective. And I've been thinking about how minds change. And one of our directors, David McRaney, wrote a book called "How Minds Change," and so what we know is that there are certain things that can help people to change their minds, in a way that is non-adversarial, that doesn't make them defensive, that allows a good faith engagement. And so we tried to bake that into this website and gamified experience called the Conspiracy Test at the conspiracytest.org.

SPENCER: It's really interesting that you're designing a test that's trying to meet people with conspiracy theories where they are. So how does it work? What is the actual process that someone goes through?

JESSE: Essentially, what happens if you go to the conspiracytest.org is that you can choose a conspiracy theory that you think might be true. And the operative word there is 'might' because a lot of the time we think about this binary way of thinking about things, that things are either 100% true or 100% false. And what we're trying to facilitate is a more probabilistic way of thinking. So if you choose a conspiracy theory that you think might be true, what happens is: if you choose a conspiracy theory that you think might be true, you're then taken through this immersive experience by a deep state, Illuminati-alien named Captain Zardulu. It is sort of an experience where we try to help people think in a metacognitive way. So rather than just being kind of, "Hey, you're stupid for thinking this conspiracy theory is true, when it's actually false." We instead don't really make any truth claims or say anything about it. We just ask people to, and help people to, think about a conspiracy theory critically. And what that does is that it changes it from a situation where someone feels they have to defend their point of view to where they are actually the person in control, they have the agency to be able to conduct their own critical thinking investigation, and that allows them to be able to think about things in a more reflective and evaluative way, because the person that they're arguing against is themselves in a way. And we gamify the experience so that you set a baseline of skepticism that you might have for a particular conspiracy theory. And then you subject yourself to a series of steps where you think about: "How many people would actually need to be in on this conspiracy for it to be true?" Or you might look at things in terms of probabilistic thinking. So, you have the scale where you update your skepticism as you go along. And the really important part of that is that instead of thinking in these binaries, as soon as you shift people into a probabilistic mindset, it tends to really quite profoundly change the way that people think.

SPENCER: From my perspective, switching from binary thinking to probabilistic thinking is just one of the most powerful core thinking tools to put in the tool belt. So I'm really glad that you're teaching that. One thing I wonder about, is there a certain body of research literature you're drawing on when you think about how to get people to think critically in this way?

JESSE: Yeah. I highly recommend, if anyone's interested in the subject, David McRaney's book, "How Minds Change," which pulls together-

SPENCER: David actually came on and we did an episode with him.

JESSE: [laughs] He's fantastic. He's a wonderful human being. And what I love about how David presents things, is that he takes this kind of holistic approach, and that word can be loaded. It's not holistic in a new age-y kind of way, but holistic in looking at things from a neuroscience perspective, a psychology perspective, a philosophical perspective, pulling together all of this research into a narrative form that can help explain and understand things in a way that is, I think, a lot deeper and more reflective. One of the core key takeouts of what David talks about in "How Minds Change" is the research around the deep canvassing work that was done in California and related research that has verified that process of engaging with people to help them soften and reflect and become more receptive to changing their mind about things that they would expect to be a lot more entrenched than it actually is. And so, the idea that people can't change their mind, won't change their mind, is this really unexamined, pervasive idea that I think is really popular and really wrong because the research shows that people actually are really quite receptive to changing their minds, given the right conditions. And one of those conditions, as you say, is switching from a binary way of thinking — true-false, right-wrong — to a probabilistic way of thinking, as soon as you introduce a scale. And what's really interesting is that you can see that actually happen in real time. If you're talking to someone — the deep canvassing research kind of exemplifies this — who perhaps has a very strong moral position on marriage equality (for example) and you ask them at the outset, "Well, on a scale of one to ten, how much do you support marriage equality?" And they might be a two. They might have an idea that they really don't support marriage equality. And if you build rapport with that person, you listen to their concerns, and then you speak your own truth, like you have an experience or put your own opinion forward, and you develop an understanding that is collaborative, then you see that people are actually quite receptive to going, "Well, maybe I can still have these views about things but maybe it doesn't affect me. I don't know why I would actually be opposed to it." There's a softening to people's beliefs as soon as you develop rapport and introduce probabilistic thinking because what you've done there is you've shifted from an absolutist mindset to a reflective one, and one that has liquidity, that has the capacity to be able to be mobile.

SPENCER: One time, an acquaintance of mine reached out about QAnon. I hadn't even heard of QAnon at this point. He reached out and he said, "Oh, I've been following something and I think you'd be really interested in it." And I said, "What is it?" And he said, "It's this thing called QAnon." And he sent me a link to the Q drops or whatever. And when I looked at it, I was just like, "This looks kind of like nonsense." So I said to him, "Okay, well, what makes you think that this stuff is true? What do you find most convincing here?" He said, "Oh, just read through it. And you'll see." I said, "Okay, I'm just gonna look through it until I find something that I can check, and I'm gonna go check that thing." So I looked at some of the Q drops and I found a claim made about, I think there's a bit about Hillary Clinton and her going to jail at a certain time or whatever. So I checked her speaker schedule during that time, and I was able to find video footage of her being somewhere at the time that claimed she would be in jail. And so I got back to him and I was like, "Okay, I checked one that seemed verifiable, and it just seems to be false. So I don't know why I would believe the rest of it." But of course, predictably, it didn't change his mind at all. He was like, "Maybe the video was backdated or maybe she was in jail and then brought out of jail, and then brought back to jail." It did literally nothing in his belief system, but it was enough to make me be, "This seems like BS. Why should I believe this?" But I'm curious to hear your thoughts on" I wasn't trying to really convince him, I was just kind of trying to engage with him. But what do you think about what I did, from the point of view of talking to someone who believes in conspiracy theories, because I'm gonna guess you're saying it's not the most effective thing to do?

JESSE: 100%. What we know from the research is that trying to debunk things with facts and logic just simply doesn't work. The capacity for human delusion, motivated reasoning, and justifications ad infinitum is limitless. And so, what's much more important — and I guess this kind of speaks to something that's been a real shift in my mindset over the last 10 years — when we first started the School of Thought, I had the belief that if we just taught people critical thinking, that would solve all the world's problems. And I think that is still really important. But I guess what's really changed for me is that I think I'm much more appreciative now that we are emotional and social creatures first, rational creatures very much second in that sequence. And unless we recognize that truth and that reality, it's ironically irrational for us not to, because we're not going to get very far if we don't recognize that, if we don't have that truth in mind. And so, the takeaway from that is that if you want to connect with someone and help to change their mind, it's much more important that you build rapport and listen to them, and try and understand where they're coming from. And rather than trying to convince them that they're wrong, instead maybe plant some seeds of doubt, and water them with some kindness and some connection. If you can do that, that's going to be a hell of a lot more effective than if you attack them or make them feel like they have to be in a defensive pose. Because the other thing is this idea that we have to change people's minds in a single sitting. And of course, that's just not how minds change, that's not how anyone's minds change. Our own minds, if we think about them, 10 years ago we had certain beliefs. And somehow, we changed those beliefs. But we never usually make a conscious decision to change those beliefs. They just soften over time. And a latticework of different perspectives and inputs tend to come into the frame as we mature. And as we think about things more in an iterative and slow process that isn't necessarily conscious a lot of the time. So bearing those two things in mind, if you worried about a family member who maybe is down the conspiracy rabbit hole, the best idea is to make sure that you maintain a connection with them, let them know that you are still on their side, shoulder-to–shoulder rather than nose-to-nose, and to be a voice of reason that's planting seeds of doubt without attacking their beliefs.

SPENCER: So to get concrete, in this example where this acquaintance reaches out and said, "Hey, you might be interested in this thing. Why don't you just read these?" If my goal was to change their mind, where should I begin with that process? What do you think a good response would be?

JESSE: That's a really good question. Essentially, just listening to someone and letting them know that you're still on their side, but that you don't buy that narrative. If it's just an acquaintance, you may not want to engage. It's not incumbent upon everyone to have to engage with people. But if it's someone you care about, say a family member, and you really care about your relationship with them, then you might be inclined to invest the time to say, "Look, I fact check this and it doesn't make sense to me. I can see where you're coming from, and why you might think this, but these are the reasons that I don't. It doesn't hold water for me." What you do there is you're not trying to necessarily attack their beliefs, but you're letting them know that there's a social connection with you still. Because for a lot of people, the impetus to be involved in conspiratorial thinking is socially motivated a lot of the time. A lot of the time, people who are deep into conspiracy theories, in particular, feel that they need a tribe and they get some of that validated with the conspiracy community. And they, in turn, find themselves less and less connected to their friends and family because they don't know the truth about things. And so, it becomes a self-reinforcing construct. So the best thing you can do is maintain social connection and help that person to know that they're not by themselves, they're not alone, and they're not your enemy, and plant subtle and kind seeds of doubt on the narrative that they're putting forward and let them change their mind, and help facilitate them changing their mind on their own terms, rather than you attacking them and resulting in a disconnection. And that's part of what we're trying to do with the conspiracy test, as well, is to provide people with a way to share something with someone who they may be worried about that has gone down a rabbit hole or is about to go down the rabbit hole of conspiracy thinking and to facilitate a less adversarial way for them to be able to think about things rationally and skeptically and critically. Because a lot of the time, it can be really difficult for people who have people in their life that are going down these rabbit holes, and they don't quite know what to do about it. So we're hoping to plug that gap to some extent. But importantly, we're not actually trying to target hardcore conspiracy theorists, for the most part, with the conspiracy test. There is this misconception in the attention economy that the people who believe wholecloth and absolutely in conspiracy theories are representative of the broader population. And the truth of it is, it's much more of a bell curve where there's only a few people who are really hardcore committed to conspiracy theories and conspiracy thinking. And then there's a lot of other people who are just kind of disengaged and don't really trust pharmaceutical companies or the government, maybe for some good reasons, some of the time. And they are a little bit convinced about, "I'm not sure about Hillary Clinton. I've heard some things about it. Maybe I don't want to vote for it." There's enough mud there to have stuck without being fully committed to that reality. So the noise to signal disconnect there is a really important one for us to understand on a sociological and political and broader level, that most people are not hardcore, committed conspiracy theorists, but they may be affected by conspiratorial thinking and narratives that are in popular media without being that far down. So our target with the conspiracy test is not hardcore conspiracy theorists, but rather what we call the conspiracy curious.

SPENCER: The idea that that group is not only bigger than the conspiracy theorists, but also much more amenable to using something like a test because they actually want to know whether it's true. They're not going to see the test as, "Oh, you're trying to convince me that you're opposed to my belief system."

JESSE: Absolutely. So the thing is that, for a lot of people, they might have questions about things, but they feel they are aware of the social stigma of thinking about things in a conspiratorial way. So they might have read some things or watched a YouTube video that has really caused them to have some doubts about things without buying into it 100%. And they are the people that we are most interested in connecting with the conspiracy test because they don't necessarily have a way to be able to evaluate things rationally and objectively that doesn't carry that stigma with it. If they have actual questions they want answers like, "I've heard about some things about the 2020 election, and it didn't seem to add up to me. But I don't really want to say because then I'm going to be accused of being on the side of politics or being a conspiracy theorist or whatever else. I'm just going to quietly harbor those beliefs and not really, even if I'm asked in a political poll, 'What do I think about these things?' I might still not venture into it because I don't feel comfortable. But my voting decisions might be affected by it and how I react to other people that I perceive as being maybe not as much part of my tribe might have its own repercussions and ramifications as well." So we want to connect with those people who are in a much broader group, and as much, as you say, more amenable to wanting to think about things critically and evaluatively.

[promo]

SPENCER: One thing I find confusing about conspiracy theory beliefs is the underlying motivation. Because on the one hand, you might say, "Maybe you feel special being part of this group that knows the secret about the way the world works." But on the other hand, it's very scary worldviews that are often involved in these conspiracy theory beliefs. The QAnon idea that people are kidnapping children and using (I don't know what it is) their blood, some chemicals inside them, for longevity. It's very, very scary. Or a lot of these beliefs are about that there's a secret cabal running the world or there's the secret group that has all this power. And you might think that people would have an internal motivation not to believe these things, because it's such a scary view. Finally, you might think, "Maybe part of what's going on here is that people have pre-existing beliefs that the conspiracy theories fit into." And so, it's sort of, they're really distrustful of certain groups. Then they might be, "Oh, yeah, this conspiracy theory helps explain my distrust," or something like that. So I'm wondering, what do you see as sort of the key motivations that drive conspiracy theory beliefs, and why is the scariness of the beliefs not a significant demotivator to believe them?

JESSE: That's a really good question. And I think it's multifaceted and complicated, but there's also some really interesting and resonant insights, I think. So first thing is that we know that dark triad personality traits are a strong predictor of conspiratorial beliefs and conspiracy thinking. So dark triad personality traits are machiavellianism, narcissism, and psychopathy. So people who are predisposed to essentially those three personality traits are also highly correlated with conspiracy thinking. So we know that you can sort of pathologize that aspect of things. But I think what you're driving at is another aspect of it, which is more about power, understanding, and control. My perspective on this is that for a lot of people, the idea that no one is in control and that the way that they experience the world as being disempowered, hurt, ostracized, and all these other things, if there's no reason for that experience, or if the reason for that is that there's some personal failing on their part or that they haven't succeeded in a meritocracy, as it were, then that's a really hard pill to swallow. So it's much more convenient to be able to believe that, "Actually, the reason why I'm in the state of anxiety or depression or disconnection from society is because of these reasons. And I have that secret knowledge. And I have that understanding that actually there's something else going on." There's an empowering aspect to that. But it also helps to explain why, "Maybe I'm not in a position where I feel like I'm validated or successful or whatever is hurting me. Now it has an explanation in the external world that's not my fault." It's a way to exteriorize and assign blame to a malevolent force that is beyond their control.

SPENCER: That makes sense to me. But I wonder whether most conspiracy theories are really about that, because you take something like 9/11 conspiracy or QAnon, do these really explain people's personal struggles or give them an excuse?

JESSE: I don't think it necessarily has to comport directly with the personal experience. There's certain things that we can see that are direct connections between, in terms of, if people might have a negative view of immigration, if they are themselves struggling financially to find work. You can see there's a one-to-one relationship maybe there. The conspiracy that the government is trying to get rid of my race of people, and so on and so forth. There's an angle there that connects. But as you say, with QAnon or something that is way more out there, I don't think it's necessarily the case that it's a one-to-one relationship with someone's experience on a logical deductive level. It's an emotional and social driver that is wanting to believe in something and connect with something that explains something about the world that is not within the normal frame of reference, because the normal frame of reference isn't working, for some reason. "I feel that I'm disconnected from my family. I'm not successful in work and life, I feel a sense of disconnection." And so having a narrative, whether it's a malevolent force that kind of explains the power dynamics in the world being nefarious, is an attractive narrative proposition, regardless of whether it connects in a logical way to my personal experience of the world.

SPENCER: As far as I'm aware, the group that has had the most conspiracy theories thrown at us are Jewish people. At least I'm not myself familiar with any group that's had more conspiracy theories lobbed at it. And I'm wondering, why do you think that is?

JESSE: That's a really good question. Anti-semitism is the easy answer to that, and there's a lot of historical pretexts for that in terms of why people have had anti-semitic beliefs throughout history. And the Jewish people, as you well know, have been dispersed largely for the last several thousand years. And so, you get a lot of the xenophobic outgroup kind of motivators of racism, promoting and fueling particular ways of looking at things. And a really important frame for conspiracy thinking generally is this: a lot of in-group/out-group motivation mechanisms going on there. People who want to connect with a tribe, find that tribe amongst other conspiracy theorists. They find a connection where maybe they don't have a connection in their own lives. People that maybe feel threatened by either on an economic level or a social level, have underlying, kind of in-group, out-group racist motivators that are fueling their belief systems. And so the Jewish people, I think, have for that reason, been very much a target of this kind of conspiratorial thinking, which is promoted by in-group/out-group thinking, as have many immigrants in different countries have been subject to a conspiracy theory. I think another aspect of it is that Jewish culture, generally, is very intellectual. And so, it plays to an elitist mindset as well. One of the core tenets of conspiracy thinking is to be anti-elite: "These elite powers that are controlling things, and I'm not getting access to that, and that is unfair." And so when you see that there's a race of people and ethnic groups who are very clearly intellectual, and maybe are doing very well in society on some levels, then it's very easy to play into a double whammy of xenophobia and anti-elitism. I think that drives a lot of conjecture to some extent, but that's my perspective on it.

SPENCER: Yes, it's sort of similar to some thoughts that I had, where if you look at the accomplishments of Jewish people, it's kind of incredible. My understanding is that relative to the population of the world, they've had something like 100 times more Nobel Prizes than you'd expect. Obviously, they're way over represented compared to their percent of the population in things like media and finance and other influential focus areas. And so, if you look at the world and you say, "How is that possible? How could such a small group do so well? What are the possibilities?" One possibility is that they have a culture that helps them succeed. Another possibility is the genetic explanation. A third possibility is that somehow they're cheating or doing something Machiavellian or have a secret cabal. And I think a lot of people jump to that explanation. I don't know what other explanations there could be. I don't know. But there aren't that many possible explanations. And I think if you reject the other ones, you kind of jump to the conspiracy one.

JESSE: Yeah, absolutely. And the same underlying reality of what the motivators are for conspiracy thinking, I think, are largely power-based. A disempowering of oneself is a strong predictor for conspiracy thinking and extreme conspiracy thinking as well. There's a lot of work on status that's been coming out the last few years. One that I think really, really speaks to this is: — people don't like to acknowledge it because it seems egotistical or vain or something like that — but as social creatures, our status matters very much to us. And if our status is threatened, or we perceive it to be threatened, that's a really strong emotional and psychological driver of our behavior, often without much conscious reflection. And so, if there is a perception that a particular group in society is lording it over the rest of us, has unfair access, and is somehow an elite cabal that is doing something that is nefarious and unfair, that could explain why I have these experiences in my own life, where I feel disempowered, where I feel like I'm not getting access to the same resources and opportunities that other people are, you don't often see people who are very successful and happy in their life waxing lyrical about conspiracy theories. I think that's kind of a telling reality.

SPENCER: One thing you mentioned earlier in the conversation is this heuristic — and I've heard many people give us heuristic — that when analyzing conspiracy theory, you can ask the question, "How many people would have to keep this secret?" And then you can say, "If the number of people that have to keep the secret is really, really large, it's probably not a real conspiracy. It's probably a fake conspiracy." And I'm wondering how good a heuristic do you think that is?

JESSE: It depends. There are obviously real conspiracies, and people have conspired to keep things secret. So it's not an absolute. Rather, what we should do is think in this probabilistic way of, "Okay, let's consider how many people would need to be on that, and how much motivation they would have to keep their mouth shut for a long period of time? And would there be evidence to the contrary?" And for the conspiracies that have come to light, some of them, people did keep their mouths shut for some time. It was classified information under threat of national security violations. They were able to keep a lid on things or something like that. So it's not an absolute quantity. But for a lot of the more extreme conspiracy theories, it is a real consideration to be probabilistically thinking about this in terms of, "That would actually be pretty difficult to keep that under wraps for that many people for that long a time, and for there to be no evidence to come out to suggest that that was actually the case."

SPENCER: It seems intuitively right that the larger number of people that need to keep something secret, the harder that would be to do, on average, all is equal. But there seem to be interesting exceptions. You mentioned military exceptions. The military is really good at keeping secrets; it's one of the key things that it does. Tight-knit groups, in general, seem like they can keep secrets pretty well. Scientology literally infiltrated the US government and presumably, actually, there were tons of people in Scientology that knew this was happening, but they managed to do it nonetheless. So if you have a sufficiently tight-knit group, maybe they can accomplish this.

JESSE: Absolutely. We say right at the outset of the conspiracy tests that conspiracies are real. And the point of what we're doing is not to dismiss and be contemptuous towards anyone that wants to question the powers that be, but rather for us to level the playing field as a work as critical thinking works both ways. We should question pharmaceutical companies, we should question the government, and we should question the powers that be in all frames of life within a democratic open society. So the idea isn't to dismiss conspiratorial inquiry or to say that we shouldn't question things. But rather, to say that we need to try and be even-handed about this because we create this polarization effect where, most people, when you say conspiracy theory, they go right to the extreme QAnon-end of the spectrum, and maybe not know about COINTELPRO or MKUltra or any of these other things. And then on the other side, you have people that do know about those things, and then conflate that with all of the much more far out there conspiracies. So what we're trying to do is to facilitate a more even-handed probabilistic, critically evaluative way to be able to look at things so that we get the best of both worlds.

SPENCER: Another thing I think about is, for some of these wacky conspiracies, if someone did come out and say, "Oh, no. I'm an insider, and it really is true." I think the reality is, a lot of people call that person in question. They say, "Maybe that person's crazy, maybe that person's lying." So it's a little bit of a funny thing. You can call for, "If this was really true, then somebody would have come out." But if someone does come out, they can be dismissed easily. But I think there's a real phenomenon going on, where it's easy to miss the idea that, on any given topic, there are some people willing to lie for attention and there are also some people who are mentally ill and might be schizophrenic, or have other mental issues going on. And so, even if someone does come out as an insider, you can't necessarily take that at face value.

JESSE: Yeah, absolutely. And I think with the UAV-UFO stuff that's happened recently, the conspiracy community is in conniptions because it would seem that they've had some validation that there was things going on all along. But what's often not apparent and what we find is, if we think about things skeptically and rationally, things can appear a certain way. And it doesn't matter how much of a good critical thinker you are. The idea that you are going to have access to absolute truth and be able to have insight beyond your own scope of references is just wrong. Intellectual humility is the core of critical thinking, and it is a lesson that you learn over and over again in your life. So take the UFO-UAV things that have happened recently. It would appear that whistleblowers have come out and gone before Congress and said, "There's actually evidence of these UAVs and we think something's going on here." The narrative that's not surfaced is maybe a wicked problem where, say for instance — pure conjecture here, and I'm not making any claims — but we're it case that, say, there was a covert arms race with aviation technology between superpowers in the world, if that was going on, as it might be, there would be a national security reason to not be able to talk about that, or people with certain clearance levels wouldn't be privy to that information. And it would be something that would be closely guarded by the militaries of those respective countries. And so, you have this wicked problem where you can't really come out and say, "What's actually going on guys is this." And you kind of have to keep your mouth shut. But also, then you have to deal with the public and political ramifications of that being the case. If you've got someone maybe with a lower security clearance that isn't privy to that and going out there and saying, "Hey, guys, I found something that's going on here." And then maybe that person, after they've done that and been briefed on it, might come out afterwards and start saying a lot of things that discredits what they were saying earlier, because they realized that there might be a reason for why that would put national security issues into the frame.

SPENCER: The UFO happens to be so interesting, because you really do have legitimate seeming people saying, "I saw this thing I couldn't explain it." And then you have people in the government saying, "None of this stuff is true." And then you have the official position which is, "No, these things have never been found." And then you have lots of different things going around the internet. Some of them are real, some of them are fake. Some of the real ones actually come from the government. And then some people say, "No, this is just this thing. We can explain this as a known phenomenon." And other people say, "No, that doesn't explain it." This is just an incredibly complex landscape to try to make sense of. So it feels like you could easily spend hundreds of hours investigating it, and still feel like you don't really fully get the whole picture. So I'm curious, if someone who's interested in, let's say, the UFO topic, how would you suggest that they approach it?

JESSE: Yeah, that's a great question. I think the answer is that: for many topics, we just don't have access to the information that really clarifies it for us. So the epistemic doubt that we ought to have for particular subjects such as, "What is happening on a national security level? Or is there information that I maybe can't or don't have access to?" results in an uncomfortable conclusion which is, "I just don't know." And people don't like to not know [laughs]. So it's much more attractive, within the attention economy, to have a narrative that plays to, "Actually, I do know. It's a secret knowledge. And if you understand what's really going on here, it's actually quite exciting and salacious. And we can be the power that holds power to account with our collective rising up against these elitists, and so on and so forth." And so there's these bad incentives that can muddy the epistemic waters and make us a lot more prone to believing things that maybe aren't likely to hold up in the long run, and maybe ought to be subject to a little bit more skepticism and circumspect rationality. And it's hard to sell that because it's not as much of an attractive narrative than, "We're not sure. Maybe there's something there but we don't have access to the evidence. And we're not really going to know because it's really complex."

SPENCER: You use the phrase intellectual humility earlier. How do you define that, and why do you talk about that, in particular?

JESSE: So intellectual humility is something that I preached when I was quite a bit younger. And I think it was an irony because I was preaching it without really practicing what I preach to a large extent. And I was part of the new atheist movement and was very combative in the way that I was trying to argue with people who are wrong on the internet. Since then, I think I've swallowed my own medicine a lot more, I hope I have at least, because I realized that I was wrong about a lot of things. And when you keep realizing that you're wrong about things, it helps to generate intellectual humility which is just the idea, the baseline mindset, of thinking, "I might be wrong." My favorite quote in the world is from Bertrand Russell and he said, "The whole problem with the world is that fools and fanatics are so sure of themselves and wiser people so full of doubts." And the way that that plays out in the marketplace of ideas and in politics and everything else, is that people who are reflective, critical, circumspect, and rational tend to be a little bit less likely to speak up, and certainly less likely to speak up in a very aggressively and boast in strong narrative way. And that's a real pity because it gives this inverted reality, where the people who are most confident, Dunning-Kruger effect in play writ large, tend to be the least reflective a lot of the time, and maybe they're overconfident as it were. Whereas the people that are more reflective, more evaluative, and more circumspect, tend to not speak up as much. And that creates a feedback loop within our culture that can have really dire consequences.

SPENCER: So then, would you say intellectual humility is just sort of a metacognition where you're aware that you're sometimes wrong, you observe that you've been wrong in the past, and you kind of apply that when you're thinking in the future? How would you kind of define that concept?

JESSE: I think intellectual humility is just, very simply, having the mindset that we might be wrong about things. And thinking probabilistically is the functional reality of intellectual humility. Instead of thinking, "This is true. This is false." Coming into things with what Julia Galef calls as Scout Mindset, which is just wanting to try and understand things impartially, being truth-seeking and curious about what's really going on, and not being too confident in our own priors and our own beliefs.

SPENCER: One thing you might wonder about with that is whether it kind of pushes people to be not opinionated enough on topics they really know. Because one interpretation of the Bertrand Russell quote — which I'll just say that again: "The whole problem with the world is that fools and fanatics are always so certain of themselves and wiser people so full of doubts." — is that actually, when you know a topic, you should be more confident, that you should be kind of going out there and saying, "Oh, no. I actually have a really high probability belief in this topic."

JESSE: Yeah, absolutely. And that's the rub of it. It inverts what ought to happen in terms of practice because people, who maybe don't know very much about the subject and have just a surface level understanding, feel very confident to speak about it because they think they've connected the dots. Whereas once you've done a PhD thesis on something, you realize it's a very nuanced and complicated reality, and that actually, there's a whole lot you don't know, even though you're an expert in the field. And so, there's this disconnect, this dissonance epistemically between two people's points of view. And it's the worst of both worlds in some ways, because, as you say, the people who know the most about things tend to be the most circumspect because they realized that even on subjects that we have a lot of good evidence for (hard sciences notwithstanding), there is still some pretty big area of doubt, fuzziness, and unknown unknowns, and other things that we just don't have access to yet. And so that position of epistemic doubt is actually the right one, rationally speaking, but how that plays out socially and politically, is not representative of how it should be or how we would want it to be.

[promo]

SPENCER: I think it's extremely valuable to keep in mind that you might be wrong, especially on complex topics. However, there's something about the phrase intellectual humility that rubs me the wrong way. And I think I worry that it will make people more reluctant to give their opinions when they're not an expert. And I actually think there's a lot of value in giving your opinion when you're not an expert, while still understanding that you may be wrong and knowing and going into giving your opinion with that in mind. And so I'm curious what you think about that. This is spoken from someone who gives their opinion on topics all the time, and I'm sometimes wrong. But I find that incredibly valuable, because then, I find out I'm wrong. Whereas if I just didn't say my opinion, I would never find out.

JESSE: 100% agree with you actually. I think that there's the implication that "we should keep our mouth shut if we don't know," is the wrong conclusion to draw. What we know from a lot of sociological research is that we are much better at thinking about things as a group activity than we are individually. We're much better at being able to reason socially because we're much better at seeing other people's biases than we are at seeing our own. There's a great Hume quote which is, "Truth springs from arguments amongst friends." And so, if you have a good faith engagement with somebody where you put your ideas to the test, and you have this context where you're really trying to understand the truth of things and you're curious about it, and you're using the principle of charity and acting in good faith, that's by far the best scenario to be in to venture your opinion, even if you're not sure about it. But the difference, I guess, the intellectual humility is getting at is that we ought not to go into things thinking we know the truth, but rather, even if we have opinions (because of course, we have opinions) just being curious about finding out, "Does this stand up to scrutiny? What do you think? Is there evidence to support this view? Are there counterfactuals that I haven't considered?" If we have that mindset of being curious and wanting to understand, that's the important part of intellectual humility. It's not about keeping your mouth shut or not engaging in debate or thinking about things or collaborative social reasoning or anything like that. It's more about just not being too cocky and confident in our own beliefs to think that we have access to the absolute truth.

SPENCER: One thing I find useful when I give my opinion is trying to flag my level of confidence. And I think that that helps people when they're talking to me, or when they're reading what I'm writing. So if I say something really confidently, they know, "Okay, Spencer doesn't always do that. So, that actually means he is more certain on this topic." Whereas if I say, "Here's what I think might be the case in this situation," then they can adjust on that. And so, I find that for people that maybe are reluctant to give their opinion, I would recommend using that to your advantage, where you kind of flag how confident you are or how unconfident you are. And that maybe makes it more okay to give your opinion on topics that you know you're not an expert on or you know you don't know that much about it, but it still can be valuable to say what you think.

JESSE: Absolutely. And I think even if you are an expert, still giving that does two things: The first thing is it says, "Well, not only I'm not 100% myself, but it opens up what we do with the probabilistic thinking within the conspiracy test." The idea is that as soon as you flip it from being a binary to, there's a scale of confidence here, as soon as you do that, you change the way in which we think. It's a really profound shift, and you can see it kind of physically happen. So I 100% agree with you that if we frame things in that way — of confidence levels and probabilities — it opens up a more good faith way to engage with ideas than if we're just trying to defend one particular view as an absolute truth, or attack other ideas as an absolute falsity. That way of interrogating things is often very unhelpful. Having said that, truth springs from debate amongst good friends because it's fine for us to be able to take a position and defend it from a kind of devil's advocate point of view as well, just to stress test it. That's a very different mindset to being emotionally motivated to that, "These are my beliefs. And if you're attacking these ideas then you're attacking me as a human being." They are often conflated as being the same thing, but they're very much different, I think.

SPENCER: I think a related thing is that people sometimes don't give their opinion because they're afraid that being wrong is really bad. And as someone who has been wrong on the internet, it's not that bad, really. I remember I was supposed to think about John Rawls, and I had actual John Rawls scholars beat the crap out of me, being like, "You're wrong," which is fine. That's how you learn. It's just not that big a deal to be wrong. Especially, I think people take it really well if you're wrong graciously. If you're like, "Oh, wow, thank you for pointing that out. Awesome. I'm gonna update what I wrote. You're correct." They actually feel really good about it, and they don't think you're a jerk.

JESSE: Absolutely. Isn't it such a curious, counterintuitive thing, that there is the strong human tendency to defend our ideas, as if we're going to lose social status, or as if we're going to lose something if we go, "Oh, I was wrong about that." And yet, it is very obviously universally the case. When we see somebody demonstrate that kind of intellectual humility, where they go, "Oh, I never thought of it that way," or, "Oh, I was wrong about that. I'll update my thoughts on it," that we immediately respect that person so much more, that we immediately go, "Oh, that person has demonstrated to me that they have integrity, that they care about the truth of things, anything that they say from now on, I'm gonna put so much more faith in." And they've had their status elevated as a result of admitting they're wrong. So it's a really curious, counterintuitive reality, that we defend ideas instinctually and emotionally that we might have. And it actually lowers our status rather than raising it.

SPENCER: Yeah, I wonder if there's a conflation of someone who overconfidently states a lot of things that turned out to be false, which most people, I think, would judge quite negatively, versus someone who states things in a balanced way based on their actual strength of belief and occasionally is wrong, but then has graciously admits it and says, "Oh, you're totally right actually; thanks for correcting me." And I think the second thing is actually really well received mostly,

JESSE: Absolutely. And that's the thing. I think that mindsets and attitudes are more important than the truth claims and specificity of ideas. So, it's not that certain people have these correct ideas or those incorrect ideas because, as history teaches us, we're often very wrong about things. Collectively, the norms of society change pretty profoundly over time. And so, it's not just about personal intellectual humility, but there's this kind of collective intellectual humility we ought to have as well. And so, the idea that if you are attitudinally receptive to the idea that you might be wrong, and you have confidence levels and you're thinking probabilistically in good faith, that becomes apparent over time. And we learn as a result of that. So, it's not the same as being really arrogantly confident about an idea and stating it without much evidence. You are going to lose status as a result of that. And you should lose your status as a result of that. The idea that there's no utility function to status, I think, is wrong as well. Because as social creatures, we hold each other to account. And there's real value in that, and we ought to be intellectually humble and have epistemic doubt about things generally. And having confidence levels is a reflection of that, rather than this kind of, "These are my beliefs and I believe them wholeheartedly and will defend them to the death." The difference in those mindsets is a really profound and distinct one, I think. They're not the same. And we often get into trouble when we think that they are the same, and we treat them as being equivalent.

SPENCER: You've said something that I found interesting, which is that all reasoning is motivated but we can change our motivation. So tell me about that. What does that mean?

JESSE: As Daniel Kahneman and Amos Tversky's research from the 70s onwards has shown pretty convincingly, we are all subject to biases. We're all subjects to a level of intellectual and reason-based error that we can't really ever correct for entirely. The truth of how human beings work is that we work with what we've got. Our belief systems are a reflection of our circumstances, our genetics, whatever we were exposed to as children and grew up with, and we're just making our way through. So, it is the case that when you have a certain set of beliefs, you must necessarily be motivated by those beliefs, by that worldview, by those biases that you carry around with you. But, you can change on a systemic level, if you aren't just operating on the level of ideas and beliefs themselves but on a mindset of wanting to get at the truth of things, which is fueled by curiosity and truth seeking. Then, you can have a self-regulating system where, over time, you can update your priors, you can become more aware of things that maybe you weren't aware of, you can learn things you were wrong about, you can gain a deeper understanding about things. That, to me, is a really key difference: are your beliefs self-justifying, and are those ideas that you have things that you're defending? Or are you motivated to want to understand the truth of things? And of course, everyone wants to think that they're motivated by the truth of things. But it's something that you actually need to do. It's a verb, not a description. So I think that if we understand that all of our reasoning is motivated by our own priors, whatever they might be, we can change those motivations to curiosity and a truth-seeking mindset. And that helps us to be able to become more aware, have deeper understanding, and to become wise over time.

SPENCER: So Jesse, how about we end on a lightning round, where I'll ask you a bunch of really difficult questions and you have to give really short answers?

JESSE: Okay, yeah. [laughs]

SPENCER: All right. So you've done a ton of work on cognitive biases. Which cognitive bias do you think causes the most harm in society?

JESSE: I would say probably confirmation bias. And that's a bit of a cheat answer, because it is kind of the mother of all biases in a lot of ways. It speaks to motivated reasoning writ large.

SPENCER: What's a cognitive bias that you think is important that people may not have heard of before?

JESSE: Maybe the curse of knowledge bias, which is the idea that once you know something, it's very hard for you to be able to understand that other people don't know it. And it's very instructive, in a practical way, when you're teaching people something, because we can often forget that when we didn't understand this concept, it was actually quite difficult, and we presume and project our own understanding onto others. So being aware of the curse of knowledge bias can help us to become better teachers.

SPENCER: When we look at society today, with so much tribalism and so much hatred towards the other side, one might think that maybe the two sides — the Left and Right in the US, for example — actually are just fundamentally incredibly different. Do you think that's true, or do you think that the two sides actually have a lot in common, maybe that they're missing?

JESSE: Definitely the latter. We're all human beings. And I think that we can see the effects of post World War II where there was a galvanisation against fascism, and behind the idea of liberal democracy, that I think gave us a cultural kind of unity that lasts for a really long time. And Ezra Klein's work on polarization and tribalism, I think it's really instructive here that we have this identity stacking kind of reality where we think we're a lot more divided than we are. But if you actually talk to someone with different political beliefs to you, and you foster a good face relationship, almost invariably, you find that it's usually the case that there's just misunderstandings, and maybe some ontological differences to some extent, but it's far less than I think sensational narratives would have us belief.

SPENCER: Some people attribute a lot of bad things in society to the small percent of really, really bad actors: people who are really malevolent and have really bad intentions. Other people attribute a lot of problems in society to a much larger group of people that maybe are pretty decent, but they're confused, or they're full of anger towards another group, or they feel wronged, and they do harmful actions out of that. What do you attribute a lot of problems to society? How would you break it down between those two explanations?

JESSE: Yeah, sure. So I think that there's a small number of bad faith narcissistic sociopaths that have a pretty disproportionate influence on things. But for the most part, again, that kind of bell curve of people that maybe 10:1 way or the other but misrepresented, I think, in the narrative sphere of sensationalism and attention economy. Bad incentives, where the most extreme kind of strawman examples of things tend to be held up as the tribalistic kind of exultation of a particular worldview. And it's often very misrepresented of people's lived reality. So, I think that there's a really bad incentive set and a bad feedback loop effect that goes on. And when you actually talk to people, you find out that they have a lot more of a good faith, maybe perspective, on things. And even if you disagree, maybe it's more of a misunderstanding than an intractable difference.

SPENCER: What do you think about the difference between knowledge and wisdom?

JESSE: So maybe a way of defining wisdom is to say that wisdom is knowing how and why something is the way it is, such that one can be discerning. As an analogy, knowledge is knowing that tomato was a fruit. And wisdom is not putting it in your fruit salad. And philosophy is wondering whether ketchup is therefore technically a smoothie. So I think that this is the idea that wisdom is this kind of navel-gazing gurus spousing vague aphorisms-type thing. And what I guess I'd propose instead is that it's more context-independent knowledge. And what I mean by that is that you can know that 3 times 4 equals 12 — learning by rote and that's knowledge — but if you understand why 3 times 4 equals 12, then you'll also know that 12 divided by 4 equals 3. So understanding the how and why of things is, I think, a really profound shift and demarcation from knowledge.

SPENCER: When you think about trying to fight a kind of epistemic crisis that some people say that we're having in society now, how does promoting wisdom fit into that?

JESSE: Yeah, so I think that if we try and facilitate understanding, rather than the correct knowledge, that people might have about a particular subject, say, marriage equality or a woman's right to choose with regard to abortion, these kinds of contentious issues, if we attempt to smack people over the heads with our belief set and try to force them, it can often be counterproductive. Whereas if we try to promulgate wisdom and understanding for why it is that we have these beliefs, and why it is we think this is the truth of things, that can be, I think, a lot more effective and becomes a self-regulating kind of system to help democracy function much better. Because the more that we promote understanding versus a particular worldview or policy or anything else, the more we have flow on effects that are more of the same. By promoting understanding, we beget more understanding.

SPENCER: What do you think the effects are when people take an adversarial approach towards those they disagree with?

JESSE: Oftentimes, it's counterproductive. It's interesting because I think most problems are conflation errors, basically. And so, when we take things apart, there's these relevant differences, if you're arguing with someone in a public forum and they're putting forward a bad faith point of view that is based on factually and inaccurate priors, and they're a charlatan or whatever, the idea that we ought to engage them in a good faith, understanding conversation, and give as much attention to their point of view as anybody else's, is, I think, wrongheaded. It's perfectly fine, and I think it is advisable, in a public forum, to call people out on things and say, "Hang on, that doesn't set up to the facts of the matter." But that's different too if we're having a conversation with people, and we're attempting to actually build a relationship and rapport and understanding together. So these contexts matter, and they're often conflated as being the same. So I think that if you argue with somebody on the internet, for example, or you're on a university campus, it's good to engage in good faith debate and be vociferous about things in the marketplace of ideas. That's, I think, a really healthy thing to do in a democracy. But if you're having a one-on-one conversation with somebody, and you take that adversarial approach, that's a very different context. And if you apply the same adversarial way of approaching things in that context, you're actually going to create more polarization rather than less, because the only audience is you and the other person. And you lose something even in the public debate, because if you have an adversarial approach there, you are probably going to maybe entrench that other person's beliefs a lot more. But for everybody else watching, if their arguments don't hold up to scrutiny, then that might have a net positive in terms of, "You're actually putting forward a good counter argument to this."

SPENCER: Last question for you. When it comes to improving critical thinking, do we need more education reform or do we need more cultural change? And how do you think about that choice?

JESSE: Oh, I'll try and answer that quickly, but I might fail. I came into — I'm starting this nonprofit, School of Thought — with the idea that if we taught the next generation how to think for themselves, then we would solve a lot of the world's problems. I still believe that to be true. I think educational reform is badly needed. But I've also come to a couple of other conclusions in the meantime. One is that it's really, really hard to create educational reform, especially in the United States. It's just a very intractable bureaucratically dense, difficult situation, politically and otherwise, to be able to make the changes that maybe would have a really profound difference, like teaching philosophy and critical thinking in middle school, for example. So those things are just practically very difficult. But also, I think that I've come to the view that because we are emotional and social creatures before we are rational creatures, the tools of critical thinking and rationality need to be a couple of stops down the line of how we think about things. Because strategically, we need to focus on the social and emotional and tribalistic aspects of our motivation sets before we get to those instrumental tools. And so, we're sort of shifting our focus a little bit at the School of Thought to not abandon critical thinking tools — which I think is still really important and have a lot of utility to be able to teach everyone, but kids in particular with their neuroplasticity — but also look at the cultural aspect of things. I think that's where the most good can be done and where we need to make the shift more profoundly and more urgently than in the educational reform area, which is a good 10 to 20 years of work, even in the best case scenarios. But I think culture can be shifted quite quickly and profoundly if the ideas that are resonant and powerful, self-reinforcing and can have a really profound shift in a relatively short period of time.

SPENCER: Jesse, thanks for coming on. This was a really interesting conversation.

JESSE: Thanks so much for having me, Spencer. I really appreciate it.

[outro]

JOSH: A listener asks: "With so many different methods of meditation available, what's your advice on how to approach finding one that works best for someone interested in it?"

SPENCER: It's a good question. We're actually launching a website to help people explore different meditation activities because what we realize is there's actually a huge number of them available. There's so many different things that you might call meditation, and they might have quite different effects from each other. But in terms of finding one that works for you, I don't know of a better method than just experimenting. But I do think that when you're experimenting, it's worth giving each one a real shot. So maybe you try it for an hour a day for a couple of weeks; that might be a good way to test out a method and see if you find it helpful. Some that I think tend to be more helpful to people are compassion type meditations where you, for example, will imagine someone that you feel a great deal of love or compassion for, and then you'll kind of channel that feeling. And then you'll try to apply that feeling to yourself, and you're trying to apply that feeling to acquaintances, and you'll try to apply that feeling to strangers you've never met, and so on. So that's one type that I think people tend to get quite a bit of value out of. Another type is basically awareness of the present moment where you practice being totally aware of what's happening around you. If thoughts pop into your head, you kind of note those thoughts and then let them go and then go back into focusing on the present awareness. And I think there's value in this type, especially in learning to be more aware of every passing moment, as you live your life, which can make life more joyous and can make you notice things you wouldn't normally notice. And it's something I aspire to get better at. I do it from time-to-time, but I could do it a lot more. Another type that is popular, which a lot of people say they've gotten benefits from, are the ones that require deep concentration. So focusing on your breath as it goes in and out of your nostrils, for example, and really trying to learn to concentrate to the point where you can really hone in on that subtle feeling. And then as things come up, you notice an ache in your back or you notice a stray thought, you kind of always just keep returning to the object of meditation and this can improve concentration. I did this kind of method for about 20 minutes a day for a year. And I got better and better at it. I wouldn't say I ever achieved an expert level, but I did find it had some benefits. So those are some to explore.

Staff

Music

Affiliates


Click here to return to the list of all episodes.


Subscribe

Sign up to receive one helpful idea and one brand-new podcast episode each week!


Contact Us

We'd love to hear from you! To give us your feedback on the podcast, or to tell us about how the ideas from the podcast have impacted you, send us an email at:


Or connect with us on social media: