Tyler Porter (University of Colorado at Boulder), "Manufacturing the Illusion of Epistemic Trustworthiness"
Forthcoming, Episteme
By Tyler Porter
In 2015, conspiracy theorist Alex Jones claimed that the U.S. government has used bombs that turn people gay “on our troops, in Vietnam… and in Iraq.” Jones often makes claims like this. After this proclamation, however, Jones did something that caught my attention. He paused and said, “If you’re a new listener just type in ‘Pentagon tested gay bomb.’” Naturally, I was curious and did exactly that. As it turns out, mainstream sources confirmed that the U.S. military did attempt to develop a bomb that turned people gay. The research even won them the ignoble prize. If you want to learn more just type in ‘Pentagon tested gay bomb.’
At this point, I was intrigued. Not by Jones’ claims or the articles I found. Instead, I was interested in why Jones had directed audience research like this. A few features of his directive stood out. First, his statement was shocking, intriguing, and unbelievable – a claim many would feel compelled to follow up on and would remember if he got it correct. Second, Jones must have known that Googling ‘Pentagon tested gay bomb’ would yield some range of mainstream results that seemed to cohere with his claims. Of course, mainstream sources didn’t confirm Jones’ actual claim (the U.S. government has used bombs that turn people gay), but they did confirm nearby and similarly outrageous claims (the U.S. government attempted to develop bombs that turn people gay).
This observation kicked off the research project which eventually became Manufacturing the Illusion of Epistemic Trustworthiness. Initially, I thought Jones’ behavior could be captured by existing philosophical concepts. In particular, C. Thai Nguyen developed the now famous distinction between epistemic bubbles and echo chambers (see Escape the Echo Chamber). In an epistemic bubble, group members believe things because they are not aware of outside information or opposing arguments. If you only read the New York Times and not other sources (like Fox News), you are in an epistemic bubble. You are unaware of the positions being taken by outside groups. In echo chambers, on the other hand, group members are aware of outside information – they just don’t trust the bearers of that information. If you read the New York Times and other sources but were trained to distrust those other sources, then you are in an echo chamber. You understand the positions being taken by outside groups but you don’t trust those groups. Jones' research directives didn’t fit either of these concepts, however. In this case, he was explicitly pointing people toward outside sources and relying on the fact that people trusted those outside sources. This behavior just didn’t fit neatly with existing philosophical concepts.
I began trying to understand what Jones was up to. I settled on the notion that Jones was onboarding new listeners – trying to build trust with them. (Perhaps after this onboarding Jones’ messaging would switch to that found in a typical echo chamber). After helpful advice, I looked into empirical work on how people evaluate trustworthiness. One influential meta-analysis identified three factors that people look for: ability, benevolence, and integrity. For example, if you wanted to find a trustworthy babysitter, you would look for evidence that they were able to watch your children competently (e.g. ability), had goodwill towards you and your children (e.g. benevolence), and that they were a good person (e.g. moral integrity). I thought that these were the factors that Jones would try to manipulate to gain people’s trust.
How would bad actors build the illusion that they possessed epistemic ability? They would make themselves appear to be experts. I explored four factors people should look for when attempting to find experts, and how those factors could be manipulated: coherence with other trusted sources, direct verification of claims, signs of intelligence, and credentials from reputable institutions. The gay bombs example demonstrates an attempt to make shocking claims appear to cohere with trusted sources. With coherence out of the way, I looked for examples of bad actors manipulating the other three factors. The Flat Earth community manipulated new audience members by seeming to help them directly verify shocking claims. The community has forums that teach people how to “work through” mathematical and scientific problems for themselves. On one of these forums, people are trained to accept the following argument:
1. If the earth was round, then you couldn’t take a photograph of Chicago from across Lake Michigan.
2. You can take a photograph of Chicago from across Lake Michigan.
3. So the earth is not round.
New members are taught how to do mathematical equations which seem to confirm premise 1 and older members lead field trips to take the photograph which verifies premise 2. This process gets people to feel as if they have directly verified all the premises of the argument. The problem is, their confirmation of premise 1 does not include all the relevant information. Thus, the direct verification of this claim is merely an illusion.
Next on the docket was signs of intelligence. Here I relied on the work of Neil Levy, who has written about how bad actors make themselves appear intelligent through intellectual virtue-signaling (see Intellectual Virtue Signaling). Levy says that bad actors will not signal truly valuable epistemic qualities. Instead, they will signal qualities that other people are likely to value – such as quickness of mind (e.g the ability to use language well, speak coherently about any topic, and so on), intellectual autonomy (e.g. the ability to think for oneself and not rely on others), and intellectual courage (e.g. the willingness to offer contrarian views). This intellectual virtue-signaling leads new audience members to view bad actors as intelligent enough to be experts on any range of topics.
Finally, Robert Malone and Scott Orson Car provided examples of manipulating people's expectations about credentials. Here I noticed two different strategies. The first is to say that credentials are important and that you have better credentials than everyone else. During the COVID-19 pandemic, Malone deployed this strategy. He inflated his credentials above and beyond other experts, suggesting that he invented mRNA technology and thus had special insights about their dangers. This claim appears not to be true, however. Rein Verbeke spoke to the Atlantic about these claims, stating Malone and his co-authors “sparked for the first time the hope that mRNA could have potential as a new drug class” but “the achievement of the mRNA vaccines of today is the accomplishment of a lot of collaborative efforts” (see The Vaccine Scientist Spreading Vaccine Misinformation). The second strategy is to say that credentials are not important at all. Scott Orson Card deployed this tactic on one of his websites (The Ornery American), stating “We aren't impressed by your credentials, Dr. This or Senator That. We aren't going to take your word for it, we're going to think it through for ourselves.” In this way, Card sets the bar for expertise low so that he can clear the bar without getting properly credentialed.
So, bad epistemic actors can create the illusion that they possess epistemic ability through coherence, direct verification, intellectual signaling, and credential inflation or deflation. However, demonstrating epistemic ability is only one step toward building trust with new audience members. The bad actor must also display benevolence and moral integrity. The bad actor can build the illusion of benevolence through techniques like love bombing. Love bombing involves showering someone (in this case, new audience members) with compliments and affection, saying things like “You are the only people smart enough to see the truth.” Of course, love bombing is merely a manipulation tactic and will ultimately be replaced with exploitation and/or abuse. Displays of moral integrity will also be essential for bad actors when building trust with new audience members. These displays will often be made through storytelling. According to empirical research, certain stories will be perceived as more demonstrative of good moral character than others. Specifically, stories that show characters performing statistically rare or otherwise extreme behaviors that come at a personal and are accompanied by genuine emotions “are perceived as especially informative about [moral] character” (see A Person-Centered Approach to Moral Judgment). Jones is again a great example of this. He often tells stories about himself hunting down information to aid listeners – usually at great personal expense. He accompanies these stories with grand displays of emotion. Exactly the sort of stories that effectively demonstrate moral integrity.
In the case of bad actors, the evidence of epistemic ability, benevolence, and moral integrity offered to new audience members is all misleading. It is meant to onboard people – to get them to view the bad actor as epistemically trustworthy. But these bad actors are not epistemically trustworthy. They position themselves in this way to gain money, admiration, or power. They solicit money from audience members and offer nothing worthwhile in return. They harm the healthy epistemic networks that audience members previously engaged with. They spread misinformation without a second thought. But at some stage it is too late for the new audience member. They have already succumbed to the illusion of epistemic trustworthiness.