A Conversation with Renée DiResta, One of the World’s Foremost Experts in Online Manipulation, Disinformation, Conspiracy Theories & Scams.

A Conversation with Renée DiResta, One of the World’s Foremost Experts in Online Manipulation, Disinformation, Conspiracy Theories & Scams.

Renée DiResta is a highly regarded expert and speaker on adversarial abuse online – in other words, on how bad actors manipulate the digital public square. From spammers and scammers to state sponsored trolls, she investigates how social media platforms and technologies (such as generative AI) are used for evil. Although she studies abuse, she also uses and loves technology. Renée is the technical research manager at Stanford Internet Observatory, a cross-disciplinary program of research, teaching and policy engagement for the study of abuse in current information technologies.

Renée has spent a decade studying rumours, propaganda, and influence in the digital age, in contexts ranging from conspiracy theories, to terrorist activity, to state-sponsored information warfare. At the behest of the bipartisan leadership of the United States Senate Select Committee on Intelligence, she led one of the two research teams that produced comprehensive assessments of the Internet Research Agency’s and GRU’s influence operations targeting the U.S. from 2014-2018. She has advised Congress, the State Department, and civil society organizations on technology policy in areas ranging from transparency legislation to AI implications for women and children. She frequently speaks to business leaders about risks to brand and executive reputation. Renée regularly writes and speaks about the role that technology plays in society today, describing her own firsthand deep dives into fascinating and specific case studies. She was the keynote speaker for BlackHat cybersecurity conference in 2020. Renée is the author of Invisible Rulers: The People who turn Lies into Reality, published by Public Affairs at Hachette Book Group, and The Hardware Startup: Building your Product, Business, and Brand, published by O’Reilly Media.

In this interview, I speak to Renée DiResta, one of the world’s foremost experts in online manipulation, disinformation, spams, scams & trolls. We discuss the real powerbrokers who shape public opinion, enabled by technology, influencing politics, economics, culture and society.

Q: What are rumours?

[Renee Diresta]: So, rumours are unofficial information that passes from person to person. There’s a really great book by Jean-Noel Kapferer that I love, which goes into this massive study of rumour and explains the idea that people share rumours because it makes them more part of their community. It’s very pro-social. You hear something that you think sounds interesting—it piques your interest—you share it with your neighbour: “Hey, did you hear…?” It goes on to the next person, the next person, and the next person. And this is just a social behaviour that people have done in groups forever; it’s just part of human society.

And sometimes you can use that inclination, that desire to share, to seed a rumour deliberately. So this is where we get into the ideas of—in the book, he refers to this as “black rumours.” This is Britain in the ’80s; this is the term they were using then. He looks at how difficult it is to stamp out a rumour once it gets started because the rumour is often significantly more interesting than the facts. And the rumour is often very persuasive to the people who’ve heard it; they’ve heard it from somebody they trust. And when the target of the rumour says, “No, that’s not true,” you have this problem where the audience who has believed the rumour says, “Oh, of course he or she would deny it,” and so it really puts the company or the target—the individual sometimes—at a disadvantage.

So, it ties into a lot of elements of how we communicate, human psychology, ways that people form opinions and determine reality in social groups. And I think that’s really what happens a lot on social media today because now we are doing those community conversations online. And so we’ve just sort of moved this behaviour to a different set of communication structures, and instead of it being person to person in a village, it’s person to person on the internet.

Q: A lot of our group identity seems- therefore- to be tied to rumours?

[Renee Diresta]: …that dynamic of who you are in a group is something that I think the internet has really increased. Whereas when the rumour is geographically founded in a different communication environment where it’s truly just person to person—you’re at the cracker barrel, so to speak, or the bridge game—I remember my grandma coming home and filling me in on celebrity gossip that she would literally hear at her bridge game. And these were the sorts of things where your identity was more about being a member of that community in that place, as opposed to being a member of a community that is rooted entirely around your identity or your interests.

Social media really orients us around our identity and our interests because of the way those connections are formed. Often, you might come to social media with your real-world social graph—with your bridge club, so to speak—but then it’s going to suggest a whole bunch of different people for you to follow, a bunch of groups for you to join, and so it slots you according to what you’re most likely to be interested in based on your individual and group identity. And so, all of a sudden, the rumour mill goes from being something where you might have a limiting factor—where somebody in your geographical area who hears it says, “No, I actually know a lot about that from my professional sphere or from being a member of that community; that’s not true”—whereas on social media, it’s going to be reinforcing these very homogeneous networks that we wind up assembled into.

Q: Why do we get sucked-in to conspiracies, rumours and disinformation?

[Renee Diresta]: I think it’s that dynamic of information coming from your trusted peer. This has always been the source of the most trusted information. I talk in the book about communication theory a little bit, and the way that up until the 1940s, people understood broadcast media like radio—and then television came into play—as a sort of hypodermic model of information adoption. You saw something on television, you heard it on the radio, and then boom, you believed it. So there was an idea that persuasion happened through the mere act of communicating.

Then, what social scientists found in 1947 in a study looking at political opinions was that actually there were these people they called “opinion leaders.” They were women in the community in this particular study who happened to be very tuned into media but were also deeply connected in the community. So again, it was a geographically bounded community—it was Decatur in Michigan, I think. And what you see happen is that this information is mediated through that trusted figure. People aren’t all paying attention to everything that’s said on the news; they don’t believe everything that’s said on the news. But the opinion leader brings it into the group, they talk about it amongst themselves, and that’s how the opinion formation is actually happening.

And so when you’re on social media, you’re having that experience. Either it’s an influencer that you’ve come to follow and trust—somebody who seems maybe one degree removed from you, maybe a lot more popular, but still also at the same time just like you—or it’s the crowd, the homogeneous group that you’re part of that’s sort of following this topic, and your reinforcement is coming from people that you like, people that you trust. And you’re shaping and having that discussion amongst yourselves.

I think the biggest rumour that we’re seeing right now is this rather ludicrous story about Haitians eating cats in Springfield, Illinois. And again, this is one where there are some earlier ties to white supremacists in the community articulating this point of view, making these accusations months earlier. But where you really see it take off on social media is not just through that community conversation at the local city council meeting or whatever it was. It was actually a woman posting to a private Facebook group about crime in the community, saying “my friend’s daughter said,” which is a classic rumour construct—”a friend of a friend said,” “I heard”—and the next person says “I read in my Facebook group,” “I heard from a friend,” and that’s how it begins to travel.

And then where it intersects with the propaganda machine I describe in the book is those rumours sometimes become very politically advantageous for a particular niche party or identity. And this is where you go and see a political figure actually at that point picks it up and begins to talk about the “cat-eating Haitians” of Springfield, Ohio.

Q: Can we ever retrieve people who are ‘down the rabbit hole’?

[Renee Diresta]: This is something in that Kapferer book I mentioned—I can track down the title. It’s literally on my shelf behind me somewhere. Oh, here it is. I just want to make sure I’ve got the title right for you so you can quote it accurately. Ah, here we go: Rumours: Uses, Interpretations, and Images. I didn’t remember the subtitle.

One of the things he does in the book that I thought was interesting is he actually traces efforts to refute rumours. There’s an entire chapter dedicated to that. One of the cases he talks about is a Satanic panic that catches Procter & Gamble, a massive food corporation in the United States. People decide that their logo is satanic—this is during the days of a lot of the Satanic panic rumours in the United States—and the company is repeatedly explaining where their logo comes from, that there are no ties to Satanism. They have prominent religious leaders—I think Billy Graham, if I’m not mistaken; I wrote this in my book too just to get the facts 100%—but you see religious leaders trying to tamp down where the riled-up crowd is going.

People are leaving thousands of voicemails on Procter & Gamble’s phone lines alleging they are Satanists. This is a big deal for them. Then you see their competitors begin to use the rumour to their advantage. Procter & Gamble eventually ends up suing one of the competitors, in fact, for quite maliciously spreading it themselves by sending it out to their mailing list and trying to defame their competitor.

So you see there’s a very complicated dynamic, but ultimately, in the end, they do wind up changing the logo because the rumour just keeps recurring. Eventually, they decide it’s not worth it anymore and make the logo switch. It’s an interesting case study, an interesting example where the rumour’s more interesting than the facts, and the motivated reasoning is there—people want to believe it.

For some of these things where the rumours intersect with conspiracy theories, you’re also offering an easy-to-understand framework for the universe. If the Illuminati controls everything, then this explains why you are powerless, why your leaders are making decisions you don’t want them to make. There’s a lot of scapegoating that goes into who the villains are in some of these conspiracy theories. It gives people a simple, easy explanation for a world that seems very, very complicated.

Q: Is there a connection between misinformation, rumours, and the dopamine hit of – say – solving puzzles?

[Renee Diresta]: This is the sort of operating theory that psychologists have advanced for why QAnon became so popular. Some of these things, as you’re noting, involve individual component rumours or claims or narratives that weave into these big “single source of truth” type theories. QAnon is an example of this because the dynamic that happened there—and it very much started as an American conspiracy theory—involved what were called “Q Drops” being put out. These would be some cryptic poem, like “watch the water” kind of thing, or “On September 10th all will be revealed”—things that you can read a lot into. On September 10th, something is going to happen. Something big, somewhere in the world, will happen.

What you see is people taking these cryptic poems and talking amongst themselves in communities and groups that form around this, coming to some sense of what that poem could have meant, and that becomes a shared lore for the people in the community. Some people have likened it to almost like an augmented reality game that has that participation component. You’re solving a mystery. You go to an escape room, and you solve clue number one and get into the next part of the room, then clue number two, and you get to the next part, until you finally reach the point where all is revealed and you win in the end. So there is this sense of momentum, a sense that new information is emerging, and we can keep moving through this.

One thing that is very interesting to me is that although QAnon’s core mythology was about Donald Trump freeing children from a global paedophile cabal—that was sort of where it started—it wound up becoming quite big in Europe. I would see QAnon Germany groups forming, and I thought this is so interesting that such a foundationally American core theory nonetheless became so big in Europe, or big enough that it attracted hundreds of thousands of people at protests and things like this—hundreds of thousands of people in groups and then a few thousand people in real-world protests. You would see these dynamics happen where they’d say, “Okay, well, that happened in America, but if we read this through the lens of our politics, here’s how our leaders must be caught up in this global cabal. Because if it’s a global cabal, surely it’s over here too.” And you see the facts mutate to fit the baseline reality that is on the ground over there.

Q: Who profits from misinformation, disinformation, and online rumours?

[Renee Diresta]: So, whereas you would have your geographical, local conspiracy theories or rumours about some political leader or other in a local space, what you start to see—and again, some of them become national like Procter & Gamble in the U.S. in the ’80s—but what you start to see happen with the internet is that an entire ecosystem can spring up in which certain people can make themselves almost the translators of the rumour, the keepers of the lore. And so you see an entire community of QAnon influencers form. Again, they come out of the community; they’re just maybe a little bit better at communicating, they’re more willing to prioritize this thing that becomes their life. They’re going to create content about it, and then they have the capacity to monetize on different platforms.

It really goes back and forth. Sometimes, for a while there, platforms like Facebook took them down; YouTube took them down. You do see them on Rumble; you do see them now back on Twitter, where they can monetize this kind of content. On Twitter, on Substack, you can have a nice, neat way of getting subscribers to your theory. And so there is an opportunity to turn this into something that provides either the validation of clout—a psychological payoff, an ego payoff—or actual payoffs where you’re actually earning money from being the person who becomes the keeper of the conspiracy theory, the breaker of the news, the interpreter of real events in that context.

Q: It’s not just big scams and disinformation, there’s a lot of day to day interactions with these phenomena?

[Renee Diresta]: It’s very much that. I get asked a lot, especially as generative AI has become popular—I cannot tell you how many media inquiries I get in a given week about something related to generative AI and an election. But these are because elections are maybe higher stakes; politics is higher stakes, and so it attracts attention from media at the national communication level.

But a lot of what we see with things like new technologies for shaping reality is spammers and scammers pick them up first. Very much, it’s an immediate adoption for them because the motive is profit. And so it’s not ideology; they don’t have to come up with very complicated mechanisms for manipulating groups of people with political content—they’re just in there: “Let me show you a picture of a beautiful sweater; I’ll tell you you can buy it. In reality, it can’t exist in the world; it defies the laws of physics.” These sorts of things.

And so there’s just, I think, maybe the focus on the political is because then it affects other people outside the group also. And this is where you get some questions like: What are the conspiracy theories? What are the rumours that have broader social harms? We’ve written a little bit about that—the vaccines which you mentioned. There’s a spectrum there too.

For example, rumours about cancer care or conspiracy theories about chemotherapy—they’re going to affect a limited population of people, and if they make a choice, it’s largely going to affect them and their families. Whereas on the vaccine conversation, for example, the impacts are much broader. The impacts affect the potential entirety of the public in the case of an outbreak or contagious disease.

So there is that question as we look at these things: they are not all equal in terms of their potential for impact, so it is an interesting challenge. Sometimes, though, they do shift in their impact. Meaning in the U.S., we have long had this conspiracy theory about chemtrails—I don’t know if this one’s alive in the UK too—and it was very niche for a long time.

But there was something about COVID and distrust in science and distrust in government that exacerbated during COVID that has really given a much broader viewership, a much more receptive set of potential audience members for these theories that used to be largely confined to small groups of people. But now, all of a sudden, there was some legislation introduced in one of the U.S. states about “We’re going to ban the government from spraying us,” and this is the sort of thing where you realize that people—politicians in particular—can actually appeal to these highly activated, very motivated communities of people for votes.

And so you do see the strange impact on politics as people who are highly motivated, highly likely to vote, will turn out when the politicians themselves begin to pander to that as actually a base—which it is in the U.S. now—and that’s an interesting shift that’s happened over the last few years.

Q: What can we do?

[Renee Diresta]: It is very much, on the policy level front, that there are two different sets of potential policymakers. The first would be the government, and in the U.S. in particular, you would not want the government making determinations about which of these things are true or false. That would also have a significant backfire effect, since a lot of the conspiracy theories are about the government or involve the government in some way. So the government is not going to do much, in my opinion, to defuse a lot of these theories.

We’re, what, 23 years now after September 11th—we’ve just had that anniversary last week—and we still have very committed 9/11 truthers. Nothing that the government tells them is going to change their mind because, of course, the government would deny that it was an inside job.

So, let me keep going with that framework of two different actors. The other, though, is the tech platforms, and they play a very, very interesting role here. Because you want to be maximizing freedom of expression. There’s nothing that is, at least in American political culture, outside the bounds of normalcy about being a 9/11 truther, a chemtrailer, or even an anti-vaxxer. These are all political opinions; people have them, and yet at the same time, the mechanisms by which they spread have very significantly shifted.

And so saying, “Oh, counter speech is how we’re going to respond to this”—well, again, when you had your geographically bounded rumours, the counter speech would have been the person in your bridge club saying, “No, that’s crazy; here are the facts.” Somebody you trust saying that back to you.

What happens today is that social media platforms, through what they amplify and curate, actually become significant connectors that bring conspiracy theorists together. They actually helped people go down the rabbit hole for many, many years before they finally realized that maybe boosting—particularly proactively boosting—some of these theories to people was actually a very bad thing for them to be doing.

And that you could let an anti-vaccine group be on your platform without serving the anti-vaccine group up proactively to a pregnant woman—which was something that happened to me that I talk about in the book. When I had just had my first baby and I joined some mom groups, Facebook started pushing the anti-vaccine mom groups. So there’s a difference between me going looking for it and it being served to me, and that is the area where platforms do have some power because they are curating.

And no curating is neutral, and so determining what you weight and how you curate things—if you’re going for “we’re going to serve you the most active parenting groups”—well, chances are you’re going to get some pretty weird conspiratorial things, because people who are actively out there investigating are very reactive. The QAnon groups were so high-volume—just hundreds and thousands of posts per day—because they were all in there investigating, and it became a hobby, as opposed to a group that did not have that sort of energy.

So if you’re making recommendations to people based on some sense of correlation between a belief they hold or a facet of their identity and you’re driving people deeper, further into that, that’s where I think there’s a very big difference. Where it’s not an issue of speech at that point; it’s an issue of why there is a proactive curation and amplification of something that does have a demonstrable harm. And that’s where I think that other question of shifting curation to be more attuned to topics that actually do have a societal harm is actually a very, very important thing that platforms need to be doing.

 

Thought Economics

About the Author

Vikas Shah MBE DL is an entrepreneur, investor & philanthropist. He is CEO of Swiscot Group alongside being a venture-investor in a number of businesses internationally. He is a Non-Executive Board Member of the UK Government’s Department for Business, Energy & Industrial Strategy and a Non-Executive Director of the Solicitors Regulation Authority. Vikas was awarded an MBE for Services to Business and the Economy in Her Majesty the Queen’s 2018 New Year’s Honours List and in 2021 became a Deputy Lieutenant of the Greater Manchester Lieutenancy. He is an Honorary Professor of Business at The Alliance Business School, University of Manchester and Visiting Professors at the MIT Sloan Lisbon MBA.