0:00
/
0:00
Transcript

A Guide to Resisting Violent Extremism

Top researcher explains our subconscious intolerance of hate

Love it or hate it, a majority of our lives are now enmeshed with the online ecosystem. We are constantly subject to hearing the opinions of other people and discerning what is true and what is misinformation. While usually innocuous, sometimes these unsolicited opinions can be dangerous.

Dr. Mike Jensen, Director of Research at the Polarization and Extremism Research and Innovation Lab (PERIL) at American University, joins Jen to explain how rapidly harmful, extremist content can exit fringe spaces and infiltrate the mainstream. Luckily for us, our colleagues at PERIL are researching effective ways to “pre-bunk” individuals against harmful content. Tune in to learn how!

If you’re interested in learning more about PERIL, click here.

Dr. Michael Jensen is the Director of Research at the Polarization and Extremism Research and Innovation Lab (PERIL) at American University. Prior to joining PERIL, Dr. Jensen was the research director at the National Consortium for the Study of Terrorism and Responses to Terrorism (START) at the University of Maryland, where he led the center’s research on terrorism and targeted violence in the United States.


The following transcript has been edited for formatting.

Jen Rubin

Hi, this is Jen Rubin, Editor-in-Chief of The Contrarian. We have another session with our friends at Peril, Michael Jensen, who is the Director of Research. Welcome, Michael.

Michael Jensen

Thank you for having me.

Jen Rubin

Absolutely. We all feel the burden, and the stress of constant events and the constant parade of horrors. PERIL has actually done research on building emotional reserve, resilience. Talk to us about, first of all, why that’s important, and then second of all, what you found.

Michael Jensen

Yeah, so PERIL’s research on psychological resilience is trying to disrupt the role that harmful and manipulative narratives play in radicalizing and mobilizing people to violence. And to be clear, the existence of harmful narratives is not what is new or unique in the moment that we’re living in right now. Those narratives have been around forever. What is unique in the environment that we all operate in right now is the role that digital communication technology has played in radically transforming our information environments. We all now live and operate in an online and digital ecosystem in which a harmful, dehumanizing message or conspiracy theory that justifies the use of violence against a particular person or a group of people can emerge on a fringe space, online, on the internet, and very rapidly move into the mainstream where it gets amplified by incredibly powerful voices that have very, very large audiences.

So these messages are reaching tens of millions of people in an instant. And what research has shown is that once somebody has has been exposed to and has adopted a belief that trying to debunk it after the fact just really doesn’t work very well. People tend to push back on those challenges, they tend to dig in, and in many cases, they might actually become more committed to the beliefs. So the challenge that we face right now, and what Perils Research is trying to address, is how do we get out in front? of these messages and give people the tools to respond to them appropriately, right? How do we stop the harm that these messages create before they end up at the top of millions of people’s social media feeds and they come to embrace them?

Jen Rubin

And what have you found? What works?

Michael Jensen

What works is what we call pre-bunking, right? It’s the idea of, before anyone sees the message or gets exposed to it, that you present a weakened form of the message and identify how it’s trying to manipulate the person that is seeing it into believing or behaving a certain way. And it’s not unlike research and medicine on vaccines. The idea is that if you expose people to a weakened pathogen, in this case, a harmful narrative. their body will respond, right? People will, their cognitive and emotional systems will build antibodies to that message when they are told how that message is trying to manipulate them. They will It will generate emotions of disgust and anger and annoyance. So later, when they go out into their online worlds, and they encounter that message on their own, those emotional responses will return, and they’ll reject the message.

Jen Rubin

In some ways, this is the emotional content of what we used to call digital literacy, or kind of an above-the-neck kind of sense, talking to people in a very rational way about what techniques they should look for. How does this differ, and how do you do that pre-bunking on a scale that will help people focus on the problems?

Michael Jensen

Yeah, so psychological resilience really does rely on those cognitive-emotional responses that humans naturally have to… when they know that there’s a harm in front of them. So, this is not a counter-narrative. We are not going in and, point by point, trying to debunk a position. Right? Which doesn’t work all that well. What we’re saying is, you know, we’re not going to tell you how to think or feel about this, but we want you to be aware that this message is trying to manipulate your emotions and manipulate your actions. And here’s the particular manipulation technique that is being used and how it works. Right? So that way, when they go out into the world and they see that technique again, they go, I’ve seen that before, and I know what that’s trying to do to me, and I don’t like it. And so I’m gonna close out of that. I’m gonna ignore it. I’m gonna push back against it. And so that’s really the key to psychological resilience, is equipping people to do this on their own, rather than being told, believe this, or think this, or don’t trust this, right? It’s producing that response.

In terms of doing it at scale, that is very, very challenging, of course, because that digital ecosystem that I mentioned has these very, very powerful voices that amplify harmful rhetoric, and you’re competing against those voices in many ways to try to get your pre-bunking message out to the masses. So, at Peril, we’ve taken a couple of different approaches. Obviously, the key place where you want to distribute this type of messaging is online, and especially on mainstream platforms that have the largest audiences, but the question is, how do you do that, right? One approach is you buy ads on YouTube, or any other streaming platform, or Facebook, or Instagram. that delivers your message. That’s one approach, but we know that, you know, anybody who’s used those services, you know, you click out of the ad as quickly as you possibly can. Another approach that we’ve used, and we continue to use, is to find influential voices on those platforms that, can share the message and the narrative. So using influencers in these spaces. To distribute this, to their audiences, and being very targeted in the types of influencers we want. So for example, let’s say there’s a harmful narrative that is spreading, or has the potential to spread in faith communities. Can we find influential leaders in those faith communities to go out and do the pre-bunking messaging for us? Their voices carry far more legitimacy and weight than mine does, right? And so using those influential, incredible narrators in online spaces is key to doing this at scale.

Jen Rubin

So, when you do your research, you obviously test this. You take a group of people, you expose it to them, and then you see how they’re going to relate. how effective is the pre-bunking? In other words, do you have a… we don’t expect a 100% success rate, but what’s the, what’s the ability to either diminish or completely, disassociate people from the messages that they may be getting that may be harmful?

Michael Jensen

Yeah, you know, while this approach is somewhat novel in the space of extremism research, psychological inoculation has been around since the 1960s, and there’s a large body of scientific evidence That shows that it’s quite effective. At Peril, we use a three-stage process to do this. The first stage is identifying the threat or the harm, and so our team will go into various online spaces, both fringe and mainstream, and what ideas are out there? What are people talking about? What seems to be resonating? What has the potential to manipulate people to engage in harm? harmful activities. Once we’ve done that, then we will develop a script, to produce a short video. Our research has shown, and others as well, that usually want these messages to be under 2 minutes long. People don’t have very large attention spans, so getting to the point quickly is important. And we work with professional video development companies, social media influencers, to develop videos based on these scripts that highlight these manipulation techniques. And then we will typically recruit around 5,000 individuals to be in test and control groups.

So, some will not see the inoculation video, they will only be exposed to the extremist narrative, others will be shown the inoculation video, and then the extremist narrative, and we want to know what is the difference in effect there. And what we have found is that people who view the inoculation video in comparison to the control group report higher levels of feeling anger. After seeing the extremist content, higher levels of being annoyed by it. They report that they are more likely to speak out against that narrative. And these, these numbers, while in, you know, percentage terms might seem small, so for example, it might be anywhere from a 10% to 20% rate of participants express that they’re gonna, you know, they’re willing to speak out against the narrative.

Imagine doing that at scale and having tens of millions of people viewing it, 20% of tens of millions, that’s a lot of people. That are joining this movement to try to push back against these harmful narratives. So, the other thing that we think is really, really crucially important in this process, and that we take very, very seriously, is we measure for what we call blowback effects. We want to make sure that the inoculation messaging that we’re putting out there is not making the problem worse. That we’re not actually driving people into the conspiracy.

Jen Rubin

Yes, oh, that looks good, I’m gonna look for that now.

Michael Jensen

Exactly, so one of the things we do, for example, in our pre- and post-survey tests, is we try to measure, the, you know, the like… the extent to which the participants are maybe predisposed to believe conspiratorial ideas, that they naturally engage in conspiratorial thinking. And what we have found in those cases is that you do have to pair the inoculation message with some Some, you know, information that counters, factual information that counters. the extremist narrative that has been… when you present that kind of factual information alongside identifying the manipulation technique, that even people that are predisposed to be very conspiratorial in their thinking, tend to reject the harmful narrative.

Jen Rubin

Interesting. Is this translatable in other arenas? In other words, we talk about people getting messages about the violent extremism. But it can also be the case that people are attempted, by, drugs, by other harmful messages that are, you know, deleterious to them. Does it work in other contexts, aside from violent extremism?

Michael Jensen

Absolutely, I mean, you can think of this approach in addressing vaccine hesitancy, for example, in the health field, or around conspiracy theories around election integrity, right? As I mentioned, there’s a large body of research that is primarily not on extremism. or political violence in which this approach has been tested. And it’s around those more common issues that are prone to being weaponized By certain actors and are prone to conspiracy theories. So, the integrity of our political institutions, the safety of our health sciences, those types of things.

Jen Rubin

It is certainly the case that we have no shortage of things that we need to address and pre-bunk, and I think just intuitively, we understand that you never argue someone out of a position, and that, you know, anyone who’s had a Thanksgiving with that uncle knows that that often does not work. So when people people want to get more information, educators, parents, individuals themselves, where do they go? How do they find it?

Michael Jensen

Yeah, so they can come to peril, right? We have a repository of information on our website that details not only our psychological resilience work, but also some of the community-based programs and efforts that we’re engaged in, including K-12 digital literacy. that Peril works on, to prepare kids for the information environment that, unfortunately, they will be living in for the rest of their lives. So you can come to the apparel website and get all kinds of information about the work that we’re doing.

Jen Rubin

Thanks so much, Michael. We really appreciate it. Folks, this is why we love talking to PERIL, because they don’t just identify the problem. They actually have done research, novel, I know, these days, and come up with solutions. So, thanks so much, Michael. We’ll look forward to seeing you and your other colleagues at Peril very soon. Take care.

Michael Jensen

Thanks for having me.

Discussion about this video

User's avatar

Ready for more?