HomeWorldAmerica’s Slide Toward Simulated Democracy

America’s Slide Toward Simulated Democracy


Subscribe here: Apple Podcasts | Spotify | YouTube

In this episode of Galaxy Brain, Charlie Warzel sits down with Eliot Higgins, founder of the open-source investigative collective Bellingcat, to examine how our public sphere slid from healthy debate into what Higgins calls “disordered discourse.” Higgins is an early internet native who taught himself geolocation during the Arab Spring and later built Bellingcat’s global community. He has spent the past decade exposing war crimes and online manipulation with publicly available data. Higgins has recently come up with a framework to help understand our information crisis: Democracies function only when we can verify truth, deliberate over what matters, and hold power to account. All three are faltering, he argues.

In this conversation, Warzel and Higgins trace the incentives that broke the feed: how algorithms reward outrage, how “bespoke realities” form, why counterpublics can devolve into virtual cults, and what “simulated” accountability looks like in practice. They revisit Higgins’s path from early web forums to Bellingcat, look at the MAGA coalition as a patchwork of disordered counterpublics, and debate whether America is trapped in a simulated democracy. Higgins offers a clear diagnosis—and a plan for how we might begin to claw back a shared reality.

The following is a transcript of the episode:

Charlie Warzel: Hey, everybody. It’s Charlie. And before we get to today’s episode, I had requests for all of you listeners. We’re working on a story about screen time. And when we tend to talk about screen time, often the conversation will be focused on younger people. We’re worried that they’re getting too much screen time, or they’ve been radicalized by what they see on their devices, or that they don’t seem to understand how they’re being manipulated.

But I’ve gotten a lot of anecdotal reporting over the last few years that the problem is similar, if not worse, on the other side of the age spectrum. And so we wanna do a story about a different generation’s relationship to this technology. We’d really love to hear from you. Whether you are somebody who is having some of these problems, or you feel your relationship to your device has become a bit problematic or lopsided, we’d really like to hear from you. Tell us your age and why you feel you have an unhealthy relationship with your device. If you’ve noticed this with a family member, we also want to hear from you. So if you could send us a brief voice memo—about a minute, no longer—we’d absolutely love that. More than anything, we want you to emphasize and describe what you’re seeing and what you’re feeling about your loved one’s screen time or your own, and we want you to express whatever honest amount of concern you have.

Please send that voice memo to cwarzel@theatlantic.com. It’s cwarzel@theatlantic.com. Thank you so much, and here’s today’s episode.

Eliot Higgins: In terms of deliberation, you have these, you know, “20 X versus one Y” videos, which kind of do this before.

Warzel: Oh, you mean the Jubilee videos?

Higgins: Yeah, those. I despise those. I think they’re just a strong example of the kind of hollow performance of democracy—that no one’s there to learn from each other, to come to a shared understanding.

They’re there for clips to get attention on social media, and that’s, for me, the bottom line of those videos. No one’s going there to have their minds changed. It’s not designed around that. It’s designed around capturing the algorithm, and I think it’s, you know, bad for democracy and pathetic as well.

[Music]

Warzel: I’m Charlie Warzel, and welcome to Galaxy Brain. Today we’re gonna talk about discourse, and not the shorthand on the internet for a viral outrage, right? When we talk about discourse, we’re talking about, like, Cracker Barrel’s logo changing and people being up in arms about it, or the latest political infighting. This version of discourse that we’re gonna talk about in this episode is much more substantial, and I think that the stakes are far higher. Discourse—as we define it here—is, essentially, just our ability to talk to each other, to find things out about the world, to debate those ideas and establish ground truths and reject the things that we don’t like.

It is our collective sense-making process. It’s how we do science; it’s how we develop laws. It’s the backbone of a functional and healthy society, and nowadays when we talk about discourse, we often just refer to it as “The discourse is bad.” But there are all kinds of discourse, right? There’s a healthy, functional discourse where there are elites that are held to account, where we can debate things, where institutions and people act in good faith and are benevolent. And there’s a version of discourse that is kind of hollow, right? Where there are good actors and bad actors, and we kind of just limp along there, even though there are a lot of inequities in the system. And then there’s what my guest calls a “disordered discourse,” where democracy is almost simulated. You have people who get into power, and they wield it by imposing their views on the world and making it so they can never really be held to account, right? This is something that I think a lot about today. And now with the Jeffrey Epstein investigation, right? You have this trove of emails that people are seeing, where you have elites talking behind closed doors and operating with impunity—because they don’t feel like they’re ever gonna be held to account for the amoral or immoral things that they have done.

And when that gets found out in the world, people get really angry, right? But in this disordered discourse, the reason why my guest says it’s simulated is because when people try to push back against that—when people do try to hold leaders to account—nothing functionally happens, or not enough happens. And so you get this incredible frustration. And when people feel like their democratic participation isn’t rewarded, they start to tune out or drop out of the system altogether. And it’s very, very dangerous. My guest today is Eliot Higgins. He’s an investigative journalist who founded the open-source company called Bellingcat.

They produce journalistic investigations using all kinds of publicly available data online. And Eliot is somebody who is a true internet native and really understands—and has gotten into the weeds of—all the different online manipulators, nefarious bad actors. And knows these platforms and systems inside and out. And so he’s the perfect person to talk about this. Not just because he has the experience, but also because he’s developed a framework around disordered discourse. And in it, he has this idea that there’s basically three conditions that allow societies to function, right. You have to be able to establish truth, debate what matters, and hold the powerful to account.

And if you think about those three pillars right now … doesn’t really seem like we’re doing a great job on a lot of those. You know, it is really harder than ever to establish truth these days. Debating what matters is happening all the time, but is happening in a very chaotic way, right? We’ve outsourced a lot of these conversations to these tech platforms that are not neutral—that constantly manipulate us, that drive us to be the worst versions of ourself, that amplify outrage.

We are operating in what the researcher Renée DiResta calls these “bespoke realities.” And so it leads to a discourse that is so disordered that it really threatens democratic collapse. And so, Eliot Higgins is going to walk us through this framework and try to ground us a little bit, describe the temperature of the water that we are all swimming in all the time, and help us try to figure out, if at all, how we can claw it back. So here’s Eliot Higgins.

Eliot, welcome to Galaxy Brain.

Higgins: Thanks for having me on.

Warzel: Yeah, absolutely. I wanted to start with your background, and specifically something that you posted on Twitter back when it was Twitter, where you talked about how the research work that you did that turned into Bellingcat, It started with, and I’m gonna quote you here, “me arguing with people on the Guardian Live, Middle Live blog comments; posting way too much on the Something Awful forums. During those arguments in 2011, there were videos shared from Libya, and arguments about their authenticity. That’s when I figured out you could use satellite imagery to figure out where these videos were filmed, stumbling into geolocation.”

I want to talk a little bit about how you fell into this work. And, a little bit, you know, to sort of set the stage here, like what it was like to realize sort of the depth and the breadth of all this publicly available information on the internet.

Higgins: Yeah. Well. I mean, when I started I was really just an ordinary internet user in the early 2000s and then kind of becoming part of these online communities, like the Something Awful forum, which, if your listeners don’t know, is a very old internet comedy forum that’s been around, I think now for about 25 years, even longer.

But it’s where a lot of this kind of meme culture originated, along with a number of other websites in the early internet. But it was also a place where they had a really actually quite good community of people who were taking part in discussions about—in 2011 and 2010—what was happening in the Arab Spring.

And I was involved with those discussions, but I just was frustrated with what I was seeing in the reporting in the media, that you had all this kind of video footage being shared online. You know, people were getting smartphones and sharing stuff on social media, and it was being ignored by the mainstream media for the most part.

And there were good reasons for that. I mean, questions of verification, for example. There were some quite infamous stories early on about how the media were tricked by accounts, like a blogger called Gay Girl in Damascus, who turned out to be a white guy. And they got very cautious about this online information coming from these Arab Spring countries.

But I felt like there was something there that was useful. And I also really wanted to beat people in internet arguments, because I was—I’m quite petty like that. So, like every day, and you see the factions appear on these internet forums. So the people saying things like, Oh, [Muammar] Gaddafi’s actually an okay guy, and It’s, you know, NATO interference and other people saying, Oh, it’s terrible, but I just wanted to know what was going on. So this is where I came up with these ideas of using satellite imagery to compare it to video footage coming from these conflict zones to confirm exactly where they were filmed. And that’s something we call geolocation now, but that was something we didn’t really have a name for. I mean, I didn’t have a name for it.

It was like an adults’ “spot the difference.” So really that’s where I started kind of using that to win arguments on the internet. But I just found more and more interesting things in these videos, and that then turned to me starting, in early 2012, a blog called the Brown Moses Blog, named after a Frank Zappa song.

Where I was, it was more a hobby for my own interest. But, you know, serious people took interest. Now journalists contacted me about videos I was writing about, and it kind of grew from there. In 2014, I launched Bellingcat. But that was really founded on the idea that—we do the investigations. People, you know, from the public can do investigations of open-source evidence. That’s what’s so powerful about it. But also, it’s therefore valuable to teach people how to do it.

So it was both the investigations and guys and case studies for anyone who wanted to do it themselves. And I think through, you know, now 11 years of Bellingcat, that it’s always been about not just the investigations—but how you spread those ideas and techniques to the public, to traditional institutions, into new media.

Warzel: Have you always been someone who has been—I mean, that there’s sort of like an experimental lens to this, right? Have you always been a person who has poked around on the internet, who’s always been, you know, trying to see around those different corners?

Higgins: I grew up in a really interesting time here in history, in the U.K.

I think in terms of technology, we had, you know—my first computer, when I was probably three or four years old, was the Spectrum 48K. Which was an old type-based computer system that became, really, the whole foundation for the entire U.K. games industry. You know, it kind of put the U.K. on the map in terms of technology, and I love technology.

My favorite TV program when I was like seven or eight was a BBC One program called Tomorrow’s World about the technology of the future. And the internet, for me, when I heard about it, was the most exciting thing you could possibly imagine. So I was a very early internet user. I mean, I was probably on like CompuServe when that was being distributed by magazines.

I’m aware by saying “CompuServe”—there’s a large amount of the audience, you probably have no idea what that is. It was an old—I feel like there’s a whole, yeah—online service providers is a whole thing, basically. Very early internet. And then, that’s how—

Warzel: Old-school bona fides. Yeah.

Higgins: Yeah. So I was always, you know, looking for interesting stuff on the internet. And, you know, forums. Like, the Something Awful forums played a part of that, because that was full of people digging out the weirdest stuff on the internet, to show each other and laugh about. And, you know, then, I think, for my political journey I was always quite interested in kind of alternative stuff, music, politics. But then with the invasion of Iraq in 2003 and how I saw that there was like, you know, so much protest against it—you know, clear lies being told—and it still happened for several years.

I actually really switched off politics and kind of really disengaged from it. When I used to be someone who read a lot about politics; I read loads. Things like Noam Chomsky and Seymour Hersh and a lot of writers who now seem to hate me. So it’s kind of a bit weird. But—

Warzel: You made it.

Higgins: Yeah; I made it. And so yeah, that was kind of my whole journey with this. And then the Arab Springs happened. So I was very online, seeing these videos. And also it became almost like a puzzle for me to solve, so I could understand what was going on in these conflicts.

Warzel: So, going back really quickly before, I think it’s important because we’re essentially here to talk about disordered discourse. And Something Awful—because it’s been mentioned a few times—is a comedy website that had these forums. Very popular in the mid-to-late 2000s to 2010s.

And, as people have argued, the site was, like, genuinely, remarkably, influential in birthing, a lot of the dominant internet culture of the era. Like every, you know—I think I’ve seen lists of things that say, you know, everything from, like, lolcats to sort of the standard meme templates to like 4chan.

So I think there’s something foundational—like, does having time in that part of the internet, and especially that forum, does it inform this work that we’re about to talk about, in some way? Like, do you feel like that is foundational to have come from places where a lot of this dominant internet culture sprouted out of?

Higgins: Yeah. I think so. I mean, as I started my work with Bellingcat and my earlier blog, I could always already see these kind of communities forming around the kind of stuff I was looking into. But I kind of understood weird online communities from the Something Awful forums—’cause there’s always been some drama between Something Awful and some other forum or online community. And there were always quite, like, strange—you know, at the time—seeming communities to those online cultures. Now they’re all over the place. But at the time, ’cause the internet was new, all these things were strange and new. And it just, I think—that was part of what helped me.

I found I used to do, in 2015 to about 2017, work with the Atlantic Council, which is a U.S.-, Washington-based think tank. And meeting a lot of people there working on issues around disinformation. And, you know, what Russia’s up to. And I always thought—it came to me, these are really well-educated, intelligent people. But the thing is, when they were learning to be educated about all this really important stuff, there are a bunch of people who spent far too much time on the internet learning about that culture. And so you have these very intelligent people applying kind of logic and, you know, what they’ve learned to this new environment.

And there was just this dissonance they couldn’t, kind of, breach with that. Because it’s like they were making assumptions that always seemed flawed to me. And part of the work I did on disordered discourse was really part of that; it’s that there was a lot of focus on Russian disinformation. But I think there was too much focus on Russian disinformation in terms of the totality of what’s going on.

So we’ve kind of missed the forest for the trees. And that’s why I wanted to kind of step back and say, Okay, actually break this whole thing down. Why is there so much disinformation? Why do people seem not to exist in the same reality as each other anymore? And that started, first of all, by creating this “disordered discourse” framework, which is really about how communities who believe in these fictions start forming in the first place.

Warzel: And so, I mean, you set it up perfectly. I want to talk about this framework and what you’ve called this, like, epistemic and democratic collapse. Or at least this trajectory of how it happens: how we assess the health of a given democracy, etcetera.

To begin with, you lay out that democracy basically rests on three functional minimums. Can you say what those are?

Higgins: Sure. So if you look back through political philosophy and academia for the last century and even beyond, it always comes back to kind of three core ideas, in my opinion—that without, you can’t really have a functioning democracy.

The first is the idea of verification. That as a democracy, we try to figure out what the truth is—because then we can deliberate on that truth. That’s the second part, of a deliberation—where we create spaces where we, in a pluralistic manner, can bring people together to deliberate over the facts that we can establish through that process of verification.

And finally, there is accountability. So action is taking on those things, and those who hold power can be held to account. Now I will stress: These are kind of functional ideals. No democracy actually ever can truly achieve them, because by the very nature of democracy, you can never have 100 percent happy people on any topic at any one time.

So it kind of starts with the idea that verification, deliberation, and accountability have to exist. Because without verification, how can you deliberate on reality? And if you can’t do that, you can’t really have true accountability. But I expand on that, saying that you need to start thinking about this as three types of verification: deliberation and accountability and democracy. There’s substantial, which means you have functional democracy and functional systems. There is hollow or performative, which creates these hollow systems. And then there are simulated verification, deliberation, and accountability functions, which occur in disordered discourse.

And that’s kind of basically the initial start of the framework.

Warzel: I love some of these terms, because they’re very immediately evocative for me. Like, when I hear about—I mean, I don’t know if I have a ton of lived experience in the “substantial” part of this, which I think is part of the problem that we’re gonna get to.

But this idea of a hollowed-out, or sort of, you know, performative-type feeling. Like, when I think of also just like discourse on the internet, I think of both those words that you use—like performative and simulated, right? These ideas that it sort of doesn’t matter, in a sense, what anyone does in the system.

It’s going to sort of churn out things the exact way. And so my question here is: As you were coming up with this, where do you think—’cause you call this the arc of democracy in some way, right? This, where do you think, maybe using America as an example.

Like, where do you feel we are right now on that arc?

Higgins: In terms of the U.S.: very far into disordered. And it’s a mix of these things. I would say, if you imagine it as the arc, you are moving from performative, well into disordered. And when you head toward disordered democracies, that’s when you basically start seeing democracies turn into these competitive, authoritarian governments—like you have in Hungary, like you have in Turkey. Where you still have the kind of rituals of democracy. You still get to vote, for example. But this system is so corrupted and skewed towards the autocrats that it doesn’t really count. And you see that happening in Turkey, for example. You see that happening in Hungary. And that is where I think America’s heading toward. I think the recent election results might seem like a piece of relief. But my problem is: There’s one thing to, you know, get another president in. You get rid of [Donald] Trump; you know, win the midterms. But then you have to create that substantial “verification, deliberation, accountability” functions in that democracy.

And that’s a much, much bigger task. And my fear is that what keeps happening is: We just have this pendulum swing back and forth, where people are just losing trust in one side. So they go to the other side; they lose trust in that. They swim back in the other direction. But some of them also swing in the direction of more extreme ideologies they’re encountering online.

So, yeah; it’s not a great situation to be in. It is really the frank answer. And what I find the most worrying is in this model, that’s almost the default state for a number of reasons.

Warzel: What do you mean by that?

Higgins: So if you look at what are the incentives for the creation of these discourses, and let’s just take it back a bit.

These verification, deliberation, and accountability functions were done by institutions in the 20th century. So you had—for example, verification was done through newspapers, editorial processes. Deliberation, parliament. And accountability happens in courts, for example. And that obviously had mixed success.

One thing that’s very important there is the idea of these things called “counterpublics,” which are public movements that form around perceived injustice. Like the civil-rights movement, the feminism movement. And they change democracy through these same functions. They influence them. And it can be a battle sometimes. But it’s kind of like those things have to exist to allow democracy to evolve.

And the challenge that we’re facing at the moment is that we’ve lost a lot of that, and it’s been replaced by these online spaces where it’s no longer about the truth getting through to people through institutions. It’s about a kind of free-for-all—for the most valuable thing to the algorithm, which is attention.

So everyone is shaping the behavior around getting attention online, because that’s the only way you can be visible. In the information system, the whole media infrastructure is kind of shifting toward this attention-based system, rather than one that’s coming through these institutions. And the problem is: That means the functions of verification, for example, are done by you and I.

We see a post on social media. We are now the distribution of the information, and that changes everything. Because it’s no longer about a kind of scarcity of information, and a lot of attention, as we had in the 20th century. Now there’s so much information—but there’s a scarcity of attention because of that. And that dynamic is poisonous.

Warzel: This is something—in a previous episode, I talked with Hank Green, a popular YouTuber. And we talked a little bit about trust. And about the shift, as you put it, basically from a gatekeeping model to a platform-based model. This idea of sort of a democratic system of information that ends up being very disordered.

And one thing that we were trying to tease out in that conversation, a little, was—you know, I think when you think about the trust in institutions. So many people who are competing for this now—competing for that attention—malign these institutions. It’s very helpful to do and to say, “Well, you know, the real reason there’s a lack of trust in the media, or a lack of trust in government, or, you know, public health, or you name it, is that these things have failed in some way.” And what’s sort of interesting about the framework—and I’m not asking you to absolve institutions or, you know, render a judgment—but it seems a little bit like the shift in the information system, in the way that it is distributed, in the way that we’re all vying for attention.

It seems like that is a bigger force and factor in this than the institutions simply tripped and fell and now are facing the, you know, having to deal with that. Like, how do you think about that? Do you feel like the institutions—it’s basically just like, it’s not a fair fight in this new environment? Or do you feel like this is actually also a reaction to frustrations against elites, frustrations against institutions, that are legitimate?

Higgins: I would see it as a convergence of both of those factors, and other factors as well. They aren’t competing; they’re part of the same problem. And I think this is often where we kind of fall down when we’re thinking about, for example, disinformation. We see it as a separate problem from other issues. So it’s kind of like, we see it as a problem of information—but this is kind of where I started with my framework.

It was a problem of discourse. Is the information actually being able to reach people in a way that’s functional? Rather than what starts happening in these disorder systems, where they don’t reject the idea of verifying information, deliberating, and accountability. They kind of pervert it through simulation.

And then the internet is also brilliant at filtering you to the most extreme version of beliefs you can hold, while also reinforcing them. Because, as it’s showing that material, it will always include a subset. If eventually, once you see enough of it, it’s more extreme—and then you might click on it. And then that serves you even more extreme content, and it filters toward those communities who hold those views. Who then provide you with all the information you need and the community to reinforce that.

So, you know, it’s a convergence of that. But the distrust is a big part of this as well. I think also we are living through a period where—I think the growth of the ’80s and ’90s that a lot of people saw reflected a kind of meritocracy has just fallen away. Because everyone’s working really hard, and you know, my generation, in particular younger generations, don’t seem to be, you know, having the successes of their parents in many cases.

And people recognize that. And that’s, you know, just part of the system. Because neoliberalism has been such a priority over the last 40 years. You’ve had places, like union halls, public spaces—you know, places where people could meet and actually deliberate and actually form these counterpublics—disappear.

And instead, that’s in online spaces where everyone’s just mad at each other all the time, because of the algorithmic recommendations they’re getting. And that does not help democracy at all. It kind of undermines the entire thing, really.

Warzel: Well, and yeah—I mean, what you’re describing is this perfect storm, right?

And I think what is really important in what you said: There is this idea of community, right? I think all the time; I mean, I try to assess this in my own mind when I’m consuming information and scrolling and thinking and trying to, you know, express my opinions or figure out what I’m gonna write about or do. And I think, like, it becomes this hard thing for people to see. But, like, the ways in which humans are wired to want to be in community with other people—to want that acceptance, to want that, you know, those bonds and then the norms that those communities create. Which are enforced by all this discourse and all these information systems that we’re all plugged into. And the fundamental, like, emotional and psychological pain of breaking from those things, right? Of saying something and being ostracized for that. I mean, it feels to me like, you know, it’s so easy to talk about this stuff in a really simple—or it sounds very simple when you just say, Oh, if you say the wrong thing, you know, you’ll be canceled or ostracized or whatever.

And we make a lot of hay outta that. But like, psychologically, when you get yourself into this position where these are your people, right? Like, when you get into these groups, you’re obviously facing a lot of pressure from outside people.

You kind of … you hunker down. But if you are trying, you know—if you do try to stray from that—there is this, like, psychological pain tax that, you know, these communities will exert on you if you break. And it feels like it’s just—that’s one of the things that just, like, hardens this, right?

That creates a more durable ideology.

Higgins: Yeah, because either it doesn’t matter if they admit it to themselves or not; they deeply understand, you know, what turning away from that community means for them. Because if you hold extreme beliefs on pretty much anything, you aren’t gonna have many people in the real world who agree with you. And often your family members will start having, you know, bad relationships with you.

I mean, if you have really extreme beliefs around things like anti-vaxxing—an example where people do have a strong opinion on that. On the other side of the argument, you could find yourself really with your only friends being that online community who agrees with you about all the things that you think, and where you actually get recognition for that.

And that’s something that’s really important, because people don’t feel recognized in the real-world spaces therein. Because their family doesn’t talk to them, and all their work colleagues think they’re the weird one in the office. They go online, and they’re a hero. So that is very, very appealing to people on that level, but it’s also the fact they kind of define themselves—not just in terms of the group they belong to, but the people outside that group. They aren’t just “many other people.” They are the enemy or idiots. And that would be kind of admitting, if those people are right, that maybe you are an idiot, or maybe they weren’t an idiot all along.

And that creates that dynamic as well. So it’s really—the claws can get really deeply into you. And it’s a lot, actually, like how cults are formed and controlled. We’ve managed to build a system that kind of creates virtual cults automatically, without realizing. So yeah: That’s basically what the algorithms have done to us.

Warzel: Hey, that’s such a striking takeaway: that we’ve just democratized the cult leader, or the cult dynamic.

Higgins: For every single person of the world, you just need an algorithm. It’s as simple as that. Now, you don’t need some strong leader, because you don’t need someone telling you what you are, right?

And how special you are. You’ve got an algorithm—just serve you content that makes you think that.

Warzel: So I’m curious. What can be difficult in talking about this is, essentially, as you know, your framework goes: This is basically the air we breathe, right? Like, this is embedded in everything and every conversation that we have, and every debate that we have, and the way that every institution is trying to, you know, like claw onto or—

Higgins: Even this podcast. Because you’re gonna be thinking, after you’ve done this, which clips will go the most viral, and what is gonna get this attention?

Because you are as much part of this economy, and you can’t escape. It’s not something you can opt out of, because you kind of then cease to exist. And you know, if you work in the media, that’s a big, big problem. So everyone is—even me. I’m on Bluesky; rank these threads, thinking what’s gonna get the most engagement on these threads.

But underneath that—because I have this understanding of this model—I want to do it in a way that’s functional. That’s part of, you know, creating good information. And that’s always, really, been what Bellingcat has been about. Our investigations add information that allows people to understand. You know, deliberate, and, you know, hopefully bring about accountability on a range of topics.

And I realized, doing this work, that Bellingcat was a functional counterpublic, because it had formed in a reaction to, you know, not the biggest injustice in the world, but the way in which the media was ignoring this content from these conflict zones. And it was designed around the idea of verifying that, and we create spaces for deliberation.

So, you know, X and oh, Twitter, really was really a big part of the open-source community. But as things have changed, we’ve created a Discord server that has about 35,000 members. Investigations come from that. People learn how to do investigations and, you know, learn from each other in that space. So creating those functional spaces is also something that we do with Bellingcat.

And then finally, you know, we do accountability as well on the work we’re doing. On a range of topics.

Warzel: Well, and two: You know, I also feel like with Bellingcat there’s this idea, right, of “do your own research.” And that has become this, obviously a shorthand for, you know, “Take your sort of amateur lens with—maybe you don’t have all the right information to process what you’re seeing, but go do it anyway. Form your own conclusions.”

And in this way—I mean, in a very real way—Bellingcat is a way of doing your own research through, you know, a system that has more guardrails. You are basically, at least in my mind, the good-guy version of “do your own research.”

Higgins: Well, you can see it in terms of the framework.

So, you know, we can see ourselves as doing kind of a functional version of that. But there’s disordered form of that, where people are doing their own research—but they’re doing it through this moral epistemic lens. That means they’re eliminating certain sources as being trustworthy, because they disagree with their worldview of the system they’re already part of.

Then we also have hollow stuff like that, as well. I mean, you know, it’s like—in terms of deliberation, you have these, you know, “20 X versus one Y” videos, which kind of do this before.

Warzel: Oh, you mean the Jubilee videos?

Higgins: Yeah, those. I despise those. I think they’re just a strong example of the kind of hollow performance of democracy—that no one’s there to learn from each other, to come to a shared understanding.

They’re there for clips to get attention on social media. And that’s, for me, the bottom line of those videos. No one’s going there to have their minds changed. It’s not designed around that. It’s designed around capturing the algorithm, and I think it’s, you know, bad for democracy and pathetic as well.

Warzel: And what would a healthy, functional, Jubilee video look like to you?

Higgins: I mean, the concept of them, and how they’re set up. I mean, that’s just—unfortunately, it would look boring to most people, I think. This is the, probably—

Warzel: Right? That’s the point too, right? Is that it would be so substantial that it would appear boring next to the rest of the content?

Higgins: And this is the challenge. You’ve got two or three seconds to grab someone’s attention. And if you’ve got, you know, some idiot scowling at some other radio on social media that will catch someone’s attention most of the time, maybe they just draw a line through the video. So it catches your attention. Or whatever new ideas someone comes up with to irritate you enough to stop scrolling.

But it’s not about building a better democracy; it’s just about content that gets attention on the algorithm.

Warzel: Right. So how big—and this is maybe a silly question given what we’ve just talked about. How big a crisis is this disorder discourse? Like, how do you capture the scale of it? How do you conceptualize the size and the stakes of all of this?

Higgins: It’s, I think something that is best measured in the effects it has in society. And I would say what’s happened in the last nine months in the U.S. is a really strong warning to everyone else, where it can go. I look at the MAGA movement as a coalition of disordered counterpublics.

So if you look at, for example, the alternative-health counterpublic, which is, you know, RFK Jr., and that’s what he represents. You have the kind of anti-NATO, kind of chemical-weapon-denialist community, which Tulsi Gabbard represents. You have the Pizzagate, that Kash Patel represents. It’s not one kind of unified group with the same shared ideas.

It’s just that they have alignment around, you know, the concept that Trump’s gonna give them what they want, and that they all distrust institutions, and they’re gonna change those institutions. But that’s a really big problem—because it creates a cycle where these people, they look for problems that don’t exist. That they truly believe exist.

Maybe some of ’em are grifters. Some are true believers, and they go after that problem and they can’t really solve it. So when they fail to solve it, they blame the outsiders—you know, they say, “Oh, it’s the woke media,” or whatever it is. “It’s the liberals.” And then they further kind of radicalize our own viewpoints against the outgroup.

And I think we’re still very early on in that process. But, you know, when that starts happening in the U.S., I think that will be a real moment of seeing how bad things have got for democracy in the U.S. I think it’s nice everyone can pat themselves on the back on the good election results recently. But unless they build something beyond that, we’re just gonna fall back into this kind of swamp of algorithmically mediated information that, I feel, it’s really difficult to escape from.

Warzel: Well, and there’s this way to—you know, looking at election results, I mean, there’s unending conclusions one can draw. Which is why it’s sometimes difficult to parse. But I think one thing that you’re also seeing that applies to this framework that you have is, especially in America, a kind of a ping-ponging back and forth, right?

A rejection of whoever seems to be holding power at this current moment, right? You have Trump’s handling of the pandemic. Certain things. Okay. [Joe] Biden; we need someone. Okay. You know, all the reaction to that. Trump back in, you know—you have this sort of wave of rejection, of honestly, like an institution, right?

As soon as you can kind of gain control of the institution, there seems to be sort of this rejection of it. And that doesn’t seem to me—you know, again, that may be too tidy of a bow to place on it, given what is also happening concurrently with the MAGA coalition in America. But I also think that it sort of speaks to this dynamic. Right. That it is extremely hard once you grab a little bit of authority now to face this information system that is essentially geared toward, you know, undermining that authority. I mean, does that feel right to you?

Higgins: Yeah. So you just end up—I think the U.S. is fairly unique in terms of having the two-party system. So you have that ping-pong effect.

It’s like in the U.K., for example, Reform U.K.—which is a very minor party, despite the coverage it gets because Nigel Farage is now leading in the polls. Which, it seems like really bad news, but that seems more like a rejection of both the Conservative Party, the previous government, and the new Labor government, who’s been doing pretty badly to begin with.

So it is a reaction against something. But that’s the problem—our politics are becoming: I’m reacting against something because they’ve let me down. And we’re surrounded by a system that will continue to remind us how rubbish things are, because saying “Everything’s great” does not get you on the algorithm.

Saying, “Everything’s rubbish; here’s some violence; here’s some sex; here’s some just really bad news”—that’s what gets people looking for stuff on social media. Not “Everything’s sunshine and rainbows,” unfortunately. And again, it comes back to these algorithmic incentives that are being created for people. And it’s just like—this is the kind of “dragging down in the swamp” effect.

You know, once you’re in there—once the claws are in you through social media—it starts dragging down the whole democracy with it.

Warzel: We have this diagnostic framework here that you’ve come up with. And then we have this idea, especially with democracy and also with institutions, that there is this old system trying to cling on. Trying to sort of work in the, you know, pre-algorithmic world, right? Still trapped in that zone. And now we need democracy to adapt to this system. What do we do? Help us out here.

Higgins: Yeah.

Warzel: What do we do?

Higgins: It’s a big job. First of all, there needs to be a recognition that the fundamental information system exists and has changed dramatically.

And it’s really important to understand that institutions no longer dominate that verification, deliberation, accountability process. They also dominated what voices were allowed to be heard—but now any voice can be heard, and they get heard if they do something to engage the algorithm. Which usually is not something that’s truthful; it’s something that is engaging.

So we have to understand: That’s the system that we live in. Now the question we have then is, you know: Do our legislators really have any interest in stopping social-media algorithms? And I think in this particular nine-month period, probably—or 11 months now—probably not. And this means: Okay, so we have to look at kind of alternative action.

I think it has to come from the grassroots. It has to come from, you know, the public. Because, again, the institutions ain’t what they used to be. And we need to look at a different way of doing things. That’s not to say, you know, “full communism now,” or anything like that. But it’s more saying—how do you engage the public with the democratic process in a functional way?

And there’s a whole variety of ways of doing that. So I describe something called the art framework, which describes eight tracks of activity. And these tracks include things like education. For example, at Bellingcat, we’ve been working with schools on a pilot program to teach them critical-thinking skills, just to kind of see how that works in school.

And there’s a huge amount of interest in it. We work with universities to create open-source investigation courses, that students learn how to do the investigations. But also, at those universities we have investigative hubs so those students can do their own investigations without the direct intervention of Bellingcat. But also connect to their local communities, because I think local media has to play a really big role in this as well. So we have the education aspect of it that can connect to the kind of media side of it. How do we actually get this stuff out there?

Warzel: I want to pause on that for a second. Because that, to me—I’ve talked with academics and researchers around this idea of education. And it’s obviously, like, it’s very important from the media-literacy portion of it, all the way up to what you’re talking about in terms of research methods. Giving people the ability to actually go and do that work, and understand that. Every time I write about that, or report on that, it is met with such harassment, and I am met with such harassment and fervor from other people on the right.

It very clearly shakes propagandists to their core. They get extremely reactive about this, And I wonder: Do you get a lot of pushback there? The educational part of it feels like it is almost, like, the most crucial building block to some path that is, you know, a little more healthy from an information perspective. It also seems like the most fraught in terms of people all of a sudden getting very suspicious. Of “What are you teaching?”

Okay, you’re teaching this critical thinking. Is this critical thinking, you know, to blindly trust institutions? Or what have you. Are you seeing that kind of pushback when you try to implement some of these programs or ways of thinking?

Higgins: Not when we implement them. But when certain people hear about them, their framing is more that, Oh, they’re gonna go into schools and teach young people propaganda against the right or whatever. Or on behalf of the government. They don’t see it as teaching skills. They see it as teaching information, propaganda. Which is absolutely not what we’re about. I mean, the whole point of what we’re trying to do is, you know, recognize that we no longer have this relationship with institutions where we rely on them to verify the information that comes to us. And that really was the reality of the 20th century through the media. I mean, you could buy a book; but who decided to write the book? You know, who published the book?

So now we have this freefall of information. We need to recognize that we live in that environment—and that to navigate that environment, media studies, critical-thinking skills are not optional anymore.

Warzel: When I think about the real demise of local news, I mean so much here in America. Obviously it’s an issue globally. And then the nationalization of news. I think a lot about—I’ve lived recently in a couple of small towns, and you know, you can really see how that local model is just a virtuous cycle of trust-building, right?

This idea of, you know, the people who do the reporting and the writing and the verification and the holding people to account—there is also this way in which the community does that with them, right? Like, they are the—let’s say, your local columnist or your local investigative consumer-reporter kind of person is also, you know, a parent in the schools.

You see them on the sidelines, in the sporting events and things like that. There’s this way in which they’re all intertwined. The community can, you know, verify, hold them to account; well, they hold the community to account. And there’s this real notion of, you know, tangibility, I think with all of it, that builds trust, right? This is not some abstract person. When you get rid of that, and you nationalize it—and that’s part of what these information systems have done, too, right? They’ve nationalized our conversation so much. Like I know more about the mayor-elect of New York City than I know about any politician truly within 500 miles of my home.

There’s something that’s a little mind-boggling about that, too. But I feel like the nationalization of that, what it incentivizes is people dropping into your community, right? Someone from, you know, a magazine or whatever organization comes in when something happens, right?

I think about it as a reporter, a national reporter, myself. I’ve had to go into communities after, say school shootings, and I don’t know a single person in this community. I don’t know the norms of this. I’m trying to do my best, but it’s ultimately an experience that alienates almost in both directions, and can cause that distrust. And I think of all of this, of how it creates that cycle, how it adds to the disordered discourse, right? Because we aren’t speaking from that position. And I think this—like, the localization of all of this, this community element of this—it sounds, I think, to a lot of people very pie in the sky, right?

But reestablishing those networks—I feel like it is like the crucial first-step node, right? In restoring trust and saying Okay, maybe I don’t trust all journalists, but I definitely trust the ones in my community. Because I’m watching them work. I’m seeing the impact of that.

Higgins: I think some people look at that, and yeah, they do say “How is that possible?” But one of the things that I found really interesting working on the art framework—we have these eight directions of activity, and you’ll find there’s people in different kind of silos.

You know, there’ll be journalists; there’ll be people working in education, some civil-society organization, who never talk to each other, never know each other exists. Yet they’re actually working on something in that same direction, that overlaps really nicely. So by having that, we can start rather than thinking about, you know, “I’m an NGO”; “I’m a journalist.”

We can start saying: Okay, I’m interested in this stuff; I’m doing stuff here. And we can look at ways to collaborate on that, because finding those collaborations isn’t just about a kind of nice way to do a bit of extra work. It’s about also building relationships between people in those collaborations. And it’s almost as if we have these online spaces that allow us to connect internationally and nationally. But we’ve gotta connect those to the real-world spaces that, you know, are in these local areas. Like through these university hubs—almost as like they’re a node in a network that we’re trying to build.

And it’s not a node where the network is around one center. It’s very decentralized. And that’s, you know, I think really core to this as well. Because as soon as it starts to feel like that network is owned by someone, then that’s when people will start losing trust in it.

Warzel: Yeah; I think that’s right. One thing that I’m getting from all of this, and reading the work and talking to you, is that you might think, right, with this idea of trying to reorder or restore some order to our discourse, that would be a call for the removal of friction. The removal of disagreement. The removal of, right. ’Cause right now, we exist in a system that when you’re experiencing it—when you are participating on Twitter, X, whatever, Bluesky, name your social network, right? Or reading the news, or watching politics in action, there’s so much of that friction and disagreement and pain and psychological pain and dunking, whatever.

But what I’m getting from you actually is an insertion of friction, of disagreement. Like this whole idea of functional democracies needing these counterpublics, right? These are, like, holding people to account, putting friction in the system. Do you think of it that way? Of this is kind of—actually adding friction is the way that we sort of get a more ordered, a little more seamless, discourse?

Higgins: Yeah. I mean, it’s simple that as saying that, you know—if you need a functional democracy, you need functional counterpublics. And the problem is: We aren’t seeing those forming in this current information system, because of the incentives that these platforms create.

So the question is how you start creating those healthy counterpublics before it’s impossible to even do so.Because institutions become captured by that disordered discourse. And, for me, I feel like we’re staring that in the face in the U.S. at the moment. We’re so close to a tipping point, and I think Americans really need to realize that, you know, they are a very short way away from becoming just as bad as Turkey or Hungary, and just as hopeless when it comes to restoring democracy in those countries.

Warzel: I feel like listening to all this, what feels so insidious is that these tools that have really helped create this disordered discourse that you speak about—this democratic collapse of sorts—have been sold as these ultimately democratic tools. These tools that give voices to so many people, that get rid of these gatekeepers, that allow for what you would think is this democratic flourishing. And it feels to me, it’s so difficult for me to grapple with that. I think, because it puts people in a difficult position—from any kind of institution, from any kind of former authority position—to say This explosion of voices and perspectives and things like that, you know, is actually leading to a lot of chaos, and we do need some order in the system. How do you think about messaging that to people? Like, you know, talking about this in a way that, you know, doesn’t erode the trust, doesn’t make people roll their eyes, and, talks about this idea of trying to balance this, you know, democratizing of opinion. But also to do it in a way where there is some sort of order and structure and isn’t, you know, eroding the good parts of the systems that we have.

Higgins: And I think, unfortunately, we live with a lot of good examples of what happens when this runs out of control. And those are becoming more and more apparent to everyone. I think, you know—again, the U.S. is unfortunately a really good example of this at the moment—but one thing I found presenting this work is a lot of people have kind of felt this problem. But they’ve not been able to conceptualize it in a way that they can. When I explain this to them, in using this framework, it kind of all comes together in a way that they can understand. And that’s where we, I think, we need to start with a story. We can’t just say, “You know, there’s too many people believing the wrong things; this is how we tell people how to believe the right things.” Because that’s not how it’s gonna work. It’s gonna be about: How do we actually improve, you know, the whole system that people can feel empowered to actually do this stuff? That they see there’s value in doing this stuff. Because we can, you know, have the most functional verification and deliberation in the world—but if there’s no real accountability, it’s still gonna start moving toward those performative and disordered forms.

We need there to be not just, you know, the stuff that the public’s doing—but for real accountability to happen. And I think we’re getting very, very close to a point where if we don’t get a handle of this, it’s gonna start dominating our politics, our democracies, until we don’t really have democracy anymore. Both in the U.S. and the rest of Europe.

Warzel: I think that is a sobering place to leave it. I thank you so much for your work. I do have one question. We asked Hank Green this, because he’s someone who thinks about attention and understands the media, and we are. And we joked about it earlier. Give me a potential headline for this YouTube—

Higgins: Oh my god.

Warzel: What are we gonna do to attract maximum people?

Higgins: The most engaging thing.

Warzel: The most engaging thing that we can do. We’re gonna be transparent about our process here at Galaxy Brain, right?

Higgins: You have to say something’s definitely been completely destroyed, in its hairfire ways. Destruction of democracy. Good. If you use the same D, you know—“The Desperate Destruction of Democracy” or something like that, that actually sounds like a quest from Outer Worlds 2. No, uh, something less cheesy. But, you know, you need something that’s really punchy. It’s definitely the end of the world. You’ve gotta get people listening, ’cause unfortunately that’s the attention-driven economy we live in.

Warzel: All right. We will recklessly ramp up the stakes of this. No, I think the stakes couldn’t be higher. I think your work couldn’t, truly couldn’t be more important to getting us all to understand, and better describing the temperature of the water in the aquarium that we’re all swimming in. And it’s getting hot. So, Eliot Higgins, thank you for all your work and for coming on Galaxy Brain.

Higgins: That’s great. Thanks for having me.

Warzel: That’s it for us here. Thank you again to my guest, Eliot Higgins. If you like what you saw or heard here, new episodes of Galaxy Brain drop every Friday. You can subscribe to the Atlantic YouTube channel, or on Apple or Spotify or wherever you get your podcasts. And if you’d like to support my work or the work of the rest of the journalists at the publication, you can subscribe to The Atlantic at TheAtlantic.com/Listener.

That’s TheAtlantic.com/Listener. Thanks so much, and I’ll see you on the internet.

- Advertisment -

Most Popular

Recent Comments