Filmed February 13, 2025
BILL KRISTOL:
Hi, welcome back to Conversations. I’m Bill Kristol. Very pleased to be joined today by Renée DiResta. A first time Conversations guest, but an excellent… I don’t know why I was so slow in having you on, but a research professor at Georgetown, associate research professor at Georgetown, author of a very good book that I really do recommend that came out, what, a few months ago I think, Invisible Rulers: The People Who Turn Lies into Reality. Renée is one of, really, our leading students of, I guess, what? Influence and propaganda online maybe? I don’t know. And has made many… Amassed an impressive list of horrible people who have been your enemies for exposing what they have been doing online. So as someone who’s cultivated a certain amount of enemies over the years, or at least critics is the nice way to say it, and has taken a certain pleasure in that, I want to congratulate you for doing this at your young age.
RENÉE DIRESTA:
Thank you so much.
BILL KRISTOL:
Anyway, great to have you with me. And you’re going to explain influence and propaganda online and what could possibly be done about it. And I guess let we begin with just the obvious sort of question, and maybe put it in a contrarian way. I mean, how powerful is this phenomenon really? You say invisible rulers. Isn’t it just another… We have a new technological medium for communications, just like we’ve had over… There’ve been plenty of breakthroughs in the past, radio, television, et cetera, and are we being… Is it as powerful as some people think, and what’s particularly striking about it?
RENÉE DIRESTA:
Yeah. Well, I think it is the next iteration. I mean, what you’re describing is basically different technological epics for how opinion was shaped, right? One thing that’s really interesting about social media though is that people are participating. In the other kinds of media environments that you just described, people were the audience and the audience was passive, and that, I think, is the biggest shift. So one of the ways that people form opinions is by actively participating with somebody that they trust. They kind of decide what’s true, what’s real, what they’re going to believe, who they’re going to vote for, all that kind of stuff.
And one of the things that was wrong in our understanding of propaganda in kind of the olden days, around 1947, people started realizing that a message was not communicated from on high. People didn’t hear something on the television or on the radio and say like, “Oh.” Their minds didn’t magically change. That used to be called the hypodermic needle model of forming opinions. You heard something and boom, you changed your mind. We all know that’s not true. Hey, I spend my days reading bullshit on the internet all day long and I don’t believe it. But people tend to assume that, well, those other people must. No, that’s not true either.
What happens a lot of the time though is that it’s what you see in your social circles. And what’s really interesting about the internet is that, as more public opinion research was done in 1947, what people found was that some groups of people would follow the media and then would discuss it with their friends. And those people came to be called opinion leaders in this very kind of canonical work that studied how this process happened. And the opinion leaders would talk to their friends, and when those friends were sort of surveyed by social scientists, they would report that they formed their opinions in large part around who to vote for in this election based on what the women in their community were talking about with them.
So these opinion leaders kind of became the mediators who took stories from the media and talked about it with their friends, and that was where the influence channels were happening. And this came to be called two-step flow. Two-step flow of information and opinion shaping. And what happens on social media is that the people that you talk to are the people who are also kind of the media. So you’re talking to an influencer who is also the person that you sort of see as being this media-like figure, but they’re talking to you at the same time. You’re having that opinion shaping process happen with the other people who are in the comments. If you go watch a stream on Twitch or something like that, the chat is going at all times, and the person who’s speaking to the tens of thousands of people who are watching is picking comments out of the chat and responding in real time. They’re having a conversation.
The same thing is happening on Instagram. The same thing is happening on TikTok. So you have some models of media that may be on the internet, you can still read news on the internet, you can still read an article, but that’s still a writer who is talking to you, when what the internet is much more effective at doing now is having a person have a conversation with you. And so that opinion shaping is happening. And then the other thing I’ll just add really quickly is that algorithms are also pushing you into communities that are just like you. So those opinions are constantly reinforced, and that’s the other thing that’s really significantly different.
BILL KRISTOL:
Yeah. So let’s go through each of those sides of it in a way, but I think you explained the first part very well. Does that make this… How should I put it? Does that make propaganda, or let’s be neutral and just say “information,” somehow more powerful? It’s quicker, it’s faster than, I suppose, in the old days. And is it more powerful? Is it something almost qualitatively different from the description you just described from 1947? I mean, how different, how new is this world we’re in? Or how much is it just, “Hey, it’s radio and now it’s this,” and what’s all the… We can think of it in the same way. It feels different. It feels… The instantaneous character…
RENÉE DIRESTA:
It feels different.
BILL KRISTOL:
… of it, the mob character of it, and so forth. But, discuss.
RENÉE DIRESTA:
A lot of it is very similar. The way that propaganda works, a lot of the time it’s tapping into identity-based needs. One of the things that you really see on the internet is people assembling into very identity-based groups. And this happens partially because of culture, partially because of particularly polarized culture, the internet is reinforcing that. Algorithms though do key off of things that you like, things that people who are like you like. And then when that happens, you are put into these buckets, if you will, where you’re going to see more of a certain type of thing, so those identities are reinforced.
And what you start to see is people will also really become very passionate defenders of some of those political identities in particular. They’ll put an emoji in their bio so you know exactly what they are the minute that you see them, and then they’ll fight with other people who are on the other side, in a different tribe. So that dynamic is very real. But I kind of got into studying this in part as an activist, not as an academic, because I wanted to pass a pro-vaccine bill in California as a pro-vaccine mom. And I think every time people start looking at something, they think like, “Oh, this just happened, this is the very first time this happened, this is completely new.” But I started reading about anti-vaccine narratives and realized that you could actually go back and pull up the British Medical Journal archives from the 1800s, and it is the same stuff. The vaccine is going to kill you, it’s anti-God, it comes from cows and it’s going to turn you into a human-cow hybrid. Right? They’re talking about smallpox—
BILL KRISTOL:
But if it’s the same stuff, doesn’t the algorithm change everything though?
RENÉE DIRESTA:
Well, that’s how you get at that. It’s the targeting. The algorithm changes the targeting. So content and distribution are two separate things. Content is what you’re going to read, what you’re going to pay attention to. Sorry, let me actually differentiate that. Content is what you’re going to read. It’s sort of the substance of the stories. It’s the narratives themselves. But what you’re going to see is a little bit more targeted for you. So there’s a whole bunch of different stories out there in the world that could describe a particular phenomenon and they can all be inflected for your particular identity. And so as a pro-vaccine mom in California, I can read… I was reading a story in the New York Post this morning about… It was sort of going viral on X. There’s a relative of JD Vance’s who is trying to get a heart transplant and her parents don’t want to have her vaccinated, so she’s not eligible. So this is going to outrage everybody for entirely different reasons. And so it’s going to be targeted to people using very different framings, even though the facts of the story are the same.
BILL KRISTOL:
It just feels… Yeah, I mean, there’s always been obviously self-segregation of groups, and not just self-segregation, but other forms of actual segregation of groups, either by income or by government policy, obviously, on terms of race and so forth, and just residential segregation and ethnic groups and so forth. Having said all that, it does feel like it’s one thing if you as a pro-vaccine mom had wandered around to all your neighbor’s house with a petition, 60% of them might’ve agreed with you and 40% might’ve been doubtful and you would’ve had conversations.
I do feel it doesn’t… Correct me if I’m wrong, but it feels like the algorithm both intensifies the uniformity of belief that one encounters and just the power of it, because it’s obviously national and international and instantaneous. So suddenly you feel like, “Well, there are thousands of people who agree with me.” Whereas if you walked around your neighborhood and one person did agree, but then two people didn’t, and the one who didn’t seemed like an intelligent person and had some experience, you might change your mind. I guess how powerful is that both in, I don’t know, magnification and intensification of belief?
RENÉE DIRESTA:
Yeah, it’s a great question. It’s called “majority illusion.” And so majority illusion is the people who are around you shape what you think the majority opinion is, right? You see these stories where Trump wins and people are like, “Whoa, how could that possibly have happened? Nobody I know voted for him.” In the case of the vaccine thing though, it’s a great example because the only people who were posting about vaccines at the time, this was in around 2013, 2014 timeframe. 2014, 2015 was when we were trying to get this bill passed. Only the anti- vaxxers were posting. And that’s because most people go and they get their kid vaccinated and nothing happens and they don’t post about it. This is the same phenomenon with the flat earthers. You can find flat earth content, but not round-earth content because nobody is creating the round-earth Facebook group. That’s the norm.
And so what you see happen is that the majority opinion seems like… The anti-vaccine opinion seems like the majority opinion. Post-COVID, it’s huge. At around 2015, it wasn’t. And so we had this bill that we were talking about where the only people who were posting about it were the vehement anti-vaxxers who didn’t want it to pass. And then we were trying to say like, “Oh, boy.” The legislators are saying, “Well, all the people… When I poll my constituents, they’re all in favor of it. When I go to my town halls, they’re in favor of it. But then when I look at the internet, everybody’s against it. What is happening here?” And you start to see that dynamic of who are the communities that are active, and then beyond that, what is curated for you. And so the internet can really twist and shift your opinions depending on what bubble you wind up in.
BILL KRISTOL:
Yeah. I remember when I was a young conservative—I don’t want to shock you that I was once that—there was a book that was very popular, I remember getting it from the Conservative Book Club for $2.99 or something. It was written in the mid, early 19th century, I vaguely remember. Something about the madness of crowds. I can’t remember the first—
RENÉE DIRESTA:
Oh, yeah, yeah, yeah. Extraordinary Popular Delusions. I have it behind me.
BILL KRISTOL:
Yes. Extraordinary Popular Delusions.
RENÉE DIRESTA:
Yeah, yeah, yeah, Charles McKay. It’s a great book.
BILL KRISTOL:
And so, conservatives loved it because it was vaguely democratic, let’s say, with a small D. It was like, the people are often wrong, they get carried away, tulip crazes, this, that, conspiracy theories. And of course, all those things have existed. But again, I guess I come back to the question in terms of politics, it does feel like those things might’ve burned themselves out faster or might’ve been more subject to correction by facts or by alternate groups, by other people showing up and saying, “Wait a second,” or other people they respected. Anyway, but go ahead.
RENÉE DIRESTA:
I think it varies. I struggle with that because I think it really varies over time. What you’re talking about, tulip mania, McKay kind of opens with that, is a big one of these sort of cautionary tales. I was at Jane Street for a while. It was one of the books that they give junior traders when you’re—
BILL KRISTOL:
Is that right?
RENÉE DIRESTA:
Yeah, yeah, yeah. When you’re brand new. So yes, I read that in 2004 or something. But no, it is this… And I think about it as this cautionary tale, that momentum might be taking us in one direction, but that doesn’t mean that it’s correct. The market can remain irrational longer than you remain solvent is the sort of Wall Street take on that. But the question about being resistant to correction is an interesting one. And again, as I mentioned, the anti-vaccine letters to the editor in the medical journals in the 1860s, they’re there. And the letters back from the doctors, they’re incredibly blunt. They’re like, “Why are we publishing these idiots,” is basically the response. So you do see that frustration. Like, “Why are these people not deferring to my authority,” is sort of the response even back then.
But I think the challenge that we face now is that… So I was talking about creation and distribution as these two separate things. Anyone can be a creator now. Anyone can be influential. Anyone can develop relationships. Anyone can say anything they want. And this is not inherently a bad thing, right? Because propaganda, for all of the writing on it from the ’20s through Chomsky in the ’80s, writing about the propaganda of hegemonic media, like selling wars to the public and stuff. This is where you start to see that challenge of institutional media getting it wrong or institutional media carrying water for the government, and this was what the internet was supposed to not do. That was the vision, that we would finally have this media that was independent, this way to get better information to the people.
And instead, what happened, unfortunately, is that… Speaking of Wall Street sayings, there’s that Charlie Munger one, “Show me the incentives and I’ll show you the outcomes.” And that’s where you see, again, the incentives of social media unfortunately start to turn to all of these influencers who maybe start out as just sort of plucky people wanting to tell their vision of the truth to the world, eventually begin to… A lot of them begin to monetize, to sell a particular vision of reality to an audience, they wind up in this audience capture kind of process, and they begin to really reinforce kind of selling a particular message to a particular niche.
And through the sort of process of curation, where an algorithm filters you in particular directions, you do start to see this reinforcement. And so the way I describe it in the book is, instead of this hegemonic propaganda that sells a vision of America to the entirety of America, you wind up with these people who are sort of selling a vision of a particular identity to a particular niche. And so you have these fragments that fight with each other instead of a more unified form of propaganda. I don’t know which is worse. I kind of go back and forth on that, but that’s the… The great vision of… I think it ultimately is better that we don’t have this one controlled narrative, but we’re facing a different fight now.
BILL KRISTOL:
No, that’s right, I think. It’s kind of an old-fashioned conservative view that you fix one problem and you create new ones, and you need to be sober about both, that it’s good to fix the old problem to the degree we have. I mean, the question is whether, in some cases, of course, the solution is even worse. And I guess that gets to… So let’s talk about the algorithms for a minute. I want to come back a little later to the power of this whole network and how important it is in the bigger picture of our politics and our society. But just, do the algorithms necessarily reward intensity, extremism, negativism, what one always hears they reward? Or could we be having a different social media right now?
RENÉE DIRESTA:
We could and we can. So yes, to some extent they do. But it’s not Mark Zuckerberg in a room cackling with his finger on the “I’m going to make it negative” button in the way that it’s sometimes kind of cartoonishly described. It’s more—
BILL KRISTOL:
Sometimes cartoons are somewhat accurate, though. Sorry…
RENÉE DIRESTA:
I think… well, I don’t know. I imagine you remember… Remember the ISIS days on Twitter? I feel like we kind of forget 10 years ago, but in 2014 timeframe, ISIS got very big on Twitter, speaking of propaganda. It was one of the first terrorist organizations that realized that this gave it a way to recruit, it could reach disaffected people. And it was very, very, very effective at propagandizing. It had an iconography. It had the black flag. It was very… Transmitting this powerful sort of brand that they built. I don’t mean the beheading videos, I mean the sort of the video game-like ads that they would put out.
And Twitter really struggled with it, with what to do about it. And I didn’t struggle with it. I was like, “Why aren’t they taking it down?” I felt like it was morally clear. But they saw it as like, “Well, it’s a global platform. We’re the free speech wing and the free speech party,” was what one of the… kind of friend of mine who was there at the time was saying. And this was, the, “one man’s terrorist is another man’s freedom fighter. We don’t want to look like we’re taking down content because the US government wants us to,” et cetera, et cetera. And then you had this sort of series of escalating attacks where propaganda… People would see ISIS on Twitter and then would go and say, “I pledge allegiance to ISIS,” and then they would go and… There were actually a lot more people driving trucks into crowds, if you recall… Or the Bataclan Massacre, the nightclub in Paris, and Twitter eventually began to act and to take them down.
And that sort of moment of the realization that all of the sudden, private companies have an incredible amount of power over geopolitical events, over what is happening in the public square. It really became, I think, this wake-up call as people began to realize that what an algorithm curated… Because what was happening with the Twitter algorithm, just for people who don’t know or weren’t aware, is that if you followed one terrorist account, it would show you more. And so you’d follow one, it would say, “You might like…” and then it would show you three or four more. “You might like. You might like. You might like.”
BILL KRISTOL:
And that wasn’t just… It would also be true if you followed an account about Mozart, it would show you other classical music ads.
RENÉE DIRESTA:
Exactly.
BILL KRISTOL:
It wasn’t being pro-terrorist, it was just…
RENÉE DIRESTA:
No, sorry. I feel like I’m so immersed in the weeds of it.
BILL KRISTOL:
No, no, no. You said it correctly.
RENÉE DIRESTA:
If you follow Elon Musk now, it gives you Marjorie Taylor Greene and Jim Jordan.
BILL KRISTOL:
No, but that’s why I’m about to contrast it perhaps, and I will ask this leading question ahead of time, which is, it feels like Twitter was neutral, and you should go on and explain the problem even with, let’s call it being neutral, about steering people to what they already have shown an interest in. But that’s one level, and another level would be not being neutral and purposely trying to steer people to one, well, to a certain point of view or to a certain conspiracy theory, or whatever.
RENÉE DIRESTA:
I think the question of neutrality is complicated. So Twitter has that phenomenon happening with ISIS, right? That’s called content-based recommendations. It’s showing you more of something you’ve engaged with. So yes, you follow a particular topic and it gives you more content related to that particular topic.
The other way that platforms show you things is what’s known as collaborative filtering, which is, I am a forty-something person with three kids living in DC, right? Well, there are a lot of forty-something people with three kids living in DC. I also cook. I occasionally post about cooking. Well, people who post about cooking also like to garden. People who like to garden like to jog. Maybe I would like to jog. So then it’ll show me jogging-cooking content. And that’s because what it’s doing is it’s saying that these other people who are kind of like you, who have this overlap with you, also do this thing. And there’s an immense amount of data that platforms have about you and about all the other people too. And so it’s just doing these, machine learning, it’s sort of finding patterns and using that to show you recommendations that you are statistically more likely to be, or content or people that you are statistically more likely to be interested in.
So this is when you get to the question of “neutral”. So what started to happen on Facebook at around the same time… Now Facebook, by the way, acted very directly about, with ISIS, pretty early on, they were not having this one-man terrorist, freedom fighter debate. But what was happening on Facebook was interesting, because you did start to see the anti-vaccine… I had an anti-vaccine account so I could follow anti-vaccine news without it polluting my main feed. And that account would get recommended flat earth content. It would get recommended chemtrails content. And then I would join the flat earth and the chemtrails groups. And this account never posted, it just joined.
And then after the chemtrails, I started getting Pizzagate, right? This was around the election, around 2015, 2016 timeframe, so the Hillary Clinton emails and all that stuff. And so I wound up with Pizzagate. And then I logged back in for something. I wanted to get some screenshots related to some of the chemtrail stuff that I was paying attention to. And I wound up seeing QAnon content. And this was before QAnon was really in the news. So again, it’s this progression where—
BILL KRISTOL:
So if you like one conspiracy, the algorithm thinks you’ll like other conspiracies.
RENÉE DIRESTA:
Correct. Exactly. QAnon wouldn’t have existed… Like, this is a hill I’ll absolutely die on. QAnon wouldn’t have existed to the extent that it does today without the recommendation engines pushing all of those people into that group, into those groups. There were many of them by that point. There were hundreds of thousands of people in these groups because the algorithms were bringing them together. Because as far as they were concerned, the highest good, if you will, was an active group, because an active group for them, because what they were optimizing for, show me the incentive, I’ll show you the outcome, the incentive, the optimization is: getting people talking.
Well, conspiracy theorists will talk all day long, because they’re in there digesting the “Q drop.” And so they’re in there trying to figure out where the pedophiles are, where the children are, or whatever. And these are incredibly active groups, so there’s no ethical question there. There’s no question about is it good to recruit people into digital cults? That’s not the question they’re asking at the time. ISIS is kind of clearly a no-no, but some of these other things were not. And you start to see the ethical questions, they get thornier, right? I feel like QAnon is the easy one.
BILL KRISTOL:
Before we get to the ethical question, I mean, just on the empirical side, so to speak, this does seem to be very important. It’s very interesting. I mean, if the algorithm… I’m going to say this in a stupid way, but you’ll understand. It’s smart enough to understand that if you like this conspiracy in this area, you might like other conspiracies in another area. That has always been true presumably psychologically. I’ve got to stipulate for human beings.
RENÉE DIRESTA:
Yes.
BILL KRISTOL:
But in the old days, if you were a conspiracy theorist about X, you wouldn’t necessarily have entree, so to speak, or be forced into knowledge, even, of a conspiracy theory Y, right? So the scientific kooks would be in a different, the vaccine people would be in a world perhaps from the, I don’t know, QAnon types and from other kinds of conspiracies. So that has to be one case in which the new mode of just the fact of the algorithm, I guess you’d say, the new mode of transmission of these things, both rewards conspiracies, because they stay online longer, as you were saying, and they are more passionate and intense and they do what the algorithm wants to optimize for. But also then, more of them would get sucked in than would have been the case once before, maybe. So it does seem like it’s a case where it’s…
RENÉE DIRESTA:
Yeah.
BILL KRISTOL:
Yeah.
RENÉE DIRESTA:
And again, what this was good for was actually bringing people together. We talk a lot about loneliness epidemics and things like this. So the positive vision for what this was supposed to do was to help people find groups. I had a baby in 2013, and it started showing me mom groups. And I actually didn’t have very many friends with kids, almost none, in fact, and so it was nice to have that suggestion. A little creepy that it realizes. You post your first picture with an infant and boom, the whole feed changes. It knows immediately everything. But that was what it was for.
And the other thing I’ll say, for those who aren’t immersed in the social media weeds, is that Facebook has, and all of these platforms have… Well, maybe not X anymore, but they have incredible teams of research scientists who work at the company, and they are actually doing research, and you did see leak, some of the Facebook papers that leaked a couple of years later did show that they began to realize that pathways to radicalization were happening as a result of their recommendation engine, that that was what was taking people into these groups, that it was actually the power of suggestion. Because this anti-vaccine group account that I had, as I mentioned, it did not type in the word Pizzagate. It did not type in the word QAnon. It did not type in the word chemtrails. So everything was being pushed to it.
And that’s the part that I think is really interesting. It’s the signal that you get from piquing somebody’s curiosity with something that they’re likely to be interested in, and that’s where you get at the question of what is the ethical way for a platform to weight something when it’s making a proactive suggestion to you?
BILL KRISTOL:
Yeah, that’s so interesting. I hadn’t really understood that the proactivity, if that’s a word, of what was even, let’s call it “a neutral.” I mean, I don’t mean that in a technical sense, but it’s not trying to promote QAnon, as opposed to, I don’t know, liberal Christianity.
RENÉE DIRESTA:
But this is what people…
BILL KRISTOL:
It does it because the algorithm knows that it will pay off.
RENÉE DIRESTA:
It makes a bunch of suggestions. It’s like it offers you a platter of food, and there’s salad maybe, but there’s donuts too, right? This is why…
BILL KRISTOL:
But it doesn’t even, it doesn’t offer you, I mean, just to take that…
RENÉE DIRESTA:
There’s not very much salad. There’s a lot of donuts.
BILL KRISTOL:
Right, and once you don’t take salad the first two times, it sort of stops offering you salad.
RENÉE DIRESTA:
You get more donuts.
BILL KRISTOL:
And then it, just to go on this somewhat amusing metaphor, and then once you’ve taken a couple of donuts, it starts offering you cakes and everything else. And then it also has studies that show that if you like sugar, and I’m not one to be critical of that, that you might also like other forms of other things that…
RENÉE DIRESTA:
Substances.
BILL KRISTOL:
What’s that?
RENÉE DIRESTA:
Substances.
BILL KRISTOL:
Yeah, other substances. I mean, it does feel like it’s a very different thing than just normal advertising, of which everyone was so concerned about brainwashing people in their 20s and 30s and 40s, whereas, hey, these doughnuts are great. They’re the best ever. Buy them. I mean, it’s a slightly different phenomenon now.
RENÉE DIRESTA:
Well, this is where the TikTok conversation, which is a huge, whole other topic of conversation, but this is where the worries come about China controlling the algorithm. What could you do if you did have the capacity to manipulate a feed for explicit political ends? And TikTok’s algorithm is very effective. People really like it. It’s one of the few things that really blew up, even as there were other platforms where people spent a ton of their time. TikTok doesn’t actually really care who you follow, that’s not really the most important weight in the feed. Whereas most of the other platforms rely on what’s called the social graph, TikTok cares a whole lot more about just showing you content topics that are interesting to you in each individual video, as opposed to stuff from your friends. And so that’s where you start to see that question of who controls your feed.
And right now, just to bring things back to the other question, the other half of your question, could we do better? I would argue that that same concern about what happens when China controls your feed, right? That was the topic around TikTok, is what happens when a CEO you don’t trust controls your feed? What happens when Elon Musk controls your feed, right? Because Elon Musk owns Twitter. And right now you can… And I am not saying that Elon Musk is doing anything nefarious, but the point is…
BILL KRISTOL:
I’ll say that. You can not say that.
RENÉE DIRESTA:
But the point is, Elon Musk, when he bought Twitter, significantly changed how content was weighted in the feed, right? “Blue checks” all of a sudden were for sale, and their content would be ranked higher, their replies would be ranked higher, and people could monetize. And what I was referencing earlier, influencers who wanted to make money wanted to be more sensational, and then the algorithm would curate that, would curate that content, because people would engage with it. And so you would see a very different style. Like your feed, when people say, “My feed on X is totally different now,” that’s because the incentives are different and the way that it’s ranked is different. And the person who controls that is X, which is now owned by one person.
So what’s the alternative? It’s to move to platforms where you have better control over your own feed. Unfortunately, right now, Bluesky is really the only thing that’s doing that. It’s new. I think it’s still kind of read as like, “lib Twitter” because of the sort of early adopters. My hope is that that opens up and more people come because of that granular control, because you really can control your feeds. I think Threads is starting to get at, again, this here are different ways that you can view your feed. Maybe other platforms will start to move into that mode.
But I think ultimately we do want to move into social media platforms where we have a lot more transparency and visibility, and that’s just not going to be something that the US government is going to demand, and the Europeans tried to do it, and I don’t know if you followed, but we’re currently reframing data transparency laws in the EU as some form of censorship. So right now, I think the alternative for people is really to move to other places.
BILL KRISTOL:
Well, that gets to the question of power, which you mentioned earlier, which I think is a word which is important. That is, I think a lot of the discussion of in this area has been… And not yours, but others over the years has been interesting and thought-provoking, but it slightly misses sometimes just the pure power equation, and I think TikTok makes that so clear. I mean, people were really alarmed about what the Chinese government might be doing, or was doing, I think they were told in classified briefings, one has that impression on the Hill. They passed by big majorities.
RENÉE DIRESTA:
I wanted them to say it. I really felt like, my objection to the TikTok ban was that they didn’t level with the public.
BILL KRISTOL:
Right.
RENÉE DIRESTA:
I feel like the authorities have to level with the public in cases like that.
BILL KRISTOL:
Well, they sort of did, right? That China was…
RENÉE DIRESTA:
No, they kind of hand waved about it. I really kind of wanted, I wanted Senator Warner in particular, somebody who’s been fairly blunt about this stuff, to come out and say it.
BILL KRISTOL:
To give an example, to walk you through it.
RENÉE DIRESTA:
To walk you through it. An example, yeah.
BILL KRISTOL:
I agree. Therefore, maybe for that reason, partly for that reason, it wasn’t, even though it passed by big majorities, it hadn’t sunk into the public’s mind that this was really important to do. Once there was a pushback, all that collapsed in like 10 seconds, basically. And Trump has decided to do the opposite. Maybe that’s because of personal financial incentives for him and so forth. But anyway, there he is, and there’s no real pushback to the pushback, it doesn’t seem to me.
So it gets to the question of the power of these platforms, and I think it is, TikTok would be just one example. TikTok, which you’d think would be not very powerful because it is not owned by Americans, and it kind of explicitly would be the easiest to criticize, turns out to be quite powerful. And they all turned out to be very powerful. And I think that’s such an important…
Well, let’s just talk about that for a minute. I mean, a friend of mine made this point, maybe you said this— versions of this in the past too, that 30 years ago, if someone had come to me 30 years ago and said, “Hey, one very rich guy has bought the Washington Post. Another very rich guy has bought this sort of weird Twitter thing.” I mean, which is going to have more influence over America in the next decade? Which is going to be the more major figure in American life and in American politics? Which is going to be sitting in the Oval Office running half the government right now? That’s a slightly flukish thing, you might say, but Trump’s not an idiot. I mean, he knows how powerful Musk is. And we would’ve all said, “Oh, owning the Washington Post, that’s so important. This Twitter stuff is sort of owning…” I don’t know what it would be. Something on the side, kind of…
RENÉE DIRESTA:
A tabloid?
BILL KRISTOL:
Yeah, yeah. Maybe that’s a good example. Owning a tabloid, owning the Star, whatever that thing in the supermarkets was, as opposed to the New York Times. It gets a lot of readers. It can sort of persuade people for two days that some story about an actor or an actress is true, but it doesn’t change our country much, whereas the New York Times… So it’s the opposite, right? I mean, that is really striking to me, if you step back and sort of try to think about this phenomenon, the power of these platforms and these companies.
RENÉE DIRESTA:
Well, I think some people, some governments began to realize that in 2012 during the Arab Spring, when governments toppled as a result of social movement momentum that began on Facebook and Twitter, and they were tools for galvanizing people.
One of the interesting things… I think people, when I say the word propaganda, it’s so linked to Chomsky in America, this hegemonic persuasion. But so much of it is related to activation, just getting people who already believe a thing to act on it. And when you have the capacity to sort people algorithmically and to show them content that is tailored to them and get them to… And when they begin to…
I want to make really clear here also that the public is not passive. People who are on social media have a lot of agency. I think everybody thinks like, “Well, I have agency when I’m on social media, but those other people over there don’t.” No, that’s not true. Everybody’s experience is just like yours. And so when we say, I always think about the phrase, “it went viral,” like it just magically happened. No, people hit buttons, they did things. And I think we don’t think in terms of collective behavior, but that is what social media is for. It’s like assembling a murmuration of starlings or a school of fish, where all of the sudden, everybody spontaneously assembles in a particular direction. The cues, the social cues that bounce from person to person, that’s what social media is for.
And so, just like I said, so 2012, some governments realize it, and ISIS realizes it, to draw that example, and then Russia realizes it. Russia invests very heavily in online propaganda to justify its invasion of Crimea. These are to support the little green men, you have the social media flood the zone. Saudis do it. This is how the sort of flood the zone, make it too hard to know what’s true. And that’s where you start to see the realization that these are tools of power.
My experience with the anti-vaccine thing in 2015, again, as an activist, was like, okay, guys, we’re about six years behind building up a counter movement, an ecosystem here. The algorithm is already keying off of when you type in “vaccine,” it assumes you mean “anti-,” because no pro-vaccine person is searching for vaccine groups on the internet. And we’ve got a lot of work to do. And that investment doesn’t come from people who see themselves as the established majority, because they already have the mainstream media, so to speak. And that phrase that Elon uses, “We are the media now,” that came out of QAnon, just to be clear. And I’m not saying that it’s a conspiratorial… I’m not saying that to connect him to QAnon in some sort of woo, conspiracy theory. No, I’m saying it because it comes out of this sense that we can collectively assemble and have power and change the world. And that is the attitude of people who get deeply into factional, social media politics. And I say it as somebody who really, deeply understands it, because I did it too.
And it’s because it’s fun, and it’s engaging, and it gives you a sense of meaning and a sense of belonging, and it gives you a sense of mission. And it is how every single important political battle is going to be fought from here on out, because it leverages an entire organic army of people. The same time I was doing the vaccine thing, Gamergate was happening. And then the Trump, MAGA, 3X, Pepe crowd was doing their thing. And it was just watching these… It’s not like a top-down, coordinated thing. It’s a bottom up, organic, improvisational process.
And I think that people who are institutionalists and are inherently… And I am one, we need institutions for society. But you also have to understand that this dynamic, it’s much more of an insurgent type of dynamic than a controlled brigade force, if you will. And harnessing that force in support of institutions, is maybe a little bit more challenging because so much of it is priding itself on this idea that, “we are the media, we’re in control, we’re leading.” And that’s why it does have this sort of inherently anti-authority, anti-expertise sentiment that lends itself particularly to sort of populist messaging and values.
BILL KRISTOL:
I think that last point’s so important. Yeah, that was really my next question, but I’ll ask you to follow up a little more on that even. Because it does seem, okay, you can get this, so let’s call it organic conspiracy theorizing. Certainly that has existed over the centuries. And in the age of the internet, they can find each other more. It can be intensified by the algorithm. It can find villains to attack. You add to that though, people who are purposely powerful people or powerful governments, purposely using that for political purposes, political in a broad sense as well as actual elections. But I mean broader social and political purposes. That becomes very powerful. And it isn’t quite even. I mean, I’m not saying people on the other side shouldn’t therefore study this and think about, “Well, how do I counter that?”
But just as you were saying exactly, I mean by nature of the algorithm and of what we’re talking about I guess, the extremists, the conspiracy theorists, the alarmists, are going to have an advantage over the sober, “Well, actually, it’s complicated. And yeah, of course everything has a tiny risk, but it’s a hugely positive thing on nets.” Whatever the… It’s just, Trump and Elon Musk and Roger Stone, are going to have an advantage over Kamala Harris and some earnest liberal who believes that we should expand healthcare some. I mean, right? And it’ll be true with the left too. I mean, you could have Islamists—
RENÉE DIRESTA:
Oh, the left is so bad at this. I can go on about that forever.
BILL KRISTOL:
But the far left you’d think would be… In theory, shouldn’t be worse than the far right, but maybe it should be.
RENÉE DIRESTA:
No. The far left, the “Rosen bio brigades,” is how I tend to think about the… The Bernie Sanders people were pretty good at it actually. And I tell the story in the book of some of the ways in which they galvanize activism. But there’s that meme, “We should improve society somewhat.” I don’t know if that resonates with… You can’t see the picture in your head. Okay. I can.
BILL KRISTOL:
It’s funny though. It’s a funny meme. I hadn’t heard that before.
RENÉE DIRESTA:
It’s a peasant holding a bag of sticks or really downtrodden saying, “We should improve society somewhat.” And then there’s… Anyways, it’s sort of a… I think there’s this point where you become too online. And I’m definitely there. No, but it gets at this thing where if you are used to having a thousand words to lay out an argument, where you will have your caveats and you will do your throat clearing, and you will get all of your points out there, you’ll address the other side’s argument somewhere in some paragraph, three down or whatever, and that’s how you make your case to the world, that’s a very different approach than communicating on social media. Because on social media… So in order to get that piece out there, you have to appeal to one person basically, the editor, who is the gatekeeper who’s going to get your thing out there. And then there it is.
In order to get it on social media, there is no gatekeeper. You’re just going to type it, put it out there with your thumbs. But the gatekeeper, if you will, is actually… It’s the fellow members of the crowd are going to collectively decide that your thing is good. Because the algorithm is going to push it out to some number of people, not all. And people often will think like, “Oh, I’m being censored. My friends don’t see all my posts.” No, it’s that it’s going to push it out to some number of people to gauge if it’s good, and if they like it, it’ll push it out to more. But what’s happening in that process is you’re competing for attention. And if you’re throat clearing and caveating and saying something nuanced, it’s not going to go anywhere. If you’re saying something strident, if you’re saying something morally righteous, if you have a particular tone, if you’re sarcastic, if you’re shit-posting, right?
That’s what people want to see because they’re there… They want to get their information through this entertainment also. And a lot of times, unfortunately, people who are much more accustomed to giving a press conference or having their credentials sort of speak for itself, the information, the ways that people with expertise communicate are not necessarily cut out for being in that attention game. And you do see more and more doctors who are now… There’s a dermatologist that I see on Threads all the time, and he makes funny reaction videos to people on TikTok, just kind of mocking the wellness, woo woo people who are putting weird shit on their face. And I love it actually. I think it’s really funny. I always watch his videos because he’s just charismatic, he’s interesting, and he uses his expertise in a funny way, by reacting to this kind of BS, in the same format that they’re doing. And so he’s not writing a whole blog post about it. He’s putting the content right there.
BILL KRISTOL:
But even if… I mean A, that could be… Yes, it’s good to have, I would say, in my opinion, to have people who are good at the medium—
RENÉE DIRESTA:
Both. You want both, yes.
BILL KRISTOL:
But at the end of the day, it’s still a disadvantage. Because in a way, he still can’t promise you that this will magically change your life or change your dermatological condition—
RENÉE DIRESTA:
That’s absolutely true.
BILL KRISTOL:
And in that respect, I mean, look, from Thucydides to the Federalist Papers, and to the 20th century and Walter Lippmann, everyone’s understood that, in democracies, or just in life maybe, demagogues have an advantage over sober people. And am I wrong to think that the internet, just by its nature, the algorithms very much by their nature, the self-reinforcing character of it, strengthens the demagogues vis-a-vis the sober people, more than was once the case. Or more than was… I mean, of course, God knows with the thirties and stuff, you can’t overstate this and generalize that, and say that it couldn’t have happened in the past, it didn’t happen in terrible ways in the past. And so that’s a good check on the kind of glorification of the nostalgia for the past and thinking everything is totally new. But it does feel like… And the final point, you talk about any of this you want, I mean, the conspiracy stuff strikes me as particularly… I don’t know, to fit nicely with the algorithmic way of being, that I mean…
And that strikes me as something in the past. Again, people could believe all kinds of wacky things, but again, they got punctured, it seems to me, at some point. Maybe they didn’t. I don’t know. So talk about any aspect of that you want. I think it’s interesting.
RENÉE DIRESTA:
I want to touch the conspiracy thing because that relates to trust. And I want to talk about two sides to that. But let me answer the part… So hold that in your heads, in case I forget it. But the thirties, the demagogue thing, I was thinking about Father Coughlin, who I have in the book as this interesting case study, because he is the man for the medium. He was this radio priest. And I pulled all the old New York Times archives from when Coughlin was rising to power, and I pulled some of the books about him. Huey Long, these sort of firebrand, demagogue kind of figures coming about. And for those who are not familiar, Coughlin is a Catholic priest who has a radio address, weekly radio address. He starts out as very pro Roosevelt, moves to being very disillusioned. The Great Depression is happening.
He’s tapping into the sentiment of people who are destitute. But then he also moves into eventually becoming a Nazi sympathizer. He’s writing letters to Mussolini, and praising Hitler and saying, “The Jews brought it on themselves.” And these sorts of things. And eventually what you see happen is, people have to try to figure out what to do about that. And the interesting thing about Coughlin though is that, normal people can’t be counter speakers in that moment. Normal people aren’t on the radio. So, the one thing social media does offer is the opportunity for instantaneous counter speech.
Whereas what happens with Coughlin is, they try to make him pre-clear his broadcast. So they’ve essentially appoint an official censor to go there and see if he’s cleared to speak, which kind of has some parallels to content moderation. They have a person, essentially a fact-checker, come on right after him to say, “Father Coughlin lied to you about the following things.” And then eventually, the Catholic Church kind of yanks him and is like, “Okay, you’re done.” And I tell the story in the book in kind of greater detail because it really is an incredible parallel to the modern moment. On social media, the theory is that counter—
BILL KRISTOL:
Just on the thirties for a second. Of course, the counter, that worked out okay in the US ultimately, sort of. Mostly—
RENÉE DIRESTA:
Well, I mean there was a war.
BILL KRISTOL:
Yeah. Well, no. And the reason there was war is because of Hitler mostly. And the reason Hitler succeeds in large part, people thought at the time, and I have no reason to doubt it, because he’s the first, in Germany at least, maybe in continental Europe in general, to understand the potential of radio. And he goes above the institutions and gatekeepers’ heads, and speaks directly to the German people. And so it is, as you say, the good news is we have more ability instantaneously to answer such things, in the way, once you seize control of the institutions of government and of power, and combine that with the populist demagoguery, you were apparently untouchable. Now, there are some unfortunate parallels to that today, in the sense of people are combining the demagoguery with the algorithms, with the institutions of power. But whatever.
RENÉE DIRESTA:
There was a representative who… There was a congressional hearing I watched yesterday in House Judiciary, of Jim Jordan doing his circus again about censorship, industrial complexes and whatnot. And one of the minority members made the point that the social media state had merged with the actual state.
BILL KRISTOL:
Which is more dangerous.
RENÉE DIRESTA:
Which is more dangerous. And that’s where—
BILL KRISTOL:
They have power.
RENÉE DIRESTA:
Yes. And now the man who is giving addresses from next to the resolute desk is also the man moderating what voices are heard on social media. And so, yes, we have had this bizarre and alarming thing that, had it happened on the other side, Jim Jordan would’ve convened a hearing about that. But that’s not what was actually happening in the hearing I was listening to. But no, the point has always been, bad speeches refuted through counter speech. And one of the challenging things with social media is just, structurally, how does counter speech work? It is much more possible. People can absolutely do it. They totally should. This was why I really wanted the institutions out there on social media, particularly during COVID… And this will connect to the conspiracy thing. Here’s how I’ll connect it for you, right?
During COVID, very often the institutions would wait to speak until they were sure. You saw this in the earliest days of COVID around, how dangerous was it? You saw it with the conversation around masks. Did we need them? Was it airborne? All of these things where people who were paying attention to Chinese social media, were realizing that doctors were dying. There were massive morgues being set up. It was pretty clear that something was very, very dangerous. People were studying past outbreaks and were saying… Influencers on social media were saying, “We should be wearing masks.” And the CDC was kind of not saying anything. And so there was this sense of them almost… And then they would come out a couple days later when they did feel like they had data and evidence, and they would come out and say something. But this gave the sense of almost leading from behind.
And that’s because science doesn’t happen on the timeline that social media virality does. And this is just the reality of the world. But you can do a little bit more to be out there and say, “This is what we know right now. Our information can change tomorrow. Here is how we are thinking about this particular crisis in this particular moment.” The other thing though is that, per your point a little bit earlier, the doctors have to get it right. And the influencers, if they don’t get it right, nobody actually ever holds them to account. Occasionally, someone will make a screenshot and be like, “Is this you with your stupid take that was completely wrong?” But that doesn’t actually often happen in any material way, because it’s very tribal at this point. And with the conspiracy theories in particular, one of the reasons that… So institutions are both not particularly well adapted to the moment, and not particularly… They’re trying now, they’re starting to get a little bit better at it.
You’re seeing this in some of the emergency responses to the fires and things like this. You’re seeing them start to come out and push back a lot more directly, realizing that it doesn’t do any good if social media opinion is formed and solidified, and then you give your fact check two days later. So you’re seeing them start to shift. But what was happening there was, the conspiracy theory would start, and people who trust the influencer, the influencers in that particular conspiratorial universe are constantly emphasizing that the institutions are corrupt, wrong, and just full of it. Listen to how Elon describes USAID. I mean, it’s insane. Regardless of what you actually think about USAID, the man standing next to the resolute desk should be giving out accurate facts. And instead, we’re getting any random anon who misreads a spreadsheet in a way that appeals to Elon’s sensibilities, gets retweeted.
It’s seen 30 million times. And that becomes… Because these people are so distrustful of institutions, the conspiratorial view of the universe holds. And unless they hear it from Elon as a correction, or maybe in the form of a community note as a correction, they don’t actually trust the correction. They don’t believe the media fact check. It never breaks through. Some of the people who are out there making the boldest statements… “USAID is behind the Trump impeachment,” is one of the stupidest stories out there. But Mike Shellenberger wrote it in his newsletter, and then Mike Benz said it on Joe Rogan. And this is audiences of millions of people, and they write very carefully, but they say the thing, and know their audiences trust them and are not going to believe the correction. And they’re not going to believe… They’re never going to walk it back. But that’s the sort of divide that we live in at this point. There’s no evidence offered beyond the theory, but that’s where we’ve kind of gotten to.
BILL KRISTOL:
Well, or some sort of fake evidence of some, as you say, misread spreadsheet. Or they like actually evidence in the wrong way. Right? Conspiracy theories always like… Joe McCarthy waved around the lists. I mean, they like to have a datum, I guess, that’s not real or that’s one exaggerated, or totally made up point. But they always want to have… It’s funny how they—
RENÉE DIRESTA:
That is true. Usually, there’s a screenshot. There is a—
BILL KRISTOL:
Yes, there’s a thing.
RENÉE DIRESTA:
This is what happened… In the conspiracy theories about us, they actually… So when I was at Stanford Internet Observatory, the same people I just mentioned, Shellenberger and Benz, came up with this theory that we had censored 22 million Tweets. And the evidence of that was in a report that we wrote. We added up the number of Tweets in the most viral stories alleging fraud in the 2020 election. And the number at the bottom of the table was 22 million. And so they highlighted it, circled it in red, screenshotted it, decontextualized it, and that was where, “They censored 22 million Tweets,” came from. So there is a decontextualized fact that serves as the evidence to backstop the conspiracy theory, basically.
BILL KRISTOL:
Yeah. No, that’s very interesting. Conspiracy and trust, were you going to say any more on that?
RENÉE DIRESTA:
Oh, yeah.
BILL KRISTOL:
I was supposed to remind you. I’m dutifully doing my job here. I’m very… Yes.
RENÉE DIRESTA:
No, the trust point is exactly that, right? It is that we live in very different universes at this point, where if you… Because eroding confidence in institutions and repositioning yourself as the source of authority and trust is good business for somebody who’s trying to sell Twitter or Substack subscriptions or whatever. That’s where you start to see that. This is much more a problem of trust than anything else. It’s very divided, and that’s why that sense of how do we put reality back together? Well, people who believe Elon, when he says that there was, what was it, 50 million in condoms to Gaza or something, that’s just wild theories come out of nowhere. And the fact check doesn’t reach people.
BILL KRISTOL:
It takes it to another level, as you say, when he’s standing in the Oval Office, and then when a whole bunch of government organizations, and he’s got huge people in government who want to keep him happy, have a big incentive to these things.
RENÉE DIRESTA:
Well, it’s actually that—
BILL KRISTOL:
And so it’s not just Elon, it’s the Cabinet Secretary, the Secretary of DH, whoever.
RENÉE DIRESTA:
Well, it’s that they can act on them now. So 10:30 this morning was the RFK Jr. confirmation hearing. I assume he sailed through. I assume it went through on party line. Maybe we’ll be pleasantly surprised when we’re done. I doubt it. But now we have a person who has been propagating the vaccines cause autism conspiracy theory leading Health and Human Services. And that’s because, again, in 2015 when he showed up in California to fight against our bill, he was still seen as kind of a crackpot. And then during COVID, he became the anti-Fauci, if you will, the alternate equivalent, the person who was held up as this is the person who’s really telling the truth, if you believed that the vaccines were going to hurt you.
BILL KRISTOL:
It is really startling, isn’t it? And I guess what happens is two things at once. That is the fact that he’s going to be confirmed as HHS Secretary today, assuming that’s happening as we speak here on February 13th. If you already were inclined in that direction, this is vindication. The United States said it. You may not have a very high opinion of our institutions, but still you probably have some residual sense that they’re important. And the majority of senators voted to confirm him. How can you say that he’s just a crackpot? And then if you’re in the middle and you’re not sure, maybe there is something messed up with our food and too much sugar and whatever artificial ingredients. But on the other hand, some of the stuff sounds a little wacky. But I guess the Senate confirmed him, so maybe there’s more to what he’s been saying than I thought. I do think it affects both the true believers, as it were, and the in between people. And this is where the political power really magnifies the effect of the—
RENÉE DIRESTA:
Exactly. So that’s where we are now. So that acceleration, now there’s the capacity to act, to drive policy and to make decisions that impact families and children based on those theories.
BILL KRISTOL:
That’s not great. Let me ask you this, just one final maybe question, not final question, but the final theme of this. So internationally, the EU has taken an attitude of greater regulation, I guess is the way to say it. You should maybe explain that a little bit. Has it mattered much, or has it worked, or is it a model for us, or do we have to rethink things from the ground up? Where are we in terms of doing something about this?
RENÉE DIRESTA:
So in the realm of social media, a lot of people ask, “Well, why don’t we regulate it?” And it’s very challenging. Most of what I’ve described is just speech. You can be a conspiracy theorist. The First Amendment protects you, so it is what it is. It’s the downside that comes with the upside of freedom. And again, the theory is supposed to be that counter-speaking works, and we’ve seen that either counter-speaking is struggling right now, but maybe we’ll improve on that. But Europe doesn’t have the same … There are certain things that are a no-go in Europe around hate speech, certain different types of hate speech laws. Certain types of content are illegal in Germany and France, for example, that are legal here in the US. And so platforms have always had to adjust their moderation policy and their content curation regionally to serve particular markets.
So Europe is the second largest market. And the European Union passed this thing called the Digital Services Act, and this is a regulation … You know, Europe’s kind of interesting. They pass the theme and then get to the details a little bit later. And so now they have hit this point where there’s two big provisions that this law gives that are going to become controversial here in the US. So one is what’s called Article 40, and it’s the right of researchers to have access to data. If you’re a European researcher, and they have some sort of vetting process, and then those researchers are listed—I think journalists and other people can apply also—you have the right to request data access to study a topic that studies content related to systemic risk to the European Union.
Nobody really knows what systemic risk means. This is something that we’re very uncomfortable with here in the US. They’re much more comfortable with that ambiguity in Europe. Anyway, somebody actually requested this. These German researchers requested election data from X related to the upcoming German election. You’ll recall Elon had on the AFD a woman candidate. And so these researchers are requesting some engagement data from X related to the German election. And X didn’t turn it over, which it is obligated to under this law. And they sued, and they just won, and so now we wait to see what the enforcement looks like. Is Elon just going to be like, “No?” And if Elon says no, then the Europeans have to enforce because regulation is only useful so long as it is enforced, and the enforcement is penalties in the form of, I think it’s, gosh, 6% of some form of … I don’t recall the specifics of the revenue. I don’t remember the details, but they’re trying to make it enough to be painful to ensure compliance.
There’s another aspect that is not under the DSA itself, but there is also a disinformation code of practice that is voluntary. The platforms can choose which aspects of it they participate in. If I’m not mistaken, X participates in none, but Meta does participate. Anyway, a lot of the American tech companies, well, not a lot ,Meta and X in particular, you heard Mark Zuckerberg reference the egregious European regulation, the onerous regulation in his recent speech talking about how he wanted President Trump to help him take on unfair regulation. And he name-checked Brazil, which went to war with Twitter and briefly shut it down. And then Europe, which is referencing these compliance regimes related to data access and occasional take-down requests.
They’re also supposed to produce transparency reports, risk assessments and other things to remain in compliance with your European markets. Anyway, it’s very boring. It’s extremely in the weeds. Those of us who do tech policy find it fascinating, and I think I’m going to do a podcast on it on Lawfare to get into the nuances. But JD Vance was just over in Paris for the AI Summit, and he referenced this in his speech, because what you’re starting to see in the US—
BILL KRISTOL:
He referenced it negatively.
RENÉE DIRESTA:
Negatively. And Jim Jordan—
BILL KRISTOL:
We should not go down this ghastly European road or trying to knock conspiracy theories.
RENÉE DIRESTA:
Oh, no. It’s actually a little bit more than that. It’s not that America shouldn’t copy it. It’s that American companies shouldn’t comply. It’s that it’s unreasonable for Europe to make these requests of American companies. And so there’s a reframe of this as the Digital Censorship Act. And in Jim Jordan’s hearing, some of the questions from the Republican caucus were, “Why are our allies doing this to our companies?” You had some of the witnesses talking about are they really our allies? They’re tossing around NATO. They’re tossing around tariffs. They’re alleging that if Europe asks American companies to moderate content related to Europe, then that impedes American freedom of speech. That, I think, is an incredible stretch because you can gate content where it’s invisible in Europe and still visible in the US.
So this is a weak argument, but the problem is that’s going to be the argument that goes viral, these rumors that are not true, but nobody’s going to care about the actual nuance of the law. They’re just going to see this narrative. You’ll see them start to beat the drum that Europe has a vast censorship regime, and this tyrannical overreach is going to silence American speech. And they’re going to just use it to, I imagine, try to get other concessions in some negotiation. But that’s where that’s going now. So no, Europe has a regulatory regime. I have mixed feelings about it. Some of it’s good, some of it’s vague, not good. But that’s going to be the next battle that I think we’re going to see.
BILL KRISTOL:
I don’t know that we’ll see much in the next three or four years, you think? With Trump and the White House and Republicans, for now, controlling Congress, I suspect the—
RENÉE DIRESTA:
No. Well, the Republicans want to regulate social media, just to be clear. Not the way that Europe does. Europe wants disinformation and conspiracy theories and misinformation and all that stuff to come down. That is not what the Republican Party in the United States wants. Quite the opposite. But they have other things that they’re going to be looking for. You’re going to see some child safety legislation go through in particular ways. I think you’re going to see different types of regulations related to… What were some of the other ones? Gosh, they’ll bring back COSA, they’ll bring back Shield. There’ll be a few acts that will come back up, but nothing related to transparency and nothing related to any of the stuff about good information.
BILL KRISTOL:
It is funny how much it was a conservative thing to hate big tech a few years ago, and Zuckerberg was roasted by the Republicans, and it’s funny how that changes so quickly.
RENÉE DIRESTA:
Well, I think Brendan Carr tosses around these ideas.
BILL KRISTOL:
He’s the new chairman of the FCC, huh?
RENÉE DIRESTA:
Yeah, I’m not sure what he’ll do, but he’s written some letters. Again, the weird weeds of tech policy, but Zuckerberg settled a lawsuit with Trump. Trump sued him after he was kicked off the platform after January 6th. There is no universe in which this company would have settled that lawsuit for anybody else. It was unambiguously, absolutely the platform’s First Amendment right to kick the President off. That is how the law is structured. But instead, they settled and they gave him a $25 million donation to his library to get in the tent, is how Trump put it. We call that a bribe.
BILL KRISTOL:
So that’s politics. We won’t get into the politics here. So final thoughts? So it’s not entirely… I don’t know. On the one hand, you’re impressively cheerful about the topic in general. Not about the topic, but in your manner of discussing it. And I guess as you said, it’s very important to remind people things weren’t great before. We had horrible demagogues succeeding. We had conspiracy theories. We had craziness. And there were good sides. I was very much on the good side of the internet in the ’90s and early 2000s, as many, I think, conservatives were. We weren’t that friendly to the establishment media. We thought the more people the better. And I just remember going to speaking college campuses and being so pleased that students, this is much more simple-minded, but could have access to my friend Charles Krauthammer’s columns, even though they were in The Washington Post and they were in college in Ohio somewhere because of the internet.
And so that’s the simple, hey, the more information, the easier to access it to, pretty easy to check things. Wikipedia, which some of my friends were very worried about, you remember the old days, you can’t cite Wikipedia. It’s going to degrade the whole profession of encyclopedias if that’s a profession. And I remember always thinking, I don’t know, maybe it’ll be okay. And it turns out this is a typical case of everyone being wrong about everything. Everyone who was so worried about Wikipedia and totally missed what was really happening in social media. Wikipedia is fine. Wikipedia is not destroying America. It’s not fine in every instance. God knows what your page or my page look like. I never look at it. And there are issues with it, but it’s not done the damage that the stuff that everyone liked, Facebook and stuff, has done, I think.
So it’s funny how these things don’t work out the way one expects, which is a good lesson for us thinking about the future. But I don’t know, give me a brief, in closing if you want, I don’t know, where might we be going on this.
RENÉE DIRESTA:
Optimism?
BILL KRISTOL:
Oh, let me put it this way. Do we understand the problem and now it’s a question of whether we have the will to do something about it or understand how to do something about it?
RENÉE DIRESTA:
Yeah, I think we do understand it.
BILL KRISTOL:
Or is there another wave of developments coming, AI and deep fakes? Where are we in this technological revolution? Maybe that’s a good way to put it. Or in this moment, in this era.
RENÉE DIRESTA:
So the tech never stops the same way recommender systems look a little bit different now than they did in 2015. Things are always changing. There’s new features, new platforms. The stuff I was talking about earlier with regard to Bluesky, these new protocol-based platforms are a whole different type of architecture that’s really interesting. AI changes constantly. That, I think, some of the trust issues get really serious there. It’s very hard to know what to trust. There was this Kanye West Super Bowl ad that led to this terrible site with a swastika T-shirt. And then there was this video made in response, a highly, highly plausible, incredibly realistic looking video of celebrities wearing shirts with the middle finger up, alluding to it. And they really looked like the celebrities that we all know. And it turned out to be a deep fake. And so the law hasn’t quite caught up in some ways, in ways that people’s images are misused. But I think, again, it comes down to trust and adaptation, as far as the social media ecosystem. And participation. I really think—
BILL KRISTOL:
But you think this is what we see— Of course, it could go further in terms of the technology and deep fakes.
RENÉE DIRESTA:
Well, the technology is just going to keep changing.
BILL KRISTOL:
But what we see, you don’t think we’re having on the verge of another fundamental change like we had in the ’90s and 2000s?
RENÉE DIRESTA:
Oh, I mean— Every now and then you get the, well, then we’ll have augmented reality and Zuckerberg’s investing in the Ray-Bans and things. AR has always been, or VR has been the next big thing for 30 years now. Who knows if it’ll actually get anywhere? But no, I think this information environment is really important. Getting people participating and thinking in terms of how do you get messages out and capture attention? And the most important part is that networked process, where people share each other’s content and move things along to make sure that messages reach people. That’s the part where I think institutions haven’t really invested in being a part of that process, and that’s what needs to happen.
BILL KRISTOL:
No, that’s really—
RENÉE DIRESTA:
Credible information out there.
BILL KRISTOL:
People do still have a wishing away of the problem or it’ll take care of itself. And I think what you’re saying is—
RENÉE DIRESTA:
Or it’s some—
BILL KRISTOL:
It is what it is and it’s got, of course, pluses and minuses. But if we don’t get serious about some of the minuses or about at least understanding better what we’re seeing out there, we have a big problem.
RENÉE DIRESTA:
We’re not going to get regulation here in the US. People just need to stop thinking that there’s going to be some magical regulatory fairy that’s going to wave their hands and it’s going to get better. It’s not. That participation and that counter-speech piece is really, really critical.
BILL KRISTOL:
So, it’s up to us. Any last words, Renee? This has really been fascinating.
RENÉE DIRESTA:
Thanks for having me.
BILL KRISTOL:
No, thank you for joining us on Conversations, Renée DiResta, and thank you all for joining us on Conversations.