Social Media’s Big Tobacco Moment

Listen to this episode

Speaker A: These are live images outside the courthouse in Los Angeles, where a verdict has been reached in a potentially landmark social media addiction trial. Meta and YouTube essentially on trial and based on a jury’s finding on Wednesday, liable for damages. The jury in Los Angeles Superior Court said that meta and YouTube’s negligence was a substantial factor in causing harm to the plaintiff, who went by her initials, kgm. In court, the jury also said the companies failed to adequately warn users of the dangers of Instagram and YouTube. New York Times tech reporter Ryan Mack was at the courthouse. I asked him to describe the scene for me.

Speaker B: When the verdict came down after nine days of deliberation, there were camera crews outside who were also very tired waiting for this verdict. Three sets of lawyers and, you know, also these families of these social media users who had died, you know, or, you know, committed suicide after having bad experiences on social media. So those families were also around the courthouse, but everyone was just kind of waiting for this thing to come in.

Speaker A: Nine days of deliberations is a pretty long time. Were you expecting this verdict?

Speaker B: You know, after a certain point? I think I was expecting something to happen, at least, like a verdict against these companies. I think it would have been easier for the jury to go into deliberation and be like, oh, there’s no big deal here. Let’s get out of here, and no one’s liable. But they really took their time going through the evidence and going through everything that happened in the trial. I think they understood the gravitas of this case, which is a bellwether case, one of nine cases to be held in LA regarding social media, and they took that responsibility pretty seriously.

Speaker A: What was the reaction like, certainly among the families when the jury returned its verdict?

Speaker B: Judge had issued some warnings about, you know, not displaying too much emotion, and you get kicked out of the courtroom. But, you know, there were smiles, there were, you know, kind of stifled cheers in a way I don’t stifled, you know, outbursts. You know, these people had been waiting for a long time. And while these families weren’t necessarily involved with the plaintiffs per se, you know, they weren’t the family members of the plaintiff. They viewed this as kind of a proxy justice for their own loved ones. And, you know, these loved ones had used social media, they had had negative experiences, and maybe died after using social media, according to them. And so they saw this as justice for these or against these, like, you know, massive companies that they had no way of holding to account.

Speaker A: This is actually the second court case that Meta lost this week. On Tuesday, a New Mexico Jury found that Meta had failed to protect kids on its apps from child sexual exploitation. Meta has lost in court twice this week and not even a year ago. The idea of losing big cases, let alone this many this quickly, I think might have seemed unfathomable. Same thing goes for YouTube, that same idea. So I wonder what that tells you about how we think about online platforms or maybe how juries think about online platforms in 2026.

Speaker B: It’s strange. I’ve reported on Meta for almost a decade now and, you know, the thought of this happening seven, eight years ago when I was really covering the company, seriously is unfathomable. And it’s sort of like London buses, these two things coming at the same time. The New Mexico verdict that you’re talking about $375 million in this state case about child exploitation. It’s a bit apples and oranges to the verdict that we got in la, which was about social media addiction and whether or not these companies designed their platforms to be addictive. But it does represent something of a shift here or a ground shift in holding these companies again to account.

Speaker A: Today on the show, Big Tech loses in court. Are juries ready to hold companies accountable when Congress won’t? I’m Lizzie o’ Leary and you’re listening to what Next? Tbd, a show about technology, power and how the future will be determined. Stick around. Let’s step back a bit. For people who haven’t been following this as closely as you have, can you tell me the story of kind of this Bellwether case in LA and the story of this plaintiff?

Speaker B: So we’ll start with the plaintiff. The plaintiff is a 20 year old woman. Her name is KGM and her case was filed, I believe, about four years ago when she was still a minor. So throughout the proceedings of the case and even after the case, she has remained anonymous. She also goes by Kaylee. What we know about her is she was in the court, she testified she’s from Northern California. She has used YouTube since she was about 8 years old, I believe, and Instagram since she was 9 and has a lot of mental health issues she says are attributed to the use of those social media platforms, things like Body Dysmorphia, for example, and anxiety that she says would not have been as bad or even possible without the use of social media. In terms of this case itself, it’s called the Bellwether case because it’s kind of the first of many. It sets the tone. It’s one of many, many cases that have been filed against these companies. And it just happened to be that this one was the first selected to go to trial in Lo Angeles. There are eight other cases that are set to go to trial. There are other cases in the Northern District of California as well. So we’re about to see a kind of period where these companies are constantly under trial. I’m talking about Meta and YouTube. Alphabet, I guess, is a parent company. This lawsuit also originally included TikTok and Snap, although those two companies settled before the trial began. But it could be this kind of, I guess, golden age of litigation against these companies.

Speaker A: What is so key about this case is that it was about addiction and design of the platform, not the content itself. And I think that is an incredibly important distinction. I wonder if you could explain why that’s so key and why that legal strategy was so important.

Speaker B: So that legal strategy is very important because it completely avoids Section 230.

Speaker A: Section 230 of the Communications Decency Act.

Speaker B: Correct. And Section 230, which is a decades old law, says that companies are not liable for the content that their users post. So if a user, for example, posts a death threat against another user, Instagram or Reddit or wherever is not responsible for that piece of content, even if it is violative of other laws. So the plaintiff’s attorneys and the plaintiff took a look at this and said, you know, we don’t want to screw with that. You know, we don’t want to go up against that because the precedent there is so strong that we’ll just get our case thrown out on those grounds. And so what they went after was the design of these platforms. And were the executives of these companies aware that the design of these platforms, you know, could that lead to negative outcomes or social media addiction?

Speaker A: This is the big tobacco philosophy, basically.

Speaker B: Yeah. And it was based on the verdict, a winning strategy.

Speaker A: Were there any moments in court that really stood out to you? Particularly around the sort of internal communications they had?

Speaker B: Mark Zuckerberg and Adam Masseri.

Speaker A: Head of Instagram.

Speaker B: Yeah, head of Instagram. And it’s interesting cause you watch and you read about this testimony and you know, they’re trying to thread the needle here where they feel that they’re doing the best they can for kids, but they also don’t want to admit wrong or fault. And it’s this kind of weird dance that you get with these executives, you know, Mark Zuckerberg especially. And anytime he testifies, it’s always. It’s not very common. So it was always a spectacle. When that happened, I was particularly interested in I mean, I was there for closing. And, you know, something that really intrigued me is this larger idea of how do you hold these companies accountable? Especially when there’s two phases of this trial, which was one was trying to determine liability, and the second, much quicker phase was punitive damages. Once they established liability, what was going to be punitive to these companies? What could actually punish them? Someone who’s reported on these companies for years now, I understand that these are the wealthiest, most powerful companies in the world. And how do you punish a company that has a $1.5 trillion market cap in meta or $3.5 trillion market cap in Google? Ultimately, the jury didn’t really grapple with that, I think, but I think this is something that will come up in later cases as well, is like, what will prevent these companies from doing this again, and what is the kind of appropriate financial penalty to assess to them?

Speaker A: Both companies say they will appeal. Meta sent us a statement saying teen mental health is profoundly complex and cannot be linked to a single app. We will continue to defend ourselves vigorously as every case is taken different, and we remain confident in our record of protecting teens online. In terms of damages, the jury awarded the plaintiff $3 million in compensatory damages and another 3 million in punitive damages. The jury said Meta should pay 70% of that and YouTube 30%.

Speaker B: But for companies with market caps north of a trillion dollars, well, it’s a drop of a drop in the bucket for these companies. It’s not even a slap on the wrist. It’s a tap on the Wrist. I guess $3 million for, or 6 million or whatever you want to call it. Any millions. I feel like they’ve probably spent more on lawyers and legal fees in this case than they will have to pay to the plaintiff. And I think it was very interesting watching this trial in that the verdict came down initially against Meta and YouTube, and you could kind of see the faces of the lawyers drop. You know, especially with. With YouTube. I don’t think they necessarily were expecting this Meta. You know, I think that there’s a stronger case against them, and there was a lot more focus on Meta. But, you know, after that first phase of trial on Wednesday, I think the company thought, dang, we’re going to be in for one here. And you get to the second phase where the jury goes and determines what’s punitive damages, what’s punitive to these, and they come out and they say, just another 3 million. And I think that’s in some ways a win for these companies. That’s such a small amount. The plaintiff’s attorney, Mark Lanier, was asking for potentially hundreds of millions, if not billions of dollars in punitive damages. He had this very interesting prop where he brought out a jar of M&Ms. And each M and M represented a billion dollars in net worth to these companies. And so he said, you know, what is it? If I take out 1M&M or 2M&MS. Or a handful of M and Ms, it’s not going to matter. These companies will not notice. That’s why you as a jury need to consider that when you, when you punish them.

Speaker A: That’s kind of what takes me to my next point, which is the question about reputation and public opinion. Because obviously from an economic standpoint, right, as you said, this is not even like a tap on a tap on the wrist. There is this large amount of other similar cases. And it does make me wonder, like, are we now in this new era of juries and plaintiff’s attorneys and state AGs and just people suing social media companies in a way that we’ve never seen before?

Speaker B: And I think that’s where the lasting impact of this trial will take us. If there had been a verdict where both companies were found not liable, that would have shut the door for many of these other cases. But now we have this precedent where a plaintiff came in, sued these companies based on the design of their apps and came away with a win. Maybe financially, not the win that some would have expected, but a win nonetheless. And so you take that and you apply that now down the road to that was just one person. What happens if you have eight other cases and what if everyone in those cases is awarded $6 million and then other people see that and they file their own cases? There could be this kind of domino effect here. And I think if you’re a plaintiff in those cases waiting in the wings, you are a little heartened by that, that there is a path to victory and there is a path to holding these companies accountable.

Speaker A: And I think that’s kind of what we’ll see as the lasting kind of legacy of this after the break. Is section 230 really going to change? Well, there’s this other verdict. This jury in New Mexico found that Meta didn’t protect kids from online harm. It is very different from the case where you were in the courtroom, but it is similar in that the platforms are being held liable even though in both cases the platforms are going to appeal. It just seems like this kind of multi dimensional legal attack where juries, regular people are Doing a thing that a lot of lawmakers have not really done over the past 10, 15 years as we’ve been talking about all this stuff.

Speaker B: Right. And I think that going back to that New Mexico case real quick, I think that one will have a much tougher time holding given that there is content involved there. And so you get section 230, that’ll kick in potentially on appeal and maybe erase some, if not all of that $375 million verdict. But to your point on the lawmakers and regulators here, we’ve had so little regulation with regards to these companies in the U.S. you know, around the world we’ve seen other countries try to enact laws, you know, whether those are age restrictions or identity verification or that kind of thing, you know, and we can argue the merits of those laws or the intent of them, but at least some of these countries have seen that they need to do something, I guess. And in the US despite the changing Democrat Republican leadership over the years, there has been no really meaningful federal regulation when it comes to social media. Of course, there’s been plenty of stuff that has been proposed. So I think some people have simply had it and they’ve gone this legal route and this very novel legal route now that we’ve seen. And this potentially is the path to hold these companies accountable in their view.

Speaker A: A thing that I find interesting. I talked to the Nebraska attorney General a couple of weeks ago about his suit against Roblox and it’s a different suit. It’s a bit more like the New Mexico one where this is about predators and child sexual abuse material. But it is striking to me. He is a Republican, he is a Federalist Society guy. It is striking to me that we’re now seeing like bipartisan legal action against Big Tech in a way that feels, I guess a little different to me than we have seen before potentially.

Speaker B: But I don’t want to. Yeah, I, Yeah, I mean, am I overselling it?

Speaker A: Right. Like we, we’ve also seen lawsuits against, against Big Tech in the past and there was this like tech lash and then nothing happened.

Speaker B: I don’t know, I feel like I’ve seen so many of these waves of it come and go. I mean there is this, I guess, greater focus on child safety that has played out and I think that is where a lot of the energy is. Of course, in this case, this was a case of a 20 year old woman, but it was filed when she was still underage. And I think that is kind of uniting in a way for Democrats and Republicans. I think We’ve seen senators on both parties kind of push on that. I think the outcomes and maybe the intent is a little different, but this idea of rallying around child safety. But I guess I would caution a little bit as to like, you know, is asking if this is finally the moment where people, you know, hold hands and come together on social media regulation. I guess I’ll believe it when I see it.

Speaker A: Okay, so let’s talk about Section 230. We have just had hearings in Congress about Section 230 of the CDA and what could happen, what should happen. It is a very thorny issue and you have people kind of from all sides of the political spectrum taking a whole variety of positions right now. CDA230 protects platforms from liability for what third parties post on their platform. Do you think there is real momentum to Repeal or change 230?

Speaker B: Oh, man. However I answer this, people are just going to yell at me. So I do believe there are people out there that want Section 230 to be reformed or at least ended. I think Donald Trump has said that at some point. Who knows what he believes. But yeah, it’s hard to gauge. I guess there’s always been a very vocal contingent that has been against 230. And I guess I’d be remiss if I didn’t say Section 230 is largely why we have the modern Internet. It’s why the Internet has flourished in the way that it has. But at the same time it’s now become this shield for these multi trillion dollar companies. Right. And I think that’s why people are concerned, those that have concerns. This law that has existed for a few decades now is protecting the most powerful companies in the world.

Speaker A: Yeah. The original CDA is 1996.

Speaker B: So like we’ve been doing this for a while, 20 years. Yeah. And you know, back then there wasn’t Facebook, there wasn’t. I mean, the thought of these companies being worth trillions of dollars was not on anyone’s ra. And now that they are and they’re protected from a lot of litigation around their content because of this law, I think that concerns people. And I think a lot of people are wondering if that was truly the spirit of that law to protect the Mark Zuckerbergs of the world or the Googles and the YouTubes and TikToks.

Speaker A: Where does this put Meta and Alphabet today? Because on the one hand, they are facing legal consequences, consequences in public opinion. On the other, they’re ubiquitous. I mean, this conversation is going on YouTube like they are knit into so many aspects of our society. The CEOs are closely allied with the Trump administration. Does this mean anything from an accountability perspective?

Speaker B: I mean, I think I’m throwing you every big question. Yeah, I think so. I mean, in their world, they would have loved to have just won that verdict and not have to deal with this. But now there is this, I guess, strategy on how the little guy or how the user can go after a meta, for example. I think the irony of this is this happened on a day where I think Zuckerberg was added to a White House Technology Advisory Council. So I’m sure there is some discussion there as to what can be done. It’s no secret of the lobbying effort of these companies with the Trump administration. So the thing is here that that can only do so much. This is the legal system, and I think these cases will wind their way through the courts. And as you pointed out, this is just a different form of accountability. This isn’t regulation, this isn’t lawmakers, this isn’t senators or congresspeople and two parties deciding. It’s regular juries here. And I stress that because I spent time with these folks, I guess, in the same courtroom as these folks and talked to two of them yesterday. And these are the people that are going to be deciding the fates in some ways, of these massive companies. Did they feel satisfied, the juries or the jury members that we spoke to? Yeah, I think they were satisfied with their work, certainly. I think the length of the trial spoke to how serious they took it. I thought it was interesting because in asking them questions, they played down the idea of how significant this case was. They weren’t taking into account the eight other cases that would follow or the, you know, and they weren’t told to consider those, I think.

Speaker A: Right. I mean, I guess they were doing their job as jurors on a case.

Speaker B: A case, yeah. And I think it would have been wrong to be like, oh, we’re this group of 12, and we’re about to set the precedent for years to come. So that was our job as reporters to kind of link that significance to this. So I think it was interesting. You know, there are two different perspectives. These people are so tunnel vision on this one verdict, but in doing so, they have, again, in some ways, pushed the first domino over.

Speaker A: Ryan Mack, thank you so much for talking with me.

Speaker B: Thanks again.

Speaker A: Ryan Mack is a reporter for the New York Times. All right, that is it for our show today. What next? TBD is produced by Evan Campbell and Patrick Ford. Our show is edited by Paige Osborne, who is the senior supervising producer for what Next and what Next tbd. Mia Lobel is the executive producer here at Slate, and TBD is part of the larger what Next to Family. We will be back on Sunday. I’m Lizzie o’. Leary. Thanks so much for listening.