Meta and YouTube Found Negligent in Landmark Social Media Addiction Case

nytimes.com

173 points by mrjaeger 2 hours ago


krunck - 2 hours ago

https://archive.is/07nv5

strongpigeon - an hour ago

There is a fairly low amount of details about the case in the article. This NPR article [0] has a bit more, but it's still fairly sparse. Though it's interesting how Zuckerberg thought it was a good idea to say: "If people feel like they're not having a good experience, why would they keep using the product?".

Given that this is a case about addiction, that feels like a shockingly bad thing to say in defense of your product. Can you imagine saying the same thing about oxycodone or cigarettes?

[0] https://www.npr.org/2026/03/25/nx-s1-5746125/meta-youtube-so...

hash872 - an hour ago

At least even money that an appellate court throws this verdict out entirely. Reminder that the US is the only developed country that uses juries for civil trials- everywhere else, complex issues of business litigation are generally left to a panel of judges. It's not that hard to rile up a bunch of randomly impaneled jurors against Big Bad Corporation. The US is kind of infamous for its very large, very unpredictable civil verdicts. There's an incredibly long history of juries racking up shockingly large verdicts against companies, only for an appellate court to throw the whole case out as unreasonable. Not even close to the final word in the American judicial system.

Edit to include: I mean this is coming the same day as the Supreme Court throwing out the piracy case against Cox Communications 9-0. Remember that this case originated with $1 billion dollar jury verdict against them! Was reversed by an appeals court 5 years later and completely invalidated today. Juries should not handle complex civil litigation, I'm sorry

fraywing - an hour ago

I'd hope the next iteration of social media tools humanity builds are less about reinforcing the individual ego and more about collective improvement, learning, and supporting the health of our species.

Anecdote, but it does seem like a lot of younger folks I speak with are exhausted by the dark patterns and dopamine extraction that top-k social media platforms create.

If agents/AI/bots inadvertently destroy the current incarnation of social media through noise, I think we'll be better for it.

strongpigeon - an hour ago

Gift link: https://www.nytimes.com/2026/03/25/technology/social-media-t...

dzink - 30 minutes ago

Read the book “Careless People” if you have a chance - according to the book, social media companies figured out they have real leverage with politicians since they can influence elections. As a result they are actively pushing for far right candidates to reduce their own taxation and regulation.

dlcarrier - 30 minutes ago

This is the kind of stuff that is causing them to push for mandatory identity verification laws. If they are being held liable for the the desires of their users, they're being forced micromanage the affairs of their customers, which preclude anonymous usage.

mikece - an hour ago

A good time to (re-)recommend the movie "The Social Dilemma".

- an hour ago
[deleted]
ChrisArchitect - 2 hours ago

Notably a different case from the other one in New Mexico:

Jury finds Meta liable in case over child sexual exploitation on its platforms

https://news.ycombinator.com/item?id=47509984

jmyeet - an hour ago

I believe social media is on a collision course with an iceberg called Section 230.

Broadly speaking, Section 230 differentiates between publishers and platforms. A platform is like Geocities (back in the day) where the platform provider isn't liable for the content as long as they staisfy certain requirements about havaing processes for taking down content when required. A bit like the Cox decision today, you're broadly not responsible for the actions of people using your service unless your service is explicitly designed for such things.

A publisher (in the Section 230 sense) is like any media outlet. The publisher is liable for their content but they can say what they want, basically. It's why publishers tend to have strict processes around not making defamatory or false statements, etc.

I believe that any site that uses an algorithmic news feed is, legally speaking, a publisher acting like a platform.

Example: let's just say that you, as Twitter, FB, IG or Youtube were suddenly pro-Russian in the Ukraine conflict. You change your algorithm to surface and distribute pro-Russian content and suppress pro-Ukraine content. Or you're pro-Ukrainian and you do the reverse.

How is this different from being a publisher? IMHO it isn't. You've designed your algorithm knowingly to produce a certain result.

I believe that all these platforms will end up being treated like publishers for this reason.

So, with today's ruling about platforms creating addiction, (IMHO) it's no different to surfacing content. You are choosing content to produce a certain outcome. Intentionally getting someone addicted is funtionally no different to changing their views on something.

I actually blame Google for all this because they very successfully sold the idea that "the algorithm" ranks search results like it's some neutral black box but every behavior by an algorithm represents a choice made by humans who created that algorithm.

- an hour ago
[deleted]
2OEH8eoCRo0 - an hour ago

Huge if upheld. This was the bellwether case for thousands of other similar cases.

apopapo - an hour ago

Will they also find liable all the companies that produce addictive food by injecting sugar into everything?

What about the "infinite" broadcasts found on all television channels?

This is ridiculous and pathetic.

aprilthird2021 - an hour ago

I can't help but feel these are "revenge" verdicts. Public perception of these companies is dirt low, and there are so few levers the average person has to change what they feel is an increase in atomization, loneliness, breakdown of civic discourse, Cambridge Analytica level political targeting, misinformation, etc.

Maybe the social media companies could do more to combat all these. They certainly have a level of profit compared to what they provide to the average person that makes people squirm.

But does anyone believe for a second that YouTube is responsible for a person's internet / video watching addiction? It's like saying cable television is responsible for people who binge watch TV.

It's hard to square this circle while sports gambling apps and Polymarket / Kalshi are tearing through the landscape right now with no real pushback