If a company has a product that relies on addiction mechanisms to succeed, that is a different situation, that is a corporate entity exploiting citizens for profit.
Cigarettes are a great example of where we can draw lines in the sand. If you want to smoke them go ahead you have that freedom, but I think companies should be banned from putting nicotine in them. Simple and obvious lines in the sand.
Vapes, whatever, smoke your bubblegum water. Vapes with nicotine? Clearly exploitive behaviour. Yes they can help you quit, but quit what? Nicotine addiction! If it weren't in cigarettes already you wouldn't need to quit it.
Social media is harder to draw lines in the sand for, but I think algorithmic feeds may be one place to target regulation.
This ruling was about liability, in that an entity created a product with risks without disclosing them. It's actually worse, they purposefully engineered the product to be harmful. Thus they are liable for that harm. This is subtly different from banning these products - arguably many products that are sold are harmful, the difference is that they either are not acutely harmful (junk food), or the acute harm is well known (alcohol, cigarettes). Some countries mandate disclosure at sale or on the packaging as well.
And society as a whole. Even if you don't participate you don't escape the blast radius of the harm they've caused over the past 10-15 years.
I was astounded hanging out with my friends in person last weekend how every one of them at some point pulled out their phone mid conversation to watch TikTok, or Wordle, or whatever. They thought I was the weird one when I mentioned all social media sites and apps are blocked on my phone. We had an overall good time but these moments stuck out.
The way we do this is just we set a passcode for the others phone but I configure my own settings and she hers. This has been available and worked for us for nearly a decade.
To kill time, sometimes I watch those random "America's Funniest Videos" type videos where it's some random family at home and something funny/weird/etc. happens. I've started noticing that in almost all of them now, everyone is just sitting around staring at a phone. Sometimes an entire family will be in the living room, three on a couch, each in their own little world.
Even my family does the same. It's a very very hard habit to break. Like smoking, except anti-social where smoking was at least social.
50 years ago they'd be reading their own newspapers and magazines.
The name changes but the song remains the same; people have their own interests, even within a family, that aren't shared with others. I wouldn't bore my partner by monologuing about my hobbies, and she likewise. At least we're in the same room together.
At one point I also had a few of them filtered at the DNS level at home, not to restrict my access but rather to defeat any embedded third party requests that might escape my browser filtering.
I remember going on dates a few years later, 2014/15, and the phone usage during the dates seemed rude and slightly offended me. Now it's so common it's not even really noteworthy.
When social media is a tool of regular people, it's an awesome, awesome tool. But when the companies and people that own the platforms start to see users as tools themselves, for their own sociopolitical ends, that's when they become destructive forces. And there was a clear enshittification line drawn about this time 10 years ago, when the transition from one state to the other got underway.
I fear that we're looking at an attempt to manufacture consent to destroy the tool and not just the malicious function.
Screens are not drugs. They are not somehow uniquely and magically addictive (like drugs actually are). The multi-media is not the problem and not the device to be regulated. The corporate structure and motivations are the problem. This issue literally applies to any possible human perception even outside of screens. Sport fishing itself is random interval operant conditioning in the same way that corporations use. And frankly, with a boat, it's just as big of a money and time sink.
We should not be passing judgements or making laws regulating screens themselves because we think screens are more addictive than, say, an enjoyable day out on the lake. They're not. You could condition a blind person over the radio with just audio. The radio is not the problem and radios are not uniquely addictive like drugs.
We can't treat screens like drugs. It's a dangerous metaphor because governments kill people over drugs.
Without this distinction the leverage this "screens are drugs" perceptions gives governments will be incredibly dangerous as these cases proceed. If we instead acknowledge that it's corporations that are the problem and not something magical about screens then there's a big difference in terms of the legislation used to mitigate the problem and the people to which it will apply. The Digital Markets Act in the EU is a good template to follow with it only applying to large entities acting as gatekeepers.
Try to take away a kids tablet, a teen's phone, or an adult's phone. They will fight just like an addict.
The issue isn’t with reading or consuming content, as was set up in the challenge above.
The issue is with designing feeds and surfacing content in ways that take advantage of our brains.
As an analogy, loot boxes in video games, and slot machines come to mind. Both are designed to leverage behavioral psychology, and this design choice directly results in compulsive behavior amongst users.
“things which are harmful and which people have trouble not doing”
I don't impulsively drive to the store to purchase another bag immediately after finishing the one I have whereas (for example) many people exhibit such behavior when it comes to tobacco.
In the case of social media the feed is intentionally designed to be difficult to walk away from and it is endless (or close enough as makes no practical difference). Even if it weren't endless, refreshing an ever changing page is trivial in comparison to driving to the store and spending money.
Maybe autoplay and immediately popping up a grid of recommendations should both be legally forbidden as tactics that blatantly prey on a well established psychological vulnerability. I'd likely support such legislation provided that it could be structured in such a way as to avoid scope creep and thus erosion of personal liberties.
In short I think Netflix is closer to a bag of Lays and modern social media closer to the cigarette industry of yore.
Some middle ground might be there somewhere. But if forced to choose… the choices for interpreting behavioral engineering funded by $billions in research for over a decade + data harvesting on a scale unprecedented, for the purpose of manipulating users:
Doesn’t sound a lot like fishing to me.
I've lived through this entire story before in the video game wars. People said exactly the same things with exactly the same urgency about Mortal Kombat - what kind of sick society do we live in, where greedy corporations sell you the experience of shooting people and ripping their heads off? Perhaps we have to let adults buy these "murder simulators", but only a disturbed, evil person could possibly argue for letting kids do it.
If that sounds crazy to you, the moral panic over social media will sound just as crazy in a decade or two.
From my perspective, this will sound crazy in a decade or two but more like how harmful smoking is and how ridiculous it is we didn't see it soon.
Anyway, sometimes 'panic' is justified. Sports betting has been a total disaster, for example.
That's why we call it addiction when folks struggle with stopping even though they can see the harm in their own actions.
This isn’t a moral panic.
Mortal Kombat did not result in changed behavior in its users. As I recall, The best study on video games only showed that there was some change in behavior for a short time after playing a game, and then children reverted to their baseline.
On the other hand, social media has not survived that scrutiny, with multiple studies show a causal link between anorexia, depression, anxiety, addictive design and social media.
People defended cigarettes too back in the day, and it took years for people to stop smoking cigarettes in public.
Tobacco was not a moral panic.
I just worry we left no levers for the public to regulate these entities and this is the worst option of very few options. Who isn't liable under this kind of logic?
Imagine a feed that actually just ends when you run out of posts from people you follow instead trying to endlessly keep your attention by pushing stuff it thinks you might like
If I've read all of the posts from my friends I would prefer to not see anything else, but that doesn't maximize engagement for ad platforms so
Suggest Eric Schlosser's Fast Food Nation. It'll open your eyes.
If it were one building in one state doing this shit, no one would care, and we'd just block or tell people don't go in the building. That doesn't work with digital products that started benign, then had the addictive qualities turned up to 11. That's malice, at scale. If every ice cream parlor, or link in the ice cream supply chain started adulterating ice cream with drugs, regulators would have dropped the hammer at the site of adulteration. Meta et Al have had no such presence forced upon them due to lack of regulation in some jurisdictions, or being left to self implement the regulation, thereby largely neutering the effort.
That doesn't mean they are equivalent and must regulated the same way. Scale matters.
If cable television or restaurants or ice cream start causing harm that we want to deal with, we can vote on that when the time comes.
People used to spend an awful lot of mindless time watching TV. They weren’t “addicted” in a clinically meaningful sense.
Also fwiw I'm not in favour of regulating social media, but I am in favour of bringing lawsuits to companies who engage in societally harmful behaviour, and punishing them financially.
“I’m so addicted to Firefly!”
That kind of thing?
If you're interested in the topic further, you could consider reading 'Toward the classification of social media use disorder: Clinical characterization and proposed diagnostic criteria', which should shine some more light on what people are referring to as "addiction" in this circumstance :)
If you're interested in the neuroscience, consider reading "Neurobiological risk factors for problematic social media use as a specific form of Internet addiction: A narrative review".
Like, I dunno, really getting into running or yoga or fantasy football?
Where is the line, according to experts in addiction-like behavior?
https://www.sciencedirect.com/science/article/pii/S235285322...
24-hour commercial cable news (in the US) is the original sin of addictive media.
Letting juries rob them just because the jury doesn't like it is nothing more than fascism.
If you're going to pick a law from one of the smallest states in the union, the least you could do is quote the relevant excerpts.
This is a pathetic rebuttal.
New Mexico prosecutors convinced a jury the company enabled child exploitation on its platforms.
The outcome followed laws that enable the jury to conclude as they did! So there you go, laws passed.
Is this Zuckerberg's burner account?
There should be a law banning the addictive practices of these apps. Until there is, fining the companies that make these apps is unjust.
There are laws enabling the judiciary to operate as it has to give plaintiffs a platform in the first place, in the absence of specific laws because legislative bodies are slow to adopt new laws for various excuses.
For example; not hard to pay off a handful of legislators to vote no. Then what? People just suck up living at the mercy of the rich?
Judiciary has leeway to allow such cases and outcomes to bubble up useful context for changes to law. Longstanding precedent and in some cases is codified in law itself.
The lack of a specific legal language banning social media actions is also irrelevant because of similarities to other situations that are enshrined in law. That human biology is susceptible to psychological manipulation is already well understood. Tiny little difference in legal context does not invalidate known truth of biology.
Society doesn't exist in your head alone and has existed for some time. Much of this is not truly new territory.
Stop embarrassing yourself.
Everything else outside of reels is the usual social media fake life facade, and everything amplified to the max for engagement to get it pushed to feeds via "the algorithm" (note: Interactions don't need to be positive to promote it to feeds)
Rewind 30 years or so, how long did the typical New York Times subscriber spend with their paper every day?
Was the Times addictive?
And I won’t even get started on network television for half a century.
Lots of people can get drunk once a month and suffer or cause no real harm. Some people get drunk everyday which is slightly more harmful.
There are many physical products that are today designed to minimize harm and misuse after facing liability historically. So I suppose the direct answer to your question would be "yes, absolutely, and there's a figurative mountain of precedent for it".
There’s somebody out there who’s harmfully addicted to just about anything, from ultramarathons to World of Warcraft.
What’s the limiting principle on liability?
The limiting principle on liability is quite complicated. You'd have to go ask a lawyer. At least in the US (and I believe most of the western world) it has to do with manufacturer intent, manufacturer awareness, viable alternatives, and material harm among other things.
Doritos are designed to taste good and encourage you to eat them to your satisfaction.
The latest Mario game is designed to be playable & fun for as long as you have time and energy to play it.
My Instagram feed is designed to engage me with interesting and relevant content for as long as I have time to scroll.
All three can be used in unhealthy ways, and would be less likely to be so used if they were designed less well to their goals.
Which is “designed to be addictive”?
It doesn't matter if the outcome is the same here what matters is the intent behind the design when considered in the context of the intended usecase. That's in addition to lots of other factors (some of which I listed) plus any relevant legislation plus any relevant case law and that will all be examined in great detail by a court. At the end of the day what is legal and what is not is decided by that process. A large part of the point of employing corporate lawyers is to prevent a situation where your past behavior is examined from arising in the first place.
I'd suggest the essay "what color are your bits" if you're genuinely struggling to understand this concept.
Also, what you’re describing sounds like when you’ve haven’t spent enough time on the shorts for the content recommendation algorithm to learn your preferences. Which I agree, is unwatchable. I saw it recently when my friend put on YouTube shorts on a guest account (on an Airbnb smart tv). it was bad. But spend enough time and that will change. But best you don’t!
Then again, I hardly use YouTube, so I don’t think I’m the target audience for this.
I think that's all related, is at least partially a matter of what I'm accustomed to, but is largely just an inherent part of how I am.
"is it a young people thing": no, obviously not because nothing is.
You're just as prone to addictive behaviours at 20 as at 40 at 80.
There might be some differences as to how you happen to be exposed, perhaps because of how your literal social network is behaving, but that's obviously not intrinsic.
I mean, yes, perhaps "young people" are slightly more likely to be exposed to it via advertising/peers/etc, but anyone with a similar exposure can be a victim.
Maybe a generational thing, but for most of the latter half of the 20th Century most folks had to “exert special effort to regulate their consumption” of network television. Should there have been lawsuits and regulation of couch potatoes?
Anyway, the way you talk about shorts reminds me of drug addicts who talk about how they can control their consumption. Some can. Many cannot but delude themselves. The way I see people interact with shorts/TikTok/reels is very much not restrained. They're optimised for addictive scrolling in the same way a slot machine is - the fact that some people can use a slot machine without becoming addicted is besides the point.
Good luck with that one. Somebody probably used 18th Century behavioral psychology to try to sell George Washington a horse!
Someone saying that someone shouldn't be able to promote specific harm x is not saying that the idea of 'promotion' of anything in general is necessarily bad, exactly in the same way that we restrict certain harmful things from being sold without being against the idea of selling things in general.
This is the Netflix business model, right now.
I'm not sure this rings true to me. Meta has to know that millenials and younger are giving up on their platforms, they have endless internal data showing it, right? If anything they are just afraid of endless litigation while they are struggling to gain an AI foothold.
Do you have a source for that? I don't think it's true when looking at global Meta numbers across _all_ Meta social platforms (FB+Instagram+Threads) combined.
Facebook is dwindling, but Instagram is still thriving.
Instagram doesn't make Zuckerberg "successful". He's a black hat that deserves jail
If that were true, they would be going somewhere and that somewhere would be visible. The last "new" thing that got any traction was TikTok and that is almost 10 years old at this point.
For a while, the Fediverse stuff (specifically Bluesky) seemed to be getting some traction, but apparently the Fediverse wasn't ready for the influx and people have started leaching back.
The social media sites have things pretty well carved up between them. If you want competition that doesn't suck as bad, you have to break them up.
I am sympathetic, because there is actual good that Meta does and teams that still try to fight the good fight.
But this is different from the way they are perceived in the broader public, which includes more than teens.
This is ignoring the strong America centricism that permeates decision making at an American firm. The emotions in the rest of the world are not even given the same degree of consideration.
What if instead of banning these addictive services we require companies to charge for them and disallow advertising revenue. That changes the entire business model, and there is no longer a strong incentive to have users spend as much time on the platform as possible. In fact, the best customer would be one that subscribes but barely uses the platform.
For me this all comes back to the perverse incentives that arrive when advertising is the primary source of revenue for the largest companies in the world. Social media allows advertising at scale never seen before and it’s no surprise that it’s been weaponized in ways that are actively harming people.
Heck, I am constantly looking at hacker news on my phone.
People want community but at a distance and only when they want it at a specific time. Everyone talks about how great it is to have community/village for raising kids but then they deal with their family teaching their kids bad habits, others being slightly neglectful compared to them, and having to put up with giving back to others to make it more fairly compensatory.
Shocker. Many people didn’t like that shit and decided it was better to do it all alone than deal with any inconveniences from others. You what your parents to help raise your kids? Nope. They often did bad things to you that you didn’t like. Also it means someone’s family has to move or live with the other. Another dealbreaker.
We just live in expensive times and these things are harder to do in a more globally competitive economy. People have lower tolerance.
On another note, personally I'm not sure I buy the "addictive" argument with social media, maybe its just me but I find social media pretty boring, but I think for a lot of younger people it is something that fills a need for meaning and connection to the world that has been diminished due to a loss of community in our society (which does predate social media).
I agree with this whole heartedly, but the government works on mass-scale patterns. It's essentially their entire job to regulate such things. Wind the clock back 20 years and the regulation seems insane. With how prolific computers have become? How they've been shoehorned into everyone's lives, whether or not they have any business or interest in actually interacting with one? It's a logical necessity.
I don't like it and I don't think this is a real solution. We should instead be looking to wind things down. Less people using less computers in a smaller fraction of their day. That's it. Unfortunately it looks like instead we're just going to be losing all of our computing freedoms while doubling down on the bullshit because Grandma needs email or something.
Today's media circus is about addictive social media. Before that it was video games and rock music and D&D clubs. Before that it the Satanic panic of the 80s, gay 'recruitment', Soviet spies. Much before that it was witches and heretics. And so on and so on, forever.
If you have a choice, maybe don't be part of the pitchfork wielding mob? The people with the pitchforks always think they're warriors of justice. They generally aren't. They just tend to make everything worse.
(Plus the economic motivations are so clear here - traditional media hate social media because social media ate the traditional media's cosy entrenched profits, so now social media are to blame for Russia, for Trump, for anxious teenagers... and must immediately be regulated out of existence)
How many people who played DND or video games or music or any of the other things you listed regretted it afterwards? How many people playing DND would say “I wish I was out with my friends because this game is too addictive”. None, because they were with their friends!!
The closest thing would be cigarettes. And while I think cigarettes should be legal normalized and plentiful, I’m aware enough not to attack the movement that marginalized them.
No one is talking about content here, and to emphasize the point, I think no one is really defending social media, for all the examples you gave it was an activity no one understood except the small group of people whom it gave meaning. Everyone understands social media and most people hate it.
And in fact, I might go so far as to say you’re directionally incorrect. Social media is the force that killed speech, that killed the things that made DND and punk music and transgressive video games possible. Social media is the victory of those people who wanted to normalize the abnrormal.
Lol. Tell me you weren't around for the D&D panic without saying you weren't around for the D&D panic.
This was precisely the argument used. "These kids should be out, running around, climbing trees! They're missing their childhoods! Here's Becky, age 15, to tell us how much happier she is now that she's hanging out with her girlfriends at the park, instead of summoning demons in her parents' basement."
And everyone bought it in exactly the same way that they buy the social media teen panic now. There were developmental psychologists on TV to explain how harmful D&D was to the kids' sensitive developing brains, how it was a gateway drug to all sorts of destructive self-behaviours, how parents were just so gosh dang powerless to do anything about it (all their friends are doing it!), and how the state needed to step in NOW! Sound familiar?
Honestly, you've seen it once, you've seen all there is to see. The social media panic has all the characteristics of any other moral panic. Some unpopular thing is alleged to be hurting children, and if you support it, then you're probably some kind of child abuser. Because we're all so perfectly rational, we all know our suspicions are 'directionally correct', to borrow your beautifully Orwellian turn of phrase. Certainly nothing to do with the ceaseless drum of narratives directed against social media that we imbibe from every external conduit - films, TV, newspapers - and live and breathe and occupy as though it were reality. Hey did you see that Netflix show Adolescence, about the harms of social media? It's fiction, but it really <strike>creates</strike>captures the moment. It's just so directionally correct, you know?
Not like our prejudices can ever be echoed back to us through our own media, in an ever shriller feedback loop. No need to build up any defenses against that sort of thing. Grab those pitchforks.
There are people who just can't admit to themselves they actually hate free speech. Because they're people who've never needed it. They've never been abolitionists speaking against slavery, or civil rights leaders speaking against apartheid - whether in South Africa or the American South. They've never been gay people fighting for equality, or trans people fighting to survive. They've never been an unfavoured minority - ethnic, religious, sexual, linguistic, what have you. They don't need free speech, so why should you? Everyone else already has all the rights that they could possibly want or need, so as far as they're concerned, all these people are needlessly disruptive to the public order. So they maintain a fiction of collectivism, in reality a majoritarian hegemony, while silencing anyone who'd speak out against it. They can't quite bring themselves to say they oppose free speech, but they act in practice to undermine it.
It is a contemptible stance.
Somewhere out there is a young lesbian in Russia finding her people on social media, a young atheist in Saudi Arabia making friends online. And the majority is as ever ready to throw the most vulnerable under the bus, so that they, the majority, don't need to take a modicum of responsibility for their own idle doomscrolling. And if they need to whip up a moral panic to do so, fine. More efficient that way, helps override people's rationality.
I suspect most people don’t remember WHY free speech itself is valued. It’s often treated in a talismanic sense.
At least in America, a good part of the value of Free speech comes because it is a fundamental building block to having a vibrant market place of ideas.
Since no one has a monopoly on truth, our best model is to have a fair competitive market place that allows good ideas to thrive, even if they are uncomfortable.
The traditional risk to the free exchange of ideas was government control; the suppression of trade.
However, in the era we live in, we have evolved to find ways to shape the market through market capture. Through overwhelming the average user, instead of controlling speech. Bannon called this “flooding the zone”.
The traditional solution ensured a working and vibrant marketplace for its era. I don’t know what tools we will develop for the modern era.
Do note, we depend on content moderation to keep forums like HN running. The fundamental power of content moderation is censorship. Without the exercise of these censorial powers, we would not be able to have this discussion.
Bad take. Civil liberties matter.
They need to play fair or GTFO
Duolingo's notifications are borderline emotional blackmail ("don't make the owl sad!"), and Duolingo is a vastly profitable company that expressly targets school-aged children. But because it's not social media, it's... fine?
What does playing fair even mean in this context?
Heck no. Year after year after year these issues have been brought up and ignored.
I worked in this damn domain, and have seen better people than me try their best to avoid exactly this outcome, for these exact same firms.
I can give credit to the people at these firms who try to do the right thing, but the firm itself needs to answer in terms of revenue and growth figures.
The fact is that your policy and T&S teams are cost centers, while the quarterly shareholder report is God. There is only one way these incentives line up.
It’s been YEARS of teams within these firms raising the issues of user harms and getting no where.
I remember having T&S folks cry on MY shoulder about how they couldn’t get engineering resources even while working at a FAANG company.
Others talked about how, out of sheer repetition, they developed a protocol for the times an engineering team would inevitably come in to “fix” T&S issues. They knew they would get sidelined, till eventually the PM/engineers/Savior would run into the same problems they had been dealing with forever, and then ask for help.
Public research also has issues - If you want to do actual research on tech, you can’t even get the data.
If you get the data, you also get the NDA, which means your results need to make tech look good, or the report becomes an internal report that will never see the light of day.
At any given time it seems like whatever is defined as the most addictive is just the one with most market share? For me personally I think most addictive is actually hacker news (god bless you all)
The solution is education. The government should be educating society and especially parents on how to protect their children.
Education worked to cut cigarette use, and is starting to lower alcohol consumption as well. It can work for social media without all the negative impacts on civil liberties that come with regulations.
I mean, they banned it from most public locations first.
I know lootboxes in video games are regulated in some countries. Not sure if they are banned in some places, but I do know that they have to show the odds in some places, and in others they have to be deterministic.
The crux of the issue is personalization and behavior psychology. If you move to a boring feed design, you end up addressing most of the current issue.
Another option is to allow for interoperability between social media platforms, which is a competition respecting way of giving people the ability to move to platforms that “work” for them better.
I’d hazard that Civil liberties are not really at risk here, only the bottom line of social media platforms. However, theres enough money to protect the bottom line even if it costs civil liberties.
Yes. I know corps do what they can to keep us engaged. I read HN too. I didn't say it was a big part.
People will give excuses for this. Guess what, meta and Google have their own too.
Feeds without options should be illegal.
Not every interaction needs to be your self control vs 30 years of professional marketing psychology doing A/B tests. It’s not a fair fight.
Pokemon cards are the same too.
> "The verdict has forced those inside the companies to grapple with the fact that many outsiders do not view them as favourably as they have come to view themselves."
They quote one unnamed insider for this characterization. I recall from my stats 101 class that n=1 is not a strong basis from which to make broad claims about a population of 10s of thousands.
Social media is one of the few good paymasters left.
I mean, it can't be that hard to imagine them, with their never-before-seen fortunes, extensive real estate portfolios and their extravagant lifestyles, in the roles of modern day Pablo Escobars and the like. Addiction is extremely profitable.
I mean, if that's where your confidence comes from...
Let me take half a step backward from that provocative stance. Of course we don't need to outlaw all fun, but we perhaps we really do need to outlaw some fun, to prevent people from overindulgence. Maybe a sin tax could be the way to go.
Core to all of this is what's colloquially become known as The Algorithm. Google in particular has sucessfully propagandized this idea that The Algorithm is a neutral black box over which we have no influence (for search). But every feature and behavior of any kind of recommendation or ranking or news feed algorithm is the result of a human intentionally or negligently creating that behavior.
So one thing most of us here should be aware of is to get more distribution for a post or a video or whatever is through engagement. That is likes, comments, shares, reposts, quotes and so on. All these companies measure those and optimize for engagement.
That sounds neutral and possibly harmless but it's not and I think it's foreseeably not harmless and no doubt there's evidence along the way to demonstrate that harm.
We've seen this with some very harmful ideas that get a lot of traction online. Conspiracy theories, antivaxxer nonsense, doxxing queer people, swatting, the manosphere and of course eating disorders. ED content has a long history on the Internet and you'll find pro-ana or "thinspiration" sites and forums going back to the 1990s.
So I think social media sites are going to have three huge problems going forward:
1. That they knowingly had minors (and children under 13, which matters for COPPA) on their platforms and they profited from that by knowingly or negligently selling those audiences to advertisers;
2. They knew they had harmful content on their platforms but hid Section 230 in particular as simply being the host for third-party content. I believe that shield is going to fail; and
3. They knowingly or negligently pushed that content to children to increase overall engagement.
One clue to all this is you see Mark Zuckerberg who wants to push age verification into the OS. Isn't that weird? The one company that doesn't have an OS thinks the OS should handle that or, more specifically, should be liable for age verification? That's so strange.
In an era where we have LLMs (and the systems that came before) that can analyze posted content (including video) and derive features about that content you don't get to plead ignorance or even user preference. These companies will be held liable for the harm caused by content they distribute.