TikTok's recommender is partly built on European Technology (Apache Flink for real-time feature computation), along with Kafka, and distributed model training infrastructure. The Monolith paper is misleading that the 'online training' is key. It is not. It is that your clicks are made available as features for predicitons in less than 1 second. You need a per-event stream processing architecture for this (like Flink - Feldera would be my modern choice as an incremental streaming engine).
* https://www.youtube.com/watch?v=skZ1HcF7AsM
* Monolith paper - https://arxiv.org/pdf/2209.07663
https://en.wikipedia.org/wiki/The_purpose_of_a_system_is_wha...
Distancing yourself from temptations is an effective and proven way to get rid of addictions, the programs constantly trying to get you to relapse is not a good feature. Like imagine a fridge that constantly puts in beer, that would be very bad for alcoholics and people would just say "just don't drink the beer?" even though this is a real problem with an easy fix.
If by features you mean tracking state per user, that stuff can be tracked without Flink insanely fast with Redis as well.
If you re saying they dont have to load data to update the state, I dont see how massive these states are to require inmemory updates, and if so, you could just do inmemory updates without Flink.
Similarly, any consumer will have to deal with batches of users and pipelining.
Flink is just a bottleneck.
If they actually use Flink for this, its not the moat.
On topic, I also think we won’t see much of a consequence from the new classification.
"No timeline was given on when authorities will make a final decision in the case."
Typical EU commission powerbottom move.Those benchwarmers will not do a damn thing before the next round of elections.
All politicians and world leaders decided to stay on a platform that enables generating child pornography. Why would they leave TikTok? They want people to be addicted to theirs and their parties content on there across all political camps.
You can't legislate intelligence...
The average person has zero chance against all-pervasive, ultra-manipulative, highly-engineered systems like that.
It is, quite simply, not a fair fight.
It's not just social media. It's gaming, ad tech, marketing, PR, religion, entertainment, the physical design of malls and stores... And many many more.
The difference with social media is that the sharp end is automated and personalised, instead of being analysed by spreadsheet and stats package and broken out by demographics.
But it's just the most obvious poison in a toxic ecosystem.
Its the "lean startup" culture as well as books like "Hooked, how to build habit forming products" - Nir Eyal.
The dark lean startup pattern is where you break down the big picture rationale for the company. You extract metrics that contribute to the company's success (i.e. engagement) and you build a machine that rewards changes to the underlying system that improves those metrics.
If done successfully, you create an unwitting sociopathy, a process that demands the product be as addictive as possible and a culture that is in thrall to the machine that rewards its employees by increasing those metrics. You're no longer thinking about purpose or wondering about what you're doing to your users. You simply realise that if you send this notification at this time, with this colour button, in this place, with this tagline then the machine likes it. Multiple people might contribute a tiny piece of a horrifying and manipulative whole and may never quite realise the true horror of the monster they've helped build, because they're insulated by being behind the A/B test.
No thats exactly capitalism, capitalism ensures processes gets more and more efficient over time, as you say previous versions were less efficient at inducing addictive behaviors but capitalism ensured we progressed towards more and more addictive apps and patterns.
Capitalism doesn't mean we start out with the most efficient money extractor, it just moves towards the most efficient money extractor with time unless regulated.
This is well known and a feature, capitalism moves towards efficiency and regulation helps direct that movement so that it helps humanity rather than hurts us. Capitalism would gladly serve you toxic food but regulations ensures they earn more money by giving you nutritious food. Now regulations are lagging a bit there so there is still plenty of toxic food around, but it used to be much worse than now, the main problem with modern food is that people eat too much directly toxic compounds.
So you are saying I am not an average person because I have the willpower to simply not install the TikTok app or watch short form video on any platform?
Has the bar for the average person really sunk this low?
Media can enrich people - expose them to new ideas, new stories, different views and opinions. This expands worldview and generally trends in the same direction as education.
Media can also be engaging - Use tools that make it compelling to continue viewing, even when other things might be preferable, on the low end: cliffhangers and suspenseful stories. on the high end: repetitive gambling like tendencies.
I'd argue if we view tiktok through this lens - banning it seems to make sense. Honestly, most short form social media should be highly reviewed for being low value content that is intentionally made addictive.
---
It's not society's job to cater to the whims of fucking for-profit, abusive, media companies. It's society's job to enrich and improve the lives of their members. Get the fuck outta here with the lame duck argument that I need to give a shit about some company's unethical profit motives.
I also don't care if meth dealers go bankrupt - who knew!
PS if we apply your analysis to video games they surely would have been banned too.
Edit: by the way I remember back in the day we searched for "addicting flash games" and it was seen as a positive ;p
I’m quite glad that there is a form of control preventing a company from a different part of the world that don’t really care about the mental health or wellbeing of my kids to creep into their life like that…
As a parent, it’s not a fair fight and I should not have to delegate that to another private company
Fixed that for you.
Your argument is basically the same as saying that Banana Ball should be banned because they are intentionally making the experience as fun as possible, because that's how they make money.
I'm not some sort of prodigy or anything, just a random schmuck. If I can do it, anyone can. People just really like blaming others for their own vices instead of owning up to having a vice.
HN is a vice too. One of many that I have. And they're all mine. I've chosen them all. In most cases knowing full well that I probably shouldn't have.
You don't say to a heroin addict that they wouldn't have any problems if those pesky heroin dealers didn't make heroin so damn addictive. You realize that it's gonna take internal change (mental/cultural/social overrides to the biological weaknesses) in that person to reliably fix it (and ensure they don't shift to some other addiction).
I'm not saying "let the producers run free". Intervening there is fine as long as we keep front of mind and mouth that people need to take their responsibility and that we need to do everything to help them to do so.
Right, but they don't. Not to mention a significant portion of the target market are children whose brains are still developing.
Smoking is a vice. Anyone can stop smoking any time they want. But it was still incredibly popular. Government regulation put warning labels everywhere, tightened regulation to ensure no sales to children, provided support to quit. And then the number of people smoking plummeted. Society is better off for it.
"Anyone can do it" is an ideological perspective divorced from lived reality.
If you can't stop cold at any time if/when you decide to, you don't have the agency to make a free choice.
This is such a normie perspective and shows just how unfamiliar you are with addiction. Yes, some people can avoid becoming addicted. Yes, some addicts can break the habit and detox and stay clean. At the same time, a larger number of addicts can detox but relapse in a relatively short time. There are also addicts that have not yet admitted they have a problem, and there are addicts that are okay with being an addict. Just because you have the emergency stop button that you can hit does not mean everyone else is the same way. Your lack of empathy is just gross
If something's harmful it should be controlled.
That does not mean it is the province of the state to ban them.
You are free to not use TikTok yourself, no one is stopping you.
Also drug decriminalisation is very nuanced, I’m not 100% against it, I’m just pointing out just that open drug use spiked after.
Was that spike a true spike in new users, or existing users just coming out of the shadows?
We are primates dominated by our primitive urges.
I’d love to think of myself as an exceptional individual because I don’t use Facebook or TikTok, but most likely I’m not exceptional at all, and other people could also just not use TikTok.
I think that algorithmic social media should be likewise regulated, with at the very minimum ban for minors.
Note that my focus here on the "algorithmic" part. I'm fine with little or no regulation for social media where your feed is just events in chronological order from contacts that you are subscribed to, like an old bullettin board, or the original Facebook.
Also, I think we should consider companies that provide algorithmic social media responsible for what they publish in your feed. The content may be user generated, but what is pushed to the masses is decided by them.
Pitting the average person up against that, then blaming them for having "no self control" once they inevitably get sucked in is not a remotely fair conclusion.
Like gambling?
Europe wants to ban algorithmic recommendation. You attack a straw-man: banning all the content from creators. If you have any valid argument you should bring them to the discussion instead of creating imaginary enemies.
Banning harmful design patterns is a must to protect citizens even if it ruffles the feathers of those profiting from their addiction.
please fix this to
A subset of the population who has not yet reached the age of consent
I think society broadly accepts that there are different expectations for children and adults; the line is currently officially drawn somewhere around 18-21 years old.
1. The reactions to banning drunk driving: "It's kind of getting communist when a fella can't put in a hard day's work, put in 11 to 12 hours a day, and then get in your truck and at least drink one or two beers."
2. Mandatory seatbelts: "This is Fascism"
You're going to balk at just about anything that comes down the line - I guarantee it.
[https://www.unilad.com/news/us-news/americans-react-drink-dr...] [https://www.history.com/articles/seat-belt-laws-resistance]
Additionally, this is not about self control. The claim is that the algorithm is designed to exploit users. Insiders (including a designer of infinite scroll!) have admitted as much going back years: https://www.bbc.com/news/technology-44640959
We should be uncomfortable with companies spending huge amounts of money to research and implement exploitative algorithms. We did something about cigarette companies advertising to kids. This action is along those lines.
> it's a wildly popular form of entertainment with millions of creators sharing their lives
I don't think we should be rewarding those who make a living by creating "content" that serves for nothing but a dopamine rush, and you can bet that those who who put it in the effort to create valuable content would prefer to have one less channel where they are forced to put out content just to satisfy the algorithm overlords.
If you want to distribute short videos on a website that let's you choose what you want after search and deliberately clicking on a button to play it, by all means feel free to do it. But the current Tik-Tok mechanism removes all agency and are an extreme version of mind pollution.
you can't even be aware of what they're doing, because the algorithms they're using to do it are black boxes
youtube algorithms have shown evidence that they've lead to radicalization
would you not draw a line on any of this?
Spoiler: There is no line. Societies (or more accurately, communities) attempt to self-regulate behaviors that have perceived net-negative effects. These perceptions change over time. There is no optimal set of standards. Historically, this has no consideration for intelligence or biology or physics (close-enough-rituals tended to replace impractical mandates).
Except, I'll never be given that choice.
I think there's a wide regulatory spectrum between those extremes--one that all sorts of governments already use to regulate everything from weapons to software to antibiotics.
It's easy to cherry-pick examples where regulation failed or produced unexpected bad results. However, doing that misses the huge majority of cases where regulation succeeds at preventing harms without imposing problematic burdens on people. Those successes are hard to see because they're evidenced by bad outcomes failing to happen, things working much as they did before (or getting worse at a slower rate than otherwise might happen).
It's harder to point to "nothing changed" as a win than it is to find the pissed-off minority who got denied building permits for reasons they disagree with, or the whataboutists who take bad actions by governments as evidence that regulation in unrelated areas is doomed to failure.
We do it for alcohol and cigarettes already: taxes, ads & marketing restrictions, health warning mandated communication.
I used to be opposed, now I'm not. I strongly believe human specialization is the important niche humans have adapted, and that should be encouraged. Another equally significant part of human nature is, trust and gullibility. People will abuse these aspects of human nature to give themselves an unfair advantage. If you believe lying is bad, and laws should exist to punish those who do to gain an advantage. Or if you believe that selling an endless, and addictive substance should restricted. You already agree.
There's are two bars in your town, and shady forms of alcohol abound. One bar is run by someone who will always cut someone off after they've had too many. And goes to extreme lengths to ensure that the only alcohol they sell is etoh. Another one is run by someone who doesn't appear to give a fuck, and is constantly suggesting that you should have another, some people have even gone blind.
I think a just society, would allow people to specialize in their domain, without needing to also be a phd in the effects of alcohol poisoning, and which alcohols are safe to consume, and how much.
> Growing up, I had to go to the theater to see movies, but that didn't make cliffhangers and sequels any less compelling. Now we binge entire Netflix series and that's fine, but short-form video needs government intervention?
Yes, the dopamine feedback loop of short form endless scrolling has a significantly different effect on the brain's reward system. I guess in line with how everyone shouldn't need to be a phd, you also need people to be able to believe the conclusions of experts as well.
> The real question is: where do we draw the line between protecting people from manipulative design and respecting their ability to make their own choices?
It's not as linear of a distinction. We don't have to draw the line of where we stop today. It's perfectly fine to iterate and reevaluate. Endless scroll large data source algorithm's are, without a doubt, addictive. Where's the line on cigarettes or now vapes? Surely they should be available, endlessly to children, because where do you draw the line?
(It's mental health, cigarettes and alcohol are bad for physical health, but no one (rhetorical speaking) gives a shit about mental health)
> If we're worried about addictive patterns, those exist everywhere—streaming platforms, social feeds, gaming,
I'd love to ban micro transactions and loot boxes (gambling games) for children.
> even email notifications.
reductive ad absurdism, or perhaps you meant to make a whataboutism argument?
> My concern isn't whether TikTok's format is uniquely dangerous.
Camels and Lucky Strike are both illegal for children to buy.
> It's whether we trust adults to manage their own media consumption, or if we need regulatory guardrails for every compelling app.
We clearly do. Companies are taking advantage of the natural dopamine system of the brain for their advantage, at the expense of the people using their applications. Mental health deserves the same prioritzation and protection as physical health. I actually agree with you, banning some activity that doesn't harm others, only a risk to yourself, among reasonably educated adults is insanely stupid. But that's not what's happening.
> I'd rather see us focus on media literacy and transparency than constantly asking governments to protect us from ourselves.
I'd rather see companies that use an unfair disparity of power, control, knowledge and data, be punished when they use it to gain an advantage over their consumers. I think dark patterns should be illegal and come with apocalyptic fines. I think tuning your algorithm's recommendation so that you can sell more ads, or one that recommends divisive content because it drives engagement, (again, because ads) should be heavily taxed, or fined so that the government has the funding to provide an equally effective source of information or transparency.
> You can't legislate intelligence...
You equally can't demand that everyone know exactly why every flavor of snake oil is dangerous, and you should punish those who try to pretend it's safe.
Especially when there's an executive in some part of the building trying to figure out how to get more children using it.
The distinction requiring intervention isn't because these companies exist. The intervention is required because the company has hired someone who's job is to convince children to use something they know is addictive.
Apples to oranges.
I can’t make meth in my basement as a precursor to some other drug then complain that my target product had a shitty design.
Real life experience shows that TikTok is harmfully addictive and therefore it must be controlled to prevent negative social outcomes. It’s not rocket science, we have to be pragmatic based on real life experience, not theory.
In short, banning hard drugs is very very obviously a losing policy that serves only to enrich the world's worst people at the expense of everyone else.
Is this a serious question? Have you been asleep since 70s and are not aware on how the War on Drugs has been going?
The science tends to back these ideas up. Banning does not stop people from doing what they want.
Education and guard rails are always better than hard control.
They haven’t concluded anything yet. It’s early in the process and they’re opening the process of having TikTok engage and respond.
The article starts with a headline the makes it sound like the conclusion was already made, then the more you read the more it becomes clear that this is the early part of an investigation, not an actual decision.
> Now European Union regulators say those same features that made TikTok so successful are likely illegal.
> No timeline was given on when authorities will make a final decision in the case.
> At this stage, the Commission considers that TikTok needs to change the basic design of its service. For instance, by disabling key addictive features such as ‘infinite scroll' over time, implementing effective ‘screen time breaks', including during the night, and adapting its recommender system.
Most of these seem concretely doable, and maybe effective. But the core of the addictiveness comes from the "recommender system", and what are they supposed to do there? Start recommending worse content? How much worse do the recommendations have to be before the EC is satisfied?
I agree with you, this is rather odd. And sort of missing the problem.
All apps are about attention. The percentage of the time spent using the app when it shows you your good content (Whatever it is that you're interested in) determines how addictive it is. And the percentage of time it's showing you bad content (Ads, 'screen time breaks', manual scroll time, more ads, loading screens, sponsor ads, filler content (youtube for instance is full of this), etc) counteracts the addictive properties because nobody likes it.
What's the end goal here? Right now TikTok is winning the attention economy race against the other apps because it's more focused on the user's preferred content. Is that what we want to reduce? To show more uninteresting other stuff on the screen? Like blank 'wait 5 minute' statics? Or just more ads?
I get that we don't want a generation of socially inept phone addicts, but this won't solve anything I fear. People will still want the good content, forcing the most customer friendly (it feels wrong to say that about TikTok) app to become more enshittified is a bewildering solution.
I think it is, but it's hard for me to articulate without getting into teleological judgments.
Or in the gym, where they block machines for many minutes, i.e. much more than the one or two minutes of resting in between sets, while paging through social media in between sets. Asking them to unblock a machine in the gym? Some are reachable there if you stand in front of them and wave your hands.
And walking the dog, or strolling with kids while on "social" media. I often observe them to neither recognize when either dog or kid try to show them things or events. I sometimes wonder (aloud and near them ;-), if they phone with their companions.
Oldie but Goldie: Charlene Guzman's video "I forgot my phone" from 12 years ago:
https://youtube.com/watch?v=OINa46HeWg8
I like music and I like videos, but I also learned to concentrate on the task at hand and/or the people besides me.
Disclaimer: listening to music while doing chores like washing dishes is OK. But I prefer a dish washer and connect to people while the dish washer is running.
That's what I was thinking about when I mentioned teleological arguments. A stream is programmed by somebody else and who knows if they are trying to please me or their partners. I do use music streaming services, but these days I try to listen to entire albums.
I get what you are saying about wearing headphones in public places. I have ear buds that have a fantastic transparent mode where I get a mix of music and outside noises sent to my ears. As soon as I start talking, it pauses the music. In theory, you would be able to ask me to press the elevator button for you but having ear buds in usually communicates do-not-disturb.
That video is great and I hadn't seen it before. Thanks for linking it.
That's one problem, yes. The other, more subtle, might be that one cannot really develop a personal taste. If you have a CD (or nowadays Vinyl ;-) you can listen to it even when the artist isn't in the stream any more.
I'm a fan of J. J. Cale's music for example, and have a number of his CDs (ripped for convenient mobile handling, of course) so can mix my own "stream" to take with me and listen to it when I'm in the mood. I'm a fan of Bach, Händel, Telemann too, own a number CDs of course, and when I'm in the mood for a relaxing bit of classic I can "stream" my own selection. So I decide what to listen to and I decide when to do it, depending on my mood.
Just some days ago I learned that many people sell their CD collection and you can find them in cheap batches on Ebay. When I suddenly remember a long forgotten artist (forgotten by me as time goes by), I will be able to grab a CD, rip it and listen to things I remember. Doing that with a streaming service? Tough thing, I suppose.
I do listen to music new to me (mainly on Youtube) every now and then, and learn about artists I didn't know, but if I really like enough of their work, I'll get a CD. Which, BTW, is not always easy for certain niche artists which either publish a limited set of vinyl and/or downloadable collections only nowadays.
I feel like with Tikatok etc. its really just that your entire attention both audio and visual is stuck in that thing, it's not an auxiliary activity
If you think about cognitive load, then I would say yes.
Listening to music or even talking with headphones does consume some of your brain power, but you are able to execute physical tasks reasonably well. For example I am able to do DIY (apart from measuring) whilst listening to audiobooks. I can do all the household chores too (washing up, clothes, tidying vaccuming etc)
I cannot do that with short videos playing. firstly I have to hold the device, secondly I'm not looking at what I am doing, thirdly, moving pictures attract my attention.
In the same way that that most people are utterly unable to do "thinking work" (ie stuff that requires inner monologue and visualisation [sorry aphantasia people]) with a TV within visual range. I know that some people are able to do ironing infront of the TV, but I'd struggle with that to do a good job
This caused me to disable the youtube app(literally can't uninstall it on a pixel stock os), and if i ever utilize youtube on my phone its through firefox instead.
I also got the extension unhook on my desktop/laptop, and now my youtube experience is more reminiscent of the early 2010s where I would just use it to look up sports highlights or music videos, and if i don't have a video or subject in mind im not force fed one.
This also just kinda shows me how terrible the search experience is on youtube. Feel like all of their effort is on their doomscroll / suggested content, rather than their search results.
however, I must say that youtube shorts is the worst of the bunch, even if I'm trying to be entertained, it's full of just slop spam and "top 5" or something that I'm not interested in, while reels are actually funny
I remember I'd sometimes try and get into it, scrolling just to see if I can find one thing that's actually good and just quitting because I got frustrated.
it's truly the worst of the bunch in my opinion.
and they've definitely made the overall experience worse on youtube while focusing all efforts on shorts and funneling you to it.
Also, Unhook for removing suggestions/comments/etc from Youtube, you can basically turn everything off until it becomes a search bar and your subscriptions.
Get a good website blocking browser extension. Remove anything that resembles a "recommendation" or avoid it like the plague.
Maybe it just has to run its course.
But beyond that, the most compelling content was probably the best all time videos which I’ve exhausted. Plus half the videos now seem to cut off before they answer whatever question they posed. Very frustrating.
I'm only back on social media because it actually made my life worse being off it.
I landed on YouTube shorts once and started scrolling. Hours later I genuinely felt like I’d been drugged. It was shocking and surreal how powerful the effect was. Made it a point since then to never go there. I’ve never touched TikTok but I’ve heard stories of people spending every waking second on that thing.
Obviously some people are going to be more prone to it than others.
I may be similarly wired, and I've found abandoning Duolingo streaks on my own terms to be very rewarding.
Their lessons aren't bad because they want to stop you from being proficient in the language; they're just uninspired and unchallenging. Their gamification is nonsense and totally non-addictive. No one is addicted to Duolingo, otherwise they'd be doing hours of lessons every day.
People just don't want to break their streak - that's the reason they continue to use it. It's an obligatory thing you do once a day, it takes 2 minutes, and they get to show you an ad.
I've used it for a couple years learning Spanish, essentially because it introduces me to new words I'm otherwise not encountering in my regular Spanish usage, and that's all I need it for. Duolingo actually used to be better, and I was paying for it for a couple years. But they did a giant AI overhaul last year that made the content worse overnight. The stories are regularly nonsense because they're LLM-generated and seemingly not vetted properly. And they somehow even broke the TTS which hasn't been able to say certain consonant sounds for months now. But I digress.
> They reset your cleared lessons and require you to redo them if they add new vocab to them
The same would be true if that case was never considered, or postponed, during development.
I tinkered with my own toy learning platform; I too found the question of how to deal with added content to an already-completed lesson, and the answer is that there is no easy answer. Every solution sucks in a way.
> as well as randomly clearing them in the name of making you practice them again
Anki does the same, calls it "spaced repetition" and says it's a feature. Should we ban Anki now?
I switched my launcher so I could customize the icon, but Duolingo overwrites it.
This is not a toggle feature.
Damn them, so it's gone now.
They don't need to design for that. If you want to become proficient in the language, you'll have to use the language for something. Whatever lessons Duolingo provides, they won't get you to become proficient in a language.
That bit isn't that difficult or new. the special sauce is the editorialising and content categorisation. being able to accurately categorise videos into genres, subjects and sub subjects (ie makeup video, 25 year olds, woman, straight, new york, eyeglitter) and then creating a graph of what persona likes what.
The second secret sauce is people going through and finding stuff and promoting it. TikTok (used to) editorialise/pay highly for content.
How is that any different to Facebook?
In addition to TikTok, the social media company Meta, Facebook's parent company, is also under the investigation.
https://ec.europa.eu/commission/presscorner/detail/en/ip_24_...
Quoting: >The Commission is concerned that the systems of both Facebook and Instagram, including their algorithms, may stimulate behavioural addictions in children, as well as create so-called 'rabbit-hole effects'. In addition, the Commission is also concerned about age-assurance and verification methods put in place by Meta.
And before someone mentions the other? X - the everything app formally known as Twitter - is also under the Commission's scrutiny. It was fined approximately 120 million euro at the end of last year.
Once the website is designated as such, you're looked at with more scrutiny, have to comply to higher standards, and the exact remediation steps are decided on a case-by-case basis. All of the cases are chugging along, but not all of them are on the same stage.
If your website is not popular enough to be designated as VLOP, this law basically doesn't exist. It's not like GDPR in a sense that it defines some things everyone has to follow, regardless of your audience size.
EU laws are slow, sometimes stupid, but consistent.
the EU isn't a federal government. the UK, when it was in the EU did a full smoking inside ban, and tightened it after leaving.
It however had a massive problem with binge drinking and sorta didn't do much to stop that, apart from make it more expensive.
the netherlands has a smoking ban, but it was brought in later (I think). they had a different drinking culture so didn't have the same issues as the UK for drink.
That kind of issue is usually left to member states.
Packaging however is more the EU's purview
i hope i don’t have to go out of my way to explain the analogy.
Like, where were they years ago saying “hey TikTok, we think your design is addictive and probably illegal, you need to change or face penalties.” If TikTok continues to operate in the same manner despite a warning, sure, throw the book at them. Otherwise it just seems like the EU waits for years and years until a company is a big enough player and then retroactively decides they’ve been breaking the law for years. Doesn’t help the impression that they’re running a non-EU tech company shakedown campaign.
Tiktok spend a lot of money talking to EU regulators. They know shits coming down the track because these directives have to be put into law by eu members. that takes time.
> Doesn’t help the impression that they’re running a non-EU tech company shakedown campaign.
But thats not the point, companies shouldn't be doing stuff they know is harmful. Thats literally the point of regulation.
That is basically what happened today. No penalties have been issued at this point.
Also Commission had sent various requests for information to TikTok in 2023 before they opened these proceedings in early 2024 (https://ec.europa.eu/commission/presscorner/detail/en/ip_24_...) - this didn't come out of the blue.
Lol. It's never like this.
These companies are given plenty of warnings and deadlines. After years and years of ignoring them these companies get slapped with a fine and start playing the victim.
BTW at this point DSA has been in effect for three years
The answer is "Yes".
If you now think "they have to start somewhere in prosecuting these violations" you're partly correct but also partly mistaken. Sure they have to start somewhere but they could - and if they are really serious about their claims should - have started prosecuting all those other companies which did this way before TikTok or even its predecessor Musically was a thing. Algorithm-driven endless scroll designs to keep user's eyes glued to the screen have been a thing from very early on in nearly all 'social' app-site-things and the warning signs about addictive behaviour in users have been out for many years without the law being thrown at the proprietors of those entities. As to why this has not happened I'll leave for the reader to decide. There are plenty of other examples to be found in this regard ranging from the apprehension of the Telegram CEO to the sudden fervour in going after X-formerly-known-as-Twitter which seem to point at politics being at play in deciding whether a company gets to violate laws without being prosecuted or not.
So what's the solution you ask? As far as I can see it is to keep these companies from violating user's rights by keeping them in line regardless of who owns or runs the company and regardless of whether those owners or proprietors are cooperative on other fronts. Assuming that these laws were written to stem the negative influence these app-things have on their users they should have gone after many other companies much earlier. Had they done so it might even have led to TikTok realising that their scheme would not work in the EU. They might not have launched here or they might have detuned their algorithmic user trap, they might have done many things to negate the negative effects of their product. They might just have decided to skip the whole EU market altogether like many other companies have done and do. I'd have thought 'good riddance', what you?
I find twitter more addictive then TikTok. Should it be forced to make me click "next" before seeing another tweet?
Banning recommendation engines is also incredible. Is it really the EU's case that they're all illegal, from the youtube recommendation engine to amazon's "people who bought this also bought" to twitter's "who to follow"? Is TikTok's just too good?
If infinite scroll is good design.
> we might as well throw every designer in prison
No, we might as well convict every manager/boss that assign those goals to the designers.
Designers don't dream these patterns out of thin air, they have incentives to.
Its a good thing, but its not what the title says it is
I think short form content especially is basically brain rot, but I also don't know how you ban something simply because it's too good at providing content people enjoy. The result would just be a worse experience across the board, is that a win?
I guess a forced 5s video saying take a break after 20 minutes of doom scrolling wouldn't be the end of the world, but truely making it illegal doesn't make sense.
I've never gambled let along used a gambling app.
Frank Possemato: How to Live an Analog Life in a Digital World: A Workbook for Living Soulfully in an Age of Overload
How to live an analog life in a digital world | Frank Possemato | TEDxBU https://www.youtube.com/watch?v=WEMffdUgWCk
He does not say stop everything, but instead offers realistic tips to reduce one's dependency, e.g. he suggests to take breakes and training to stay offline for certain intervals (e.g. half an hour, or an hour)
LinkedIn has become such a pit of force-fed self-help vitriol it’s completely lost its purpose.
How is this any different from Reddit? From Instagram? Why single out TikTok?
Applying laws unevenly is a form of discrimination.
I can think of tabacco and other drugs, but that's not really the same. Monopolistic behavior doesn't really fit either. Maybe Kleenex marketing doing so well their name became interchangeable with the word "tissue"?
At the top of the mobile app there’s a “For You” tab and a “Following” tab. You must have been on the “For You” tab.
Switch to the “Following” tab.
If you start scrolling the “For You” tab and do it for half an hour straight, you’re basically signaling that this the content you wanted to see and will continue getting more of it.
Which country, or countries are you talking about? Are you including the UK?
Unlike the States, with one language, we have many.
But we still don't let liquor stores sell to kids. We still criminalize a lot of drug use. And while there are tons of different opinions about whether specific instances of those restrictions are appropriate, pretty much everyone agrees that there are qualitative differences between predatory behavior-influencing and bad choices.
It's a question about where to move lines that society already broadly agreed to put in place, not about whether to have lines at all a la "well you might as well just make bad choices illegal then". We already do that, and it succeeds at mitigating harm in many (not all) cases.
It used to be understood within hacker culture that government influence over speech is never good. For some reason when it comes to social media we're suddenly willing to throw the baby out with the bathwater.
Even banning algorithmic feeds is a problem. Do those feeds push harmful and extremist content? Yes. But they push everything else as well. Making it more difficult to find related content of any kind is a kind of censorship.
"Government" isn't an external actor; most governments exist on the spectrum between "a big chunk of the public is OK with or supports what they're doing" and "directly influenced by public opinion" (democratic states).
And yeah, they abuse power a lot! So do corporations, tribes, religions, etc.--governments are just the "big group with violence capability and power" we put in that role during this chapter in history. There's no magical "autonomy is theoretically possible and therefore it's OK" line between letting a corporation that everyone is hooked on control what content people see, and letting a state restrict what content can be shown. The technicality that "people could choose to watch something else" for the corporation is just that--a technicality, and just as specious as the "if you don't like it here, then you can move" argument against participation in a state.
> This is a fundamentally different argument than with liquor, or cigarettes, because those don't intersect with fundamental human rights.
Well, liquor quite famously is considered as something to be protected in the U.S.; we had widespread civil unrest about removing legal restrictions on it! As for cigarettes: what's different about my right to express myself in speech and my right to put what I want in my body? Aren't both protections trying to draw a line between preserving autonomy and preventing harm? That's not whataboutism--I see that as a very similar regulatory space: personal choice and trust versus behavioral likelihood/distrust + negative externalities.
> It used to be understood within hacker culture that government influence over speech is never good.
It used to be felt within hacker counterculture that some government influence over speech was bad. Then counterculture expanded into a regular subculture (whether you call it eternal september or just popularity). Some popular opinions about speech restrictions changed (whether you call it orchestrated frog-boiling or just shifting opinions). And even in the '70s-'80s, few hackers believed that a free-speech-absolutist position would scale.
We also heavily restricted speech then; take off the rose-tinted history glasses. Broadcast media restrictions were insane. Things like the MPAA had widespread public support. Pearl-clutching was at a high level. Hell, the CDA in many ways had more teeth to restrict outlets back then, and a centralized social network sure looks a lot more like an outlet than a Usenet server does. I'm as thankful for section 230 as the next person, but even I have to admit that it is looking more and more like a technicality-shield every day.
Like, you and I probably agree extensively on the specifics of what counts as government overreach and the strong need to protect against that. I just don't think the way to do that is to stake out deliberately absolutist positions--either because you think an absolutist outcome is good or out of a flawed belief that an extreme position will somehow help move the consensus position in the direction of the extreme.
And yet there's no Constitutional right to liquor, nor is access to liquor generally recognized as a fundamental human right. The civil unrest was due to the obvious result of banning liquor being the creation of mafia-run black markets. Same as the "war on drugs." Banning drugs only makes black markets and cartels more effective.
>As for cigarettes: what's different about my right to express myself in speech and my right to put what I want in my body?
Cigarettes aren't speech. Speech has value, even if it can do harm. Social media, being a means of effecting speech, has value even if it can do harm. Cigarettes have no value and can only do harm.
>We also heavily restricted speech then; take off the rose-tinted history glasses. Broadcast media restrictions were insane.
Sure. The argument for regulating broadcast tv and radio was the spectrum being a limited resource - but the web is not a limited resource. No matter how big Facebook or Twitter get, we're not going to run out of internets for new platforms.
>I just don't think the way to do that is to stake out deliberately absolutist positions--either because you think an absolutist outcome is good or out of a flawed believe that an extreme position will somehow help move the consensus position in the direction of the extreme.
I don't believe my position is necessarily absolutist - although it gets interpreted as such - I believe that social media platforms have the right and the moral duty to police themselves and deplatform dangerous and extremist content. Free speech doesn't mean freedom from consequence nor does it oblige you a platform. I just don't believe that having governments step into that role is a good idea, and I think recent history in the US and UK back me up in that regard.
And yes, social media platforms may (and will) get things wrong, but they can't send men with guns to shoot me in the head.
But staking out extreme positions to the contrary is sometimes necessary when confronted with extreme positions. I consider the position that social media is more addictive and dangerous than heroin to be extremist, that algorithms need to be banned, that social media platforms need to be nationalized, all of that hyperbole is getting ridiculous to me, and it smacks of a moral panic. But almost no one seems willing to push against it or even question it.
But this really just stinks of Regulatory Capture to me. Their main argument is that the consumers like to use the app too much?
Why? Because it's smarter and not as enshittified as the competitors?
I'm sure if youtube, facebook, reddit, etc reduced the number of ads, and started showing more relevant content that people actually cared about, they too would start being "more addictive". Do we really want to punish that?
What's the end goal here?
What makes TikTok different?
I can watch a 9 hour video on GTA games without problems (not in one sitting, but in parts), but 3 'shorts' in a row with not enough info and explanation to be interesting makes me close any of the 'shorts' apps (tiktok, youtube shorts, instagram....).
(eg, the 9 hour video: https://www.youtube.com/watch?v=Faxpr_3EBDk )
There’s clear scientific evidence that these shorts trigger addiction-like behavior[1]. The detrimental effects on a kid’s brain development can be inferred[2]. A reasonable argument could made that it’s not so different from things like nicotine, alcohol or other drugs when it comes to child brain development. I believe these companies know this and willfully push it on kids anyway.
Edit: And I think it’s really telling that China has some of the strictest state-led anti-addiction and youth protection policies globally[3].
[1]https://www.sciencedirect.com/science/article/pii/S105381192...
[2]https://www.sciencedirect.com/science/article/pii/S105381192...
[3] https://cjil.uchicago.edu/print-archive/kids-no-phones-dinne...
Of course, there are other issues instead.
At that point it was a game of "I'm not slandering you" to chip away at every other valuation, that could have easily have just been called antitrust because they didn't build it. That was 1996-2005 and went completely unchecked.
This is similar but the stack was even cheaper, and closer to more people's faces.
Even if governments take no recourse, I don't see an issue with government using it's position to put a food pyramid in citizen's faces to say like, "this can be harmful." The church probably would have if this were long ago, except, instead of fire and brimstone, some sort of epic story of social isolation, permanent dissatisfaction, and self-imposed constraints, alien abduction, transformation into a pig by a wizard?
There's probably a lot of visceral fears that would be worthy analogs to the harms of the feed.
I don't think that this narrative has been explored enough, honestly. Corps keep building crap like this, even amazon has (had?) an influencer feed.
People who are in play/leisure should probably practice tolerating more choices than "express mild, momentary dissatisfaction and receive an instantaneous reward"... that's probably not a life everyone should be trained to live
Unregulated social media is digital heroin. And allowing it to be run by billionaires with thinly-veiled agendas is like cutting the heroin with rat poison.
It's fine to disagree with the EU's stance (I probably do. I'm not sure yet) but it's not a good look to dismiss it without some recognition that a reasonable person might think this is a worthwhile position to take given the known harms of social media.