216 pointsby andsoitis8 hours ago26 comments
  • belochan hour ago
    How much time you spend on something has become a metric of success in both gaming and social media.

    I occasionally play a perpetually-in-alpha AAA+ game (I won't name it to avoid the flames) that recently asked users to fill out a questionnaire. At no point did it ask how they could make my time spent in the game more fun or awesome. They did explicitly ask, "What can we do to make you spend more time in game?". The focus was clearly on quantity, not quality. This made me realize that, perhaps, I should stop playing this game.

    Social media and games use all sorts of dark patterns and engagement bait to keep you clicking, but no concern is given to giving back. There is a complete absence of awareness that the best forms of entertainment enrich and then end. If they were to provide an amazing but brief experience that changes regularly, people would come back again and again. They don't need to spend hours on it every single day to feel they're getting value and justify opening their wallets. Doom-scrolling and spending excessive time grinding in games will only make you feel stressed out and unfulfilled. Customers need to realize this and start voting with their wallets for experiences that end.

    We need to turn things around and say, "The light that burns half as long burns twice as bright!"

  • noosphr3 hours ago
    I'm so happy the economist is converting this important topic.

    Now if only the dick heads running this complete rag could listen to the wonderful people who wrote that enlightened piece and let users unsubscribe: https://www.reddit.com/r/assholedesign/comments/rli0u9/how_t...

    • bloqs36 minutes ago
      Jeez. I actually stopped short of reading the article because of this screenie
  • djoldman22 minutes ago
    From the corporate POV, the lesson remains: never ever ever conduct research that could lead to a conclusion that your product/service can harm in any way, unless you know how and intend to fix/change it.

    The most important evidence was just internal research saying exactly what the plaintiffs wanted.

  • ngriffiths7 hours ago
    Sure, making instagram as addictive as possible seems bad but I disagree with the framing a bit. Dark patterns get users to do things they don't want, that's why they get super annoyed at the design or the process or the outcome. Addictive apps are a different thing to me.

    I don't think it's that compelling to say "obviously no one wants to be on Instagram and they're getting manipulated into it." ...yeah they do! The question is can you make a compelling case that spending time on it is harmful.

    • dd8601fn7 hours ago
      This reminds me of the TikTok ban that lasted all of twelve seconds.

      I’ve been using the internet for longer than I care to admit, and I’ve never seen anything like it.

      It was like 300 million junkies all lost their drug supplier at the same time.

      • advisedwang3 hours ago
        > the TikTok ban that lasted all of twelve seconds

        The TikTok ban successfully forced the sale of the US TikTok operations. I wouldn't be so dismissive of it.

      • Terr_5 hours ago
        > This reminds me of the TikTok ban that lasted all of twelve seconds.

        That timeline has way more to do with the corrupt politicians than consumer behavior.

        _______________

        Both in the sense that the original semi-bipartisan law should've been ruled unconstitutional [0], and also in how the Republican party turned around and broke portions of that law for months until Trump could ensure the assets were handed to his major donor buddy--and fixing none of the original PRC influence issues. [1]

        [0] https://www.aclu.org/news/national-security/banning-tiktok-i...

        [1] https://www.techdirt.com/2025/12/19/tiktok-deal-done-and-its...

        • xg154 hours ago
          I found it interesting that congress never took issue with any other social media platform, and was fine with TikTok once again as well after it was sold to an American owner.

          So looks like politicians never had any problem with the addictiveness of social media, they only have a problem when it's used by foreign adversaries and not by domestic companies...

          • XorNot2 hours ago
            Absolutely no one was running it on "social media is harmful". The policy was overtly that it had to be American owned.
        • NickC254 hours ago
          >and also in how the Republican party turned around and broke portions of that law for months until Trump could ensure the assets were handed to his major donor buddy--and fixing none of the original PRC influence issues. [1]

          Are you even remotely surprised by that? Honestly.

      • thinkingtoilet6 hours ago
        That's literally what it was. These technologies are addicting. Is it as bad or the same as heroin? No. However, they are designed to be addicting.
        • cmoski6 hours ago
          Not as good as heroin either.
        • shimman6 hours ago
          Well seeing how we are all granted with one single life, maybe we should be more upset at things that take away our valuable time and replace it with things that make us angry? Who's to say that these things aren't worse than heroin? Lots of people would argue otherwise, I'm becoming one of them myself. Heroin only impacts one individual, social media impacts every connected person on the planet.

          Mass misery is still misery.

          • bheadmaster5 hours ago
            We should, but we also shouldn't decide what other people consider proper use of their time
            • nathan_compton5 hours ago
              I don't think this is obvious at all. I think its a reasonable function of the state to pursue policies that improve the mental and physical health of its citizens, partly because the negative effects of an unhealthy population are not limited to the individuals who are unhealthy. Liberty is great. I wouldn't want to live anywhere where it wasn't one of the primary goals of a society, but there is no stone tablet from God saying its needs to be the only goal a society can set.
              • bheadmaster4 hours ago
                When you say "a society" sets a goal, it always implies a ruling group of people imposing their view of the common good unto everyone.

                How do you make sure that whoever makes that choice makes it in a way you yourself will agree with?

                • fireflash382 hours ago
                  Do you seriously believe that is not happening now? Or that even a libertarian utopia could manage to achieve agreement?

                  If you're going to get philosophical, go all the way. Why have society at all because it's just people imposing their will on others? Or do you at least agree that there exists a line?

                  • sokoloffan hour ago
                    Even though there clearly must be a line on some topics, many people think those lines should be placed to minimize the number of times people are forced to do something (or prevented from doing something) against their will.

                    It’s not at all obvious that “adults can’t have TikTok” is anywhere near the correct side of that line.

                • nathan_compton2 hours ago
                  I think a mature person accepts some compromise with society at large. How do you make sure your wife always wants to do what you want? You don't. You live with other people, depend on them, pay for them when they are sick or poor (one way or another). You can't escape society. All that the libertarian view appears to do is make everyone miserable with externalities that a properly functioning state would regulate out of existence.

                  People's lives are ruined by gambling all the time, for instance. It is dumb to pretend like the pleasure a few people get out of it is worth someone betting away his family's welfare. It is ok to just decide "this needs to be regulated." Not everything is some intractible philosophical mystery that no consensus will ever coalesce around. Not every single thing every single person wants needs to be taken seriously.

        • 6 hours ago
          undefined
      • pembrook5 hours ago
        > It was like 300 million junkies all lost their drug supplier at the same time.

        No, it was not. It was actually nothing like that.

        No babies were left to die because their parents were out searching for tiktok clips. I saw no people whoring themselves on the street just to see a few tiktok clips. I heard no stories of children stealing from their own family to get a few scrolls of tiktok. There was no people killing each other just to get a hit of tiktok.

        Let's not trivialize something like drug addiction by comparing it to kids procrastinating by watching their TV phone app.

        • rjbwork3 hours ago
          I think you're probably vastly discounting the amount of childhood neglect wrought by social media addiction on both the parent and child's parts.
          • pembrook18 minutes ago
            No, I'm not.

            The median child of a social media user (so basically, the median child) is vastly more well off than the median child of a heroin/crack cocaine user and its not even close.

            The fact you're suggesting otherwise is quite frankly hilarious.

            Glad I could draw attention to the irrational logic of the current "social media is evil" moral panic.

        • ori_b3 hours ago
          True facts. My friend spent an entire day without weed once, and he killed 3 people, and abandoned seven babies. Four of which weren't even his!
          • pembrook22 minutes ago
            Not sure if you're aware opioids and crack exist.
        • 22 minutes ago
          undefined
        • prolly974 hours ago
          [dead]
      • tty466 hours ago
        [dead]
    • 2 hours ago
      undefined
    • funimpoded2 hours ago
      > I don't think it's that compelling to say "obviously no one wants to be on Instagram and they're getting manipulated into it." ...yeah they do! The question is can you make a compelling case that spending time on it is harmful.

      I want to follow news and deals from a handful of vendors and local businesses I like a lot. The best way to do that is following them on instagram. It’s the only reason I signed up and installed the app. If it’d been one or two, I’d not have bothered, but it’s that way for lots of them.

      I never want to see the “feed”. I would disable it if I could. I would make it default to my “following” view if I could. Instagram so very much wants me not to do that that they went out of their way to make it impossible to achieve that even with iOS’ built in shortcut-like system (you used to be able to).

      As a result, sometimes I get distracted by one or two of the top items on the feed. That doesn’t mean I actually want to see them. That I open the app once every couple days doesn’t mean I like the app. I think it’s terrible.

      People taking what folks do with a sharply constrained set of options as an expression of “why they want” or revealed preference or whatever is frustratingly wrong.

    • throwaway274486 hours ago
      > I don't think it's that compelling to say "obviously no one wants to be on Instagram and they're getting manipulated into it." ...yeah they do!

      I can't say I know anyone who defends extended social media usage. Do you?

    • InsideOutSanta5 hours ago
      I absolutely did not want to go to these websites and did it anyway. I ended up blocking them in my hosts file to get me to stop.
      • thedougd5 hours ago
        It's wild. I reinstalled Facebook to sell some things on Marketplace. Thirty minutes later I'm doom scrolling through shit I wouldn't have sought out. I uninstall the app after I no longer have items to sell.
        • ngriffiths4 hours ago
          Good point. You sort of have a purpose for opening it up, then you get distracted, or fired up or whatever, because the app just unloads tons of information at you.

          I sort of claimed that everyone enjoys it when they use these apps, maybe it's better to say they are likely getting something out of it in that moment. This could be kind of a bad deal - people make bad deals, and repeat old ones all the time. Other times they delete the app once they realize it.

      • 2 hours ago
        undefined
    • rconti5 hours ago
      What we've lost in social media just makes me so sad. I hate that reels/stories have become the "new" way of sharing things (over the past 10 years).

      I took a trip to Yosemite last weekend and took the (rare) opportunity to post a reel. All of the comments and reactions are DMs. It feels so lonely and weird and isolating. Who asked for this?

      I miss the days where you shared things, and people actually commented on them and interacted with each other as well as the poster. And where it wasn't ephemeral.

    • wormy7456 hours ago
      Do you think Instagram/Facebook is a wolf in sheep’s clothing, or a sheep with fangs?

      By that I mean- is the product addiction, with a shroud of media, or is it media which just happens to be addictive.

      • thewebguyd6 hours ago
        They are the wolf. The product is the user's attention, they are ad delivery networks disguised as "social media."

        The entire revenue model is based on on engagement and clicks, the product is incentivized to maximize time spent on the service at any cost. Addiction is a core engineering requirement.

      • micromacrofoot6 hours ago
        they know what they're doing, they've tried to bury the evidence but their own internal studies have shown addiction and harmful psychological effects in children

        facebook in the past has done tests of emotional manipulation on their users without informing them

        they're rotten from the head down

      • fsflover6 hours ago
        > is the product addiction, with a shroud of media, or is it media which just happens to be addictive.

        It's the former, by design:

        https://news.ycombinator.com/item?id=24579498

        https://news.ycombinator.com/item?id=26846784

      • altmanaltman3 hours ago
        they are the Shepard, we are the sheep, ads/media are the wolves that also have a deal with the shepard.
  • sunandsurf4 hours ago
    IMO recommender algorithms and other dark patterns like infinite scroll should be turned off BY DEFAULT on these apps. That way those people who want a dose of brainrot still have the option to do so but most of them get a little help to turn away from screens (I never heard anybody say they want to spend more time on social media).

    I've written more about this here: https://klemenvodopivec.substack.com/p/recommender-systems-n...

  • Animats5 hours ago
    There are two separate issues: addictive technologies, and mandated technologies. Instagram and TikTok are examples of the first. Google Play Store and Microsoft 360 are examples of the second. The second is more of a concern than the first.

    Google Chrome is trying hard to become a mandated technology, but hasn't quite succeeded yet.

  • Seattle35034 hours ago
    > The burden of proof should fall on the platform, not the victim. The question is not whether a harmed user can show specific damage. The question is whether the company can show, before rolling a product out to billions of people, that it is not predatory by design.

    That's asking every company to prove a negative before rolling out new features.

    Could we have a regulatory agency that keeps an eye on dark patterns and deals with them as evidence emerges that something is harmful.

    • Pet_Ant3 hours ago
      > That's asking every company to prove a negative before rolling out new features.

      That’s not as rediculous as it seems. That’s sort of model that drug manufacturers follow. It would also mean that if internally they see troubling behaviour they know they have to stop.

      Practically, it would be corporate cover up. And applied earnestly it would make these businesses unviable.

  • ajb3 hours ago
    What's ironic is that originally one of the advantages of automation was that it was more impartial than human-delivered services. The inventor of the automated telephone exchange, Strowger, designed it because he was concerned that the local telephone operators we directing his calls to a competitor. We had several decades during which machines had only very limited decision-making ability, and so it was their ability to manipulate or discriminate was minimal. That's gone. It went years ago, but it's taken a while for the public's intuition to catch up. People are starting to get angry, but are still somewhat baffled. Industry believes that they can continue to get away with it since they've done it for 10-20 years, but I think this underestimates how strong the backlash can get.
  • securicat6 hours ago
    It takes five minutes to delete your TikTok, Meta, and Instagram accounts. Setting up forwarding rules from Gmail to Fastmail or another provider takes maybe a little longer, after three months hopefully all your emails are going to the new account after changing them. These companies can’t manipulate you if you don’t use their products.

    Edit: I know what network effects are, I was talking about steps individual users can (and should IMO) take. We should be helping our friends, family and neighbors find safe and health alternatives like Signal for comms. Build different networks that are actually social and not doomscrolling.

    • dakiol6 hours ago
      Same can be said about Claude, Codex, etc. These tools are amazing (technically speaking) but they don't play in our favor (most of us are regular, replaceable employees). Only the usual suspects benefit from AI (executive layer, investors, etc)

      Still amazes me how engineers on HN are in awe of AI and LLMs knowing that 90% of us will be affected (we won't be able to bring money to the table) once the higher ups start to normalize even more the usage of AI to reduce headcount. Not everything is about the technical details people, grow up

      • securicat6 hours ago
        As if Claude and Instagram are remotely similar products. But again, these products make it incredibly easy to cancel. If work requires that you use it, make the next job you get not require it or just use it on the job.
        • dakiol6 hours ago
          I see engineers addicted to Claude the same way non-tech people (friends of mine) are addicted to instagram. At the end it's all the same: making multibillion dollar companies richer every day
        • AvAn125 hours ago
          Both try to maximize engagement. Both (soon to be) ad supported. Both driven by algorithms that show the user what they want to see.
      • fragmede6 hours ago
        It's an iterated prisoner's dilemma with all the other developers in the world, and some are vocally choosing to defect. The only rational strategy then is to also defect.
        • dakiol6 hours ago
          Right. It seems then that all these "elite" engineers on HN aren't as smart as we thought (and yeah, I include myself in that bag).

          It's deeply sad to see how our most beloved work (those side projects we pour ourselves into purely for the joy of it) will, at the end, be the very reason most of us lose our jobs (not all of us, but the majority). Openai/antrhopic/etc and others simply took all of that and turned it to their advantage. It's capitalism, sure, but it's heartbreaking... I wouldnt mind be out of job for another reason, but not for that one pls

          • apsurd6 hours ago
            All is not lost though is it? We can invest our efforts into local models and frontier competitors.

            I'm not blind, I have Claude pro (not max) and Cursor subscription. But I'm really hesitant to go balls to the wall on the most powerful models because it isn't sustainable; I don't want it to be. So how much can I get from the older models, the smaller, cheaper ones that will hopefully inevitably be commoditized. I think the harness improvements are making headway. I continue to think Cursor Composer 2 is more than adequate.

            Then again if one believes it's a race to the singularity, then that's another story. I don't.

            • fragmede5 hours ago
              Why not?
              • apsurd5 hours ago
                The most concise answer as of now is because AI has no "will".

                LLMs are objectively smarter than any one person so in some definition we've already created super-intelligence. The problem is they just sit there. They have all the answers already, if you think about it. Whenever we ask it something it gives us the answer, it's amazing, we can even say it can synthesize new information. We can agree with all the claims.

                But what does it do with that super-intelligence? Nothing. It can't. it doesn't have will. Or interest. Curiosity? Biological imperative. Who knows.

                So we create loops and introspection and set them free. Does giving AI a goal make the AI conscious? That's easily silly if you ask me.

                (I'm trying really hard not to make this philosophy. I really like the philosophy aspect, but this is my 30 second answer to the question)

                • fragmede3 hours ago
                  The singularity won't happen because sticking a cron job in front of an LLM and telling it to do something (make money) is "silly"?

                  I am no philosopher but https://poc.bcachefs.org/ seems conscious.

                  • apsurd2 hours ago
                    It's not.

                    It's no more conscious than running that cron job to send you today's weather. That's as far as I understand what this link is. The agent is posting blog updates and such. Because it was told to. It has no will. LLM generative output is incredible. It's also not conscious.

      • joe_mamba6 hours ago
        >These tools are amazing (technically speaking) but they don't play in our favor (most of us are regular, replaceable employees).

        I'm a mid programmer at best, like compared to top guys in the industry, who built stuff like OpenClaw or those prodigy 16 year-old coders who became millionaires, and yet I don't fear the LLM assisted coding future. I'm at peace knowing that I will adapt to the LLM programming world using my knowledge in my favor, or adapt to a world where I will no longer be a SW engineer, but something else.

        Also I find it ironic and poetic how some SW devs here want us to rise up and fight LLMs and the companies making them for disrupting this profession, when the SW dev profession was so well paid precisely because the SW products they wrote, disrupted other peoples' professions, moving the savings from labor costs into the pocket of employers, who used SW to optimize processes and repetitive labor and not have to hire as many people, yet they never saw an issue with other people losing their jobs. "Learn to code" eh?

        Oh how the turntables.

        • LPisGood6 hours ago
          I haven’t looked at OpenClaw but I get the impression anyone could build it. It doesn’t do anything technically impressive, does it?
          • joe_mamba6 hours ago
            >anyone could build it

            Then why hasn't anyone else done it before?

            With hindsight, it's always easy to say anyone could have done it too, but there's more to product success than just coding and shipping an app out the door.

            The first iPhone was built using COTS(commercial off the shelf) parts that Nokia, Ericsson and Motorola also had access to, and SW tools they also had access to, yet Apple won and buried the other companies because their end-product was way more popular with the customer base. I'm sure engineers from Nokia, Ericsson and Motorola also said "we could have done exactly the same thing with the right leadership" when they saw that.

            I also say "I could have done that" when I see how the maker of Flappy Bird became a multi millionaire, or how any other top 100 AppStore slop app has 100+ million downloads.

            Coding skills are dime a dozen these days. A lot of people can do 95% of these things now. The differentiator between failure and success, comes with the 5% rest: network effects, market know-how, promotion, timing, outreach, UI, UX, luck, etc.

            • LPisGood2 hours ago
              I agree it was a good idea and there’s more to product success, but you were specifically talking about coding skill level.

              There are some things I could easily say I (and many others) could not build even in retrospect. Solidworks, for example is beyond a lot of people’s skill level and very difficult to build.

              Flappy bird and open claw, not so much.

            • gavmor2 hours ago
              Many people have! Nanoclaw, LocalGPT, Moltis, Thoth, Q-Claw... the list goes on.
            • Dylan168074 hours ago
              Well your previous comment sure made it sound like you were talking about level of coding skill.
    • afavour6 hours ago
      It's frustrating to see this response so often, as if it weren't blindingly obvious.

      After years of near monopoly status these companies have a lock on many people's social lives. To give up Instagram is akin to giving up text messaging. "Just stop using it" isn't helpful advice to those people.

      If Instagram disappeared tomorrow it would be different, because everyone would be in the same position. But preaching personal responsibility in an area subject to network effects doesn't work.

      • securicat6 hours ago
        Give me a break. No one says “I can’t live without Instagram” literally. There are even studies that show that it makes their users depressed. From inside the company that _makes the product_.

        Now, would it be inconvenient to stop, sure, but people need better self control. Put that cookie down!

        • afavour6 hours ago
          > No one says “I can’t live without Instagram” literally.

          That's a straw man argument. I never said they were.

          > There are even studies that show that it makes their users depressed.

          What percentage of the population do you think are in the habit of reading academic studies about the effects of the products they use?

          It all feels reminiscent of cigarette smoking. The damage was very well known yet people continued to do it. It took extensive government regulation to wean people off their addiction, not a "buck up, chump" motivational message.

          • pixl974 hours ago
            I don't believe the above user is here to have a well thought out discussion, they just want to tell the world how much better they are than the social media addicts.
          • securicat6 hours ago
            I never said you did say that.
    • toasty2286 hours ago
      It takes five minutes to just stop being depressed, it takes 5 minutes to just stop being addicted

      What works for you, and me actually, doesn't work for most people, humans are complex things

      • dinfinity5 hours ago
        > It takes five minutes to just stop being depressed, it takes 5 minutes to just stop being addicted

        Would you place all the responsibility of drug addiction on drug dealers?

        Yes, their practices are predatory, but it is essential to remind the addicts that ultimately change comes from within themselves. They need to change something.

      • securicat6 hours ago
        That’s pretty insensitive to people suffering from mental illness. To compare sitting and doomscrolling on social media with something that’s chemically out of balance in a persons body is… a choice.
        • peanut_merchant5 hours ago
          The parent was obviously being sarcastic to prove the point through comparison.
      • 6 hours ago
        undefined
    • idle_zealot6 hours ago
      You can and should do that, but it's not sufficient to individually avoid harm. You still have to live in a world where most people have their behavior manipulated, and that will impact you. Even from a purely selfish perspective you should support efforts to stop this sort of control broadly with legal action.
      • securicat6 hours ago
        Fair point, and nothing would make me happier than TikTok and Instagram being shut down, at least for minors.
      • bdangubic6 hours ago
        exactly. I did all of OPs suggestions, decade ago (never had TT to begin with) and still live in a sick society surrounded by the influence of these platforms
    • macintux6 hours ago
      They're still distorting our political and social worlds, whether we're participants or not.
    • sonicvroooom6 hours ago
      yeah but that's a way they want you to behave in order to set up a control group within the target group that continues to behave as expected. the questions to be answered are not which parts of that control group, and how, nudge which parts of the target group slightly off the predicted and/or confirmed results. they answered that way back when. the question is, how can we react to the unexpected results that we ourselves forced. they can't just go on doing the opposite of what's good for them and bad for the users or vice versa, they have 50 years of data on that, some of which, should be noted, was accidently burned or bombed with a bunch of incriminating evidence shortly before investigators arrived ... which should make even the last sus person understand, it wasn't on purpose
    • logickkk14 hours ago
      "just delete your account" assumes the exit was designed to work. this is the company that called its own prime cancellation flow "the iliad."
    • 6 hours ago
      undefined
    • Bengalilol5 hours ago
      [dead]
  • cortesoft7 hours ago
    While I agree with the premise, I do wonder how you can write a law that would stop the behavior we want to stop without hurting beneficial features or allowing the law to be too easily bypassed.

    How do you describe in a legal way the difference between a useful feature people want and an addictive feature they don’t want?

    • ryandrake6 hours ago
      > How do you describe in a legal way the difference between a useful feature people want and an addictive feature they don’t want?

      I don't know how you'd write it in a law either, but if you're in a meeting at your tech company, and the product owner or tech lead uses language like "We need to get users to do..." and "We need to incentivize..." and "It should be easy to do X and hard to do Y..." then do whatever is in your power to steer/stop. You're not really building a product users want, you're pushing a behavior-modification scheme onto users.

      • pbasista5 hours ago
        > It should be easy to do X and hard to do Y

        > you're pushing a behavior-modification scheme onto users

        In general I think that your comment is reasonable. I just would like to point out that such "behavior-modification" schemes are sometimes introduced for genuinely good and ethical reasons.

        For instance, it is in my opinion desirable to make it more difficult for users to delete all their photos by e.g. having to confirm their decision in a dialog first. Because it prevents them from accidentally doing something they might not want to do and which is potentially impossible to revert.

      • cortesoft5 hours ago
        I feel like they will just frame it differently: “Users aren’t getting the full value from product x, so let’s change the workflow to help enable them to get more value with no additional effort” or “Users are losing out on a ton of value by cancelling their subscriptions without realizing what they are losing out on, so let’s implement feature x to make them less likely to mistakenly cancel”
    • akersten6 hours ago
      > How do you describe in a legal way the difference between a useful feature people want and an addictive feature they don’t want?

      For laws like this it always boils down to "I'll know it when I see it" which is such a shockingly poor way to write legislation that I'm flabbergasted it doesn't immediately fail any amount of rudimentary scrutiny. Not to mention the latitude it grants for selective enforcement. It's basically Washington asking (through the Economist) for a leash on platforms that host their critics that they can yank at any time the population gets too rowdy, with the convenient justification that the algorithm is too good and our attention spans are in danger or whatever.

    • conductr7 hours ago
      Agree. My first thought is most people in early days didn’t even want to start using PCs for work to begin with. The businesses generally had to mandate it. I imagine many people are facing this today with AI.
    • traderj0e6 hours ago
      One way is intent. If a company's internal communications show that they're intentionally making it addictive, or worse they know it causes harm, you have the smoking gun. This of course doesn't catch all the abuse, but at least it makes it much harder to do this down an entire reporting chain. They have to get really good at winking.

      One famous case was Apple suing Samsung over patents. Hard to prove until internal comms surfaced showing intent to copy the iPhone.

      • cortesoft5 hours ago
        Companies are onto this, though, and do training with their staff about how to phrase things in emails to make it look better.
        • traderj0e3 hours ago
          Yeah I've done those trainings. That's expected. Even if people learn to say things without saying them, it's a lot harder to communicate across multiple people. And some people are still loudmouths, like at Samsung evidently.
    • Seattle35032 hours ago
      You create an agency and give it a mandate that requires it to balance concerns.
      • octoberfranklin2 hours ago
        This answer can be applied to pretty much any social question.

        If it were so easy, we'd do this all the time. We already do it a lot, and there are heaps of examples where it goes wrong.

    • general14656 hours ago
      Very simple - force companies into data interoperability. That will allow users to move to competition without any data loss. I.e. nobody actually cares that GitHub is constantly down because you can move your repos to a different git provider or to your own server.
      • Aurornis5 hours ago
        > I.e. nobody actually cares that GitHub is constantly down because you can move your repos to a different git provider or to your own server.

        I honestly can't tell if this is serious or satire, so apologies if missed the joke.

        Pushing a git repo to a new server is built into git itself.

        Github project data is easy to export: https://docs.github.com/en/issues/planning-and-tracking-with...

        There are import tools for many competing projects that will transfer it over in various ways.

    • y0eswddl6 hours ago
      dark patterns are pretty well documented and understood at this point. I don't think identifying them is all that hard.

      Infinite scroll is one obvious one. As well as forcing algorithmic feeds of accounts we don't follow.

    • thaumasiotes7 hours ago
      Well, you could look to the gambling market for inspiration and let people voluntarily sign up for a blacklist on that feature.

      That would be a lot of extra work for the platforms, but I think the results would be interesting. It amounts to legislating that certain features have to be optional and configurable.

  • smarm525 hours ago
    The author is a "Master of Laws" (lawyer) writing about technology and psychology. Read with some skepticism.
  • thebeardisred4 hours ago
    > Dopamine neurons respond not to rewards received but to the uncertainty of whether a reward will arrive: the more unpredictable the outcome, the stronger the signal.

    This leads me to think about the idea of procrastination as a mechanism of gambling by the sub-conscious. A subversive way of "raising the stakes on the game" in an attempt to "make things a little bit more interesting."

  • andai5 hours ago
    > SOMEWHERE IN META’S servers sat a slide deck marked “Confidential”. Written in 2019, its conclusion was blunt: “Teens can’t switch off from Instagram even if they want to.”

    Found this document:

    https://www.economist.com/by-invitation/2026/04/29/stop-big-...

    Headlines (quote):

    Instagram is an inevitable and unavoidable component of teens lives. Teens can’t switch off from Instagram even if they want to.

    Instagram has become the ID card of this generation. It is the go-to tool for both measuring and gathering social prestige.

    Instagram sets the standards not only for how teens should look and act but also for how they should think and feel.

    Teens feel themselves to be at the forefront of new social behaviours to which there is no consensus on how to behave or cope. They sorely lack empathetic voices to whom they can turn for support.

    Teens talk of Instagram in terms of an ‘addicts narrative’ spending too much time indulging in a compulsive behaviour that they know is negative but feel powerless to resist.

    The pressure to ‘be present and perfect’ is a defining characteristic of the anxiety teens face around Instagram. This restricts both their ability to be emotionally honest and also to create space for themselves to switch off.

    Anxiety around what to post and the potential cost involved in posting the wrong thing means teens are switching from proactive to passive engagement with the platform.

    • andai2 hours ago
      Kinda sounds like the older generations have abandoned them, and now they settle for IG.
  • mactavish883 hours ago
    Isn't one of the core problems here a lack of "healthier" alternatives?

    (Not only in terms of tech, but also in terms of ways of living popularized by celebrities, thought leaders, etc.)

  • JohnMakin6 hours ago
    There are still supposedly serious people who should know better than insist "dark patterns" are not real and a mechanism to attack tech companies. I don't know how anyone these days can honestly reach that conclusion. Some of these sites use similar strategies as the old tobacco companies used to, all of this stuff is known already to marketers.
    • jp576 hours ago
      But are they actually serious people? I had corporate astroturf accounts arguing with me on my otherwise-ignored blog as early as 2004. All this time later, I just assume that every serious corporation employs PR firms using sock-puppet accounts to shill in favor of whatever dark shit they're doing, acting like it all just really great and good for us.
      • ddtaylor4 hours ago
        We've seen this on HN before as well. Companies targeting blogs and reddit with LLM generated content that "subtly" name drop products or services, fake praise, and even meaningless "support" requests on discussion boards.
  • nalekberov7 hours ago
    The Irony is that in order to read this entry I had to pass a cookie wall, which gave me only ‘Accept all’ and ‘Manage’. Then I couldn’t read it, because I had no subscription.
    • 5 hours ago
      undefined
    • y0eswddl6 hours ago
      the author has no control over that
      • Aurornis5 hours ago
        The publication did not force the author to publish their works on their site.

        The author made a choice to publish there. They want the paywall, because that's how they get paid for this writing.

      • nalekberov4 hours ago
        She had, apparently.
  • gnarlouse2 hours ago
    nauseating to think that Amazon was only charged $2.5B when they made somewhere around $25-44B off the Iliad cancellation thing.
    • mayhemducksan hour ago
      Yes it is nauseating. And it's the norm. Whenever some company finds a way to make a boat-load of money by exploiting a weakness of human nature, the government will demand a portion of the proceeds. It's business as usual, in the USA at least. Even more nauseating is what they could do with that $2.5B...
  • 2OEH8eoCRo08 hours ago
    > An internal memo found that 12-year-olds were three times as likely as 32-year-olds to stay on Facebook for the long term, despite the platform nominally requiring users to be at least 13; the memo concluded that Facebook “should consider investing more heavily in bringing in larger volumes of tweens”.
    • ViktorRay7 hours ago
      100 years from now the descendants of the engineers who work at Big Tech will be looked upon by their descendants with the same shame that people nowadays look at ancestors who were involved in tobacco.
      • 2OEH8eoCRo07 hours ago
        I don't think it will take 100 years, the world is already souring on big tech.
      • colechristensen7 hours ago
        >people nowadays look at ancestors who were involved in tobacco

        Huh? Does anyone actually care any more? The kind of moralizing busybodies that spend their time shaming the tobacco industry are few and far between.

      • selectively7 hours ago
        This is an outrageously dumb thing to say. BIg Tobacco knowingly sold a product that physically addicted (the only real form of addiction) its users and killed them.

        Facebook is not that.

        • treyd7 hours ago
          Facebook ran experiments on on unknowing teenage girls to study how being shown negative content leads to negative mental health outcomes, which has lead to suicide.
        • dijksterhuis7 hours ago
          > physically addicted (the only real form of addiction)

          https://journals.sagepub.com/doi/10.1177/26318318221116042

          snippet from the abstract

          > Contrary to the earlier notion that addiction is predominantly a substance dependency, research now suggests that any source or experience capable of stimulating an individual has addictive potential. This has led to a paradigm shift in the psychiatric understanding of behavioural addictions.

          dopamine, the little “hit” you get on social media sites or when you get a “ping”, has a massive role to play in behavioural addictions. and with behavioural addiction it basically causes the same stuff in the brain that cocaine etc does (very simplified explanation).

          also, i’m a recovering drug addict. and i can tell you for sure from my lived experience that addiction is definitely not limited to physical stuff like drugs. xD

        • Nevermark7 hours ago
          > Problem gambling (PG), also known as pathological gambling, gambling disorder, gambling addiction or ludomania, is repetitive gambling behavior despite harm and negative consequences. [0]

          Addiction isn't just [chemical in blood stream] -> [addiction]. Addiction involves many steps, many of them in the brain, and many of those reactive to non-physical events.

          [0] https://en.wikipedia.org/wiki/Problem_gambling

        • b00ty4breakfast7 hours ago
          >the only real form of addiction

          gonna need a citation on that one, dawg

        • ambicapter7 hours ago
          Depression is not death, but it is still a loss of life.
        • sandy_coyote7 hours ago
          Gambling is conventionally considered addictive, but the user isn't ingesting chemicals. I don't think a physical/non-physical binary really stands up under scrutiny. I mean, aren't all addictions physical insofar as they stimulate the body to produce neurotransmitters?

          Plus, smoking doesn't kill people; its pathological outcomes do. Similarly, looking at a phone screen might hurt a user's eyes, but it won't kill them; however, the decisions that user makes over time due to the effects of the subject matter they interact with might definitely put them at risk. And if aspects of that subject matter are deliberately amplified for their addictive properties, should platforms be regulated to control this?

  • metalman7 hours ago
    Step by step I am slowly backing away from any technology that I dont like, sometimes going to ridiculous lengths to bypass certain imposed aysmmetric requirements, up to and including abandonment. Nothing in my house beeps. My only online subscription is for web space. At this point it has become fun, as I have stoped reacting, and am experimenting and planning ahead, while figureing out ways to increase my income, while reduceing my personal spend
  • jmyeet5 hours ago
    This touches on many issues. It's kind of a confused narrative. Predatory practices against minors (in particular), sign up dark patterns, addictive behavior (eg infinite scroll). I don't think you should bundle all of these together like this.

    For example, infinite scroll is a product of a news feed and a news feed is algorithmic. What this produces and what it reinforces in the user is one thing but not really related to some small grey text in an Amazon Prime sign up.

    So let's break it down. Some of the issues are:

    1. Intent to sign up.

    2. Difficulty in cancelling a service. This is what I call the "gym model". Easy to sign up, hard to cancel. This can be handled. California, for example, requires companies to offer online cancellation. Most other states don't. This is so much an issue you'll regularly find advice from people to change their address to California so they get that option. There's no reason why every state or the federal government couldn't do that.

    3. Selling of your data. Not really touched here but it's going to be a big issue going forward;

    4. Addictive behavior to maximize time spent on platform; and

    5. What should we allow or disallow for minors. This is going to be a big issue. We're only at the start of the Age Verification Era (like it or not). But IMHO no company should be talking about how to maximize time spent for 13 year olds. And no advertiser should be able to advertise to minors; and

    6. Not really touched here but I'm going to add it anyway. IMHO we give tech companies a free pass for algorithms as some kind of mystical, neutral black box. But everything an "algorithm" does represents a decision humans made to get a certain behavior from what training data is used, what they're optimizing for (eg interactions or time spent) and what features they create.

    Platforms now essentially get liability protection from publishing content even though they elevate or suppress content based on what it contains. IMHO this is no different than someone deciding what to publish and being liable for it.

  • throwaway274486 hours ago
    Look it's either this or we adopt an economic strategy that isn't basically "assume the market magically knows what is best"—i.e., communism, as I understand Americans to know the term.
  • tolerance5 hours ago
    Do you want your fascist/authoritarian government to arrive via buxom CyberTruck or svelte fixie bike?
  • andy997 hours ago
    I assume this is about dark patterns but can’t confirm as I’m faced with a cookie wall where I can select from “Manage” and “Accept All”.
    • Unai7 hours ago
      I got a big "reject all" button just next to the "accept all" one, on mobile.
      • californical6 hours ago
        I just got a big

        “We respect your privacy” banner, with a big green ok button and a “manage data collection” tiny print text that had consent for everything automatically approved

      • gavinsyancey6 hours ago
        I wonder if you're in a region that requires that, while the original commenter isn't?
    • dangus7 hours ago
      [flagged]
      • birdsongs7 hours ago
        The point isn't us. You should know that. The point is the 99.8% that doesn't have our skills, and is forced into these dark patterns, by deception or psychological manipulation.
        • dangus7 hours ago
          Cookie dialogs are the opposite, they are asking for consent up front.

          Before they existed websites would just put stuff on your computer without asking. They’re literally a consumer protection.

          Direct your outrage elsewhere.

          • traderj0e6 hours ago
            The site doesn't put cookies on my browser, my browser lets the site set/get cookies. If I let it.
          • mghackerlady7 hours ago
            They're better, but most of them use dark patterns to get you to accept all of them
          • birdsongs7 hours ago
            I have no outrage, and for what it's worth, I upvoted you so your comment wasn't killed.

            I think you're being condescending though, and missing the point.

            • dangus4 hours ago
              The point came across to me as a pretty unproductive comment on the design of the website hosting the article rather than its contents, which is why I responded the way I did.

              Just like people who will complain about a news site with ads or some other unrelated design feature of the site they don’t like.

              Again, if you’re on here you presumably know how to block ads, or cookie dialogs.

          • EGreg7 hours ago
            Direct yours
            • 4 hours ago
              undefined
  • xg156 hours ago
    ...but but but Innovation!
    • DangitBobby5 hours ago
      Can we not do this type of mocking comment here please?

      > But but but <argument I am mocking>

      > Shhh! <People I don't agree with> will hear you!

      > It's almost as if <sarcastic oversimplification>.

      > Tell me you <don't understand topic> without telling me you <don't understand topic>.

      • xg154 hours ago
        I'm sorry. Yeah, agreed that this was too much. I was angry because I have seen quite often pushback against regulation that tries to elevate "innovation" as a value in itself, that even trumps other considerations such as safety or whether the innovation in question actually improves things meaningfully. What Meta did felt to me like the ultimate outcome of that thinking.
  • pembrook7 hours ago
    I'm no defender of engagement algorithms and social media (including upvote based algos and this site too)....but this is a ridiculous argument.

    Social media is not making you behave in ways you don't want. On the contrary, it's giving you EXACTLY what you want. People want to doomscroll social media instead of engage reality, because the real world requires action, effort and social risk...doomscrolling is pure passive consumption.

    If we're going to give people autonomy and freedom to choose how they spend their time, at some point we have to draw the line and hold people accountable for their own actions. Or we have to acknowledge we'd rather stay in a permanent state of adolescence and give full control of our lives to big brother.

    This constant push by the urban monoculture to turn everything into an "addiction" and turn everyone into a "victim" is a terrible set of ideas to put in peoples heads and is equally as toxic as anything they claim smartphone apps are trivially doing with UI design.

    Apps are not physically addictive like cigarettes or alcohol and never have been.

    And if you're going to argue social media preys on reward systems in the brain, this is also true about everything that humans do. Reward systems in the brain govern every single action we take, so everything we do can turned into a victimization by some addictive outside force.

    • grocery_stores5 hours ago
      I'll agree and add on: doesn't every for-profit enterprise want their products to be as addictive as possible (retrofitting that word in various ways to suit whatever their context)?

      If I'm making bicycle wheels, I want all my customers thinking "these are the best bicycle wheels. I don't want them from any other supplier ever again and actually I want some that I don't even need just in case." I want them up at night thinking about how great my bicycle wheels are, looking at pictures of them on their phones.

      I'm not sure how people are squaring the circle where companies are supposed to meet market demand by giving people what they want, but, uh, "not like that." If a product people want is really that bad for them, vote for the government to regulate it. We've read this story before.

    • tardedmeme6 hours ago
      What is addiction? Can you explain to us how you think about addiction?
      • pembrook6 hours ago
        Everything is an addiction. Nothing is an addiction. Instead of jumping down a semantic rabbit hole, it might be more useful to look at specifics since obviously its a spectrum.

        I can say with certainty that opioids are addictive. I can also say with certainty that doomscrolling is pretty far on the opposite end of that spectrum. I have yet to meet someone who would steal copper pipes off of an abandoned building or sell their body on the street for a few scrolls of tiktok.

        But why do you get out of bed at all in the morning? What drives you to exist...are those reward systems in the brain addictive? Why are you sitting at your keyboard right now arguing with a random stranger on the internet?

        Are you procrastinating something else you should be doing instead...and is that Hackernews' fault or yours?

        • tardedmeme6 hours ago
          So, buried within this extended mostly-non-answer, it seems your definition of an addiction is something that drives someone to steal copper pipes.
          • pembrook6 hours ago
            Litigating the semantics of a word doesn't get us anywhere closer to defining the limits of personal responsibility.

            You'd like the goalposts to sit closer so its easier to offload responsibility onto abstract external entities.

            I'm arguing this doesn't change who has to be the one to close the app, shut off the TV, turn off the video game, close the bag of candy and take risks in the real world.

            • tardedmeme5 hours ago
              And I'm not understanding what point you're trying to make, except that you don't think social media is addictive because people don't steal copper pipes to get more of it.
              • recursive-call2 hours ago
                I believe the argument is that for something to be addictive, the user has to feel compelled to keep using it that they would take some outsized/extreme risk/action in order to keep using it. It doesn’t have to literally be about stealing copper pipes, just, any action that an ordinary person wouldn’t do, justified because it lets them keep using. e.g drug addicts will steal, even from their own families, or lose their homes because they spent the rent money on drugs. So then the question becomes: if you had to pay to use social media, are there people who are so addicted to it that they would steal, or choose social media time over their bills? If you can actually imagine someone being so addicted to it that they would do this, then you can say social media is addictive.