696 pointsby oldnetguy4 hours ago84 comments
  • JohnMakin2 hours ago
    We'll try everything, it seems, other than holding parents accountable for what their children consume.

    In the United States, you can get in trouble if you recklessly leave around or provide alcohol/guns/cigarettes for a minor to start using, yet somehow, the same social responsibility seems thrown out the window for parents and the web.

    Yes, children are clever - I was one once. If you want to actually protect children and not create the surveillance state nightmare scenario we all know is going to happen (using protecting children as the guise, which is ironic, because often these systems are completely ineffective at doing so anyway) - then give parents strong monitoring and restriction tools and empower them to protect their children. They are in a much better and informed position to do so than a creepy surveillance nanny state.

    That is, after all, the primary responsibility of a parent to begin with.

    • techblueberry39 minutes ago
      I know this is weird, but I'm in some ways not really sure who is on the side of freedom here. I get your position, but like. The whole idea of the promise of the internet has been destroyed by newsfeeds and mega-corps.

      There is almost literally documented examples of Facebook executives twirling their mustaches wondering how they can get kids more addicted. This isn't a few bands with swear words, and in fact, I think that the damage these social media companies are doing is in fact, reducing the independence teens and kids that have that were the fears parents originally had.

      I dunno, are you uncertain about your case at all or just like. I just like, can't help but start with fuck these companies. All other arguments are downstream of that. Better the nanny state than Nanny Zuck.

      • netcan2 minutes ago
        >Better the nanny state than Nanny Zuck.

        For me this is a crux, at least in principle. Once online media is so centralized... the from argument freedom is diminished.

        There are differences between national government power and international oligopoly but... even that is starting to get complicated.

        That said... This still leaves the problem in practice. We get decrees that age-restriction is mandatory. There will be bad compliance implementations. Privacy implications.

        Meanwhile a while... how much will we actually gain when it comes to child protection.

        You can come up will all sorts of examples proving "Facebook bad" but that doesn't mean these things are fixed when/if regulation actually comes into play.

      • pokstad37 minutes ago
        Social media is like tobacco. We went after tobacco for targeting kids, we should do the same to social media. Highly engineered addictive content is not unlike what was done to cigarettes.
      • Terretta35 minutes ago
        > ... start with fuck these companies. Better the nanny state than Nanny Zuck.

        I'm not sure how those two positions connect.

        Execs bad, so laws requiring giving those execs everyone's IDs, instead of laws against twirled mustaches?

      • butvacuum35 minutes ago
        Screw over Meta then. Not everybody else.

        Meta is the bozo in a panel van with no windows. All The legit porn sites put up Big Blinking Neon Signs.

      • nefarious_ends14 minutes ago
        > There is almost literally documented examples of…

        lol

    • belorn2 hours ago
      I am a bit confused by that comment. Are parents social responsible to prevent companies from selling alcohol/guns/cigarettes to minors? If a company set up shop in a school and sold those things to minors during school breaks, who has the social responsibility to stop that?
      • brifflean hour ago
        when I was a kid in the early 90's, my state (and many others) banned cigarette vending machines since there was no way to prevent them being used by minors, unless they were inside a bar, where minors were already not allowed.
      • Quarrelsome42 minutes ago
        I think the argument is more around it being illegal so as to not be forced into playing "the bad guy". It's hard to prevent a level of entitlement and resentment if those less well parented have full access. If nobody is allowed then there's no parental friction at all.

        Its unfortunate that the application of this rule is being performed at the software level via ad-hoc age verification as opposed to the device level (e.g. smartphones themselves). However that might require the rigimirole of the state forcibly confiscating smartphones from minors or worrying nepalise outcomes.

        • JohnMakin29 minutes ago
          I'm saying hold parent's accountable for their children's online behavior and for their protection online, not companies (who want to profit off the kids, perverse incentive) or governments (who can barely be trusted to do this even if this was the only goal). For example if your kid starts making revenge CP of their classmates, and the parent could have reasonably mitigated or known about it, I think the parent absolutely should be held responsible.

          Don't punish the rest of the web for crappy parenting and crappy incentives by companies/govts.

          • Quarrelsome17 minutes ago
            > I'm saying hold parent's accountable for their children's online behavior and for their protection online

            You're saying the status quo and I think its fair to state you wouldn't intentionally design the status quo. Unless we have some wizard wheeze where we can easily arrest and detain or otherwise effectively punish parents without further reducing the quality of life for their children.

        • philipallstar35 minutes ago
          But it's not playing the bad guy. It's playing the good guy.
          • Quarrelsome19 minutes ago
            in the abstract but in the social of the home you have to be the bad guy. While good parents manage that, the bar is too high for society in general.
      • honkycatan hour ago
        ISPs and OSs should be the ones providing these tools and make is stupid easy to set up a child's account and have a walled garden for kids to use.
        • maccardan hour ago
          I live in the UK. By default your ISP will block "mature" content and you have to contact them to opt out. iOS, Android, Playstation, Xbox, Switch all have parental controls that are enforced at an account level.

          A child with an iPhone, Xbox, and a Windows Laptop won't be able to install discord unless the parent explicitly lets them, or opts out of all the parental controls those platforms have to offer.

          The tech is here already, this is not about keeping children safe.

          • drnick1an hour ago
            No, it's about corporate and government control. Thankfully, the UK government is clueless about tech, which means these controls can be bypassed relatively easily by using your own DNS or a public DNS server like Quad9.
          • whywhywhywhy43 minutes ago
            You have to be very tech savvy to know that your kid asking to install Discord to talk to/play games with their friend group is as dangerous as it is.
            • maccard20 minutes ago
              A single google search will tell you pretty unanimously that discord isn’t for kids, is rated 13+ and has risks of talking to strangers.
        • techblueberryan hour ago
          Mark Zuckerberg advocates for this, most people entrenched in this argument think it's worse. But I'm all for burning it to the ground so.
        • blindriveran hour ago
          You must not have kids if you think it's easy to keep children off things that are bad for them.
          • friendzisan hour ago
            [Any] task is much easier if you have the tools. Do/did you have a baby monitor? A technological tool, that allows you to "monitor" the baby while not being within an arms reach.

            Do you have an A+++++ oven with three panes of glass? It's [relatively] safe to touch and instead of monitoring if a child is somewhere near the oven you have to monitor if the child does not actively open the oven. That's much easier.

          • squigzan hour ago
            It's really not some Herculean task to do so either, though.
            • metadatan hour ago
              Maybe you don't have kids of your own. Once you have 2 or 3, it is quite challenging to manage everything, especially over time.
              • blindriveran hour ago
                Especially if they are older, like 8+ years old. They are resourceful, sneaky and relentless.
                • cgriswaldan hour ago
                  Which is exactly why all people everywhere giving up their privacy will also be ineffective.

                  Drugs, alcohol, cigarettes, pornography were all illegal for me to access as a kid but I wouldn’t have had any trouble getting any of it.

                  • sarchertech36 minutes ago
                    Maybe at 16, not at 8.
                    • cgriswald21 minutes ago
                      The older kids are often the easy source for the younger kids. At 8 I had already seen a Playboy and knew kids who had seen harder stuff. I could have easily gotten a teenager to get me cigarettes (and drugs, but I didn’t know what those were really). I had also already tasted alcohol. Any of this I could have stolen from any number of places.

                      At 16 it was easier, but at 8 it wasn’t hard.

                • aleph_minus_onean hour ago
                  > They are resourceful, sneaky and relentless.

                  ... and honest:

                  - they will honestly tell you that they'd be very happy to see you dead when you impose restrictions upon them (people who are older will of course possibly get into legal trouble for such a statement)

                  - they will tell they they wish you'd never have given birth to them (or aborted them)

                  - they will tell you that since they never wanted to be born, they owe you nothing

                  - ...

                  • squigzan hour ago
                    Sounds like a kid in need of psychiatric help.
            • scottLobster41 minutes ago
              As a father of 3, one thing the wife and I had to learn over the course of the first two is that the modern world holds parents to impossible standards and a "fuck off" attitude is required for much of it.

              We've had pediatricians shame us for feeding our kids what they're willing to eat and not magically forcing "a more varied diet" down their throats at every meal, despite them being perfectly healthy by every objective metric. There are laws making it technically illegal for us to leave our kids unsupervised at home for any period of time in any condition, even a few minutes if one of us is running slightly late from work/appointments.

              Your not-quite-2-year-old is too tall for a rear-facing car-seat? You're a bad parent, possibly a criminal and putting them at risk by flipping the seat to face forward, a responsible parent spends hundreds of dollars they don't have on several different seats to maybe find one that fits better or have their kid ride uncomfortably and arguably unsafely with their legs hyper-extended up the seatback.

              Miss a flu shot because you were busy? Careful you don't come off as an antivaxxer.

              And all of this and more on top of changing diapers, doctors' appointments, daycare, preschool, school, family activities and full time jobs?

              Yeah, when my kids are old enough to engage with social media I will teach them how to use it responsibly, warn them about the dangers, make myself available to them if they have any problems, enforce putting the phones down at dinner and and keep a loose eye on their usage. Fortunately/unfortunately for them they have a technically sophisticated father who knows how to log web activity on the family router without their knowledge. So if anything goes sideways I'll have some hard information to look at. Most families don't have that level of technical skill.

              • Wobbles4218 minutes ago
                I was almost certainly never going to be a parent for other unrelated reasons, but you have just given me a whole other list of confirmations for that decision that I hadn't thought of before.

                Thank you for that.

            • Kaliboyan hour ago
              I remember how my sister and I set up Google Family and fully locked down my niece her phone with app restrictions, screen time restrictions and a policy of accountability when we need to extend the screen time.

              It worked really well up until she got a school managed chromebook for homework with no access controls.

      • regularfry2 hours ago
        The school, in loco parentis.
      • estimator72922 hours ago
        Companies are legally prohibited from marketing and selling certain products like tobacco and alcohol because they historically tried to.

        Parents are legally and socially expected to keep their kids away from tobacco and alcohol. You're breaking legal and social convention if you allow your kids to access dangerous drugs.

        Capitalist social media is exactly as dangerous as alcohol and tobacco. Somebody should be held responsible for that, and the legal and social framework we already have for dealing with people who want to get kids addicted to shit works fairly well.

        • trueismywork2 hours ago
          So we should ban social media is what you're saying but not what OC is saying.
        • pembrookan hour ago
          > Capitalist social media is exactly as dangerous as alcohol and tobacco.

          Most actual studies done on this topic find very little evidence this is true.

          It's a run-of-the-mill moral panic. People breathlessly repeating memes about whatever "kids these days" are up to and how horrible it is, as adults have done for thousands of years.

          I expect some emotional attacks in response for questioning the big panic of the day, but before you do so please explore:

          [1] Effects of reducing social media use are small and inconsistent: https://www.sciencedirect.com/science/article/pii/S266656032...

          [2] Belief in "Social media addiction" is wholly explained by media framing and not an actual addiction: https://www.nature.com/articles/s41598-025-27053-2

          [3] No causal link between time spent on social media and mental health harm: https://www.theguardian.com/media/2026/jan/14/social-media-t...

          [4] The Flawed Evidence Behind Jonathan Haidt's Panic Farming: https://reason.com/2023/03/29/the-statistically-flawed-evide...

          • logicchainsan hour ago
            There's far more evidence of leftist political views leading to mental illness then of social media leading to mental illness, but imagine the uproar if it was suggested that children should be banned from exposure to leftist political views in school.
      • gleenn2 hours ago
        Not responsible for selling to all minors, just theirs.
      • moffkalastan hour ago
        Well the parents entrust their kids to the school, so they would be the ones responsible for what goes on on their premises. In turn, school computers are famously locked down to the point of being absolutely useless.
        • HeWhoLurksLatean hour ago
          That's really a district-by-district / school-by-school thing, some are significantly more locked down than others
    • closeparen5 minutes ago
      A physical realm that is safe for children to explore in their own is clearly preferable to one where it’s transgressive to let a child go outside without an escort.

      It is plausible that the same applies to the digital realm.

    • awakeasleep2 hours ago
      Individual Parents vs Meta Inc (1.66T mkt cap)

      May the best legal person win!

    • b00ty4breakfast10 minutes ago
      parents can be held liable for buying their kids cigarettes but, similarly, tobacco companies are (at least nominally) not supposed to target children in their advertising campaigns and in the design of their products.

      It's obviously not a 1/1 comparison here, because providing ID to access the internet is not analogous to providing ID to purchase a pack of Cowboy Killers but we can extrapolate to a certain extent.

      (inb4 DAE REGULATING FOR-PROFIT CORPORATIONS == NANNY STATE?!?!?!?!?)

    • 2ductan hour ago
      A monitoring solution might have worked for my case if my parents had monitored my Internet history, if they always made sure to check in on what I thought/felt from what I watched and made sure I felt secure in relying on them to back me up in the worst cases.

      But I didn't have emotionally mature parents, and I'm sure so many children growing up now don't either. They're going to read arguments like these and say they're already doing enough. Maybe they truly believe they are, even if they're mistaken. Or maybe they won't read arguments like these at all. Parenting methods are diverse but smartphones are ubiquitous.

      So yes, I agree that parents need to be held accountable, but I'm torn on if the legal avenue is feasible compared to the cultural one. Children also need more social support if they can't rely on their parents like in my case, or tech is going to eat them alive. Social solutions/public works are kind of boring compared to technology solutions, but society has been around longer than smartphones.

      • ipaddran hour ago
        Should the state have force your parents to give you up for adoption? That's the social support the state can offer.
        • Wobbles4213 minutes ago
          This is the real point that needs to be made.

          You can argue that many parents are less than ideal parents, but that is not sufficient to justify having the state step in. You also have to show that the state is less bad.

          Decades of data on the foster system strongly suggests otherwise. The state, by any objective measure, is terrible at raising children.

        • 2ductan hour ago
          I don't think it would have helped, given the outcomes for foster children are near universally worse except in the most extreme cases of abuse. I did threaten to call CPS but I was, of course, berated for it and threatened that I would be taken away, so that shut me up. Since I was never assaulted I doubt it would have reached the standard for foster care anyway, yet the consequences still endure to this day.

          I was told over and over by in hindsight unqualified persons that emotional abuse wasn't real abuse, so after a few years I was disinclined to seek help.

          If I had had even one person that supported me unconditionally instead of none at all, even if that person wasn't a parent, I'm fairly certain I would have turned out differently. That was just a matter of luck, and I came out empty-handed. I never felt comfortable talking about what I was exposed to online with anyone, and that only hurt me further, but I was a child and couldn't see another option.

        • jama21139 minutes ago
          So the only options are no support or give you up for adoption? No middle ground is possible?
    • Salgat2 hours ago
      This only works if I ban my child from having any friends since they all have unlimited mobile access to the internet.
      • 1shooneran hour ago
        Could your child not just call or text their friends? Or is the real expectation to not have to intervene at all about their preferred platform?
        • jlokier36 minutes ago
          I think the idea is for the child see their friends in person... not call, text, or internet.

          So even if their own child has no phone at all, they have access to the internet through other children's unlimited mobile access.

      • ipaddran hour ago
        Yes if they do bad things like drunk, have sex and do drugs.

        I would start with banning cellphones.

        • techblueberryan hour ago
          My greatest fear for my future young adult children is that they're on their cell phone all day and never have time to get in trouble with their friends, so there's that. Yes, Let's start with banning the cell phones.
      • jmyean hour ago
        Sorry, I know it's a hard line for parents to tread and it's really easy to criticize parenting decisions other people are making, but the "everyone else is doing it so I have to" always seems as lazy to me today, as it probably did to my parents when I said it to them as a teenager.

        Is it more important to prevent your son from being weaponized and turned into a little ball of hate and anger, and your daughter from spending her teen years depressed and encouraged to develop eating disorders, or to make sure they can binge the same influencers as their "friends"?

        • whaleidk8 minutes ago
          We used to teach kids to be themselves and stand up for what they believe in and their own authenticity and uniqueness even in the face of bullying. That having less or other doesn’t mean your value is lesser or that you should be left out. Now we teach them… conform at all costs so you never have to risk being bullied or lonely?
        • cgriswaldan hour ago
          The number of times I objected to my parents rules because my friends didn’t have those rules and the response was: “I’m not their parent.”
        • friendzisan hour ago
          Is it more important to prevent your child from <...>, or to not be seen as an adversarial monster?
      • luxuryballsan hour ago
        this is the biggest problem, so many parents are head-in-the-sand when it comes to things that can damage a child’s mind like screen time, yet no matter how much you protect them if it’s not a shared effort it all goes out the window, then the kid becomes incentivized to spend more time with friends just for the access, and can develop a sense that maybe mom and dad are just wrong because why aren’t so-and-so’s parents so strict?

        because their parents didn’t read the research or don’t care about the opportunity cost because it can’t be that big of a deal or it would not be allowed or legal right? at least not until their kid gets into a jam or shows behavioral issues, but even then they don’t evaluate, they often just fall prey to the next monthly subscription to cancel out the effects of the first: medication

    • axusan hour ago
      It's very easy to lock up alcohol/cigarettes, a child should never have access. Internet usage is more like broadcast media, a child should have regular access.

      The positives and negatives of Internet usage are more extreme than broadcast media but less than alcohol/guns. The majority of people lack the skills to properly censor Internet without hovering over the child's shoulder full-time as you would with a gun. Best you can do is keep their PC near you, but it's not enough.

      We agree that a creepy surveillance nanny state is not the solution, but training parents to do the censorship seems unattainable. As we do for guns/alcohol/cigarettes, mass education about the dangers is a good baseline.

      EDIT: And some might disagree about never having access to alcohol!

      • KoolKat23an hour ago
        Devices such as phones come with an option when you start the device asking simply is this for a child or an adult. Your router generally these days comes with a parental filter option on start up too. Heck we have chatgpt that can guide a parent through setting up a system if they want something more custom.

        If people want to push, they should just push to make these set up options more ubiquitous, obvious and standardized. And perhaps fund some advertising for these features.

      • fartfeatures44 minutes ago
        This is where Apple, Microsoft and Android need to step up. Indeed they already have in many ways with things being better than they used to be.

        There needs to be a strict (as in MDM level) parental control system.

        Furthermore there needs to be a "School Mode" which allows the devices to be used educationally but not as a distraction. This would work far better than a ban.

        • axusa few seconds ago
          Microsoft has done a good job with Microsoft accounts and Microsoft Family Safety. It's about as user-friendly as you'll get outside of Apple, though the speed could be improved.

          Even with this, the problem requires more than pushing a button. Time, thought, and adjustment are needed. Like home maintenance, its necessary but not everyone can do it without help.

          Getting AI assistance is good advice.

        • jama21140 minutes ago
          They could provide all the tools in the world. Unless there’s legislation change to what children are allowed to consume legally, everyone will largely ignore it.
    • Fervicus39 minutes ago
      > We'll try everything, it seems, other than holding parents accountable for what their children consume.

      The mistake in this reasoning is assuming that they are actually interested in protecting the children.

      • Wobbles428 minutes ago
        This.

        The world is becoming increasingly more uncertain geopolitically. We have incipient (and actual) wars coming, and near term potential for societal disruption from technological unemployment. Meanwhile social media has all but completely undermined broadcast media as a means of social control.

        This isn't about protecting children. It's about preventing a repeated of the Arab Spring in western countries later this decade.

        "Think of the children" is the oldest trick in the book, and should always be met with skepticism.

    • ragall18 minutes ago
      The vast majority of parents aren't tech-savvy enough to be able to operate IT parental controls.
    • SunshineTheCatan hour ago
      The greatest of uphill battles in today's current climate is trying to push anything in the realm of personal responsibility.

      Politicians' whole basis for nearly every campaign is "you're helpless, let us fix it for you."

      For the vast majority of problems plaguing society, the answer isn't government, it's for people changing their behavior. Same goes for parenting.

      But unfortunately, "you're an adult, figure it out" isn't the greatest campaign slogan (if you want to win).

      • skipantsan hour ago
        It wasn’t always this way: “Ask not what your country can do for you — ask what you can do for your country”
      • cindyllman hour ago
        [dead]
    • onion2k2 hours ago
      Yes, children are clever - I was one once.

      A counterargument to your point that children are clever - I was also one once.

    • watwutan hour ago
      The expectations on parents in USA are at their historical high. What are you talking on about in here. The expectation that parents will perfectly supervise them at every moment of their life till their adulthood is a.) new b.) at its historical max.
    • SecretDreams38 minutes ago
      ITT are a lot* of tech workers who made their money as a cog in the system poisoning the internet that future generations would have to swim in. I wonder if toxic waste companies also tell the parents it's strictly on them to keep their kids out of the lakes that are poisoned, but once flowed cleanly?

      We live in a shared world with shared responsibilities. If you are working on a product, or ever did work on a product, that made the internet worse rather than better, you have a shared responsibility to right that wrong. And parents do have to protect their kids, but they can't do it alone with how systematically children are targeted today by predatory tech companies.

    • numbsafari2 hours ago
      Except companies provide wholly inadequate safeguards and tools. They are buggy, inconsistent, easily circumvented, and even at time malicious. Consumers should be better able to hold providers accountable, before we start going after parents.

      The only real solution is to keep children off of the internet and any internet connected device until they are older. The problem there is that everything is done on-line now and it is practically impossible to avoid it without penalizing your child.

      If social media and its astroturfers want to avoid outright age bans, they need to stop actively exploiting children and accept other forms of regulation, and it needs to come with teeth.

      • raw_anon_11112 hours ago
        How easy is it for kids to bypass Parental Controls on iOS devices?
        • rootusrootus2 hours ago
          Social engineering is the most effective strategy, because iOS screen time controls are so buggy that eventually parents throw up their hands in exasperation and enable broader access than they would otherwise choose.
          • raw_anon_1111an hour ago
            It’s one setting to only allow a whitelist and not allow apps to be downloaded. Yes parents might actually need to learn technology
        • adolphan hour ago
          When everything is turned off by default, iOS Screentime is very effective. It also has effective tools for to grant certain exceptions, facilitated by Messages. It also distinguishes between "daytime" and "downtime" for the purpose of certain apps and app attributes, like the contact list. For example, we have ourselves, grandparents and the neighbors as "all the time" contacts but their friends as daytime only. They don't retain their devices at night but it is possible for them to pull them from the charging cabinet.
      • jmholla2 hours ago
        > Except companies provide wholly inadequate safeguards and tools. They are buggy, inconsistent, easily circumvented, and even at time malicious. Consumers should be better able to hold providers accountable, before we start going after parents.

        We could mandate that companies that market the products actually have to deliver effective solutions.

        • numbsafari2 hours ago
          Cue blog posts about section 230 and how it’s impossible to do hard things and parents should be held accountable not companies, meager fines, captured bureaucrats, libertarians, and on and on…
      • duped2 hours ago
        Step 0 is physical device access. Kids shouldn't have tablets or smartphones or personal laptops before age 16.
        • aleph_minus_onean hour ago
          > Kids shouldn't have tablets or smartphones or personal laptops before age 16.

          If you make such a restriction, they'll secretly buy some cheap "unrestricted" device like some Raspberry Pi (just like earlier generations bought their secret "boob magazines").

          • hellojesus33 minutes ago
            Parents should have an allowlist of devices to be able to join their network. And then they can require root certs or something for access outside of a narrow allow list. There's a host of ways to solve both problems. Just remember to check for hardware keyloggers on your (the parents') devices, as kids could use them or try evil maid attacks, etc. if they feel totally encaged.
            • 21 minutes ago
              undefined
        • mghackerlady2 hours ago
          16 is a bit steep but I do generally agree with your sentiment. I wish there were more educational home computers like there were back in the day like the BBC micro. I have a startup idea to make something like that (mostly as a dumping ground for my plethora of OS-software and computer education ideas) but don't currently have the resources and have doubts on how successful something like that would even be in this day and age. I'm only 18ish (Not giving my actual age for privacy reasons but it's within a 5 year margin) and feel like my peers would rather be locked to platforms and consume than learn to create and actually use computers despite there being a very obvious need (I once had a 20 year old look at me like I had 2 heads for asking them to move something into a folder)
        • raw_anon_11112 hours ago
          This is the craziest thing I’ve heard in a while. They shouldn’t have connected game systems either?
          • mghackerladyan hour ago
            I think they should. Theres a fine line between beneficial and detrimental. I had a 3DS growing up and could browse the web with its very gimped browser, and I think something like that is actually very good for a child (able to access the internet and view simple and informative sites while being too limited to access social media and the like)
          • numbsafari2 hours ago
            No, because those devices have little or no controls and those controls are easily bypassed and/or not honored by the platform.
          • trashban hour ago
            Have you ever visited any game store and turned off nsfw protection?

            I love gaming, but I hate all the smutt games. It discredits the medium, essentially what has also happened to anime.

            • rkomornan hour ago
              I'm kinda baffled about the Switch store's quantity of dating/whatever adult-ish games.

              I don't really want to turn on age-based filters (to the point that I've never investigated if they even exist) but at this rate, there's hardly anything worth looking at in the recent feed.

        • logicchainsan hour ago
          I hope they do pass a law like that, because it'd give my kids a gigantic advantage over the kids who had no access modern technology and the free flow of information until the age of 16. If you want to leave your kids completely unable to find any kind of gainful employment in the AI era, be my guest.
    • basiswordan hour ago
      >> In the United States, you can get in trouble if you recklessly leave around or provide alcohol/guns/cigarettes for a minor to start using, yet somehow, the same social responsibility seems thrown out the window for parents and the web.

      So anyone can walk into a shop and purchase these things unrestricted? It's not the responsibility of the seller too?

      • mothballed4 minutes ago
        Tobacco, yes you can order pipe tobacco and cigars online sent to your house without ID.

        Guns yes, you can buy a schmidt-rubin cartridge rifle or black powder revolver sent straight to your home from an online vendor no ID or background check, perfectly legal.

        Alcohol yes, you can order wine straight to your house without ID.

        These are all somewhat less known "loopholes" but not really turned out to be a problem despite no meaningful controls on the seller.

    • tinfoilhatter7 minutes ago
      Even if the world was full of responsible parents, there are still people and groups that want to establish a surveillance state. These moves around age-restriction are never really about protecting children. They are focused on systems that are able to monitor and track online activity / limit access to those who are willing to sacrifice their own personal sovereignty for access to services.

      There is most definitely a cult that is obsessed with the book of revelation and seeing Biblical prophecy fulfilled, and if that isn't readily obvious to folks at this juncture in time, I'm not sure what it will take. I guess they'll have to roll out the mark of the beast before people will be willing to admit it.

    • NoMoreNicksLeftan hour ago
      >We'll try everything, it seems, other than holding parents accountable

      The government took over most parenting functions, one at a time, until the actual parent does or is capable of doing very little parenting at all. If the government doesn't like the fact that it has become the parent of these children, perhaps it shouldn't have undermined the actual parents these last 80 years. At the very least, it should refrain from usurping ever more of the parental role (not that there is much left to take).

      You yourself seem to be insulated from this phenomena, maybe you're unaware that it is occurring. Maybe it wouldn't change your opinions even if you were aware.

      >If you want to actually protect children

      What if I don't want to protect children (other than my own) at all? Why would you want to be these children's parents (you suggest you or at least others want to "protect" them), which strongly implies that you will act in your capacity as government, but then get all grumpy that other people are wanting to protect children by acting in their capacity of government?

    • croesan hour ago
      > then give parents strong monitoring and restriction tools and empower them to protect their children.

      Because parents don’t abuse massive surveillance tools.

      Given that most abuse happens in the family and by parents maybe it’s a bad idea to give them so much power

    • amriksohataan hour ago
      its like the food industry blaming parents, sugar like apps/games are designed to be addictive to the point they are act like a drug, stop the drug dealer, not the consumer.
    • eesmith2 hours ago
      > We'll try everything, it seems, other than holding parents accountable for what their children consume.

      The way to keep kids from eating (yummy) lead-based paint chips was not holding parents accountable to what their kids ate, but banning lead-based paint.

    • dyauspitr2 hours ago
      This tired argument again. It doesn’t work. It’s like keeping your kid from buying alcohol but all their friends are allowed to buy it. The whole age demographic has to be locked out of the ecosystem.
      • wizzwizz42 hours ago
        Well, yes. If your friends can all go 'round to David's house, where David's parents hand each child a case of beer and send them on their way, any attempt by the other parents to prohibit underage drinking is going to be ineffective. But most parents don't do that. (I've actually never heard of it.) So social solutions involving parent consensus clearly do work here.

        "But it's behavioural!" I hear you cry. "What's stopping children from going out, buying a cheap unlocked smartphone / visiting their public library / hacking the parental control system, and going on the internet anyway?" And that's an excellent objection! But, what's stopping children from playing in traffic?

        • basisword42 minutes ago
          Just because you haven't heard of it doesn't mean it isn't common. Parents take different approaches. I had some friends parents who preferred we did it in their house where they could maintain some level of safety than us drinking recklessly in field. Others thought providing some beers was better than us buying the cheapest vodka available. And I'm sure other parents wouldn't have liked this approach if they knew about it.
        • dyauspitr2 hours ago
          Yeah but it’s illegal for the parents to give the other kids beer with serious criminal repercussions. That’s why most people make sure it doesn’t happen, not just some social sense of reponsibility. You would need something similar for smartphones/social media.
          • b40d-48b2-979e2 hours ago

                That’s why most people make sure it doesn’t happen
            
            Were you not invited to parties in high school? My experience growing up (and my experience being a neighbor to people with teenage children even now) says otherwise.
            • Sohcahtoa82an hour ago
              > Were you not invited to parties in high school?

              Did you forget what web site you're on?

            • dyauspitr2 hours ago
              Every high school and college freshman party I’ve been to involves some serious planning to find alcohol. It’s always hit or miss and not easy.
          • wizzwizz4an hour ago
            The US generally has strict anti-alcohol laws, with exceptions for legally-recognised familial relationships (e.g. children, spouses). The UK doesn't: its laws are restricted to "the relevant premises" (https://www.legislation.gov.uk/ukpga/2003/17/part/7/crosshea...) and "in public" (https://www.gov.uk/alcohol-young-people-law – can't find the actual law right now); but still, the behaviour I described does not occur in the UK often enough for me to have heard of it. I have, however, heard about similar behaviour from the US, where "we all go out late at night and become alcoholics" seems to be a culturally-acceptable form of teenage rebellion.

            People, for the most part, have no respect for the law. They usually haven't even read the law. They have respect for what they consider appropriate or inappropriate behaviour. (Knowingly breaking the law is, in most instances, considered an inappropriate behaviour – except copyright law, which people only care about if there are immediately-visible enforcement mechanisms. Basically everyone is fine with copying things from Google Images into their PowerPoint presentations… but I digress.) Most people would object to murder, even if the law didn't forbid it. This distinction is important.

            Is there a law that says "children must not play in traffic"? Probably! Haven't the foggiest idea which it would be, though. That law (if it exists) is not why children don't play in traffic. The law against giving alcohol to children (if it exists) is not why we don't give alcohol to children. We can establish similar social norms for deliberately-addictive, deceptive, dangerous computer systems, such as modern corporate social media.

            • ndriscoll41 minutes ago
              We can establish social norms, but companies have a tendency to ignore those norms if it makes them money and it isn't illegal (maybe not all or even most companies, but if it's profitable, some company will do it and expand into that niche). So it makes sense to make it illegal for those companies to provide services to children, and then establish a social norm that parents won't create an account for their children/bypass the checks that companies need to do. Just like with alcohol: it is illegal for stores to sell it to minors, and they must check ID; we don't just let them shrug and say a 14 year old looked 21, and at least in the US, that would be a criminal offense. It's then socially unacceptable (and maybe also illegal) for a parent to buy a ton of alcohol so their kid can host a rager for all of their friends.

              Drawing out the alcohol analogy further, you can actually buy alcohol on Amazon, subject to an ID check. I'm not sure why no one bats an eye at this, but somehow e.g. porn or other adult-only services are different.

              It's long been an established, reasonable stance that it is both the parent's responsibility and decision to allow or deny certain things, and it's also illegal for businesses to completely undermine the parent's ability to act as that gatekeeper for their kids.

    • jama21141 minutes ago
      Ah, the abstinence theory of protection. How it continues to rear its ugly head.

      Why this utter drivel is the top comment is beyond me, unbelievable.

      • JohnMakin37 minutes ago
        That is not what the post you are replying to is advocating for at all - try reading it one more time without so much hostility
      • marliechiller38 minutes ago
        Can you offer some rebuttal to give some credence to your point?
    • lenerdenator2 hours ago
      The thing is, what are the parents to do beyond restricting things? You find out some creep has been talking to Junior; do you talk to your local police department, state agency, or to the feds?

      We've never properly acted upon reports of predators grooming children by investigating them, charging them, holding trials, and handing down sentences on any sort of large scale. There's a patchwork of LEOs that have to handle things and they have to do it right. Once the packets are sent over state lines, we have to involve the feds, and that's another layer.

      Previously, I would have said it's up to platforms like Discord to organize internal resources to make sure that the proper authorities received reports, because it felt like there were instances of people being reported and nothing happening on the platform's side. Now, given recent developments, I'm not sure we can count upon authorities to actually do the job.

      • ipaddran hour ago
        Back in the day you would beat up that person.
      • robomartinan hour ago
        > The thing is, what are the parents to do beyond restricting things?

        Well, I can't speak for parents (as in all parents). I can, however, tell you what we did.

        When two of my kids were young we gave them iPods. The idea was to load a few fun educational applications (I had written and published around 10 at the time). Very soon they asked for Clash of Clans to play for a couple of hours on Saturdays. We said that was OK provided they stuck to that rule.

        Fast forward to maybe a couple of months later. After repeated warnings that they were not sticking to the plan and promises to do so, I found them playing CoC under the blankets at 11 PM, when they were supposed to be sleeping and had school the next day.

        I did not react and gave no indication of having witnessed that.

        A couple of days later I asked each of them to their room and asked them to place their top ten favorite toys on the floor.

        I then produced a pair of huge garbage bags and we put the toys in them, one bag for each of the kids.

        I also asked for their iPods.

        No anger, no scolding, just a conversation at a normal tone.

        I asked them to grab the bags and follow me.

        We went outside, I opened the garbage bin and told them to throw away their toys. It got emotional very quickly. I also gave them the iPods and told them to toss them into the bin.

        After the crying subsided I explained that trust is one of the most delicate things in the world and that this was a consequence of them attempting to deceive us by secretly playing CoC when they knew the rules. This was followed by daily talks around the dinner table to explain just how harmful and addictive this stuff could be, how it made them behave and how important it was to honor promises.

        Another week later I asked them to come into the garage with me and showed them that I had rescued their favorite toys from the garbage bin. The iPods were gone forever. And now there was a new rule: They could earn one toy per month by bringing top grades from school, helping around the house, keeping their rooms clean and organized and, in general, being well behaved.

        That was followed by ten months of absolutely perfect kids learning about earning something they cherished every month. Of course, the behavior and dedication to their school work persisted well beyond having earned their last toy. Lots of talks, going out to do things and positive feedback of course.

        They never got the iPods back. They never got social media accounts. They did not get smart phones until much older.

        To this day, now well into university, they thank me for having taken away their iPods.

        So, again, I don't know about parents in the aggregate, but I don't think being a good parent is difficult.

        You are not there to be an all-enabling friend, you are there to guide a new human through life and into adulthood. You are there to teach them everything and, as I still tell them all the time, aim for them to be better than you.

        https://www.youtube.com/watch?v=99j0zLuNhi8

        • Sohcahtoa82an hour ago
          This reads like something I'd find on /r/LinkedInLunatics, all the way down to the one-sentence/thought-per-line formatting.
        • robofartinan hour ago
          > I explained that trust is one of the most delicate things in the world

          > lies to own children about throwing their toys away

          • Terr_10 minutes ago
            It's not stated explicitly, but I'm getting a sense of a certain strategy that... Well, consider the difference between:

            1. Teach children about "consequences" by using clear expectations, timely feedback, and proportional responses.

            2. Teach children about "consequences" by deliberately permitting a festering mess to form until it "justifies" inflicting emotional trauma.

            We might argue which one is more "effective" from the perspective of conditioning a behavior, but I think it's also worth pointing out that parents are also role models (which is why the job is so hard) so you're also teaching them something about how they should set "consequences" on others.

    • adolph2 hours ago
      > holding parents accountable for what their children consume

      There is a local dive bar down the street. I haven't expressly told my kids that entering and ordering an alcoholic drink is forbidden. In fact, that place has a hamburger stand out front on weekends and I wouldn't discourage my kids from trying it out if they were out exploring. I still expect that the bartender would check their ID before pulling a pint for them.

      It takes a village to raise a child. There are no panopticons for sale the next isle over from car seats. We are doing our best with very limited tooling from the client to across the network (of which the tremendously incompetent schools make a mockery with an endless parade of new services and cross dependencies). It will take a whole of society effort to lower risks.

      • HeWhoLurksLate19 minutes ago
        also there's a huge argument to be made that surveilling your kids is really really bad for their development
    • vinyl72 hours ago
      We live in a technofeudalist society now, we're all at the whims of the tech corps
    • runarberg2 hours ago
      Blaming parents is a bit unwarranted, when on the other end we have business interests driven by perverse incentives of predating on children’s gullibility for their own profit.

      When you say “We‘ll try everything” that is simply not true, in particular what we do not try is strict consumer protection laws which prohibits targeting children. Europe used to have such laws in the 1980s and the 1990s, but by the mid-1990s authorities had all but stopped enforcing them.

      We have tried consumer protection, and we know it works, but we are not trying it now. And I think there is exactly one reason for that, the tech lobby has an outsized influence on western legislators and regulators, and the tech industry does not want to be regulated.

      • ipaddran hour ago
        It is literally the parents responsibility. You want to blame someone else. Raising a kid doesn't mean letting society raise them you have to make tough choices.

        If parents can't handle that they can give them up to the state.

        • taeric39 minutes ago
          It is literally a platform's responsibility to make sure they are being used responsibly, as well?

          Imagine a gun range that was well aware that their grounds were being used in nefarious ways. We'd shut it down. A hospital that just blindly gave out pain killers to anyone that asked. We'd shut it down.

          Does this mean that a zero tolerance policy is what should be used to shut things down? I don't think so. We have some agency to control things, though.

        • runarbergan hour ago
          I am not gonna blame parents while businesses are allowed to target children with ads about the newest mobile game. Children are very easy to influence, and this is exploited heavily by the tech industry, who shower children with advertising. This is predatory behavior, which the legislator and the regulator of western governments (including Europe) has allowed to proliferate.

          We cannot expect every parent to be able to protect their children when they are being predated on by dozens of multi-million dollar companies, and the state is on the side of the companies.

          • logicchainsan hour ago
            >Children are very easy to influence, and this is exploited heavily by the tech industry, who shower children with advertising

            The parents' job is to say no. If they're letting themselves be influenced too, that's bad parenting.

    • verisimian hour ago
      > We'll try everything, it seems, other than holding parents accountable for what their children consume.

      You've missed the point. No legislator or politician cares about what the parents are doing.

      What they care about is gaining greater control of people's data to then coerce them endlessly (with the assitance of technology) into acting as they would liike. To do that, they need all that info.

      "The children" is the sugar on the pill of de-anonymised internet.

  • antitoxican hour ago
    I work at a European identity wallet system that uses a zero knowledge proof age identification system. It derives an age attribute such as "over 18" from a passport or ID, without disclosing any other information such as the date of birth. As long as you trust the government that gave out the ID, you can trust the attribute, and anonymously verify somebodies age.

    I think there are many pros and cons to be said about age verification, but I think this method solves most problems this article supposes, if it is combined with other common practices in the EU such as deleting inactive accounts and such. These limitations are real, but tractable. IDs can be issued to younger teenagers, wallet infrastructure matures over time, and countries without strong identity systems primarily undermine their own age bans. Jurisdictions that accept facial estimation as sufficient verification are not taking enforcement seriously in the first place. The trap described in this article is a product of the current paradigm, not an inevitability.

    • dom962 minutes ago
      That's really awesome. I hope that soon we will also have humanity verification without sacrificing our anonymity.

      With LLMs and paid actors wreaking havoc on social media I do think that social media needs pivot towards allowing only human users on it. I wrote about this here: https://blog.picheta.me/post/the-future-of-social-media-is-h...

    • uniq743 minutes ago
      In your system, can companies verify age offline, or do they need to send a token to the Government's authority to verify it (letting the Government identify and track users)?

      Switzerland is working on a system that does the former, but if Government really wants to identify users, they can still ask the company to provide the age verification tokens they collected, since the Government hosts a centralized database that associates people with their issued tokens.

    • nemomarxan hour ago
      This is true, but I think it's more that those jurisdictions don't actually care about something solving this securely so much as they want face scans for other purposes?
    • Terretta33 minutes ago
      Not only EU -- Digital ID on iPhone does this today, and is accepted by many USA airports for travel, etc., with rollout for DLs.
    • gigel823 minutes ago
      Where can we learn more about your architecture?

      Someone brought up the need for device attestation for trust purposes (to avoid token smuggling for example). That would surely defeat the purpose (and make things much much worse for freedom overall). If you have a solution that doesn't require device attestation, how does that solve the smuggling issue (are tokens time-gated, is there a limit to token generation, other things)?

    • brunoborgesan hour ago
      Yeah, but how to convince investors that trusting the government-issued ID is good enough? /s
  • armchairhacker3 hours ago
    Age verification is very hard, because parents will give their children their unlocked account, and children will steal their parents' unlocked account. If that's criminalized (like alcohol), it will happen too often to prosecute (much more frequently than alcohol, which is rarely prosecuted anyways). I don't see a solution that isn't a fundamental culture shift.

    If there's a fundamental culture shift, there's an easy way to prevent children from using the internet:

    - Don't give them an unlocked device until they're adults

    - "Locked" devices and accounts have a whitelist of data and websites verified by some organization to be age-appropriate (this may include sites that allow uploads and even subdomains, as long as they're checked on upload)

    The only legal change necessary is to prevent selling unlocked devices without ID. Parents would take their devices from children and form locked software and whitelisting organizations.

    • horsawlarway2 hours ago
      I don't understand how this is any better.

      It's my job as a parent (and I have several kids...) to monitor the things they consume and talk with them about it.

      I don't want some blanket ban on content unless it's "age appropriate", because I don't approve that content being banned. (honestly - the idea of "age appropriate" is insulting in the first place)

      Fuck man, I can even legally give my kids alcohol - I don't see why it's appropriate to enforce what content I allow them to see.

      And I have absolutely all of the same tools you just discussed today. I can lock devices down just fine.

      Age verification is a scam to increase corporate/governmental control. Period.

      • armchairhacker2 hours ago
        You should be able to choose what's age-appropriate for your kids. Giving them access to e.g. "PG-13" media when they're 9 isn't the problem. Giving mature kids unrestricted access isn't a problem. The problem is culture:

        - Many parents don't think about restricting their kids' online exposure at all. And I think a larger issue than NSFW is the amount of time kids are spending: 5 hours according to this survey from 2 years ago https://www.apa.org/monitor/2024/04/teen-social-use-mental-h.... Educating parents may be all that is needed to fix this, since most parents care about their kids and restrict them in other ways like junk food

        - Parents that want to restrict their kids struggle with ineffective parental controls: https://beasthacker.com/til/parental-controls-arent-for-pare.... Optional parental controls would fix this

      • aidenn02 hours ago
        > Fuck man, I can even legally give my kids alcohol - I don't see why it's appropriate to enforce what content I allow them to see.

        In the USA it depends on the state. Federal guidelines for alcohol law does suggest exemptions for children drinking under the supervision of their parents, but that's not uniformly adopted. 19 states have no such exceptions, and in many of the remaining 31, restaurants may be banned from allowing alcohol consumption by minors even when their parents are there.

        • gmueckl2 hours ago
          You're assuming that this person is in the US. Alcohol is treated far more liberally in other places. For example, in some places it is legal for restaurants to serve alcohol to minors who are accompanied by a parent...

          Another thing: I fundamentally disagree with certain age rarings for kids content. Some explicit violence is rated OK for young audiences, but insert a swear word or a some skin and the age rating is bumped up? This rating system is nonhelp at all. I have to review each bit of content anyway before I can be certain.

          • aidenn0an hour ago
            Starting a comment with "In the USA..." is the exact opposite of assuming a person is in the US.
      • rolph2 hours ago
        this seems to be an issue of being able to be a parent, period.

        yup we should all be able, to talk to our kids instead of screaming at them.

    • Aurornis2 hours ago
      > Age verification is very hard, because parents will give their children their unlocked account, and children will steal their parents' unlocked account

      More simply: If ID checks are fully anonymous (as many here propose when the topic comes up) then every kid will just have their friends’ older sibling ID verify their account one afternoon. Or they’ll steal their parents’ ID when they’re not looking.

      Discussions about kids and technology on HN are very weird to me these days because so many commenters have seemingly forgotten what it’s like to be a kid with technology. Before this current wave of ID check discussions it was common to proudly share stories of evading content controls or restrictions as a kid. Yet once the ID check topic comes up we’re supposed to imagine kids will just give up and go with the law? Yeah right.

      • armchairhacker2 hours ago
        The older sibling should be old enough to know better. Or if they're still a kid, they can have their privileges temporarily revoked.

        This problem probably can't be solved entirely technologically, but technology can definitely be a part of solving it. I'm sure it's possible to make parental controls that most kids can't bypass, because companies can make DRM that most adults can't bypass.

        • Aurornis2 hours ago
          > The older sibling should be old enough to know better.

          This is exactly what I meant by my above comment: It’s like the pro-ID check commenters have become completely disconnected from how young people work.

          Someone’s 18 year old sibling isn’t going to be stopped by “should know better”. They probably disagree with the law on principal and think it’s dumb, so they’re just helping out.

          • armchairhacker42 minutes ago
            True, hence the culture shift is necessary.

            But imagine if a locked device was treated like alcohol. Most kids get access to alcohol at some point despite it being illegal, often from older siblings, and rarely with legal consequences for the adult. But it's much less of an issue, because most kids don't get it consistently. Furthermore, "good" kids understand that it's bad, and even some "bad" kids understand that they must limit themselves.

        • kmijyiyxfbklaoan hour ago
          >Or if they're still a kid, they can have their privileges temporarily revoked.

          Since people are already talking about using the law instead of parenting this needs clarification. Are the parents the one that would revoke their privileges or the government?

          • armchairhackeran hour ago
            The parents. They're the ones who configure the parental controls. e.g. if their 15-year old gets caught sharing his device with their 7-year old, they can temporarily give him 7-year old permissions as punishment.
      • Terretta29 minutes ago
        More simply: If ID checks are fully anonymous (as many here propose when the topic comes up) then every kid will just have their friends’ older sibling ID verify their account one afternoon. Or they’ll steal their parents’ ID when they’re not looking.

        Digital ID with binary assertion in the device is an API call that Apple's app store curation can ensure is called on app launch or switch. Just checking on launch or focus resolves that problem. It's no longer the account being verified per se, it's the account and the use.

      • aleph_minus_one2 hours ago
        > If ID checks are fully anonymous (as many here propose when the topic comes up) then every kid will just have their friends’ older sibling ID verify their account one afternoon.

        Exactly the same way that kids used in former days to get cigarettes or alcohol: simply ask a friend or a sibling.

        By the way: the owners of the "well-known" beverage shops made their own rules, which were in some sense more strict, but in other ways less strict than the laws:

        For example some small shop in Germany sold beverages with little alcohol to basically everybody who did not look suspicious, but was insanely strict on selling cigarettes: even if the buyer was sufficiently old (which was in doubt strictly checked), the owner made serious attempts to refuse selling cigarettes if he had the slightest suspicion that the cigarettes were actually bought for some younger person. In other words: if you attempted to buy cigarettes, you were treated like a suspect if the owner knew that you had younger friends (and the owner knew this very well).

      • __MatrixMan__2 hours ago
        Circumventing controls as a kid is what taught me enough about computers to get the job that made college affordable (in those days you could just boot windows to a livecd Linux distro and have your way with the filesystem, first you feel like a hacker, later the adults are paying you to recover data).

        If we must have controls, I hope the process of circumventing them continues to teach skills that are useful for other things.

      • skeptic_ai2 hours ago
        Probably will limit to one device per person, to save the children, so we won’t share with others.

        (So you need to keep all your stuff into one device to be fully tracked easily. And have no control over your device, share your location… )

    • everdrive3 hours ago
      Completely agree. The internet works differently than how people want it to, and filtering services are notoriously easy to bypass. Even if these age-verification laws passed with resounding scope and support, what would stop anyone from merely hosting porn in Romania or some country that didn't care about US age-verification laws. The leads to run down would be legion. I think you could seriously degrade the porn industry (which I wouldn't necessarily mind) but it would be more or less impossible to prevent unauthorized internet users from accessing pornography. And of course that's the say nothing of the blast radius that would come with age-verification becoming entrenched on the internet.
      • armchairhacker3 hours ago
        > what would stop anyone from merely hosting porn in Romania or some country that didn't care about US age-verification laws

        A government could implement the equivalent of China's great firewall. Even if it doesn't stop everyone, it would stop most people. The main problem I suspect is that it would be widely unpopular in the US or Europe, because (especially younger) people have become addicted to porn and brainrot, and these governments are still democracies.

        • 9dev2 hours ago
          That isn’t necessary because porn companies don’t exist to gift orgasms, but to make money. They need US citizens to pay them for premium content and subscriptions, and that dependency means they’ll have to comply with US laws.
          • everdrivean hour ago
            The words of someone who does not actually look at pornography. The vast majority of pornography-by-consumption is free / ad-supported. Customers are not "paying" and those ads are usually the bottom of the barrel with regard to sleaziness or legality.
            • 9dev25 minutes ago
              It’s still just a sales funnel for ads or subscriptions. Why do you think porn sites exist?
          • 2ductan hour ago
            Plenty of porn exists for free, posted online by models or digital artists. It's archived in places that circumvent copyright, don't require payment or accounts, and are easily accessible.
        • big-and-smallan hour ago
          > A government could implement the equivalent of China's great firewall. Even if it doesn't stop everyone, it would stop most people.

          Porn is not just political information about human right abuses, government overreach or heavily censored overview of concentration camps for "group X". People can live just fine with government censorship buying into any kind of propaganda.

          Kids would find a way to access porn though. Whatever it VPNs, tor or USB stick black market. Government cant even win war on drugs and you expect them to successfully ban porn. What a joke.

        • logicchains44 minutes ago
          Even China hasn't been remotely successful at banning porn, and it already has the great firewall and porn is illegal there.
        • a4564632 hours ago
          eh... they are more like `dumbocracies` with these measures. None of this is to protect children. Except to satisfy rabid parents who think the world needs to serve them.
    • brisky2 hours ago
      Just a personal anecdote from my life - I have set up Youtube account for my kid with correct age restrictions (he is 11). Also this account is under family plan so there are no ads.

      My kid logs out of this account so he can watch restricted content. I wonder - what is PG rating for logged out experience?

    • Buttons8403 hours ago
      And we need a standard where websites can self-rate their own content. Then locked devices can just block all content that isn't rated "G" or whatever.
      • 9dev2 hours ago
        Wrong incentive. If you don’t give a shit about exposing children to snuff or porn, but do give a shit about page views and ad revenue, you obviously don’t rate your content or rate it as G to increase that revenue.
      • armchairhacker3 hours ago
        I imagine there would be a set of filters, including some on by default that most adults keep for themselves. For example, most people don't want to see gore. More would be OK with sexual content, even more would be OK with swear words, ...
    • hawk_2 hours ago
      > If there's a fundamental culture shift,

      You mean this culture shift is needed for the masses but I don't think that's the case. In my widest social circle I am not aware of anyone giving alcohol to young kids (yes by the time they are 16ish yes but even that's rare). Most guardians would willingly do similar with locked devices.

      The real problem is that the governments/companies won't get to spy on you if locked devices are given to children only. They want to spy on us all. That's the missing cultural shift.

      • aleph_minus_one2 hours ago
        > Most guardians would willingly do similar with locked devices.

        Considering the echo chamber in which I was at school, my friends would have simply used some Raspberry Pi (or a similar device) to circumvent any restriction the parents imposed on the "normal" devices.

        Oh yes: in my generation pupils

        - were very knowledgeable in technology (much more than their parents and teachers) - at least the nerds who were actually interested in computers (if they hadn't been knowledgeable, they wouldn't have been capable of running DOS games),

        - had a lot of time (no internet means lots of time and being very bored),

        - were willing to invest this time into finding ways to circumvent technological restrictions imposed upon them (e.g. in the school network).

      • armchairhacker2 hours ago
        The kids in your social circle are used to not having access to alcohol, but they're not used to not having access to social media.

        Hypothetically, if every kid in your social circle had their device "locked", the adults would probably have a very hard time the kids away from their devices, or just relent, because the kids would be very unhappy. Although maybe with today's knowledge, most people will naturally restrict new kids who've never had unrestricted access, causing a slow culture shift.

    • raw_anon_11112 hours ago
      So kids can drive at 16. But can’t get access to an unlocked phone until their 18? Who gets to decide the whitelist? The government?
      • armchairhacker34 minutes ago
        I never specified age.

        The whitelist would be decided by the market: the parents have the unlocked device, there are multiple solutions to lock it and they choose one. Which means that in theory, the dominant whitelist would be one that most parents agree is effective and reasonable; but seeing today's dominant products and vendor lock-in...

    • kristopolous3 hours ago
      I mean look, there's a point where the manufacturers back off and entrust the parents.

      Any parent can be reckless and give their children all kinds of things - poison, weapons, pornographic magazines ... at some point the device has enough protective features and it is the parents responsibility.

      • 2duct2 hours ago
        Digital media use is easier to conceal than weapons. My parents did not protect me from it growing up because they were not responsible, and I was harmed as a result. To this day they still do not realize I was harmed, because I did not tell them and we are not on speaking terms. Trying to be honest would have resulted in further rejection from them. This was on a personality level and I had no way to deal with this as a developing human.

        I could not control how my parents were going to raise me, I was only able to play with the hand I was dealt. I hate the idea that parents are sacrosanct and do not share blame in these situations. At the same time, if this is just the family situation you're given and you're handed a device unaware of the implications, who is going to protect you from yourself and others online if your parents won't? Should anyone?

    • dyauspitr2 hours ago
      Yes we need a fundamental shift where sharing of parent accounts is akin to atleast some sort of infraction or maybe even a misdemeanor.
      • armchairhacker2 hours ago
        This could help, but without the culture shift, way too many parents will intentionally and unintentionally break that law.
    • TZubiri3 hours ago
      >parents will give their children their unlocked account, and children will steal their parents' unlocked account.

      I think either is better than the staus quo. In the first case the parent is waiving away the protections, and in the second the kid is.

      Even if a kid buys alcohol, I think it's healthier that they do it by breaking rules and faking ids and knowing that they are doing something wrong, than just doing it and having no way to know it's wrong (except a popup that we have been trained by UX to close without reading (fuck cookie legislation))

      • armchairhacker2 hours ago
        That would be the status quo if we had better parental controls.

        Trying to enforce parental controls via regulation may only be as effective as Europe enforcing the DMA against Apple. But maybe not, because there's a huge market; if Apple XOR Android does it, they'll gain market share. Or governments can try incentive instead of regulation (or both) and fund a phone with better parental controls. Europe wants to launch their own phone; such a feature would make it stand out even among Americans.

    • scotty793 hours ago
      Prove of adulthood should be provided by the bank after logging into a bank account. I'm sure parents just would let their bank details be stolen and such.

      Of course no personal details should be provided to the site that requests age confirmation. Just "barer of this token" is an adult.

      • m4rtinkan hour ago
        The "Bank identity" system in Czech Republic (and likely other countries) can be used to log into to various government services. The idea is that you already authenticated to the bank when getting the account, so they can be sure it is really sou when you log in - so why not make it possible for you to log in to other services as well if you want to ?
        • scotty79an hour ago
          > The "Bank identity" system in Czech Republic (and likely other countries) can be used to log into to various government services.

          In Poland we have the same setup.

      • 9dev2 hours ago
        So we trust a bank more than the government that they won’t extend this to earn more money by disclosing more information? Bad idea. You need a neutral broker.
        • armchairhacker31 minutes ago
          AFAIK today if you buy a device, the bank doesn't get the device-unique identifier, at best it sees the model number.
    • DeathArrow3 hours ago
      That is actually a very good solution that is respecting privacy. And is much more effective than asking everyone for ID when opening a website or app.
    • delusional3 hours ago
      How does this solve the problem at all? You're just making more problems. Now you have to deal with a black market of "unlocked" phones. You're having to deal with kids sharing unlocked phone. Would police have to wal around trying to buy unlocked phones to catch people selling them to minors? What about selling phones on the internet, would they check ID now?

      SOME parents give their children access to their ID. That is NOT the same as ALL parents, and therefore is not a reason not to give those parents a helping hand.

      Even just informing children that they're entering an adult space has some value, and if they then have to go ask their parents to borrow their wallet, that's good enough for me.

      • armchairhacker2 hours ago
        It would not be solved without a culture shift. But with a culture shift, giving a kid an unlocked device would be as rare as giving them drugs.

        I'm sure it will occasionally happen. But kids are terrible at keeping secrets, so they will only have the unlocked device for temporary periods, and I believe infrequent use of the modern internet is much, much less damaging than the constant use we see problems from today. A rough analogy, comparing social media to alcohol: it's as if today kids are suffering from chronic alcoholism, and in the future, kids occasionally get ahold of a six pack.

    • noitpmeder3 hours ago
      I actually don't hate this??? As long as parents can set up their own whitelists and it's not up to the government to have the final say on any particular block.
      • akersten3 hours ago
        Parents can do this today if they wanted to

        The problem of "kids accessing the Internet" is a purposeful distraction from the intent of these laws, which is population-level surveillance and Verified Ad Impressions.

        • armchairhacker2 hours ago
          Today, in practice it's not a choice, because even the most attentive parents fail to block internet access. Parental controls are ineffective, and all the kid's friends have access so they become alienated. https://beasthacker.com/til/parental-controls-arent-for-pare...

          But laws alone won't fix this, and laws aren't necessary (except maybe a law that prevents kids from buying phones). In the article, the child's devices had parental controls, but they were ineffective. There's demand for a phone with better parental controls, so it will come, and more parents are denying access, so their kids will become less alienated.

    • skeptic_ai2 hours ago
      Definitively we should have constant verification of the current user with Face ID or similar tech. Every 5 minutes of usage, your camera is activated to check who’s using your phone and validates it. So much secure and safe. /s
    • cromka3 hours ago
      This is Nirvana/Perfect Solution fallacy. That's like saying limiting smoking to 18 y/o was futile because teenagers could always have some other adult buy them cigs, or use fake IDs.

      Ridiculous take.

      • akersten3 hours ago
        Well, age verification is the "we have to do something about this nebulous problem even if the best thing we can think of actually makes everything worse for everyone but it makes us feel better" fallacy, which is equally ridiculous.
        • cromka3 hours ago
          No, it's not the same. There are anonymous solutions that solve this problem that are perfectly acceptable. Not perfect for prevention, but a good compromise nonetheless. Like cig/alcohol underage consumption prevention.
          • akersten2 hours ago
            I think we totally disagree on the degree of how much this is actually a problem compared to how much we're willing to invest in it. Those anonymous solutions are fairly idealistic and Nirvana-esque themselves, I don't think they'd see wide adoption. Beyond that I'm firmly in the camp that age verification for the kids is a complete smokescreen for the actual intent of these efforts, which is more surveillance, so on principal I'm opposed to any movement in this direction and doubt we'll find common ground.
            • cromka2 hours ago
              Yeah, sure, no matter the studies, no matter the developmental indices, ni matter the WHO, no matter the psychologists. Let's also talk about climate change and how it's up for debate?

              We don't disagree on whether it is actually a problem, you just have your opinion about facts.

              • akersten2 hours ago
                We are arguing different things. I have never stated "psychological effects of the Internet aren't real and therefore this discussion is moot." My argument is "psychological effects or not (and personally I think they are overplayed), the privacy tradeoff of trying to fix them is not worth it (and I doubt any vague gestures in the direction of age assurance would help)." You are focusing on the first parenthetical but the important part is outside it.

                We also have no way to actually measure this even if we wanted to do an experiment. So comparing this very soft science to climate change is a bit out of pocket.

                • cromka2 hours ago
                  > We also have no way to actually measure this even if we wanted to do an experiment.

                  Sorry, WHAT? No way to measure it? My god, are we talking about the same thing? Are you sure you haven't missed past 12-24 months of increased reporting on the matter from several different angles, from cognitive skills, anxiety, sexual drive, and so on?

                  EOT for me.

      • armchairhacker2 hours ago
        I'm saying that, in today's culture, age-gating the internet is likely to be much less effective than age-gating alcohol or tobacco. Most kids spend an appalling amount of time on social media (think, 5 hours/day*); most kids didn't spend this much time or invest this much of their lives into drugs.

        * according to this survey from over 2 years ago: https://www.apa.org/monitor/2024/04/teen-social-use-mental-h...

        • a4564632 hours ago
          not saying that I support age verification in any form, but you seen the vape sales?
          • armchairhackeran hour ago
            I like to believe that, even with the amount of kids vaping, there aren't nearly as many as kids on social media.

            To give perspective: in my high school, there were a few kids who vaped in bathrooms, but the majority (including me) did not; we were told many times that it was unhealthy, and anyone caught vaping would be suspended. Everyone I know (including me) had social media, we were not told it was unhealthy (only to not use it too much, not give out PII, avoid bullying, etc.), and it wasn't even policed in some classrooms.

      • hamdingers2 hours ago
        For the smoking analogy to fit, you'd have to have parents giving their children packs of cigarettes to play with and then being mad at Marlboro they figured out how to smoke them.
  • nye2k10 minutes ago
    I worked for a decade in what I would consider the highest level of our kids' privacy ever designed, at PBS KIDS. This was coming off a startup that attempted to do the same for grownups, but failed because of dirty money.

    Every security attempt becomes a facade or veil in time, unless it's nothing. Capture nothing, keep nothing, say nothing. Kids are smart AF and will outlearn you faster than you can think. Don't even try to capture PII ever. Watch the waves and follow their flow, make things for them to learn from but be extremely careful how you let the grownups in, and do it in pairs, never alone.

  • Wobbles4221 minutes ago
    The purpose of a system is what it does.

    Undermining data protection and privacy is clearly the point. The fact that it's happening everywhere at the same time makes it look to me like a bunch of leaders got together and decided that online anonymity is a problem.

    It's not like kids having access to adult content is a new problem after all. Every western government just decided that we should do something about it at roughly the same time after decades of indifference.

    The "age verification" story is casus belli. This is about ID, political dissent, and fears of people being exposed to the wrong brand of propaganda.

    • snerbles16 minutes ago
      Exactly. So many comments here about technical solutions are missing the underlying government/authority problem, or are actively a part of it.
  • aqme283 hours ago
    If we're going to do this at all, it should be on the device, not the website/app. Parents flag their child's device or browser as under 18, and websites/apps follow suit. Parents get the control they're looking for, while service providers don't have to verify or store IDs. I guess it's just more difficult to pressure big dogs like google/apple/mozilla for this than pornhub and discord.
  • agentultra3 hours ago
    There are alternatives to ID verification if the goal is protecting children.

    You could, for example, make it illegal to target children with targeted advertising campaigns and addictive content. Then throw the executives who authorized such programs in jail. Punish the people causing the harm.

    • varenc3 hours ago
      If targeting children with advertising got corporate execs thrown in jail, wouldn't the companies just roll out age verification for users like they do now? How would this rule change their behavior? They have to know who the children are to not target them.

      Stronger punishment creates more of an incentive to age verify. Which is basically why it's happening now.

      • cloverichan hour ago
        > They have to know who the children are to not target them.

        There is a difference between identifying specific children, and running programs that target children more generally; and / or having research that shows how your product harms children, and failing to do anything to stop it. We can tackle both of those issues without requiring age verification. We're headed down the path of age verification because we know now that not only is social media harmful, it's especially harmful to kids, and has been specifically targeted to them. Those are things that can be fixed, regardless of how you feel about age verification. Its not different than tobacco being not allowed to create advertisements for kids; its the same type of people doing the same types of things in the end.

      • barbazoo2 hours ago
        At least then it wouldn't be the government requiring it, is what people may think I imagine.
        • b40d-48b2-979ean hour ago
          The problem is private companies being extensions of what the government wants to do, like all of the surveillance tech in the US right now basically eviscerating the fourth amendment since they willingly hand over their data to the government without even a court order in many cases.
    • cubefox3 hours ago
      To avoid your proposed punishment, they will implement things like ... ID verification.
      • Perz1valan hour ago
        Then ones who won't will become the preferred choice
    • NoMoreNicksLeftan hour ago
      >Then throw the executives who authorized such programs in jail.

      Gee, I wonder if the executives who are suspected of doing such things haven't spent the last 100 years building the infrastructure necessary to avoid charges, let alone jail time? Large corporate legal departments, wink-wink-nudge-nudge command and control hierarchies where nothing incriminating is ever put into writing, voluminous intra-office communications that bury even the circumstantial evidence so deeply no jury could understand it even if the plaintiffs/state could uncover it, etc.

      Anyone over the age of 12 that thinks corporate entities can be made to be accountable in a meaningful way is more than naive. They are cognitively defective. Or is it that you realize they can't be held accountable but you'd rather maintain the status quo than contemplate a country which abolished them and enforced that all business was the conducted by sole proprietorships and (small-n) partnerships?

      • agentultra6 minutes ago
        For a while it was thought that we could never bring back anti-trust.

        Sure, there's a lot of corruption right now. Doesn't have to stay that way.

    • scotty793 hours ago
      Facebook advertises outright scams and nobody manages to punish them for that.
  • julianozen2 hours ago
    There is missing a solution.

    Give our personal devices have the ability to verify our age and identity securely and store on device like they do our fingerprint or face data.

    Services that need access only verify it cryptographically. So my iPhone can confirm I’m over 21 for my DoorDash app in the same way it stores my biometric data.

    The challenge here is the adoption of these encryption services and whether companies can rely on devices for that for compliance without having to cut off service for those without it set up.

    • some_random2 hours ago
      The real problem with this is that the ultimate objective isn't age verification, it's complete de-anonymization. I think different groups want this for different reasons, but the simple reality is that minimizing the identify information transferred around is antithetical to their goals.
      • dom96a few seconds ago
        Is it though? Do you have any proof that is the case?
      • knallfroschan hour ago
        Google/Apple already know where you and your mistress live. In case you pay for any service, they've got your identity too. Ever had a single shipment confirmation to your address come to your mail? They know who you are.

        The hardware providers already have the information. You only need to make them reveal it to 3rd parties.

      • Seattle3503an hour ago
        If we create age verification tools with strong privacy protections that solve the problems they raise, we can can call their bluff.

        If we fight every and any solution, we may end up with their solution, becauase they build it. We end up in the position of saying "don't use the thing they built" without offering alternatives. I'd rather be saying "use whatbwe built, ita is better."

    • EnderWT2 hours ago
      ISO/IEC 18013-5 (Personal identification — ISO-compliant driving licence — Part 5: Mobile driving licence (mDL) application) is a potential solution for this. https://www.iso.org/obp/ui/en/#iso:std:iso-iec:18013:-5:ed-1...

      It would allow someone with an mDL on their device to present only their age instead of other identifying information.

    • luplex2 hours ago
      I think this is what my German electronic ID card does. The card connects to an app on my phone via NFC, a service can cryptographically verify a claim about my age, and no additional info is leaked to the service provider or the government.
      • encrux31 minutes ago
        I think this is actually the correct way to move forward.

        We should be able to verify facts about people on the internet without compromising personal data. Giving platforms the ability to select specific demographics will, in my view, make the web a better place. It doesn’t just let us age restrict certain platforms, but can also make them more authentic. I think it’s really important to be able to know some things to be true about users, simply to avoid foreign election interference via trolling, preventing scams and so much more.

        With this, enforcement would also be increasingly easy: Platforms just have to prove that they’re using this method, e.g. via audit.

      • hexyl_C_gutan hour ago
        But that doesn't verify that the person using the ID is the person that it was issued to.
        • Seattle3503an hour ago
          Its better than what we have now. Maybe using the ID could require a PIN code if we wanted to enhance security.
    • snvzz2 hours ago
      The solution has always been there: Assume everybody is an adult.

      The only reasonable way to deal with children on the Internet is to treat Internet access like access to alcohol/drugs. There is no need for children to access the Internet full stop.

      Internet is a network in which everything can connect to everything, and every connected machine can run clients, servers, p2p nodes and what not. Controlling every possible endpoint your child might connect to is not feasible. Shutting the entire network down because "won't somebody please think of the children" is not acceptable.

      And, don't let them trick you. This is the endgoal. An unprecedented level of control over the flow of information.

      • Squarexan hour ago
        So you would deny children the greatest source of knowledge in the history? I have learned math and programming thanks to unlimited access to the web and would not be where I am without it.
  • Cthulhu_3 hours ago
    > And the only way to prove that you checked is to keep the data indefinitely.

    This is a false premise already; the company can check the age (or have a third party like iDIN [0] do it), then set a marker "this person is 18+" and "we verified it using this method at this date". That should be enough.

    [0] https://www.idin.nl/en/

    • Attrecomet2 hours ago
      Doesn't matter, I've already had to provably identify myself, the information is a) out there b) will be used and stored, and c) will be abused

      and there is nothing I or the few (in terms of power) well-meaning government and corporate actors can do to change that.

    • reorder969544 minutes ago
      And how do they prove to me they (and no 3rd party providers) aren't actually storing the data? I simply don't trust companies telling me they won't store something, so to me the only acceptable option is the data to never leave my device.
    • enraged_camel2 hours ago
      Nope, as the article notes, it is actually almost never enough because it does not stand up to legal scrutiny. And for good reason: there's no way to conclusively prove that the platform actually verified the user's age, as opposed to simply saying they did, before letting them in.
  • enjoykaz3 hours ago
    Most of this debate makes more sense if the actual goal is liability reduction, not child safety. If it were genuinely about protecting kids, you'd regulate infinite scroll and algorithmic engagement optimization, not who can log in.
    • malfist3 hours ago
      If the US really cared about child safety they'd go after people in the epstien files.
    • cloverichan hour ago
      Most people have only a light grasp of what infinite scroll and algorithmic engagement optimization means. They know they like the scrolling apps more, but it takes a bit of research and education to really understand the specific mechanics and alternatives. We get this well as tech literate but many people using these apps today, are neither tech literate, nor even remember a world before infinite scrolling media was a thing. It seems incredibly obvious mechanism but I've explained it to people, and it takes a few times for it to really sink in and become a specific mental model for how they see the world.
    • stahorn2 hours ago
      I think this would be good for everybody, not just kids. It doesn't even have to be complicated: Just that after a certain amount of time scrolling/watching, put in a message asking if it's maybe time to stop with some information about how these algorithms try to keep you for as long as possible. Maybe a link to a government page with more information.

      It doesn't have to be perfect and there will of course be easy workarounds to hid the warnings for people that want. The goal is to improve the situation though, not solve it perfectly. Like putting information about the dangers of smoking on packages of smokes; it doesn't stop people from smoking but it does make the danger very easy to learn.

    • barbazoo2 hours ago
      I'm happy they don't because they don't know what they're doing. Hopefully countries prioritizing public health will implement a social media ban for the vulnerable population which gives them some time to grow up without all that garbage poisoning their brains. Then when they're 16 or whatever age, hopefully by that age we'll have realized that this is actually like cigarettes and everyone, all age groups treat it like that.

      Better than muddying the waters trying to make it less addictive but then letting them on there when their brains aren't ready.

    • kristopolous3 hours ago
      I think it's because there's always a group of nosy busybodies finger-wagging about protecting the children and we have to do decorative theatrics to satiate whatever narratives they've convinced themselves of
      • swiftcoder3 hours ago
        This is a group particularly beloved by politicians, because you can pretty much use them as a smokescreen whenever you want to pass authoritarian legislation...
    • kneel253 hours ago
      Pretty sure they’re doing both of those things but it takes a long time for the regulation to reach the final stage
    • dfxm122 hours ago
      Interestingly, regulating these would be good for adults as well. A lot of these very large online companies enjoy an asymmetric power advantage. We should aim to protect ourselves against them, in addition to our children.
  • arn3n29 minutes ago
    Parents are competing with multi-trillion dollar companies who have invested untold amounts of cash and resources into making their content addictive. When parents try to help their children, it's an uphill battle -- every platform that has kids on it also tends to have porn, or violence, or other things, as these platform generally have disappointingly ineffective moderation. Most parents turn to age verification because it's the only way they can think of to compete with the likes of Meta or ByteDance, but the issue is that these platforms shouldn't have this content to begin with. Platforms should be smaller -- the same site shouldn't be serving both pornography and my school district's announcement page and my friend's travel pictures. Large platforms are turning their unwillingness to moderate into legal and privacy issues, when in fact it should simply be a matter of "These platforms have adult content, and these ones don't". Then, parents can much more easily ban specific platforms and topics. Right now there's no levers to pull or adjust, and parent s have their hands tied. You can't take kids of Instagram or TikTok -- they will lose their friends. I hate the fact that the "keep up with my extended family" platform is the same as the "brainrot and addiction" one. The platforms need to be small enough that parents actually have choices on what to let in and what not to. Until either platforms are broken up via. antitrust or until the burden of moderation is on the company, we're going to keep getting privacy-infringing solutions.

    If you support privacy, you should support antitrust, else we're going to be seeing these same bills again and again and again until parents can effectively protect their children.

  • TimPC3 hours ago
    Big tech likes this because there are a lot more face recognition technologies in the wild in real life and being able to connect all real life data to online data is quite valuable. It's also quite possibly the largest training set ever for face recognition if ids are stored and given how ids and images are sold across many companies it seems very high probability that some company will retain the data rather than delete after use.
    • iririririr3 hours ago
      China (and US via latin american countries and it's own poor people ...via benefit programs access via id.gov) is testing both biometrics and device id to evaluate pros and cons, and to merge data, when it come to autocratic control.

      In china there are places to scan you device and get coupons. usually at elevators in residential buildings so they can track also if you're arriving or leaving easily.

      In the US every store tracks and report to ad networks your Bluetooth ids. and we know what happens to ad networks.

      US now requires cars to report data, which was optional before (e.g. onstar) and china joined on this since the ev boom.

      the public id space is booming.

      • drnick12 hours ago
        > US now requires cars to report data, which was optional before (e.g. onstar) and china joined on this since the ev boom.

        This isn't true, there is no federal requirement for a cellular modem in cars. Most modern cars have one, but nothing prevents you from disabling or removing it. I certainly would not tolerate such a "bug" in by car.

        > In the US every store tracks and report to ad networks your Bluetooth ids.

        This also isn't true, modern phones randomize Bluetooth identifiers. I personally disable Bluetooth completely.

    • Noaidi3 hours ago
      So don't use big tech. No one needs discord, or porn, or social media. But this is not the answer. The answer is fighting to change the laws. And we can start changing the laws by boycotting big tech. Laws are changed by money flows, not ideology.
  • flerchin27 minutes ago
    The thing that needs to be age banned, or really just banned, is algorithmic feeds with infinite scroll. Kids (and adults) need to just interact with their friends, and block all the bait.
    • brainwad22 minutes ago
      For adults, I think a legal opt-in-only policy would work well. And require reconsent with every major algorithm change.
  • trashban hour ago
    I would like to take the discussion in the other direction. How about we offer safe spaces instead of banning the unsafe spaces for kids.

    Similar to how there is specific channels for children on the TV. Perhaps the government can even incentivize such channels. It would also make it easier for parents to monitor and set boundaries. Parents would only need to monitor if the tv is still tuned to disney channel or similar instead of some adult channels.

    Similarly this kind of method could be applied to online spaces. Ofcourse there will be some kids that will find ways around it but they will most likely be outliers.

    • NoMoreNicksLeftan hour ago
      >How about we offer safe spaces instead of banning the unsafe spaces for kids.

      Children shouldn't be associating with other children, except in small groups. Even the typical classroom count is far too large. They become the nastiest, most horrible versions of themselves when they congregate. A good 90% of the pathology of public schools can be blamed on the fact that, by definition, public schools require large numbers of children to congregate.

  • jama21136 minutes ago
    This thread is gonna be full of HN users blaming the parents for a systemic problem isn’t it?

    Yup.

  • Seattle3503an hour ago
    > Some observers present privacy-preserving age proofs involving a third party, such as the government, as a solution, but they inherit the same structural flaw: many users who are legally old enough to use a platform do not have government ID. In countries where the minimum age for social media is lower than the age at which ID is issued, platforms face a choice between excluding lawful users and monitoring everyone. Right now, companies are making that choice quietly, after building systems and normalizing behavior that protects them from the greater legal risks. Age-restriction laws are not just about kids and screens. They are reshaping how identity, privacy, and access work on the Internet for everyone.

    This rebuttal to privacy preserving approaches isn't compelling. Websites can split the difference and use privacy preserving techniques when available, and fall back to other methods when the user doesn't have an ID. I'd go further and say websites should be required to prioritize privacy preserving techniques where available.

    There is a separate issue of improving access to government ID. I think that is important for reasons outside of age verification. Increasingly voting, banking, etc... already relies on having an ID.

  • reorder969537 minutes ago
    What's always got me about this is when I was in school I had it absolutely drilled into me that I should never expose personal information online to anyone, I completely saw the logic in that and so heavily limit the personal data I give out. Now we're just expecting people to completely go against that and give away the most personal details possible to companies who cannot prove what they are or are not doing with it just because governments have decided that's best now?
    • co_king_527 minutes ago
      The people pushing for this resent schools because they instill a sense of dignity in the population.

      They are bothered that you were taught such things and have made sure that your children will never be exposed to such information.

  • notTooFarGone4 hours ago
    >Some observers present privacy-preserving age proofs involving a third party, such as the government, as a solution, but they inherit the same structural flaw: many users who are legally old enough to use a platform do not have government ID.

    So there is absolutely no way to change that and give out IDs from the age of 14? You can already get an ID for children in Germany https://www.germany.info/us-de/service/reisepass-und-persona...

    This is a problem that has to be solved by the government and not by private tech companies.

    This is a lazy cop out to say "we have tried nothing and we are all out of ideas"

    • fabian2k3 hours ago
      I'm not convinced age restrictions like this are a good idea. But yeah, the non-availability of IDs in the US is a self-inflicted problem.

      Another example where this plays a role are voter registration and ID requirements for voting in the US. It is entirely bizarre to me how these discussions just accept it as a law of nature that it's expensive and a lot of effort to get an ID. This is something that could be changed.

      • bdangubic2 hours ago
        When one of the only two political parties does not want everyone to vote (cause they’d lose every election) you get what we got…
      • pixl973 hours ago
        You may underestimate the levels of classism and racism in the US. Go on and bring up a conversation about it and you'll eventually get someone talking about how that would be socialism and we can't do that.
    • iamnothere3 hours ago
      The problem is not that we aren’t doing age verification, it’s that a group of authoritarians are trying to force mandatory implementation of age verification (and concomitant removal of anonymity). That’s the problem.
      • Seattle350340 minutes ago
        It seems like the solution is to provide an age verification mechanism with robust privacy protections. That way when we offer a solution that works for all of their states concerns, if they disagree with the privacy preserving approach we force them to say outright "I want to keep a record of every website you visit."
      • Noaidi3 hours ago
        Anonymity is a myth. I am sure by now an LLM can figure out who you are and where you live by your HN posts alone.
    • logifail4 hours ago
      > This is a problem [..]

      (This is a genuine question) please could you describe the underlying problem that age verification is attempting to solve?

      • notTooFarGone3 hours ago
        Not my point in the comment but my personal opinion:

        To regulate access to addicting material. This is done in the physical world - why should digital be lawless when it applies to the same human behaviors?

        I've been addicted to a lot of digital media parts in harmful ways and I had the luck and support to grow out of most of it. A lot of people are not that lucky.

      • crazypyro3 hours ago
        I don't think that's what the original comment was discussing at all...

        If governments want to require private companies to verify ages, those same governments need to provide accessible ways for their citizens to get verification documents, starting from the same age that is required.

      • 3 hours ago
        undefined
    • meowface4 hours ago
      What problem? I don't think internet websites and apps actually need to know the face, age, or name of their users if their users don't want to provide that information. With exceptions for things like gambling websites.
      • infotainment3 hours ago
        Why should gambling be the exception? One could argue other app-based vices are just as bad, if not worse.
        • meowface3 hours ago
          Crippling debt from unwise impulsive gambling by a teenager is probably worse than whatever occurs from a teenager scrolling Twitter all day.

          The latter may not be great, but eating potato chips all day also probably isn't, and I don't think the government should outlaw minors eating potato chips. Plus it's variable: some get positive, educational, pro-social, productive outcomes from social media and some don't. Gambling is always bad in the limit.

          A simple rule could probably be that if a website can make you lose over $200 of real money, it should probably require age verification. I don't see why other things should.

          • swiftcoder3 hours ago
            > Crippling debt from unwise impulsive gambling by a teenager is probably worse than whatever occurs from a teenager scrolling Twitter all day.

            The cynic in me says that's not why governments want identity confirmation for gambling websites. It's so you can't dodge the taxman

            • meowfacean hour ago
              That's true. It's probably at least 50% the latter. And I don't really blame them.
    • triceratops3 hours ago
      I strongly oppose any form of "age verification" involving uploading your ID. That's just asking for a data breach.

      There are options that don't involve any ID uploads whatsoever.

      • fabian2k3 hours ago
        That's not what this user was talking about.

        For example, with a German ID you can provide proof that you are older than 18 without giving up any identifying information. I mean, nobody uses this system at the moment, but it does exist and it works.

        • triceratops3 hours ago
          Does the German ID system know what you are trying to access? Based on the requestor.
    • Aerroon3 hours ago
      It costs money. Getting an ID here costs about 5% of minimum wage if you order it online + travel (you still have to travel there for the photos and pickup). It costs even more if you apply in person.

      You could buy 19 gallons of milk for that money (80 liters).

      • notTooFarGone3 hours ago
        So do you buy an ID every month or can we depreciate that over 15 years?
        • account423 hours ago
          Not unless you are offering to front everyone that money for no interest.
          • Seattle350332 minutes ago
            Providing every citizen an ID every X years at no cost does seem like good policy.
      • pixl973 hours ago
        Really more so than money is the amount of time. Sitting at the DMV for half a day, and that is with an appointment, really sucks.
    • co_king_53 hours ago
      > So there is absolutely no way to change that and give out IDs from the age of 14?

      If that happened in the US, Republicans would then:

      1. Insist that non-white children carry ID at all times

      2. Operationalize DHS and ICE to deport non-white children to foreign concentration camps.

    • zenbowman3 hours ago
      Exactly right. Also, better to be overly restrictive here given the well documented harms of social media on young minds. If the law stipulates that you must be 15 to obtain social media access, and most people don't get their IDs until 18, then most people will stay off social media for another three years: no big deal.
  • Galanwe32 minutes ago
    Well there are technical solutions for this: blind signatures.

    I could generate my own key, have the government blind sign it upon verifying my identity, and then use my key to prove I'm an adult citizen, without anyone (even the signing government) know which key is mine.

    Any veryfying entity just need to know the government public key and check it signed my key.

    • halls-94027 minutes ago
      I was thinking the same thing. Why don't we just get a key from the government?
      • Galanwe17 minutes ago
        > Why don't we just get a key from the government?

        Because one could argue that the government could keep track of the keys they give away.

        That is where blind signing is interesting. The government can sign _your_ key without knowing it.

  • rafaelero2 hours ago
    I have no idea where this idea that Internet is toxic to children is coming from. Is that some type of moral panic? Weren't most of you guys children/adolescents during the 2000's?
    • Salgat2 hours ago
      Are you saying that social media isn't harmful to children?
      • ok12345632 minutes ago
        This is like rhetorically asking, "Are you saying that doom and marylin manson aren't harmful to children?"

        The problem with social media isn't the inherent mixing of children and technology, as if web browsers and phones have some action-at-a-distance force that undermines society; it's the 20 years or so they spent weaponizing their products into an infinite Skinner box. Duck walk Zuckerburg.

        This is all assuming good faith interest in "the children," which we cannot assume when what government will gain from this is a total, global surveillance state.

      • rafaeleroan hour ago
        Last time I checked there's no scientific consensus if social media causes harm at all. The best studies found null or very small effects. So yeah, I am skeptical it is harmful.
  • condiment3 hours ago
    We are missing accessible cryptographic infrastructure for human identity verification.

    For age verification specifically, the only information that services need proof of is that the users age is above a certain threshold. i.e. that the user is 14 years or older. But in order to make this determination, we see services asking for government ID (which many 14-year-olds do not have), or for invasive face scans. These methods provide far more data than necessary.

    What the service needs to "prove" in this case is three things:

    1. that the user meets the age predicate

    2. that the identity used to meet the age predicate is validated by some authority

    3. that the identity is not being reused across many accounts

    All the technologies exist for this, we just haven't put them together usefully. Zero knowledge proofs, like Groth16 or STARKs allow for statements about data to be validated externally without revealing the data itself. These are difficult for engineers to use, let alone consumers. Big opportunity for someone to build an authority here.

    • john_strinlai3 hours ago
      >We are missing accessible cryptographic infrastructure for human identity verification.

      like most proposed solutions, this just seems overcomplicated. we don't need "accessible cryptographic infrastructure for human identity". society has had age-restricted products forever. just piggy-back on that infrastructure.

      1) government makes a database of valid "over 18" unique identifiers (UUIDs)

      2) government provides tokens with a unique identifier on it to various stores that already sell age-restricted products (e.g. gas stations, liquor stores)

      3) people buy a token from the store, only having to show their ID to the store clerk that they already show their ID to for smokes (no peter thiel required)

      4) website accepts the token and queries the government database and sees "yep, over 18"

      easy. all the laws are in place already. all the infrastructure is in place. no need for fancy zero-knowledge proofs or on-device whatevers.

      • IanCal3 hours ago
        What you’re describing is infrastructure that doesn’t necessarily exist right now for use online, and has all the privacy problems described. Why should I have to share more than required?
        • john_strinlai3 hours ago
          it has none of the privacy problems described, and 95% of the infrastructure exists right now (have you ever purchased smokes or alcohol?)

          to go on tiktok, you enter a UUID once onto your account, and thats it. the only person that sees your id card is the store clerk that glances at the birth date and says "yep, over 18" when you are buying the "age token" or whatever you want to call it. no copies of your id are made, it cant be hacked, theres no electronics involved at all. its just like buying smokes. theres no tie between your id and the "age token" UUID you received.

          theres no fanciness to it, either. itd be dead simple, low-tech, cheap to implement, quick to roll out. all of the enforcement laws already exist.

          >Why should I have to share more than required?

          you shouldnt. having to prove age to use the internet is super dumb. but thats the way the winds are blowing apparently. if im gonna have to prove my age to use the internet, id much rather show my id to the same guy i buy smokes from (and already show my id to) than upload my id to a bunch of random services.

          • IanCalan hour ago
            Sorry I'd misunderstood I thought you were describing infrastructure that already exists and making a comparison to just using your ID.
          • simoncion2 hours ago
            The problem with this scheme is that it's exactly as protective as requiring someone to tick a "I'm of legal age" tickbox in the software they wish to access. Anyone who is of legal age can buy UUIDs and pass them around to folks who are not.

            Having said that, I think having an "I'm of legal age" tickbox goes quite far enough.

            For the ultra-controlling, setting up a "kid's account" using the tools already provided in mainstream OS's [0][1] is a fine option.

            [0] <https://www.microsoft.com/en-us/microsoft-365/family-safety>

            [1] <https://support.apple.com/guide/mac-help/set-up-content-and-...>

            • john_strinlai2 hours ago
              >The problem with this scheme is that it's exactly as protective as requiring someone to tick a "I'm of legal age" tickbox in the software they wish to access.

              no, it is exactly as protective as the protections for purchasing alcohol or buying smokes or other controlled substances/products.

              buying smokes/alcohol when underage is obviously harder than "click this box". (did you ever try to buy smokes/alcohol when underage? you cant just go up to the clerk at the store when you are 14 and say "trust me bro, im 18/19/21".)

              >Anyone who is of legal age can buy UUIDs and pass them around to folks who are not.

              same for smoking and alcohol. i could go to the store right now and buy smokes, then hand them to my 10 year old.

              we have laws already in place to punish selling smokes/alcohol to underagers, and laws for consuming smokes/alcohol when underage. we can apply those laws to your internet-age-token.

              most people seem fine with the current trade-off for smokes/alcohol. i see no reason why tiktok needs to be treated as more dangerous than either.

              >Having said that, I think having an "I'm of legal age" tickbox goes quite far enough.

              i agree with this and everything you said afterwards. id rather not have any of it.

              • simoncionan hour ago
                > no, it is exactly as protective as the protections for purchasing alcohol or buying smokes...

                Right. That's exactly as protective as that tickbox. [0] As I mentioned, any of-age person can distribute those UUIDs to people who are not of-age. Unlike with the proposed ID-collection-and-retention schemes (that are authoritarian's wet dreams) the vendor of the UUID is not responsible for ensuring that that UUID is not later used by someone who is not of-age.

                If you were to -say- make alcohol vendors liable for the actions of of-age people who pass on alcohol to not-of-age people, then you'd see serious attempts to control distribution.

                [0] Don't forget the existence of preexisting parental controls in every major OS. IME, this is a hurdle that's at least as difficult to surmount as the ID check done in non-chain convenience stores.

                • john_strinlaian hour ago
                  >Right. That's exactly as protective as that tickbox. [0]

                  no, it isn't, for reasons already mentioned but i will say it again for clarity:

                  - a 14 year old can click "im of age" on a checkbox.

                  - a 14 year old cannot go into a gas station and buy smokes. they will be declined.

                  >As I mentioned, any of-age person can distribute those UUIDs to people who are not of-age.

                  again... same with smokes and alcohol! but we are okay with how smokes and alcohol are regulated right now.

                  tiktok is not worse than a bottle of vodka. we are okay with how vodka is regulated. tiktok does not need even more strict age-verification than vodka.

                  it is not perfect, but it is absolutely more stringent than a checkbox. if you still doubt me, please send one of your 12-14 year old family members to buy a pack of smokes or a bottle of vodka at the nearest store. i will wait for your report.

                  • SoftTalker32 minutes ago
                    I mostly agree but unless these UUID age tokens are of limited life, it's more like buying the kid an unlimited amount of vodka and cigarettes with one action. If the tokens were good for one use, or a short time period, it would be more workable.
                    • john_strinlai30 minutes ago
                      sure, make them good for like 1 year or 3 months or something.

                      or make them good for 1 month, but sold in 12-packs.

                  • simoncionan hour ago
                    Your hypothetical 14-year-old needs to first be able to bypass the parental controls that come with every modern OS. You keep ignoring that.

                    (Also, like, did you ever go to college? Live in a dorm or apartment with underage students? It was super common for of-age people to buy and distribute booze to substantially underage students. Everyone knew it was happening all the damn time.)

                    > they are obviously not liable if i buy something legitimately, go home, and feed it to my kid. in that case, i am liable...

                    And if you changed up the rules to make them liable, you'd see serious attempts at controlling distribution.

                    What has been the state of the art in parental controls for quite some time is like the current regulatory regime for booze and tobacco. The single thing that needs to change to make it exactly the same would be to make it substantially illegal for US-based publishers to not tag the porn/violence/etc that they publish with age-restriction tags. [0]

                    What's being proposed and is currently implemented by several big-name sites is even more invasive.

                    > we are okay with how smokes and alcohol works right now.

                    I'm not. Either booze and tobacco need to be made into Schedule I substances, or their regulation needs to become much more lax. But I recognize that my opinion on the topic is considered to be somewhat out-of-the-ordinary.

                    [0] This might already be the law of the land right now. I haven't bothered to check.

                    • john_strinlaian hour ago
                      >Your hypothetical 14-year-old needs to first be able to bypass the parental controls that come with every modern OS. You keep ignoring that.

                      because they dont matter. parental controls exist today but have been deemed ineffective for the age verification conversation, for whatever stupid reason. so we are stuck trying to figure something else out. do i wish we could just use the existing basic parental controls instead of whatever the hell we are going to end up with? obviously!

                      the easiest "something else" is to piggy-back on existing age-restriction regulations (i.e. smokes, alcohol, gambling) because they have broad (obviously not ubiquitous, but broad) support. we have decades of experience with them.

                      and, to that end, you create a little token and you show your id to the store clerk to buy it. the "protect the children" people are satisfied (its the same process everything else age-restricted!), and i dont need to send my id to a peter thiel company. it preserves privacy, it re-uses existing laws, it re-uses existing infrastructure, etc.

                      • simoncionan hour ago
                        > ...but have been deemed ineffective for the age verification conversation, for whatever stupid reason.

                        Consider that such arguments (just like the arguments of Prohibitionists that resulted in the rise to power of Organized Crime) are made in a varied combination of ignorance and bad faith, and that we should loudly reject them in the strongest possible terms.

                        To be clear, I'm asserting that the claim that preexisting parental controls are insufficient is an argument made in ignorance and bad faith, not your assertion that the argument is being made.

                        • john_strinlai39 minutes ago
                          >Consider that such arguments [...] we should loudly reject them in the strongest possible terms.

                          me and you can yell into the void all we want. and i will continue to do so!

                          but, age verification is already here. so while i continue to yell about how stupid it is, i am also going to propose options that i feel like are less bad than what is being actively rolled out right now.

                          • simoncion36 minutes ago
                            > ...i am also going to propose options that i feel like are less bad than what is being actively rolled out right now.

                            As I mentioned, what you propose is exactly as useful and protective as what we have now. What we have now has been roundly rejected by the authoritarians pushing this expansion of power and influence. Your time and energy are better spent resisting the expansion, rather than suggesting alternatives that those authoritarians will never accept (and tacitly accepting their premise in the process).

                            • john_strinlai26 minutes ago
                              >As I mentioned, what you propose is exactly as useful and protective as what we have now.

                              i disagree, for reasons i have already said and for other reasons i havent yet.

                              but it is clear that we wont end up agreeing, so no need for us to keep going.

                    • SoftTalker29 minutes ago
                      A 21-year-old in a dorm buying booze for a 19 year old dorm-mate is a bit different from doing the same for a 14 year old.
      • mothballed3 hours ago
        The government will want some way to uncover who bought the token. They'll probably require the store to record the ID and pretend like since it's a private entity doing it, that it isn't a 4A violation. Then as soon as the token is used for something illegal they'll follow the chain of custody of the token and find out who bought it.

        No matter what the actual mechanism is, I guarantee they will insist on something like that.

        • john_strinlai3 hours ago
          if the goal is to "protect children", or just generally make parts of the internet age-gated, my proposal is 100% fine.

          if the goal is "surveil everyone using the internet", yes, very obviously my proposal would not be selected, and you will have to upload your id to various 3rd-party id verifiers.

          • mothballed3 hours ago
            I think something like your proposal actually sounds the most logical. I just think they will bolt on chain of custody tracking to it, while promising it will only be used for finding "terrorists" or something.
            • Seattle3503a minute ago
              The nice thing about something bolted on like that is that it is not an essential feature of the core design and has no bearing on the original goal. It can be removed or reformed. The same isn't true of the approaches we are heading towards now.
            • procfloraan hour ago
              Yes, while I was reading the article I couldn't help but think about notaries public. Seems like something like that would be government's go-to for this if they weren't quite so overfed on tech industry contributions that lead them down the path of AI solutions.

              I'm not sure that's the right answer here, but I think it ticks a lot boxes for the state.

    • JanisErdmanis3 hours ago
      A significant obstacle to adoption is that cryptographic research aims for a perfect system that overshadows simpler, less private approaches. For instance, it does not seem that one should really need unlinkability across sessions. If that's the case, a simple range proof for a commitment encoding the birth year is sufficient to prove eligibility for age, where the commitment is static and signed by a trusted third party to actually encode the correct year.
      • condiment3 hours ago
        I agree. I've been researching a lot of this tech lately as a part of a C2PA / content authenticity project and it's clear that the math are outrunning practicality in a lot of cases.

        As it is we're seeing companies capture IDs and face scans and it's incredibly invasive relative to the need - "prove your birth year is in range". Getting hung up on unlinkable sessions is missing the forest for the trees.

        At this point I think the challenge has less to do with the crypto primitives and more to do with building infrastructure that hides 100% of the complexity of identity validation from users. My state already has a gov't ID that can be added to an apple wallet. Extending that to support proofs about identity without requiring users to unmask huge amounts of personal information would be valuable in its own right.

    • zarzavat3 hours ago
      https://xkcd.com/538/

      Your crypto nerd dream is vulnerable to the fact that someone under 18 can just ask someone over 18 to make an account for them. All age verification is broken in this way.

      There is a similar problem for people using apps like Ubereats to work illegally by buying an account from someone else. However much verification you put in, you don't know who is pressing the buttons on the screen unless you make the process very invasive.

      • condiment3 hours ago
        You seem to have missed requirement #3 -> tracking and identifying reuse.

        An 18-year-old creating an account for a 12-year-old is a legal issue, not a service provider issue. How does a gas station keep a 21-year-old from buying beer for a bunch of high school students? Generally they don't, because that's the cops' job. But if they have knowledge that the 21-yo is buying booze for children, they deny custom to the 21-yo. This is simple.

        • zarzavat2 hours ago
          > How does a gas station keep a 21-year-old from buying beer for a bunch of high school students?

          They don't? Teenagers can easily get their hands on alcohol... you just need to know the right person at school who has a cool older brother. If their older brother is really cool they can get weed too!

          The police absolutely do not have the time to investigate the crime of making a discord account for someone.

    • mothballed3 hours ago
      Even if the problem is perfectly solved to anonymize the ID linked to the age, you still have the issue that you need an ID to exercise your first amendment right. 1A applies to all people, not just citizens, and it's considered racist in a large part of the US to force someone to possess an ID to prove you are a citizen (to vote) let alone a person (who is >= 18y/o) w/ 1A rights.
    • egorfine3 hours ago
      You are missing the point.

      They don't care whether you are 14 or not. They want your biometrics and identification. "Think of the children" is just a pretense.

      • yladiz3 hours ago
        In general, any government already has your information, and it's naive to think that they don't; if you pay taxes, have ever had a passport, etc. they already have all identifying information that they could need. For services, or for the government knowing what you do (which services you visit), then a zero-knowledge proof would work in this case.
      • IanCal3 hours ago
        The companies don’t, and th government already has your government id.
  • arrsingh2 hours ago
    Age Verification is very hard to do without exposing personal information (ask me how I know). I feel it should be solved by a platform company - someone like Apple (assuming we trust apple with our personal information but seems like we already do) - and the platform (ios) should be able to simply provide a boolean response to "is this person over 18" without giving away all the personal information behind the age verification.

    Now the issue of which properties can "ask to verify your age" and "apple now knows what you're looking at" is still an unsolved problem, but maybe that solution can be delivered by something like a one time offline token etc.

    But again, this is a very hard problem to solve and I would personally like to not have companies verify age etc.

  • alt2272 hours ago
    I like the solution Tim Burners-Lee is working on. Lets hope he has some success.

    https://solidproject.org/

    https://www.theguardian.com/technology/2026/jan/29/internet-...

  • kuon2 hours ago
    Even if you design the perfect system, kids will just ask parents for an unlocked account, many parents will accept, myself included. My kids have full access to the internet and I never used parental control, I talk to them. Of course, I don't want to give parenting advice, that would be presumptuous. But, my point is that a motivated kid will find a way, you have to "work" on that motivation.

    Many of the worst present on the internet is not age gated at all, you have millions of porn websites without even a "are you over 18" popup. There are plethora of toxic forums...

    Of course it's a complex problem, but the current approach sacrifice a lot of what made the internet possible and I don't like it.

  • RockRobotRock4 hours ago
    Here is an example of the problem with inference-based verification:

    https://streamable.com/3tgc14

  • kseniamorph3 hours ago
    > "Social media is going the way of alcohol, gambling, and other social sins: societies are deciding it’s no longer kids’ stuff."

    Oh, remember those good old times when alcohol was kids' stuff.......

    • Noaidi3 hours ago
      In Italy it is common for 13 and 14 year olds to have a glass of wine with dinner. The sin is not drinking, it is gluttony.
  • SoftTalker44 minutes ago
    A lot of talk and no solutions. Exactly the reason we are where we are.

    Liquor stores, bars, strip clubs, adult bookstores, or similar businesses don't let kids in. Movie theatres don't let a 10 year old in to an R-rated movie. The tech industry ignored their social responsibility to keep kids away from adult and age-inappropriate content. Now, they are facing legal requirements to do so. Tough for them, but they could have been more proactive.

  • fny3 hours ago
    Isn't it a simpler solution to create some protocol for a browser or device announce an age restricted user is present and then have parents lock down devices as they see fit?

    Aside from the privacy concerns, all this age verification tech seems incredibly complicated and expensive.

    • TZubiri3 hours ago
      I think this solution exists (e.g. android parental lock, but also ISP routers). But parents and industry have failed to do so on a greater scale. So legislation is going for a more affirmative action that doesn't require parental consent or collaboration.

      A service provider of adult content now cannot serve a child, regardless of the involvement or lack thereof of a parent.

  • bondarchuk3 hours ago
    It's kind of weird to me how every article on this topic here has people rushing to comment within a couple minutes with some generic "yes I too support ID checks for internet use!". Has the vibe really shifted so much among tech-literate people?
    • iamnothere3 hours ago
      Although there is some organic support, there is a lot of coordinated astroturfing. It’s apparent if you watch the discussions across platforms, there are obvious shared talking points that come in waves.

      Governments (and a few companies) really want this.

      • dang2 hours ago
        What are some links to HN comments that you (or anyone else) feel is "coordinated astroturfing"?

        The site guidelines ask users to send those to us at hn@ycombinator.com rather than post about it in the threads, but we always look into such cases when people send them.

        It almost invariably turns out to simply be that the community is divided on a topic, and this is usually demonstrable even from the public data (such as comment histories). However, we're not welded to that position—if the data change, we can too.

        • pessimizer14 minutes ago
          > What are some links to HN comments that you (or anyone else) feel is "coordinated astroturfing"?

          I don't think that there is any definitive way to prevent or detect this anymore. The number of personnel dedicated to online manipulation has grown too much, and the technology has advanced too far.

          These are now discussions that states and oligarchs have interests in, not Juicero or smart skillet astroturfing. And this remains a forum that people use to indicate elite support for their arguments.

      • paulryanrogers3 hours ago
        > Governments (and a few companies) really want this.

        The cynic in me fears they don't want a privacy-preserving solution, which blinds them to 'who'. Because that would satisfy parents worried about their kids and many privacy conscious folks.

        Rather, they want a blank check to blackmail or imprison only their opponents.

        • mcmcmc3 hours ago
          That’s not cynicism, it’s reality.
        • anjel2 hours ago
          Add to this that more and more sites and services are hostile to VPN connections and obfuscated email address for account registration. Worse still is that for existing accounts introducing ID req'ts, the next step in these changes is your prior anonymous activity could easily become a retro-liablit.y
        • phendrenad23 hours ago
          I think Larry (not, not that Larry, the other one) spilled the beans in 2024:

          "Citizens will be on their best behavior, because we’re constantly recording and reporting everything that is going on" - Larry Ellison

          (I seem to recall from the context of the quote, he isn't saying this is the future he wants, but it's a future he's not particularly opposed to)

          But the real threat is "accidental" database leaks from private websites. Let's say you live in a state where abortion isn't legal, and you sign up for a web forum where people discuss getting out-of-state abortions. As soon as that website is required to collect real names (which it will be), it becomes unusable, because nobody can risk getting doxxed.

          • hellojesus2 hours ago
            Maybe the US gov needs more tor users and is therefore doing this to drive more traffic to the onion network.
        • 2 hours ago
          undefined
        • throw__away73913 hours ago
          This is not a cynical take, it is blindingly obvious. Right now, governments around the world are watching, salivating over what is effectively remote control over the literal thoughts of and total surveillance over their entire population. They are itching insatiably to get control over these systems.
          • klsdjfdlkfjsd3 hours ago
            In my state, I caught a circuit court judge shilling on a certain well known "social media" site for the establishment of a lottery in our state. He framed it as a "We the People vs the corrupt politicians" issue--with him being firmly on the side of We the People of course.

            When I challenged him on his rhetoric, my comment INSTANTLY disappeared. I thought maybe it was a fluke, so I tried again, and the next comment insta-disappeared also.

            Soon thereafter I was locked out of the account and asked to provide a "selfie" to confirm my identity. (I declined.)

      • nostrebored3 hours ago
        > It’s apparent if you watch the discussions across platforms, there are obvious shared talking points that come in waves

        This is true of basically any issue discussed on the internet. Saying it must be astroturfing is reductive

      • 3 hours ago
        undefined
      • embedding-shape3 hours ago
        > It’s apparent if you watch the discussions across platforms, there are obvious shared talking points that come in waves.

        How do you know what is "shared talking points" vs "humans learning arguments from others" and simply echoing those? Unless you work at one of the social media platforms, isn't it short of impossible to know what exactly you're looking at?

      • reliabilityguy3 hours ago
        > there is a lot of coordinated astroturfing.

        Interesting. Are you saying all the concerns raised by the proponents of ID verification are invalid and meritless? For example,

        1. Foreign influence campaigns

        2. Domestic influence campaigns

        3. Filtering age-appropriate content

        I’m sure there are many other points with various degree of validity.

        • hellojesus2 hours ago
          If you drive out everyone with identity filters, those folks will naturally flock to sites run in nations without the same controls. I don't think you really solve anything except to push traffic elsewhere.

          Instead it would be more appropriate to let sites pass headers, such as "we have adult content", thst you could filter on the network or client side. It's still voluntary, of course. Anyone will just visit sites that don't have the checks if necessary.

        • simoncion2 hours ago
          > Interesting. Are you saying all the concerns raised by the proponents of ID verification are invalid and meritless?

          In the US, #1 and #2 are invalid and meritless. Wholly and without reservation. One of the huge reasons for the First Amendment is to ensure that people are able to counter lies uttered in the public sphere with truth.

          #3 is handled by parental controls that have existed in mainstream OSs for quite some time now. [0][1][2] However, those preexisting parental controls don't justify additional expansion of the power and influence of authoritarians, so here we are.

          [0] <https://www.microsoft.com/en-us/microsoft-365/family-safety>

          [1] <https://support.apple.com/guide/mac-help/set-up-content-and-...>

          [2] <https://support.google.com/android/answer/16766047?hl=en-rw>

          • reliabilityguy2 minutes ago
            > In the US, #1 and #2 are invalid and meritless. Wholly and without reservation. One of the huge reasons for the First Amendment is to ensure that people are able to counter lies uttered in the public sphere with truth.

            How does digital ID prevents you from speaking out? For example, 2nd amendment requires a lot of hoops in some jurisdictions, which were deemed constitutional, and not violating 2nd amendment. Same with the 1st amendment. You can argue that with digital IDs there will be less privacy and anonymity than before, but it’s a different story.

            Moreover, influence campaigns are not about truth or lies, but about making the public loose face on the institutions. A good example of it today is Russia, where the public does not believe that democratic elections are possible at all, in principle.

            > #3 is handled by parental controls that have existed in mainstream OSs for quite some time now.

            It is not handled perfectly at all, and easily bypassed. To pretend that information access on the internet can be regulated through parental controls is ridiculous.

        • inigyou2 hours ago
          [dead]
      • EarlKing3 hours ago
        More than a few companies. Nothing would allow advertisers to justify raising ad rates quite like being able to point out that their users are real rather than bots.
      • delusional3 hours ago
        > there is a lot of coordinated astroturfing. It’s apparent if you watch the discussions across platforms, there are obvious shared talking points that come in waves.

        Is that really evidence of astroturfing? If we're in the middle of an ongoing political debate, it doesn't seem that far fetched for me that people reach similar conclusions. What you're hearing then isn't "astro-turfing" but one coalition, of potentially many.

        I often hear people terrified that the government will have a say on what they view online, while being just fine with google doing the same. You can agree or disagree with my assesment, but the point is that hearing that point a bunch doesn't mean it's google astroturfing. It just means there's an ideology out there that thinks it's different (and more opressive seemingly) when governments do it. It means all those people have a similar opinion, probably from reading the same blogs.

        • zug_zug3 hours ago
          Well the hard thing about astroturfing is that only the people running the platform have the hard data to prove it beyond any reasonable doubt.

          But I don't think we need 99.99% confidence -- isn't even acknowledged that 30% of twitter is bots or something? I think it's safe to conclude there's astroturfing on any significant political issue.

          Also as far as documented cases, there were documented cases of astroturfing around fracking [1], or pesticides [2]

          1. https://journals.sagepub.com/doi/10.1177/2057047320969435 2. https://www.corywatson.com/blog/monsanto-downplay-roundup-ri...

        • klsdjfdlkfjsd3 hours ago
          > it doesn't seem that far fetched for me that people reach similar conclusions.

          How do you suppose it is that millions of people, separated by vast geographic distances, somehow all reach similar conclusions all at once?

          Related: How do you suppose it is that out of 350-700+ million people (depending on whose numbers you believe), there's always only two "choices" and both of them suck?

          • fyredgean hour ago
            In the same way that they came up with the idea of divine being(s) in the image of man that rule nature.

            In the same way that patriarchy rose amongst them all.

            In the same way that a shared currency was deemed necessary.

            Escpecially in matters of governance, there is something to be said about how humans like to organise themselves. No country has truly escaped capitalism so far.

      • cyanydeez3 hours ago
        "A few"?

        "Real" user verification is a wet dream to googlr, meta, etc. Its both a ad inflation and a competive roadblock.

        The benefits are real: teens are being preyed upon and socially maligned. State actors and businesses alike are responsible.

        The technology is not there nor are governments coordinating appropiate digital concerns. Unsurprising because no one trusts gov, but then implicitly trust business?

        Yeah, so obviously, its implementation that will just move around harms.

      • parineum2 hours ago
        > there are obvious shared talking points that come in waves.

        Groups of people who wake up at the same time of the day often have a tendency to be from a similar place, hold similar values and consume similar media.

        Just because a bunch of people came to the same conclusion and have had their opinions coalesce around some common ideas, doesn't mean it's astroturfing. There's a noticeable difference between the opinions of HN USA and HN EU as the timezones shift.

      • bilbo0s3 hours ago
        I think we should be careful of writing off this sea change as simple professional influence campaigns. That kind of thinking is just what got Trump to the Whitehouse, and is currently getting the immigrants rounded up.

        Things that didn't seem likely to have broad support previously, now are seen as acceptable. In the 90's no one could envision rounding up immigrants. No one could envision uploading an ID card to use ICQ. No one could envision the concept of DE-naturalization or getting rid of birthright citizenship.

        Today, in the US for instance, there are entire new generations of people alive. And many, many people who were alive in the 90's are gone. Well these new people very much can envision these things. And they seem to have stocked the Supreme Court to make all these kinds of things a reality.

        All because the rest of us keep dismissing all of this as just harmless extreme positions that no one in society really supports. We have to start fighting things like this with more than, "It's not real."

        • ReptileMan3 hours ago
          >In the 90's no one could envision rounding up immigrants.

          Both Clinton and Obama deported way more people than Trump.

          • bilbo0s2 hours ago
            Obama wasn't around in the 90's.

            And Clinton only deported 2 million across his entire 8 years in office. With a laser focus on convicted criminals as part of a war on drugs. (Now the efficacy of the old "War on Drugs" can be argued, but the numbers can't. We have the records.)

            I think you're conflating the number of "returns", defined in the 90's as people who were not allowed to enter at the border; and "deportations", defined in the 90's as people who were in the US, and then we put on a plane back out of the US. IE - "Returns" were people who showed up at the border, sea port, airport or border checkpoint; asked to get in, and we said no. Basically, the nice people.

            What you mean is that Clinton simply didn't let anyone into the country. This is true. (Again, we have the records. Clinton refused entry to the US more than any president in US history.) He didn't, however, round up immigrants living in the US on this scale and deport them like we're seeing today. People would never have allowed for that.

            To put numbers on it, Trump is on year 5, and has already processed more formal removal orders than Clinton did by year 8. Not only that, voluntary removals were near non-existent under Clinton in the 90's. Today, for just this year alone, they sit at around 1.5 million.

          • co_king_53 hours ago
            > Both Clinton and Obama deported way more people than Trump.

            You are correct. Further, I suggest that Democrats and Democrat-controlled media cultivate a delusional worldview which allows their supporters to ignore the right-wing brutality consistently and continually imposed by Democrat leaders.

            How do you feel about the second Trump admin's nationwide, made-for-TV DHS/ICE siege?

            • ReptileMan2 hours ago
              Ineffective. Too much noise, too little removals.
              • co_king_52 hours ago
                Do you think Trump's first term was a failure because he didn't deport as many people as Obama?
                • ReptileMan2 hours ago
                  Trump's first term was a total failure for many reasons. He didn't implement nothing of his agenda successfully.
            • klsdjfdlkfjsd2 hours ago
              If one feels anything about it at all, it's a sign they're taking the Made-for-TV movie seriously.

              Never take TV seriously.

              The key mistake is even watching it in the first place.

              "If you don't read the newspaper, you're uninformed. If you do read the newspaper, you're misinformed." - Mark Twain

              • co_king_52 hours ago
                > "If you don't read the newspaper, you're uninformed. If you do read the newspaper, you're misinformed." - Mark Twain

                I love the quote, thanks for sharing.

        • OGEnthusiast2 hours ago
          [dead]
      • 3 hours ago
        undefined
    • tlogan3 hours ago
      The industry clearly prefers a system in which using the internet requires full identification. There are many powerful interests that support this model:

      - Governments benefit from easier monitoring and enforcement.

      - The advertising industry prefers verified identities for better targeting.

      - Social media companies gain more reliable data and engagement.

      - Online shopping companies can reduce fraud and increase tracking.

      - Many SaaS companies would also welcome stronger identity verification.

      In short, anonymity is not very profitable, and governments often favor identification because it increases oversight and control.

      Of course, this leads to political debate. Some point out that voting often does not require ID, while accessing online services does. The usual argument is that voting is a constitutional right. However, one could argue that access to the internet has become a fundamental part of modern life as well. It may not be explicitly written into the Constitution, but in practice it functions as an essential right in today’s society.

    • coffeefirst3 hours ago
      There’s some nuance here.

      Realizing that much of the internet is totally toxic to children now and should have a means of keeping them out is distinct from agreeing to upload ID to everything.

      A better implementation would be to have a device/login level parental control setting that passed age restriction signals via browsers and App Stores. This is both a simpler design and privacy friendly.

      • subscribed12 minutes ago
        This also means the only operating systems allowing access to the internet will be these with the immense surveillance and ad-infested.
      • mcmcmc3 hours ago
        I like this take. Ultimately the only people responsible for what kids consume are the parents. It’s on them to control their kids’ internet access, the government has no place in it. If you want to punish someone for a child being exposed to inappropriate content, punish the negligence of the parents.
      • grvdrm2 hours ago
        I've thought the same.

        At least here in US: Google/Apple device controls allow app to request whether user meets age requirements. Not the actual age, just that the age is within the acceptable range. If so, let through, if not, can't proceed through door.

        I know I am oversimplifying.

        But I like this approach vs. uploading an ID to TikTok. Lesser of many evils?

      • fluoridation3 hours ago
        It doesn't sound simple. Now there needs to be some kind of pipeline that can route a new kind of information from the OS (perhaps from a physical device) to the process, through the network, to the remote process. Every part of the system needs to be updated in order to support this new functionality.
        • csunbird2 hours ago
          and is it easier to implement id checks for each online account that people have, had, and will ever have in the future?

          parents need to start parenting by taking responsibility on what their kids are doing, and government should start governing with regulations on ad tech, addictive social media platforms, instead of using easily hackable platforms for de anonymization, which in turn enable mass identity theft.

          • fluoridation2 hours ago
            >and is it easier to implement id checks for each online account that people have, had, and will ever have in the future?

            No, I think both ideas are bad.

      • fruitworks3 hours ago
        now?
    • illumanaughty3 hours ago
      There's a high chance the government is attempting to influence public opinion by using botted comments, which is easier than ever to pull off.
      • swolios2 hours ago
        Seems like these articles and the subsequent top replies saying

        "use a token from the device so the ID never leaves, this is way better right!"

        This is the true objective. They actually want DEVICE based ID.

        I want LESS things that are tied to me financially and legally to be stolen when(not if) these services and my device are compromised.

      • co_king_53 hours ago
        This is what the LLMs are actually good for.
        • muyuu3 hours ago
          it's a solid business model actually
          • co_king_53 hours ago
            I'm inclined to agree.

            I also think the FUD they've succeeded in creating around the use of LLMs for code generation (there's a portion of the management class that seems to genuinely believe that Claude Code is AGI) is the greatest marketing operation of our lifetimes.

        • ep1033 hours ago
          Yeah, I've been saying for years that LLMs are a technology that basically unlock three major new technologies:

          1. Automatic shaping of online community discussions (social media, bots, etc)

          2. Automatic datamining, manipulating and reacting to all digitally communicated conversations (think dropping calls or MITM manipulation of conversations between organizers of a rival poltical party in swing districts proir to an election, etc. CointelPro as a service)

          3. Giving users a new UI (speech) with which they can communicate with computer applications

      • delusional3 hours ago
        Unless you live in North Korea, no there is not. This is pure conspiracy theory.
        • jpfromlondon3 hours ago
          well at least your screen name is accurate, which is more than can be said of your comment.
    • blablabla1233 hours ago
      I think it's quite embarrassing that the WWW exists since more than 3 decades and still there's no mechanism for privacy friendly approval for adults apart from sending over the whole ID. Of course this is a huge failure of governments but probably also of W3C which rather suggests the 100,000th JavaScript API. Especially in times of ubiquitous SSO, passkeys etc. The even bigger problem is that the average person needs accounts at dozens if not hundreds of services for "normal" Internet usage.

      That being said, this is a 1 bit information, adult in current legislation yes/no.

      • akersten3 hours ago
        > and still there's no mechanism for privacy friendly approval for adults apart from sending over the whole ID. Of course this is a huge failure of governments but probably also of W3C

        I consider it a huge success of the Internet architects that we were able to create a protocol and online culture resilient for over 3 decades to this legacy meatspace nonsense.

        > That being said, this is a 1 bit information, adult in current legislation yes/no.

        If that's all it would take to satisfy legislatures forever, and the implementation was left up to the browser (`return 1`) I'd be all for it. Unfortunately the political interests here want way more than that.

      • beambot3 hours ago
        SSO and passkeys don't solve adult verification. I don't see how this problem is embarrassing for the www - it's a hard problem in a socially permissible way (eg privacy) that can successfully span cultures and governments. If you feel otherwise, then solutions welcome!
    • Aurornis2 hours ago
      When ID checks are rolled out there is immediate outrage. Discord announced ID checks for some features a couple weeks ago and it has been a non-stop topic here.

      From what I’ve seen, most of the pro-ID commenters are coming from positions where they assume ID checks will only apply to other people, not them. They want services they don’t use like TikTok and Facebook to become strict, but they have their own definitions of social media that exclude platforms they use like Discord and Hacker News. When the ID checks arrive and impact them they’re outraged.

      Regulation for thee, not for me.

    • MaKey3 hours ago
      This is what I was wondering too. It doesn't seem genuine. Most people in tech I know will strongly oppose ID checks for internet use, rightfully so.
      • worldsayshi3 hours ago
        I think that not doing partial-identity checks invite bot noise into conversations. We could have id checks that only check exactly what needs to be checked. Are you human? Are you an adult? And then nothing else is known.
    • worldsayshi3 hours ago
      There are better and there are really really bad ways to do ID checks. In a world that is increasingly overwhelmed by bots I don't see how we can avoid proof-of-humanity and proof-of-adulthood checks in a lot of contexts.

      So we should probably get ahead of this debate and push for good ways to do part-of-identity-checks. Because I don't see any good way to avoid them.

      We could potentially do ID checks that only show exactly what the receiver needs to know and nothing else.

      • semi-extrinsic2 hours ago
        > We could potentially do ID checks that only show exactly what the receiver needs to know and nothing else.

        A stronger statement: we know how to build zero-knowledge proofs over government-issued identification, cf. https://zkpassport.id/

        The services that use these proofs then need to implement that only one device can be logged in with a given identity at a time, plus some basic rate limiting on logins, and the problem is solved.

        • 9dev2 hours ago
          Thank you - this gets way too few attention especially among tech folks. People act like uploading your government ID to random online services was the only solution to this problem, which is really just a red herring.
      • 2 hours ago
        undefined
    • meowface3 hours ago
      It's very odd. I see it everywhere I go.

      I think a lot of the younger generation supports it, actually. They didn't really grow up with a culture of internet anonymity and some degree of privacy.

    • hibikir3 hours ago
      We have a Scylla vs Charybdis situation, where lack of ID leads to an internet of bots, while on the other end we get a dystopia where everything anyone has ever said about any topic is available to a not-so-liberal government. Back in the day, it was very clear that the second problem was far worse than the first. I still think it is, but I sure see arguments for how improved tooling, and more value in manipulating sentiment, makes the first one quite a bit worse than it was in, say, 1998.
      • co_king_53 hours ago
        > Back in the day, it was very clear that the second problem was far worse than the first.

        This is still the case. The difference now is that the astroturfed bot accounts are pushing for fascism (I.E., the second problem).

    • hhh3 hours ago
      A lot of people are unhappy with the state of the Internet and the safety of people of all ages on it. I believe we should be focusing on building a way to authenticate as a human of a nation without providing any more information, and try to raise the bar for astroturfing to be identity theft.
    • some_random2 hours ago
      There's absolutely some astroturfing happening, but I wouldn't discount that there is some organic support as well. Journalists have been pushing total de-anonymization of the internet for a while now, and there are plenty of people susceptible to listening to them.
    • embedding-shape3 hours ago
      > Has the vibe really shifted so much among tech-literate people?

      Actually, yes, it seems to have shifted quite a bit. As far as I can tell, it seems correlated with the amount of mis/disinformation on the web, and acceptance of more fringe views, that seems to make one group more vocal about wanting to ensure only "real people" share what they think on the internet, and a sub-section of that group wanting to enforce this "real name" policy too.

      It in itself used to be fringe, but really been catching on in mainstream circles, where people tend to ask themselves "But I don't have anything to hide, and I already use my real name, why cannot everyone do so too?"

    • yogurt-male3 hours ago
      Could be astroturfing
    • bacchusracine2 hours ago
      I think the word you're missing is fatigue.

      The average tech literate person keep seeing their data breached over and over again. Not because THEY did anything wrong, but because these Corpos can't help themselves. No matter how well the tech literate person secures their privacy it has become clear that some Corpo will eventually release everything in an "accident" that causes their efforts to become meaningless.

      After a while it's only human for fatigue to build up. You can't stop your information from getting out there. And once it's out there it's out there forever.

      Meanwhile every Corpo out there in tech is deliberately creating ways to track you and extract your personal information. Taking steps to secure your information ironically just makes you stand out more and narrows the pool you're in to make it easier to find you and your information. And again you're always just one "bug" from having it all be for nothing.

      I still take some steps to secure my privacy, I'm not out there shouting my social security information or real name. But that's habit. I no longer believe that privacy exists.

      To the extent we ever had it in the past was simply the insurmountable restrictions on tracking and pooling the information into some kind of organization and easy lookup. Now that it is easier and easier to build profiles on mass numbers of people and to organize those and rank them the illusion is gone. Privacy is dead. Murdered.

      And people are tired of pretending otherwise.

      • bondarchukan hour ago
        People are saying privacy is dead for decades, yet privacy continually declines more and more. And there's still quite a ways it can go from here. The defeatist attitude only helps further erosion.
    • giancarlostoro3 hours ago
      I don't support ID for internet use, only for adult content specifically. There's things on Discord that would shock you to your core if you saw some of it, I don't think children should be blindly exposed to any of it. Specifically porn. Tumblr almost got kicked out of the app store over porn, they went the route of banning it and killing what to me felt like a dying social media platform as things stood.

      Do you think strip clubs and bars should stop IDing people at the door? I don't. Why should porn sites be any different?

      • cjs_ac3 hours ago
        The difference is that at the strip club, you show your ID to the bouncer, who makes sure its valid and that the photo matches your face, and then forgets all about it. Online, that data is stored forever.

        The principle of online ID checks is completely sound; the implementation is not.

        • ibejoeb2 hours ago
          That's pretty much over, too. PatronScan and others collect and share data as a first-class feature, e.g., to broadly 86 people.

          https://www.sacbee.com/food-drink/article231580393.html

        • delusional3 hours ago
          The implementation is sound. Instead of getting an ID, the bouncer gets a serial number from you, he calls his government contact who tells him you are of age. The serial number is meaningless to him.

          This would be impractical in meatspace, but works perfectly fine on the internet.

          • debugnik2 hours ago
            Where in your metaphor are the club next door using Persona instead of that implementation, and the EU's reference implementation requiring a Google Play integrity check to acquire a serial number in the first place?
          • fluoridation3 hours ago
            You're proposing that every porn site on the planet pings a user's government's API to see if they're adult or not? In other words, that any random site is able to contact hundreds of APIs.
            • 9dev2 hours ago
              Absolutely, yes. They don’t ping to see that you are of age, but that the random challenge generated by your ID checks out.
          • Gracana3 hours ago
            Where is it implemented that way?
            • delusional3 hours ago
              In the proposal from the European Union, and in the implementation in Denmark.
              • Gracana2 hours ago
                Huh, interesting. Do you know if the government sees the identity of the company and the person being verified?

                [edit] I did a little reading and it sounds like the company does not query the government with your ID. You get the cryptographic ID from the government, and present it to a company who is able to verify its validity directly. My source is mostly this: https://www.eff.org/deeplinks/2025/04/age-verification-europ...

    • pixl973 hours ago
      I mean there has always been some part of the tech literate people that were like that, they were just less likely to post about it on forums. Heck after the eternal September it wasn't uncommon for 'jokes' about requiring a license to use the internet.
    • heliumtera3 hours ago
      The audience shifted from tech-literate to the opposite.
      • badgersnake2 hours ago
        Too may in tech for the money that’s for sure. No fundamentals, just drilled leetcode.
    • observationist3 hours ago
      It's inauthentic at best. The four horsemen of the infocaplypse are drugs, pedos, terrorists, and money laundering - they trot out the same old tired "protect the children!" arguments every year, and every year it's never, ever about protecting children, it's about increasing control of speech and stamping out politics, ideology, and culture they disapprove of. For a recent example, check out the UK's once thriving small forum culture, the innumerable hobby websites, collections of esoteric trivia, small sites that simply could not bear the onerous requirements imposed by the tinpot tyrants and bureaucrats and the OSA.

      It's never fucking safety, or protecting children, or preventing fraud, or preventing terrorism, or preventing drugs or money laundering or gang activities. It's always, 100% of the time, inevitably, without exception, a tool used by petty bureaucrats and power hungry politicians to exert power and control over the citizens they are supposed to represent.

      They might use it on a couple of token examples for propaganda purposes, but if you look throughout the world where laws like this are implemented, authoritarian countries and western "democracies" alike, these laws are used to control locals. It's almost refreshingly straightforward and honest when a country just does the authoritarian things, instead of doing all the weaselly mental gymnastics to justify their power grabs.

      People who support this are ignorant or ideologically aligned with authoritarianism. There's no middle ground; anonymity and privacy are liberty and freedom. If you can't have the former you won't have the latter.

    • dyauspitr2 hours ago
      I think so. A lot of people think the internet now is a somewhat negative construct and don’t feel so strongly about it somewhat dying away.
    • mulmen2 hours ago
      Beware the vocal minority. Internet comment sections only tell you the sentiments of people who make comments.

      HN comments sentiment seems to shift over the age of the thread and time of day.

      My suspicion is that the initial comments are from people in the immediate social circle of the poster. They share IRC or Slack or Discord or some other community which is likely to be engaged and have already formed strong opinions. Then if the story gains traction and reaches the front page a more diverse and thoughtful group weighs in. Finally the story moves to EU or US and gets a whole new fresh take.

      I’m not surprised that people who support something are the ones most tuned in to the discussion because for anyone opposed they also have their own unrelated thing they care about. So the supporters will be first since they’re the originators.

    • bakugo3 hours ago
      > Has the vibe really shifted so much among tech-literate people?

      HN has largely shifted away from tech literacy and towards business literacy in recent years.

      Needless to say that an internet where every user's real identity is easily verifiable at all times is very beneficial for most businesses, so it's natural to see that stance here.

    • stuffn3 hours ago
      The average tech “literate” person uses discord, social media, a GitHub with their real name, a verified LinkedIn, and Amazon Echo.

      These are not the same people from 30 years ago. The new generation has come to love big brother. All it took to sell their soul was karma points.

      • ghaffan hour ago
        Many of the things you mention are also tools that many people use in a professional context which mostly doesn't work if you try to be anonymous. Yes, some people choose to be pseudonymous but that mostly doesn't work if your real-life and virtual identities intersect, such as attending conferences or company policies that things you write for company publications be under your real name.
    • jMyles3 hours ago
      I highly doubt the sentiment is from real humans. If anything, it proves that a web-of-trust-based-attestation-of-humanity is the real protection the internet needs.
    • 2OEH8eoCRo03 hours ago
      Its weird how all these 1,000 IQ innovators suddenly can't figure it out.

      I dont think they want to figure it out. They think the internet should be stagnant unchanging and eternal as it currently exists because it makes the most money. If you disagree you're either a normie, bot, or need to parent harder or something. There is nothing you can do don't dare try to change it.

    • salawat2 hours ago
      This is a VC site, when the revenue generating model of the Internet has strongly shifted into surveillance capitalism overdrive.

      Cui bono?

    • intended3 hours ago
      The vibe has shifted quite a bit among the general populace, not just in tech.

      The short version is that voters want government to bring tech to heel.

      From what I see, people are tired of tech, social media, and enshittified apps. AI hype, talk of the singularity, and fears about job loss have pushed things well past grim.

      Recent social media bans indicate how far voter tolerance for control and regulation has shifted.

      This is problematic because government is also looking for reasons to do so. Partly because big tech is simply dominant, and partly because governments are trending toward authoritarianism.

      The solution would have been research that helped create targeted and effective policy. Unfortunately, tech (especially social media) is naturally hostile to research that may paint its work as unhealthy or harmful.

      Tech firms are burned by exposés, user apathy, and a desire to keep getting paid.

      The lack of open research and access to data blocks the creation of knowledge and empirical evidence, which are the cornerstones of nuanced, narrowly tailored policy.

      The only things left on the table are blunt instruments, such as age verification.

    • alephnerd3 hours ago
      > Has the vibe really shifted so much among tech-literate people?

      Yes.

      Or more honestly, there was always an undercurrent of paternalistic thought and tech regulation from the Columbine Massacre days [0] to today.

      Also for those of us who are younger (below 35) we grew up in an era where anonymized cyberbullying was normalized [1] and amongst whom support for regulating social media and the internet is stronger [2].

      The reality is, younger Americans on both sides of the aisle now support a more expansive government, but for their party.

      There is a second order impact of course, but most Americans (younger and older) don't realize that, and frankly, using the same kind of rhetoric from the Assange/Wikileaks, SOPA, and the GPG days just doesn't resonate and is out of touch.

      Gen X Techno-libertarianism isn't counterculture anymore, it's the status quo. And the modern "tech-literate" uses GitHub, LinkedIn, Venmo, Discord, TikTok, Netflix, and other services that are already attached to their identity.

      [0] - https://www.nytimes.com/1999/05/02/weekinreview/the-nation-a...

      [1] - https://www.nytimes.com/2013/09/14/us/suicide-of-girl-after-...

      [2] - https://www.washingtonpost.com/politics/2022/06/09/why-young...

    • Kenji3 hours ago
      [dead]
    • deadbabe3 hours ago
      Better to let a hundred people’s “privacy” be violated than to let another child be radicalized or abused or misled by online predation.
      • co_king_53 hours ago
        > Better to let a hundred Black men be hanged than to let another White woman be radicalized or abused or misled by their predation.

        This is how you sound to me.

      • co_king_53 hours ago
        > “privacy”

        Why did you put "privacy" in scare quotes?

        • deadbabe27 minutes ago
          The information being collected isn’t inherently private. Usage patterns, facial analysis, stylometry, etc.

          there is a lot you can do to determine a person’s age without ever having to see a formal ID.

      • baal80spam3 hours ago
        I don't subscribe to this. I value my privacy more.
    • bgro3 hours ago
      It’s bots pushing another false narrative. You’ll notice this in anything around politics or intelligence the past 10+ years, with big booms around 2016 and 2024 “for some reason”
      • luke7273 hours ago
        No. There are significant numbers of real people who genuinely support this type of thing. Dismissing it as "bots" or a "false narrative" leads to complacency that allows this stuff to pass unchallenged.
        • vaylian3 hours ago
          The problem is: The people who typically support this type of thing are either technically illiterate and they support it, because it sounds good. Or they are promoting these laws because they actually want more surveillance and control. It's not about protecting children.

          I still haven't read any truly compelling argument, why this type of surveillance is actually effective and proportionate.

      • 3 hours ago
        undefined
    • forgotaccount33 hours ago
      As a tech-literate person, I'm not 100% against the concept of ID if only because I think people will be more reasonable if they weren't anonymous.

      This conflicts with my concerns about government crackdowns and the importance of anonymity when discussing topics that cover people who have a monopoly on violence and a tendency to use it.

      So it's not entirely a black/white discussion to me.

      • PaulKeeble3 hours ago
        Both Google and Facebook have enforced real identity and its not improved the state of peoples comments at all. I don't think anonymity particular changes what many people are willing to say or how they say it, people are just the creature you see and anonymity simple protects them it doesn't change their behaviour all that much.
      • armchairhacker3 hours ago
        I think opt-in ID is great. Services like Discord can require ID because they are private services*. Furthermore, I think that in the future, a majority of people will stay on services with some form of verification, because the anonymous internet is noisy and scary.

        The underlying internet should remain anonymous. People should remain able to communicate anonymously with consenting parties, send private DMs and create private group chats, and create their own service with their own form of identity verification.

        * All big services are unlikely to require ID without laws, because any that does not will get refugees, or if all big services collaborate, a new service will get all refugees.

      • rogerrogerr3 hours ago
        The problem is this is only true for values of "reasonable" that are "unlikely to be viewed in a negative light by my government, job, or family; either now or at any time in the future". The chilling effect is insane. There was a time in living memory when saying "women should be able to vote" was not a popular thing.

        I mean, this is _literally the only thing needed_ for the Trump admin to tie real names to people criticizing $whatever. Does anyone want that? Replace "Trump" with "Biden", "AOC", "Newsom", etc. if they're the ones you disagree with.

        • co_king_53 hours ago
          > Replace "Trump" with "Biden", "AOC", "Newsom", etc. if they're the ones you disagree with.

          Stop trying to reason with fascists.

          Everyone in the world knows that the Democrats you named are too ideologically aligned with right-wing hatred to ever leverage the repressive power of the state apparatus in the same way Republicans do.

          • mikkupikku3 hours ago
            Obama carried on where Bush left off. I think Biden was at least marginally better, at the very least I admire him for ripping off the Afghanistan bandaid, but the amount of effort he put onto rolling back executive overreach was minimum if anything.
          • rogerrogerr3 hours ago
            You're saying that Biden, AOC, and Newsom are "ideologically aligned with right-wing hatred"? This is not something I've ever heard a human being say. Almost afraid to ask, but where's that coming from?
            • co_king_53 hours ago
              Why did AOC stop calling them "concentration camps" when Biden took office?
      • triceratops3 hours ago
        > I think people will be more reasonable if they weren't anonymous.

        I've seen people post appalling shit on fuckin LinkedIn under their own names.

        Strong moderation keeps Internet spaces from devolving into cesspools. People themselves have no shame.

        • subscribed21 minutes ago
          Same. Also on Facebook and Nextdoor (with real names and addresses).

          Real name moderator is a fallacy.

      • 2OEH8eoCRo03 hours ago
        That's what I believe as well. Anons have turned the internet into an unsafe cesspit. It's the opposite of a "town square."
        • finghin3 hours ago
          Internet anonymity is FAR from something new.
    • jraby33 hours ago
      I used to be so against this but after the never ending cat and mouse game with my kids (especially my son) I don't think the tech crowd really appreciates how frustrating it is and how many different screens there are.

      Tons of data also showing higher suicide rates, depression rates, eating disorders etc. so it's not as if there is no good side to this.

      • cataphract3 hours ago
        If they are so intent on disobeying what makes you they won't just use a VPN or ask someone older to login for them (or any other workaround, depending on the technology)?
      • voidUpdate3 hours ago
        I think the tech crowd appreciates how hard it is to lock down access to tech, since they were the kids bypassing the restrictions
      • 2 hours ago
        undefined
      • modo_mario3 hours ago
        You are the one handing them those screens.
      • zer00eyz3 hours ago
        > Tons of data also showing higher suicide rates

        Here is the data:

        https://www.cdc.gov/mmwr/volumes/66/wr/mm6630a6.htm

        and the more recent data:

        https://afsp.org/suicide-statistics/

        I was a child of the 90's, where the numbers were higher, where we had peak PMRC.

        > depression rates

        Have these changed? Or have we changed the criteria for what qualifies as "depression"? We keep changing how we collect data, and then dont renormalized when that collection skews the stats. This is another case of it, honestly.

        > eating disorders

        Any sort of accurate data collection here is a recent development:

        https://pmc.ncbi.nlm.nih.gov/articles/PMC7575017/

        > never ending cat and mouse game with my kids (especially my son)

        Having lived this with my own, I get it. Kids are gonna be kids, and they are going to break the rules and push limits. When I think back to the things I did as a kid at their age, they are candidly doing MUCH better than I, or my peer group was. Drug use, Drinking, ( https://usafacts.org/articles/is-teen-drug-and-alcohol-use-d... ) teen pregnancy are all down ( https://www.congress.gov/crs-product/R45184 )

    • otterley3 hours ago
      When you’re young, the overwhelming and irrepressible desire to overcome society's proscriptions to satisfy your intellectual and sexual curiosity is natural and understandable. The open Internet made that easier than ever, and I enjoyed that freedom when I was younger—though I can’t say it was totally harmless.

      When you’re older and have children—especially preteens and teenagers—you want those barriers up, because you’ve seen just how fucked up some children can get after overexposure to unhealthy materials and people who want to exploit or harm them.

      It’s a matter of perspective and experience. As adults age, their natural curiosity evolves into a desire to protect their children from harm.

      • subscribed7 minutes ago
        The only thing this is going to achieve is to bar unverified users form the vaguely reputable and mainstream places into the small, completely unregulated spaces, sites and networks.

        I presume you prefer hard requirement of IDs.

        I'm saying this will make kids go to i2p, tor, to the obscure fora in countries not giving a f* about western laws.

        As a parent to the teens and teens, THIS makes me concerned. The best vpns are very hard to detect (I know, I try it myself).

      • miroljub3 hours ago
        So you basically want to prevent your children from doing what you did at their age?

        And you don't mind that freedoms of all of us would be restricted as a result?

        And then, we keep blaming boomers for those restrictions.

        • co_king_53 hours ago
          > And you don't mind that freedoms of all of us would be restricted as a result?

          Usually the people who say things like that really just want to restrict everyone's freedoms. Everything else is just bluster.

          • otterley3 hours ago
            Freedom to do what, exactly? You realize that the extreme opposite of laws and restrictions meant to maintain a working social order is anarchy, right?
            • co_king_53 hours ago
              > Freedom to do what, exactly?

              You may be failing to comprehend the concept of "freedom".

              • otterley3 hours ago
                Please, O wise one, explain "freedom" to the political scientist and lawyer you're talking to. Let me get my popcorn first.
                • miroljub2 hours ago
                  > Please, O wise one, explain "freedom" to the political scientist and lawyer you're talking to. Let me get my popcorn first.

                  If you think only "political scientists and lawyers" have to decide what a freedom is, you have quite a totalitarian mindset.

                  If you have some arguments, pray tell. "I'm the smartest guy here" is not an argument. It's just something an NPC would say when they run out of arguments.

                  PS: This is not ad hominem. It's a dismissal of your claim of authority.

                  • otterley2 hours ago
                    I'm afraid you missed the point of my reply. You have to assume here that the people you're arguing with may, in fact, be as smart as, or even more knowledgeable than you regarding certain subjects; and that dismissive replies like "You may be failing to comprehend the concept of 'freedom'" put you way out of line and at risk of having your ass handed to you. Come armed with substance, not snipes.
                  • mothballed2 hours ago
                    There's 190,000 pages of CFR that are essentially bound as law, almost entirely written and maintained by unelected bureaucrats.

                    They've been deciding what "freedom" is for a long time (even deciding what constitutional rights are, on occasion, see ATF bureaucrats constantly publishing and changing rules re-deciding what constitutional restraints they think there are on the 2A).

                    Of course, these "scientist and lawyers" know they have this power, and are so seeped in it, they occasionally forget when they step out of the ivory tower that the plebs (and indeed, the foundational ideals USA was built on written by those such as Locke) usually either disagree with it or aren't aware that much of the USA functions under "credentialism/technocrat makes right" and the scientist and the lawyer as the arbiter of freedom.

                    This feels like one of those moments when the technocrats forget that they've shed the thin façade they hide behind.

                    • otterley2 hours ago
                      No political thread would be complete without a Second Amendment absolutist joining the conversation in order to derail it. They're joining sooner than ever!
                • co_king_52 hours ago
                  I am so sorry. I didn't realize you had a *political science* degree.

                  I'll get my simpleminded ass out of here leave this discussion to the scientists.

                  • otterley2 hours ago
                    Alternatively, you could provide a substantive and respectful argument instead of a snipe, as you should have done in the first place.
                    • co_king_5an hour ago
                      I'm sorry but I don't think I have the proper training to debate someone so far outside of my intellectual weight class.
            • mothballed3 hours ago
              The opposite of something like Bastiat's ideal of the law is something more like the law of tyranny or law of the plunderer. Anarchy I place somewhere closer to the middle -- better than the law of a tyrant because at least under anarchy the law of the tyrant isn't legitimized even if it still might be enforced by might.
        • otterley3 hours ago
          Yes, in exactly the same way that my dad would want me to only use SawStop table saws so that I don't lose a finger like he did.

          As for "freedoms," you're not free to vote or drink alcohol below a certain age. And before the internet, minors couldn't purchase pornography, either. Some people perceive this change as a return to normal, not an egregious destruction of freedom.

          • miroljub2 hours ago
            > As for "freedoms," you're not free to vote or drink alcohol below a certain age. And before the internet, minors couldn't purchase pornography, either. Some people perceive this change as a return to normal, not an egregious destruction of freedom.

            I am not talking about pornography or alcohol at all.

            I hope you are aware that requiring an ID to surf the internet leads to total censoring and self-censoring of the complete internet. There goes your privacy, anonymity, and right to free speech.

            If your country's regime really wanted to address pornography or alcohol, I'm pretty sure they would be able to shut it down without requiring everyone's identity. The issue is, they are just using these topics to manipulate people, and you are failing to that trap.

            • otterley2 hours ago
              > requiring an ID to surf the internet

              Who's proposing this? I don't want to argue over a straw man.

              • miroljub2 hours ago
                Age verification === require an ID
                • otterley2 hours ago
                  Right. I meant the "to surf the internet" part. Who's proposing this, exactly? No government is mentioned in the article that is doing or considering this.

                  They are talking about it in the context of "high risk" services and social media, but not the Internet as such.

          • mitthrowaway23 hours ago
            SawStop table saws still suffer from kickback like other table saws, which is arguably much more dangerous than losing a finger and can even cause lethal injury. The SawStop mechanism might provide an illusion of safety that results in users being less careful with their work.

            I think the solution we really need is age verification for table saws. Of course, it goes without saying that the saw should also monitor the user's cuts to make sure they're connected with the right national suppliers who can supply material to meet their needs, and to ensure that you aren't using the saw to cut any inappropriate materials from unregistered sources.

            • otterley2 hours ago
              Ah, yes, the old "safety mechanism doesn't protect against all dangers, therefore it has no value" argument. Right.

              The door is over there. Take the baby out with the bathwater as you leave. -->

      • phoronixrly3 hours ago
        > When you’re older and have children—especially preteens and teenagers—you want those barriers up, because you’ve seen just how fucked up some children can get after overexposure to unhealthy materials.

        You mean that you shirk your responsibility to teach your child how to protect themself on the Internet, and instead trust the faceless corp to limit their access at the cost of everyone's privacy? How does this make sense...

        • gertlex3 hours ago
          They may be looking at the societal level and saying: "I can attempt to teach my kids best practices, but I've learned I sure can't rely on my peers to do the same with their kids...", then feeling like the outcome of that, if left as-is, is societal decline... and then believing that something needs to be done beyond the individual level.
        • otterley3 hours ago
          If a business demands you reveal your identity as a condition of use, and you would rather maintain your anonymity, you can choose not to use that business. It's not like these companies are providing essential services necessary for life.

          Heck, you can't even obtain housing -- which is an essential service -- without having to provide identity in most cases.

          • 2duct3 hours ago
            Some people would argue though that if the friend group is on Facebook/Discord or whatever, and they aren't going to move off to cater to the person rejecting those services, then those services are at least essential to maintaining those social ties. They decided that giving up their data was a tradeoff worth it.

            What remains to be seen is if the outcome of teenagers becoming social pariahs is really worse than the alternatives.

            • otterley2 hours ago
              If not joining social media with friends has been seriously detrimental to teens by making them social pariahs, I'm sure we'd have heard plenty of horror stories by now, as these services have been around for over 20 years. Compare against the horror stories we have heard about those who have gone down the dark roads social media has opened to them that ended in tragedy.
  • an hour ago
    undefined
  • jonstaab3 hours ago
    Why is no one talking about using zero knowledge proofs for solving this? Instead of every platform verifying all its users itself (and storing PII on its own servers), a small number of providers could expose an API which provides proof of verification. I'm not sure if some kind of machine vision algorithm could be used in combination with zero-knowledge technology to prevent even that party from storing original documents, but I don't see why not. The companies implementing these measures really seem to be just phoning it in from a privacy perspective.
    • thewebguyd3 hours ago
      People are talking about it, at least here anyway.

      The reason you don’t see it in policy discussion from the officials pushing these laws is because removal of anonymity is the point. It’s nit about protecting kids, it never was. It’s about surveillance and a chilling effect on speech.

    • quotemstran hour ago
      Technologists engage in an understandable, but ultimately harmful behavior: when they don't want outcome X, they deny that the technology T(X) works. Consider key escrow, DRM, and durable watermarking alongside age verification. They've all been called cryptographically impossible, but they're not. It's just socially obligatory to pretend they can't be done. And what happens when you create an environment in which the best are under a social taboo against working on certain technologies? Do you think that these technologies stop existing?

      LOL.

      Of course these technologies keep existing, and you end up with the worst, most wretched people implementing them, and we're all worse off. Concretely, few people are working on ZKPs for age verification because the hive mind of "good people" who know what ZKPs are make working on age verification social anathema.

  • boerseth3 hours ago
    Does each service really need to collect this data from the user directly? They could instead have the user authorise them by e.g. OAuth2 to access their age with one of the de-facto online-identity-providers. I would be surprised if they didn't implement an API for this sometime soon, cause it would place them as the source of truth and give them unique access to that bit of user data. Seems like a chance and position they wouldn't want to lose.
  • 3 hours ago
    undefined
  • cromka3 hours ago
    Someone explain me like I'm 5: there are some solutions already in effect that are based on cryptographically generated, anonymous, one-time use tokens that allow to confirm adults's age without being tied up yo your ID. Why on earth even technically skilled people completely ignore those? Is this pure NIMBY ignorance or am missing something?
    • bakugo2 hours ago
      Because those solutions always have obvious flaws. If the cryptographic token is anonymous, how do you know the user verifying is the same one who generated the token? How do you know the same cryptographic key isn't verifying several accounts belonging to other people?
      • cromka2 hours ago
        They are one time use by definition. You can't know they are used by respectful owner, but the idea is you have to provide a new token every few weeks/months. Much like when using other services nowadays, I mean even Gmail will have you authorize every few months even if you didn't log out. Plus you fine/prosecute those who sell/misuse theirs. Just like you prosecute adults who buy kids alcohol or other substances.

        Obvious flaws are OK. I absolutely hate the Nirvana fallacy that you people think is acceptable here, while hundreds of millions of kids suffer from serious developmental issues, as reported left and right by all kinds of organizations and governments themselves.

  • arbirkan hour ago
    I know many will disagree and that is ok. Imo we need global id based on nation states national id. I know that the US doesn't have that, but the rest of the developed world do. I don't want id on porn sites because I don't think that is necessary, but I want bot-free social media, 13+ sharing forums like reddit and I want competitive games where if you are banned you need your brothers id to try cheating again.
  • barfiure3 hours ago
    The internet isn’t the same as it was when we were growing up, unfortunately. I miss the days of cruising DynamicHTML while playing on GameSpy but… yeah. It became an absolute clusterfuck and I’m not surprised they now want to enforce age restrictions.

    Maybe TBL is right and we need a new internet? I don’t have the answer here, but this one is too commercialized and these companies are very hawkish.

  • ct02 hours ago
    I don't get the alcohol analogy as in most places it's 100% legal for minors to consume alcohol in the home with parental permission in the USA. In public it's a different story.
  • akersten3 hours ago
    In my experience the people who want "privacy preserving age verification" are the same people who want "encryption backdoors but only for the good guys." Shockingly the technically minded among them do seem to recognize the impossibility of the latter, without applying the same chain of thought to the former.
  • robinwhg2 hours ago
    I‘m not too knowledgeable about this, but couldn’t you just provide a government issued key to every citizen and give a service provider that key and it‘s only valid if you’re above a certain age?
  • bronlund3 hours ago
    I would argue that this has nothing to do with age verification, but everything to do with getting identifiable data on all of us.
  • almosthere16 minutes ago
    I think this should work like OpenID connect but with just a true/false.

    PS = pr0n site

    AV = age verification site (conforming to age-1 spec and certified)

      PS: Send user to AV with generated token
      AV: Browser arrives with POST data from PS with generated token
      AV: AV specific flow to verify age - may capturing images/token in a database. May be instant or take days
    
      AV: Confirms age, provides link back to original PS
      PS: Requests AV/status response payload:
    
      {
        "age": 21,
        "status": "final"
      }
    
    
    No other details need to be disclosed to PS.

    I don't know if this is already the flow, but I suspect AV is sending name, address, etc... All stuff that isn't needed if AV is a certified vendor.

  • 3 hours ago
    undefined
  • haunter3 hours ago
    This is my problem with the Discord situation too:

    Big tech don't have wait for an outright government ban when they can just say that we are a teen-only site by default and everyone have to verify if they are over 18 or not. This age verification will affect everyone no matter what.

  • 3 hours ago
    undefined
  • Pxtl21 minutes ago
    All of my kids devices are identified, at device level, as children's devices. They could've trivially exposed this as metadata to allow sites to enforce "no under 18" use. However, I'd disagree that my bigger concern for my kids isn't that they'd see a boob or a penis, but that they'd see an influencer who'd try to radicalize them to some extremist cause, and that's usually not considered 18+ content.

    And either way, none of that requires de-anonymizing literally everyone on the internet. I'd be more than happy to see governments provide cryptographically secure digital ID and so that sites can self-select to start requiring this digital ID to make moderation easier.

  • lightningspirit3 hours ago
    If there's only a centralized system that uses digital IDs to hand off providers only a "yay" or "nay"...
  • pessimizer4 minutes ago
    The point is to undermine data protection; this debate is useless. It's a question about power and control, not a technical one. The people lobbying for this don't care about children, and neither are they getting big support from a constituency clamoring for this. This is an intelligence initiative, and a donor initiative from people who are in a position to control the platform (all computing and communications) after it is locked down.

    It's not even worth talking about online. There's too much inorganic support for the objectives of nation-states and the corporations that own them.

    Legislation has been advanced in Colorado demanding that all OSes verify the user's age. It will fail, but it will be repeated 100 times, in different places, smuggled attached to different legislation, the process and PR strategies refined and experimented with, versions of it passed in Australia, South Korea, maybe the UK and Europe, and eventually passed here. That means that "general purpose" computing will be eventually be lost to locked bootloaders.

    https://www.pcmag.com/news/colorado-lawmakers-push-for-age-v...

    And it will be an entirely engineered and conscious process by people who have names. And we will babble about it endlessly online, pretending that we have some control over it, pretending that this is a technical discussion or a moral discussion, on platforms that they control, that they allow us to babble on as an escape valve. Then, one day the switch will flip, and advocacy of open bootloaders, or trading in computers that can install unattested OSes, will be treated as organized crime.

    All I can beg you to do is imagine how ashamed you'll be in the future when you're lying about having supported this now, or complaining that you shouldn't have "trusted them to do it the right way." Don't let dumb fairytales about Russians, Chinese, Cambridge Analytics and pedophile pornography epidemics have you fighting for your own domination. Maybe you'll be the piece of straw that slows things down just enough that current Western oligarchies collapse before they can finish. Maybe we'll get lucky.

    Polls and ballots show that none of this stuff has majority organic support. But polls can be manipulated, and good polls have to be publicized for people to know they're not alone, and not afraid they're misunderstanding something. If both candidates on the ballot are subverted, the question never ends up on the ballot.

    The article itself says nothing that hasn't been said before, and stays firmly under the premise that access to content online by under-18s is suddenly one of the most critical problems of our age, rather than a sad annoyance. What is gained by having this dumb discussion again?

  • kevincloudsec3 hours ago
    the companies pushing hardest for age verification are the same ones whose business model depends on knowing exactly who you are. the child safety framing is convenient cover for a data collection problem they were already trying to solve.
  • edgyquant4 hours ago
    Everything is a trade off in the world. I think that people who are anti-id ignore this but for me personally it’s harder and harder to accept the trade offs of an internet without id. AI has only accelerated this, I don’t want to live in a world where the average person unknowingly interacts with bots more than other individuals and where black market actors can sway public opinion with armies of bots.

    I think most people are aligned here, and that an internet without identification is inevitable whether we like it or not.

    • Levitz3 hours ago
      Astroturfing was already a thing.

      Identification fixes nothing here, you log with your account, plug in the AI.

      The problems with social media have nothing to do with ID and everything to do with godawful incentives, the argument seems to be that it's a large price to pay but that it's worth it. Worth it for what? The end result is absolutely terrible either way

      • iamnothere3 hours ago
        Astroturfing will still be a thing after ID. What, you think the government is going to go after their own bot armies?
        • edgyquant3 hours ago
          I think it would be a lot more difficult for anyone to do and it isn’t like people will be using government platforms at least not in the west
          • Levitz2 hours ago
            >I think it would be a lot more difficult for anyone to do

            Why? Like, what makes you think that?

            • fyredgean hour ago
              Because of ID tracking? Say you have attach your government approved ID to use social media. It is now trivial to check how many accounts you have made and how much you have posted. You certainly can't be posting faster than the fastest typist in the world. And if you're mostly just copy pasting, is the quality of the posts actually worth engaging with?

              While I am not against internet ID, there is a case to be made for social media for the harms they are causing.

              • amiga38626 minutes ago
                Let's say the government issues hundreds of thousands of IDs to people who don't exist and uses them to verify bots (or room full of paid humans) that post pro-government messages all day, at "normal" rates that a human posts.

                It's amazing how there is a much larger crowd, of completely real people, who approve of the government, than those nasty dissenters. We know they're real people because we trust the government vouching for its own IDs.

                And because of the real ID policy, the government can also ask the social media company for the ID used by opposed posters, and find out where they live and "visit" them, maybe "warn" them.

                Hooray for democracy!

    • egorfine3 hours ago
      1) ID checks will not close the trade off. Real IDs are easily available on the market. Thus criminals will used them no problem. It's the law-abiding privacy-minded people (like me) who would be hurt the most.

      2) Your point is valid. I too want to know whether I am engaging with a bot or a person. This is impossible now and it will be impossible once ID check becomes ubiquitous.

      3) I will be happy to see (or not) a blue checkmark by the profile name. Just like in Twitter. That's enough.

    • kristopolous3 hours ago
      No the argument is bad actors will reliably find a way to bypass these systems at an industrial scale while you'll instead snag honest people instead.

      Look at the facebook real name policy.

      • fyredgean hour ago
        This sounds a lot like the pro-gun rhetoric of bogging down the "good" gun owners but not doing enough to the "bad" gun owners.
        • kristopolous34 minutes ago
          It's not really the same. The good guy/bad guy gun rhetoric has deeply racist roots.

          But beyond that we can look at places similar things have been rolled out.

          Facebook has a real name policy and is overflowing with fraudsters and ai slop

          Although I can't figure out how to sign up for a second telegram account with their phone number restriction that hasn't stopped multiple scammers hitting me up every day on the service.

          On YouTube, their demographics has ladies in their 30s watching nursery rhyme videos by the millions because mothers give their children their phone.

          On social media, scammers tend to take over the accounts of dead people because the deceased don't update their passwords after a data breach. Your ID card policy, however strict, isn't going to stop the most common attack vector

          So I don't know what you're trying to solve with id checks: parents hand their logged in devices to their children, scammers raid the accounts of the verified dead, existing systems clearly aren't working and strictly enforcing ineffective security theater isn't going to change this

          Doing something that doesn't work isn't a solution

    • zenbowman4 hours ago
      100% correct. At this point the harms to children from social media use are very well documented.

      Like everything else in society, there are tradeoffs here, I'm much more concerned with the damage done to children's developing brains than I am to violations of data privacy, so I'm okay with age verification, however draconian it may be.

      • logifail3 hours ago
        > At this point the harms to children from social media use are very well documented

        Our middle child (aged 12) has an Android phone, but it has Family Link on it.

        Nominally he gets 60 mins of phone time per day, but he rarely even comes close to that, according to Family Link he used it for a total of 17 minutes yesterday. One comes to the conclusion that with no social media apps, the phone just isn't that attractive.

        He seems to spend most of his spare time reading or playing sports...

        • edgyquant3 hours ago
          I commend this but I always try to think about the arguments for something like cigarettes. People didn’t buy the argument that parents need to be preventing their kids from smoking
        • zobzu3 hours ago
          most kids dont have parents who care to that degree.
          • logifail3 hours ago
            As part of the unofficial bargain in which we limit screen time I get to spend a big chunk of my spare time driving him (and his siblings) to and from various sports fixtures.

            Just one of the many joys of parenting :)

      • meowface3 hours ago
        We need to destroy privacy and anonymity online for the noble goal of the government banning teenagers from looking at Twitter and Instagram?

        If it's a concern, parents can prevent or limit their children's use. If all this were being done to prevent consistent successful terrorist attacks in the US with tens of thousands of annual casualties, I'd say okay maybe there is an unavoidable trade-off that must be made here, but this is so absurd.

        • edgyquant3 hours ago
          It isn’t just about teenagers though I think I outlined that? We need to make sure people online are real people and yes we should prevent kids from being exposed to algorithms designed to addict then.
          • meowfacean hour ago
            Adults are nearly as susceptible to such addiction. If this is the goal then the actual legislation should be to prohibit social media companies from doing it to anyone. (I think this would be government overreach and a possible first amendment violation, though. I say this as a center-left person who deeply hates what Musk has done to Twitter. I would even describe myself as an anti-free speech person; I just respect the nation's laws and the principle that the state should not be able to imprison you just for speech.)
      • modo_mario3 hours ago
        Do you genuinely believe the major tech companies and gov reps actually want to close their addiction revenue taps?
    • modo_mario3 hours ago
      > I think that people who are anti-id ignore this

      No we do not.

      >I don’t want to live in a world where the average person unknowingly interacts with bots more than other individuals and where black market actors can sway public opinion with armies of bots.

      That is not the argument for identification on many places on the internet. It's not even the argument that the gov reps pushing it typically make. And why would it be. The companies that go along with all this don't want to get rid of all bots and public opinion campaigns. They make money off of many of those.

    • amiga3863 hours ago
      You're not thinking more than one step ahead. If you let a third party define who "has ID", "is human", etc. you give that third party control over you. You already gave control of your attention away to the sites who host the UGC, now you also give away control of your sense of reality.

      At any point they can tell a real human what they can and can't say, and if they go against their masters, their "real human" status is revoked, because you trust the platform and not the person.

      If we want to go full conspiritard, we could accuse those of wanting to control speech to be the financial backers of those flooding social media with AI slop: https://www.youtube.com/watch?v=-gGLvg0n-uY -- this fictional video thematically marries Metal Gear Solid 2's plot with current events: "perfect AI speech, audio and video synthesis will drown out reality [...] That is when we will present our solution: mandatory digital identity verification for all humans at all times"

      • edgyquant3 hours ago
        I am though. In the world I live in I already have to give power over myself to corporations and a government, I don’t buy this as an argument for continuing to let internet companies skirt existing laws.
        • amiga3862 hours ago
          I don't know what to say. You will live in a world "where the average person unknowingly interacts with bots more than other individuals and where black market actors can sway public opinion with armies of bots", even more so after you and I and everyone on the planet are compelled to provide our identity at all times.

          The various government actions trying to force "robust" age verification on the internet are being woefully naive in trusting other internet companies and letting them skirt existing laws on data protection.

          That's not even mentioning other factions whose real goal is in shutting down legal speech that doesn't meet their Christian agenda: https://theintercept.com/2024/08/16/project-2025-russ-vought...

          You are being a useful idiot, sorry. Your weakness is what politicians exploit when they say "think of the children", you fail to see the amoral power-grabs hiding beneath their professed sentiment.

          I don't want you encouraging people to demand my identity because you trust "authorities" taking yours

    • RockRobotRock3 hours ago
      (Disclaimer: American perspective)

      Why don't we have PKI built in to our birth certificates and drivers licenses? Why hasn't a group of engineers and experts formed a consortium to try and solve this problem in the least draconian and most privacy friendly way possible?

      • zobzu3 hours ago
        newer passports and driver licenses do.
    • spwa44 hours ago
      ID verification doesn't protect against that. Why not? Because there are a lot of people that will trade their ID for a small amount of money, or log someone/something in. IDs are for sale, like everyone who was ever a high school student knows for "some" reason.

      Plus what you're asking would require international id verification for everyone, which would first mostly make those IDs a lot cheaper. But there's a second negative effect. The organizations issuing those IDs, governments, are the ones making the bot armies. Just try to discuss anything about Russia, or how bad some specific decision of the Chinese CCP is. Or, if you're so inclined: think about how having this in the US would mean Trump would be authorizing bot armies.

      This exists within China, by the way, and I guarantee you: it did not result in honest online discussion about goods, services or politics. Anonymity is required.

  • knallfroschan hour ago
    All adults proof their identify multiple times per month: Every time they access digital health records, or when they use any electronic payment.

    Just make Google/Apple reveal part of that data (age > x years) to websites and apps.

    Boom, done. Privacy guarded. Easy.

  • tolmasky3 hours ago
    I am so surprised by the comments on this thread. I was not expecting to see so many people on Hacker News in favor of this. As is typically the case with things like this, the reasoning stems from agreeing with the goal of age verification, with little regard to whether age verification could ever actually work. It reminds me in some sense to the situation with encryption where politicians want encryption that blocks "the bad guys" while still allowing "the good guys" to sneak in if necessary. Sure, that sounds cool, it's not possible though. I suppose DRM is a better analogue here, an increasingly convoluted system that slowly takes over your entire machine just so it can pretend that you can't view video while you're viewing it.

    To be clear, tackling the issue of child access to the internet is a valuable goal. Unfortunately, "well what if there was a magic amulet that held the truth of the user's age and we could talk to it" is not a worthwhile path to explore. Just off the top of my head:

    1. In an age of data leaks, identity theft, and phishing, we are training users to constantly present their ID, and critically for things as low stakes as facebook. It would be one thing if we were training people to show their ID JUST for filing taxes online or something (still not great, but at least conveys the sensitivity of the information they are releasing), but no, we are saying that the "correct future" is handing this information out for Farmville (and we can expect its requirement to expand over time of course). It doesn't matter if it happens at the OS level or the web page level -- they are identical as far as phishing is concerned. You spoof the UI that the OS would bring up to scan your face or ID or whatever, and everyone is trained to just grant the information, just like we're all used to just hitting "OK" and don't bother reading dialogs anymore.

    2. This is a mess for the ~1 billion people on earth that don't have a government ID. This is a huge setback to populations we should be trying to get online. Now all of a sudden your usage of the internet is dependent on your country having an advanced enough system of government ID? Seems like a great way for tech companies to gain leverage over smaller third world companies by controlling their access to the internet to implementing support for their government documents. Also seems like a great way to lock open source out of serious operating system development if it now requires relationships with all the countries in the world. If you think this is "just" a problem of getting IDs into everyone's hands, remember that it a common practice to take foreign worker's passports and IDs away from them in order to hold them effectively hostage. The internet was previously a powerful outlet for working around this, and would now instead assist this practice.

    3. Short of implementing HDCP-style hardware attestation (which more or less locks in the current players indefinitely), this will be trivially circumvented by the parties you're attempting to help, much like DRM was.

    Again, the issues that these systems are attempting to address are valid, I am not saying otherwise. These issues are also hard. The temptation to just have an oracle gate-checker is tempting, I know. But we've seen time and again that this just (at best) creates a lot of work and doesn't actually solve the problem. Look no further than cookie banners -- nothing has changed from a data collection perspective, it's just created a "cookie banner expert" industry and possibly made users more indifferent to data collection as a knee-jerk reaction to the UX decay banners have created on the internet as a whole. Let's not 10 years from now laugh about how any sufficiently motivated teenager can scan their parent's phone while they're asleep, or pay some deadbeat 18 year-old to use their ID, and bypass any verification system, while simulateneously furthering the stranglehold large corporations have over the internet.

    • Noaidi3 hours ago
      Whatever happened to all the innovation the tech world was capable of? This is 100% a solvable problem. It only needs the will and good law.

      1) Person signs up with discord with fake name and fake email.

      2) Discord asks (state system) for an age validation.

      3) In pop up window, state validates the persons age with ID matching with face recognition.

      3) State system sends token to discord with yes or no with zero data retention in the state records.

      4) Discord takes action on the account.

      What is so hard about this?

      • tolmasky2 hours ago
        I want to sincerely ask whether you read my post, because your response is so unrelated I believe you might accidentally be responding to another post? If so, please ignore the rest, which is only intended in the case where you are actually responding to what I wrote.

        Your system seems to address none of the issues I listed. For example, I argue that one difficulty is in the fact that these systems would be highly phishable -- a property that is present in your described "easy" solution. Your system trains users to become accustomed to being pestered by pop up windows that ask to see their ID and use their camera. Congrats, I can now trivially make a pop up a window that looks like this UI and use it to steal your info, as the user will just respond on auto-drive, as we have repeatedly shown both in user studies and in our own lived experiences. I also explained how a system like this would assist in the practice of trapping migrant workers by confiscating their government credentials [1]. This is a huge problem today in Asia, and one of the few outlets captive workers can use to escape this control is the internet -- a "loophole" your system would dutifully close for these corporations.

        I am happy to have a discussion about this -- it's how we come up with new solutions! But that requires reading and responding to the concerns I brought up, not assuming that my issue is that I can't imagine implementing a glorified OAuth login flow.

        1. There's tons of articles about this, here is one of the first ones that comes up on Google: https://www.amnesty.org/en/latest/news/2025/05/saudi-arabia-...

  • b82 hours ago
    Hence why Illinois has already mame it illegal.
  • DeathArrow3 hours ago
    I wonder how much time we have before being asked to enter the government issued ID in a card reader so websites can read age and biometric data from the chip.
  • dark-staran hour ago
    I don't see why platforms would have to store the data indefinitely.

    Once you are verified, you just flip a bit "verified" in the database and delete all identification data.

    No reason to store the data indefinitely

  • rglover2 hours ago
    Imagine an OIDC type solution but for parents might work here.

    Basically, kids can sign up for an account triggering a notification to parents. The parent either approves or rejects the sign in. Parents can revoke on demand. See kids login usage to various apps/services. Gets parental restrictions in the login flow without making it a PITA.

  • miss_haru3 hours ago
    parents: won't somebody else put some rules and safeguards in place to protect my children?
    • logicchains35 minutes ago
      Most of them probably don't even have kids of their own, they just hate social media for exposing children to conservative ideas and want to ban it to prevent that.
  • alvatar2 hours ago
    zero knowledge cryptography solves this
  • djohnston2 hours ago
    (thats the point)
  • publicdebates3 hours ago
    Isn't this the same debate as airports post 9/11, whether you can have both privacy and security? Seems conclusive, no.
  • cess114 hours ago
    My main takeaway from this is that politicians seem to have given up on making "social media" less harmful by regulating it, and instead focus on gatekeeping access, with the added perk of supplying security services and ad tyrants with yet another data pump.
  • ck24 hours ago
    if you are paying for internet access you have to be over 18, no?

    and if you have internet access without paying, that means someone else is legally responsible for your access

    "problem solved" ?

    • malfist4 hours ago
      Famously children can only access internet from wifi paid for by their parents.

      I'm not for these draconian age verification nonsense, but this isn't a valid argument.

      • bondarchuk4 hours ago
        It is a valid alternative avenue towards a legal implementation of "child safeguarding" IMO. Someone pays for the internet, that person is responsible for what minors do on their connection. If they have trouble doing that we can use normal societal mechanisms like idk social services, education, and government messaging.

        This is the way it works with e.g. alcohol and cigarettes, most places. Famously kids can just get a beer from a random fridge and chug it, but someone 16/18/21+ will be responsible and everyone seems mostly fine with this.

        • nazgulsenpai3 hours ago
          If protecting children were the actual intended outcome, this would have been the logical way to do it. Since it isn't what they're actually doing, instead using personally identifiable information to establish your age, we can only assume it's an attempt to deanonymize the internet.
        • alt2272 hours ago
          This will never work.

          I regularly talk to other parents at the school gates who have no idea that permissions on mobiles even exist, let alone that they can choose what they let each app have access to.

          The general public people just dont care.

          • bondarchukan hour ago
            Yes, it's hard work to build a society where people behave responsibly and in their best interests. But I'd prefer we actually put in the effort rather than go for the easy authoritarian option out of basically laziness and contempt for your fellow man.

            (fwiw I regularly talk to parents who are quite aware of various parental controls and use them effectively, combined with talking to their kids and just general good parenting practices)

    • moritonal4 hours ago
      This is the answer. If you provide internet access to someone, you're responsible for it. It's a generally established law from a Torrenting PoV, so isn't it equally applicable to downloading content unsuitable for children. Sure it'll destroy offering free wifi, but that always was tricky from a legal PoV around responsibilities.
    • gpderetta3 hours ago
      Ideally the law would require websites (and apps) to provide some signed age requirement token to the client (plus possibly classification) instead of the reverse. Similarly OS and web clients should be required to provide locked down modes where the maxium age and/or classification could be selected. As a parent I would the be able to setup my child device however I wish without loss of privacy.

      Is it bypassable by a sufficiently determined child? Yes, but so it is the current age verification nonsense.

    • daveoc644 hours ago
      > if you are paying for internet access you have to be over 18, no?

      No, that's not the case.

      • john_strinlai3 hours ago
        every contract by every ISP i have ever signed has required me to be over the age of 18 to enter the contract.
        • daveoc643 hours ago
          In many countries, it's possible to get a prepaid SIM with data access - without any ID or age requirement whatsoever.
          • john_strinlai3 hours ago
            ah, fair, but with an easy enough fix. make data-enabled SIM cards be 18+ (or whatever age). show ID to the store clerk at purchase time, just like if you were buying smokes/alcohol.
            • bennyp1013 hours ago
              And then how does public wifi work? Stand outside a Weatherspoons, or just walk down a highstreet with free internet, back to square one
              • john_strinlai3 hours ago
                seems dead simple to me: if you host public wifi, you are responsible for the people that use it. easy!

                just like you already are responsible for what happens on your free public network (torrenting, hacking, CSAM, etc.) in most jurisdictions

                (for what its worth, i think age verification is dumb. but it looks like we're getting it one way or the other)

    • sidewndr463 hours ago
      unless your kid never goes to public school that isn't true
      • zobzu3 hours ago
        or goes outside at all. free wifi is everywhere
        • alt2272 hours ago
          Free wifi generally is everywhere, however it is often heavily filtered and firewalled to stop being doing things the internet owner wouldnt approve of.
  • CrzyLngPwd4 hours ago
    It's just another way to surveil the population and won't cause any real problems for anyone who can work around it.
  • simion3142 hours ago
    Big Tech refused to work together to implement a age flag that parents would setup on the children device, now we get each European and each USA state with their own special rules.
  • matthewmorgan3 hours ago
    That was the goal.
  • Devasta3 hours ago
    I can understand the need to restrict some stuff kids can see, like when I was a teen it me hours and hours to download one 2 minute porn clip from kazaa, but these days you could download a lifetime worth in one weekend. That can't be healthy.

    That being said nothing about these laws is about protecting children; their primary purpose is to crack down on the next Just Stop Oil or Palestine Action so for that reason should be opposed.

    • simoncionan hour ago
      > ...but these days you could download a lifetime worth in one weekend.

      Uh. I could check in the back of my parents' closet (hidden under some fabric) for at least a decade's worth of dirty magazines. It's true that that's less than a lifetime's worth of pictures and articles, but I'd say that that's effectively equivalent.

      > That can't be healthy.

      The only thing that's unhealthy is not being able to talk frankly and honestly about sex and sexuality with your peers, parents, and other important adults in your life. Well, that and never being told that sex leads to pregnancy, or how to recognize common STDs... but you're likely to get that "for free" if you're able to talk frankly and honestly about sex and sexuality.

  • redog3 hours ago
    It's to continue the culture of bullying and lack-of-accountability by and for the perversely rich oligarchy.

    For you'll need to be accounted while they do the counting.

  • scotty793 hours ago
    If government is concerned shouldn't government just deliver auth based on birth certificate for everyone to use?
  • DeathArrow3 hours ago
    In most countries is illegal for small children to drive or to use fire arms. And it's their parents job to not let them to.

    Instead of requiring IDs, we should let parents manage what their children do online.

  • Filip_portive2 hours ago
    My new comment
  • DeathArrow3 hours ago
    30 years of internet were possible with relative freedom, without spying and surveillance. All of the sudden it's not possible.

    Governments recycle "Think of the children" mantra and they are again after terrorists and bad guys.

    • xinayder3 hours ago
      Mandatory age check is not going to reduce the number of criminals online. Period.

      We should focus on teaching parents how to educate their children properly, and teach children how to safely browse the internet and how to avoid common scams and pitfalls.

      I played Roblox when I was a teenager and all the time my aunt told me to be careful of who I talked to online, as they could be a pedo. Even though there wasn't a constant monitoring from my parents or family, her words were repeated many times that I actually thought 5 times before sharing any kind of personal information online, back then.

    • Attrecomet2 hours ago
      >Governments recycle "Think of the children" mantra and they are again after terrorists and bad guys.

      nope, they are going after dissenters, not bad guys. It's how it always ends up.

  • an hour ago
    undefined
  • Noaidi3 hours ago
    I have a problem with an open internet and allowing open access to everything the internet can offer to young children.

    It cannot be a friction-less experience. Allowing children to see gore and extreme porn at a young age is not healthy. And then we have all the "trading" platforms (gambling).

    Even though my brothers were able to get many hard drugs when I was young, around 1977, there was a lot of friction. Finding a dealer, trusting them, etc. Some bars would not card us but even then there was risk and sometimes they got caught. In NY we could buy cigarettes, no friction, and the one drug I took when I was young, addicted to them at 16, finally quitting for good at 20. I could have used some friction there.

    So how do we create friction? Maybe hold the parents liable? They are doing this with guns right now, big trial is just finishing and it looks like a father who gave his kid an ak47 at 13 is about to go to jail.

    I would like to see a state ID program when the ID is just verified by the State ID system. This way nothing needs to be sent to any private party. Sites like Discord could just get a OK signal from the state system. They could use facial recognition on the phone that would match it with the ID.

    Something needs to be done however. I disagree that the internet needs to be open to all at any age. You do not need an ID to walk into a library, but you need one to get into a strip club. I do not see why that should not be the same on the internet.

  • scythean hour ago
    It's crazy to me that we want to force age verification on every service across the Internet before we ban phones in school. I could understand being in favor of both, or neither, but implementing the policy that impacts everybody's privacy before the one that specifically applies within government-run institutions is just so disappointingly backwards it's tempting to consider conspiracy-like explanations.

    The advantage, I think, of age verification by private companies over cellphone bans in public schools is that cellphone bans appear as a line-item on the government balance sheet, whereas the costs of age verification are diffuse and difficult to calculate. It's actually quite common for governments to prefer imposing costs in ways that make it easier for the legislators to throw up their hands and whistle innocently about why everything just got more expensive and difficult.

    And the argument over age verification for merely viewing websites, which is technically difficult and invasive, muddles the waters over the question of age verification for social media profiles, where underage users are more likely to get caught and banned by simple observation. The latter system has already existed for decades -- I remember kids getting banned for admitting they were under 13 on videogame forums in the '00s all the time. It seems like technology has caused people to believe that the law has to be perfectly enforceable in order to be any good, but that isn't historically how the law has worked -- it is possible for most crimes to go unsolved and yet most criminals get caught. If we are going to preserve individual privacy and due process, we need to be willing to design imperfect systems.

  • 3 hours ago
    undefined
  • light_hue_12 hours ago
    As a parent, I'm happy that social bans are finally a thing.

    But, I don't get the approach. It's not like social media starts being a positive in our life at 20. The way these companies do social media is harmful to mental health at every age. This is solving the wrong problem.

    The solution is to take away their levers to make the system so addictive. A nice space to keep in touch with your friends. Nothing wrong with that.

  • 2duct2 hours ago
    I'm going to state that at one point I was one of the young people this kind of legislation is meaning to protect. I was exposed to pornography at too young an age and it became my only coping mechanism to the point where as an adult it cost me multiple jobs and at one point my love life.

    I don't think this legislation would have helped me. I found the material I did outside of social media and Facebook was not yet ubiquitous. I did not have a smartphone at the time, only a PC. I stayed off social media entirely in college. Even with nobody at all in my social sphere, it was still addicting. There are too many sites out there that won't comply and I was too technically savvy to not attempt to bypass any guardrails.

    The issue in my case was not one of "watching this material hurt me" in and of itself. It was having nobody to talk to about the issues causing my addiction. My parents were conservative and narcissistic and did not respect my privacy so I never talked about my addiction to them. They already punished me severely for mundane things and I did not want to be willingly subjected to more. To this day they don't realize what happened to me. The unending mental abuse caused me to turn back to pornography over and over. And I carried a level of shame and disgust so I never felt comfortable disclosing my addiction to any school counselors or therapists for decades. The stigma around sexual issues preventing people from talking about them has only grown worse in the ensuing years, unfortunately.

    At most this kind of policy will force teenagers off platforms like Discord which might help with being matched with strangers, but there are still other avenues for this. You cannot prevent children from viewing porn online. You cannot lock down the entire Internet. You can only be honest with your children and not blame or reproach them for the issues they have to deal with like mine did.

    In my opinion, given that my parents were fundamentally unsafe people to talk to, causing me to think that all people were unsafe, then the issue of pornography exposure became an issue. In my case, I do not believe there was any hope for me that additional legislation or restrictions could provide, outside of waking up to my abuse and my sex addiction as an adult decades later. Simply put, I was put into an impossible situation, I didn't have any way to deal with it as a child, and I was ultimately forsaken. In life, things like those just happen sometimes. All I can say was that those who forsook me were not the platforms, not the politicians, but the people who I needed to trust the most.

    I believe many parents who need to think about this issue simply won't. The debate we're having here on this tech-focused site is going to pass by them unnoticed. They're not going to seriously consider these issues and the status quo will continue. They won't talk with their children to see if everything's okay. I don't have many suggestions to offer except "find your best family," even if they aren't blood related.

  • john_strinlai4 hours ago
    [dead]
  • Tr3nton3 hours ago
    [dead]
  • yde_java2 hours ago
    [dead]
  • TZubiri3 hours ago
    >"None of this is an argument against protecting children online. It is an argument against pretending there is no tradeoff"

    Tradeoff acknowledged, and this runs both sides, there's hundreds of risks that these policies are addressing.

    To mention a specific one, I was exposed to pornography online at age 9 which is obviously an issue, the incumbent system allowed this to happen and will continue to do so. So to what tradeoffs in policy do detractors of age verification think are so terrible that it's more important than avoiding, for example, allowing kids first sexual experiences to be pornography. Dystopian vibes? Is that equivalent?

    Or, what alternative solutions are counter-proposed to avoid these issues without age verification and vpn bans.

    Note 2 things before responding:

    1)per the original quote, it is not valid to ignore the trade offs with arguments like "child abuse is an excuse to install civilian control by governments"

    2) this was not your initiave, another group is the one making huge efforts to intervene and change the status quo, so whatever solution is counterproposed needs to be new, otherwise, as an existing solution, it was therefore ineffective.

    If any of those is your argument, you are not part of the conversation, you have failed to act as wardens of the internet, and whatever systems you control will be slowly removed from you by authorities and technical professionals that follow the regulations. Whatever crumbs you are left as an admin, will be relegated to increasingly niche crypto communities where you will be pooled with dissidents and criminals of types you will need to either ignore or pretend are ok. You will create a new Tor, a Gab, a Conservapedia, a HackerForums, and you will be hunted by the obvious and inequivocal right side of the law. Your enemy list will grow bigger and bigger, the State? Money? The law? God? The notion of right and wrong which is like totally subjective anyways?

    • alt2272 hours ago
      > I was exposed to pornography online at age 9....allowing kids first sexual experiences to be pornography

      I was initially exposed to pornography at 8 years old, by finding a disgarded magazine in a hedge. However this was pretty soft.

      I was exposed to serious pornography at 10 years by finding a hidden VHS tape in the back of a drawer at a friends house and getting curious. This was hardcore German stuff with explicit violence. This has caused me to have therapy in my lifetime.

      This was all in the 80s by the way.

      Therefore anything you are mentioning happened long before the internet, and is totally possible in a completely offline world as well. So how do these new digital laws 'protect children' again?

  • 2OEH8eoCRo04 hours ago
    Fuck data privacy, what privacy? Your ISP knows you, sites track you, cookies track you. It's a myth. But oh, we totally can't figure out age verification. Fuck off, I dont buy it.
  • callamdelaney3 hours ago
    We should just ban smartphones, it's where a great deal of the harm comes from and is harder for parents to manage. No need for children to have cameras connected to the internet whether via smartphones or computers.
  • infotainment3 hours ago
    Device based attestation seems like the way to go largely; it doesn't solve the problem, but it's good enough that it would cover most cases.
    • alt2272 hours ago
      Not really. It just pushes the responsibility onto parents, who already have no idea how security works or what their kids are doing on their phones.
  • anon_shill3 hours ago
    From the second paragraph:

    > And the only way to prove that you checked is to keep the data indefinitely.

    This is not true and made me immediately stop reading. If a social media app uses a third party vendor to do facial/ID age estimation, the vendor can (and in many cases does) only send an estimated age range back to the caller. Some of the more privacy invasive KYC vendors like Persona persist and optionally pass back entire government IDs, but there are other age verifiers (k-ID, PRIVO, among others) who don't. Regulators are happy with apps using these less invasive ones and making a best effort based on an estimated age, and that doesn't require storing any additional PII. We really need to deconflate age verification from KYC to have productive conversations about this stuff. You can do one thing without doing the other.

    • 0x000xca0xfe3 hours ago
      If you don't keep and cross-reference documents it is really easy to circumvent, e.g. by kids asking their older siblings to sign them up.

      I don't think a bulletproof age verification system can be implemented on the server side without serious privacy implications. It would be quite easy to build it on the client side (child mode) but the ones pushing for these systems (usually politicians) don't seem to care about that.

      • anon_shill2 hours ago
        Yep, it is easy to circumvent, and the silver lining of all of this is that regulators don't care. They care that these companies made an effort in guessing.