5 pointsby scrlk6 hours ago2 comments
  • iamnothere6 hours ago
    > Ofcom, the UK’s communications regulator, praised Apple for the decision, especially since it’s not required to implement age verification for the iOS or its App Store under the region’s Online Safety Act.

    Oh look, private companies can solve problems without legislation after all.

    Unfortunately, rather than allowing the market to respond to this, it will probably just be used as evidence that regulation is possible, and legislators will attempt to codify it as a requirement for all platforms.

  • miohtama6 hours ago
    Here is the actual legal text:

    The Online Safety Act 2023

    The Online Safety Act 2023 (the Act) protects children and adults online. It puts a range of new duties on social media companies and search services, giving them legal duties to protect their users from illegal content and content harmful to children. The Act gives providers new duties to implement systems and processes to reduce risks their services are used for illegal activity, and to take down illegal content when it does appear. Illegal Content

    As of 17 March 2025, platforms have a legal duty to protect their users from illegal content online. Ofcom are actively enforcing these duties and have opened several enforcement programmes to monitor compliance.

    https://www.gov.uk/government/collections/online-safety-act

    Also in other news, the UK has found its non-crime hate incidents (still police records, think it as a crime but different name) system troublesome:

    https://x.com/Telegraph/status/2036526043893289279