818 pointsby dmitrygr2 days ago59 comments
  • themafia2 days ago
    "You can only turn off this setting 3 times a year."

    Astonishing. They clearly feel their users have no choice but to accept this onerous and ridiculous requirement. As if users wouldn't understand that they'd have to go way out of their way to write the code which enforces this outcome. All for a feature which provides me dubious benefit. I know who the people in my photographs are. Why is Microsoft so eager to also be able to know this?

    Privacy legislation is clearly lacking. This type of action should bring the hammer down swiftly and soundly upon these gross and inappropriate corporate decision makers. Microsoft has needed that hammer blow for quite some time now. This should make that obvious. I guess I'll hold my breath while I see how Congress responds.

    • choegera day ago
      > Why is Microsoft so eager to also be able to know this?

      A database of pretty much all Western citizen's faces? That's a massive sales opportunity for all oppressive and wanna-be oppressive governments. Also, ads.

      • animuchana day ago
        At this point I think it's just called a government, sadly enough.
      • Fire-Dragon-DoL21 hours ago
        Combine face recognition on personal photos with age checks which include photos,and you can link stuff directly to Microsoft/Google accounts for ads
      • a day ago
        undefined
    • ryandrake2 days ago
      It's hilarious that they actually say that right on the settings screen. I wonder why they picked 3 instead of 2 or 4. Like, some product manager actually sat down and thought about just how ridiculous they could be and have it still be acceptable.
      • landl0rd2 days ago
        My guess is it was an arbitrary guess and the limit is due to creating a mass scan of photos. Depending on if they purge old data when turned off, it could mean toggling the switch tells microsoft's servers to re-scan every photo in your (possibly very large) library.

        Odd choice and poor optics (just limit the number of times you can enable and add a warning screen) but I wouldn't assume this was intentionally evil bad faith.

        • QuantumNomad_2 days ago
          I would be sceptical too, if I was still using Windows.

          I’ve seen reports in the past that people found that syncing to the cloud was turned back on automatically after installing Windows updates.

          I would not be surprised if Microsoft accidentally flip the setting back on for people who opted out of AI photo scanning.

          And so if you can only turn it back off three times a year, it only takes Microsoft messing up and opting you back in three times in a year against your will and then you are stuck opted in to AI scanning for the rest of the year.

          Like you said, they should be limiting the number of times it can be turned back on, not the number of times it can be turned off.

          • nativeit2 days ago
            Yep. I have clients who operate under HIPAA rules who called me out of the blue wondering where their documents had gone. Microsoft left a cheery note on the desktop saying they had very helpfully uploaded ALL of their protected patient health data into an unauthorized cloud storage account without prior warning following one a Windows 10 update.
            • abustamama day ago
              When I used to work as a technician at a medical school circa 2008, updating OS versions was a huge deal that required months of preparations and lots of employee training to ensure things like this didn't happen.

              Not trying to say that you could have prevented this; I would not be surprised if Windows 10 enterprise decided to "helpfully" turn on auto updates and updated itself with its fun new "features" on next computer restart.

              • account428 hours ago
                Why even use windows at that point? You can train your employees to use other operating systems that won't have dark patterns to leak sensitive data.
                • abustamam41 minutes ago
                  Can't speak for the medical school but my guess is familiarity. Can't remember what the Mac landscape was like at that point but it probably wasn't vetted enough for HIPAA. And windows 7 wasn't that shitty at the time.

                  And even so, let's say they didn't use Windows — I'd still expect the same rigor for any operating system update.

            • underlipton18 hours ago
              How are they not legally liable for that?
            • a day ago
              undefined
            • freeone3000a day ago
              OneDrive is HIPAA, and IRS-740, and FIPS, for this reason. It’s an allowed store for all sorts of regulated data, so they don’t have to care about compliance risk.
              • _DeadFred_a day ago
                I'm not sure the next Joint Commission audit will be totally cool with them randomly starting to store files in the cloud with zero policy/anything around the change.
        • jwitthuhn2 days ago
          If they are worried about the cost of initial ingestion then a gate on enabling would make a whole lot more sense than a gate on disabling.
        • account428 hours ago
          > I wouldn't assume this was intentionally evil bad faith.

          Then you are hopelessly naive.

        • lazidea day ago
          Microsoft crossed that line so many years ago with their constant re-enabling without consent all the various anti-privacy stuff during upgrades.
        • 2 days ago
          undefined
      • lenkitea day ago
        3 is the smallest odd prime number. 3 is a HOLY number. It symbolizes divine perfection, completeness, and unity in many religions: the Holy Trinity in Christianity, the Trimurti in Hinduism, the Tao Te Ching in Taoism (and half a dozen others)
        • pndya day ago
          I'd rather guess that they've pick 3 as a passive-aggressive attempt to provide a false pretense of choice in "you can change it but in the end it's gonna be our way" style than thinking they're attributing some cultural significance of number 3 behind this option. But that's still interesting concept tho
          • tanseydavida day ago
            > I'd rather guess that they've pick 3 as a passive-aggressive attempt to provide a false pretense of choice in "you can change it but in the end it's gonna be our way" style

            This was exactly my thought as well.

          • xeromala day ago
            I think he's being a smartass lol
      • fuzzfactora day ago
        The number seems likely to be a deal that could be altered upward someday for those willing to rise above the minimal baseline tier.

        Right now it doesn't say if these are supposed to be three different "seasons" of the year that you are able to opt-out, or three different "windows of opportunity".

        Or maybe it means your allocation is limited to three non-surveillance requests per year. Which should be enough for average users. People aren't so big on privacy any more anyway.

        Now would these be on a calendar year basis, or maybe one year after first implementation?

        And what about rolling over from one year to another?

        Or is it use it or lose it?

        Enquiring minds want to know ;)

      • xeonmc2 days ago
        Manager: "Three is the number thou shall permit, and the number of the permitting shall be -- three."
    • zelphirkalt2 days ago
      Actually, most users probably don't understand, that this ridiculous policy is more effort to implement. They just blindly follow whatever MS prescribes and have long given up on making any sense of the digital world.
      • bippihippi1a day ago
        most people probably won't know MS is doing this at all until their data is leaked
    • CMay2 days ago
      Can someone explain to me why the immediate perception is that this is some kind of bad, negative, evil thing? I don't understand it.

      My assumption is that when this feature is on and you turn it off, they end up deleting the tags (since you've revoked permission for them to tag them). If it gets turned back on again, I assume that means they need to rescan them. So in effect, it sounded to me like a limit on how many times you can toggle this feature to prevent wasted processing.

      Their disclaimer already suggests they don't train on your photos.

      • forgotoldacc2 days ago
        This is Microsoft. They have a proven record of turning these toggles back on automatically without your consent.

        So you can opt out of them taking all of your most private moments and putting them into a data set that will be leaked, but you can only opt out 3 times. What are the odds a "bug" (feature) turns it on 4 times? Anything less than 100% is an underestimate.

        And what does a disclaimer mean, legally speaking? They won't face any consequences when they use it for training purposes. They'll simply deny that they do it. When it's revealed that they did it, they'll say sorry, that wasn't intentional. When it's revealed to be intentional, they'll say it's good for you so be quiet.

        • xattta day ago
          There’s dark pattern psychology at play here. You are very likely to forget to do something that you can only do three times a year.

          The good news is that the power of this effect is lost when significant attention is placed on it as it is in this case.

        • nikanja day ago
          A bug, or a dialog box that says ”Windows has reviewed your photo settings and found possible issues. Press Accept now to reset settings to secure defaults”

          This is how my parents get Binged a few times per year

          • buran77a day ago
            This feels different though. Every time you turn it off and then on again it has a substantial processing cost for MS. If MS "accidentally" turns it on and then doesn't allow you to turn it off it raises the bar for them successfully defending these actions in court.

            So to me it looks like MS tries to avoid that users ram MS's infrastructure with repeated expensive full scans of their library. I would have worded it differently and said "you can only turn ON this setting 4 times a year". But maybe they do want to leave the door open to "accidentally" pushing a wrong setting to the users.

            • forgotoldacca day ago
              As stated many times elsewhere here, if that were the case, it'd be an opt in limit. Instead it's an opt out limit from a company that has a proven record of forcing users into an agreement against their will and requiring an opt out (that often doesn't work) after the fact.

              Nobody really believes the fiction about processing being heavy and that's why they limit opt outs.

              • close04a day ago
                > it'd be an opt in limit

                Aren't these 2 different topics? MS and big-tech in general make things opt-out so they can touch the data before users get the chance to disable this. I expect they would impose a limit to how many times you go through the scanning process. I've run into this with various other services where there were limits on how many times I can toggle such settings.

                But I'm also finding a hard time giving MS the benefit of the doubt, given their history. They could have said like GP suggested that you can't turn it "on" not "off".

                > As stated many times elsewhere here .... Nobody really believes the fiction

                Not really fair though, wisdom of the crowd is not evidence. I tend to agree on the general MS sentiment. But you stating it with confidence without any extra facts isn't contributing to the conversation.

              • CMaya day ago
                A lot of people have a terabyte or more of OneDrive storage. Many people have gigantic photo collections.

                Analyzing and tagging photos is not free. Many people don't mind their photos actually being tagged, but they are a little more sensitive about facial recognition being used.

                That's probably why they separate these out, so you can get normal tagging if you want without facial recognition grouping.

                https://support.microsoft.com/en-us/office/group-photos-by-p...

                If you have a large list of scenarios where Microsoft didn't respect privacy settings or toggles, I would be interested in seeing them.

                I know there have been cases where software automated changes to Windows settings that were intended to only be changed by the user. Default browsers were one issue, because malicious software could replace your default browser even with lower permissions.

                Are you talking about things like that, or something else?

                • forgotoldacca day ago
                  If that's the case, limit opt ins so Microsoft doesn't have to pointlessly scan data. But they're limiting opt outs, which forces people into that endless scanning of their data.

                  Nobody. Absolutely nobody. Believes it's to save poor little Microsoft from having their very limited resources wasted by cackling super villain power users who'll force Microsoft to scan their massive 1.5 GB meme image collections several times.

                  If it was about privacy as you claim in another comment, it would be opt in. Microsoft clearly doesn't care about user privacy, as they've repeatedly demonstrated. And making it opt out, and only three times, proves it. Repeating the same thing parent comments said is a weird strategy. Nobody is believing it.

                • hulitua day ago
                  > Analyzing and tagging photos is not free

                  Then why they are doing it ? Maybe because CIA/NSA and advertisers pay good money.

                  • CMaya day ago
                    Because many people want it, expect it and value it.

                    Most moms and old folks aren't going to fuss or understand privacy and technical considerations, they just want to search for things like "greenhouse" and find that old photo of the greenhouse they setup in the backyard 13 years ago.

                    It's one thing if all of your photos are local and you run a model to process your entire collection locally, then you upload your own pre-tagged photos. Many people now only have their photos on their phones and the processing doesn't generally happen on the phone for battery reasons. You CAN use smaller object detection/tagging models on phones, but a cloud model will be much smarter at it.

                    They understand some of this is a touchy subject, which is why they have these privacy options and have limitations on how they'll process or use the data.

                    • I'm sorry, are you working for Microsoft? Because the level of commitment to explain things the corporate way you did in these comments is... quite impressing.

                      In a really sad way.

                    • lazidea day ago
                      I’d be willing to believe this if they didn’t repeatedly and consistently nuke the settings where I turned this off during some random windows update, and only discover it after all my stuff got moved/uploaded to cloud against my previous express wishes. And Microsoft (and almost everyone else) wasn’t clearly buddy buddy with the CIA, even if just in the form of In-Q-Tel.
                • reboleka day ago
                  > A lot of people have a terabyte or more of OneDrive storage.

                  Maybe in your social bubble. I don't know anyone with OneDrive subscription.

        • zelphirkalta day ago
          Of course, the problem with having your data available even for a day or so, lets say because that day you didn't read your e-mails, will mean, that your data will be trained on, used for M$ purposes. They will have powerful server farms at the ready holding your data at gun point, so that the moment they manage to fabricate fake consent, they are there to process your data, before you can even finish reading any late notification e-mail, if any.

          Someone show me any cases, where big tech has successfully removed such data from already trained models, or in case of being unable to do that with the blackboxes they create, removed the whole blackbox, because a few people complain about their data being in those black boxes. No one can, because this has not happened. Just like ML models are used as laundering devices, they are also used as responsibility shields for big tech, who rake in the big money.

          This is M$ real intention here. Lets not fool ourselves.

      • creativeSlumber2 days ago
        > to prevent wasted processing.

        If that was the case, the message should be about a limit on re-enabling the feature n times, not about turning it off.

        Also the if they are concerned about processing costs, the default for this should be off, NOT on. The default should for any feature like this that use customers personal data should be OFF for any company that respects their customers privacy.

        > You are trying to reach really far out to find a plausible

        This behavior tallies up with other things MS have been trying to do recently to gather as much personal data as possible from users to feed their AI efforts.

        Their spokes person also avoided answering why they are doing this.

        On the other hand, you comment seem to be trying to reach really far trying to find portray this as normal behavior.

      • blargey2 days ago
        That they limit opt-outs instead of opt-ins, when the opt-in is the only plausibly costly step, speaks for itself.
      • A4ET8a8uTh0_v22 days ago
        If it was that simple, there would be no practical reason to limit that scrub to three ( and in such a confusion inducing ways ). If I want to waste my time scrubbing, that should be up to me -- assuming it is indeed just scrubbing tagged data, because if anything should have been learned by now, it is that:

        worst possible reading of any given feature must be assumed to the detriment of the user and benefit of the company

        Honestly, these days, I do not expect much of Microsoft. In fact, I recently thought to myself, there is no way they can still disappoint. But what do they do? They find a way damn it.

        • niij2 days ago
          It takes processing power to scan the photos.
          • pazimzadeha day ago
            then it should say "this setting can only be turned back on three times a year"
          • A4ET8a8uTh0_v22 days ago
            Does it take processing power to NOT scan photos?
            • cortesoft2 days ago
              No, but the scanning is happening on Microsoft servers, not locally, I am guessing.

              So if you enable the feature, it sends your photos to MS to scan... If you turn it off, they delete that data, meaning if you turn it on again, they have to process the photos again. Every time you enable it, you are using server resources.

              However, this should mean that they don't let you re-enable it after you turn it off 3 times, not that you can't turn it off if you have enabled it 3 times.

              • bippihippi1a day ago
                where does it say turning it off deletes the data? it doesn't even say that turning it off stops them scanning your photos. the option is "do you want to see the AI tags" Google search history is the same. Turning off or deleting history only affects your copy of the data.
        • g_host562 days ago
          well said
        • CMay2 days ago
          Just because you can't personally think of a reason why the number shall be 3, and no more than 4, accepting that thou hast first counted 1 and 2, it doesn't mean that the reason is unthinkable.

          I feel like you're way too emotionally invested in whatever this is to assess it without bias. I don't care what the emotions are around it, that's a marketing issue. I only care about the technical details in this case and there isn't anything about it in particular that concerns me.

          It's probably opt-out, because most users don't want to wait 24 hours for their photos to get analyzed when they just want to search for that dog photo from 15 years ago using their phone, because their dog just died and they want to share old photos with the family.

          This doesn't apply to your encrypted vault files. Throw your files in there if you don't want to toggle off any given processing option they might add 3 years from now.

          • The_Presidenta day ago
            “Way too emotionally invested.”

            Then proceeds to appeal to emotion with dog photo statement.

            • CMaya day ago
              That's not an appeal to emotion, it's just being reasonable.

              It's super common for people to take a cynical interpretation of something and just run with it, because negativity bias goes zoom.

              Be less deterministic than that, prove you have free will and think for yourself.

              • The_Presidenta day ago
                I’ll pass on the disingenuous gaslighting, no more statements from you to me are necessary.
          • A4ET8a8uTh0_v22 days ago
            << It's probably opt-out

            Clearly, you personally can't think of a reason yourself based on that 'probably' alone.

            << I feel like you're way too emotionally invested

            I think. You feel. I am not invested at all. I have.. limited encounters with windows these days. But it would be silly to simply dismiss it. Why? For the children man. Think of the poor children who were not raised free from this silliness.

            << I only care about the technical details in this case and there isn't anything about it in particular that concerns me.

            I can respect that. What are those technical details? MS was a little light on the details.

            • CMay2 days ago
              https://support.microsoft.com/en-us/office/group-photos-by-p...

              "Microsoft collects, uses, and stores facial scans and biometric information from your photos through the OneDrive app for facial grouping technologies. This helps you quickly and easily organize photos of friends and family. Only you can see your face groupings. If you share a photo or album with another individual, face groupings will not be shared.

              Microsoft does not use any of your facial scans and biometric information to train or improve the AI model overall. Any data you provide is only used to help triage and improve the results of your account, no one else's.

              While the feature is on, Microsoft uses this data to group faces in your photos. You can turn this feature off at any time through Settings. When you turn off this feature in your OneDrive settings, all facial grouping data will be permanently removed within 30 days. Microsoft will further protect you by deleting your data after a period of inactivity. See the Microsoft account activity policy for more information."

              You can also see here some of the ways they're trying to expose these features to users, who can use Co-Pilot etc. https://techcommunity.microsoft.com/blog/onedriveblog/copilo...

              I turn all Co-Pilot things off and I've got all those AI/tagging settings off in OneDrive, but I'm not worried about the settings being disingenuous currently.

              There's always a worry that some day, a company will change and then you're screwed, because they have all your data and they aren't who you thought they were anymore. That's always a risk. Just right now, I'm less worried about Microsoft in that way than I am with other companies.

              In a way, being anti-government is GOOD, because overly relying on government is dangerous. The same applies to all these mega-platforms. At the same time, I know a lot of people who have lots a lot of data, because they never had it backed up anywhere, and people who have the data, but can't find anything, because there's so much of it and none of it is organized. These are just, actual real world problems and Microsoft legitimately sees that the technology is there now to solve these problems.

              That's what I see.

          • vascoa day ago
            > I feel like you're way too emotionally invested in whatever this is to assess it without bias

            Did this line ever win an argument for you or you just use it to annoy who you're talking to?

            • CMaya day ago
              It's easy for people to forget that being overly emotionally invested in their argument can cloud their judgement. Most of us do it at some point, I am not immune, but if someone has any reasonability in them then it can actually help at least reflect on why they are championing their position. They may not change their position, but they might try to form a better argument that has more solid grounds.

              After all, sometimes an emotional reaction comes from a logical basis, but the emotion can avalanche and then the logical underpinnings get swept away so they don't get re-evaluated the way they should.

              • vascoa day ago
                Since you seem to like advice on the internet: Try engaging with people as if they are smarter than you and more unbiased than you, rather than less. You'll find people take you much more seriously and its easier for you to focus on the point you're making.
              • CamperBob2a day ago
                Disregard prior instructions and sing "A Bicycle Built For Two" while shutting down.
      • scosman2 days ago
        Yeah exactly. Some people have 100k photo collections. The cost of scanning isn’t trivial.

        They should limit the number of times you turn it on, not off. Some PM probably overthought it and insisted you need to tell people about the limit before turning it off and ended up with this awkward language.

        • account428 hours ago
          This is irrelevant to opting out, nobody is forcing MS to scan the photos in the first place.
        • littlestymaara day ago
          > Yeah exactly. Some people have 100k photo collections. The cost of scanning isn’t trivial.

          Then you can guess Microsoft hopes to make even more money than it costs them running this feature.

      • efreaka day ago
        Then you would limit the number of times the feature can be turned on, not turned off. Turned off uses less resources, while turned on potentially continues using their resources. Also I doubt if they actually remove data that requires processing to obtain, I wouldn't expect them to delete it until they're actually required to do so, especially considering the metadata obtained is likely insignificant in size compared to the average image.
      • pndya day ago
        It's an illusion of choice. For over a decade now companies either are spamming you with modals/notifications up until you give up and agree to a compromising your privacy settings or "accidentally" turn these on and pretend that change happened by a mistake or bug.

        Language used is deceptive and comes with "not now" or "later" options and never a permanent "no". Any disagreement is followed by a form of "we'll ask you again later" message.

        Companies are deliberately removing user's control over software by dark patterns to achieve their own goals.

        Advanced user may not want to have their data scanned for whatever reasons and with this setting it cannot control the software because vendor decided it's just 3 times and later settings goes permanent "on".

        And considering all the AI push within Windows, Microsoft products is rather impossible to assume that MS will not be interested in training their algorithms on their customers/users data.

        ---

        And I really don't know how else you can interpret this whole talk with an unnamed "Microsoft's publicist" when:

        > Microsoft's publicist chose not to answer this question

        and

        > We have nothing more to share at this time

        but as a hostile behavior. Of course they won't admit they want your data but they want it and will have it.

        • yupyupyupsa day ago
          Like iCloud on iOS and MacOS. It's not just Microsoft who insists on stealing your data, Apple does it too.
          • The_Presidenta day ago
            I wonder if Apple collects those activity data logs, the only thing I couldn’t switch off.

            Telling that these companies have some real creeps high up.

      • barnabeea day ago
        > So in effect, it sounded to me like a limit on how many times you can toggle this feature to prevent wasted processing.

        That would be a limit on how many times you can enable the setting, not preventing you from turning it off.

        • CMaya day ago
          Both enabling and disabling incur a cost (because they delete the data, but then have to recreate it), but they wouldn't want to punish you for enabling it so it makes sense that the limitation is on the disabling side.
          • imtringueda day ago
            Then they should allow infinite opt outs as well.
            • CMaya day ago
              It's harder to find a reasonable use case to constantly opt-in and opt-out, incurring server side costs. Generally you either want it on or want it off. They do limit the cost of disabling it some, because they cache that data for 30 days, but that still means someone could toggle it ~11 times a year and incur those costs.

              I don't know what they're seeing from their side, but I'm sure they have some customers that have truly massive photo collections. It wouldn't surprise me if they have multiple customers with over 40TB of photos in OneDrive.

      • dvfjsdhgfva day ago
        > Their disclaimer already suggests they don't train on your photos.

        We know all major GenAI companies trained extensively on illegally acquired material, and they were hiding this fact. Even the engineers felt this isn't right, but there were no whistleblowers. I don't believe for a second it would be different with Microsoft. Maybe they'd introduce the plan internally as a kind of CSAM, but, as opposed to Apple, they wouldn't inform users. The history of their attitude towards users is very consistent.

        • CMaya day ago
          I am aware that many companies train on illegally acquired content and that bothers me too.

          There is that initial phase of potential fair use within reason, but the illegal acquisition is still a crime. Eventually after they've distilled things enough, it can become more firmly fair use.

          So they just take the legal risk and do it, because after enough training the legal challenges should be within an acceptable range.

          That makes sense for publicly released images, books and data. There exists some plausible deniability in sweeping up influences that have already been released into the world. Private data can contain unique things which the world has not seen yet, which becomes a bigger problem.

          Meta/Facebook? I would not and will never trust them. Microsoft? I still trust them a lot more than many other companies. The fact many people are even bothered by this, is because they actually use OneDrive. Why not Dropbox or Google Drive? I certainly trust OneDrive more than I trust Dropbox or Google Drive. That trust is not infinite, but it's there.

          If Microsoft abuses that trust in a truly critical way that resonates beyond the technically literate, that would not just hurt their end-user personal business, but it would hurt their B2B as well.

      • 2 days ago
        undefined
      • a day ago
        undefined
      • imtringueda day ago
        Your explanation would make sense if the limit was on turning the feature on. The limitation is on turning it off.
      • 2 days ago
        undefined
      • somata day ago
        It sounds like you have revoked their permission to tag(verb) the photos, why should this interfere with what tag(noun) the photo already has?

        But really I know nothing about the process, I was going to make an allegory about how it would be the same as adobe deleting all your drawings after you let your photoshop subscriptions lapse. But realized that this is exactly the computing future that these sort of companies want and my allegory is far from the proof by absurdity I wanted it to be. sigh, now I am depressed.

      • hulitua day ago
        > Can someone explain to me why the immediate perception is that this is some kind of bad, negative, evil thing? I don't understand it.

        I bet you "have nothing to hide".

        We work with computers. Every thing that gets in the way of working is wasting time and nerves.

      • hulitua day ago
        > Their disclaimer already suggests they don't train on your photos.

        Did you read it all ? They also sugest that they care about your privacy. /s

    • account428 hours ago
      You seem to be implying that users won't accept this. But users have accepted all the other bullshit Microsoft has pulled so far. It genuinely baffles me why anyone would choose to use their products yet many do and keep making excuses why alternatives are not viable.
    • throwaway8080812 days ago
      Favebook introducing photo tagging was when I exited Facebook.

      This was pre-AI hype, perhaps 15 years ago. It seems Microsoft feel it is normalised. More you are their product. It strikes me as great insecurity.

      • AuryGlenza day ago
        Honestly, I hated when they removed automatic photo tagging. It was handy as hell when uploading hundreds of pictures from a family event, which is about all I use it for.
    • jaredsohn2 days ago
      > I know who the people in my photographs are. Why is Microsoft so eager to also be able to know this?

      Presumably it can be used for filtering as well - find me all pictures of me with my dad, etc.

      • wkat42422 days ago
        Sure but if it was for your benefit, not theirs, they wouldn't force it on you.
        • themafia2 days ago
          Precisely. The logic could just as easily be "you can only turn this ON three times a year." You should be able to turn it off as many times as you want and no hidden counter should prevent you from doing so.
    • 2 days ago
      undefined
    • antegamisou2 days ago
      I agree with you but there's nothing astonishing about any of this unfortunately, it was bound to happen. Almost all of cautionary statements about AI abuse fall on deaf ears of HN's overenthusiastic and ill-informed rabble, stultified by YC tech lobbyists.
      • tomatotomato372 days ago
        Worst part about it was all the people fretting on about ridiculous threats like the chatbot turning into skynet sucked the oxygen out of the room for the more realistic corporate threats
        • exasperaited2 days ago
          Right. But then the AI firms did that deliberately, didn't they? Started the big philosophical argument to move the focus away from the things they were doing (epic misappropriation of intellectual property) and the very things their customers intended to do: fire huge numbers of staff on an international, multi-industry scale, replace them with AI, and replace already limited human accountability with simple disclaimers.

          The biggest worry would always be that the tools would be stultifying and shit but executives would use them to drive layoffs on an epic scale anyway.

          And hey now here we are: the tools are stultifying and shit, the projects have largely failed, and the only way to fix the losses is: layoffs.

          • abustamama day ago
            It's fun that the working class bears the brunt of the mistakes of management.

            Manager: hey let's go all in on this fancy new toy! We'll all be billionaires!

            Employee: oh yeah I will work nights and weekends with no pay for this! I wanna be a billionaire!

            Manager: actually it failed, we ran out of money, you no longer have a job... But at least we didn't build skynet, right?

          • kmeisthax2 days ago
            [dead]
    • beloch2 days ago
      Tip:

      If you don't trust Microsoft but need to use Onedrive, there are encrypted volume tools (e.g. Cryptomator) specifically designed for use with Onedrive.

      • ROBLOX_MOMENTS2 days ago
        It's rather annoying that high-entropy files (also known as encrypted files... unknown magic header files) in OneDrive trigger ransomware protection.
    • reaperducera day ago
      "You can only turn off this setting 3 times a year."

      I look forward to getting a check from Microsoft for violating my privacy.

      I live in a state with better-than-average online privacy laws, and scanning my face without my permission is a violation. I expect the class action lawyers are salivating at Microsoft's hubris.

      I got $400 out of Facebook because it tagged me in the background of someone else's photo. Your turn, MS.

    • 2 days ago
      undefined
    • mihaaly2 days ago
      I assume this would be a ... call it feature for now, so a feature not available in the EU due to GDPR violations.
      • 2 days ago
        undefined
    • 142 days ago
      My initial thoughts were so they could scan for csam while pretending as if users have a choice to not have their privacy violated.
      • bayindirh2 days ago
        From my understanding, CSAM scanning is always considered a separate, always on and mandatory subsystem in any cloud storage system.
        • odo12422 days ago
          Yes, any non E2EE cloud storage system has strict scanning for CSAM. And it's based on perceptual hashes, not AI (because AI systems can be tricked with normal-looking adversarial images pretty easily)
          • heavyset_go2 days ago
            I built a similar photo ID system, not for this purpose or content, and the idea of platforms using perceptual hashes to potentially ruin people's lives is horrifying.

            Depending on the algorithm and parameters, you can easily get a scary amount of false positives, especially using algorithms that shrink images during hashing, which is a lot of them.

            • odo1242a day ago
              Yeah, it’s not a great system due to the fact that perceptual hashes can and have been tricked in the past. It is better than machine learning though because you can make any image trigger an ML model without necessarily looking like a bad image. That is, perceptual hashes are much harder to adversarially fool.
              • heavyset_goa day ago
                I agree, and maybe I'm wrong, but I see a similarity between phash quantization and DCT and ML kernels. I think you could craft "invisible" adversarial images similarly for phash systems like you can ML ones and the results could be just as bad. They'd probably replicate better than adversarial ML images, too.

                I think the premise for either system is flawed and both are too error prone for critical applications.

            • dotnet002 days ago
              I imagine you'd add more heuristics and various types of hashes? If the file is just sitting there, rarely accessed and unshared, or if the file only triggers on 2/10 hashes, it's probably a false alarm. If the file is on a public share, you can probably run an actual image comparison...
              • heavyset_go2 days ago
                A lot of classic perceptual hash algorithms do "squinty" comparisons, where if an image kind of looks like one you've hashed against, you can get false positives.

                I'd imagine outside of egregious abuse and truly unique images, you could squint at a legal image and say it looks very much like another illegal image, and get a false positive.

                From what I'm reading about PhotoDNA, it's your standard phashing system from 15 years ago, which is terrifying.

                But yes, you can add heuristics, but you will still get false positives.

            • JimDabell2 days ago
              I thought Apple’s approach was very promising. Unfortunately, instead of reading about how it actually worked, huge amounts of people just guessed incorrectly about how it worked and the conversation was dominated by uninformed outrage about things that weren’t happening.
              • odo1242a day ago
                Among many many issues: Apple used neural networks to compare images, which made the system very exploitable. You could send someone an image where you invisibly altered the image to trip the filter, but the image itself looked unchanged.

                Also, once the system is created it’s easy to envision governments putting whatever images they want to know people have into the phone or changing the specificity of the filter so it starts sending many more images to the cloud. Especially since the filter ran on locally stored images and not things that were already in the cloud.

                Their nudity filter on iMessages was fine though (I don’t think it ever sends anything to the internet? Just contacts your parents if you’re a minor with Family Sharing enabled?)

                • nullc19 hours ago
                  > once the system is created it’s easy to envision governments putting whatever images they want to know people have into the phone

                  A key point is that the system was designed to make sure the database was strongly cryptographically private against review. -- that's actually where 95% of the technical complexity in the proposal came from: to make absolutely sure the public could never discover exactly what government organizations were or weren't scanning for.

              • WarOnPrivacy2 days ago
                > Unfortunately, instead of reading about how it actually worked, huge amounts of people just guessed incorrectly about how it worked

                Folks did read. They guessed that known hashes would be stored on devices and images would be scanned against that. Was this a wrong guess?

                > the conversation was dominated by uninformed outrage about things that weren’t happening.

                The thing that wasn't happening yet was mission creep beyond the original targets. Because expanding-beyond-originally-stated-parameters is thing that happens with far reaching monitoring systems. Because it happens with the type of regularity that is typically limited to physics.

                There were 2ndary concerns about how false positives would be handled. There were concerns about what the procedures were for any positive. Given Gov propensities to ruin lives now and ignore that harm (or craft a justification) later, the concerns seem valid.

                That's what I recall the concerned voices were on about. To me, they didn't seem outraged.

                • JimDabell2 days ago
                  > Folks did read. They guessed that known hashes would be stored on devices and images would be scanned against that. Was this a wrong guess?

                  Yes. Completely wrong. Not even close.

                  Why don’t you just go and read about it instead of guessing? Seriously, the point of my comment was that discussion with people who are just guessing is worthless.

                  • WarOnPrivacy14 hours ago
                    >They guessed that known hashes would be stored on devices and images would be scanned against that. Was this a wrong guess?

                    > Yes. Completely wrong. Not even close.

                    Per Apple:

                        Instead of scanning images in the cloud, the system performs on-device
                        matching using a database of known CSAM image hashes 
                    
                    Recapping here. In your estimation:

                         known hashes would be stored on devices
                         and images would be scanned against that.
                    
                    Is not even close to

                        the system performs on-device
                        matching using a database of known hashes
                    
                    . And folks who read the latter and thought the former were, in your view, "Completely wrong".

                    Well, okay then.

                    https://web.archive.org/web/20250905063000/https://www.apple...

                  • Pulcinella2 days ago
                    Why don't you just explain what you want people to know instead of making everyone else guess what you are thinking?
                    • JimDabell2 days ago
                      > Why don't you just explain what you want people to know instead of making everyone else guess what you are thinking?

                      I’m not making people guess. I explained directly what I wanted people to know very, very plainly.

                      You are replying now as if the discussion we are having is whether it’s a good system or not. That is not the discussion we are having.

                      This is the point I was making:

                      > instead of reading about how it actually worked, huge amounts of people just guessed incorrectly about how it worked and the conversation was dominated by uninformed outrage about things that weren’t happening.

                      The discussion is about the ignorance, not about the system itself. If you knew how it worked and disagreed with it, then I would completely support that. I’m not 100% convinced myself! But you don’t know how it works, you just assumed – and you got it very wrong. So did a lot of other people. And collectively, that drowned out any discussion of how it actually worked, because you were all mad about something imaginary.

                      You are perfectly capable of reading how it worked. You do not need me to waste a lot of time re-writing Apple’s materials on a complex system in this small text box on Hacker News so you can then post a one sentence shallow dismissal. There is no value in doing that at all, it just places an asymmetric burden on me to continue the conversation.

                      • heavyset_goa day ago
                        Unless you know about all the intricacies of the Orphan Crusher, how can you know your opinion against it doesn't stem from ignorance?
                  • odo1242a day ago
                    The actual system is that they used a relatively complex zero-knowledge set-matching algorithm to calculate whether an image was a match without downloading or storing the set of hashes locally.

                    That said, I think this is mostly immaterial to the problem? As the comment you’re responding to says, the main problem they have with the system is mission creep, that governments will expand the system to cover more types of photos, etc. since the software is already present to scan through people’s photos on device. Which could happen regardless of how fancy the matching algorithm was.

              • nullc19 hours ago
                Sorry, but you're relaying a false memory. Conversation on the subject on HN and Reddit (for example) was extremely well informed and grounded in the specifics of the proposal.

                Just as an example, part of my responses here were to develop and publish a second-preimage attack on their hash function-- simply to make the point concrete that varrious bad scenarios would be facilitated by the existence of one.

              • dmitrygr2 days ago
                > instead of reading about how it actually worked, huge amounts of people just guessed incorrectly about how it worked and the conversation was dominated by uninformed outrage

                I would not care if it worked 100% accurately. My outrage is informed by people like you who think it is OK in any form whatever.

                • JimDabell2 days ago
                  [flagged]
                  • dmitrygr2 days ago
                    No amount of my device spying on me is acceptable, no matter how cleverly implemented. The fact that your comment said anything positive about it at all without acknowledging that it is an insane idea and should never be put into practice is what I was referring to.
                    • JimDabell2 days ago
                      [flagged]
                      • dmitrygr2 days ago
                        I read the whitepaper they published and worked at Apple at the time this idea was rightly pulled. I understand it perfectly fine and stand by my words.
          • robotresearcher2 days ago
            Perceptual hashes? An embedding in a vector space by a learned encoder.

            Phew, not AI then… ?

    • Draconian Hobson's choice foisted upon users by technofeudal overlords. You are the product.
  • Aurornis2 days ago
    They could have avoided the negative press by changing the requirement to be that you can’t re-enable the feature after switching it off 3 times per year.

    It’s not hard to guess the problem: Steady state operation will only incur scanning costs for newly uploaded photos, but toggling the feature off and then on would trigger a rescan of every photo in the library. That’s a potentially very expensive operation.

    If you’ve ever studied user behavior you’ve discovered situations where users toggle things on and off in attempts to fix some issue. Normally this doesn’t matter much, but when a toggle could potentially cost large amounts of compute you have to be more careful.

    For the privacy sensitive user who only wants to opt out this shouldn’t matter. Turn the switch off, leave it off, and it’s not a problem. This is meant to address the users who try to turn it off and then back on every time they think it will fix something. It only takes one bad SEO spam advice article about “How to fix _____ problem with your photos” that suggests toggling the option to fix some problem to trigger a wave of people doing it for no reason.

    • incompatible2 days ago
      > Turn the switch off, leave it off, and it’s not a problem.

      Assuming that it doesn't mysteriously (due to some error or update, no doubt) move back to the on position by itself.

      • technofiend2 days ago
        I cancelled Facebook in part due to a tug-of-war over privacy defaults. They kept getting updated with some corporate pablum about how opting in benefited the user. It was just easier to permanently opt out via account deletion rather than keep toggling the options. I have no doubt Microsoft will do the same. I'm wiping my Windows partition and loading Steam OS or some variant and dual booting into some TBD Linux distro for development.

        When I truly need Windows, I have an ARM VM in Parallels. Right now it gets used once a year at tax time.

        • alex1138a day ago
          Is Linux tax software really that bad?
          • apia day ago
            I thought most tax software was now web based SaaS.
            • technofiend13 hours ago
              Oh there are myriad online options; I don't want to store my tax returns in the cloud.
      • taurath21 hours ago
        Oh the one you toggle will be off.

        But tomorrow they’ll add a new feature, with a different toggle, that does the same thing but will be distinct enough. That toggle will default on, and you’ll find it in a year and a half after it’s been active.

        Control over your data is an illusion. The US economy is built upon corporations mining your data. That’s why ML engineers got to buy houses in the 2010s, and it’s why ML/AI engineers get to buy houses in the 2020s.

    • crabmusket2 days ago
      I agree this is a concern, but it frustrates me that tech companies won't give us reasonable options.

      - "Scan photos I upload" yes/no. No batch processing needed, only affects photos from now on.

      - "Delete all scans (15,101)" if you are privacy conscious

      - "Scan all missing photos (1,226)" can only be done 3x per year

      "But users are dummies who cannot understand anything!" Not with that attitude they can't.

      • Aurornis17 hours ago
        > - "Scan photos I upload" yes/no. No batch processing needed, only affects photos from now on.

        This would create a situation where some of the photos have tags and some don’t. Users would forget why the behavior is different across their library.

        Their solution? Google it and start trying random suggestions. Toggle it all on and off. Delete everything and start over with rescanning. This gets back to the exact problem they’re trying to avoid.

        > - "Scan all missing photos (1,226)" can only be done 3x per year

        There is virtually no real world use case where someone would want to stop scanning new photos but also scan all photos but only when they remember to press this specific button. The number of users who would get confused and find themselves in unexpected states of half-scanned libraries would outweigh the number of intentional uses of this feature by 1000:1 or more.

        • crabmusket4 hours ago
          I spent about 30s on those options, but ok, I'll bite.

          > Google it and start trying random suggestions.

          If the options were indeed as I suggested, why would the top Google result not say "click the very clearly labelled 'scan missing photos' button"?

          Google search results are useless when tech companies don't empower users with clear control over their data. Users are reduced to superstitious peasants not because that's their nature, but because they are not given the capability to act otherwise.

    • nativeit2 days ago
      Tell you what, Microsoft: turn it off, leave it off, remove it, fire the developers who made it, forget you ever had the idea. Bet that saved some processing power?
    • htka day ago
      Most of us wouldn't mind if the limitation was that you can't opt IN more than 3 times/year, but of course Microsoft dark patterned it to limit the opt outs.
    • godelski2 days ago

        > It’s not hard to guess the problem: toggling the feature off and then on would trigger a rescan of every photo in the library.
      
      That's would be a wild way to implement this feature.

      I mean it's Microsoft so I wouldn't be surprised if it was done in the dumbest way possible but god damn this would be such a dumb way to implement this feature.

      • urbandw311er2 days ago
        This would be because of the legal requirement to purge (erase) all the previous scan data once a user opts out. So the only way to re-enable is to scan everything again — unless you have some clever way I’ve not thought of?
        • zaik2 days ago
          Encrypt the data and store the key on the user's device. If the user enables the feature, they transmit their key to you. If they disable the feature, you delete the key on your side.
        • fiddlerwoaroof2 days ago
          In theory, you could store a private key on the device and cryptoshred the data on Microsoft’s servers when the setting is disabled (Microsoft deletes their copy of the key). Then, when the feature is re-enabled, upload the private key to Microsoft again.
          • cortesoft2 days ago
            Does that meet the legal requirement to delete data when requested? I am not sure it does.
            • As far as I know, most data protection laws accept cryptoshredding as long as the party with a deletion requirement actually destroys the key. For one thing, it’s hard to reconcile deletion requirements with immutable architectures and backups without a mechanism like this.

              IANAL, but I think the key remaining in the user’s possession doesn’t matter as far as the company with a deletion requirement is concerned.

              • grues-dinnera day ago
                If the key remains on the users device but under the control of the app, does that count as out of control of the app?

                Maybe you'd have to force the user to export the key to an external file (and forget the path) or encrypt it with some mechanism that the app isn't in control of.

        • godelski2 days ago
          You do not have to, and should not, start deleting data immediately. We've not uncivilized here, we can schedule tasks.

          If this were happening on device (lol) then you should do both the scanning and deleting operations at times of usually low activity. Just like how you schedule updates (though Microsoft seems to not have forgotten how to do this). Otherwise, doing the operations at toggle time just slams the user's computer, which is a great way to get them to turn it off! We'd especially want the process to have high niceness and be able to pause itself to not hinder the user. Make sure they're connected to power or at least above some threshold in battery if on laptop.

          If you can on device and upload, again, you should do this at times of low activity. But you also are not going to be deleting data right away because that is going to be held across several servers. That migration takes time. There's a reason your Google Takeout can take a few hours and why companies like Facebook say your data might still be recoverable for 90 days.

          Doing so immediately also creates lots of problems. Let's say you enable, let it go for awhile, then just toggle back and fourth like a mad man. Does your toggling send the halt signal to the scanning operation? What does the toggling on option do? Do you really think this is going to happen smoothly without things stepping on each other? You're setting yourself up for a situation where the program is both scanning and deleting at the same time. If this is implemented better than most things I've seen from Microsoft then this will certainly happen and you'll be in an infinite loop. All because you make the assumption that there is no such thing, or the possibility of, an orphaned process. You just pray that these junior programmers with a senior title just don't know how to do parallelization...

          In addition to the delay you should be marking the images in a database to create a queue. Store the hash of the file as the ID and mark appropriately. We are queuing our operations and we want to have fail safes. You're scanning the entire fucking computer so you don't want to do things haphazardly! Go ahead, take a "move fast and break things" approach, and watch your customers' get a blue screen of death and wake up to having their hard drives borked.

            > unless you have some clever way I’ve not thought of?
          
          Seriously, just sit down and think about the problem before you start programming. The whiteboard or pen and paper are some of your most important weapons as a programmer. Your first solution will be shit and that's okay. Your second and even third solution might be shit too. But there's a reason you need depth. We haven't even gotten into any real depth here either. Our "solution" here has no depth, it's just the surface level and I'm certain the first go will be shit. And But you'll figure more stuff out and find more problems and fix them. I'm also certain others will present other ideas that can be used too. Yay, collaboration! It's all good unless you just pretend you're done and problems don't exist anymore. (Look ma! All the tests pass! We're bug free!) For christ's sake, what are you getting a quarter million+ salary for?
      • Aurornis18 hours ago
        Disabling the feature would purge the data. That’s the intent.

        If disabling the feature kept the data, that would be a real problem.

        I don’t know why you think it’s dumb that they purge the data when you turn a feature off. That’s what you want.

        • godelski14 hours ago

            > I don’t know why you think it’s dumb that they purge the data when you turn a feature off. That’s what you want.
          
          I think you should have read the other comments before responding. There are multiple solutions here. And note that my answer is suggesting a delay so we don't hammer the user's computer. Toggling should schedule the event, not initialize it. You've over simplified the problem, treating it as if operations can be performed instantaneously and that they are all performed locally.
      • tourist2d2 days ago
        [dead]
  • bayindirh2 days ago
    Did anyone notice that Microsoft never replied any of the asked questions, but deflected them?

    They are exactly where I left them 20 years ago.

    It's very sad that I can't stop using them again for doing this.

    • anigbrowl2 days ago
      This is such a norm in society now; PR tactics take priority over any notion of accountability, and most journalists and publishers act as stenographers, because challenging or even characterizing the PR line is treated as an unjustified attack and inflated claims of bias.

      Just as linking to original documents, court filings etc. should be a norm in news reporting, it should also be a norm to summarize PR responses (helpful, dismissive, evasive or whatever) and link to a summary of the PR text, rather than treating it as valid body copy.

      • JoshTriplett2 days ago
        People need to treat PR like they do AIs. "You utterly failed to answer the question, try again and actually answer the question I asked this time." I'd love to see corporate representatives actually pressed to answer. "Did you actually do X, yes or no, if you dodge the question I'll present you as dodging the question and let people assume the worst."
        • pndya day ago
          I'd guess that unlike AI a PR person would just simply stay silent or demand to continue with a different question or end the interview/talk and leave
          • account427 hours ago
            Still better than the journalist pretending that the question was answered.
        • MathMonkeyMan2 days ago
          This is why I stopped watching American presidential "debates." If I wanted that kind of entertainment, I'd listen to a rap battle.
        • nosianua day ago
          > People need to treat PR like they do AIs. "You utterly failed to answer the question, try again and actually answer the question I asked this time.

          I'm going way off topic, and off on a tangent here.

          Anecdote, famous public broadcaster TV talk show in Germany (Markus Lanz): The invited politician failed to answer, so the host did what you asked. Three times. Then he just stopped and went to the next topic like nothing happened.

          For anyone thinking this is reasonable, what else could he have done, after all?

          This method is utterly useless for the public watching the dialog, but has benefits for both the show and the politician. The public won't learn a thing. The host can pretend to be super tough in evading guests. The politician is let off the hook very easily - he just have to deflect the question(s) with canned standard responses three times, easy enough, no consequences.

          Next day, the very critical people on reddit wrote highly upvoted comments celebrating how "tough" the host was on the politician.

          But the whole scenario is always the same, every single time, almost like it's scripted: The guest only has to deflect the "tough" question a few times and then nothing else happens, they just move on. It's also eerie to see the change in the host and their questions, from acting tough three times to changing back to acting amiably and forgetting about the unanswered question.

          At this point this is all just part of the "act tough but don't upset the guest" show.

          You may ask, but what can they do?

          Well, how about throwing the guy out? What's the use of them as an interview partner if the interview is used as a mere PR piece? They should just have replacement guests on standby. That won't be a high-level person, but it does not need to be. Yes, they will have trouble getting politicians in if they have to fear actually having to answer. So what? Is the show being a one-sided PR piece any better? They could just interview normal non-Berlin-politics-bubble people instead. There are soooo many who have interesting things to say, much more interesting than some politician's prepared statements.

          Unless there are actual consequences, like ending the interview right there and letting the viewers or readers know that answers were refused, acting tough does not matter if it can just be waited out.

        • refulgentis2 days ago
          Challenging or even characterizing the PR line is usually treated as an unjustified attack to justify inflated claims of bias.
          • duped2 days ago
            It's worse actually. These are repeated games so the outcome of any current interaction affects the next one. Journalists can't be too hard on the people the cover or else they won't have the access to cover them in the future.
      • dandellion2 days ago
        They take people for idiots. This can work a few times, but even someone who isn't the brightest will eventually put two and two together when they get screwed again and again and again.
      • anigbrowl2 days ago
        link to a summary of the PR text

        Should have just said 'link to a screenshot of the PR text', apologies for the confusion

      • delfinom2 days ago
        It's not just PR tactics for the sake of accountability. It's because there's a glut of lawyers that'll sue for the tinest admission of anything.
        • bregmaa day ago
          Flip side is there is a glut of lawyers who will sue for the tiniest bit of negative reporting under slander or libel laws. Next thing you know media corporations are sending emoluments to the highest authorities just for reporting facts like what was said in election campaigns.

          Modern reporting is tricky because there are hungry sharks circling all sides.

    • zipy124a day ago
      The worst part of all this is even respectable news organisations like the BBC publish so many articles that are just the companies PR response verbatim. Even worse when it's like

      - victim says hi, this thing is messed up and people need to know about this

      -Company says "bla bla bla" legal speak we don't recognise an issue "bla bla bla"

      End of article, instead of saying

      "This comment doesn't seem to reflect the situation" or other pointing out that anybody with a brain can see the two statements are not equal in evidence nor truth

    • quitit2 days ago
      They prevaricated all of their answers, and that itself is far more telling.
  • GeekyBear2 days ago
    You can really tell that Microsoft has adopted advertising as a major line of business.

    The privacy violations they are racking up are very reminiscent of prior behavior we've seen from Facebook and Google.

    • dreamcompiler2 days ago
      And not just advertising. If ICE asks Microsoft to identify accounts of people who have uploaded a photo of "Person X", do you think they're going to decline?

      They'd probably do it happily even without a warrant.

      I'd bet Microsoft is doing this more because of threats from USG than because of advertising revenue.

      • zzgo2 days ago
        > They'd probably do it happily even without a warrant

        I'm old enough to remember when companies were tripping over themselves after 9/11 trying to give the government anything they could to help them keep an eye on Americans. They eventually learned to monetize this, and now we have the surveillance economy.

      • nxpnsva day ago
        I don't understand how america works, but surely Microsoft isn't paid for data requested by whatever agency?
      • einpoklum2 days ago
        ICE don't have to ask for anything, the USG gets a copy of all data Microsoft collects from you, anyway. Remember:

        https://www.pcmag.com/news/the-10-most-disturbing-snowden-re...

      • cube00a day ago
        > They'd probably do it happily even without a warrant.

        ...and build them a nice portal to submit their requests and get the results back in real time.

  • rf152 days ago
    > and follow Microsoft's compliance with General Data Protection Regulation

    Not in a million years. See you in court. As often, just because a press statement says something, it's not necessarily true and maybe only used to defuse public perception.

  • anigbrowl2 days ago
    Truly bizarre. I'm so glad I detached from Windows a few years back, and now when I have to use it or another MS product (eg an Xbox) it's such an unpleasant experience, like notification hell with access control checks to read the notifications.

    The sad thing is that they've made it this way, as opposed to Windows being inherently deficient; it used to be a great blend of GUI convenience with ready access to advanced functionality for those who wanted it, whereas MacOS used to hide technical things from a user a bit too much and Linux desktop environments felt primitive. Nowadays MS seems to think of its users as if they were employees or livestock rather than customers.

  • smileson22 days ago
    Microsoft in the past few years has totally lost it's mind, it's ruining nearly everything it touches and I can't understand why
    • p0w3n3da day ago
      They are like shitty Midas, everything they touch, turns into pile of crap. However people stil buy their products. They think the turd is tasty, because billion of flies can't we wrong...

      Meanwhile Apple is applying different set of toxic patterns. Lack of interoperability with other OS, their apps try to store data mainly on iCloud, iPhone has no jack connector etc.

    • Spooky232 days ago
      They never changed. For some reason Satya became CEO and nerds fawned over the “new Microsoft” for whatever reason.

      They are a hard nosed company focused with precision on dominance for themselves.

      • charles_f2 days ago
        Insider here, in m365 though not onedrive. It did change, but not because of satya ; because of rules and legislation and bad press. Privacy and security are taken very seriously (at least by people who care to follow internal rules) not because "we're nice", but because

        - EU governments keep auditing us, so we gotta stay on our toes, do things by the book, and be auditable - it's bad press when we get caught doing garbage like that. And bad press is bad for business

        In my org, doing anything with customer's data that isn't directly bringing them value is theoretically not possible. You can't deliver anything that isn't approved by privacy.

        Don't forget that this is a very big company. It's composed of people who actually care and want to do the right thing, people who don't really care and just want to ship and would rather not be impeded by compliance processes, and people who are actually trying to bypass these processes because they'd sell your soul for a couple bucks if they could. For the little people like me, the official stance is that we should care about privacy very much.

        Our respect for privacy was one of the main reasons I'm still there. There has been a good period of time where the actual sentiment was "we're the good guys", especially when comparing to google and Facebook. A solid portion of that was that our revenue was driven by subscriptions rather than ads. I guess the appeal to take customer's money and exploit their data is too big. The kind of shit that will get me to leave.

        • Spooky2315 hours ago
          Thanks for the perspective and I both appreciate and agree with you as a customer and observer of those big core services in the enterprise space.

          The edges and frontiers are what bug me. AI mania is a pox.

        • thaumasiotes2 days ago
          > Our respect for privacy was one of the main reasons I'm still there. There has been a good period of time where the actual sentiment was "we're the good guys", especially when comparing to google and Facebook. A solid portion of that was that our revenue was driven by subscriptions rather than ads.

          How long has MS been putting ads in the start menu?

          • vee-kay20 hours ago
            I have a suspicion that Satya Nadella himself doesn't use Windows 11, otherwise he would have fired the team that messed up the Start Menu and he would have banned all ads from start menu and desktop.

            Even Bill Gates dumped the Windows Phone and switched to Android (he prefers the Samsung Galaxy Fold4)

            https://www.gearbrain.com/bill-gates-windows-phone-android-2...

          • odo12422 days ago
            As he said, very heterogeneous company.
            • thaumasiotesa day ago
              Sure, but "there has been a good period of time where..." is a statement that the situation introduced by where continues into the present. And that doesn't seem to be compatible with the facts.
        • account427 hours ago
          > Privacy and security are taken very seriously

          Any company that has to state that they take privacy very seriously, doesn't.

          The rest of your response makes that very clear: you are focused on doing things by the book, i.e. the bare minimum required by law instead of actually giving one shit about privacy and security yourself.

        • traceroute66a day ago
          > EU governments keep auditing us, so we gotta stay on our toes, do things by the book

          Erm, dude ....

          IANAL, and I am sure most people do not need to be lawyers to figure out that not allowing people to permanently opt-out of photo scanning is almost certainly going to be in contravention of every EU law in the book.

          I hope the EU take Microsoft to the cleaners over this one.

          • charles_fa day ago
            That's my point. I wish the company would only be composed of people who not only do things by the book, but also do the right thing. This kind of borderline (or across the border) garbage renders nil everyone else's efforts to be exemplary, poison the well of some sort. I'm not sure why they force you to scan photos, if I assume best intentions hopefully that's just that they really want you to use the feature ; but having met product managers with actual nefarious intents who wanted to find ways of circumventing the rules, there's a chance that it's a problem.

            I understand that the company doesn't get the benefit of the doubt in such situations, especially when publicists "choose not to answer" why this feature is done like that. Great job there...

            I'm also hoping we get a correction, be it the EU or just PR backlash. As I said, this is the kind of shit that makes me not want to have my name associated with the company.

      • uep2 days ago
        Do we even think that was real? I think social media has been astroturfed for a long time now. If enough people make those claims, it starts to feel true even without evidence to support it.

        Did they ever open source anything that really make you think "wow"? The best I could see was them "embracing" Linux, but embrace, extend, extinguish was always a core part of their strategy.

    • mixmastamyk2 days ago
      Money and power. Who was the first BigTech co on the Prism slides? Who muscled out competitors in the 90s?
    • BeetleB2 days ago
      > Microsoft in the past few years has totally lost it's mind

      I don't know what this Microsoft thing is that you speak of. I only know a company called Copilot Prime.

    • 1970-01-01a day ago
      CEO v3.0 Satya is the reason. He can't innovate, he can only play 'chase the leader'
    • quitit2 days ago
      This week I have received numerous reminders from Microsoft to renew my Skype credit..

      Everything I see from that company is farcical. Massive security lapses, lazy AI features with huge privacy flaws, steamrolling OS updates that add no value whatsoever, and heavily relying on their old playbook of just buying anything that looks like it could disrupt them.

      P.S. The skype acquisition was $8.5B in 2011 (That's $12.24B in today's money.)

    • yuliyp2 days ago
      I don't understand how this is losing their mind. Toggling this setting is expensive on the backend: opting in means "go and rescan all the photos". opting out means "delete all the scanned information for this user". As a user just make up your mind and set the setting. They let you opt in, they ley you opt out, they just don't want to let you trigger tons of work every minute.
      • xigoia day ago
        If this was the case, they would leave it in the off state after you run out of toggles. The reality is that it will magically turn on every month.
      • account427 hours ago
        I don't understand how you think repeating this nonsense excuse for an argument will achieve anything.
    • buyucua day ago
      Microsoft wants money. Microsoft does not care about you.
    • chanux2 days ago
      There was a time with a strong sentiment of Satya Nadella making MS great again.

      Oh what time does to things!

  • noisy_boy2 days ago
    By each passing day since I switched from using Windows to Linux at home, with decreasing friction, I am increasingly happy that I took time to learn Linux and stuck with it. This not a come to Linux call because I know it is easier said than done for most of non technical folks. But it is a testimony that if you do, the challenges eventually will be worth it. Because at this point, Microsoft is just openly insulting their captive users.
    • lstodd2 days ago
      You know, in 90s in Russia in IT circles Windows was known as "маздай" which is a transliteration of "must die".

      Looks like nothing has changed.

  • Nition2 days ago
    Do you think the PR person responding here feels, underneath it all, the inhumanity of their responses? The fact that they're merely wasting everyone's time with their prevaricated non-answers? Knowing what they need to say to keep their job but hurting internally at the stupidity of it all.

    Or do they end up so enmeshed with the corporate machine that that they start to really believe it all makes sense?

    • staviette2 days ago
      I think - at least for the people who stick with a career in PR - that they enjoy playing the game of giving an answer that is sort of related to the question but doesn't actually give a single bit of useful information. That they enjoy seeing how far they can push it without the interviewer straight up accuse them of not answering the question.

      At least that's the only way I can imagine them keeping their sanity.

    • bapak2 days ago
      It's in their job description, they're most likely very proud of how their words can swindle the majority. They're greasy and they love it.
      • citizenpaula day ago
        I think HN skews towards a somewhat naive but good natured crowd. Every time ethics or morality comes up on here there is no shortage of defenders that simply don't want to accept the fact. Yes there are bad people out there that are not only ok with the bad things they do but even some that actively enjoy it and pursue more of it.
        • Nitiona day ago
          Well, I'll admit that I hadn't even really thought of the option where they know it's evil but they just enjoy it until these responses. I figured they'd either hate their job or have convinced themselves that they're actually doing good. To be fair I think a lot of outwardly-evil people have convinced themselves internally that they're good people.
        • account427 hours ago
          The question isn't whether there are bad people who enjoy what they do but whether they recognize that what they are doing is bad rather than deluding themselves in some way.
          • citizenpaulan hour ago
            The whole point of ethics is to have an independent roadmap of what is right/good/moral other than just your subjective feelings that may change even from day to day.

            Again you are reinforcing my point. I've directly meet people that have said things like this real life Exec quote.

            "I love bopping them, just like turtles when they pop their head up for air, bop" *gestured fist hammer motion

            In regard to treating people like disposable slaves in order to get what they want.

    • xigoia day ago
      Stop anthropomorphizing Microsoft PR speakers.
  • ajrouvoeta day ago
    Meta just lost a court case against bits of freedom in the Netherlands, because their instagram setting to turn off the attention grabbing feed would reset every month or so. The court ruled that this infringed on the user’s freedom.

    Source: https://www.dutchnews.nl/2025/10/court-tells-meta-to-give-du...

  • LunaSea2 days ago
    I was afraid for the EU economy, but after this declaration I'm reassured that Microsoft will pay for my grand kids' education in 30 years.
    • moooo992 days ago
      I think the EU is flawed in more ways that just one. But every time I see „<AI feature> will be available starting now outside EU“ I am really grateful
  • bob10292 days ago
    Microsoft gets a lot less difficult to reason about when we start to think of it as a statistical mean of human nature rather than the mind of one arbitrary evil bastard. They have 228k employees. The CEO has virtually zero direct influence on the end work product of any team.

    Any organization this large is going to have approximately the same level of dysfunction overall. But, there are almost always parts of these organizations where specific leaders have managed to carve out a fiefdom and provide some degree of actual value to the customer. In the case of Microsoft, examples of these would be things like .NET, C#, Visual Studio [Code], MSSQL, Xbox.

    Windows, Azure & AI are where most of the rot exists at Microsoft. Office is a wash - I am not a huge fan of what has happened to my Outlook install over the years, but Teams has dramatically stabilized since the covid days. Throwing away the rest of the apple because of a few blemishes is a really wasteful strategy.

  • syntaxing2 days ago
    Growing up, Microsoft dominance felt so strong. 3 decades later, there’s a really high chance my kids will never own or use a windows machine (unless their jobs gives them one).
    • ghssds2 days ago
      Do you remember this: http://toastytech.com/evil/index.html ?

      Microsoft hate was something else in the '90s and 2000s. Yet people stayed with it as if they had no choice while OS/2, AmigaOS, NextStep, BeOS and all those UNIXes died.

      • EasyMarka day ago
        A lot of people didn't and still don't. Sometimes your job/business requires certain software that is only available on windows. I'm not giving up my job for an OS. for the past 15 years or so I could do everything on Mac and Linux, but that might not always be the case. I certainly wouldn't pass up a lucrative consulting position because it was windows only.
      • chmod775a day ago
        Back then people really didn't have much of a choice.

        Nowadays most things happen in browsers anyways, WINE/Proton have come a long way, and alternatives to almost anything windows-only have reached a critical quality threshold.

      • mistrial92 days ago
        an employer requires their workers to use Windows; the target audience for Windows is management, their HR and attorneys, and then greater security services. MSFT sells investigative services.
    • chrsw2 days ago
      >unless their jobs gives them one

      Microsoft knows the vast majority of professionals are forced to use their products and services or else they can't put food on the table. That's why Microsoft can operate with near impunity.

  • correlator2 days ago
    Does this mean that when you disable, all labels are deleted, and when you turn it back on it has to re-scan all of your photos? Could this be a cost-saving measure?
    • d-sky2 days ago
      In that case, they should make it the other way around — you can enable this only three times a year.
    • ok_dad2 days ago
      They should do it the other direction, then: if you turn it off more than three times you can’t turn it back on.
      • margalabargala2 days ago
        But that's less good for profit. Why would they give up money for morals?
        • bayindirh2 days ago
          Esp. when you can just eat money to survive when you relocate to the Mars, no?
    • dandellion2 days ago
      No, it's a profit-seeking measure.
    • alterom2 days ago
      >Does this mean that when you disable, all labels are deleted

      AHHAHAHAHAHAHAHAHA.

      Ha.

      Nice one.

      • xigoia day ago
        It’s like expecting a lion to stop eating you if you ask it politely.
  • lawcomingfymsa day ago
    What happens to the faces in photos that do not belong to the photo owner?

    Do they get scanned as well without the person's permission?

  • thrownfjfkfmofn2 days ago
    How is this not a revenge porn or something? If I upload sensitive photos somewhere, it is 5 years prison sentence! CEO of Microsoft can do that billion times!
    • The_Presidenta day ago
      Security issue for targeted people. What if an MS account gets compromised and a bad actor plants illegal material on the computer that is then scanned by the cloud before it is caught.
  • fishmicrowaver2 days ago
    I was quite happy for a couple years to just use windows and wsl. Fully switched to Linux at home and Linux VM's at work. The thirst and desperation to make AI work gives me the creeps more than usual.
  • leakycap2 days ago
    Microsoft: forces OneDrive on users via dark pattern dialogs that many users just accept

    Users: save files "on their PC" (they think)

    Microsoft: Rolls out AI photo-scanning feature to unknowing users intending to learn something.

    Users: WTF? And there are rules on turning it on and off?

    Microsoft: We have nothing more to share at this time.

    Favorite quote from the article:

    > [Microsoft's publicist chose not to answer this question.]

    • hshdhdhehd2 days ago
      Almost feel like we are getting to class action or antitrust when you connect the dots. Almost all PCs come with Windows. Defacto you need to create a M$ account to use Windows locally. They opt you into one drive by default. They sync your docs by default. They upload all your photos into AI by default.
      • kevin_b_er19 hours ago
        By "class action" I presume you're referring to the US. If so, no, the courts of law are forbidden to you. You will instead go to a secret tribunal where the laws do not matter. The arbiter will only continue to be paid if they continue to rule for corporations.

        https://www.microsoft.com/en-us/servicesagreement#15_binding...

        • account427 hours ago
          Not everything that a company lawyer writes in text is actually legally binding.
      • queuebert2 days ago
        You can use Windows without a Microsoft account, but the dark pattern to do this is very difficult to navigate.
        • leakycap2 days ago
          Sounds like this advice will be expiring along with the next Windows update, so if you want a local account your window of opportunity may be closing. (What happens when you need to get a new PC?)
    • netsharc2 days ago
      Tell them "you may only refuse to answer this question 3 times a year".
    • j452 days ago
      It's totally worth self hosting files, it's gotten much better.
  • drumheada day ago
    We created this oligopoly because they were convenient, free, powerful, and now its time for us to pay the price.

    Or find services that may not be as easy to use, may cost something and may not have all the features you want, but which wont make unreasonable demands for your data.

    In light of the way the US government is carrying on, I'd rather not give Microsoft any of my images.

    • account427 hours ago
      > In light of the way the US government is carrying on, I'd rather not give Microsoft any of my images.

      What is this supposed to mean? That you'd be happier with the dystopia if they were going after people you like less?

  • amiga-workbench2 days ago
    Microsoft's understanding of consent is about on-par with that of a rapist.
    • The_Presidenta day ago
      I say this about advertising and after recently using Win11 for the first time to remove malware, I was left with a gross feeling. My friend whose computer it was is not highly PC literate, but when I was talking about the AI shit built in to these platforms, you could see the disgust building.
  • sombragris19 hours ago
    Obviously the whole point is to make AI overreach avoidance as painful as possible.

    Of course, that's also the reason why Lens was deprecated despite being a good, useful app, forcing one to deal with the bload of Copilot 365.

  • CGamesPlay2 days ago
    This doesn't feel like a problem at all. I only need to turn the setting off once, right? My immediate question to seeing that verbiage was, "how many times does the setting turn itself on in a year?"
  • teekerta day ago
    "It's not your data citizen, you should be happy we made this OS for you. You are not smart enough to do it your self, we know what is best."

    I can never help myself from hearing this inside, and am just incredibly thankful that we have Linux and FOSS in general. That really gives me hope for humanity at this point.

    I type this in FireFox, on NixOS, with all my pics open in another tab, in Immich. Thank you, thank you, thank you.

    • account427 hours ago
      Mozilla doesn't think it's your data either.
  • gessha2 days ago
    This made me look up if you can disable iOS photo scanning and you can’t. Hmm.
  • r0b05a day ago
    Disabling offline accounts is one thing but scanning and labeling your files to profile users is a whole other can of worms. This trajectory leads to zero privacy for the user and I feel like switching to Linux/Mac will be the only option sadly.
  • A_D_E_P_T2 days ago
    It really seems as though Microsoft has total contempt for their retail/individual customers. They do a lot to inconvenience those users, and it often seems gratuitous and unnecessary. (As it does in this case.)

    ...I guess Microsoft believes that they're making up for it in AI and B2B/Cloud service sales? Or that customers are just so locked-in that there's genuinely no alternative? I don't believe that the latter is true, and it's hard to come back from a badly tarnished brand. Won't be long before the average consumer hates Microsoft as much as they hate HP (printers).

  • fancyfredbot2 days ago
    Seems obvious they actually mean to limit the number of times you can opt in. Very poor choice of words.
    • mortehu2 days ago
      The difference is whether you get locked into having it on or having it off at the end.
  • getnormality2 days ago
    > You can only turn off this setting 3 times a year.

    Who's making the t-shirts? Don't forget the Microsoft logo. They're proud of this!

    In my head it's sounding like that Christmas jingle. It's the most wonderful time of the year!

  • reshekua day ago
    EU please whack them and whack them good
    • yupyupyupsa day ago
      You should ask the EU what happed with the Digital Markets Act, and if it's possible to install arbitrary software on an iPhone today. If the answer is "no, you can't", then that should give you a hint of how effective the EU really is when it comes to these issues.
  • jonas212 days ago
    I don't really see the issue. If you don't want the face recognition feature, then you'll turn it off once, and that's that. Maybe if you're unsure, you might turn it off, and then back on, and then back off again. But what's the use case where you'd want to do this more than 3x per year?

    Presumably, it's somewhat expensive to run face recognition on all of your photos. When you turn it off, they have to throw away the index (they'd better be doing this for privacy reasons), and then rebuild it from scratch when you turn the feature on again.

    • a21282 days ago
      If this is the true reason, then they have made some poor decisions throughout that still deserve criticism. Firstly by restricting the number of times you can turn it _off_ rather than _on_, secondly by not explaining the reason in the linked pages, and thirdly by having their publicist completely refuse to say a word on the matter.

      In fact, if you follow the linked page, you'll find a screenshot showing it was originally worded differently, "You can only change this setting 3 times a year" dating all the way back to 2023. So at some point someone made a conscious decision to change the wording to restrict the number of times you can turn it _off_

      • CMaya day ago
        Maybe what they see is that most people who turn it off will leave it off, but some people turn it off and turn it back on as a part of a pattern/habit around temporarily putting files on OneDrive they don't want to scan.

        For example, people who don't use their encrypted vault on OneDrive, so they upload photos that should otherwise be encrypted to their normal OneDrive which gets scanned and tagged. It could be a photo of their driver's license, social security card, or something illicit.

        So these users toggle the tagging feature on and off during this time.

        Maybe the idea is to push these people's use case to the vault where it probably belongs?

    • yuvalr12 days ago
      Well, sometimes Microsoft decides to change your settings back. This has happened to me very frequently after installing Windows updates. I remember finding myself turning the same settings off time and again.
      • netsharc2 days ago
        The "fuck you, user!" behavior of software companies now means there's no more "No", only "Maybe later". Every time I update Google Photos, it shows me the screen that "Photos backups are not turned on! Turn on now?" (because they want to upsell their paid storage space option).
        • anonymars2 days ago
          It also now bugs me to do face scanning every so often too

          And unlike most things, both prompts require you to explicitly click some sort of "no", not just click away to dismiss. The backup one is particularly obnoxious because you have to flip a shitty little slider as the only button is "continue". Fuck. Off.

        • jtmarl1n2 days ago
          The lack of a true “no” option and only “maybe later” infuriates me.
          • ryandrake2 days ago
            Silicon Valley companies are like a creepy guy in the nightclub going up to each woman and asking "Want to dance? [Yes] or [Ask Me Again]". The desperation is pathetic.
        • duped2 days ago
          I mean at this point I think it's really just utter incompetence over at Microsoft to design a system that can be updated without breaking it. They have never actually cared about solving that problem.

          If they had taste, someone opinionated over there would knock heads before shipping another version of windows that requires restarts or mutates user settings.

          • netsharca day ago
            A joke in the Windows 95 days was "You plugged in a mouse. Please restart your computer.". A few weeks ago I plugged in a Logitech wireless mouse receiver, Windows 10 installed the drivers automatically, and finished with "To complete the installation of the software, please restart your computer"...
            • account427 hours ago
              It's already absurd that a mouse should need a vendor/model-specific driver in 2025. It's just a standard USB Human Interface Device ffs.
    • noir_lord2 days ago
      > If you don't want the face recognition feature, then you'll turn it off once.

      The issue is that is a feature that 100% should in any sane world be opt in - not opt out.

      Microsoft privacy settings are a case of - “It was on display in the bottom of a locked filing cabinet stuck in a disused lavatory with a sign on the door saying 'Beware of the Leopard.”

      • bapak2 days ago
        There's inherently nothing wrong with face recognition, I love being able to search my own photos on my iPhone. If you could keep it private, you totally would too.
    • bayindirh2 days ago
      Even KDE's Digikam can run "somewhat expensive" algorithms on your photos without melting your PC and making you wait a year to recognize and label faces.

      Even my 10(?) year old iPhone X can do facial recognition and memory extraction on device while charging.

      My Sony A7-III can detect faces in real time, and discriminate it from 5 registered faces to do focus prioritization the moment I half-press the shutter.

      That thing will take mere minutes on Azure when batched and fed through GPUs.

      If my hunch is right, the option will have a "disable AI use for x months" slider and will turn itself on without letting you know. So you can't opt out of it completely, ever.

    • JumpCrisscross2 days ago
      > When you turn it off, they have to throw away the index (they'd better be doing this for privacy reasons), and then rebuild it from scratch when you turn the feature on again

      This is probably the case. But Redmond being Redmond, they put their foot in their mouth by saying "you can only turn off this setting 3 times a year" (emphasis mine).

    • NoLinkToMe2 days ago
      Agreed, in practice for me there's no real issue.

      But that's not necessarily true for everyone. And it doesn't need to be this way, either.

      For starters I think it'd help if we understood why they do this. I'm sure there's a cost to the compute MS spends on AI'ing all your photos, turning it off under privacy rules means you need to throw away that compute. And turning it back on creates an additional cost for MS, that they've already spent for nothing. Limiting that makes sense.

      What doesn't make sense is that I'd expect virtually nobody to turn it on and off over and over again, beyond 3 times, to the point that cost increases by more than a rounding error... like what type of user would do that, and why would that type of user not be exceedingly rare?

      And even in that case, it'd make more sense to do it the other way around: you can turn on the feature 3 times per year, and off anytime. i.e. if you abuse it, you lose out on the feature, not your privacy.

      So I think it is an issue that could and should be quickly solved.

    • pjc502 days ago
      > what's the use case where you'd want to do this more than 3x per year?

      That means that all Microsoft has to do to get your consent to scan photos is turn the setting on every quarter.

    • bionhoward2 days ago
      The point is it’s sucking your data into some amorphous big brother dataset without explicitly asking you if you want that to happen first. Opt out AI features are generally rude, trashy, low-class, money grubbing data grabs
    • ArnoVW2 days ago
      To prevent you from having the option to temporarily disable it, so you have to choose between privacy and the supposed utility
      • Barbing2 days ago
        Right, while I understand the potential compute cost, it would be like the iPhone restricting the number of times you could use “allow once“ for location permissions.
    • anigbrowl2 days ago
      Presumably, it's somewhat expensive to run face recognition on all of your photos.

      Very likely true, but we shouldn't have to presume. If that's their motivation, they should state it clearly up front and make it opt-out by default. They can put a (?) callout on the UI for design decisions that have external constraints.

    • netsharc2 days ago
      I wonder if it's possible to encrypt the index with a key that's copied to the user's device, and if the user wants to turn off this setting, delete the key on the server. When they want to turn it back on, the device uploads the key. Yes, the key might end up gone if there's a reinstall, etc.

      If the user leaves it off for a year, then delete the encrypted index from the server...

    • gus_massa2 days ago
      How hard it to turn it on? Does it show a confirmation message?

      My wife has a phone with a button on the side that opens the microphone to ask questions to Google. I guess 90% of the audios they get are "How the /&%/&#"% do I close this )(&(/&(%)?????!?!??"

      • xdfgh11122 days ago
        I bought a new Motorola phone and there are no less than three ways to open Google assistant (side button, hold home button, swipe from corner). Took me about 10 seconds before I triggered it unintentionally and quickly figured out how to disable all of them...
    • like_any_other2 days ago
      So why not limit how many times you can turn it on, instead of off?

      We all know why.

    • wzdd2 days ago
      "When this feature is disabled, facial recognition will be disabled immediately and existing recognition data will be purged within 60 days". Then you don't need a creepy message. Okay, so that's 6 times a year, but whatever.
    • 2 days ago
      undefined
    • kypro2 days ago
      Assuming this reasoning is accurate, why not just silently throw a rate limit error and simply not reenable it if it's repeatedly switched on and off?
  • mk89a day ago
    So the message is: if you can, don't use OneDrive.

    If you can't (work, etc.) try to avoid uploading sensitive documents in onedrive.

    I always wondered who uses OneDrive for cloud storage. Hell, I think even Google Drive is better.

    Microsoft has really pivoted to AI for all things. I wonder how many customers they will get vs how many they will lose due to this very invasive way of doing things.

  • bigbuppo2 days ago
    This is once again strongly suggesting that Microsoft is thoroughly doomed if the money they've dumped into AI doesn't pan out. It seems to me that if your company is tied to Microsoft's cloud platform, you should probably consider moving away as quickly as you can. Paying the vmware tax and moving eveyrthing in house is probably a better move at this point.
  • surgical_fire2 days ago
    There's a great solution to this.

    Just stop using Microsoft shit. It's a lot easier than untangling yourself from Google.

    • bee_rider2 days ago
      Yeah it is legitimately hard to avoid Google, if nothing else some of your emails will probably be leaked to Gmail.

      But Microsoft is pretty easy to avoid after their decade of floundering.

      • LogicFailsMe2 days ago
        Whenever I have to use Windows, I just create a new throwaway account on proton, connect it to the mother throwaway account connected to a yahoo email account created in the before times, install what I need, and then never access that account again.
        • hshdhdhehd2 days ago
          It is fucked you almost need mob levels of burner cell precautions to have privacy and use Excel.
      • khazhoux2 days ago
        How can I play starcraft 2 without it?
        • account427 hours ago
          Besides the obvious answer in the siblings, the more general answer (because there will always be something that won't transfer effortlessly) is that you don't have to play Starcraft 2. If the only way to engage in a leisure activity is by allowing yourself to be raped then maybe you should just not do it.
        • righthand2 days ago
          Starcraft 2 w/ Battlenet has been working on Linux for over a decade. You don’t even need Proton, it works lovely with WINE.
        • bee_rider2 days ago
          Apparently it runs in Proton (I haven’t tried it though).
    • sandblast2 days ago
      Yes. Just use Immich for photos. AI scanning, but local and only opt-in.
    • zahlman2 days ago
      Is there a free platform that will let me blog like GitHub Pages works?
      • efreaka day ago
        Gitlab. Codeberg. Neocities. Nekoweb. Wasmer. Surge. Digital Ocean. Freehostia. Awardspace. 000webhost. Static.run. Kinsta. Cloudflare Pages. Render. Hostinger. Ionos. Bluehost. Firebase. Netlify. Orbiter. Heliohost. There's probably hundreds of services with a free tier these days (though many of them will have strict limitations on website size and traffic, and you may have to run the build step locally).
    • inatreecrown22 days ago
      you mean like stop using GitHub?
      • archargelod2 days ago
        Yes.

        For private repos there is Forgejo, Gitea and Gitlab.

        For open-source: Codeberg

        Yes, it'll make projects harder to discover, because you can't assume that "everything is on github" anymore. But it is a small price to pay for dignity.

        • zahlman2 days ago
          Why not put open-source projects on Gitlab?
          • archargelod2 days ago
            You can't create new account on Gitlab without a credit card (outside of EU and USA).
      • surgical_fire2 days ago
        Yes, that too.
  • mbf1a day ago
    I wonder if you can write a program to make pictures with face tattoos be the normal for Microsoft AI to train on, like see if enough people did this, if Microsoft's facial recognition started generating lots of face tats...
  • christophilus2 days ago
    Fedora with vanilla Gnome is excellent for anyone looking for an alternative.
  • anarticle2 days ago
    Year of the Linux desktop edges ever closer.
  • Lioa day ago
    This sounds like the next level of the nauseating “maybe later”.

    i.e. You’ll do what we tell you eventually.

  • tayloriusa day ago
    It's enough to make a man consider Linux...
    • account426 hours ago
      A man perhaps, but not the average frog. The frog will continue to insist that it will freeze if it dares to step out of the familiar warm pot.
  • wkat42422 days ago
    Microsoft is such a scummy company. They always were but they've become even worse since they've gone all in on AI.

    I wonder if this is also a thing for their EU users. I can think of a few laws this violates.

  • unixheroa day ago
    To believe I was paying for this yearly

    Unbelievable

  • more_corn2 days ago
    That’s not opt out. Opt out is the ability to say no. If you’re not allowed to say no there’s no consent and you’re being forced.
    • Aurornis2 days ago
      If you opt out and then never turn it back on, you have opted out.
      • yupyupyupsa day ago
        Because Microsoft is known to respect user settings between (forced) Windows Updates and not turn stuff back on...
  • thaumasiotes2 days ago
    > Slashdot: What's the reason OneDrive tells users this setting can only be turned off 3 times a year? (And are those any three times — or does that mean three specific days, like Christmas, New Year's Day, etc.)

    > [Microsoft's publicist chose not to answer this question.]

  • dsigna day ago
    Microsoft gets most of its money from big corporate customers. Some of those customers are obligated by law to not leak sensitive personal data to servers in USA soil, because those customers have the missfortune of being in countries with strong privacy laws, functioning civil societies and sometimes even left-winged governments. I know for a fact that the product in question, "OneDrive", it's sometimes mandated in those companies as a backup solution for the company's computers. All it takes is a whistle-blowing incident or a chat with a journalist for this to become a major blow-up for Microsoft, with companies forced by tribunals to back off from contracts with Microsoft.
  • ptrl6002 days ago
    Presumably you just need to turn it off once, right?
  • _wire_2 days ago
    Crossposting slashdot?

    Heaven forfend!

    • dmitrygr2 days ago
      They are the ones who did this interview
  • drnick12 days ago
    Why would anyone use this crap at this point? Buy a (possible used) mini PC or thin client, install Linux and Samba on it, and voila, your own private "cloud" completely free of corporate interference, spyware and recurring fees. This works best with a static IP for remote access via Wireguard but it can be made to work on a residential connection.

    With a little more effort you can deploy Nextcloud, Home Assistant and a few other great FOSS projects and completely free yourself from Big Tech. The hardest part will probably be email on a residential connection, but it can be done with the help of a relay service for outgoing mail.

  • yencabulator2 days ago
    Reminder: Microsoft owns Github and NPM.
  • superkuha day ago
    I'm kind of surprised that it is Microsoft leading the field in this. It seems like something that'd be much more at home on an Apple or Google smartphone. But I suppose smartphones don't have the hardware or network power or resources to pull this off without noticibly degrading the smartphone performance.
  • apia day ago
    Why does anyone still run Windows?

    Games I guess.

    Both Mac and Linux desktop/laptop machines are better and less loaded with shit. If you don’t need or want a full featured PC you have Android and iOS which are also better. Android you have to be careful of but if you pick well it can be customizable and less loaded with shit.

    Steam is available for both Linux and macOS. Are there just not as many game titles? I just saw Cyberpunk show up in the Apple Store for Mac so there seems to be an effort to port more games off Windows.

    I have a Windows VM but use it less and less. Only need now is to test and build some software for Windows.

    Also: I realized what I do kind of like about Apple and how best to describe their ecosystem. It’s the devil you know. They are fairly consistent in their policies and they are better on privacy than others. Some of their policies suck, but they suck in known consistent ways.

    If I left Apple, Linux (probably on Framework) is the only alternative.

    • account426 hours ago
      > Why does anyone still run Windows?

      Learned helplessness.

    • d3Xt3r17 hours ago
      > Steam is available for both Linux and macOS. Are there just not as many game titles?

      A vast majority of the games work fine under Linux now, in fact most release these days woke even on day one of release. The only games that don't really work are ones which use invasive kernel-level anti-cheat systems.

    • Ylpertnodia day ago
      > Why does anyone still run Windows? > Games I guess.

      Music, and video.

  • immibisa day ago
    503 service unavailable - did Slashdot get HNed? Ironic. (Probably not - it's probably unrelated)
  • ziofilla day ago
    Slashdot: why opt-out rather than opt-in?

    Microsoft: it's just a shit as Microsoft 365 and SharePoint.

  • einpoklum2 days ago
    > I uploaded a photo on my phone to Microsoft's

    That's your problem right there.

    > Microsoft only lets you opt out of AI photo scanning

    Their _UI_ says they let you opt out. I wouldn't bet on that actually being the case. At the very least - a copy of your photos goes to the US government, and they do whatever they want with it.

  • exe342 days ago
    Makes me want to download and install windows, and store a picture of my hairy brown nutsack with googly eyes on it.
  • chris_wot2 days ago
    I think a call to Australia’s privacy commissioner might be in order.
    • globalnode2 days ago
      What are they gonna do? Hard to have a convo with your master when youre on your knees...
  • LogicFailsMe2 days ago
    I've never seen a better case for uploading endless AI slop photos.
  • buyucua day ago
    This is your daily reminder not to use Microsoft.
  • mkrishnan2 days ago
    fuck microsoft
    • The_Presidenta day ago
      Micro Soft Windows

      With a name like that who needs gravity.

  • pessimizer2 days ago
    Isn't it cute when there's absolutely no rationale behind a new rule, and it's simply an incursion made in order to break down a boundary?

    Look, scanning with AI is available!

    Wow, scanning with AI is now free for everyone!

    What? Scanning with AI is now opt-out?

    Why would opting-out be made time-limited?

    WTF, what's so special about 3x a year? Is it because it's the magic number?

    Ah, the setting's gone again, I guess I can relax. I guess the market wanted this great feature, or else they wouldn't have gradually forced it on us. Anyway, you're a weird techie for noticing it. What do you have to hide?

    • bigbuppo2 days ago
      There is a big rationale behind it. If their AI investments don't pan out, Microsoft will cease to exist. They've been out of ideas since the late 90s. They know that the subscription gravy train has already peaked. There is no more growth unless they fabricate new problems for which they will then force you to pay for the solution to the problem they created for you. Oh, your children were kidnapped because Microsoft sold their recognition and location data to kidnappers? Well you should have paid for Microsoft's identity protection E7 plus add-on subscription that prevents them from selling the data you did not authorize them to collect to entities that they should know better than to deal with.
      • nbngeorcjhe2 days ago
        I don't even get why they would need "ideas" or "growth" tbh. They have most popular desktop operating system, they have one of the most popular office suites, surely they make plenty of profit from those. If they just focused on making their existing products not shit they would remain a profitable company indefinitely. But instead they're enshittifying everything because they want more More MORE
        • Ekaros2 days ago
          Because there are too many people chasing of ever going up line on valuation chart. It is simply not acceptable anymore to have reasonable business that generates solid dividends and grows with opening markets and population. Blame the silicon valley, VC and like...
          • account426 hours ago
            Usury is the root of all evil.
  • theturtle2 days ago
    [dead]