Microsoft only lets you opt out of AI photo scanning 3x a year
hardware.slashdot.org798 points by dmitrygr a day ago
798 points by dmitrygr a day ago
"You can only turn off this setting 3 times a year."
Astonishing. They clearly feel their users have no choice but to accept this onerous and ridiculous requirement. As if users wouldn't understand that they'd have to go way out of their way to write the code which enforces this outcome. All for a feature which provides me dubious benefit. I know who the people in my photographs are. Why is Microsoft so eager to also be able to know this?
Privacy legislation is clearly lacking. This type of action should bring the hammer down swiftly and soundly upon these gross and inappropriate corporate decision makers. Microsoft has needed that hammer blow for quite some time now. This should make that obvious. I guess I'll hold my breath while I see how Congress responds.
> Why is Microsoft so eager to also be able to know this?
A database of pretty much all Western citizen's faces? That's a massive sales opportunity for all oppressive and wanna-be oppressive governments. Also, ads.
Combine face recognition on personal photos with age checks which include photos,and you can link stuff directly to Microsoft/Google accounts for ads
It's hilarious that they actually say that right on the settings screen. I wonder why they picked 3 instead of 2 or 4. Like, some product manager actually sat down and thought about just how ridiculous they could be and have it still be acceptable.
My guess is it was an arbitrary guess and the limit is due to creating a mass scan of photos. Depending on if they purge old data when turned off, it could mean toggling the switch tells microsoft's servers to re-scan every photo in your (possibly very large) library.
Odd choice and poor optics (just limit the number of times you can enable and add a warning screen) but I wouldn't assume this was intentionally evil bad faith.
I would be sceptical too, if I was still using Windows.
I’ve seen reports in the past that people found that syncing to the cloud was turned back on automatically after installing Windows updates.
I would not be surprised if Microsoft accidentally flip the setting back on for people who opted out of AI photo scanning.
And so if you can only turn it back off three times a year, it only takes Microsoft messing up and opting you back in three times in a year against your will and then you are stuck opted in to AI scanning for the rest of the year.
Like you said, they should be limiting the number of times it can be turned back on, not the number of times it can be turned off.
Yep. I have clients who operate under HIPAA rules who called me out of the blue wondering where their documents had gone. Microsoft left a cheery note on the desktop saying they had very helpfully uploaded ALL of their protected patient health data into an unauthorized cloud storage account without prior warning following one a Windows 10 update.
When I used to work as a technician at a medical school circa 2008, updating OS versions was a huge deal that required months of preparations and lots of employee training to ensure things like this didn't happen.
Not trying to say that you could have prevented this; I would not be surprised if Windows 10 enterprise decided to "helpfully" turn on auto updates and updated itself with its fun new "features" on next computer restart.
OneDrive is HIPAA, and IRS-740, and FIPS, for this reason. It’s an allowed store for all sorts of regulated data, so they don’t have to care about compliance risk.
I'm not sure the next Joint Commission audit will be totally cool with them randomly starting to store files in the cloud with zero policy/anything around the change.
If they are worried about the cost of initial ingestion then a gate on enabling would make a whole lot more sense than a gate on disabling.
Microsoft crossed that line so many years ago with their constant re-enabling without consent all the various anti-privacy stuff during upgrades.
3 is the smallest odd prime number. 3 is a HOLY number. It symbolizes divine perfection, completeness, and unity in many religions: the Holy Trinity in Christianity, the Trimurti in Hinduism, the Tao Te Ching in Taoism (and half a dozen others)
I'd rather guess that they've pick 3 as a passive-aggressive attempt to provide a false pretense of choice in "you can change it but in the end it's gonna be our way" style than thinking they're attributing some cultural significance of number 3 behind this option. But that's still interesting concept tho
> I'd rather guess that they've pick 3 as a passive-aggressive attempt to provide a false pretense of choice in "you can change it but in the end it's gonna be our way" style
This was exactly my thought as well.
The number seems likely to be a deal that could be altered upward someday for those willing to rise above the minimal baseline tier.
Right now it doesn't say if these are supposed to be three different "seasons" of the year that you are able to opt-out, or three different "windows of opportunity".
Or maybe it means your allocation is limited to three non-surveillance requests per year. Which should be enough for average users. People aren't so big on privacy any more anyway.
Now would these be on a calendar year basis, or maybe one year after first implementation?
And what about rolling over from one year to another?
Or is it use it or lose it?
Enquiring minds want to know ;)
Manager: "Three is the number thou shall permit, and the number of the permitting shall be -- three."
Actually, most users probably don't understand, that this ridiculous policy is more effort to implement. They just blindly follow whatever MS prescribes and have long given up on making any sense of the digital world.
most people probably won't know MS is doing this at all until their data is leaked
Can someone explain to me why the immediate perception is that this is some kind of bad, negative, evil thing? I don't understand it.
My assumption is that when this feature is on and you turn it off, they end up deleting the tags (since you've revoked permission for them to tag them). If it gets turned back on again, I assume that means they need to rescan them. So in effect, it sounded to me like a limit on how many times you can toggle this feature to prevent wasted processing.
Their disclaimer already suggests they don't train on your photos.
This is Microsoft. They have a proven record of turning these toggles back on automatically without your consent.
So you can opt out of them taking all of your most private moments and putting them into a data set that will be leaked, but you can only opt out 3 times. What are the odds a "bug" (feature) turns it on 4 times? Anything less than 100% is an underestimate.
And what does a disclaimer mean, legally speaking? They won't face any consequences when they use it for training purposes. They'll simply deny that they do it. When it's revealed that they did it, they'll say sorry, that wasn't intentional. When it's revealed to be intentional, they'll say it's good for you so be quiet.
There’s dark pattern psychology at play here. You are very likely to forget to do something that you can only do three times a year.
The good news is that the power of this effect is lost when significant attention is placed on it as it is in this case.
A bug, or a dialog box that says ”Windows has reviewed your photo settings and found possible issues. Press Accept now to reset settings to secure defaults”
This is how my parents get Binged a few times per year
This feels different though. Every time you turn it off and then on again it has a substantial processing cost for MS. If MS "accidentally" turns it on and then doesn't allow you to turn it off it raises the bar for them successfully defending these actions in court.
So to me it looks like MS tries to avoid that users ram MS's infrastructure with repeated expensive full scans of their library. I would have worded it differently and said "you can only turn ON this setting 4 times a year". But maybe they do want to leave the door open to "accidentally" pushing a wrong setting to the users.
As stated many times elsewhere here, if that were the case, it'd be an opt in limit. Instead it's an opt out limit from a company that has a proven record of forcing users into an agreement against their will and requiring an opt out (that often doesn't work) after the fact.
Nobody really believes the fiction about processing being heavy and that's why they limit opt outs.
> it'd be an opt in limit
Aren't these 2 different topics? MS and big-tech in general make things opt-out so they can touch the data before users get the chance to disable this. I expect they would impose a limit to how many times you go through the scanning process. I've run into this with various other services where there were limits on how many times I can toggle such settings.
But I'm also finding a hard time giving MS the benefit of the doubt, given their history. They could have said like GP suggested that you can't turn it "on" not "off".
> As stated many times elsewhere here .... Nobody really believes the fiction
Not really fair though, wisdom of the crowd is not evidence. I tend to agree on the general MS sentiment. But you stating it with confidence without any extra facts isn't contributing to the conversation.
A lot of people have a terabyte or more of OneDrive storage. Many people have gigantic photo collections.
Analyzing and tagging photos is not free. Many people don't mind their photos actually being tagged, but they are a little more sensitive about facial recognition being used.
That's probably why they separate these out, so you can get normal tagging if you want without facial recognition grouping.
https://support.microsoft.com/en-us/office/group-photos-by-p...
If you have a large list of scenarios where Microsoft didn't respect privacy settings or toggles, I would be interested in seeing them.
I know there have been cases where software automated changes to Windows settings that were intended to only be changed by the user. Default browsers were one issue, because malicious software could replace your default browser even with lower permissions.
Are you talking about things like that, or something else?
If that's the case, limit opt ins so Microsoft doesn't have to pointlessly scan data. But they're limiting opt outs, which forces people into that endless scanning of their data.
Nobody. Absolutely nobody. Believes it's to save poor little Microsoft from having their very limited resources wasted by cackling super villain power users who'll force Microsoft to scan their massive 1.5 GB meme image collections several times.
If it was about privacy as you claim in another comment, it would be opt in. Microsoft clearly doesn't care about user privacy, as they've repeatedly demonstrated. And making it opt out, and only three times, proves it. Repeating the same thing parent comments said is a weird strategy. Nobody is believing it.
> Analyzing and tagging photos is not free
Then why they are doing it ? Maybe because CIA/NSA and advertisers pay good money.
Because many people want it, expect it and value it.
Most moms and old folks aren't going to fuss or understand privacy and technical considerations, they just want to search for things like "greenhouse" and find that old photo of the greenhouse they setup in the backyard 13 years ago.
It's one thing if all of your photos are local and you run a model to process your entire collection locally, then you upload your own pre-tagged photos. Many people now only have their photos on their phones and the processing doesn't generally happen on the phone for battery reasons. You CAN use smaller object detection/tagging models on phones, but a cloud model will be much smarter at it.
They understand some of this is a touchy subject, which is why they have these privacy options and have limitations on how they'll process or use the data.