Meta’s AI smart glasses and data privacy concerns
svd.se1362 points by sandbach a day ago
1362 points by sandbach a day ago
I do think it's completely unacceptable if Meta makes the glasses unable to be used for routine functions without (a) other humans reviewing your private content and (b) AI training on your content. There needs to be total transparency to people when this is happening - these are absolutes.
But I'm a bit confused by the article because it describes things that seem really unlikely given how the glasses work. They shine a bright light whenever recording. Are people really going into bathrooms, having sex, sharing rooms with people undressed while this light is on? Or is this deliberate tampering, malfunctioning, or Meta capturing footage without activating the light (hard to believe even Meta would do this intentionally).
Agreed. I'm confused trying to map what the article is saying to what's happening at a technical level. For example, obviously it's not doing on-device inference, so it's unsurprising that it won't work without a network connection, but this is totally distinct from your recordings ending up getting labeled. It talks about being able to opt into that, which is one thing. But I guess I don't understand if you don't opt in, if the data still gets sent out for labeling.
I feel like this article is either a bombshell, or totally confused.
My reading was that as soon as you enable the "AI" functionality you are opted into having your recordings labeled.
"But for the AI assistant to function, voice, text, image and sometimes video must be processed and may be shared onwards. This data processing is done automatically and cannot be turned off."
Right, that's the section I was confused by because it was in the context of an experiment trying to use the AI stuff without an Internet connection, which obviously won't work. The article is using the "shared onwards" terminology to refer to at least inference. But the inference part is uninteresting to me, and the data labeling is. The article doesn't really separate those out.
>> but this is totally distinct from your recordings ending up getting labeled
The distinction here occurs wherever the data is processed, and it sounds as if the difference between using your video for labeling versus privately processing it through an AI is deliberately confusing and obscured to the user by the way the terms of service are written. Once the video is uploaded, which is necessary for the basic function, it's unclear how or whether it can be separated from other streams that do go through labeling. This confusion also seems to be an intentional dark pattern.
I do believe people do all of that with the light on. And then there are also people who tamper with the device to deactivate the light. You can find guides for that online.
The funny thing about the light is that it doesn't even matter when surreptitious recording devices are trivial to make these days. You can never know when you're being recorded, even when no one is wearing glasses.
Also some people probably tape over the light for whatever reason.
my understanding is that the light is resistant to simply taping over it, and recording can't happen in this case. you have to intentionally modify the glasses to be able to surreptitiously record.
> my understanding is that the light is resistant to simply taping over it, and recording can't happen in this case.
I remember when the glasses came out and this was tested: if you tape it over before starting the recording it refuses, but if you tape it after starting it will happily continue to record. I don't know if they changed it, but that is how it use to be.
Still works like that.
The glasses have in the same hole a led light and a small light sensor (similar to the ones used in monitors to set up auto-brightness).
On start recording the glasses check if the light sensor is above a certain threshold, if it is then it starts recording and turns on the led light.
So, if you start recording and then cover the hole, it keeps recording because the check only happens on start. Even if they wanted to fix this by making the light sensor do a constant check it wouldn't work as the privacy led light indicator is triggering the same sensor, which is a terrible design choice.
And to disable the light is as easy as using a small drill bit and breaking either the light sensor module or the led light. They can detect if it's been tampered with and they put a giant notice saying the privacy light is not working but they still let you record anyways lol.
> Even if they wanted to fix this by making the light sensor do a constant check it wouldn't work as the privacy led light indicator is triggering the same sensor,
The privacy led light could just turn off for a couple of milliseconds (or less) while the light sensor performs its check.
Or, just buy any of the many pages of hidden cam devices that exist on Amazon, which also aren't limited to only 3 minute videos.
> The privacy led light could just turn off for a couple of milliseconds (or less) while the light sensor performs its check.
True but then that would mean a blinking led light instead of a constant turned on led light, which is a different product requirement from what it currently does.
Parent's point was that you can likely do it at a high enough frequency that blinking would be imperceivable by the human eye.
I don't think the cheap light sensor would have a fast enough polling rate for that. And if you increase the polling rate I will just put a phosphorescent sticker that absorbs and reflects the light coming out of the led with a good enough afterglow that the photoresistor will still pick up as some value and still allow for recording.
Also what is the implication here? If you cover the hole accidentally for one microsecond do you invalidate the whole recording? Does it need to be covered for more than one second, two seconds, ten?
All of that for what? So that in 2 years we can have chinese off-brand clones for 50 dollars that offer no security mechanisms anyways?
We all need to understand this is the new normal, being able to be recorded anywhere anytime. Just like you can get punched in the street anywhere anytime. We only act on things that can be proven to have caused you prejudice in court.