The Age Verification Trap: Verifying age undermines everyone's data protection
spectrum.ieee.org1076 points by oldnetguy 9 hours ago
1076 points by oldnetguy 9 hours ago
We'll try everything, it seems, other than holding parents accountable for what their children consume.
In the United States, you can get in trouble if you recklessly leave around or provide alcohol/guns/cigarettes for a minor to start using, yet somehow, the same social responsibility seems thrown out the window for parents and the web.
Yes, children are clever - I was one once. If you want to actually protect children and not create the surveillance state nightmare scenario we all know is going to happen (using protecting children as the guise, which is ironic, because often these systems are completely ineffective at doing so anyway) - then give parents strong monitoring and restriction tools and empower them to protect their children. They are in a much better and informed position to do so than a creepy surveillance nanny state.
That is, after all, the primary responsibility of a parent to begin with.
As a parent, I think you’re understating how difficult it is to provide a specific amount of internet access (and no more) to a motivated kid. Kids research and trade parental control exploits, and schools issue devices with weak controls whether parents like it or not. I’m way at the extreme end of trying to control access (other than parents who don’t allow any device usage at all) and it has been one loophole after another.
I know this is weird, but I'm in some ways not really sure who is on the side of freedom here. I get your position, but like. The whole idea of the promise of the internet has been destroyed by newsfeeds and mega-corps.
There is almost literally documented examples of Facebook executives twirling their mustaches wondering how they can get kids more addicted. This isn't a few bands with swear words, and in fact, I think that the damage these social media companies are doing is in fact, reducing the independence teens and kids that have that were the fears parents originally had.
I dunno, are you uncertain about your case at all or just like. I just like, can't help but start with fuck these companies. All other arguments are downstream of that. Better the nanny state than Nanny Zuck.
> I just like, can't help but start with fuck these companies. All other arguments are downstream of that.
The solution would then be to break them up or do things like require adversarial interoperability, rather than ineffective non-sequiturs like requiring them to ID everyone.
The perverse incentive comes from a single company sitting on a network effect. You have to use Facebook because other people use Facebook, so if the algorithm shows you trash and rage bait you can't unilaterally decide to leave without abandoning everyone still there, and the Facebook company gets to show ads to everyone who uses it and therefore wants to maximize everyone's time wasted on Facebook, so the algorithm shows you trash and rage bait.
Now suppose they're not allowed to restrict third party user agents. You get a messaging app and it can send messages to people on Facebook, Twitter, SMS, etc. all in the same interface. It can download the things in "your feed" and then put it in a different order, or filter things out, and again show content from multiple services in the same interface, including RSS. And then that user agent can do things like filter out adult content, if you want it to.
We need to fix the actual problem, which is that the hosting service shouldn't be in control of the user interface to the service.
> ineffective non-sequiturs like requiring them to ID everyone.
Is that really a non-sequitur though? Cigarettes are harmful and addictive so their sale is age gated. So too for alcohol. Gambling? Also yes. So wouldn't age gating social media be entirely consistent in that case?
Not that I'm necessarily in favor of it. I agree that various other regulations, particularly interoperability, would likely address at least some of the underlying concerns. But then I think it might not be such a bad idea to have all of the above rather than one or the other.
Indeed "Interoperability" is what would hurt social media giants the most - Cory Doctorow recently held an excellent talk where he stated that back in the early 00s Facebook (and others) used interoperability to offer services that allowed to interact, push and pull to mySpace (the big dog back then) to siphon off their users and content. But once Facebook became the dominant player, they moved to make the exact tactics they used (Interoperability and automation) illegal. Talking about regulatory capture ...
>There is almost literally documented examples of Facebook executives twirling their mustaches wondering how they can get kids more addicted.
Then close their business. Age verification just makes their crimes even more annoying.
> I just like, can't help but start with fuck these companies. All other arguments are downstream of that. Better the nanny state than Nanny Zuck.
Wild times when we're seeing highest voted Hacker News commenters call for the nanny state.
If you're thinking these regulations will be limited to singular companies or platforms you don't use, there is no reason to believe that's true.
There was already outrage on Hacker News when Discord voluntarily introduced limited ID checks for certain features. The invitations to bring on the nanny state reverse course very quickly when people realize those regulations might impact the sites they use, too.
A lot of the comments I'm seeing assume that only Facebook or other platforms will be impacted, but there's now way that would be the case.
OK, here's another one.
How about taking all these websites that require PII onto their own members-only domain?
This actually should have been in place and well fleshed-out before Google & Microsoft started pushing their "account" nonsense.
> ... start with fuck these companies. Better the nanny state than Nanny Zuck.
I'm not sure how those two positions connect.
Execs bad, so laws requiring giving those execs everyone's IDs, instead of laws against twirled mustaches?
these are just bad arguments all around, including gov't with this upload id crap. Why aren't we making internet 18+? The only unrefutable answers I get are just downvotes which is ok I guess, sort of validates my point because there's no reason for kids to get unrestricted internet access and downvotes are easy.
>Better the nanny state than Nanny Zuck.
For me this is a crux, at least in principle. Once online media is so centralized... the from argument freedom is diminished.
There are differences between national government power and international oligopoly but... even that is starting to get complicated.
That said... This still leaves the problem in practice. We get decrees that age-restriction is mandatory. There will be bad compliance implementations. Privacy implications.
Meanwhile a while... how much will we actually gain when it comes to child protection.
You can come up will all sorts of examples proving "Facebook bad" but that doesn't mean these things are fixed when/if regulation actually comes into play.
Social media is like tobacco. We went after tobacco for targeting kids, we should do the same to social media. Highly engineered addictive content is not unlike what was done to cigarettes.
Yes, go after Facebook and their kind only, avoid collateral damage to the remaining regular old internet.
No, it isn't. Tobacco is a physical substance that alters users' biochemistry and creates a physical dependence. Social media is information conveyed via a computing device. You can criticize social media for what it is in its own right, without having to engage in these kinds of disingenuous equivocations.
Sounds like you need to read up on dopamine and addictions a bit more.
Gambling isn’t introducing substance into user system it is making use of existing brain chemicals.
Social media companies engineered every piece of addictive mechanisms from gambling to alter brain chemistry or reactions of users.
You're blurring the lines a bit. Gambling isn't inherently an addiction. Just like a good TV show isn't inherently addictive either. Social media trying to be more engaging shouldn't really be viewed as an evil action anymore than HBO trying to create compelling content is.
The problem with comparing social media use to tobacco is that they are completely different. It's like saying weed is just like heroin because they both make you feel good. It's reductive and not productive.
The completely anti-social media stance ignores the good parts of social media. People can connect from across the planet and found others who shares the same views or experiences. People who are marginalized can find community where none may exist in their local area. So we should approach this more carefully and grounded.
You can make the point that social media has real positive benefits as well as negatives without minimizing the well proven fact that gambling creates a form of addiction in a significant proportion, though not all, of its users, one every bit as devastating as heroin or alcohol.
Nothing is inherently an addiction. You can smoke a cigarette without it being an addiction.
You’re right, it’s actually worse than tobacco. Tobacco simply makes your body sick, but social media attacks the most vital part of us. Even the CDC has studied this: https://www.cdc.gov/mmwr/volumes/73/su/su7304a3.htm
the mechanisms by which that information is being conveyed have been shown to be addictive as well, no?
Comparing Tobacco to Social Media is like comparing me to LeBron James. I'd rather have my kid smoke a pack of day than have social media accounts
Those execs were also using the tactics to addict adults, and while they may have targeted teens, the problem is, at its core: humans. So no amount of nannying by either the company nor the government will solve this issue.
Who would be responsible if a child developed alcohol addiction? A nicotine problem? Any other addiction?
Exactly. The same people that should be responsible for giving them unfettered access to an internet that is no longer safe. Even adults have to be wary of getting hooked on scrolling, and while I agree that the onus is on the companies, it has been demonstrated over and over again that they will not be held to account for their behavior.
So the only logical choice left that actually preserves freedom is for parents to get off their ass and keep their child safe. Parent's that don't use filtering and monitoring software with their children should be charged with neglect. They are for sending a kid into the cold without a coat, or letting them go hungry, why is it different sending them onto the internet?
And to your last point: You are dead wrong. No government anywhere in the world has demonstrated that they have the resources, expertise, or technical knowledge to solve this problem. The most famously successful attempt is the Chinese Great Firewall, which is breached routinely by folks. As soon as a government controls what speech you are allowed to consume, the next logical step for them is to restrict what speech you can say, because waging war on what people access will always fail. I mean, Facebook alone already contains tons of content that's against its terms of service, and they have more money than God, so either they actually want that content there, or they are too understaffed to deal with the volume, and the volume problem only ever increases.
So in my view, you are the one against freedom by advocating for the government to control the speech adults can access for the sake of "protecting the children" when the actual people that are socially, morally, and legally culpable for that protection are derelict in their duties.
> Who would be responsible if a child developed alcohol addiction? A nicotine problem? Any other addiction?
The government literally actively prevents people selling all these things to children, rather than permit a free for all and then expect parents to take responsibility for steering their kids away from them.
Meta for one has proven terminally irresponsible at acceptable stewardship.
Maybe it's about time that the proven predatory companies be restricted to something like their own adults-only internet cafes where age can be checked at the door.
They had their chance with the open internet and they blew it.
> Who would be responsible if a child developed alcohol addiction? A nicotine problem? Any other addiction?
I mean, historically speaking, we blamed the tobacco companies.
Did we? I know they lost some court cases, had to adjust advertising and so on, but was any tobacco company actually held accountable for the harm they caused? The answer is no because they all still exist and are profitable entities. Corporations that cause the harm they did should be subject to dissolution.
Also, if they were genuinely responsible, why can a child's parents be held accountable for them developing an addiction? The company was responsible, not the parent... do you see how ignorant that sounds?
The de jure minimium age to purchase tobacco is 21 now in the US, so I guess anyone see to sell tobacco products to those under that age could be held responsible as well.
They are held responsible by paying a fine to the government or losing their tobacco license, which is better than nothing, but doesn't actually fix the harm they caused already for the kid that's now hooked.
Screw over Meta then. Not everybody else.
Meta is the bozo in a panel van with no windows. All The legit porn sites put up Big Blinking Neon Signs.
I actually run an adults only community site and you are correct, I have it in a popup that appears on every "fresh" visit to the site, it's in the giant bold print you agree to when you register, and from a technical end, I send every possible header and other signal to let filtering software know it's an adult only space. If there is a child accessing that site, they are doing so because their parent didn't even attempt to prevent them from doing so. And now I'm having to look into ID verification services that are going to quintuple to costs of hosting this free community for people in a time where community is more important than ever.
> Better the nanny state than Nanny Zuck.
This is a huge self own. I can't believe I'm reading this on a website called "hacker news".
while i'm sympathetic to your position, the truth is that /that/ is where this site is now.
> I just like, can't help but start with fuck these companies. All other arguments are downstream of that. Better the nanny state than Nanny Zuck.
How about we reject all institutional nannies?
It is much easier to implement user-controlled on-device settings than any sort of over-the-Internet verification scheme. Parents purchase their children's devices and can adjust those settings before giving it to their kids. This is the crux of the problem, and all other arguments are downstream of this.
The problem is that internet is used nowadays for democratic purposes. Once you introduce a globally unique personal ID, you will be monitored. And boy, you will be monitored throughoutly. In case of any democratic process that needs to be undertaken in future against government, this very government will take the tools of identification and will knock to the doors of people who try to raise awareness and maybe mutiny. And this is what Orwell wrote about
> Better the nanny state than Nanny Zuck.
why-not-both.jpg
Maximizing corporate freedom leads inevitably to corporate capture of government.
Opposing either government concentration of power alone or corporate concentration of power alone is doomed to failure. Only by opposing both is there any hope of achieving either.
Applying that principle to age-verification, which I think is inevitable: Prefer privacy-preserving decoupled age-verification services, where the service validates minimum age and presents a cryptographic token to the entity requiring age validation. Ideally, discourage entities from collecting hard identification by holding them accountable for data breaches; or since that's politically infeasible, model the service on PCI with fines for poor security.
The motivation for this regime is to prevent distribution services from holding identification data, reducing the information held by any single entity.
> Prefer privacy-preserving decoupled age-verification services, where the service validates minimum age and presents a cryptographic token to the entity requiring age validation.
This is the wrong implementation.
You require sites hosting adult content to send a header indicating what kind of content it is. Then the device can do what it wants with that information. A parent can then configure their child's device not to display it, without needing anybody to have an ID or expecting every government and lowest bidder to be able to implement the associated security correctly.
It doesn't matter what kind of cryptography you invent. They either won't use it to begin with or will shamelessly and with no accountability violate the invariants taken as hard requirements in your theoretical proof. If you have to show your ID to the lowest bidder, you're pwned, so use the system that doesn't have that.
This solves some probelms, such as children accessing porn sites (oh the horror). But it doesn't solve other problems, such as predators accessing children's spaces. YouTube Kids is purportedly a safe, limited place for kids - and yet, there are numerous disturbing videos that get past the automated censors. Pedophiles stalk places like Roblox.
Your proposed architecture also achieves the goal of discouraging content-distributing entities from holding hard identification data, so it sounds good to me.
> Better the nanny state than Nanny Zuck.
The state can imprison you. Zuck can't.
> I know this is weird, but I'm in some ways not really sure who is on the side of freedom here. I get your position, but like.
No one. You’ll see a few politicians and more individuals stuck to their principles, but anyone with major clout sees the writing on the wall and is simply working to entrench their power.
> Better the nanny state than Nanny Zuck.
Indeed, what lolberts fail to understand usually is not a choice between government vs “freedom” it’s a choice between the current government and whoever will fill up the power vacuum left by the government.
You sound like someone that would work for the CIA or FBI if they offered you a job. Those are the types of people that I cannot and will not ever trust. I do not respect your opinion.
Centralization and standardization are going to be the topic in the 21st century.
For all the complaining some U.S.-Americans seem to do about the EU approach to these issues, things like the Digital Markets Act aim to fix exactly these types of issues.
>I get your position ... There is almost literally documented examples of Facebook executives twirling their mustaches wondering how they can get kids more addicted. This isn't a few bands
Their position was to compare it to alcohol, guns, and tobacco, not bands using naughty words. Alcohol and tobacco definitely enter mustache swirling territory, getting children addicted and funding misinformation on the harms of their product.
> I know this is weird, but I'm in some ways not really sure who is on the side of freedom here.
That’s because “freedom” is complicated and doesn’t precisely map to the interests of any of the major actors. Its largely a war between parties seeking control for different elites for different purposes.
Yes, seeking more control for themselves and completely at the expense of everybody else's loss.
> documented examples of Facebook executives twirling their mustaches wondering how they can get kids more addicted
If you genuinely believe that this is about those moustache twirling executives, then I have a bridge to sell you.
Have you ever wondered why and how these systems are being implemented? Have you ever gone why Discord / Twitch / what have you and why now? Have you ever thought that this might be happening because of Nepal and the fears of another Arab spring?
https://www.aljazeera.com/news/2025/9/15/more-egalitarian-ho...
I think too many people on this platform don't understand what this is about. This is about power. It's not about what's good for you or the children. Or for the constituents. It's about power. Real power. Karp-ian "scare enemies and on occasion kill them" power.
There are many ways in which such a system could be implemented. They could have asked people to use a credit card. Adult entertainment services have been using this as a way to do tacit age verification for a very long time now. Or, they could have made a new zero-knowledge proof system. Or, ideally, they could have told the authorities to get bent. †
Tech is hardly the first industry to face significant (justifiable or unjustifiable) government backlash. I am hesitant to use them as examples as they're a net harm, whereas this is about preventing a societal net harm, but the fossil fuel and tobacco industries fought their governments for decades and straight up changed the political system to suit them. ††
FAANG are richer than they ever were. Even Discord can raise more and deploy more capital than most of the tobacco industry at the time. It's also a righteous cause. A cause most people can get behind (see: privacy as a selling point for Apple and the backlash to Ring). But they're not fighting this. They're leaning into it.
Let's take a look at what Discord asked people for a second, the face scan,
If you choose Facial Age Estimation, you’ll be prompted to record a short video selfie of your face. The Facial Age Estimation technology runs entirely on your device in real time when you are performing the verification. That means that facial scans never leave your device, and Discord and vendors never receive it. We only get your age group.
Their specific ask is to try and get depth data by moving the phone back and forth. This is not just "take a selfie" – they're getting the user to move the device laterally to extract facial structure. The "face scan" (how is that defined??) never leaves the device, but that doesn't mean the biometric data isn't extracted and sent to their third-party supplier, k-Id.There was an article that went viral for spoofing this, https://age-verifier.kibty.town/ // https://news.ycombinator.com/item?id=46982421 . In the article, the author found by examining the API response the system was sending,
k-id, the age verification provider discord uses doesn't store or send your face to the server. instead, it sends a bunch of metadata about your face and general process details.
The author assumes that "this [approach] is good for your privacy." It's not. If you give me the depth data for a face, you've given me the fingerprint for that face.We're anthropomorphising machines. A machine doesn't need pictures; "a bunch of metadata" will do just fine.
We are assuming that the surveillance state will require humans sitting in a shadow-y room going over pictures and videos. It won't. You can just use a bunch of vectors and a large multi-modal model instead. Servers are cheap and never need to eat or sleep.
Certain firms are already doing this for the US Gov, https://x.com/vxunderground/status/2024188446214963351 / https://xcancel.com/vxunderground/status/2024188446214963351
We can assume de facto that Discord is also doing profiling along vectors (presumably behavioral and demographic features) which that author described as,
after some trial and error, we narrowed the checked part to the prediction arrays, which are outputs, primaryOutputs and raws.
turns out, both outputs and primaryOutputs are generated from raws. basically, the raw numbers are mapped to age outputs, and then the outliers get removed with z-score (once for primaryOutputs and twice for outputs).
Discord plugs into games and allows people to share what they're doing with their friends. For example, Discord can automatically share which song a user is listening on Spotify with their friends (who can join in), the game they're playing, whether they're streaming on Twitch etc.In general, Discord seems to have fairly reliable data about the other applications the user is running. Discord also has data about your voice and now your face.
Is some or all of this data being turned into features that are being fed to this third-party k-ID? https://www.k-id.com/
https://www.forbes.com/sites/mattgardner1/2024/06/25/k-id-cl...
https://www.techinasia.com/a16z-lightspeed-bet-singapore-par...
k-ID is (at first glance) extracting fairly similar data from Snapchat, Twitch etc. With ID documents added into the mix, this certainly seems like a very interesting global profiling dataset backstopped with government documentation as ground truth.
I'm sure that's totally unrelated. :)
-
† like they already have for algorithmic social media and profiling, https://www.newyorker.com/magazine/2024/10/14/silicon-valley...
Somehow there's tens to hundreds of millions available for crypto causes and algorithmic social media crusades, but there's none for the "existential threat" of age verification.
†† Once again, this is old hat. See also: Turbotax, https://www.propublica.org/article/inside-turbotax-20-year-f...
I don't get your point, at least not in relation to the GP post. I agree with GP, parents need to be more accountable. We as parents, and We should all be concerned about future children/generations, should be demanding more regulation to help force the change we need on this topic. We as a society need to treat SM like those other addictive product classes. The fact SM is addicting and execs try to juice it more, is frankly to be expected.
Vilify them all you want, but same has been done with nicotine products, alcohol products, etc. and to GPs point, we SM as a toy for our children to play with. We chose to change the rules (laws, regulations, etc) because capitalists can never be simply trusted to do what's best for anything except their bottom line. That's a fundamental law no different than inertia or gravity in a capitalistic society. That's why regulators exist. Until you regulate it, they will wear their villain badge and rake in the billions. It's easy to be disliked when the topic of your disdain is what makes you filthy rich (in other words, they don't care what you or I think of what they're doing).
Nobody is putting a gun to your head and forcing you to use Facebook or whatever other site. I quit using most social media over a decade ago. If you don't want to use it, or you don't want your children to use it, then don't use it.
O yeah? Where’s that guy who couldn’t get a job 6 months ago because he refuses to use LinkedIn.
> give parents strong monitoring and restriction tools
The problem is that it's bloody hard to actually do this. I'm in a war with my 7yo about youtube; the terms of engagement are, I can block it however I want from the network side, and if he can get around it, he can watch.
Well, after many successful months of DNS block, he discovered proxies. After blocking enough of those to dissuade him, he discovered Firefox DNS-over-HTTPS, making it basically impossible to block him without blocking every Cloudflare IP or something. Would love to be wrong about that, but it seems like even just blocking a site is basically impossible without putting nanny-ware right on his machine; and that's only a bootable Linux USB stick away from being removed unless I lock down the BIOS and all that, and at that point it's not his computer and the rules of engagement have been voided.
For now I'm just using "policy" to stop him, but IMO the tools that parents have are weak unless you just want your kid to be an iPad user and never learn how a computer works at all.
As a parent of young children, this is your entire problem:
> the terms of engagement are, I can block it however I want from the network side, and if he can get around it, he can watch.
You're treating this as a technical problem, not a parental rules problem. Your own rules say he's allowed to watch!
You have to set the expectations and enforce it as a parent.
Sounds like a smart kid, is part of you secretly proud of him for his tenacity?
Is it impractical to keep an eye on what he's doing on his computer, i.e. physically checking in on him from time to time?
How about holding him responsible for his own behavior, to develop respect for the rules you impose? Is it just hopeless, and if so how come? Is it impossible for him to understand why you don't want him watching certain content or why he should care about being worthy of your trust?
I'm not judging here, I'm genuinely curious.
I remember when I was a kid that age there were rules and some were technically enforced. But if you found a way around the technical enforcement you were in huge trouble. The equivalent here would he been, if you used a proxy to watch what you weren't meant to, then you lose all screen time indefinitely. Sneaking around parents' rules was absolutely not on.
Putting controls on the machine you want to restrict is pretty normal. While I agree with your first sentence that it's hard for parents to get proper monitoring tools, the rest of this sounds like a self-imposed problem. If you don't want to mess with the actual machine then run a proxy it has to use.
I am a bit confused by that comment. Are parents social responsible to prevent companies from selling alcohol/guns/cigarettes to minors? If a company set up shop in a school and sold those things to minors during school breaks, who has the social responsibility to stop that?
when I was a kid in the early 90's, my state (and many others) banned cigarette vending machines since there was no way to prevent them being used by minors, unless they were inside a bar, where minors were already not allowed.
The problem is, doing the analogous action with the entire internet is a privacy nightmare. You didn't have to tell 7-11 every item you bought at every store in the past 2 years and opt-in to telling them what other stores you go to for the next 5.
There is no digital equivalent of "flash an ID card and be done with it" in the surveillance state era of the internet. Using a CC is the closest we have and even then you're giving data away.
The analogous action is to only require age-restricted sites (or parts of sites) to check ID, not the entire Internet. e.g. no one is calling for mathisfun.com to check ID. I'd expect most parts of the web are child-friendly and would not be affected. Just like how almost all locations in physical space don't need to check ID.
Additionally, the laws I've read mandate that no data be retained, so you have stronger legal protections than typical credit card use, or even giving your ID to a store clerk for age restricted purchases (many stores will scan it without asking, and in some states scanning is required).
This might have the benefit of reversing the trend where everything on the internet was rolled in to social media. If social media is age restricted, news, announcements, etc will have to break out to dedicated websites if they want to be accessible by all ages.
just ban kids from the internet already. if a parent allows the kid to have a full function smartphone and the kids get caught with it then throw the parents in jail and kids in an orphanage. people will catch on.
You do not need to control the entire internet. Put time limits on connected devices. Use parental controls. Talk to your kids about what they do online. Set clear boundaries. Reward good behaviour. Talk to other parents to align these limits to avoid social issues among the kids.
We may be agreeing, I'm saying there is no battle tested, privacy safe technical method of verifying age online, and this the controls need to be in the physical environment and setting social standards for social media and phone use.
Parents can't easily prevent their kids from going to those kinds of stores once they're at the age where the parent doesn't need to keep an eye on them all the time and they can travel about on their own.
The difference though is that parents are generally the ones to give their kids their phones and devices. These devices could send headers to websites saying "I'm a kid" -- but this system doesn't exist, and parents apparently don't use existing parental controls properly or at all.
> These devices could send headers to websites saying "I'm a kid" -- but this system doesn't exist
And there would be ways to work around it. If people find that privacy-preserving age verification is not good enough because "some kids will work around it", then nothing is good enough, period. Some will always work around anything.
if a parent gives a kid a full on smartphone, charge the parent with child abuse just like feeding the kid alcohol, cigarettes or having sex with them. people will catch on.
Or people who aren't parents are yet again sharing strong opinions that are not based in reality. Plenty of parental controls are deployed, how long they last against a determined child is the real question. Here's a concrete example for you. Spotify has a web browser built in so that you can watch music videos, kids have figured out a way to use that to watch any video on YouTube--a 12 year old told me this. If you search on this subject you'll quickly learn this is well known and is generally being ignored by Spotify. Why not allow parents to disable the in-app web browser / video function?
It's not as easy as you may believe to prevent that type of access.
So what’s the alternative? Pretend we don’t live in a digitally connected society and set our kids up for failure when they get one years after their peers?
Let's assume for the sake of argument that social media is extremely harmful to children. Which means the answer to your question is "yes, obviously". If people were running around giving their kids fentanyl, you wouldn't say "but my kid's friends all use fentanyl and he'll be an outcast if he can't". You would say "any friends that he loses over this are well worth avoiding the damage". Why would it be different just because it's social media?
Phones, I mean. Sorry for the confusion there. I’m for holding off on social media.
Keeping your kids off social media is setting them up for success.
I’m talking about phones specifically. Agree re: social media.
The problem isn't with phones. We should have robust parental controls and the responsibility of parenting should be left to, wait for it... the parents.
I think the argument is more around it being illegal so as to not be forced into playing "the bad guy". It's hard to prevent a level of entitlement and resentment if those less well parented have full access. If nobody is allowed then there's no parental friction at all.
Its unfortunate that the application of this rule is being performed at the software level via ad-hoc age verification as opposed to the device level (e.g. smartphones themselves). However that might require the rigimirole of the state forcibly confiscating smartphones from minors or worrying nepalise outcomes.
I'm saying hold parent's accountable for their children's online behavior and for their protection online, not companies (who want to profit off the kids, perverse incentive) or governments (who can barely be trusted to do this even if this was the only goal). For example if your kid starts making revenge CP of their classmates, and the parent could have reasonably mitigated or known about it, I think the parent absolutely should be held responsible.
Don't punish the rest of the web for crappy parenting and crappy incentives by companies/govts.
If we want parents to be accountable, then these platforms need to provide better tools to enable parents to do so. It is impossible to monitor the entirety of your child's behavior online through any of these platforms today. They are their own person, they make their own choices, and those choices are heavily influenced by a world the parents have increasingly less influence over, especially as they grow older.
On the flip side, I do think we should also hold companies more accountable for this. We collectively prevented companies from advertising tobacco to minors through regulation with a pretty massive success rate. These companies know how harmful social media can be on youth, and there is little to no effective regulation around how children learn about these platforms and get enticed into them.
I do not disagree with any of this, I was hoping it was implied by my original comment that this would be necessary.
> I'm saying hold parent's accountable for their children's online behavior and for their protection online
You're saying the status quo and I think its fair to state you wouldn't intentionally design the status quo. Unless we have some wizard wheeze where we can easily arrest and detain or otherwise effectively punish parents without further reducing the quality of life for their children.
But it's not playing the bad guy. It's playing the good guy.
in the abstract but in the social of the home you have to be the bad guy. While good parents manage that, the bar is too high for society in general.
ISPs and OSs should be the ones providing these tools and make is stupid easy to set up a child's account and have a walled garden for kids to use.
I live in the UK. By default your ISP will block "mature" content and you have to contact them to opt out. iOS, Android, Playstation, Xbox, Switch all have parental controls that are enforced at an account level.
A child with an iPhone, Xbox, and a Windows Laptop won't be able to install discord unless the parent explicitly lets them, or opts out of all the parental controls those platforms have to offer.
The tech is here already, this is not about keeping children safe.
You have to be very tech savvy to know that your kid asking to install Discord to talk to/play games with their friend group is as dangerous as it is.
A single google search will tell you pretty unanimously that discord isn’t for kids, is rated 13+ and has risks of talking to strangers.
Parts of discord are not safe at all for 13 year olds and currently there isn't a mechanism as far as I am aware to restrict a 13 year old from accessing them.
No, it's about corporate and government control. Thankfully, the UK government is clueless about tech, which means these controls can be bypassed relatively easily by using your own DNS or a public DNS server like Quad9.
The corporations in this case are fighting against this. This is about your government and its desire to squash opinions they don't like. They are already going so far as to jail people for posting opinions they don't like. This has absolutely nothing to do with children, children are just the excuse.
There's a law going through in some state that want's to do this, but also put the onus on the OS developers to detect age aligned behavior. How do you do this with Linux? It would kill the open computer and kill ownership over computing.
Why would it be a problem to do this sort of thing with linux? Linux allows for oauth, proxied networking, what have you -- unless they're using some super-secret-unpublished-protocol, linux will be fine
I'm against these age-verification laws, but to say it's impossible to comply with open-source software isn't really true.
Mark Zuckerberg advocates for this, most people entrenched in this argument think it's worse. But I'm all for burning it to the ground so.
You must not have kids if you think it's easy to keep children off things that are bad for them.
[Any] task is much easier if you have the tools. Do/did you have a baby monitor? A technological tool, that allows you to "monitor" the baby while not being within an arms reach.
Do you have an A+++++ oven with three panes of glass? It's [relatively] safe to touch and instead of monitoring if a child is somewhere near the oven you have to monitor if the child does not actively open the oven. That's much easier.
It's really not some Herculean task to do so either, though.
Maybe you don't have kids of your own. Once you have 2 or 3, it is quite challenging to manage everything, especially over time.
Especially if they are older, like 8+ years old. They are resourceful, sneaky and relentless.
Which is exactly why all people everywhere giving up their privacy will also be ineffective.
Drugs, alcohol, cigarettes, pornography were all illegal for me to access as a kid but I wouldn’t have had any trouble getting any of it.