TikTok will not introduce end-to-end encryption, saying it makes users less safe

bbc.com

396 points by 1659447091 21 hours ago


Traster - 13 hours ago

I think this is... fine? Am I just totally naive. I think it's fine to say "You don't really have privacy on this app" - as long as there are relatively good options of apps that do have privacy (and I think there are). TikTok is really a public by default type of social media, there's not much idea of mutual following or closed groups. So sure, you don't have privacy on tiktok, if you want it you can move to snapchat or signal or whatever platform of your choice.

Like, it's literally a platform that was run under the watchful eye of the CCP, and now the US version is some kleptocratic nightmare, so I just don't see the point in expecting some sort of principled stance out of them.

In some ways I think it's worse for places like Facebook to "care about privacy" and use E2EE but then massively under-resource policing of CSAM on their platform. If you're going to embrace 'privacy' I do think it's on you to also then put additional resources into tackling the downsides of that.

xeckr - 18 hours ago

Brilliant. They're repackaging the argument governments have long made about E2EE being dangerous to children.

ThoAppelsin - 16 hours ago

DMs are akin to private conversations in real life. Thus, every DM feature should entail E2EE.

It’s ok for a platform to not feature private conversations. They should just have no DM feature at all, then; make all messages publicly visible.

Private conversations are indeed not for all ages. Parents should be able to grant access to that on individual basis.

ranyume - 18 hours ago

This might be off-topic but on-topic about child safety... but I'm surprised people are being myopic about age verification. Age verification should be banned, but people ignore that nowadays most widely used online services already ask for your age and act accordingly: twitter, youtube, google in general, any online marketplace. They already got so much data on their users and optimize their algorithms for those groups in an opaque way.

So yeah, age verification should be taken down, as well as the datamining these companies do and the opaque tunning of their algorithms. It baffles me: people are concerned about their children's DMs but are not concerned about what companies serves them and what they do with their data.