Paris prosecutors raid France offices of Elon Musk's X
bbc.com161 points by vikaveri 13 hours ago
161 points by vikaveri 13 hours ago
Finally, someone is taking action against the CSAM machine operating seemingly without penalty.
I am not a fan of Grok, but there has been zero evidence of it creating CSAM. For why, see https://www.iwf.org.uk/about-us/
CSAM does not have a universal definition. In Sweden for instance, CSAM is any image of an underage subject (real or realistic digital) designed to evoke a sexual response. If you take a picture of a 14 year old girl (age of consent is 15) and use Grok to give her bikini, or make her topless, then you are most definately producing and possessing CSAM.
No abuse of a real minor is needed.
[flagged]
He made no judgement in his comment, he just observed the fact that the term csam - in at least the specified jurisdiction - applies to generated pictures of teenagers, wherever real people were subjected to harm or not.
I suspect none of us are lawyers with enough legal knowledge of the French law to know the specifics of this case
> CSAM does not have a universal definition.
Strange that there was no disagreement before "AI", right? Yet now we have a clutch of new "definitions" all of which dilute and weaken the meaning.
> In Sweden for instance, CSAM is any image of an underage subject (real or realistic digital) designed to evoke a sexual response.
No corroboration found on web. Quite the contrary, in fact:
"Sweden does not have a legislative definition of child sexual abuse material (CSAM)"
https://rm.coe.int/factsheet-sweden-the-protection-of-childr...
> If you take a picture of a 14 year old girl (age of consent is 15) and use Grok to give her bikini, or make her topless, then you are most definately producing and possessing CSAM.
> No abuse of a real minor is needed.
Even the Google "AI" knows better than that. CSAM "is considered a record of a crime, emphasizing that its existence represents the abuse of a child."
Putting a bikini on a photo of a child may be distasteful abuse of a photo, but it is not abuse of a child - in any current law.
" Strange that there was no disagreement before "AI", right? Yet now we have a clutch of new "definitions" all of which dilute and weaken the meaning. "
Are you from Sweden? Why do you think the definition was clear across the world and not changed "before AI"? Or is it some USDefaultism where Americans assume their definition was universal?
> Are you from Sweden?
No. I used this interweb thing to fetch that document from Sweden, saving me a 1000-mile walk.
> Why do you think the definition was clear across the world and not changed "before AI"?
I didn't say it was clear. I said there was no disagreement.
And I said that because I saw only agreement. CSAM == child sexual abuse material == a record of child sexual abuse.
"No. I used this interweb thing to fetch that document from Sweden, saving me a 1000-mile walk."
So you cant speak Swedish, yet you think you grasped the Swedish law definition?
" I didn't say it was clear. I said there was no disagreement. "
Sorry, there are lots of different judical definitions about CSAM in different countries, each with different edge cases and how to handle them. I very doubt it, there is a disaggrement.
But my guess about your post is, that an American has to learn again there is a world outside of the US with different rules and different languages.
> So you cant speak Swedish, yet you think you grasped the Swedish law definition?
I guess you didn't read the doc. It is in English.
I too doubt there's material disagreement between judicial definitions. The dubious definitions I'm referring to are the non-judicial fabrications behind accusations such as the root of this subthread.
" I too doubt there's material disagreement between judicial definitions. "
Sources? Sorry , your gut feeling does not matter. Esspecially if you are not a lawyer