

Child Sexual Abuse Material is abhorrent because children were literally abused to create it.
AI generated content, though disgusting, is not even remotely on the same level.
The moral panic around AI that leads to implying that these things are the same thing is absurd.
Go after the people filming themselves literally gang raping toddlers, not the people typing forbidden words into an image generator.
Don’t dilute the horror of the production CSAM by equating it to fake pictures.
What’s the follow on effect from making generated images illegal?
Do you want your freedom to be at stake where the question before the Jury is “How old is this image of a person (that doesn’t exist?)”. “Is this fake person TOO child-like?”
You won’t be able to tell, we can assume that this is a given.
So the real question is:
Who are you trying to arrest and put in jail and how are you going to write that difference into law so that innocent people are not harmed by the justice system?
To me, the evil people are the ones harming actual children. Trying to blur the line between them and people who generate images is a morally confused position.
There’s a clear distinction between the two groups and that distinction is that one group is harming people.