Study shows AI image-generators being trained on explicit photos of children::Hidden inside the foundation of popular artificial intelligence image-generators are thousands of images of child sexual abuse, according to a new report that urges companies to take action to address a harmful flaw in the technology they built
It’s still abhorrent, but if AI generated images prevent an actual child from being abused…
It’s a nuanced topic for sure.
We need to better understand what causes pedophilic tendencies, so that the environmental, social and genetic factors can someday be removed.
Otherwise children will always be at risk from people who have perverse intentions, whether that person is responsible or not for those intentions.
I’m not psychologist, but maybe having child robots upon which they can execute their fetishes on might also be better than having an actual child suffer. What is yet unclear to me is if providing that release might actually make some pedophiles more willing to seek out actual children for their fantasies.
IMO, answering such questions (should victimless alternatives be sought out for people with questionable tendencies) might make the world more enjoyable and safe for children, parents, and pedos alike.
CC BY-NC-SA 4.0