Study shows AI image-generators being trained on explicit photos of children::Hidden inside the foundation of popular artificial intelligence image-generators are thousands of images of child sexual abuse, according to a new report that urges companies to take action to address a harmful flaw in the technology they built
All of our protect the children legislation is typically about inhibiting technology that might be used to cause harm and not about assuring children have access to places of safety, adequate food and comfort, time with and access to parents, freedom to live and play.
Y’know, all those things that help make kids resilient to bullies and challenges of growing up. Once again, we leave our kids cold and hungry in poverty while blaming the next new thing for their misery.
So I call shenanigans. Again.
It’s still abhorrent, but if AI generated images prevent an actual child from being abused…
It’s a nuanced topic for sure.
We need to better understand what causes pedophilic tendencies, so that the environmental, social and genetic factors can someday be removed.
Otherwise children will always be at risk from people who have perverse intentions, whether that person is responsible or not for those intentions.
I’m not psychologist, but maybe having child robots upon which they can execute their fetishes on might also be better than having an actual child suffer. What is yet unclear to me is if providing that release might actually make some pedophiles more willing to seek out actual children for their fantasies.
IMO, answering such questions (should victimless alternatives be sought out for people with questionable tendencies) might make the world more enjoyable and safe for children, parents, and pedos alike.
CC BY-NC-SA 4.0