A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.
Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.
The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.
Maybe it is just me, but its why I think this is a bigger issue than just Hollywood.
The rights to famous people’s “images” are bought and sold all the time.
I would argue that the entire concept should be made illegal. Others can only use your image with your explicit permission and your image cannot be “owned” by anyone but yourself.
The fact that making a law like this isn’t a priority means this will get worse because we already have a society and laws that don’t respect our rights to control of our own image.
A law like this would also remove all the questions about youth and sex and instead make it a case of misuse of someone else’s image. In this case it could even be considered defamation for altering the image to make it seem like it was real. They defamed her by making it seem like she took nude photos of herself to spread around.
There are genuine reasons not to give people sole authority over their image though. “Oh that’s a picture of me genuinely doing something bad, you can’t publish that!”
Like, we still need to be able to have a public conversation about (especially political) public figures and their actions as photographed
Yeah I’m not stipulating a law where you can’t be held accountable for actions. Any actions you take as an individual are things you do that impact your image, of which you are in control. People using photographic evidence to prove you have done them is not a misuse of your image.
Making fake images whole cloth is.
The question of whether this technology will make such evidence untrustworthy is another conversation that sadly I don’t have enough time for right this moment.
That sounds pretty dystopian to me. Wouldn’t that make filming in public basically illegal?
In Germany it is illegal to make photos or videos of people who are identifieable (faces are seen or closeups) without asking for permission first. With exception of public events, as long as you do not focus on individuals. It doesn’t feel dystopian at all, to be honest. I’d rather have it that way than ending up on someone’s stupid vlog or whatever.