With the uptick of AI, deepfake porn has become a major issue in recent years. In fact, just last year, over 21,000 deepfake pornographic videos were posted online. That’s a 460% increase from the previous year.
Unfortunately, many people fail to realize that there are real victims behind those numbers. CBS News spoke to one of them—15-year-old daughter Elliston Berry. Elliston’s classmate used a photo from her Instagram account to create a deepfake video of her without clothes on, all by simply running it through an AI program.
Combating the Deepfake Porn Issue
“She came into our bedroom crying, just going, ‘Mom, you won’t believe what just happened,’” her mom, Anna McAdams, told CBS News.
“I had PSAT testing and I had volleyball games,” Elliston added. “And the last thing I need to focus and worry about is fake nudes of mine going around the school. Those images were up and floating around Snapchat for nine months.”
In 2020, deepfake expert Henry Ajder identified this growing threat when he discovered a bot on Telegram that could “undress” photos and generate explicit images. Elliston is just one of many victims on the receiving end of this brutal form of abuse.
“This case is not about tech,” Chief Deputy City Attorney Yvonne Mere told CBS News. “It’s not about AI. It’s sexual abuse.”
Thankfully, officials are taking necessary legal action against sites that generate this type of dangerous, exploitative content. According to CBS News, the San Francisco City Attorney’s office is suing the owners of 16 such websites.
Furthermore, Sen. Ted Cruz has recently promoted the Take It Down Act, which will require social media platforms to remove any nonconsensual, sexually explicit AI content/deepfakes.
“It puts a legal obligation on any tech platform—you must take it down and take it down immediately,” Cruz said.
As they should.
“Combating the scourge of revenge and deepfake exploitative sexual material online is an issue that cuts across partisan lines. Today’s unanimous committee vote in support of the TAKE IT DOWN Act is a crucial step toward protecting and empowering victims of this heinous crime,” Cruz stated.
“As bad actors continue to exploit newer technologies like generative artificial intelligence to victimize women and girls across the country, it is vital that Congress provide victims with pathways for the removal of and prosecution of the publication of these abusive images. The full House and Senate should swiftly pass this legislation so that it reaches the President’s desk before more victims are left without recourse.”
The post Teen Victim of Deepfake Porn Pushes for ‘Take It Down Act’ appeared first on VICE.
The post Teen Victim of Deepfake Porn Pushes for ‘Take It Down Act’ appeared first on VICE.