One in eight American teenagers under the age of 18 personally know someone who’s had an AI-generated pornographic deepfake made of them, according to a new study from the nonprofit Thorn. And one in 17 teens has been directly victimized by AI deepfakes.
“How many kids are in the average high school classroom? That’s one [deepfake] in every classroom,” Melissa Stroebel, the head of research at Thorn, which focuses on fighting the spread of child sexual abuse material online, told Forbes. “It means every community is experiencing it at this point, and it’s time to show up and as a society, respond to that.”
Thorn surveyed 1,200 people between the ages of 13 and 20, including over 700 teens between the ages of 13 and 17, to better understand the threat of deepfake nudes.
As artificial technology has grown more sophisticated and simpler to use, so too has the prevalence and ease with which teens obtain them. It is trivially easy to create and share such content – which disproportionately affects teen girls and women.
The survey was conducted online in late September and early October 2024, and was weighted by age, gender, race and geography. Of the respondents, 48% were male, and 48% female, with the remaining as a “gender minority.” About one fifth of the respondents identified as LBGTQ+.
The study builds on a Thorn survey released last year of 1,040 minors which found that about one in 10 kids between the ages of 9 and 17 had heard of cases where their peers used AI to generate nudes of other kids.
The last year has seen a surge in cases of deepfake pornographic images circulating in schools. In one case in October 2023, at Westfield High School in New Jersey, Forbes reported that even by March 2024 there were little to no consequences for the boys that perpetrated and distributed these images. In another instance in December 2024, at Lancaster Country Day School in Pennsylvania, two boys were accused of creating deepfake images of nearly half of the girls at the school, and now face criminal charges. Their parents did not respond to Forbes’ request for comment at the time.
Dorota Mani, a New Jersey woman whose daughter was a victim of deepfake porn in late 2023, told Forbes she’s not surprised that this scourge is so prevalent. (Mani’s daughter, Francesca Mani, was not part of the survey.)
“I wouldn’t expect anything else unless we do something about it,” Dorotha Mani said. Mani added that she is planning on meeting with officials on Monday in Washington, D.C., including Sen. Ted Cruz (R-TX), Rep. Maria Salazar (R-FL) and First Lady Melania Trump regarding the “Take It Down Act,” which passed the Senate earlier this month.
This bill, if enacted, would criminalize the publication of non-consensual intimate images, including deepfakes, and would require platforms to remove them within 48 hours. Platforms that fail to do so could face monetary fines brought by the Federal Trade Commission.
“Now more than ever, it’s crucial to educate [kids] about the consequences and the involvement of schools in the conversation,” Mani said.
Read the full article here