A woman in eastern Pennsylvania allegedly created a series of deepfake videos in a harassment and bullying campaign meant to intimidate teenage girls in competition with her daughter and get them kicked off a local cheerleading team.
Hilltown Township police earlier this month charged Raffaela Spone with three counts of cyber harassment of a child after she allegedly began harassing the teenagers last July, according to the Bucks County District Attorney Matthew Weintraub.
The girls received voice and text messages saying, “You should kill yourself,” followed by doctored videos taken from images on their social media profiles and altered to make them appear nude, vaping, or drinking. The altered images included captions reading, “toxic traits, revenge, dating boys, and smoking” and “was drinking at the shore, smokes pot, and uses ‘attentionwh0re69’ as a screen name.” The images and videos were also sent to coaches for the team, seemingly in an attempt to have the girls removed from the team.
On review, police found video “to be the work of a program that is or is similar to ‘Deep Fakes,’ where a still image can be mapped onto an existing video and alter the appearance of the person in the video to show the likeness of the victim’s image instead,” according to an affidavit filed in the case. Spone allegedly used several phone numbers to send the messages, but investigators determined all the numbers were obtained through the same wholesaler and executed a warrant on the wholesaler seeking IP address information about the originating calls. That information was linked to Spone’s residence, and after searching Spone’s devices, police determined her phone was “used to download, access, and or manipulate data” on the wholesaler’s app.
Weintraub said Spone faces misdemeanor charges and not higher-level charges because the “nude” images did not actually depict minors in sexual situations (real or simulated). Instead, the images were basically crude digital removals of an existing bikini to create a “Barbie doll-like” effect “with no obvious genitalia.” Had the images actually looked like nudes, Spone would have faced “a much more serious felony offense.”
Fake images, real crimes
In the way of many once-novel technologies, tools for creating deepfake video are rapidly becoming more accessible and more affordable. In 2019, for example, the maker of a program called DeepNude, which used neural networks to make images of clothed women appear to be realistic nudes, was forced to stop distributing the tool following strong public backlash. A few months later, Ars’ own Timothy Lee demonstrated how to turn Facebook CEO Mark Zuckerberg into the galaxy’s most famous android, Lt. Cmdr. Data from Star Trek: The Next Generation using neural networks.
In recent years, as the technology has become more accessible, several states have passed laws specifically addressing use of deepfakes. Virginia in 2019 became the first to adopt a law imposing criminal penalties on distribution of faked, nonconsensual, sexually themed material (i.e., revenge porn). In the same year, Texas banned deepfakes designed to influence an election. California then passed two laws, one to deal with revenge porn and one to deal with election manipulation, and several other states have continued to introduce legislation tackling one or the other.
The federal government, too, is trying to find its way on the issue. The 2020 National Defense Authorization Act (passed in December 2019) included several provisions related to studying the potential national security threat poised by deepfake content. The 2021 version included similar provisions.