Screen Shot 2019 07 01 at 3.38.42 PM 88c55Following a June 26 article published by VICE, an app that alters pictures of women to show them fully naked was taken down. Far from marking the end of tech-born threats to women’s bodily autonomy, rapidly advancing deepfake technology—in which AI is used to doctor and distort media—is targeting women like never before.

The app, DeepNude, used AI technology to synthesize thousands of images of naked women so as to produce an image of a nude, female body that could be realistically grafted onto any uploaded photo of a woman. DeepNude did not produce male naked images—still adding breasts and a vulva to uploaded images of men.

ADVERTISEMENT

The free version of the app pasted a "fake" sticker on the generated image to prevent distribution. The $50 premium version of the app allowed users to access their image with only a small watermark in the corner indicating its inauthentic nature. VICE criticized the ineffective anti-circulation measure, writing: “Cropping out the "fake" stamp or removing it with Photoshop would be very easy.”

So, a quick run-down. This app allowed anyone with $50 in their pocket and internet access to transform an image of a woman—whether she be in a bikini or full-on ski gear—into a nude photograph without her consent. The watermark that designated that the picture is fake could be easily removed. This app only worked on women, not men.

Gone are the days when women had to strip down and do the deed themselves to risk a nude photo leak, now any old picture will do. Opening the door to blackmail, defamation, and plain old violation, DeepNude sold ownership over the bodies of unsuspecting women for half the price of an American Girl Doll.

Following the publication of the VICE article, the app was overrun with traffic, causing its discontinuation with a Twitter statement published last Thursday. "Depsite the safety measures adopted (watermarks) if 500,000 people use it,” the statement reads, “the probability that people will misuse it is too high.” Though I can’t imagine a situation in which an app that divests a woman of control over her own body by creating nude pictures of her without consent could be used correctly, at least this app got taken down. That’s a happier ending than this type of story usually gets. 

Recently, deepfakes have been discussed increasingly within the context of politics. A manipulated video showing House Speaker Nancy Pelosi apparently drunk, for instance, made the rounds online this past May with help from publicity by the Trump administration itself. Concern about the use of deepfakes to spread false information, particularly in light of the 2020 election, has seized the spotlight as a growing cause for concern.

While these threats are only beginning to emerge, however, deepfakes have from their very inception been weaponized against women.

Deepfakes first cropped up in 2017 when a Reddit user by the same name began producing and posting fake porn videos that put the faces of celebrities onto the bodies of adult performers. According to an article reported by VICE, this popular account launched a dangerous era of media manipulation in which editing software is cheap, accessible, and easy to use to use by anyone—not just trained professionals.

An investigation published by HuffPost delves into the universalization of deepfake technology, exposing, amongst other things, the existence of “deepfake porn forums,” where men can pay to have any woman they choose become the star of a doctored porn video. A video might cost between $15-$40. For references, two ink cartridges for a printer cost about $45.

New Fall Issue d217c

These videos are popular and spread rampantly across pornography websites—sometimes naming the seemingly featured women in the video description, according to HuffPost. And once they’re up, they’re almost impossible to get down. Given that those who commission, make, share, and post the videos generally remain anonymous as they violently breach the privacy of unsuspecting women, most victims have nowhere to turn when a deepfake video of themselves surfaces online. And when they do manage to get in touch with a distributor, not only are women’s requests for the removal of these videos met with total disregard, but as the video is shared across platforms, it becomes a hydra—growing three new heads every time one gets cut down.

Women, furthermore, have no legal recourse should this happen to them, were they even to know against whom they should be taking legal action. Revenge porn laws, according to VICE, do not usually apply to deepfakes given that the woman’s own body is not that which is being shown. A bill titled “DEEP FAKES Accountability Act” was introduced before the House of Representatives on June 12 by congresswoman Yvette Clarke in the hopes of taking on codifying stricter law relating to altered media. The bill includes provisions such as a requirement for an audio, visual, and written watermark on doctored media, as well as criminal penalties for creating deepfakes or not clearly indicating the false status of manipulated media.

DeepNude might be down, but women are having control of their bodies ripped away from them without their knowledge across innumerable platforms that not only continue to function, but thrive. As deepfake technology advances, the threat only grows larger, making it easier for more and more people to make better and better content, ruining lives and reputations in the process.

The DeepNude farewell message ended with the statement “The world is not yet ready for DeepNude.” It never fucking should be.

Top photo via Unsplash.

More from BUST

The Reaction To This Teacher's Leaked Topless Selfie Is F*cked Up And Here's Why

Online Harassment Is An Epidemic. Meet The Women Working To Stop It

Jennifer Lawrence Speaks Out About Reclaiming Her Body After Her Nude Photos Were Published Without Her Consent

 

Noa Wollstein is an editorial intern at BUST. She is currently a student at Princeton University working towards a B.A. in English, Film, and Journalism. 

Support Feminist Media!
During these troubling political times, independent feminist media is more vital than ever. If our bold, uncensored reporting on women’s issues is important to you, please consider making a donation of $5, $25, $50, or whatever you can afford, to protect and sustain BUST.com.
Thanks so much—we can’t spell BUST without U.

 DONATE NOW