(Zero Hedge) The creator of an app which used neural networks to algorithmically undress women has taken his project offline after the publication of a Motherboard article resulted in a massive feminist backlash.
Created by an anonymous programmer known as “Alberto,” the software known as DeepNude uses generative adversarial networks (GANs) and a database of thousands of images of women in order to remove the clothing of any woman.
“When Motherboard tried using an image of a man, it replaced his pants with a vulva.”
The software is based on pix2pix, an open-source algorithm created at UC Berkeley in 2017, which works by training an algorithm on the dataset, similar to the method employed in creating deepfake videos and train self-driving cars to create traffic scenarios.
On June 23, DeepNude launched as a website, allowing people to download a Windows or Linux version of the application. In the free version of the app, the output images are covered in a large watermark, while users who pay $50 will receive a version that has no watermark over the body – instead placing stamp in the upper-left corner that says “FAKE” in big red letters.
Motherboard tested it on more than a dozen images of women and men, in varying states of dress—fully clothed to string bikinis—and a variety of skin tones. The results vary dramatically, but when fed a well lit, high resolution image of a woman in a bikini facing the camera directly, the fake nude images are passably realistic. The algorithm accurately fills in details where clothing used to be, angles of the breasts beneath the clothing, nipples, and shadows. –Motherboard
“This is absolutely terrifying” said Katelyn Bowden, CEO of revenge porn activism organization Badass. “Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo. This tech should not be available to the public.”
This is an “invasion of sexual privacy,” Danielle Citron, professor of law at the University of Maryland Carey School of Law, who recently testified to Congress about the deepfake threat, told Motherboard.Loading...
Remove all ads by clicking here
“Yes, it isn’t your actual vagina, but… others think that they are seeing you naked,” she said. “As a deepfake victim said to me—it felt like thousands saw her naked, she felt her body wasn’t her own anymore.” –Motherboard
“The networks are multiple, because each one has a different task: locate the clothes. Mask the clothes. Speculate anatomical positions. Render it,” said Alberto. “All this makes processing slow (30 seconds in a normal computer), but this can be improved and accelerated in the future.”
Alberto told Motherboard that he was inspired to create DeepNudes based on advertisements for X-Ray glasses found in the back of vintage magazines. The logo for his project is a man wearing said glasses.
In a Thursday Twitter post, Alberto announced that due to the possibility of misuse, he would be pulling the software down, adding “The world is not yet ready for DeepNude.”
— deepnudeapp (@deepnudeapp) June 27, 2019
Surely this is a sign of things to come.