While the outrageous app that quite unapologetically put women’s modesty out for display has been banned by its makers, it doesn’t assure us that it won’t happen in the future.
It created a huge furore when the creator of the DeepNude app was made available for the public to try and test it, converting women’s pictures into nudes. The app received severe backlash (for obvious reasons) and hours after gaining popularity, the makers announced that they are killing the app, citing server overload and potential harms (thankfully!).
The app used neural networks to turn images of women wearing clothes into (realistic-looking) nudes. The software was based on pix2pix, an open-source algorithm developed by the University of California, Berkeley researchers in 2017. It uses generative adversarial networks (GANs) to train algorithms on a huge dataset of images.
The makers of the DeepNude claimed that they were fascinated by the idea of making x-ray glasses possible using the GAN networks. The app was driven by fun and enthusiasm for that discovery.
The End Of A Vile Era
Putting a complete ban on the use of DeepNude app, their page said that the project was created for user’s entertainment. “We thought we were selling a few sales every month in a controlled manner. Honestly, the app is not that great, it only works with particular photos. We never thought it would become viral and we would not be able to control the traffic. We greatly underestimated the request,” the page said.
They accepted the fact that the probability of people misusing this app is too high, and that they do not want to make money this way. They also said that despite the ban, some copies of DeepNude would be shared on the web, but they do not want to be the ones who sell it. They permanently put a stop on licences to activate the premium version.
“The world is not yet ready for DeepNude”, they said. And it will never be.
But Is It Gone For Ever?
It is unfortunate that what goes on the internet stays forever. Despite the ban, there are numerous fake copies on the DeepNude app available on the internet.
The Twitterati ranted out the anger and disgust on even the existence of such an app, with many fearing that this is just the beginning of AI misuse. There were many who suggested that Genie of AI is out of the bottle and we might need white-hat AI engineers just like we need white-hat hackers to deal with the situation at some level.
I’m glad DeepNude is dead. As a person and as a father, I thought this was one of the most disgusting applications of AI. To the AI Community: You have superpowers, and what you build matters. Please use your powers on worthy projects that move the world forward.
— Andrew Ng (@AndrewYNg) June 28, 2019
It is certainly only the beginning, and banning these apps from app stores is one thing, but these technologies cannot be unfortunately un-invented. It creates a fear of how technologies such as DeepNudes have brought the worst out of AI applications.
Need For A Stringent Regulation
With great power comes great responsibility, which often goes unaddressed. There is a need to bring strong regulations on the ethical use of AI and guidelines to be set on what is the extent of applications of AI. As much as the idea is sick, if there are no strong regulations and check on its use, somebody will do something similar anyway.
With such misuse of technology, the questions of morality and ethical use of technologies come quite naturally. The power of technology should not be misused to create havoc and just the way the use of nuclear weapons has a stringent regulation, the use of such nuances of AI should be kept under strict watch.
If the lax in regulation continues, it will be difficult to see DeepNudes as dead.
There is already a cracked version available which raises alarm about where are we heading with the use of this technology.
There are many others who argue that ever since GANs were invented, the deep learning as going to enable fake pictures and we should have been prepared for the aftermath since then. We all need to take the initiative and be proactive in the changing world.
It is devastating to see how such apps worked only with images of women nonconsentually and are used to attack their modesty and defame them. There is obviously a debate that it should not be made available to the public lest anyone could find themselves a victim of revenge porn, without ever having taken a nude photo.
While we cannot stop the development of certain technologies, the solution to make a sane use of technology lies in having ethical bodies and institutions to keep a check on it in place. DeepNude debacle has created huge concerns and showed us how deepfake technology can be taken to a whole new level unless we have the regulations to stop everyone from encouraging such vile practices.