Barely a day after being released, a deepfake app that harnessed AI to generate fake nude photos of women was shut down. Known as DeepNude, the app was taken down by its own dev team, whose leader said in a tweet that they “greatly underestimated” the interest in the project and concluded that “the probability that people will misuse it is too high”.

In their team’s Twitter, @deepnudeapp shared a brief account of how DeepNude came to be. They claim to have created the app “for user’s entertainment” some months ago, and “were selling a few sales [sic] every month in a controlled manner.” The team leader goes on to say that “honestly, the app is not that great, it only works with particular photos.”

Disturbingly, the app only works on women and if used on a man’s picture, it would place female genitalia on the subject. It also “works best” on subjects that are scantily clothed, such as wearing bikinis or a swimsuit. The “free” version also has large watermarks to indicate it’s a fake nude photo, and a trained eye can see that the flesh rendered by the AI is blurry and pixelated, further suggesting that it is indeed fake. On the other hand, the paid version called “DeepNude Premium” had smaller watermarks to indicate it was a fake photo and could export the fake photos in a higher resolution.

The team decided to remove the app and those who didn’t upgrade to the Premium version will have their money refunded. DeepNude will no longer be available, and no updated versions will be offered; they also warned that anyone who shared their software would be violating the terms of service. The team also advised that some copies may still be in circulation and distributed illegally.

Although the app was intended “for entertainment purposes” only, it presented many unforeseen caveats. Apart from the likelihood of people overlooking of the smaller “fake nude” watermark from DeepNude Premium, untrained viewers may be more “convinced” that the nude photo is genuine. If shared and then viewed even briefly, photos like these can cause irreparable damage to anyone’s reputation or used as a means to harass or blackmail.

The application of deepfake AI in this manner is actually not new, as redditors who have been using AI research published by academics have created their own fake celebrity pornography. And, unfortunately for Australia, some women there have been struggling with a similar and more sinister application of deepfake nudes in the form of “revenge porn”, or as their legal system now categorizes it, “non-consensual pornography” or image-based sexual abuse (IBSA)”. Since last year, Australian women have been victimized by having their photos stolen from their social media accounts, which are then superimposed digitally on pornographic images or videos, then spread over multiple websites.

But are the perpetrators of this kind of abuse legally accountable? Unfortunately, Australian law has yet to modify its laws regarding this, and protection for victims remains blurry and inconsistent. As for US Federal law, there are still loopholes that need to be addressed to protect US citizens from this form of abuse and privacy violation.

A screenshot of the once-available DeepNude app’s upgrade landing page.
Despite its steep price, interest for the app was massive.


Sample result from the DeepNude app. Censorship bars added by The Verge.
Note the blurry, lo-res “flesh” on the nude fake photo, and the large watermark “Fake Nude” labels

The implications for marketers may not be obvious, but malicious use of apps like DeepNude or deepfake AI, and the apparent lack of ethical standards of startups that create them are a huge concern for ordinary people. It’s applications like these that make less tech-savvy consumers even less trusting of AI, and more suspicious of any company that asks for their personal information or personal images, even for the most “harmless” of apps or purposes. Brands and marketers should be aware of the legalities regarding privacy and how they should handle customers’ data. Moreover, marketers are advised to steer clear of any efforts that “nudify” consumers, especially women, even if done for “entertainment purposes” or used in a “tongue-in-cheek” manner. As proven by DeepNude and the deepfake “revenge porn” problem in Australia, there will always be groups or individuals that will use deepfake AI to do serious harm. And you thought that an AI that made terrible “marketing blog posts” was the worst people could come up with.

For more details in this story, read it here:

What do you think about apps like DeepNude? Should apps like DeepNude be allowed in circulation or should there be laws restricting them? As marketers, what is your opinion on deepfake AI? Let us know in the comments!

Leave a Reply