Nudify / Undress AI
Part of a series on AI / Artificial Intelligence. [View Related Entries]
This entry contains content that may be considered sensitive to some viewers.
This submission is currently being researched & evaluated!
You can help confirm this entry by contributing facts, media, and other evidence of notability and mutation.
Overview
Nudify / Undress AI refers to several artificial intelligence programs (sometimes called "deepnudes" in reference to deepfakes) that started being advertised to internet users in 2023, predominantly on X / Twitter and Reddit. The apps claim to use AI technology to convert any photo, typically of clothed women, into AI-generated nude images. In December 2023, social media analysis company Graphika released a report about the rise in social media advertisements featuring such apps, leading to widespread discussions, viral debates and media coverage about the ethics of creating nonconsensual sexually explicit images.
Background
Evidence of such AI apps promoting non-consensual programs can be traced to Reddit and 4chan posts, with some early examples appearing as far back as May 7th, 2023, when a post asking for AI "deepnude" suggestions gathered over 1,000 upvotes on /r/deeplearning[1] in seven months.
The trend continued on Reddit into August 2023, as seen in a request by Redditor /u/Ok_Umpire412 on the /r/ArtificialInteligence sub for "websites" that make "deepnude pics" with the "best quality of results." The post[2] (shown below) gathered over 40 upvotes in three months as well as replies that ranged from suggestions for such websites and admonishments.
Online Reactions
In late 2023, the prevalence of such AI-powered undressing apps and awareness of them led to increased public scrutiny and media coverage online as they drew controversy from numerous users on various platforms. For example, on November 19th, 2023, X[3] user @ThatSamWinkler quoted the CEO of X (formerly Twitter) to show an example of an ad he received for such an "Undress" AI app, marketed as a way to create non-consensual nude images of women. The post gathered over 85,000 likes in three weeks (seen below, left).
On November 20th, X[6] user @JennyENicholson then quoted the post from @ThatSamWinkler to comment on what they perceived as absurd dialogue depicted in the advertisement, gathering over 19,000 likes in three weeks (seen below, right).
Developments
On December 8th, 2023, social media analysis company Graphika released a report titled, "A Revealing Picture." Written by Santiago Lakatos, the report delves into the mass proliferation of ads promoting such AI programs, classifying them as "synthetic non-consensual intimate imagery" or NCII. The report detailed how such programs have evolved from being discussed on niche internet forums to "an automated and scaled online business that leverages a myriad of resources to monetize and market its services."[4]
Later on December 8th, the Times reported on Graphika's paper, citing greatly increased usage of such "undressing" AI programs in 2023, adding that there are presently no federal laws banning the creation of deepfake sexually explicit material. The one instance in which the creator of deepfake images has been prosecuted is a North Carolina child psychiatrist who was sentenced to 40 years for morphing images of his minor patients.[5]
Search Interest
Unavailable.
External References
[1] Reddit – /r/deeplearning
[2] Reddit – /r/ArtificialInteligence
[3] X – ThatSamWinkler
[4] Graphika – A Revealing Picture
[5] Times – Nudify Apps Undress Women
[6] X – JennyENicholson
Recent Videos
There are no videos currently available.