Meme Encyclopedia
Images
Editorials
More
Know Your Meme is the property of Literally Media ©2024 Literally Media. All Rights Reserved.

DeepNude


Added 4 years ago by Philipp • Updated about a month ago by Don
Added 4 years ago by Philipp • Updated about a month ago by Don

DeepNude
DeepNude
This entry contains content that may be considered sensitive to some viewers.

Category: Site Status: confirmed Year: 2019 Origin: Region:
Type: Application
Tags: deep fakes deepnude neural network app x ray alberto pix2pix

About

DeepNude was a deepfakes application which allowed to alter photos of a person to make them appear nude, designed to work on females. Several days after its June 23rd, 2019, release and rapid climb to popularity, the app was permanently shut down over the concerns of potential misuse, although already downloaded copies of the software were continued to be shared.

History

On June 23rd, 2019, Windows and Linux application DeepNude, developed by an anonymous person known only as Alberto, was made available for download online (website no longer available).[1][2]


deepnudeapp @deepnudeapp DeepNude Desktop is ready. It requires Windows 10 or Linux. It works with both CPU and GPU. Visit our website (deepnude.com) and click on Download button. 1:48 PM 23 Jun 2019 Artificial Intelligence X-Ray App Download For Free Try Online Y or

The software, based on a neural network, automatically edited photographs of people to make them appear fully nude. The app only had support for female naked bodies, although an update or a separate app for male bodies was also planned.[3]

On June 27th, 2019, Vice Motherboard featured a story on the app, titled "This Horrifying App Undresses a Photo of Any Woman With a Single Click", also talking to the creator of the app.[3] In the following days, multiple news outlets covered the story, including articles by The Verge,[4] Vox[5] and CNET.[6]

On June 27th, the official website experienced a series of crashes due to high number of visitors.[7] On the same day, the creator of the DeepNude announced that the service had been permanently shut down due to concerns of potential misuse of the application.[8]


deepnudeapp @deepnudeapp Hi! DeepNude is offline. Why? Because we did not expect these visits and our servers need reinforcement. We are a small team. We need to fix some bugs and catch our breath. We are working to make DeepNude stable and working. We will be back online soon in a few days. 7:10 AM -27 Jun 2019 Here is the brief history, and the end of DeepNude. We created this project for user's entertainment a few months ago. We thought we were in a controlled manner. Honestly, the app is not that great, it only works with particular photos. We never thought it would become viral and we would not be able to control the traffic. We greatly underestimated the request. selling a few sales every month measures adopted (watermarks) if 500,000 people use it, the Despite the safety probability that people will misuse it is too high. We don't want to make money this way. Surely some copies of DeepNude will be shared on the web, but we don't want to be the ones who sell it. Downloading the software from other sources or any other means would be against the terms of our website. From now on, DeepNude will not release other versions and does not grant anyone its use. Not even the licenses to activate the Premium version. sharing it by People who have not yet upgraded will receive a refund. The world is not yet ready for DeepNude

On the same day, an unknown Reddit user posted a link for a cracked version of the app.[9]

GitHub Removal

In early July 2019, GitHub removed submissions of DeepNudes code from the repository. On July 9th, the technology news site Motherboard published an article about the removal, which included a statement from a GitHub spokesperson:

"We do not proactively monitor user-generated content, but we do actively investigate abuse reports. In this case, we disabled the project because we found it to be in violation of our acceptable use policy. We do not condone using GitHub for posting sexually obscene content and prohibit such conduct in our Terms of Service and Community Guidelines."

Features

DeepNude uses a pix2pix open-source algorithm to analyze the input image, identify clothing and remove it, replacing it with a generated image of a naked body. The algorithm was trained on a dataset of over 10,000 nude photos of women.[3]

The free version of the app had the output images partially obscured with a large watermark, removed in the $50 paid version. Paying additional $50 removed a smaller "FAKE" watermark (examples shown below, left and right).[10]


FAKI 0

Traffic

While the exact number of the app downloads is unknown, on June 27th, 2019, the official DeepNude website experienced a series of crashes due to the high load before being shut down.[7][11][12]

Search Interest

External References


Comments ( 70 )

  • Kommando_Kaijin - 4 years ago

    I'm surprised somebody didn't try this sooner.

    0
    • ObadiahtheSlim - 4 years ago

      It's probably just some pattern recognition to match a body type and skin tone and then popping down a standard nude body over it. I bet it's just a slightly more automated version of people photoshopping celebrity faces on porn models.

      +50
      • Ganondox - 4 years ago

        It's probably a bit more sophisticated than that, it probably generates a nude body rather than just popping an existing one down.

        0
      • Vissia - 4 years ago

        it just fills the gaps of clothing with nudity, and on top of that it does it like shit.

        0
    • Ganondox - 4 years ago

      It's probably a bit more sophisticated than that, it probably generates a nude body rather than just popping an existing one down.

      0
    • Vissia - 4 years ago

      it just fills the gaps of clothing with nudity, and on top of that it does it like shit.

      0
  • ObadiahtheSlim - 4 years ago

    It's probably just some pattern recognition to match a body type and skin tone and then popping down a standard nude body over it. I bet it's just a slightly more automated version of people photoshopping celebrity faces on porn models.

    +50
    • Ganondox - 4 years ago

      It's probably a bit more sophisticated than that, it probably generates a nude body rather than just popping an existing one down.

      0
    • Vissia - 4 years ago

      it just fills the gaps of clothing with nudity, and on top of that it does it like shit.

      0
  • zeoxdragon - 4 years ago

    Taking it down will be useless, if someone was able to make an app like this one time, some other person will make a similar one again later on.

    +10
    • Crowley - 4 years ago

      Oh yeah, I'm quite sure the creator already knew this. He just needed to open the Pandora's Box, even if it was just for one day. He must've been expecting the shutdown since he first started making it.

      +1
  • This Guy - 4 years ago

    "concerns of potential misuse"

    …and you didn't think of this before release why? After we've already had the celebrity porn Deepfake controversy?

    +105
    • Chaotic-Entropy - 4 years ago

      I… don't follow… it has one use…? Surely if it weren't being used for illicit, dodgy purposes then THAT would be misuse.

      +1
  • winton overwat - 4 years ago

    who even thought it was a good idea to release this? lmao

    +6
    • Giygas-DCU - 4 years ago

      It was Alberto's fault.

      +1
  • Crowley - 4 years ago

    Oh yeah, I'm quite sure the creator already knew this. He just needed to open the Pandora's Box, even if it was just for one day. He must've been expecting the shutdown since he first started making it.

    +1
  • Vulture051 - 4 years ago

    Eh, it's effectively just suped up imagination. I think shutting it down is a bad precedent.

    +2
  • Shrikey - 4 years ago

    It's just (neural network) automated Photoshop.

    0
  • Evilthing - 4 years ago

    It's too late to stop it. I bet software like this has existed for years.

    0
    • Nico - 4 years ago

      I am pretty sure it has existed for a long time but there is a difference between those that share it between extremely tight nit communities with very few members and it being available to a large portion of the population

      +3
  • HJSDGCE - 4 years ago

    The Deepnude subreddit got banned like 6 minutes ago (from this post). Fuck, I was too late.

    +4
    • Shrikey - 4 years ago

      I took a look.

      They just looked like somewhat messed up photoshops. Sometimes anatomically incorrect.

      As I said, all this does (even if it were prefected) is do a fake nude Photoshop for you. Those dudes who edit photos for magazines could make any of these photos, but better.

      It's not actually anybody's breasts. It's not undressing anyone. It's just removing skill from faking nudes. Just like deepfakes already did.

      +9
      • Shrikey - 4 years ago

        Just as a note, I'm not interested in fake nudes in general, nor do I know or care how much actual skill goes into them.

        I do enjoy celebrity nudes… Just aesthetically, I use porn if I wanna whack it.

        But the only thing that makes a celebrity nude more pleasing for me than an equivalently attractive non-celebrity is knowing those tits are famous.

        +1
      • A1rN1nja - 4 years ago

        Most fake nude Photoshops are really bad… And if you actually wanted a well-done fake, chances are you wouldn't get it because the actual professionals wouldn't care(and most probably don't actually make literal fake nudes, both professionally or informally).

        Like seriously why would they? It would need to be more than a side-gig, making more than pocket change off it.

        +2
    • Shrikey - 4 years ago

      Just as a note, I'm not interested in fake nudes in general, nor do I know or care how much actual skill goes into them.

      I do enjoy celebrity nudes… Just aesthetically, I use porn if I wanna whack it.

      But the only thing that makes a celebrity nude more pleasing for me than an equivalently attractive non-celebrity is knowing those tits are famous.

      +1
    • A1rN1nja - 4 years ago

      Most fake nude Photoshops are really bad… And if you actually wanted a well-done fake, chances are you wouldn't get it because the actual professionals wouldn't care(and most probably don't actually make literal fake nudes, both professionally or informally).

      Like seriously why would they? It would need to be more than a side-gig, making more than pocket change off it.

      +2
  • Shrikey - 4 years ago

    I took a look.

    They just looked like somewhat messed up photoshops. Sometimes anatomically incorrect.

    As I said, all this does (even if it were prefected) is do a fake nude Photoshop for you. Those dudes who edit photos for magazines could make any of these photos, but better.

    It's not actually anybody's breasts. It's not undressing anyone. It's just removing skill from faking nudes. Just like deepfakes already did.

    +9
    • Shrikey - 4 years ago

      Just as a note, I'm not interested in fake nudes in general, nor do I know or care how much actual skill goes into them.

      I do enjoy celebrity nudes… Just aesthetically, I use porn if I wanna whack it.

      But the only thing that makes a celebrity nude more pleasing for me than an equivalently attractive non-celebrity is knowing those tits are famous.

      +1
    • A1rN1nja - 4 years ago

      Most fake nude Photoshops are really bad… And if you actually wanted a well-done fake, chances are you wouldn't get it because the actual professionals wouldn't care(and most probably don't actually make literal fake nudes, both professionally or informally).

      Like seriously why would they? It would need to be more than a side-gig, making more than pocket change off it.

      +2
  • Giygas-DCU - 4 years ago

    It was Alberto's fault.

    +1
  • Shrikey - 4 years ago

    Just as a note, I'm not interested in fake nudes in general, nor do I know or care how much actual skill goes into them.

    I do enjoy celebrity nudes… Just aesthetically, I use porn if I wanna whack it.

    But the only thing that makes a celebrity nude more pleasing for me than an equivalently attractive non-celebrity is knowing those tits are famous.

    +1
  • walperinus - 4 years ago

    nobody else is having a flashback to those xray thing for phones? those ads were showing the same idea

    0
  • hexadeca_guncross - 4 years ago

    Paint me surprised.

    0
  • HelifIgno - 4 years ago

    Taken down for concerns of misuse… What the hell would be misues if not its desired purpose?

    +11
  • KRISZ46 - 4 years ago

    I need this

    +2
  • The Cooler Sub - 4 years ago

    TBH I’m surprised this didn’t happen sooner

    0
  • Tentacles - 4 years ago

    +4
  • A1rN1nja - 4 years ago

    Most fake nude Photoshops are really bad… And if you actually wanted a well-done fake, chances are you wouldn't get it because the actual professionals wouldn't care(and most probably don't actually make literal fake nudes, both professionally or informally).

    Like seriously why would they? It would need to be more than a side-gig, making more than pocket change off it.

    +2
  • Xyz_39808 - 4 years ago


    +6
  • Nico - 4 years ago

    I am pretty sure it has existed for a long time but there is a difference between those that share it between extremely tight nit communities with very few members and it being available to a large portion of the population

    +3
  • ZuperNEZ - 4 years ago

    When I read the word "misuse", the first thing to come to mind is taking photos of under-aged girls and put them through the nude app.

    -1
    • Evilthing - 4 years ago

      It gets worse: Imagine if there are people who look at children and then imagine them the way they want without the use of any app.

      +3
      • krashlia - 4 years ago

        …You mean like they do now?

        Okay, jokes aside, I think that the deep fakes, once they go there, are engaging in something thats rather distinct from Hentai.

        0
        • Evilthing - 4 years ago

          Well, for starters, real people are used, unless they use deepfakes in conjunction with random person generators (which exist).

          +1
      • Evilthing - 4 years ago

        Well, for starters, real people are used, unless they use deepfakes in conjunction with random person generators (which exist).

        +1
    • krashlia - 4 years ago

      …You mean like they do now?

      Okay, jokes aside, I think that the deep fakes, once they go there, are engaging in something thats rather distinct from Hentai.

      0
      • Evilthing - 4 years ago

        Well, for starters, real people are used, unless they use deepfakes in conjunction with random person generators (which exist).

        +1
    • Evilthing - 4 years ago

      Well, for starters, real people are used, unless they use deepfakes in conjunction with random person generators (which exist).

      +1
  • Evilthing - 4 years ago

    It gets worse: Imagine if there are people who look at children and then imagine them the way they want without the use of any app.

    +3
    • krashlia - 4 years ago

      …You mean like they do now?

      Okay, jokes aside, I think that the deep fakes, once they go there, are engaging in something thats rather distinct from Hentai.

      0
      • Evilthing - 4 years ago

        Well, for starters, real people are used, unless they use deepfakes in conjunction with random person generators (which exist).

        +1
    • Evilthing - 4 years ago

      Well, for starters, real people are used, unless they use deepfakes in conjunction with random person generators (which exist).

      +1
  • Nedhitis - 4 years ago

    I may be playing a foolish game of Devil's Advocate here and I am 100% willing to be wrong about it, but maybe the "misuse" they are talking about is to use photos of real, underaged girls on the app, which could attract the wrong kind of people and get the developer into some serious trouble, and which they did not mention directly for obvious reasons. This makes sense if that was part of the removal's reasoning.

    Just saying, you know.

    +76
    • FByte - 4 years ago

      That shit didn't even cross my mind, you might be on to something.

      +13
    • Jerach - 4 years ago

      There's already the misuse of creating phony nudes of somebody and leaking them to try and discredit them. Imagine this happening to a woman who works as a teacher. Even if the images were clearly fake on any sort of closer inspection, the initial backlash would be so hard to deal with.

      +11
      • Ganondox - 4 years ago

        This technology isn't going to go away, before long common people are going to be able to train such networks themselves and generate these images without an app specifically for the purpose. People are going to get used to it and realize fakes mean nothing.

        +8
    • Ganondox - 4 years ago

      This technology isn't going to go away, before long common people are going to be able to train such networks themselves and generate these images without an app specifically for the purpose. People are going to get used to it and realize fakes mean nothing.

      +8
    • Ketbra - 4 years ago

      I kinda doubt that considering that making this app likely requires one to feed a ton of images of naked bodies into a neural network or something so that the computer learns how they look like. But the bodies of grown women are different from those of kids/teenagers, and I'd like to believe the algorithm was trained with images of grown women (which would be easier to find, anyways).

      Now, I don't actually know what the results of trying to nudify a kid with a program that only knows about adults would be, but I'd like to think you'd get some pretty silly images at best.

      +2
      • Nedhitis - 4 years ago

        Oh, the results would definitely look ridiculous on most kids. Emphasis on "kids", however, which are not all underaged people. A 16 or 15-year-old girl's face would still look convincing enough on a 18-year-old body generated by a program, and that alone is asking for trouble from the program creator's side.

        +2
    • Xyz_39808 - 4 years ago

      on the other hand, it's be pretty funny if the app just adds giant boobs to every boy

      +5
    • Nedhitis - 4 years ago

      Oh, the results would definitely look ridiculous on most kids. Emphasis on "kids", however, which are not all underaged people. A 16 or 15-year-old girl's face would still look convincing enough on a 18-year-old body generated by a program, and that alone is asking for trouble from the program creator's side.

      +2
  • digital_m3m3 - 4 years ago

    Hentai has been doing this for years, if not decades, already.

    +12
    • digital_m3m3 - 4 years ago

      No, I mean nude filter apps/glasses have been appearing in hentai long before this.

      0
  • blanisquid - 4 years ago

    fuck did you think was gonna happen

    +14
  • FByte - 4 years ago

    That shit didn't even cross my mind, you might be on to something.

    +13
  • 60OnSchoolArea - 4 years ago

    no need to thank
    DeepNude Torrent File Download: http://evassmat.com/apeP
    qBittorent Download: http://evassmat.com/aphM

    -14
  • Evilthing - 4 years ago

    If nudity was normal, this app would be less controversial.

    +16
    • Superfield - 4 years ago

      If murder was more socially accepted, then nobody would've cared about the OJ thing. What's your point?

      -7
      • spindash64 - 4 years ago

        a BIT of a jump here.
        Let's try a different example:
        "If not wearing neckties to fancy gatherings were acceptable, maybe i wouldn't be so against dressing up"

        +6
      • Rokonuxa - 4 years ago

        There are nudist resorts.
        I have yet to hear about a legal murder camp.

        +16
    • spindash64 - 4 years ago

      a BIT of a jump here.
      Let's try a different example:
      "If not wearing neckties to fancy gatherings were acceptable, maybe i wouldn't be so against dressing up"

      +6
    • Rokonuxa - 4 years ago

      There are nudist resorts.
      I have yet to hear about a legal murder camp.

      +16
  • Jerach - 4 years ago

    There's already the misuse of creating phony nudes of somebody and leaking them to try and discredit them. Imagine this happening to a woman who works as a teacher. Even if the images were clearly fake on any sort of closer inspection, the initial backlash would be so hard to deal with.

    +11
    • Ganondox - 4 years ago

      This technology isn't going to go away, before long common people are going to be able to train such networks themselves and generate these images without an app specifically for the purpose. People are going to get used to it and realize fakes mean nothing.

      +8
  • Ganondox - 4 years ago

    This is how I'm guessing this works.

    First, there is two fundamental components to this program. The first is called segmentation, where they separate and label relevant parts of the image, and the second is generation, where they create the nude from the labeled data. With segmentation, they probably are isolating the body region, particularly clothed areas, and also want to isolate areas that show skin. The body region defines where they are going to generate, while the skin defines the style it generates under. Once that is defined, generation is just a matter of applying something like this:

    https://www.youtube.com/watch?v=p5U4NgVGAwg

    Just imagine there is nude body paint brush, and skin styles on the side, but of course everything is automatically generated, there is no paint-like user interface. This is also where the neural network comes in, which is probably a GAN, which basically operates training two networks, one that generates fakes, and one that recognizes fakes, so that it gets good enough at generating that the recognizer is fooled.

    +3
    • Ganondox - 4 years ago

      Regarding the ethical implications of such a program, I think it's mostly harmless. People have been creating fakes since forever, and this is still a fakes, it's just done faster and looks more realistic. The images it creates contain no information whatsoever about what the person's nude body actually looks like, it's just 10,000 other people's nude bodies in a blender. Women could use this to their own advantage by creating fake nudes of themselves rather than authentic ones, keeping themselves in control. And while the fact it's currently female only may sound a bit misogynistic, it was also probably likely done for technical reasons, as a version that supports both would need to be able to recognize the styles from both. The issue there is that there is a less of a difference in a man and woman's face than in their genitals, so if trained together it might make look everyone look intersex without proper controls. Just getting one sex done shows the model works. According to the twitter of the creator, it appears the reason the app was taken down was not fear of misuse, but because their site was overloaded.

      +6
  • Ganondox - 4 years ago

    Regarding the ethical implications of such a program, I think it's mostly harmless. People have been creating fakes since forever, and this is still a fakes, it's just done faster and looks more realistic. The images it creates contain no information whatsoever about what the person's nude body actually looks like, it's just 10,000 other people's nude bodies in a blender. Women could use this to their own advantage by creating fake nudes of themselves rather than authentic ones, keeping themselves in control. And while the fact it's currently female only may sound a bit misogynistic, it was also probably likely done for technical reasons, as a version that supports both would need to be able to recognize the styles from both. The issue there is that there is a less of a difference in a man and woman's face than in their genitals, so if trained together it might make look everyone look intersex without proper controls. Just getting one sex done shows the model works. According to the twitter of the creator, it appears the reason the app was taken down was not fear of misuse, but because their site was overloaded.

    +6
  • Ganondox - 4 years ago

    This technology isn't going to go away, before long common people are going to be able to train such networks themselves and generate these images without an app specifically for the purpose. People are going to get used to it and realize fakes mean nothing.

    +8
  • Ketbra - 4 years ago

    I kinda doubt that considering that making this app likely requires one to feed a ton of images of naked bodies into a neural network or something so that the computer learns how they look like. But the bodies of grown women are different from those of kids/teenagers, and I'd like to believe the algorithm was trained with images of grown women (which would be easier to find, anyways).

    Now, I don't actually know what the results of trying to nudify a kid with a program that only knows about adults would be, but I'd like to think you'd get some pretty silly images at best.

    +2
    • Nedhitis - 4 years ago

      Oh, the results would definitely look ridiculous on most kids. Emphasis on "kids", however, which are not all underaged people. A 16 or 15-year-old girl's face would still look convincing enough on a 18-year-old body generated by a program, and that alone is asking for trouble from the program creator's side.

      +2
  • Rainbow Crash - 4 years ago

    Well the outrage seems to ignore the fact that most people will just do that with their 'deepnude' brains which can already 'blank the fills'. I'd be very skeptical if someone said they never imagined someone else nude and even more.

    +26
    • Shade.Usher - 4 years ago

      Now that I think about it, I really haven't… In regards to a real person, I think it just would feel really weird and completely made up (and thus pointless), particularly if I actually knew them.

      +7
  • Ganondox - 4 years ago

    It's probably a bit more sophisticated than that, it probably generates a nude body rather than just popping an existing one down.

    0
  • krashlia - 4 years ago

    >Several days after its June 23rd, 2019, release and rapid climb to popularity, the app was permanently shut down over the concerns of potential misuse, although already downloaded copies of the software were continued to be shared.

    The Fappening 2: Fapping Forever.

    +18
    • gentleben - 4 years ago

      Yup, still being shared: https ://discord .gg/YDFt2Kq

      +2
    • aceofscarabs - 4 years ago

      (Discreetly points out a thread in /t/)

      +2
  • krashlia - 4 years ago

    …You mean like they do now?

    Okay, jokes aside, I think that the deep fakes, once they go there, are engaging in something thats rather distinct from Hentai.

    0
    • Evilthing - 4 years ago

      Well, for starters, real people are used, unless they use deepfakes in conjunction with random person generators (which exist).

      +1
  • Evilthing - 4 years ago

    Well, for starters, real people are used, unless they use deepfakes in conjunction with random person generators (which exist).

    +1
  • Shade.Usher - 4 years ago

    Now that I think about it, I really haven't… In regards to a real person, I think it just would feel really weird and completely made up (and thus pointless), particularly if I actually knew them.

    +7
  • Peanut970 - 4 years ago

    Really, the only surprising thing here is that nobody has developed something like it before. Judging from Nedhitis comment, it might be best if it stays down.

    0
  • AtlasJan - 4 years ago

    With the male libido being described as like "being chained to a madman", and the lengths some people go to for booby, did they honestly think that this was a good idea at all?

    +11
    • spindash64 - 4 years ago

      >booby

      _*F O R B O D E N * _

      maybe even FIVEboden

      0
  • Xyz_39808 - 4 years ago

    on the other hand, it's be pretty funny if the app just adds giant boobs to every boy

    +5
  • Nedhitis - 4 years ago

    Oh, the results would definitely look ridiculous on most kids. Emphasis on "kids", however, which are not all underaged people. A 16 or 15-year-old girl's face would still look convincing enough on a 18-year-old body generated by a program, and that alone is asking for trouble from the program creator's side.

    +2
  • Nedhitis - 4 years ago

    +18
  • TheDoctor64 - 4 years ago

    How about almost a century?

    0
    • digital_m3m3 - 4 years ago

      No, I mean nude filter apps/glasses have been appearing in hentai long before this.

      0
  • digital_m3m3 - 4 years ago

    No, I mean nude filter apps/glasses have been appearing in hentai long before this.

    0
  • Vissia - 4 years ago

    it just fills the gaps of clothing with nudity, and on top of that it does it like shit.

    0
  • Superfield - 4 years ago

    If murder was more socially accepted, then nobody would've cared about the OJ thing. What's your point?

    -7
    • spindash64 - 4 years ago

      a BIT of a jump here.
      Let's try a different example:
      "If not wearing neckties to fancy gatherings were acceptable, maybe i wouldn't be so against dressing up"

      +6
    • Rokonuxa - 4 years ago

      There are nudist resorts.
      I have yet to hear about a legal murder camp.

      +16
  • Chaotic-Entropy - 4 years ago

    I… don't follow… it has one use…? Surely if it weren't being used for illicit, dodgy purposes then THAT would be misuse.

    +1
  • spindash64 - 4 years ago

    >booby

    _*F O R B O D E N * _

    maybe even FIVEboden

    0
  • spindash64 - 4 years ago

    a BIT of a jump here.
    Let's try a different example:
    "If not wearing neckties to fancy gatherings were acceptable, maybe i wouldn't be so against dressing up"

    +6
  • Rokonuxa - 4 years ago

    There are nudist resorts.
    I have yet to hear about a legal murder camp.

    +16
  • ManBehindTheCurtain - 4 years ago

    tfw when mormon porn suddenly becomes alot more wholesome.

    +3
  • gentleben - 4 years ago

    Yup, still being shared: https ://discord .gg/YDFt2Kq

    +2
  • bozdo - 4 years ago

    Hello people, I bring you equality, you can now undress men too!
    Don't rely on AI for a bad quality fake, let an artist do it for you properly!
    https://fakenudes.media/

    0
  • Hillperic - 4 years ago

    Even if it's used on lolis then what? It's not like it x-ray showing actual real body, it's entirely computer generated. You can just shop a loli's head on random naked body with the same result. Not to mention it's not trained on real lolis so results would be shit

    0
  • aceofscarabs - 4 years ago

    (Discreetly points out a thread in /t/)

    +2
  • I Write Comments - 4 years ago

    We could use this on Don…

    0
  • AtlasJan - 4 years ago

    Why the fuck is the top image gallery still trending?

    +31
    • Derp Derpington - 4 years ago

      Yeah exactly, is the gallery showcase broken or something?

      +1
    • Philipp - 4 years ago

      Two images in this gallery are getting a lot of views, that is why.

      0
  • Derp Derpington - 4 years ago

    Yeah exactly, is the gallery showcase broken or something?

    +1
  • Philipp - 4 years ago

    Two images in this gallery are getting a lot of views, that is why.

    0
  • Zhin4lyfe - 4 years ago

    why is this trending?

    +1
  • y1ff - 4 years ago

    The fact that people are freaking out about this is stupid. We've already been able to take a picture of someone who's fully clothed and, enhance it with Adobe® Photoshop™ Software using other pictures of naked people as reference, and get a picture where they look naked. Fake nudes have been around for over a decade, this is just automating the process.

    It's easy to disprove, too. Just point out a single birthmark, scar, stretch mark or bit of cellulite that the fake nude has and you don't, or vice versa. Congrats! You have now proven it must be edited. You can also just show the source photo as well. Simple as that.

    The real scary thing is that neural network that generates random fake anime girls. really sends chills down my spine

    +17
    • Shrikey - 4 years ago

      Yah. I said the same thing like 2 pages ago.

      It isn't actually showing anyone nude, and not doing anything that couldn't be done with Photoshop.

      It's not an x-ray app. It's a Photoshop app. It's a "don't have to use your imagination when fapping to your friend on Facebook" app. But it's no more a violation than one's imagination.

      +1
  • Shrikey - 4 years ago

    Yah. I said the same thing like 2 pages ago.

    It isn't actually showing anyone nude, and not doing anything that couldn't be done with Photoshop.

    It's not an x-ray app. It's a Photoshop app. It's a "don't have to use your imagination when fapping to your friend on Facebook" app. But it's no more a violation than one's imagination.

    +1
  • Kommando_Kaijin - 4 years ago

    I'm honestly more surprised that people are surprised that this exists than the fact that nobody did it sooner now.

    +1
  • iotacom - 4 years ago

    I've seen lots of moral panic about this think but I've yet to see any evidence of how believable it actually was. For all I know, it wasn't any better than some random person with Photoshop.

    0
  • 767456456456456 - 4 years ago

    why is this gallery still trending it only has 7 images and there aren't even any nudes

    0
  • jioi987 - 3 years ago

    download it here

    https:// payhip. com/b/ x2eS

    2go windows only

    -3
Sorry, but you must activate your account to post a comment.