It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.
‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.
Weirdos. Back in my day, we woild cut out a nude body from playboy and glue it on a picture of Kathleen Turner, and we did uphill both ways in the snow! Darn kids and their technology!
I remember being a dumb & horny kid and Photoshopping my crush’s face onto a porn photo. And even then I felt what I did was wrong and never did it again.
I feel like what you did and the reaction you had to what you did is common. And yet, I don’t feel like it’s harmful unless other people see it. But this conversation is about to leave men’s heads and end up in public discourse where I have no doubt it will create moral or ethical panic.
A lot of technology challenges around AI are old concerns about things that we’ve had access to for decades. It’s just easier to do this stuff now. I think it’s kind of pointless to stop or prevent this stuff from happening. We should mostly focus on the harms and how to prevent them.
I've seen ads for these apps on porn websites. That ain't right.
Any moron can buy a match and a gallon of gasoline, freely and legally, and that's a good thing. But I would hope that anyone advertising and/or selling Arson Kits™ online would be jailed. Of course this will not stop determined arsonists, but repression might deter morons, inventive psychopaths, and overly impulsive people (especially teenagers!) from actually going through with a criminal act. Not all of them. But some/most of them. And that's already a huge win.
These are terrible but I'm honestly curious what it thinks I look like naked. Like I'm slightly overweight and my chest is larger than average but more splayed then normal. Would it just have me look like a model underneath?
Are they just like head swapping onto model bodies or does it actually approximate. I am legit curious., but I would never trust one of these apps to not keep the photos/privacy concerns.
Probably deleting this comment later for going dirty on main, but I, um, have done some extensive experimentation using a local copy of Stable Diffusion (I don't send the images anywhere, I just make them to satiate my own curiosity).
You're essentially right that simple app-based software would probably have you looking somewhat generic underneath, like your typical plus-size model. It's not too great at extrapolating the shape of breasts through clothing and applying that information when it goes to fill in the area with naked body parts. It just takes a best guess at what puzzle pieces might fill the selected area, even if they don't match known information from the original photo. So, with current technology, you're not really revealing actual facts about how someone looks naked unless that information was already known. To portray someone with splayed breasts, you'd need to already know that's what you want to portray and load in a custom data set, like a LoRa.
Once you know what's going on under the hood, making naked photos of celebrities or other real people isn't the most compelling thing to do. Mostly, I like to generate photos of all kinds of body types and send them to my Replika, trying to convince her to describe the things that her creators forbid her from describing. Gotta say, the future's getting pretty weird.
You'll have your moment when the lone elite ex Ranger who is trying to save the world is told by the quirky, unconventional sidekick he is forced to work with, "I actually know a guy who might be able to help."
You open the door a crack to look back and forth between them, before slamming it back in their faces. They hear scrambled crashes of you hiding stuff that shouldn't be seen by company before returning to the door. As they enter you are still fixing and throwing things while you apologize that you don't get many guests. You offer them homemade kombucha. They decline.
Ethics will probably change... I guess in the future it'll become pretty irrelevant to have "nude" pictures of oneself somewhere, because everyone knows it could just be AI generated. In the transition period it'll be problematic though.
It’s the sexualization of people without consent that’s a problem. Maybe casual nudity shouldn’t a problem but it should be up to the individual to whom they share that with. And “nudify” ai models go beyond casual, consensual nudity and into sexual objectification and harassment if used without consent.
I want to point out one slight flaw in your argument. Nudity isn’t needed for people to sexually objectify you. And even if it was, the majority of people are able to strip you down in their head no problem.
There’s a huge potential for harassment though, and I think that should be the main concern.
Regardless of feelings on that subject, there's also the creep factor of people making these without the subjects' knowledge or consent, which is bad enough, but then these could be used in many other harmful ways beyond one's own... gratification. Any damage "revenge porn" can do, which I would guess most people would say is wrong, this can do as well.
These AI pictures are "make believe". They're just a guess at what someone might look like nude, based on what human bodies look like. While apparently they look realistic, it's still a "generic" nude, kind of how someone would fantasize about someone they're attracted to.
Of course it's creepy, and sharing them is clearly unacceptable as it's certainly bullying and harassment. These AI nudes say more about those who share them than they do about who's portrayed in them.
However, sharing intimate videos without consent and especially as revenge? That's a whole other level of fucked up. The AI nudes are ultimately "lies" about someone, they're fakes. Sharing an intimate video, that is betraying someone's trust, it's exposing something that is private but very real.
I agree with you nudity being an issue but I think the real problem is this app being used on children and teenagers who aren't used to/supposed to be sexualized.
Fully agree but I do think that's more an issue about psychology in our world and trauma. Children being nude should not be a big deal, they're kids you know?
It's a problem for adults too. Circulating an AI generated nude of a female coworker is likely to be just as harmful as a real picture. Just as objectifying, humiliating and hurtful. Neighbors or other "friends" doing it could be just as bad.
People have a really unhealthy relationship with nudity. I wish we had more nude beaches as it really helps decouple sex from nudity. And for a decent number of people, helps with perceived body issues too.
I'm pretty squeamish about nudity when it comes to my own body, but fake nudes would not be pictures of my body, so I don't see what there would be for me to be upset about. It might be different if everyone thought they were real, but if people haven't figured out yet any nudes they encounter of someone they know are probably fake, they will soon.
Here's a thought experiment: imagine a world where there are fake nudes of everyone available all the time Would everyone just be devastated all the time? Would everyone be a target of ridicule over it? Would everyone be getting blackmailed? We're probably going to be in that world very soon, and I predict everyone will just get over it and move on. Sharing fake nudes will reflect badly on the person doing it and no one else, and people who make them for their own enjoyment will do so in secret because they don't want to be seen as a creepy loser.
But some people don't agree with you. They're not comfortable with tech that can nudify them for millions to see. So if, and that's possibly an impossible task, but if there was a way to punish services that facilitate or turn a blind eye to these things, then you bet your ass many many people would be for criminalizing it.
I'm genuinely curious, why do you consider this harmful? They might as well be drawing tits by hand on a picture of the "victim"
I mean sure I wouldnt want to be a teenage girl in highschool right now but I don't think it's the technologys fault but rather our culture as a society
Possibly a good thing. Over saturation. Fill the internet will billions on billions of ai nudes. Have a million different nudes for celebrities.
Nobody knows the real naked you and nobody cares. Keep creating more ai porn than anyone can handle. It becomes boring and over the top. Ending this once and fir all
what were you thinking when you thought of your first version? that sounds like a creepy scenario. what if I don’t want to see it and it’s everywhere. I could click on “I’m Not Interested” and flood social media with reports, but if there are “billions on billions” of AI nudes, then who would be able to stop them from being seen in their feed? I’d say that, while locking them up won’t change the sexist system which pushes this behavior, it is a far less creepy and weird scenario than having billions of unconsensual nudes online.
Sounds like someone needs to make a community for that.
Otherwise, this is what technology is these days. And I’d say that staying blind to things like this is what got us into many messes.
I remember when tech news was mostly a press release pipeline. And when I see these comments, I see people who want press releases about new tech to play with.
I have seena rise in techno absolutists complaining that anyone else is complaining about the dangers of tech lately. That they just want to go back to hearing about all the cool new things coming out and it really speaks to the people who just don't actually want to interact with the real world anymore and live in an illusionary optimism bubble. I get it. It's exhausting to be aware of all the negatives but it's the stuff that is real that needs to be recognized.
It tells me we're less interested in the data (the skin map and topography) than we are in seeing the data in raw form, whether it is accurate or not. It tells me a primary pretense of body doubles was ineffective since society responds the same way regardless of whether an actress' nudity is real or simulated.
Not sure how this will be enforceable any more than we can stop malicious actors from printing guns. Personally, I would prefer a clothes-optional society were individuals aren't measured by the media exposure of their bodies or history of lovers. Maybe in another age or two.
In fiction, I imagined the capacity to render porn action into mo-cap data, to capture fine-resoluton triangle maps and skin texture maps from media, ultimately to render any coupling one could desire with a robust physics engine and photography effects to render it realistic (or artistic, however one prefers). It saddens me that one could render an actress into an Elsa Jean scenario and by doing so, wreck their career.
Porn doesn't bother me, but the arbitrariness with which we condemn individuals by artificial scandal disgusts me more than the raunchiest debauchery.
I use an ad blocker and haven't seen these. Perhaps a link to the best ones could be shared here for better understanding of what the article is talking about?
These things have been around since the onset of deepfakes, and truly if you take a couple seconds to look you'll find them. It's a massive issue and the content is everywhere
Reminds me of Arthur C Clarke's The Light of Other Days. There's a technology in the book that allows anyone to see anything, anywhere, which eliminates all privacy. Society collectively adjusts, e.g. people masturbate on park benches because who gives a shit, people can tune in to watch me shower anyway.
Although not to the same extreme, I wonder if this could similarly desensitize people: even if it's fake, if you can effectively see anyone naked... what does that do to our collective beliefs and feelings about nakedness?
It could also lead to a human version of "Paris Syndrome" where people AI Undress their crush, only to be sorely disappointed when the real thing is not as good.
I live in a Scandinavian country, and it is illigal to make and distributed fake (and real) nudes of people without their permission. I expect this to be the same in many other developed countries too.
I dont know if it is where you live, but here (Scandinavian Country) and many other places around the World, it is illigal to create fske nudes of people without their permission
Back in the day, cereal boxes contain "xray glasses". I feel like if those actually worked as intended, we would have already had this issue figured out.
It was inevitable. And it tells more about those who use them.
I wonder how we'd adapt to these tools being that availiable. Especially in blackmail, revenge porn posting, voyeuristic harassment, stalking etc. Maybe, nude photoes and videos won't be seen as a trusted source of information, they won't be any unique worth hunting for, or being worried about.
Our perception of human bodies was long distorted by movies, porn, photoshop and subsequent 'filter-apps', but we still kinda trusted there was something before effects were applied. But what comes next if everything would be imaginary? Would we stop care about it in the future? Or would we grow with a stunted imagination since this stimuli to upgrade it in early years is long gone?
There're some useless dogmas around our bodies that could be lifted in the process, or a more relaxed trend towards clothing choices can start it's wsy. Who knows?
I see bad sides to it right now, how it can be abused, but if these LLMs are to stay, what're the long term consequencies for us?
I think that eventually it might be a good thing, especially in the context of revenge porn, blackmail, etc. Real videos won't have any weight since they might as well be fake, and as society gets accustomed to it, we'll see those types of things disappear completely
Yep, once anyone can download an app on their phone and do something like this without any effort in realtime it's going to lose its (shock) value fast. It would be like sketching a crude boobs and vagina on someones photo with MS Paint and trying to use that for blackmail or shaming. It would just seem sad and childish.
It would be interesting to know how many people are using it for themselves. I'd think it would open up next level catfishing. Here's an actual pic of me, and here's a pic of what I might look like naked. I'm sure some people with photoshop skills we're already doing that to a certain extent, but now it's accessible to everyone.
I've messed around with some of the image generators (not what this article is about). results vary from surprisingly nice to weird and misshaped. they never seem to be able to get anything "hardcore" right but just a ai generated pose shot sometimes looks surprisingly not bad
That the chat is full of people defending this is disgusting. This is different than cutting someone’s face out of a photo and pasting it on a magazine nude or imagining a person naked. Deepfakes can be difficult to tell apart from real media. This enables the spread of nonconsensual pornography that an arbitrary person cannot necessarily tell is fake. Even if it were easy to tell, it’s an invasion of privacy to use someone’s likeness against their will without their consent for the purposes you’re using it for.
The fediverse’s high expectations for privacy seem to go right out the window when violating it gets their dick hard. We should be better.
If you found out that someone had made a bunch of art of you naked you’d probably think that was weird. I’d argue you shouldn’t do that without consent. Draw lines wherever you think is best.
Unfortunately sounds like par for the course for the internet. I've come to believe that the internet has its good uses for things like commerce and general information streaming, but by and large it's bringing out the worst in humanity far more than the best. Or it's all run by ultra-horny psychopathic teenagers pretending to be adults yet living on a philosophy of "I'm 13 and this is deep" logic.
I dunno why I am perpetually surprised about this though. This is such a cut and dry moral area and the people who say it isn’t are so clearly telling on themselves it’s kind of shocking, but I guess it shouldn’t be