The explicit AI-created images of Taylor Swift flooding the internet highlight a major problem with generative AI
The explicit AI-created images of Taylor Swift flooding the internet highlight a major problem with generative AI
The explicit AI-created images of Taylor Swift flooding the internet highlight a major problem with generative AI
You're viewing a single thread.
Is it a problem though? I mean, it just makes Rule34 pics that much easier to create. And you wouldn't want to kink-shame anyone, now, would you? Why is it always heterosexual men who are kink-shamed? Why is liking naked women a bad thing?
Alright, I'll bite. To start, this is a real person that we are talking about. A real person who did not consent. Does that mean anything to you? The fact that there is a very real person that exists in very real life that has had this happen to them?
Otherwise, I agree. Nothing wrong with the male libido.
I don't need someone's consent to draw lewd fanart of them.
I wouldn't be so sure. Depending on where you are in the world, there dozens of laws that might interfere. Ranging from publicity rights to slander, especially if the images are photorealistic (enough).
Do you need their permission to distribute that art?
100%, which is why the fault lies with the bad actors and the platforms that let this proliferate and not the tool itself.
Can you imagine this headline but with Photoshop instead of AI? It would be utterly silly.
This is orchestrated to create anger against AI. There's a lot of money involved in it and that money triples if consumers aren't allowed to run and distribute models on their own PC.
I can agree with you on principal. Just as long as you aren't distributing things like this I don't really see an issue. Not the tool, the distributor / platform. I also agree that these articles are meant to ensure that those technology can be held behind locked doors. I fully support the idea of making AI something that is self-hostable.
That being said there are people in this thread that see nothing wrong with distributing lewd pictures of real people (drawn, ai generated, or even hacked and stolen). That is the only thing that I was addressing.
Ya I agree, there's a very ugly side of things and It's a shame certain people are rolling themselves in it.
There's also the fact that lewds of celebs and similar material will grow exponentially but the same is also true for all other media so hopefully it balances itself out.
I think that's the main problem with a lot of these articles, they are missing the forest for the trees so to speak. We are looking at an explosion of culture, the bad stuff is just along for the ride. I'm personally excited for it.
This is actually super super tricky.
So, there's an exemption for 'Transformative' art, and while this is obviously pretty shady, it feels like there's a good chance this would qualify as transformative. Basically, you can't copy an existing photograph you don't own, but you can take an existing person and paint a new original picture of them.
We had a big lawsuit just last year where the Supreme Court clarified the line a bit. In that case, the art was found to be not Transformative, but they did a lot to explain why, and based on that, this would be super likely to fall on the side of 'Legally Allowed'.
Do you?
I'm not a lawyer and can't even begin to answer that question. I was merely trying to get the conversation starting down that logical track, because I, personally, think that it is at the heart of the matter.
Looking this up, it seems that, at least in California, it probably would be considered illegal, at least according to this site.
So, this line of thought is not going to get AI fakes permitted, it's going to get rule 34 banned.
We've been censoring sex shit for a long ass time, in case you haven't noticed. The recent trend of information freedom is not going to defeat that old religious puritanical bs, and people's wishes for privacy on top of it.
Trump would shut that shit down fast. Conservatives want people reproducing, not masturbating, and he has christian supporters to keep in line, who do not like porn.