Earlier this month, the Senate passed the TAKE IT DOWN Act (S. 146), by a voice vote. The bill is meant to speed up the removal of non-consensual intimate imagery, or NCII, including videos that imitate real people, a technology sometimes called “deepfakes.” Protecting victims of these heinous...
I'm no legal expert, but isn't posting porn of someone without their consent already covered somewhere in United States Code? Porn sites did quite the purge a few years ago. So I'm wondering what this solves, even if it were worded better.
Agreed. However, it also appears to apply too broadly:
The letter explains that the bill’s “takedown” provision applies to a much broader category of content—potentially any images involving intimate or sexual content at all—than the narrower NCII definitions found elsewhere in the bill. The bill contains no protections against frivolous or bad-faith takedown requests. Lawful content—including satire, journalism, and political speech—could be wrongly censored.
Can't forget how the rich get legal representation while the poors do not. There is no justice in this country until legal counsel is affordable and accessible to everyone.
What is the implication for Lemmy and other federated platforms? Is running a Lemmy instance going to now come with a huge legal risk and moderation requirements that three people with day jobs can't handle?
I wouldn't think it wouldn't be any more of a legal requirement than making sure you keep CSAM off your servers as with current laws, just now with deep fakes too.