Choose wisely
Choose wisely
Choose wisely
You're viewing a single thread.
Why it is censored?
I've seen this a lot recently. I'm guessing a certain social media platform is perpetuating this. Not sure if it's the platform itself taking action against accounts using these words or if it's the users themselves collectively deciding to censor these topics.
Either way it's stupid to me how language is becoming increasingly taboo. We need to be more open to discussion not less so.
I watch a news show on YouTube that censored "human trafficking" in an article excerpt. Really stupid to have social media policing news
I've been seeing a lot of "trigger warnings" myself. Idk when we decided to treat triggers exactly how the professionals say they should not be treated, (triggers are supposed to be identified, but it is your responsibility to learn how to deal with it and navigate life not the world's responsibility to cater to your every whim), but even so some of these "triggers" are patently ridiculous. On mastodon people are blurring pictures of dogs, cats, even fucking cartoons, for "eye contact." With a picture. Of a cat. Shit has gone too far.
, but it is your responsibility to learn how to deal with it and navigate life not the world's responsibility to cater to your every whim
Is warning people really catering to them? Maybe some people are learning how to deal with and the heads-up helps? Idk, that mastadon situation sounds ridiculous though. What instance is that from?
Tbh warning of "eye contact" be it humans depicted or otherwise is a bit much, as is "food" and probably some others I've seen I can't recall. I can't recall all the masto instances I've seen it from, it was standard practice before the Twitter people showed up. Some on the one I'm on even do it, though very few and rarely. Looks like it is less common now, but if you look around a bit you should still see it.
Tik Tok is the most aggressive. Saying Sex, kill, die or rape at all in your video has a pretty strong chance of getting your post hidden to varying degrees.
I think this is because they were getting lawsuits when their wonderful algorithms kept sending suicidal kids more info on suicide.
Dropping negative topics down the rankings is not a terrible idea, but it does lead to bonkers workarounds when people want to talk about it anyway and worry about keeping their social media metrics up.
Hiding all the engagement metrics would probably do wonders for a lot of teens mental health, as they become desperate to be influencers.
I think it's more of a business decision. A lot of parents won't let their kids on reddit (for good reason lol) but Tik tok actually does do an alright job of keeping age appropriate content off the platform, much tighter than most other social media sites. For my own interest I went looking for NSFW stuff on tik tok, the most intense I could find was some tame thirst traps, some national geographic style nudity (lots of discussion of adult themes, but I don't think those are neccesarily a problem).
on any other social media site I've ever used, even with child mode or whatever I have been able to find beheadings and hardcore sex. That's a pretty big selling point if you want to primarily attract children.
I have been able to find beheadings and hardcore sex. That's a pretty big selling point if you want to primarily attract children.
ಠ_ಠ
( ͡° ͜ʖ ͡°)
What a dumb idea. Not surprised it comes along today when common sense is gone.