The complementary lawsuits claim that the massacre in 2022 was made possible by tech giants, a local gun shop, and the gunman’s parents.
YouTube and Reddit are sued for allegedly enabling the racist mass shooting in Buffalo that left 10 dead::The complementary lawsuits claim that the massacre in 2022 was made possible by tech giants, a local gun shop, and the gunman’s parents.
It use to be video games and movies taking the blame. Now it's websites. When are we going to decide that people are just bat shit crazy and guns need some form of regulation?
I can see the nuance in an argument that an online community, unmoderated, could be using an algorithm to group these violent people together and amplifying their views. The same can't really be said for most other platforms. Writing threats of violence should still be taken seriously over the internet, especially if it was later acted upon. I don't disagree with you that there's a lot of bat shit crazy out there though.
It's not popular nowadays to mention that people need to have self accountability, there's always apparently a website, service, game or social media platform to "blame" for the actions of the individual
This ineffectiveness is directly due to NRA lobbying, and their zero-tolerance attitude towards any new gun legislation. Any gun-friendly lawmaker who even gets close to writing gun control legislation will end up getting harassed (and likely primaried in the next election). So when gun control legislation passes, it's inevitably written by people who don't understand guns at all. No wonder it's all shit!
Maybe now that the NRA is having financial difficulties legislators will have make leeway to enact things that might have a chance of working.
Neuroplasticity is not really relevant here - it's just the ability of the brain to form new connections. You'd need a casual effect of video games/entertainment toward radicalization inherently and science does not support that position.
Even meta studies are not showing any causal link between gaming/entertainment and aggression
Anecdotally I play a genocidal maniac in every game I can. I love playing total war and killing every single thing I come across, razing pillaging their villages and enslaving the survivors. I've done it since I was a young child playing RTS games like age of empires. Adding up all my video game kills would probably be literally in the billions. Can you guess how many people I've killed in real life?
He was treated like a joke candidate by the Democrats at the time. Facebook didn't get him elected, Hillary ran a weak campaign and didn't take the threat seriously. He used FB for fundraising and she could've done the same thing if she wanted to.
Coming from a country that had a couple of school shootings and then decided it wasn't worth the risk, and everyone handed in their guns with little complaint, I find it hard to comprehend.
Well, even Americans without guns are much more violent than people in other first-world countries. Our non-gun homicide rate is higher than the total homicide rate in (for example) France or Germany.
There's an interesting discussion of the statistics here.
So my interpretation is that gun control is likely to reduce the murder rate, but the change will not be nearly as dramatic as many gun-control supporters seem to expect. Guns aren't most of the problem.
Means≠motivation. Having the capacity to do something doesn't drive one to do so.
I'm not deeply researched on this case but from what I know I'd imagine that poor solication combined with being accepted into a group who'd espouse those kind of views contributed to their actions. Not to say that any of those websites did anything particularly to drive their actions.
You can't sue "the availability of guns", but you can sue YouTube, Reddit, the manufacturer, and whoever else is involved and at least try to get some money out of them.
Man, if the only thing that's preventing a country's populace from murdering each other is restricted access to weapons, then that country is a failed society.
There will always be murders. Humans are irrational creatures. Banning firearms makes murder attempts less likely to succeed, and mass murders significantly harder to plan, execute, and achieve actual mortality with.
I mean, I'm sure there are lots of other socioeconomic reasons, but it feels like you can solve this big one a lot quicker and easier than trying to solve all the abstract issues that covers.
Start with a normal person, get them all jacked up on far right propaganda, then they go kill someone. If the website knows people are being radicalized into violent ideologies and does nothing to stop it, that's a viable claim for wrongful death. It's about foreseeability and causation, not about who did the shooting. Really a lot of people coming in on this thread who obviously have no legal experience.
I just don't understand how hosting a platform to allow people to talk would make you liable since you're not the one responsible for the speech itself.
Is that really all they do though? That's what theyve convinced us that they do, but everyone on these platforms knows how crucial it is to tweak your content to please the algorithm. They also do everything they can to become monopolies, without which it wouldn't even be possible to start on DIY videos and end on white supremacy or whatever.
I wrote a longer version of this argument here, if you're curious.
I agree to a point, but think that depending on how things are structured on the platform side they can have some responsibility.
Think of facebook. They have algorithms which make sure you see what they think you want to see. It doesn't matter if that content is hateful and dangerous, they will push more of that onto a damaged person and stoke the fires simply because they think it will make them more advertisement revenue.
They should be screening that content and making it less likely for anyone to see it, let alone damaged people. And I guarantee you they know which of their users are damaged people just from comment and search histories.
I'm not sure if reddit works this way, due to the upvotes and downvote systems, it may be moreso the users which decide the content you see, but reddit has communities which they can keep a closer eye on to prevent hateful and dangerous content from being shared.
Because you are responsible for hiring psychologists to tailor a platform to boost negative engagement, and now there will be a court case to determine culpability.
Did reddit know people were being radicalized toward violence on their site and did they sufficiently act to protect foreseeable victims of such radicalization?
We should get the thought police in on this also, stop it before it has a chance to spread. For real though, people need to take accountability for their own actions and stop trying to deflect it onto others.
YouTube, named with parent companies Alphabet Inc. and Google, is accused of contributing to the gunman’s radicalization and helping him acquire information to plan the attack. Similarly, the lawsuits claim Reddit promoted extreme content and offered a specialized forum relating to tactical gear.
Say what you want about youtube and reddit but if you want them to censor more and more you are creating a sword that can be used against you too. I also don't like the idea of shooting the messenger no matter how much we may dislike the messages. When I hear lawsuits like this I always think it is greedy lawyers pushing people to sue because they see deep pockets.
It doesn't make sense to treat websites as utilities. Net neutrality can't be applied to websites, it would make most basic spam filtering infeasible and blow up operational costs
and with hold sites like youtube accountable I am living a gun that can shoot me. Its a double edge sword that can be used to hurt me no matter what we do
The article doesn't really expand on the Reddit point: apart from the weapon trading forum, it's about the shooter being a participant in PoliticalCompassMemes which is a right wing subreddit. After the shooting the Reddit admins made a weak threat towards the mods of PCM, prompting the mods to sticky a "stop being so racist or we'll get deleted" post with loads of examples of the type of racist dog whistles the users needed to stop using in the post itself.
I don't imagine they'll have much success against Reddit in this lawsuit, but Reddit is aware of PCM and its role and it continues to thrive to this day.
Who would be the right one to sue? Reddit is hosting it, but they are using admins to keep discussion civil and legal; the admins of PCM are most likely not employed by Reddit, but are they responsible for users egging each other on? At what point is a mod responsible for users using "free speech" to instigate a crime? They should have picked a few posts and users and held them accountable instead of going for the platform.
People will keep radicalizing themselves in social media bubbles, in particular when those bubbles are not visible to the public. Muting discussion on a platform will just make them go elsewhere or create their own. The better approach would be to expose them to different views and critique of what they are saying.
There's admins and there's moderators (mods). Please clarify which you mean.
Admins are Reddit employees and are supposed to enforce site-wide rules outlined in their policy and terms of use.
Moderators are unpaid volunteers whose identity is typically unknown to Reddit who are in charge of running a sub. Moderators can make up additional rules and enforce them.
He wasn't a participant. I was a mod there before I immolated my Reddit account and day it happened I trudged through his full 196 page manifesto. It mentions PCM exactly 0 times. What does he mention in it? /pol/ and /k/ specifically. With /pol/ taking around 40% of the entire manifesto. He made a single comment on /r/pcm. That comment? "Based." We have/had nearly 600k users, 150k active weekly. One person making one comment does not judge the community. He was active on other parts of Reddit as well. Much more than ours.
In the USA it's not a crime to be racist, promote a religion teaching that God wants you to be racist, say most racist things in public, or even join the American Nazi Party. The line is set at threatening, inciting, or provoking violence, and judges don't accept online arguments that saying racist garbage is inherently threatening.
The lawsuit claims Mean LLC manufactured an easily removable gun lock, offering a way to circumvent New York laws prohibiting assault weapons and large-capacity magazines.
This seems like the only part of the suits that might have traction. All the other bits seem easy to dismiss. That's not a statement on whether others share responsibility, only on what seems legally actionable in the US.
Here's an install video of what I assume was the product in question based on the named LLC. https://youtu.be/EjJdMfuH9q4
Shy of completely destroying the the lock and catch system by drilling the mechanism I don't see an effective way of removing it.
I don't think it'd meet the court's standards for easily removable given it'd require power tools and would permanently alter the device in an unfamiliar reversible way.
They're just throwing shit at the wall to see what sticks hoping to get some money. Suing google for delivering search results? It shows how ridiculous blaming tools is. The only person liable here is the shooter.
Well, maybe. I want to be up-front that I haven't read the actual lawsuit, but it seems from the article that the claim is that youtube and reddit both have an algorithm that helped radicalize him:
YouTube, named with parent companies Alphabet Inc. and Google, is accused of contributing to the gunman’s radicalization and helping him acquire information to plan the attack. Similarly, the lawsuits claim Reddit promoted extreme content and offered a specialized forum relating to tactical gear.
I'd say that case is worth pursuing. It's long been known that social media companies tune their algorithms to increase engagement, and that pissed off people are more likely to engage. This results in algorithms that output content that makes people angry, by design, and that's a choice these companies make, not "delivering search results".
On the very specific point of liability, while the shooter is the specific person that pulled the trigger, is there no liability for those that radicalised the person into turning into a shooter? If I was selling foodstuffs that poisoned people I'd be held to account by various regulatory bodies, yet pushing out material to poison people's minds goes for the most part unpunished. If a preacher at a local religious centre was advocating terrorism, they'd face charges.
Google's "common carrier" type of defence takes you only so far, as it's not a purely neutral party in terms, as it "recommends", not merely "delivers results", as @joe points out. That recommendation should come with some editorial responsibility.
This is more akin to if you sold a fatty food in a supermarket and someone died from being overweight.
Radicalizing someone to do this isn't a crime. Freedom of speech isn't absolute but unless someone gives them actual orders it would still be protected.
Don't apply UK's lack of freedom of speech in American courts.
YouTube, Reddit and a body armor manufacturer were among the businesses that helped enable the gunman who killed 10 Black people in a racist attack at a Buffalo, New York, supermarket, according to a pair of lawsuits announced Wednesday.
The complementary lawsuits filed by Everytown Law in state court in Buffalo claim that the massacre at Tops supermarket in May 2022 was made possible by a host of companies and individuals, from tech giants to a local gun shop to the gunman’s parents.
The lawsuit claims Mean LLC manufactured an easily removable gun lock, offering a way to circumvent New York laws prohibiting assault weapons and large-capacity magazines.
YouTube, named with parent companies Alphabet Inc. and Google, is accused of contributing to the gunman’s radicalization and helping him acquire information to plan the attack.
“We aim to change the corporate and individual calculus so that every company and every parent recognizes they have a role to play in preventing future gun violence,” said Eric Tirschwell, executive director of Everytown Law.
Last month, victims’ relatives filed a lawsuit claiming tech and social media giants such as Facebook, Amazon and Google bear responsibility for radicalizing Gendron.
RMA Armament is named for providing the body armor Gendron wore during the shooting.
No he bought it.
Vintage Firearms of Endicott, New York, is singled out for selling the shooter the weapon used in the attack.
Not their issue he passed the background check.
The lawsuit claims Mean LLC manufactured an easily removable gun lock, offering a way to circumvent New York laws prohibiting assault weapons and large-capacity magazines.
Any knob w/ a dremel can make a gun full auto, let alone defeating a mag lock. And he broke NY law doing this.
YouTube, named with parent companies Alphabet Inc. and Google, is accused of contributing to the gunman’s radicalization and helping him acquire information to plan the attack.
This is just absurd.
My guess is they are hoping for settlements vs going to trial where they lose.
Only responding to the last point, but if they can prove that Google somehow curated his content to push him towards fringe, terroristic websites, they could be found liable as a civil suit.
Next they will announce that they are suing Disney because he watched the History Channel, and that had violence on it which contributed to his actions.
Not saying something shouldn't be done or not done with the gun situation. But I believe it's the community driving these kids to want a gun to kill people. Gun laws are just one part of many problems that are a part of our broken community. I guess the guns are a result of a broken community is part of what I mean. Banning guns alone in my eyes is an extremely over simplified bandaid fix. Tbh these days I see the gun debate as crooked politics just trying to get votes... They want that free publicity.
Edit: a politician is never going to speak negatively about the general community. They can't, it would kill their career. I think that's a big problem in why nothing changes. Politics is money and business it's like gang life for white collars
Ahh one of those "We're mad and we don't have anyone to be angry with." style lawsuits. Pretty much the Hail Mary from a lawyer who is getting their name in the paper but knows it won't go anywhere.
"Easy to remove gun lock" that has been tried multiple times and usually fails. "Gun lock" doesn't seem to be related to assault weapons and large capacity magazine but who knows what they mean, even when a gun is "Easily modifiable" it's usually not treated as illegal, because someone has to actually make those modifications. The same will probably be the case for the kevlar. (at the time of the shooting it was legal).
Youtube contributing to radicalization is a laugh, it's an attempt to get their name in the papers and will be dismissed easily. They'd have better chance to name the channels that radicalized him, but first amendment rights would be near absolute here. Besides which "Radicalization" isn't the same as a conspiracy or orders. It's the difference between someone riling up the crowd until they're in a fervor which ends up in a riot, and someone specifically telling people how to riot and who to target. (Even if can be tried as crimes, one is a conspiracy, one is not, and even that "radicalization" would be neither.) Even "I wish someone would go shoot up ..." would be hyperbole, and thrown out as well. It's pretty hard to break the first amendment protections in America (And that's a good thing, if you think it's not imagine if the other party is in power and wants to squash your speech... yeah let's keep that amendment in place).
The same will be the case against Facebook for all the same reasons.
If you think Google should be responsible, then you think the park that someone is radicalized in should be responsible for what's said in it, or the email provider is responsible for every single piece of mail that is sent on it, even though it might not have access to see that mail... it's a silly idea even assuming they could even do that. Maybe they're hoping to scare Google to change it's algorithm, but I doubt that will happen either.
The case against the parents is another one that people try and again... unless there's more than their saying, you still can't sue someone for being a bad parent. Hell there's a better case against the parents of Ethan Crumbley, and even that cases is still pretty shaky, and involved the parents actively ignoring every warning sign, and buying the kid the gun. This there's nothing that seems to be pinnable on the parents.
You know it sucks and I know there's a lot of hurt people but lawsuits like this ultimately fail because it's like rolling the dice, but history pretty much shows this is hoping for a one in a million chance that they get lucky, and they won't, because it's one in a million, and then they'd have to hope it's not overturned even if they do win.
interesting… whether the sites will be found liable…. it’s pretty unlikely, but it sure does shine a spotlight on how each are magnets for alt-right crazies. I wonder if that will have any effect on their moderation?
They're also "magnets" for progressive, liberal, conservative and all other crazies and normal people. That's mostly because everyone uses them. It's the most popular video sharing site and (one of?) the most popular social media site.
I used to think censorship worked. Now I think that just encourages troubled individuals to find an even worse echo chamber somewhere on the internet.
I don't know what the right answer is regarding some of the parties in these lawsuits, I just see more and more stuff get censored and it never seems to get any better.
Can't see how the lawsuit on the tech giants gets passed Section 230, which is unfortunate as Spez and the people who run Youtube willfully helped enable and encourage this shooter.
You argue that the product is faulty, you don't play with 230. That's my guess as to their strategy, as its the same strategy other lawyers are attempting to use.
"YouTube, Reddit and a body armour manufacturer were among the businesses that helped enable the gunman who killed 10 Black people in an racist attack at a Buffalo, New York, supermarket, according to a pair of lawsuits announced Wednesday.
I didn't say that. You're putting words into my mouth. It still took a human to take up arms and use a tool. Youtube alone didn't do this. Reddit alone didn't do this. Guns alone didn't do this. Training and a license would not have prevented this.