Study: Social media probably can’t be fixed
Study: Social media probably can’t be fixed

Study: Social media probably can’t be fixed

Study: Social media probably can’t be fixed
Study: Social media probably can’t be fixed
Meta and twitter cease to exist tomorrow and 99% of the issues are solved IMO
The fediverse is social media and it doesn't have anything close to the same kinds of harmful patterns
It's almost like the problem isn't social media, but the algorithms that put content in front of your eyeballs to keep your engagement in order to monetize you. Like a casino.
Facebook was pretty boring before they tried to make money. Still ick, but mostly just people posting pictures of activities with family or friends.
Exactly, the one big issue with the modern world is the algorithms pushing for engagement as the only important metric.
Although I love Lemmy, I find it will be hard to recommend a normal young person to hop on Lemmy, Mastodon, Kbin, Misskey, Iceshrimp, etc. Most people on here talk about tech and politics. If you scroll through the main feed, you won't get stuff from other communities unless you seek it out.
Not diverse enough, but once it gets diverse, it will probably enshitify and make the community mainstream garbage. Then we're back to square one with people making clickbait posts and attention seeking people.
Amazon, Google and Microsoft would still be there, so the Internet seems to be suffering from a metastatic cancer at this point. Cutting off two revolting lumps helps, but the prognosis doesn’t look that great.
None of those have had much success in creating social networks that suck people in quite like the others
Not to say they don't have their own problems, but the bulk of problems with social media come squarely from meta & twitter.
There will be a big curtaining of Apple, Microsoft, Google and Adobe if Facebook, TikTok and Twitter (and YouTube) have their algorithmic feeds outlawed.
It would probably cause the AI bubble to burst too so our OSs, Applications and Search Engines (and Government) would become usable again.
It has. Discussions here are mostly, just like elsewhere, people throwing arrogant smartass-looking text at each other and refusing to elaborate or explain or reason. Due to the experience of getting into such, people who'd actually discuss something instead "money-first" post with a set of markers hinting at their opinions and possible arguments, and masquerade discussion as agreement. It's only a little less exhausting than going into a shit-throwing contest, even if more rewarding.
lemmy does have problems though. Lots of emotional, judgemental and brigading content still. But it's less here than elsewhere, probably.
The amount of comments thinking that Lemmy is totally not like a typical social media is absurd.
Guys, we only don't have major tracking of users here.That's it! Everything else is the fucking same shit you'd see on facebook. The moment Lemmy gets couple tens of millions of users, we gonna become 2nd facebook.
It's that there's no incentive to have 80 million bots manipulate everything. Our user base is too small, and likely too jaded about fake internet points to be a target for scammers, ai slop bots, or advertisers.
Or at least that's what I thought when I drink a refreshing Pepsi! hiss-crack! glugg glugg Aaaah!! PEPSI! The brown fizz that satisfies! Pepsi!
Lemmy doesn't have a neural net prediction/recommendation engine. This is a HUGE difference.
And for the same reasons folks got hooked on old reddit, folks get hooked on Lemmy (its me I'm folks please unplug me from the machine I can't log out)
It's not a typical social media because it's decentralized, but it's not immune to all the problems of social media by any means. I'm not sure why you're using Facebook as an example rather than reddit.
I haven't used FB in half a decade, but at least with respect to reddit, there are definitely more good "features" in the threadiverse than just lack of tracking.
Not saying there aren't any issues or that scaling to 10 M MAUs won't create new problems, but lack of tracking isn't the only differentiating factor.
Yeah decentralization and open source software and protocols being big ones. It means that if the "main" culture turns reactionary, that we're not trapped in the same spaces as the shithead just because we share a platform.
There could absolutely be two main fediverses, with no changes to the technology.
Facebook has lots of miss information and scams too, which here on Lemmy don't have. Edit: if Lemmy was Facebook, then we would follow friends and share our locations and our photos
yes, and no. what really Facebook lacks (along the top social medias) is strong negative feedback.
I don't think the village idiot is going that far with the flat earth conspiracy when is publicly downvoted to oblivion
I beg to disagree.
The reason all these delusional posts getting even upvoted to begin with is due to many like-minded people are gathered together in the same sub. As an example, reddit's r/democrats and r/republicans. One is clearly more sane than another, yet try to say something in a wrong sub - get downvoted to oblivion. But if you spill your delusional shit in a r/republicans - upvotes galore and comments of praise.
Facebook groups are the same shit. And so is Lemmy. One thing in hexbear that is allowed could/will be the reason you got a ban in .world. Up/Downvotes cant fix that.
tl;dr Village idiots can join together to accumulate their own conspiracies in a big ass circlejerk, and social media has no power to stop it.
Reddit has downvotes. That hasn't saved it from misinformation, trolls, and radicalization.
if we're immune to the problems, it would be because people here use critical thinking skills instead of swallowing large amounts of contents. that's the sole reason, it has nothing to do with the network's size.
We're on the solution right now, lmao
Of course -corporate- social media can't be fixed ... it already works exactly they way they want it to...
Social media isn’t broken. It’s working exactly how it was meant to. We just need to break free of it.
first of all, it's a broad overgeneralization to assume that all social media is created with the intention to manipulate people. there was honest people running social media, but it's long past. (in the corporate domain)
But it's not possible to get unbiased content on the internet. Everything exists with an agenda behind it, for the sole reason that hosting anything is going to constantly cost money.
This wasn't a huge deal when individuals were paying to host and share content to a small audience, it was a small amount of money and you could see their motives clearly (a forum for a hobby, a passion project, an online store, etc...).
Social media is different because it presents itself as a public forum where anything can be shared and hosted (for free) to as many people as you want. But they're still footing a very large bill and the wide net of content makes their motives completely opaque. Nobody cares that much about the headaches of maintaining a free and open public forum, and any profit motive is just another way to sell manipulation.
No social media was created to manipulate people. (Most) social media is a business, optimised to make money. You make money by showing people ads. You can show more ads to people if they stay on the platform longer. You can make people stay longer by engaging them emotionally. End of conspiracy...
But that's not profitable.
rational discourse is one of the valuable options possible.
Yeah, can't say that I've seen a lot of that on social media.
You don't need social media to do rational discourse, anyway. All you need is two-way communication, a problem that the Internet solved long before any Facebooks or Twitters popped up. You can have rational discourse on IRC, an email list, or even through instant messaging.
throwing away the whole internet because Xitter sucks is throwing away the baby with the bathwater.
I know you're being hyperbolic here, but unfortunately there are a lot of people now who really do see social media as "the whole Internet". And they have thrown a lot away as a result.
I think just going back to internet forums circa early 2000s is probably a better way to engage honestly. They're still around, just not as "smartphone friendly" and doomscroll-enabled, due to the format.
I'm talking stuff like SomethingAwful, GaiaOnline, Fark, Newgrounds forum, GlockTalk, Slashdot, vBulletin etc.
These types of forums allowed you to discuss timely issues and news if you wanted. You could go a thousand miles deep on some bizarre subculture or stick to general discussion. They also had protomeme culture before that was a thing - aka "embedded image macros".
Anything that is topic focussed rather than following individuals is a big difference, and then take away the engagement algorithm and it’s much better.
This is a good point. It's like asking the question: "What is more important in politics? People, or ideas?"
People respond very differently to that. To some it's people, and to some it's ideas. That is why you have Xitter-like microblogging which is focused around people, and reddit-like communities which are focused around topics/ideas.
That's what I've been hoping for with Reddit and now Lemmy. I don't care about individuals, I care about topic based discussion.
My problem with forums is they are more like a club, where you get loss of off-topic discussion by people who happen to share an interest. I don't care what tech nerds think about medicine on a tech nerd forum, and joining dozens of forums to get the right discussion is a huge pain.
Forums are cool, and I use a few, but I really want a place that connects different subjects.
just not as “smartphone friendly” and doomscroll-enabled, due to the format.
Boowahahahahaha, I've used those with PSP default web browser. With Nintendo Wii web browser. With Java phone web browser (admittedly that was only to read, and very slowly).
Anyway, have clumsy sweaty big fingers (unfortunately due to my behavior girls don't extrapolate that feature anywhere anymore), strongly prefer anything with physical keys.
They also had protomeme culture before that was a thing - aka “embedded image macros”.
Images, links, enormous smilies' sets, colored text.
Its performing as expected
Uhm, I seem to recall that social media was actually pretty good in the late 2000s and early 2010s. The authors used AI models as the users. Could it be that their models have internalized the effects of the algorithms that fundamentally changed social media from what it used to be over a decade ago, and then be reproducing those effects in their experiments? Sounds like they're treating models as if they're humans, and they are not. Especially when it comes to changing behaviour based on changes in the environment, which is what they were testing by trying different algorithms and mitigation strategies.
As long as you know you're in an echo chamber there's nothing wrong with it. Everything is an echo chamber of varying sizes.
Or do everything within your reach to make everything an echo chamber, cough cough fandom gatekeepers being toxic to people they don't like and think are responsible to changes to their beloved media.
The study is based on having LLMs decide to amplify one of the top ten posts on their timeline or share a news headline. LLMs aren't people, and the authors have not convinced me that they will behave like people in this context.
The behavioral options are restricted to posting news headlines, reposting news headlines, or being passive. There's no option to create original content, and no interventions centered on discouraging reposting. Facebook has experimented with limits to reposting and found such limits discouraged the spread of divisive content and misinformation.
I mostly use social media to share pictures of birds. This contributes to some of the problems the source article discusses. It causes fragmentation; people who don't like bird photos won't follow me. It leads to disparity of influence; I think I have more followers than the average Mastodon account. I sometimes even amplify conflict.
Social media was a mistake, tbh
No shit. Unless the Internet becomes democratised and publicly funded like other media in other countries like the BBC or France24, social media will always be toxic. They thrive in provocations and there are studies to prove it, and social media moguls know this. Hell, there are people who make a living triggering people to gain attention and maintain engagement, which leads to advertising revenue and promotions.
As long as profit motive exists, the social media as we know it can never truly be fixed.
Yes and yes. What is crazy to me is that the owners of social media want more than profits. They also have a political agenda and are willing to tip the scales against any politician who opposes their interests or the interests of their major shareholders. Facebook promoted right wing disinformation campaigns against leaders who they disliked such as mark Carney. Their shareholders should be sued into oblivion and their c levels thrown into prison. Yet our legal system forbids this.
Ofcourse not. The issue with social media are the people. Algorithms just bring out the worst in us but it didn't make us like that, we already were.
From my point of view something that brings out the worst in us sounds like a really big part of the issue.
We've always been modified by our situations, so why not create better situations rather than lamenting that we don't have the grit to break through whatever toxic society we find ourselves graphed onto?
Sorry I know I'm putting a lot on your comment that I know you didn't mean, but I see this kind of unintentional crypto doomerism a lot. I think it holds people to an unhealthy standard.
It is a big part of the issue, but as Lemmy clearly demonstrates, that issue doesn’t go away even when you remove the algorithm entirely.
I see it a lot like driving cars - no matter how much better and safer we make them, accidents will still happen as long as there’s an ape behind the wheel, and probably even after that. That’s not to say things can’t be improved - they definitely can - but I don’t think it can ever be “fixed,” because the problem isn’t it - it’s us. You can't fix humans by tweaking the code on social media.
It magnifies the worst in people.
The reason why it brings out the worst in people is because it has open borders. You can shit into the network and move on. If you were forced to stay and live with your shit, you'd shit less into the public domain. That means small networks, harder to move to other/new networks, ...
Neat.
Release the epstein files then burn it all down.
Fixing social media is like fixing guns so they can't hurt or kill anyone anymore. Both have been designed for a very particular purpose.
Lemmy is social media. So is Mastodon. So is peer tube. And everything else in the fediverse.
So I wouldn’t compare social media to a gun, across the board.
All those platforms work the same way. In the end it's all about the same social dynamics, about control. "We are the alternative to all the shitty peer groups out there! Join us!" is one of the oldest tricks in the playbook. There is no alternative. Because it's all based on human nature.
What is not social media? Were the forums from before Friendster, MySpace, Facebook social media too? I don't know anyone here. Is a mall a house?
Social media hasn't been designed to cause these problems, though. It's more a babelfish thing.
Every problem is an opportunity to earn even more money or gain even more power. Bad for average users, great for those who own and control the platform.
The original source is here:
https://arxiv.org/abs/2508.03385
Social media platforms have been widely linked to societal harms, including rising polarization and the erosion of constructive debate. Can these problems be mitigated through prosocial interventions? We address this question using a novel method – generative social simulation – that embeds Large Language Models within Agent-Based Models to create socially rich synthetic platforms. We create a minimal platform where agents can post, repost, and follow others. We find that the resulting following-networks reproduce three well-documented dysfunctions: (1) partisan echo chambers; (2) concentrated influence among a small elite; and (3) the amplification of polarized voices – creating a “social media prism” that distorts political discourse. We test six proposed interventions, from chronological feeds to bridging algorithms, finding only modest improvements – and in some cases, worsened outcomes. These results suggest that core dysfunctions may be rooted in the feedback between reactive engagement and network growth, raising the possibility that meaningful reform will require rethinking the foundational dynamics of platform architecture.
Should just be people can't be fixed....
Let's just pretend nothing after MySpace ever happened
Because how to use it is baked into what it is. Like many big tech products, it’s not just a tool but also a philosophy. To use it is also to see the world through its (digital) eyes.
I'm not surprised. I am surprised that the researchers were surprised, though.
Bridging algorithms seem promising.
The results were far from encouraging. Only some interventions showed modest improvements. None were able to fully disrupt the fundamental mechanisms producing the dysfunctional effects. In fact, some interventions actually made the problems worse. For example, chronological ordering had the strongest effect on reducing attention inequality, but there was a tradeoff: It also intensified the amplification of extreme content. Bridging algorithms significantly weakened the link between partisanship and engagement and modestly improved viewpoint diversity, but it also increased attention inequality. Boosting viewpoint diversity had no significant impact at all.
Veerry interesting, yes...
Pre print journalism fucking bugs me because the journalists themselves can't actually judge if anything is worth discussing so they just look for click bait shit.
This methodology to discover what interventions do in human environments seems particularly deranged to me though:
We address this question using a novel method – generative social simulation – that embeds Large Language Models within Agent-Based Models to create socially rich synthetic platforms.
LLM agents trained on social media dysfunction recreate it unfailingly. No shit. I understand they gave them personas to adopt as prompts, but prompts cannot and do not override training data. As we've seen multiple times over and over. LLMs fundamentally cannot maintain an identity from a prompt. They are context engines.
Particularly concerning sf the silo claims. LLMs riffing on a theme over extended interactions because the tokens keep coming up that way is expected behavior. LLMs are fundamentally incurious and even more prone to locking into one line of text than humans as the longer conversation reinforces it.
Determining the functionality of what the authors describe as a novel approach might be more warranted than making conclusions on it.
The article argues that extremist views and echo chambers are inherent in public social networks where everyone is trying to talk to everyone else. That includes Fediverse networks like Lemmy and Mastodon.
They argue for smaller, more intimate networks like group chats among friends. I agree with the notion, but I am not sure how someone can build these sorts of environments without just inviting a group of friends and making an echo chamber.
There's actually some interesting research behind this - Dunbar's number suggests humans can only maintain about 150 meaningful relationships, which is why those smaller networks tend to work better psychologicaly than the massive free-for-alls we've built.
The dream was that social media would help revitalize the public sphere and support the kind of constructive political dialogue that your paper deems "vital to democratic life." That largely hasn't happened.
Their idea is basically that people need to be told the same things to what to believe in so that democracy can work as it's supposed to and social media is disrupting that with all the conspiracy shit, flame wars and polarization of opinions. The issue is that this common idea is fermented by the boomer generation. They grew up in really quite anomalous post war world when there was first time in human history basically monolithic mass media that people watched it AND had high trust in AND the system provided more for the masses more than it does now. Those then lead to to high societal inclusion and high social cohesion that again fed into the prosperity. Now we have fragmented information sphere and things are shit are shit, political center is hated by most and radicalism is once again rising.
However so called democracy or collective decision making in general itself does not rely on people not believing in crazy shit, not being fed the best possible validated information, or god forbid having unorthodox ideas of their own or developing factionalism or totally different reading on reality. It helps make it smoother and avoids violence, but that "smoothness of process" that boomers have come to expect is also why society in wider terms is politically stagnant and rotting. People seem to live in different realities, because in a sense we are, because our economic realities can be so different and decoupled form the mainstream narrative. It never didn't have to get this bad, but social media only a venting mechanism not the reason for the growing divides. The division in society and the general anguish is real IRL, it just takes forms of all kinds of irrational and counterproductive forms online. The problem isn't really that people are factional and can't agree with each other, it's that nobody can no longer agree with the monolithic unpopular political center that is holding on to power for dear life.
Good thing is, you don't need to use it. Bad thing is, it affects reality.
Can't?
I'm on Lemmy, am I not?
It CAN be fixed, the question if the will is there. We need to inform and teach more people
I’m on Lemmy, am I not?
It CAN be fixed, the question if the will is there.
While and improvement Lemmy is far from perfect. The upvote-downvote sytem of reddit alone encourages group think and self censorship. It doesn't really help that much that we can go circlejerk in some other instance if we get hated on or banned by mods. We are still encouraged to keep in line to keep the bubble intact.
After 20 years of living with it, I've decided I don't like the downvote. The upvote is fine.
Reddit's founders, early on tried to encourage people to treat the downvote as moderation. It was meant to mean that a thing doesn't belong on reddit and people shouldn't see it. Of course that quickly became mere dislike or disagreement.
I'd prefer an approach that requires some input about what's wrong with a post in order to reduce its prominence; a restricted list of options as in Slashdot's moderation would be sufficient, I think. I'm not sure whether this should necessarily require also making a report to a more powerful admin/moderator, but I lean toward making that optional in most communities.
the problem is algorithms. during the whole bluesky promo all over lemmy while everyone was shitting on mastodon. the only thing that's broken is algorithms, and once you throw them out social media is immediately fixed - but of course the primary argument of mastodon vs bluesky was that mastodon requires you to curate your content (like joining a sub on reddit to see it on your front page stream, before algorithms fucked that site, and the thing is people LOVED old reddit so i fail to see how this is bad and doesn't work, but hey, all of lemmy said so, so who am i to blame) whereas bluesky being a relaunch of twitter and literally curating content for you no matter if you actually want to see it or not but for most people reactionary content is the only content they happily interact with anyway so algorithms makes a lot of sense for them because they feel they are engaging more with the site despite the pointless empty engagement they are doing instead of interacting with real users and real content on pages where you have to actively curate your content instead of being fed the lowest hanging fruit.
/ rant off
Right. We fix ourselves first, we are already here and we do not attempt to control others. We make and go our own way every moment.
Most people don't know about this experience, probably aren't looking for this experience, or would not know how to interact with it. I know it sounds crazy, but Reddit still confuses many people. Lemmy's a different ball of similar wax.
They want the saccharine-coated dopamine-filled mass-produced low-effort meme cesspool that IG, TikTok, etc. all provide. They don't know they want more until they decide they're done with it and start to look. Until then, it's like showing hieroglyphs to an iguana.
Getting banned from Facebook. After a decade of clapping back against racists. Has been the best thing in my life. So glad to be out of there. Just wish I could have saved my pics first.
I mean, I feel like just shutting it down would solve at least some problems. Shuttering it all, video sharing platforms included.
Not a situation most anyone would agree on, but it's an idea.
Using Bluesky as the non-algorithmic example is problematic - they still need to show high user engagement numbers for their VC owners. Mastodon do not have the same problems since on the contrary a Mastodon instance owner has an economic incentive in making sure spambots and troll factory accounts get closed down asap.
As long as people worship themselves (but also, paradoxically, require everyone's attention and approval all the time just to make it to the next day), it will continue being that way. For those who see it for what it is and are disgusted by it, we have Lemmy/discussion boards.
Sounds like it's time to delete it, then.
Social media will be fixed by - wait for it...
Now.
Done. Fixed it, you may thank me later.
Yours,
B-TR3E - the man who fixed social media
Hi it's still broken on my machine. I've tried turning it off and on again
Hi it's still broken on my machine.
social media is what it's made to be. social media as we use it is flawed.
all of the platforms just do different colors of the same damn thing.
Exciting casual games, high rewards, and a variety of gaming experiences. So, what are you waiting for? https://betway-pk.com/Betway-Platform-Simple-Bonus-Guide/index.html
“Fixing” social media is like “fixing” capitalism. Any manmade system can be changed, destroyed, or rebuilt. It’s not an impossible task but will require a fundamental shift in the way we see/talk to/value each other as people.
The one thing I know for sure is that social media won’t ever improve if we all accept the narrative that it can’t be improved.
-Ursula K Le Guin
Seriously, read her books. I looooove „The Dispossessed“
The Left Hand of Darkness is excellent too. Sci-fi from the 1960s about a planet whose people have no fixed sex or gender, and a man from Earth who struggles to understand and function in this society. That description makes it sound very worthy, but it's actually gripping and moving.
LeGuin is a treasure.
Particularly apt given that many of the biggest problems with social media are problems of capitalism. Social media platforms have found it most profitable to monetize conflict and division, the low self-esteem of teenagers, lies and misinformation, envy over the curated simulacrum of a life presented by a parasocial figure.
These things drive engagement. Engagement drives clicks. Clicks drive ad revenue. Revenue pleases shareholders. And all that feeds back into a system that trades negativity in the real world for positivity on a balance sheet.
Yeah, this author is the pop-sci / sci-fi media writer on Ars Technica, not one of the actual science coverage ones that stick to their area of expertise, and you can tell by the overly broad, click bait, headline, that is not actually supported by the research at hand.
The actual research is using limited LLM agents and only explores an incredibly limited number of interventions. This research does not remotely come close to supporting the question of whether or not social media can be fixed, which in itself is a different question from harm reduction.
This is spot on. The issue with any system is that people don’t pay attention to the incentives.
When a surgeon earns more if he does more surgeries with no downside, most surgeons in that system will obviously push for surgeries that aren’t necessary. How to balance incentives should be the main focus on any system that we’re part of.
You can pretty much understand someone else’s behavior by looking at what they’re gaining or what problem they’re avoiding by doing what they’re doing.
If you read the article, the argument they are making is that you cannot fix social media by simply tweaking the algorithm. We need a new form of social media that is not just everyone screaming into the void for attention, which includes Lemmy, Mastodon, and other Fediverse platforms.