Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)MI
Posts
2
Comments
16
Joined
7 mo. ago

  • While you all laugh at ChatGPT slop leaving "as a language model..." cruft everywhere, from Twitter political bots to published Springer textbooks, over there in lala land "AIs" are rewriting their reward functions and hacking the matrix and spontaneously emerging mind models of Diplomacy players and generally a week or so from becoming the irresistible superintelligent hypno goddess:

    https://www.reddit.com/r/196/comments/1jixljo/comment/mjlexau/

  • The problem with FOSS for me is the other side of the FOSS surplus: namely corporate encircling of the commons. The free software movement never had a political analysis of the power imbalance between capital owners and workers. This results in the "Freedom 0" dogma, which makes everything workers produce with a genuine communitarian, laudably pro-social sentiment, to be easily coopted and appropriated into the interests of capital owners (for example with embrace-and-extend, network effects, product bundling, or creative backstabbing of the kind Google did to Linux with the Android app store). LLM scrapers are just the latest iteration of this.

    A few years back various groups tried to tackle this problem with a shift to "ethical licensing", such as the non-violent license, the anti-capitalist software license, or the do no harm license. While license-based approaches won't stop capitalists from using the commons to target immigrants (NixOS), enable genocide (Meta) or bomb children (Google), this was in my view worthwhile as a rallying cry of sorts; drawing a line in the sand between capital owners and the public. So if you put your free time on a software project meant for everyone and some billionaire starts coopting it, you can at least make it clear it's non-consensual, even if you can't out-lawyer capital owners. But these ethical licenses initiatives didn't seem to make any strides, due to the FOSS culture issue you describe; traditional software repositories didn't acknowledge or make any infrastructure for them, and ethical licenses would still be generically "non-free" in FOSS spaces.

    (Personally, I use FOSS operating systems for 26 years now; I've given up on contributing or participating in the "community" a long time ago, burned out by all the bigotry, hostility, and First World-centrism of its forums.)

  • I hate programming but if I wanted to waste any time programming stuff my idea would be something akin to Yahoo! Directory from before Google, or del.icio.us from the 2000s, but distributed, and tied to a PGP-like web of trust system.

    You search for a topic, you get links saved with that tag by people you personally validated and trust first, and then by people they trust, and people you don't know but added as probably fine, and so on. Dunno how doable it would be to do something like this.

  • TechTakes @awful.systems

    The broken search bar is symbiotic with the bullshitting chatbot

  • Yes to all that, plus the browser thing: How annoying the browsers are with expired certificates. I mean it has to be super hard to allow me to guess that the admin just forgot to renew the certificate, or it wouldn't protect me from the very common threat model of... ähm... uh...

    (it's to protect the CA business model, of course.)

  • I find it impressive how gen-AI developed a technology that is fine-tuned to generate content that looks precisely passably plausible, but never good enough to be correct or interesting or beautiful or worthwhile in any way.

    Like if I was trying to fill the Internet with noise to ruin it, on purpose, I couldn't do better than this. (mostly on accounr of me not having massive data centres nor the moral calousness to spew that much carbon, but still). It's like the ideal infohazard weapon if your goal is to worsen as many lives as you can

  • OK so we're getting into deep rat lore now? I'm so sorry for what I'm about to do to you. I hope one day you can forgive me.

    LessWrong diaspora factions! :blobcat_ohno:

    https://transmom.love/@elilla/113639471445651398

    if I got something wrong, please don't tell me. gods I hope I got something wrong. "it's spreading disinformation" I hope I am

  • I got some very intense, frequent bullying in 90s Latin America for being perceived as queer, before even understanding myself that I was actually queer.

    I don't think there was ever anything like the jocks from US movies. Bullies tended to be troubled kids from difficult backgrounds, the kind of kid who would be themself exposed to violence and abuse at home or in their neighbourhood. A handful were from religious fundamentalist families.

    There was some hostility towards children who took school too seriously or were perceived as teacher's pets, but I don't think that in itself would have inspired "slapped every day" levels of bullying. I don't remember bullying due to what today are called fandoms or geeky interests; they were just much less known.

  • I find the polygraph to be a fascinating artifact. most on account of how it doesn't work. it's not that it kinda works, that it more or less works, or that if we just iron out a few kinks the next model will do what polygraphs claims to do. the assumptions behind the technology are wrong. lying is not physiological; a polygraph cannot and will never work. you might as well hire me to read the tarot of the suspects, my rate of success would be as high or higher.

    yet the establishment pretends that it works, that it means something. because the State desperately wants to believe that there is a path to absolute surveillance, a way to make even one's deepest subjectivity legible to the State, amenable to central planning (cp. the inefficacy of torture). they want to believe it so much, they want this technology to exist so much, that they throw reality out of the window, ignore not just every researcher ever but the evidence of their own eyes and minds, and pretend very hard, pretend deliberately, willfully, desperately, that the technology does what it cannot do and will never do. just the other day some guy way condemned to use a polygraph in every statement for the rest of his life. again, this is no better than flipping a coin to decide if he's saying the truth, but here's the entire System, the courts the judge the State itself, solemnly condemning the man to the whims of imaginary oracles.

    I think this is how "AI" works, but on a larger scale.

  • TechTakes @awful.systems

    Disapproving of automated plagiarism is classist ableism, actually: Nanowrimo