Skip Navigation
436 comments
  • Now see, I like the idea of AI.

    What I don't like are the implications, and the current reality of AI.

    I see businesses embracing AI without fully understanding the limits. Stopping the hiring juniors developers, often firing large numbers of seniors because they think AI, a group of cheap post grad vibe programmers and a handful of seasoned seniors will equal the workforce they got rid of when AI, while very good is not ready to sustain this. It is destroying the career progression for the industry and even if/when they realise it was a mistake, it might already have devastated the industry by then.

    I see the large tech companies tearing through the web illegally sucking up anything they can access to pull into their ever more costly models with zero regard to the effects on the economy, the cost to the servers they are hitting, or the environment from the huge power draw creating these models requires.

    It's a nice idea, but private business cannot be trusted to do this right, we're seeing how to do it wrong, live before our eyes.

  • Distributed platform owned by no one founded by people who support individual control of data and content access

    Majority of users are proponents of owning what one makes and supporting those who create art and entertainment

    AI industry shits on above comments by harvesting private data and creative work without consent or compensation, along with being a money, energy, and attention tar pit

    Buddy, do you know what you're here for?

    EDIT: removed bot accusation, forgot to check user history

  • The currently hot LLM technology is very interesting and I believe it has legitimate use cases. If we develop them into tools that help assist work. (For example, I'm very intrigued by the stuff that's happening in the accessibility field.)

    I mostly have problem with the AI business. Ludicruous use cases (shoving AI into places where it has no business in). Sheer arrogance about the sociopolitics in general. Environmental impact. LLMs aren't good enough for "real" work, but snake oil salesmen keep saying they can do that, and uncritical people keep falling for it.

    And of course, the social impact was just not what we were ready for. "Move fast and break things" may be a good mantra for developing tech, but not for releasing stuff that has vast social impact.

    I believe the AI business and the tech hype cycle is ultimately harming the field. Usually, AI technologies just got gradually developed and integrated to software where they served purpose. Now, it's marred with controversy for decades to come.

    • If we develop them into tools that help assist work.

      Spoilers: We will not

      I believe the AI business and the tech hype cycle is ultimately harming the field.

      I think this is just an American way of doing business. And it's awful, but at the end of the day people will adopt technology if it makes them greater profit (or at least screws over the correct group of people).

      But where the Americanized AI seems to suffer most is in their marketing fully eclipsing their R&D. People seem to have forgotten how DeepSeek spiked the football on OpenAI less than a year ago by making some marginal optimizations to their algorithm.

      The field isn't suffering from the hype cycle nearly so much as it suffers from malinvestment. Huge efforts to make the platform marketable. Huge efforts to shoehorn clumsy chat bots into every nook and cranny of the OS interface. Vanishingly little effort to optimize material consumption or effectively process data or to segregate AI content from the human data it needs to improve.

  • I'd welcome actual AI. What is peddled everyday as "AI" is just marketing bullshit. There's no intelligence in it. Language shapes perception and we should take those words back and use them according to their original and inherent meaning. LLMs are not AI. Stable diffusion is not AI. Neural networks trained for a singular task are not AI.

  • "B-But you don't understand, AI DESTROYS le epic self employed artists and their bottom line! Art is a sacred thing that we all do for fun and not for work, therefore AI that automates the process is EVIL!"

    Actual thought process of some people

    • AI does do this to a subsection. Claiming that everyone is overreacting is just as stupid and lacks the same amount of nuance as claiming AI is going to ruin all self employed artists.

      Also this ignores AI companies stealing blatnatly copyrighted material to feed their AI. As an artist I rather not have some randoms steal my stuff so some mid-tier corporation can generate their logos and advertisements without paying for it

      • Not claiming that everyone is overreacting, but how stupid a lot of anti-AI arguments are. Artists drawing art for a living gets painted not as a job, but as some sort of a fun recreational activity ignoring that artists have to do commissions or draw whatever's popular with their fan base that pays their bills via patreon, which in other words is the process of commodifying oneself aka work.

        Also this ignores AI companies stealing blatnatly copyrighted material to feed their AI.

        Not saying that you're necessarily one of those people, but this argument often pops up in leftist spaces who previously were anti-IP, which is a bit hypocritical. One moment people are against intellectual property, calling it abusable, restrictive, etc, but once small artists start getting attacked then the concept has to be defended.

        As an artist

        womp womp you'll have to get a real job now /s

    • Um, have you tried practicing? Just draw a stick figure or hire an artist, this will easily solve all of your problems. You're welcome.

436 comments