Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)GR
Posts
0
Comments
9
Joined
6 mo. ago

  • As someone fairly new to Lemmy, y'all should just try making informative posts. I see lots of posts exactly like this, but there's never any substance. What exactly did those instances do to earn their reputation? Start there. I'm close to blocking some communities because the lore just has no entry point so why bother.

  • I feel sorry for Americans. The best of the 2 viable parties is a party keen on supplying a genocide, and somehow if you vote 3rd party you're more responsible than said genocide suppliers, even worse republicans, and non-voters. That's wild.

    The uncomfortable reality is that you aren't gonna solve your country's problems in the voting booths. Good luck to you, genuinely.

  • Yes, but they also enforced a carbon tax on provinces that removed their own program.

    There's also Carney scrapping the capital gains tax increase on cap gains over 250k.

    I don't deny they've always been a right wing party, but they seem to be moving further right - that's my only point.

  • Liberals when conservatives do conservative shit: "this is awful"(rightly so)

    Liberals when liberals do conservative shit: "this is fine"(it's not)

    Everything points to a right-wing pivot for the liberals and the voters seem just fine with that.

  • I agree with you, and I use it that same way. But I think it should be something the user explicitly seeks out. The problem is that everyone who uses Google now unintentionally use an LLM in the exact same way they've always found human-written content. It's fundamentally different content, so shoving it into the existing interface is begging for confusion.

  • The main problem I see is that Google just shouldn't include AI results. And they definitely shouldn't put their unreliable LLM front and center on the results page. When you google something, you want accurate information, which the LLM might have, but only if that data was readily available to begin with. So the stuff it can help with is stuff the search would put first already.

    For anything requiring critical thought or research, the LLM will often hallucinate or misrepresent. The danger is that people do not always apply critical thinking. Defaulting to showing an LLM response is extremely dangerous, and it's basically pointless.