I think the problem with anthropomorphizing LLMs this way is that they don't have intent, so they can't have responsiblity. If this piece of software had been given the tools to actually kill someone, I think we all understand that it wouldn't be appropriate to put the LLM on trial. Instead, we need to be looking at the people who are trying to give more power to these systems and dodge responsibility for their failures. If this LLM had caused someone to be killed, then the person who tied critical systems into a black box piece of software that is poorly understood and not fit for the purpose is the one who should be on trial. That's my problem with anthropomorphizing LLMs, it shifts the blame and responsibility away from the people who are responsible for attempting to use them for their own gain, at the expense of others.
- JumpRemoved Deleted
Permanently Deleted
- JumpDeleted
Permanently Deleted
Politics @beehaw.org The group chats that changed America
Politics @beehaw.org Elon Musk, in first DOGE team interview: "This is a revolution"
LGBTQ+ @beehaw.org Powerful Speeches From Trans Dems Flip 29 Republicans, Anti-Trans Bills Die In Montana
Politics @beehaw.org Good-bye, Pamela Paul - The contrarian columnist showed us the intolerable side of liberalism
Politics @beehaw.org The Internet Is Worse Than a Brainwashing Machine | The Atlantic
Technology @beehaw.org FeaturedTechnology@Beehaw.org, Community Culture, and Moderation
Fantastic article. I've had similar thoughts when reading articles on that Marist poll in particular, it seemed like a much weaker statement than most of the coverage was implying.