Skip Navigation

In NYC, companies will have to prove their AI hiring software isn't sexist or racist

www.nbcnews.com

In NYC, companies will have to prove their AI hiring software isn't sexist or racist

AI-infused hiring programs have drawn scrutiny, most notably over whether they end up exhibiting biases based on the data they’re trained on.

26 comments
  • We’ll the problem with that is you’d have to prove your hiring requirements aren’t exclusionary. Which isn’t going to happen until people start to examine their biasses on a societal level. The problem with AI isn’t that it’s biased. The problem is that both the dataset and the task given are biased. Which will always result in a biased system

  • It’s also not clear how the law will be enforced or to what extent.

    No shit. Isn't that the point? Use outrage to justify the growth of an impenetrable body of law addressing all social and economic behavior, then selectively enforce subjective interpretations to satisfy powerful groups and remain in power. So it goes for any population center whose rapid growth creates the illusion of independence.

    Can we sell NYC to Canada yet? We'd make a bundle, they wouldn't even be mad, and I'd sleep a lot better with the border of a superpower between me and a stack of nine million people who think privacy is a sin.

  • Well, when AI takes over everything, at least it won't be racist, unlike current police force.

    It will disrespect and threaten all of us equally bad.

26 comments