Well, what an AI does is look for patterns. Which is also what our brain instinctively does.
In the stone age a primitive man would notice that 1 guy goes into a cave and doesn't come back, and then another guys goes into a cave and doesn't come back. Our man surely doesn't go in the cave.
Police profiling blacks or Muslims should be official policy.
Ironically you would also save more black lives by stopping more criminals, but we all know that BLM hates white people more than they love black people.
Note how the article doesn't even ask itself why that could be. They immediately jump to the conclusion that a fucking computer is sexist.
Moreover, they've been working on it for long before giving up, proving that it was not just a fluke: rational decisions would "discriminate" against women, if by "discriminate" you mean that they would hire them less.
Machine learning in general is... somewhat adversarial in nature. It optimizes towards the provided metrics, not the goal that you think the provided metrics are a proxy for.
Yes, they tried AI for hiring but stopped when they saw it was hiring white males.
I even think they don't do psychometric tests these days anymore, or they couldn't do their quotas
The police made a crime prevention AI, and guess what? It was pretty much always right..... After it started to profile. (They shut it down.)
Well, what an AI does is look for patterns. Which is also what our brain instinctively does.
In the stone age a primitive man would notice that 1 guy goes into a cave and doesn't come back, and then another guys goes into a cave and doesn't come back. Our man surely doesn't go in the cave.
Police profiling blacks or Muslims should be official policy. Ironically you would also save more black lives by stopping more criminals, but we all know that BLM hates white people more than they love black people.
and AI learns how to cheat too. People assume that it is going to be "fair", but there is a reason humans have developed "spidey sense" about people and situations. Anyway, here is the cheater:
https://techcrunch.com/2018/12/31/this-clever-ai-hid-data-from-its-creators-to-cheat-at-its-appointed-task/
I'd love a source on this. That's hilarious!
There's many articles. This, for example.
Note how the article doesn't even ask itself why that could be. They immediately jump to the conclusion that a fucking computer is sexist.
Moreover, they've been working on it for long before giving up, proving that it was not just a fluke: rational decisions would "discriminate" against women, if by "discriminate" you mean that they would hire them less.
https://mobile-reuters-com.cdn.ampproject.org/v/s/mobile.reuters.com/article/amp/idUSKCN1MK08G?amp_js_v=a3&_gsa=1&usqp=mq331AQFKAGwASA%3D#aoh=15982962348247&referrer=https%3A%2F%2Fwww.google.com&_tf=From%20%251%24s&share=https%3A%2F%2Fwww.reuters.com%2Farticle%2Fus-amazon-com-jobs-automation-insight-idUSKCN1MK08G
Machine learning in general is... somewhat adversarial in nature. It optimizes towards the provided metrics, not the goal that you think the provided metrics are a proxy for.