Note how the article doesn't even ask itself why that could be. They immediately jump to the conclusion that a fucking computer is sexist.
Moreover, they've been working on it for long before giving up, proving that it was not just a fluke: rational decisions would "discriminate" against women, if by "discriminate" you mean that they would hire them less.
Machine learning in general is... somewhat adversarial in nature. It optimizes towards the provided metrics, not the goal that you think the provided metrics are a proxy for.
I'd love a source on this. That's hilarious!
There's many articles. This, for example.
Note how the article doesn't even ask itself why that could be. They immediately jump to the conclusion that a fucking computer is sexist.
Moreover, they've been working on it for long before giving up, proving that it was not just a fluke: rational decisions would "discriminate" against women, if by "discriminate" you mean that they would hire them less.
https://mobile-reuters-com.cdn.ampproject.org/v/s/mobile.reuters.com/article/amp/idUSKCN1MK08G?amp_js_v=a3&_gsa=1&usqp=mq331AQFKAGwASA%3D#aoh=15982962348247&referrer=https%3A%2F%2Fwww.google.com&_tf=From%20%251%24s&share=https%3A%2F%2Fwww.reuters.com%2Farticle%2Fus-amazon-com-jobs-automation-insight-idUSKCN1MK08G
Machine learning in general is... somewhat adversarial in nature. It optimizes towards the provided metrics, not the goal that you think the provided metrics are a proxy for.