On Machine Intelligence 3(5’28)
— — Zeynep Yufekci
I have a friend who develop such computational systems to predict the likelihood of clinical or postpartum(产后的) depression from social media data.
我有一个朋友开发的就是这种计算系统,从社交媒体数据中,预测临床或产后抑郁的可能性。
The results are impressive.
结果令人印象深刻。
Her system can predict the likelihood of depression months before the onset(开端 发生) of any symptoms, month before.
她的系统能够在任何症状出现前几个月就预测出抑郁症的可能性。前几个月,
On symptoms, there's prediction.
症状上,预测出。
She hopes it'll be use for early intervention(干涉 干预). Great.
她希望这可以对早期干预有用。很棒!
But now put this in the context of hiring.
但是,现在,把这个放到招聘环境中,
But this human resources, managers conferences, I approached a high-level manager in a very large company,
但是这个人力资源经理们的会议上,我接触到了一个大型企业的高层经理,
and I said to her," Look, what if unbeknownst(未知的 不为人知的) to you, your system is weeding out(清除 淘汰) people with high future likelihood of depression.
然后我对她说,”看,如果在你不知情的情况下,你的系统清除了未来有很高抑郁可能性的人,怎么办?
They're not depressed now, just maybe in the future, more likely.
他们现在没有抑郁,只是未来有可能,极有可能,
What if it's weeding out women more likely to be pregnant in the next year or two, but are pregnant now?
如果它清除了那些未来一两年有可能怀孕的女性,但是目前没有怀孕,怎么办?
What if it's hiring aggressive people because that's your workplace culture?
如果它因为你的且有文化而招聘了一些富有侵略性的人,怎么办?
You can't tell this by looking at gender breakdowns. Those may be balanced.
你不能通过性别分来分辨出来。那些有可能被平衡掉。
And since this is machine learning, not traditional coding,
然而因为这是机器学习,不是传统的代码。
there is no variable there, labeled " higher risk of depression, higher risk of pregnancy, aggressive guy scale.
那没有变量来标识“更高的抑郁风险,更高的怀孕风险,好斗家伙的程度,
Not only do you not know what your system is selecting on , you don't even know where to begin to look, it's a black box.
不仅仅是你不知道你的系统会选择什么,你甚至不知道从哪里开始寻找,它是一个黑箱子。
It has predictive power, but you don't understand it.
它拥有可预测的权利,但是你不了解它。
What's safeguards(保护 保障 捍卫), I asked, do you have to make sure that your black box isn't doing something shady(可疑的 鬼祟的 违法的)?
我问,“你有什么保障可以确保你的黑箱子不会做一些不好的事情?
Ha, so, she looked at me as if I had just stepped on 10 puppy tails.
哈,所以,她看着我,就像我刚刚踩到10条小狗的尾巴,
She stared at me, and she said, I don't want to hear another word about this.
她凝视着我,说,我不想再听到任何关于这个的词。
She turned around and walked away.
她转身走开了。
Mind you, she wasn't rude. It was clearly, what I don't know isn't my problem, go away, death stare.
请注意,她不是无礼。这很明了,我不知道,这不是我的问题,走开。死亡凝视。
Look, such a system may even be less biased than human managers in some ways.
看,这个系统可能在某些方式上,甚至比人类经理少一些偏见。
And it could make monetary(货币 钱的) sense.
而且它可能有货币意识。
But it could also lead to a steady but stealthy shutting out of the job market of people with higher risk of depression.
但是,它也可能导向一种稳固但隐秘的方法,关闭了那些高抑郁风险人的招聘市场的大门。
Is this the kind of society we want to build, without even knowing we've done this?
这是我们想要建造的那种社会吗?甚至不知道们已经做了这些?
Because we turned decision-making to machines we don't totally understand?
因为我们将决策权转交给了那些我们不完全了解的机器?
1. What does the system developed by Yufekci do?
...It predicts the likelihood of depression.
2. Why is Yufekci concerned about letting machine intelligence hire employees?
...The system may be biased in unexpected ways.
3. To weed somebody out means..
..to get rid of them.
4. 选词填空
Is this the kind of society we want to build, without even knowing we've done this, because we turned decision-making to machines we don't totally understand?
Another problem is this: these systems are often trained on data generated by our actions, human imprints.
Well, they could just be reflecting our biases,
and these systems could be picking up on our biases and amplifying them, and showing them back to us,
While we're telling ourselves, " We're just doing objective, neutral computation.
Researchers found that on Google, women are less likely than men to be shown job ads for high-paying jobs.
And searching for Africa-American names is more likely to bring up ads suggesting criminal history, even when there is none.
Such hidden biases and black-box algorithms that researchers uncover sometimes, but sometimes we don't know, can have life-altering consequences.
In Wisconsin, a defendant was sentenced to 6 years in prison for evading the police.
You may not know this, but algorithm are increasingly used in parole and sentencing decisions.
He wanted to know: How is this score calculated.
It's a commercial black box, the company refuse to have this algorithm be challenged in open court.
But Porpublica, an investigative nonprofit, audited that very algorithm with what public data they could find.
and found that it outcomes were biased and its predictive power was dismal, barely better than chance,
and it was wrongly labeling black defendants as future criminals at twice the rate of white criminals, white defendants.
So consider this case:
This woman was late to picking up her godsister from a school in Broward County, Florida, running down the street with a friend of hers.
They spotted an unlocked kids bike and a scooter on a porch and foolishly jumped on it.
As they were speeding off, a woman came out and said,:" Hey, that's my kid's bike"
They dropped it, they walked away, but they were arrested.
She was wrong, she was foolish, but she was also just 18.
She had a couple of juvenile misdemeanors.
Meanwhile, that man had been arrested for shoplifting in Home Depot - 85 dollars' worth of stuff, a similar petty crime.
But he had two prior armed robbery convictions
But the algorithm scored her as high risk, and not him.
Two years later, ProPublic found that she had not reoffended.
It was just hard to get a job for her with a record,
He, on the other hand, did reoffend and is now serving an eight-year prison term for a later crime.
Clearly, we need to audit our black boxes and not have them have this kind of unchecked power.
1. Why did Yufekci draw a contrast between the two criminals?
..To show how machine intelligence can be racially biased.
2. To audit something is...
...to closely examine it.
3. 选词填空
Searching for African-American names is more likely to bring up ads suggesting criminal history, even when there is none.
4. 排序
1) This woman was wrong, she was foolish, but she was also just 18. She had a couple of juvenile misdemeanors.
2) Meanwhile, that man had been arrested for shoplifting in Home Depot - 85 dollars' worth of stuff, a similar petty crime.
3) But he had two prior armed robbery convictions
4)But the algorithm scored her as high risk, and not him.
5. 听 复述
she looked at me as if I had just stepped on 10 puppy tails.
6. The company refused to have its algorithm be challenged in open court.