【学习笔记】懂你英语 核心课 Level 7 Unit 2 Part 3(III)On Machine Intelligence 3

TED Talk    Machine intelligence makes human morals more important 机器智能使人类道德更重要  Speaker: Zeynep Tufekci    第三课

I have a friend who developed such computational systems to predict the likelihood of clinical or postpartum depression from social media data.    我有一个朋友开发了这样的计算系统,用于根据社会媒体数据预测临床或产后抑郁症的可能性。

The results are impressive.    结果令人印象深刻。

Her system can predict the likelihood of depression months before the onset of any symptoms -- months before.    她的系统可以提前几个月,通过出现的任何症状来预测抑郁的可能。

No symptoms, there's prediction.    没有症状,有预测。

She hopes it will be used for early intervention.  她希望这将被用于早期干预。

Great! But now put this in the context of hiring.    太棒了!现在我们来到招聘的场景。


So at this human resources managers conference, I approached a high-level manager in a very large company,    所以在这次人力资源经理会议上,我找了一家非常大的公司的高级经理,

and I said to her, "Look, what if, unbeknownst to you, your system is weeding out people with high future likelihood of depression?  我对她说:“看,如果在你不知道的情况下,你的系统正在淘汰那些未来可能有抑郁症的人呢?

They're not depressed now, just maybe in the future, more likely.    他们现在不沮丧,只是可能在未来更有可能。

What if it's weeding out women more likely to be pregnant in the next year or two but aren't pregnant now?  如果在接下来的一年或两年内淘汰妇更可能怀孕的女性,但她们现在又没有怀孕,怎么办?

What if it's hiring aggressive people because that's your workplace culture?"    如果它雇佣有侵略性的人,因为那是你的工作场所文化?”

You can't tell this by looking at gender breakdowns.    你不能通过看性别问题来判断。

Those may be balanced.    这些可能是平衡的。

And since this is machine learning, not traditional coding, there is no variable there labeled "higher risk of depression," "higher risk of pregnancy," "aggressive guy scale."  而且由于这是机器学习,而不是传统的编码,它没有可变的标签“抑郁症高风险,”“怀孕号风险,”“侵略性高的家伙。”

Not only do you not know what your system is selecting on, you don't even know where to begin to look. It's a black box.  不仅你不知道你的系统在选择什么,你甚至不知道从哪里开始看。它是一个黑匣子。

It has predictive power, but you don't understand it.  它具有预测力,但你不理解它。


"What safeguards," I asked, "do you have to make sure that your black box isn't doing something shady?"  “什么保障措施,”我问,“你必须确保你的黑匣子没有做什么可疑的事情吗?”

She looked at me as if I had just stepped on 10 puppy tails.  她看着我,就像我刚刚踩了10条小狗的尾巴。


She stared at me and she said, "I don't want to hear another word about this."  她盯着我,她说:“我不想再听到这个消息了。”

And she turned around and walked away.  她转身走开了。

Mind you -- she wasn't rude.  注意--她并不粗鲁。

It was clearly: what I don't know isn't my problem, go away, death stare.  很明显:我不知道的不是我的问题,走开,死亡凝视。


Look, such a system may even be less biased than human managers in some ways.  [跟读]看,这样的系统在某些方面甚至可能比人类管理者偏见更低。

And it could make monetary sense.这可能会使货币变得有意义。

But it could also lead to a steady but stealthy shutting out of the job market of people with higher risk of depression.但这也可能导致一个稳定但却悄无声息的退出就业市场的人,有更高的忧郁症风险。

Is this the kind of society we want to build, without even knowing we've done this, because we turned decision-making to machines we don't totally understand?  这是我们想要建立的社会,甚至不知道我们已经这样做了,因为我们把决策变成了我们不完全理解的机器?

[选择]-What does the system developed by Tufekci's friedn do?  -It predicts the likelihood of depression.

-What example does Tufekci give of a potential consequence of using machine intelligence for human hiring?  -People with higher risk of depression would be less likely to be employed.

To weed sb. out =get rid of them



Another problem is this: these systems are often trained on data generated by our actions, human imprints.  另一个问题是:这些系统通常是由我们的行为、人类印记所产生的数据训练的。

Well, they could just be reflecting our biases, and these systems could be picking up on our biases and amplifying them and showing them back to us,  嗯,他们可能只是反映了我们的偏见,这些系统可能会发现我们的偏见,放大他们,并让他们回到我们,

while we're telling ourselves, "We're just doing objective, neutral computation."  而我们告诉自己,“我们只是做客观的,中立的计算。”


Researchers found that on Google, women are less likely than men to be shown job ads for high-paying jobs.研究人员发现,在谷歌上,女性比男性更不可能在高薪职位上招聘广告。

And searching for African-American names is more likely to bring up ads suggesting criminal history, even when there is none.而寻找非裔美国人的名字更可能带来犯罪历史的广告,即使没有。

Such hidden biases and black-box algorithms that researchers uncover sometimes but sometimes we don't know, can have life-altering consequences.  研究人员有时会发现这种隐藏的偏见和黑箱算法,但有时我们不知道,会有改变生活的结果。


In Wisconsin, a defendant was sentenced to six years in prison for evading the police.在威斯康星州,一名被告因躲避警察被判六年监禁。

You may not know this, but algorithms are increasingly used in parole and sentencing decisions.你可能不知道这一点,但算法越来越多地用于假释和判决决定。

He wanted to know: How is this score calculated? It's a commercial black box.他想知道:这是怎么计算的?它是一个商业黑匣子。

The company refused to have its algorithm be challenged in open court.该公司拒绝在公开法庭上对其算法提出质疑。

But ProPublica, an investigative nonprofit, audited that very algorithm with what public data they could find,  但是ProPublica,一个调查性的非营利组织,对他们所能发现的公共数据进行了审计,

and found that its outcomes were biased and its predictive power was dismal, barely better than chance,  发现其结果是有偏见的,其预测能力是令人沮丧的,几乎没有机会,

[词义] dismal=not successful or of low quality

and it was wrongly labeling black defendants as future criminals at twice the rate of white defendants.  并且错误地把黑人被告作为未来的犯罪分子,概率是白人被告的两倍。

[选择]-What was the result of the audit ProPubica?  -The outcomes of the algorithm were biased against black people.


So, consider this case: This woman was late picking up her godsister from a school in Broward County, Florida, running down the street with a friend of hers.  所以,考虑一下这个案例:这个女人在佛罗里达州布劳沃德县的一所学校接她的教母,她和她的一个朋友在街上跑来跑去。

They spotted an unlocked kid's bike and a scooter on a porch and foolishly jumped on it.  他们发现一个未上锁的孩子的自行车和一辆滑板车在门廊和愚蠢地跳上它。

As they were speeding off, a woman came out and said, "Hey! That's my kid's bike!"  当他们超速行驶时,一个女人跑出来说:“嘿!那是我孩子的自行车!”

They dropped it, they walked away, but they were arrested.他们把它扔了,他们走开了,但他们被逮捕了。


She was wrong, she was foolish, but she was also just 18.她错了,她很愚蠢,但她也只有18岁。

She had a couple of juvenile misdemeanors.她有过两次未成年的轻罪。

Meanwhile, that man had been arrested for shoplifting in Home Depot -- 85 dollars' worth of stuff, a similar petty crime.与此同时,那名男子因在家得宝商店行窃被逮捕——价值85美元的东西,类似的轻微犯罪。

But he had two prior armed robbery convictions.但他有两次持枪抢劫的前科。

But the algorithm scored her as high risk, and not him.但该算法的得分高风险,而不是他。

Two years later, ProPublica found that she had not reoffended.两年后,propublica发现她没有再生气。

It was just hard to get a job for her with her record.她很难找到一份有记录的工作。

He, on the other hand, did reoffend and is now serving an eight-year prison term for a later crime. 另一方面,他又重新犯罪,现在为以后的罪行服刑8年。

Clearly, we need to audit our black boxes and not have them have this kind of unchecked power.  【跟读】  显然,我们需要审计我们的黑匣子,而不是让他们拥有这种不受约束的权力。

你可能感兴趣的:(【学习笔记】懂你英语 核心课 Level 7 Unit 2 Part 3(III)On Machine Intelligence 3)