和谐英语

人工智能带来公平(上)

2022-12-12来源:和谐英语

Culture

文化板块

Book review: Technology and fairness

书评:科技与公平

The algorithm’s mercy

算法的仁慈

Many books focus on the problems of big data and artificial intelligence; two new ones offer refreshingly positive solutions.

许多书关注的都是大数据和人工智能的问题;而两本新书提供了令人耳目一新的积极解决方案

The Equality Machine. By Orly Lobel.

《平等机器》奥利·罗贝尔著

Escape from Model Land. By Erica Thompson.

《逃出模型岛》艾瑞卡·汤普森著

Two years ago, when Elinor Lobel was 16, a “smart” insulin pump was attached to her body.

两年前,16岁的埃莉诺·洛贝尔身上连接了一台“智能”胰岛素泵。

Powered by artificial intelligence (AI), it tracks her glucose levels and administers the right dose of insulin at the right time to keep her healthy.

它由人工智能(AI)提供动力,跟踪她的葡萄糖水平,并在正确的时间注射正确剂量的胰岛素,让她保持健康。

It is a miraculous innovation for diabetes sufferers and just one of myriad new ways that data and AI can help improve lives.

对于糖尿病患者来说,这种创新堪称奇迹,而这只是数据和人工智能帮助人们改善生活的无数新方式之一。

Books that decry the dark side of data abound.

谴责数据阴暗面的书比比皆是。

With menacing titles such as “Weapons of Math Destruction” and “Algorithms of Oppression”, they suggest that there is much more to fear than fete in the algorithmic age.

《数学杀伤性武器》和《压抑算法》等具有威胁性的标题表明,在算法时代,比起狂欢,人们更应该恐惧。

The public is duly alarmed; ditto policymakers.

不出所料,公众警觉了起来;政策制定者也是如此。

For instance, a proposed European Union directive may hold back some educational applications of AI, such as its use in marking exams.

例如,一项拟议的欧盟指令可能会对人工智能在教育领域的一些应用造成阻碍,比如用人工智能阅卷。

But the intellectual tide may be turning.

但知识界的潮流可能正在发生转变。

One of the most persuasive proponents of a more balanced view is Elinor Lobel’s mother, Orly, a law professor at the University of San Diego.

以更平衡的视角看待这一问题的人中,最有说服力的一位是埃莉诺·洛贝尔的母亲奥利,她是圣地亚哥大学的法学教授。

In “The Equality Machine” she acknowledges AI’s capacity to produce skewed and harmful results.

在《平等机器》中,她承认人工智能有能力产生扭曲和有害结果。

But she shows how, in the right hands, it can also be used to combat inequality and discrimination.

但她也展示了,在正确的人手中,人工智能也可以用来打击不平等和歧视。

“We need to cut through the utopian/dystopian dualism,” she writes.

“我们需要打破不是乌托邦就是反乌托邦的二元论,”她写道。

“The goal should be progress, not perfection.”

“目标应该是进步,而不是完美。”

For example, women selling goods on eBay tend to receive less money than men for the same item.

例如,在易贝上卖同样的东西,女性赚的钱往往比男性少。

Apprised of that bias, the website can hide vendors’ personal details until an offer is made, or alert them to higher prices in similar transactions.

在得知存在这种偏见后,网站可以在有人出价前隐藏卖家的个人信息,或者提醒他们在类似交易中出现了更高的价格。

Meanwhile women looking for jobs are less likely than men to respond to postings that use military jargon such as “mission critical” and “hero”.

与此同时,对于使用“关键任务”和“英雄”等军事术语的招聘启事,女性求职者做出回应的可能性要比男性低。

Textio, an AI firm, helps companies recruit female employees by scanning listings and recommending alternative language.

人工智能公司Textio能够扫描列表,并推荐替代用语来帮助公司招聘女性员工。

“The Equality Machine” buzzes with such examples, revealing a hidden world of coders, data scientists and activists who are working on the technical means to achieve ethical ends, not simply griping about AI’s lapses.

《平等机器》中有很多这样的例子,它揭示了一个由程序员、数据科学家和活动家组成的隐秘世界,他们致力于通过技术手段实现道德目标,而不仅仅是抱怨人工智能的失误。

The book aptly describes the workings of various AI systems, but its main contribution is to reframe problems in constructive ways.

这本书恰当地描述了各种人工智能系统的工作原理,但它的主要贡献是以建设性的方式重新定义问题。

A tenet of privacy rules is “minimisation”: collect and retain as little information as possible, especially in areas such as race, gender and sexual orientation.

隐私规则的一个原则是“最小化”:收集和保留尽可能少的信息,尤其是在种族、性别和性取向等领域。

Ms Lobel flips the script, showing how in countless cases of medical diagnosis and treatment, as well as in hiring, pay and the legal system, knowing such characteristics can lead to fairer outcomes.

而洛贝尔选择了相反的立场,她展示了在无数的医疗诊断和治疗案例中,以及在招聘、薪酬和法律体系方面,了解这些个人信息可以带来更公平的结果。

For example, in the past American regulators did not track the performance of medical devices by the sex of patients, though an independent study suggested women experience twice as many deaths and injuries as men.

例如,过去美国监管机构没有根据患者的性别来跟踪医疗器械的性能,尽管一项独立研究表明,女性的死亡和受伤人数是男性的两倍。