APP下载

人脸识别时代的来临

2018-03-12ByRomanKrznari

英语学习 2018年1期
关键词:人脸

By+Roman+Krznari

在最新的蘋果手机发布会上,最大的亮点也是被人吐槽最多的恐怕要算它的人脸识别功能了。除了指纹之外,面部特征无疑也是区分个体差异的最有效方式。人脸识别技术在不远未来的广泛应用是可预见的,但其潜在的安全隐患也值得人们关注。

The human face is a remarkable piece of work. The astonishing variety of facial features helps people recognise each other and is crucial to the formation of complex societies. So is the faces ability to send emotional signals, whether through an involuntary blush or the artifice1 o f a false smile. People spend much of their waking lives, in the office and the courtroom as well as the bar and the bedroom, reading faces, for signs of attraction, hostility, trust and deceit. They also spend plenty of time trying to dissimulate2.

Technology is rapidly catching up with the human ability to read faces. In America facial recognition is used by churches to track worshippers attendance; in Britain, by retailers to spot past shoplifters. In 2017, Welsh police used it to arrest a suspect outside a football game. In China it verifies the identities of ride-hailing3 drivers, permits tourists to enter attractions and lets people pay for things with a smile. Apples new iPhone is expected to use it to unlock the homescreen.

Set against human skills, such applications might seem incremental4. Some breakthroughs, such as flight or the internet, obviously transform human abilities; facial recognition seems merely to encode them. Although faces are peculiar to individuals, they are also public, so technology does not, at first sight, intrude on something that is private. And yet the ability to record, store and analyse images of faces cheaply, quickly and on a vast scale promises one day to bring about fundamental changes to notions of privacy, fairness and trust.

The final frontier

Start with privacy. One big difference between faces and other biometric5 data, such as fingerprints, is that they work at a distance. Anyone with a phone can take a picture for facial-recognition programs to use. FindFace, an app in Russia, compares snaps of strangers with pictures on VKontakte6, a social network, and can identify people with a 70% accuracy rate. Facebooks bank of facial images cannot be scraped7 by others, but the Silicon Valley giant could obtain pictures of visitors to a car showroom, say, and later use facial recognition to serve them ads for cars. Even if private firms are unable to join the dots between images and identity, the state often can. Photographs of half of Americas adult population are stored in databases that can be used by the FBI. Law-enforcement agencies now have a powerful weapon in their ability to track criminals, but at enormous potential cost to citizens privacy.endprint

The face is not just a name-tag. It displays a lot of other information—and machines can read that, too. Again, that promises benefits. Some firms are analysing faces to provide automated diagnoses of rare genetic conditions, such as Hajdu-Cheney syndrome8, far earlier than would otherwise be possible. Systems that measure emotion may give autistic people a grasp of social signals they find elusive.9 But the technology also threatens. Researchers at Stanford University have demonstrated that, when shown pictures of one gay man, and one straight man, the algorithm10 could attribute their sexuality correctly 81% of the time. Humans managed only 61%. In countries where homosexuality is a crime, software which promises to infer sexuality from a face is an alarming prospect.

Keys, wallet, balaclava11

Less violent forms of discrimination could also become common. Employers can already act on their prejudices to deny people a job. But facial recognition could make such bias routine, enabling firms to filter all job applications for ethnicity and signs of intelligence and sexuality. Nightclubs and sports grounds may face pressure to protect people by scanning entrants faces for the threat of violence—even though, owing to the nature of machine-learning, all facial-recognition systems inevitably deal in probabilities. Moreover, such systems may be biased against those who do not have white skin, since algorithms trained on data sets of mostly white faces do not work well with different ethnicities. Such biases have cropped up in automated assessments used to inform courts decisions about bail and sentencing.12

Eventually, continuous facial recording and gadgets13 that paint computerised data onto the real world might change the texture of social interactions. Dissembling helps grease the wheels of daily life.14 If your partner can spot every suppressed yawn, and your boss every grimace15 of irritation, marriages and working relationships will be more truthful, but less harmonious. The basis of social interactions might change, too, from a set of commitments founded on trust to calculations of risk and reward derived from the information a computer attaches to someones face. Relationships might become more rational, but also more transactional.

In democracies, at least, legislation can help alter the balance of good and bad outcomes. European regulators have embedded a set of principles in forthcoming data-protection regulation, decreeing that biometric information, which would include “faceprints”, belongs to its owner and that its use requires consent16—so that, in Europe, unlike America, Facebook could not just sell ads to those car-showroom visitors. Laws against discrimination can be applied to an employer screening candidates images. Suppliers of commercial facerecognition systems might submit to audits, to demonstrate that their systems are not propagating bias unintentionally.17 Firms that use such technologies should be held accountable.endprint

Such rules cannot alter the direction of travel, however. Cameras will only become more common with the spread of wearable devices. Efforts to bamboozle18 facial-recognition systems, from sunglasses to make-up, are already being overtaken; research from the University of Cambridge shows that artificial intelligence can reconstruct the facial structures of people in disguise. Google has explicitly turned its back on matching faces to identities, for fear of its misuse by undemocratic regimes19. Other tech firms seem less picky. Amazon and Microsoft are both using their cloud services to offer face recognition; it is central to Facebooks plans. Governments will not want to forgo its benefits. Change is coming. Face up to it.

人脸是一件了不起的作品。多到令人惊讶的面部特征帮助人们辨认彼此,而且对于复杂社会的形成至关重要。面部传递情绪信号的能力也是如此,无论是通过不自觉的脸红还是虚伪的假笑。人们花费大量醒着的时间在办公室、法庭以及酒吧和卧室观察人脸,来读取爱慕、敌意、信任和欺骗的迹象。他们也花费许多时间试图掩饰自己。

技术也在快速跟上人类识别脸部的能力。在美国,教堂利用面部识别来监测信徒的到场情况;在英国,零售商利用它来辨认有犯罪历史的扒手。2017年,威尔士警方利用面部识别在一场足球比赛的赛场外抓捕了一名嫌犯。在中国,面部识别被用来验证网约车司机的身份,允许游客进入景点以及让人们通过微笑付款。苹果的新iPhone也将使用它来解锁主屏幕。

与人类的技能相比,这些应用看起来似乎只是细枝末节的进步。一些突破性进步,比如飞机或互联网,显然改变了人类的能力;而面部识别似乎只是对人类能力的编码。虽然面孔是个人所特有的,但它们也是公开的,因此,乍看之下,技术并没有侵犯到私人领域。然而,低成本、高速度、大规模地记录、存储和分析面部图像的能力有一天会使隐私、公平和信任的概念从根本上发生改变。

最后的边界

从隐私说起。面部和其他生物识别数据(如指纹)之间的一大差异在于面部识别在一定距离之外就可以完成。任何人只要有部手机就可以拍张照供面部识别程序使用。FindFace是俄罗斯的一款应用程序,能将陌生人的照片与社交网络VKontakte上的照片进行比较,其人像识别的准确率高达70%。Facebook的面部图像库不能被其他人用程序自动抓取,但是这家硅谷巨头,比方说,可以获得那些光顾了汽车展厅的参观者的照片,之后利用面部识别技术向他们展示汽车广告。尽管私营公司无法将照片和身份关联起来,国家却往往可以。美国一半成年人的照片存储在联邦调查局能够使用的数据库中。执法机构如今在追踪罪犯的能力方面拥有了一件强大的武器,但可能会以触及公民隐私作为巨大代价。

人脸并非只是一个姓名牌。人脸可以展示出许多其他信息——而机器也能讀出这些来。当然,这肯定会带来好处。一些公司正在通过分析面部来自动诊断罕见的遗传病,比如遗传性骨发育不良并肢端溶骨症,诊断速度之快远远超过其他可能的诊断方法。情绪评估系统也许能帮助自闭症患者掌握他们觉得难以理解的社会信号。但这项技术也会带来威胁。斯坦福大学的研究人员已经表明,面对一张男同的照片和一张直男的照片,计算机算法能够以高达81%的准确率判断他们的性取向,而人眼的准确率只有61%。在同性恋尚不合法的国家,通过人脸就能准确推断性取向的软件令人感到前景堪忧。

钥匙、钱包、巴拉克拉法帽

不那么暴力的形形色色的歧视也可能会普遍起来。雇主早已经可以因为偏见而拒绝录用求职者。但是面部识别可能会使这种偏见变得司空见惯,让企业能够基于种族以及智力和性取向的面部特征来筛选求职申请。夜店和体育场可能会迫于保护民众的压力而不得不扫描入场人员的脸来防止暴力威胁——尽管,由于机器学习的本质,所有面部识别系统都不可避免地是在概率上做文章。此外,这些系统可能对有色人种存在偏见,因为这些算法主要是根据采集自白人的面部数据而训练得出的,因此并不能很好地适用于其他种族。这类偏见已经出现在用于供法院保释和量刑参考的自动化评估中。

最终,持续的面部记录和将计算机化的数据涂画到现实世界的电子设备可能会改变社交活动的肌理。掩饰内心有助于润滑日常生活的各个环节。如果你的伴侣能够发现每一个被强忍住的哈欠,你的老板可以注意到每一张恼怒的苦脸,婚姻和工作关系诚然多了一份真实,却少了一份和谐。社交的基础也可能会发生改变,从基于信任的种种承诺转变为计算机通过某人面部信息所计算出的风险和回报。人际关系可能会变得更加理性,但也感觉更像在做交易。

在民主国家,至少立法可以帮助调节利弊端之间的平衡。欧洲监管部门已经制定出了一套原则用于即将出台的数据保护法规,要求生物识别信息(包括“面部信息”)属于其所有者,其使用需要经过所有者授权同意——因此,在欧洲,与美国不同,Facebook不能向那些光顾了汽车展厅的参观者展示广告。反歧视的法律也可用于禁止雇主扫描应聘者面部图像的情况。商用面部识别系统的供应商可能会被要求接受审核,以表明其系统不会在无意中传播偏见。使用这类技术的公司应该承担责任。endprint

然而,这些规定并不能改变大势所趋。照相机只会随着可穿戴设备的普及而更加普遍。无论是依靠戴太阳镜还是化妆来迷惑面部识别系统的尝试都已经不再奏效;剑桥大学的研究表明人工智能可以对那些伪装自己的人进行面部重构。谷歌已经明确表示不赞成将人脸与身份匹配,因为担心会遭到非民主政权的滥用。其他技术公司则似乎没有这么讲究。亚马逊和微软都在利用其云服务来提供面部识别;而其对于Facebook的计划也十分关键。政府不会让面部识别带来的好处白白溜走。变革正在来临。面对吧。

1. artifice: 诡计,狡诈。

2. dissimulate: 隐藏(真实情感或目的)。

3. ride-hailing: 叫车服务。

4. incremental: // 逐步增长的。

5. biometric: 生物识别的。

6. VKontakte: 俄罗斯最大的社交网站,VKontakte为“保持联系”之意。

7. scrape: 本义是“艰难取得,勉强获得”,这里指利用爬虫程序抓取信息,爬虫程序是一种数据采集程序。

8. Hajdu-Cheney syndrome: 遗传性骨发育不良并肢端溶骨症,于1948年和 1965年分别由Hajdu和Cheney两位放射科医生进行了病例报道。

9. autistic: 自闭症的;elusive: 难懂的。

10. algorithm: // 算法。

11. balaclava: 巴拉克拉法帽,一种仅露双眼和鼻子的羊毛头罩,本来用于御寒,后来由于其能掩盖脸部、隐藏身份,常被特种部队、恐怖分子、劫匪等佩戴。

12. crop up: 发生,出现;bail: 保释;sentence: 判决。

13. gadget: //(电子或机械)小装置。

14. dissemble: 掩饰(真实的情感或想法);grease: 给……加润滑油。

15. grimace:(表示疼痛或厌恶等的)怪相,鬼脸。

16. embed sth. in: 使嵌入,使成为……的重要部分;decree: 下令,命令;consent: 同意,许可。

17. audit: 审核,严格检查;propagate: 宣传,传播。

18. bamboozle: // 愚弄,蒙蔽。

19. regime: 政權,政体。endprint

猜你喜欢

人脸
有特点的人脸
一起学画人脸
“人脸面具”来了
三国漫——人脸解锁
星光下的人脸抓拍能手——评测瑞为星光级人脸抓拍摄像机
马面部与人脸相似度惊人
基于三点透视的人脸姿态估计算法
基于肤色分割和改进AdaBoost算法的人脸检测