亚洲免费av电影一区二区三区,日韩爱爱视频,51精品视频一区二区三区,91视频爱爱,日韩欧美在线播放视频,中文字幕少妇AV,亚洲电影中文字幕,久久久久亚洲av成人网址,久久综合视频网站,国产在线不卡免费播放

        ?

        人臉識別時代的來臨

        2018-03-12 19:11:41ByRomanKrznari
        英語學(xué)習(xí) 2018年1期

        By+Roman+Krznari

        在最新的蘋果手機(jī)發(fā)布會上,最大的亮點也是被人吐槽最多的恐怕要算它的人臉識別功能了。除了指紋之外,面部特征無疑也是區(qū)分個體差異的最有效方式。人臉識別技術(shù)在不遠(yuǎn)未來的廣泛應(yīng)用是可預(yù)見的,但其潛在的安全隱患也值得人們關(guān)注。

        The human face is a remarkable piece of work. The astonishing variety of facial features helps people recognise each other and is crucial to the formation of complex societies. So is the faces ability to send emotional signals, whether through an involuntary blush or the artifice1 o f a false smile. People spend much of their waking lives, in the office and the courtroom as well as the bar and the bedroom, reading faces, for signs of attraction, hostility, trust and deceit. They also spend plenty of time trying to dissimulate2.

        Technology is rapidly catching up with the human ability to read faces. In America facial recognition is used by churches to track worshippers attendance; in Britain, by retailers to spot past shoplifters. In 2017, Welsh police used it to arrest a suspect outside a football game. In China it verifies the identities of ride-hailing3 drivers, permits tourists to enter attractions and lets people pay for things with a smile. Apples new iPhone is expected to use it to unlock the homescreen.

        Set against human skills, such applications might seem incremental4. Some breakthroughs, such as flight or the internet, obviously transform human abilities; facial recognition seems merely to encode them. Although faces are peculiar to individuals, they are also public, so technology does not, at first sight, intrude on something that is private. And yet the ability to record, store and analyse images of faces cheaply, quickly and on a vast scale promises one day to bring about fundamental changes to notions of privacy, fairness and trust.

        The final frontier

        Start with privacy. One big difference between faces and other biometric5 data, such as fingerprints, is that they work at a distance. Anyone with a phone can take a picture for facial-recognition programs to use. FindFace, an app in Russia, compares snaps of strangers with pictures on VKontakte6, a social network, and can identify people with a 70% accuracy rate. Facebooks bank of facial images cannot be scraped7 by others, but the Silicon Valley giant could obtain pictures of visitors to a car showroom, say, and later use facial recognition to serve them ads for cars. Even if private firms are unable to join the dots between images and identity, the state often can. Photographs of half of Americas adult population are stored in databases that can be used by the FBI. Law-enforcement agencies now have a powerful weapon in their ability to track criminals, but at enormous potential cost to citizens privacy.endprint

        The face is not just a name-tag. It displays a lot of other information—and machines can read that, too. Again, that promises benefits. Some firms are analysing faces to provide automated diagnoses of rare genetic conditions, such as Hajdu-Cheney syndrome8, far earlier than would otherwise be possible. Systems that measure emotion may give autistic people a grasp of social signals they find elusive.9 But the technology also threatens. Researchers at Stanford University have demonstrated that, when shown pictures of one gay man, and one straight man, the algorithm10 could attribute their sexuality correctly 81% of the time. Humans managed only 61%. In countries where homosexuality is a crime, software which promises to infer sexuality from a face is an alarming prospect.

        Keys, wallet, balaclava11

        Less violent forms of discrimination could also become common. Employers can already act on their prejudices to deny people a job. But facial recognition could make such bias routine, enabling firms to filter all job applications for ethnicity and signs of intelligence and sexuality. Nightclubs and sports grounds may face pressure to protect people by scanning entrants faces for the threat of violence—even though, owing to the nature of machine-learning, all facial-recognition systems inevitably deal in probabilities. Moreover, such systems may be biased against those who do not have white skin, since algorithms trained on data sets of mostly white faces do not work well with different ethnicities. Such biases have cropped up in automated assessments used to inform courts decisions about bail and sentencing.12

        Eventually, continuous facial recording and gadgets13 that paint computerised data onto the real world might change the texture of social interactions. Dissembling helps grease the wheels of daily life.14 If your partner can spot every suppressed yawn, and your boss every grimace15 of irritation, marriages and working relationships will be more truthful, but less harmonious. The basis of social interactions might change, too, from a set of commitments founded on trust to calculations of risk and reward derived from the information a computer attaches to someones face. Relationships might become more rational, but also more transactional.

        In democracies, at least, legislation can help alter the balance of good and bad outcomes. European regulators have embedded a set of principles in forthcoming data-protection regulation, decreeing that biometric information, which would include “faceprints”, belongs to its owner and that its use requires consent16—so that, in Europe, unlike America, Facebook could not just sell ads to those car-showroom visitors. Laws against discrimination can be applied to an employer screening candidates images. Suppliers of commercial facerecognition systems might submit to audits, to demonstrate that their systems are not propagating bias unintentionally.17 Firms that use such technologies should be held accountable.endprint

        Such rules cannot alter the direction of travel, however. Cameras will only become more common with the spread of wearable devices. Efforts to bamboozle18 facial-recognition systems, from sunglasses to make-up, are already being overtaken; research from the University of Cambridge shows that artificial intelligence can reconstruct the facial structures of people in disguise. Google has explicitly turned its back on matching faces to identities, for fear of its misuse by undemocratic regimes19. Other tech firms seem less picky. Amazon and Microsoft are both using their cloud services to offer face recognition; it is central to Facebooks plans. Governments will not want to forgo its benefits. Change is coming. Face up to it.

        人臉是一件了不起的作品。多到令人驚訝的面部特征幫助人們辨認(rèn)彼此,而且對于復(fù)雜社會的形成至關(guān)重要。面部傳遞情緒信號的能力也是如此,無論是通過不自覺的臉紅還是虛偽的假笑。人們花費大量醒著的時間在辦公室、法庭以及酒吧和臥室觀察人臉,來讀取愛慕、敵意、信任和欺騙的跡象。他們也花費許多時間試圖掩飾自己。

        技術(shù)也在快速跟上人類識別臉部的能力。在美國,教堂利用面部識別來監(jiān)測信徒的到場情況;在英國,零售商利用它來辨認(rèn)有犯罪歷史的扒手。2017年,威爾士警方利用面部識別在一場足球比賽的賽場外抓捕了一名嫌犯。在中國,面部識別被用來驗證網(wǎng)約車司機(jī)的身份,允許游客進(jìn)入景點以及讓人們通過微笑付款。蘋果的新iPhone也將使用它來解鎖主屏幕。

        與人類的技能相比,這些應(yīng)用看起來似乎只是細(xì)枝末節(jié)的進(jìn)步。一些突破性進(jìn)步,比如飛機(jī)或互聯(lián)網(wǎng),顯然改變了人類的能力;而面部識別似乎只是對人類能力的編碼。雖然面孔是個人所特有的,但它們也是公開的,因此,乍看之下,技術(shù)并沒有侵犯到私人領(lǐng)域。然而,低成本、高速度、大規(guī)模地記錄、存儲和分析面部圖像的能力有一天會使隱私、公平和信任的概念從根本上發(fā)生改變。

        最后的邊界

        從隱私說起。面部和其他生物識別數(shù)據(jù)(如指紋)之間的一大差異在于面部識別在一定距離之外就可以完成。任何人只要有部手機(jī)就可以拍張照供面部識別程序使用。FindFace是俄羅斯的一款應(yīng)用程序,能將陌生人的照片與社交網(wǎng)絡(luò)VKontakte上的照片進(jìn)行比較,其人像識別的準(zhǔn)確率高達(dá)70%。Facebook的面部圖像庫不能被其他人用程序自動抓取,但是這家硅谷巨頭,比方說,可以獲得那些光顧了汽車展廳的參觀者的照片,之后利用面部識別技術(shù)向他們展示汽車廣告。盡管私營公司無法將照片和身份關(guān)聯(lián)起來,國家卻往往可以。美國一半成年人的照片存儲在聯(lián)邦調(diào)查局能夠使用的數(shù)據(jù)庫中。執(zhí)法機(jī)構(gòu)如今在追蹤罪犯的能力方面擁有了一件強(qiáng)大的武器,但可能會以觸及公民隱私作為巨大代價。

        人臉并非只是一個姓名牌。人臉可以展示出許多其他信息——而機(jī)器也能讀出這些來。當(dāng)然,這肯定會帶來好處。一些公司正在通過分析面部來自動診斷罕見的遺傳病,比如遺傳性骨發(fā)育不良并肢端溶骨癥,診斷速度之快遠(yuǎn)遠(yuǎn)超過其他可能的診斷方法。情緒評估系統(tǒng)也許能幫助自閉癥患者掌握他們覺得難以理解的社會信號。但這項技術(shù)也會帶來威脅。斯坦福大學(xué)的研究人員已經(jīng)表明,面對一張男同的照片和一張直男的照片,計算機(jī)算法能夠以高達(dá)81%的準(zhǔn)確率判斷他們的性取向,而人眼的準(zhǔn)確率只有61%。在同性戀尚不合法的國家,通過人臉就能準(zhǔn)確推斷性取向的軟件令人感到前景堪憂。

        鑰匙、錢包、巴拉克拉法帽

        不那么暴力的形形色色的歧視也可能會普遍起來。雇主早已經(jīng)可以因為偏見而拒絕錄用求職者。但是面部識別可能會使這種偏見變得司空見慣,讓企業(yè)能夠基于種族以及智力和性取向的面部特征來篩選求職申請。夜店和體育場可能會迫于保護(hù)民眾的壓力而不得不掃描入場人員的臉來防止暴力威脅——盡管,由于機(jī)器學(xué)習(xí)的本質(zhì),所有面部識別系統(tǒng)都不可避免地是在概率上做文章。此外,這些系統(tǒng)可能對有色人種存在偏見,因為這些算法主要是根據(jù)采集自白人的面部數(shù)據(jù)而訓(xùn)練得出的,因此并不能很好地適用于其他種族。這類偏見已經(jīng)出現(xiàn)在用于供法院保釋和量刑參考的自動化評估中。

        最終,持續(xù)的面部記錄和將計算機(jī)化的數(shù)據(jù)涂畫到現(xiàn)實世界的電子設(shè)備可能會改變社交活動的肌理。掩飾內(nèi)心有助于潤滑日常生活的各個環(huán)節(jié)。如果你的伴侶能夠發(fā)現(xiàn)每一個被強(qiáng)忍住的哈欠,你的老板可以注意到每一張惱怒的苦臉,婚姻和工作關(guān)系誠然多了一份真實,卻少了一份和諧。社交的基礎(chǔ)也可能會發(fā)生改變,從基于信任的種種承諾轉(zhuǎn)變?yōu)橛嬎銠C(jī)通過某人面部信息所計算出的風(fēng)險和回報。人際關(guān)系可能會變得更加理性,但也感覺更像在做交易。

        在民主國家,至少立法可以幫助調(diào)節(jié)利弊端之間的平衡。歐洲監(jiān)管部門已經(jīng)制定出了一套原則用于即將出臺的數(shù)據(jù)保護(hù)法規(guī),要求生物識別信息(包括“面部信息”)屬于其所有者,其使用需要經(jīng)過所有者授權(quán)同意——因此,在歐洲,與美國不同,F(xiàn)acebook不能向那些光顧了汽車展廳的參觀者展示廣告。反歧視的法律也可用于禁止雇主掃描應(yīng)聘者面部圖像的情況。商用面部識別系統(tǒng)的供應(yīng)商可能會被要求接受審核,以表明其系統(tǒng)不會在無意中傳播偏見。使用這類技術(shù)的公司應(yīng)該承擔(dān)責(zé)任。endprint

        然而,這些規(guī)定并不能改變大勢所趨。照相機(jī)只會隨著可穿戴設(shè)備的普及而更加普遍。無論是依靠戴太陽鏡還是化妝來迷惑面部識別系統(tǒng)的嘗試都已經(jīng)不再奏效;劍橋大學(xué)的研究表明人工智能可以對那些偽裝自己的人進(jìn)行面部重構(gòu)。谷歌已經(jīng)明確表示不贊成將人臉與身份匹配,因為擔(dān)心會遭到非民主政權(quán)的濫用。其他技術(shù)公司則似乎沒有這么講究。亞馬遜和微軟都在利用其云服務(wù)來提供面部識別;而其對于Facebook的計劃也十分關(guān)鍵。政府不會讓面部識別帶來的好處白白溜走。變革正在來臨。面對吧。

        1. artifice: 詭計,狡詐。

        2. dissimulate: 隱藏(真實情感或目的)。

        3. ride-hailing: 叫車服務(wù)。

        4. incremental: // 逐步增長的。

        5. biometric: 生物識別的。

        6. VKontakte: 俄羅斯最大的社交網(wǎng)站,VKontakte為“保持聯(lián)系”之意。

        7. scrape: 本義是“艱難取得,勉強(qiáng)獲得”,這里指利用爬蟲程序抓取信息,爬蟲程序是一種數(shù)據(jù)采集程序。

        8. Hajdu-Cheney syndrome: 遺傳性骨發(fā)育不良并肢端溶骨癥,于1948年和 1965年分別由Hajdu和Cheney兩位放射科醫(yī)生進(jìn)行了病例報道。

        9. autistic: 自閉癥的;elusive: 難懂的。

        10. algorithm: // 算法。

        11. balaclava: 巴拉克拉法帽,一種僅露雙眼和鼻子的羊毛頭罩,本來用于御寒,后來由于其能掩蓋臉部、隱藏身份,常被特種部隊、恐怖分子、劫匪等佩戴。

        12. crop up: 發(fā)生,出現(xiàn);bail: 保釋;sentence: 判決。

        13. gadget: //(電子或機(jī)械)小裝置。

        14. dissemble: 掩飾(真實的情感或想法);grease: 給……加潤滑油。

        15. grimace:(表示疼痛或厭惡等的)怪相,鬼臉。

        16. embed sth. in: 使嵌入,使成為……的重要部分;decree: 下令,命令;consent: 同意,許可。

        17. audit: 審核,嚴(yán)格檢查;propagate: 宣傳,傳播。

        18. bamboozle: // 愚弄,蒙蔽。

        19. regime: 政權(quán),政體。endprint

        久久99免费精品国产| 俺也去色官网| 国产三级黄色在线观看| 麻豆av毛片在线观看| 日韩人妻另类中文字幕| 少妇无码av无码一区| 日韩精品久久久一区| 精品国产乱码久久免费看| 免费视频无打码一区二区三区| 亚洲国产成人片在线观看| 亚洲成av人最新无码| 午夜视频福利一区二区三区| 久久丝袜熟女av一区二区| 中文字幕人妻熟在线影院| 久久精品亚洲牛牛影视| 激情视频在线观看免费播放| 亚洲成人中文字幕在线视频| 黑人巨大跨种族video| 久久久久亚洲av成人网址| 视频在线播放观看免费| 揄拍成人国产精品视频| 欧美日韩亚洲tv不卡久久| 亚洲色图综合免费视频| 区一区二区三免费观看视频| 午夜精品射精入后重之免费观看| 久久人人97超碰超国产| 搡老女人老妇女老熟妇69| 国产精品成人一区二区不卡| 午夜毛片不卡免费观看视频| 99精品成人片免费毛片无码| 国内精品女同一区二区三区| 国产午夜av秒播在线观看| 精品人人妻人人澡人人爽牛牛| 久久99国产亚洲高清观看首页| 少妇高潮久久蜜柚av| 日韩毛片无码永久免费看| 中文字幕av一区二区三区| 隔壁人妻欲求不满中文字幕| 国产av夜夜欢一区二区三区| 青青草国产成人99久久| 亚洲av免费高清不卡|