每次啟動Siri,都會聽到一個美妙的女聲和我們say hi。提出的問題得到解決,Siri那溫柔的聲音簡直猶如天籟。Siri為何是女聲?因為人們更喜歡女性的聲音?為何網(wǎng)站上在線客服的圖標永遠是頭戴耳機、面帶微笑的妙齡女子?究其原由,人們默認女性“天生”就是提供幫助的人。想要將這傾斜的天平擺正,我們何不從改變語音助手的聲音開始……
When the makers of Apples Siri unveiled Viv1) at TechCrunch Disrupt NYC last month, the crowd—and press—swooned2). Pitched as “the intelligent interface for everything,” Viv is a personal digital assistant armed with a nearly transcendent level of sophistication. She is designed to move seamlessly3) across services, and be able to fulfill complex tasks such as “Find me a place to take my peanut-free uncle if it rains tomorrow in Cleveland.” Viv is also just the latest virtual helpmeet with a feminine voice and female name. In addition to Siri (Norse for “beautiful woman who leads you to victory”), her sorority sisters include Amazons Alexa and Microsofts Cortana (named after a voluptuous4) character in the video game “Halo,” who wears a “holographic5) body stocking”).
Why are digital assistants overwhelmingly female? Some say that people prefer womens voices, while others note that in our culture, secretaries and administrative assistants are still usually women. Regardless, this much is certain: Consistently representing digital assistants as female matters a lot in real life: It hard-codes6) a connection between a womans voice and subservience.
As social scientists explore the question of why women lag so far behind men in workplace leadership, theres increasing evidence that unconscious bias plays an important role. According to Erika Hall, a professor at Emory Universitys Goizueta Business School, unconscious bias has its origins in the “cultural knowledge” we absorb from the world around us. This knowledge can come from movies and television, from teachers and family members; we acquire it almost osmotically by living in our society. Unconscious bias happens when we then engage in discriminatory behaviors because we unwittingly use this knowledge to guide our actions.
And this knowledge is everywhere: Our society largely depicts women as supporters and assistants rather than leaders and protagonists. A recent study found that women accounted for only 22 percent of protagonists in the top-grossing films of 2015 (and only 13 percent of protagonists in films directed by men). A comprehensive review of video game studies found that female characters are predominately supporting characters, often “assistants to the leading male character.” And a study of prime-time television found that women comprise the majority of aides7) and administrative support characters. These create “descriptive stereotypes” about what women are like—that women are somehow innately more “supporter-like” than “l(fā)eader-like.”
Because Viv and her fellow digital assistants are female, their usage adds to the store of cultural knowledge about who women are and what women do. Every time you say, “Viv, order me a turkey club” or “Viv, get me an Uber,” the association between “woman” and “assistant” is strengthened. According to Calvin Lai, a Harvard University post-doc who studies unconscious bias, the associations we harbor depend on the number of times we are exposed to them. As these A.I. assistants improve and become more popular, the number of times were exposed to the association between “woman” and “assistant” increases.
The real-world consequences of these stereotypes is well-documented: Research has shown that people tend to prefer women as supporters and men as leaders. A study of engineering undergraduates at the University of Michigan found that when students presented work, the men tended to present the material and the women tended to play the role of “supporter of the male expert.” In another study, when people were shown identical resumes with either male or female names for a lab manager position, they rated the male candidate significantly more competent and hirable. A third study found that saleswomen earned less than salesmen in part because theyd been denied support staff—why would a supporter need a supporter, after all?
While “descriptive stereotypes” lead to women not being perceived as suitable for leadership positions, stereotypes can be prescriptive, too: Women are expected to conform to the stereotype of being a supporter or helper, and rejected or punished for failing to do so. Linguist Kieran Snyders study of performance reviews in tech companies showed that women are routinely criticized for having personality traits that dont conform to feminine stereotypes. Women, but not men, were consistently docked8) for being “abrasive9)” and not “l(fā)etting others shine.” In other words, they were punished for not being good helpers and supporters.
In a study by New York University psychologist Madeline Heinemann, a woman who stayed late to help a colleague was rated less favorably than a man who stayed to help—but penalized more when she declined to stay to help. Indeed, because women are expected to be helpers, they dont actually accrue10) any reward for doing it—theyre simply living up to the expectation. But if they decline to help, they are seen as selfish. Women are aware of this expectation, too: In a study of medical residents, a female medical resident reported that when leading others, “The most important thing is that when I ask for things they should not sound like orders.”
Ultimately, the more our culture teaches us to associate women with assistants, the more real women will be seen as assistants, and penalized for not being assistant-like. At this moment in culture, when more and more attention is being paid to womens roles in the workplace, its essential to pay attention to our cultural inputs, too. Lets eschew11) the false choice between male and female voices. If these A.I. assistants are meant to lead us into the future, why not transcend gender entirely—perhaps a voice could be ambiguously gendered, or shift between genders? At the very least, the default12) settings for these assistants should not always be women. Change Viv to Victor, and maybe one fewer woman will be asked to be the next meetings designated note-taker.
上個月(編注:原文發(fā)表于2016年6月),蘋果語音助手Siri的制作團隊在紐約國際創(chuàng)新峰會上推出了人工智能助手Viv,大眾和媒體紛紛為之癡迷。被定位成“萬能智能界面”的Viv是一款個人數(shù)字助手,她搭載的技術水平堪稱卓越。Viv的設計旨在可以于各種服務之間無縫切換,并能完成復雜的任務,例如“如果明天克利夫蘭下雨,找一個我和我不吃花生的叔叔去吃飯的地方”。Viv只是最新的擁有女性聲音和女性名字的虛擬助手。除了Siri (挪威語中指“引領你走向勝利的美女”),Viv的姐妹團還包括亞馬遜的Alexa和微軟的Cortana (名字取自電子游戲《光暈》里一個身著“全息緊身衣”的性感角色)。
為什么數(shù)字助手絕大多數(shù)都是女性呢?有人說人們更喜歡女性的聲音,而另一些人則指出,在我們的文化里,秘書和行政助理通常仍以女性為主。不管怎樣,有一點是肯定的:一貫把數(shù)字助手描繪成女性對現(xiàn)實生活影響很大——這種做法在女性聲音和從屬地位之間建立了一種牢固的聯(lián)系。
為什么職場領導層中女性的數(shù)量遠遠少于男性?隨著社會學家不斷探索這個問題,越來越多的證據(jù)表明無意識的偏見發(fā)揮著重要的作用。根據(jù)埃默里大學戈伊祖塔商學院艾瑞卡·霍爾教授的說法,無意識的偏見源于我們從周圍世界得到的“文化知識”,這些知識可能來自電影和電視節(jié)目,來自老師和家人。生活在這個社會,我們幾乎是潛移默化地獲得了這些知識。當我們有厚此薄彼的行為時,無意識的偏見便發(fā)揮了作用,因為我們不經(jīng)意地用這種知識指導我們的行動。
并且,這種知識無處不在:多數(shù)情況下,我們的社會把女性刻畫成協(xié)助者和助手而非領導者和主人公。最近一項研究發(fā)現(xiàn),2015年票房大賣的電影中只有22%的主角是女性(而由男性執(zhí)導的電影中,只有13%的主角是女性)。全面回顧電子游戲研究后發(fā)現(xiàn),女性角色大多為配角,通常是“男性領導者角色的助手”。對黃金時段電視節(jié)目的研究發(fā)現(xiàn),女性占了助手和行政支持角色的大多數(shù)。這些都造就了關于女性是什么樣子的“描述性的模式化形象”——女性似乎天生更像隨從者,而不是領導者。
因為Viv和她的數(shù)字助手伙伴們都是女性,對它們的使用增加了關于女性角色以及女性工作內(nèi)容的文化知識。每一次你說“Viv,幫我訂個火雞肉三明治”或“Viv,幫我叫下優(yōu)步”時,“女性”和“助手”的聯(lián)系就得到了加強。根據(jù)加爾文·萊的說法,我們腦海中的這種聯(lián)系取決于我們接觸到它的次數(shù)。加爾文·萊是哈佛大學研究無意識的偏見的博士后。隨著這些人工智能助手不斷改進并且變得越來越受歡迎,我們接觸到“女性”與“助手”之間聯(lián)系的次數(shù)也在增加。
有充分的證據(jù)表明這些模式化形象對現(xiàn)實世界有影響:研究發(fā)現(xiàn),人們往往喜歡把女性當做支持者,把男性當做領導者。對密歇根大學工科大學生的研究發(fā)現(xiàn),學生展示作業(yè)時,男生傾向于展示材料,而女生傾向于當“男性專家”助手的角色。在另一項研究中,人們接到兩份相同的簡歷,申請的都是實驗室經(jīng)理的職位,一份使用男性姓名,另一份使用女性姓名。他們認為男性候選人更能勝任工作,可以被雇用。還有一項研究發(fā)現(xiàn)女銷售員比男銷售員賺得少,部分原因是她們得不到助手——說到底,一個助手為何還需要輔助人員呢?
雖然這些“描述性的模式化形象”使人們普遍認為女性不適合領導者的位置,但模式化形象也可能是規(guī)定性的:人們期望女性符合這種模式化形象,成為支持者或幫助者,如果不這么做,就會被拒絕或懲罰。語言學家基蘭·斯奈德對科技公司績效評估的研究發(fā)現(xiàn),女性員工經(jīng)常會因為有不符合女性在人們腦海中形象的個人特點而受到批評。女性,而不是男性,總是因為表現(xiàn)“粗魯”或“喧賓奪主”而被扣工資。換句話說,她們因不是一個好助手或支持者而受罰。
紐約大學心理學家瑪?shù)铝铡ずD难芯恐赋?,一名留到很晚幫助同事的女性得到的評價不如留下來幫忙的男性高,但是她如果拒絕留下幫忙卻會受到更多的處罰。實際上,因為人們期望女性是協(xié)助者,所以她們并不會因此獲得任何獎賞——她們只是達到了人們的期待而已。但是如果她們拒絕幫忙,就會被認為自私。女性也意識到這種期望:在一項對住院醫(yī)生的調(diào)查中,一名女住院醫(yī)生提到,在領導別人時,“最重要的一點是當我要求大家做事時,我的話聽起來不應該像是命令”。
最終,我們的文化教導我們把女性和助手聯(lián)系在一起的次數(shù)越多,現(xiàn)實生活中就會有越多的女性被看作助手,并且因為她們不像個助手的樣子而被懲罰。在文化的現(xiàn)階段,隨著女性在職場的地位受到越來越多的關注,重視我們的文化輸入也至關重要。我們要避免在男性聲音和女性聲音之間做選擇這樣荒謬的想法。如果我們打算讓這些人工智能助手引領我們走向未來,何不完全超越性別呢?也許可以使用分不清男女的聲音,也可以是男女聲音的轉換。最起碼,這些助手的默認設置不應該總是女性。把Viv換成Victor, 或許下次會議指定的記錄員就會少一些女性了。