There was a day, about 10 years ago, when I asked a friend to hold a baby dinosaur robot upside down. It was this toy called a Pleo that I had ordered, and I was really excited about it because I've always loved robots. And this one has really cool technical features. It had motors and touch sensors and it had an infrared camera. And one of the things it had was a tilt sensor, so it knew what direction it was facing. And when you held it upside down, it would start to cry. And I thought this was super cool, so I was showing it off to my friend, and I said, "Oh, hold it up by the tail. See what it does." So we're watching the theatrics of this robot struggle and cry out. And after a few seconds, it starts to bother me a little, and I said, "OK, that's enough now. Let's put him back down." And then I pet the robot to make it stop crying.
10 年前的某一天, 我叫我的一個朋友 倒著抓住一只恐龍寶寶機器人。 這是我訂購的玩具,叫做「普里奧」, 我對此很興奮,因為 我對機器人情有獨鍾。 這隻機器人使用了很酷的技術。 它裝備了一些馬達和觸摸感應器, 以及一個紅外線攝影機。 此外,它還有一個傾斜感應器, 所以它知道自己面對的方向。 當你倒過來抓住它, 它會開始哭。 我覺得這真是酷斃了, 所以我自豪地向朋友展示它, 我說:「噢,抓它的尾巴, 吊著它,看它會怎麽樣。」 然後我們看著這隻機器人 裝模作樣地掙扎和哭喊。 幾秒鐘之後, 它開始令我心煩, 我說:「好了,夠了。 我們放下它。」 接着我安撫這隻機器人, 讓它平靜下來。
And that was kind of a weird experience for me. For one thing, I wasn't the most maternal person at the time. Although since then I've become a mother, nine months ago, and I've learned that babies also squirm when you hold them upside down.
這件事讓我感覺十分怪異。 一方面,我在那時 並不如母親一般慈愛。 直到九個月前,我成為了母親, 我才知道原來寶寶在被 倒著抓住時也會掙扎扭動。
(Laughter)
(笑聲)
But my response to this robot was also interesting because I knew exactly how this machine worked, and yet I still felt compelled to be kind to it. And that observation sparked a curiosity that I've spent the past decade pursuing. Why did I comfort this robot? And one of the things I discovered was that my treatment of this machine was more than just an awkward moment in my living room, that in a world where we're increasingly integrating robots into our lives, an instinct like that might actually have consequences, because the first thing that I discovered is that it's not just me.
但是我對這隻機器人的 反應也耐人尋味, 因為,即使我對這臺 機器的工作原理一清二楚, 但我仍然情不自禁地 對它生出惻隱之心。 這個觀察結果激起了我的好奇心, 讓我在過去這十年不斷探究: 為什麼我會安撫這隻機器人? 我的其中一個發現就是: 我對這臺機器的所作所為 不僅僅只是客廳裡一個尷尬的 小插曲,它說明了更多東西。 在如今,我們的生活 與機器人日益親密無間, 類似上述行為的本能 可能會造成實在的影響, 因為我的第一個發現就是: 我不是唯一一個這樣做的人。
In 2007, the Washington Post reported that the United States military was testing this robot that defused land mines. And the way it worked was it was shaped like a stick insect and it would walk around a minefield on its legs, and every time it stepped on a mine, one of the legs would blow up, and it would continue on the other legs to blow up more mines. And the colonel who was in charge of this testing exercise ends up calling it off, because, he says, it's too inhumane to watch this damaged robot drag itself along the minefield. Now, what would cause a hardened military officer and someone like myself to have this response to robots?
2007 年,華盛頓郵報報導, 美國軍方正在測試 一種拆除地雷的機器人。 它是這樣工作的: 它的形狀類似於竹節蟲, 它會用自己的「腿」在雷區附近移動。 每一次踩到地雷, 它都會被炸掉一隻腿, 然後它會用其它的「腿」 繼續移動,引爆更多的地雷。 那位負責這項測試工作的上校 最終取消了它。 因為他說,看著這些傷痕累累的機器人 在雷區掙扎前行實在是慘無人道。 那麽,又是什麼讓這位老練的軍官 以及像我這樣的人 對機器人做出這種反應?
Well, of course, we're primed by science fiction and pop culture to really want to personify these things, but it goes a little bit deeper than that. It turns out that we're biologically hardwired to project intent and life onto any movement in our physical space that seems autonomous to us. So people will treat all sorts of robots like they're alive. These bomb-disposal units get names. They get medals of honor. They've had funerals for them with gun salutes. And research shows that we do this even with very simple household robots, like the Roomba vacuum cleaner.
當然,我們受科幻小說 和流行文化的影響, 已經躍躍欲試要將這些事物人格化, 但還不止於此。 事實證明,對於身邊任何 看來像是自發做出的動作, 生物天性決定了我們 會為其賦予意圖和生命。 所以人們會把各種各樣的 機器人當成生命對待。 這些炸彈處理單位被賦予了名字。 它們被授予了榮譽勛章。 在它們的葬禮上,人們以禮炮致敬。 研究表明,甚至對於構造簡單的 家用機器人,我們也是如此, 例如「Roomba 吸塵器機器人」。
(Laughter)
(笑聲)
It's just a disc that roams around your floor to clean it, but just the fact it's moving around on its own will cause people to name the Roomba and feel bad for the Roomba when it gets stuck under the couch.
它只是一個圓盤, 在地板上遊蕩並打掃, 但僅因為它自己四處移動, 也會讓人們親昵地幫它取名字, 當它被卡在沙發下時 還會給予同情。
(Laughter)
(笑聲)
And we can design robots specifically to evoke this response, using eyes and faces or movements that people automatically, subconsciously associate with states of mind. And there's an entire body of research called human-robot interaction that really shows how well this works. So for example, researchers at Stanford University found out that it makes people really uncomfortable when you ask them to touch a robot's private parts.
並且我們能專門為此 來設計機器人, 藉助眼睛、面部或者動作, 人們在潛意識裏 會自動將其與某種心情聯繫起來。 一系列「人與機器人互動」的研究 都表明了這種機制的運作多麼廣泛。 例如,史丹佛大學的研究者發現 當要求人們去觸碰機器人的私處時, 他們會感到非常不適。
(Laughter)
(笑聲)
So from this, but from many other studies, we know, we know that people respond to the cues given to them by these lifelike machines, even if they know that they're not real.
因此,從很多研究看來, 我們知道人們會對這些 逼真的機器人所帶來的暗示 作出反應, 即使人們知道它們並非生命。
Now, we're headed towards a world where robots are everywhere. Robotic technology is moving out from behind factory walls. It's entering workplaces, households. And as these machines that can sense and make autonomous decisions and learn enter into these shared spaces, I think that maybe the best analogy we have for this is our relationship with animals. Thousands of years ago, we started to domesticate animals, and we trained them for work and weaponry and companionship. And throughout history, we've treated some animals like tools or like products, and other animals, we've treated with kindness and we've given a place in society as our companions. I think it's plausible we might start to integrate robots in similar ways.
我們正進入一個 機器人已無處不在的世界, 機器人技術打破藩籬,開始走出工廠。 它進入了工作場所、家庭。 隨着這些能夠感知、 自主決策及學習的機器 進入這些公共場所, 我認為,也許對此最好的類比就是 我們與動物之間的關係。 數千年前,我們開始馴養動物, 它們被我們馴化:付出勞動、 作為武器,並陪伴我們。 在歷史上,我們把一些動物 當作工具或者產品, 而對於另一些動物,我們予以善待 並為它們保留一席之地, 成為我們的夥伴。 我認為,我們以相似的方式 接納機器人,是合情合理的。
And sure, animals are alive. Robots are not. And I can tell you, from working with roboticists, that we're pretty far away from developing robots that can feel anything. But we feel for them, and that matters, because if we're trying to integrate robots into these shared spaces, we need to understand that people will treat them differently than other devices, and that in some cases, for example, the case of a soldier who becomes emotionally attached to the robot that they work with, that can be anything from inefficient to dangerous. But in other cases, it can actually be useful to foster this emotional connection to robots. We're already seeing some great use cases, for example, robots working with autistic children to engage them in ways that we haven't seen previously, or robots working with teachers to engage kids in learning with new results. And it's not just for kids. Early studies show that robots can help doctors and patients in health care settings.
當然,動物是活的, 而機器人不是。 實話實說,與機器人 專家共事讓我得知 我們還遠不足以開發出 能有任何感受的機器人。 但是我們同情它們, 這很重要。 因為如果想要讓機器人 融入這些公共空間, 我們需要瞭解:人們對待 它們與其他設備的方式會不同。 在某些情況下, 例如,一位軍人對一同工作的機器人 產生了情感依賴, 這可能造成種種後果, 小到不便,大到危險。 在其他情況下, 培養這種與機器人的情感聯結 是很有用處的。 我們已經看到了一些很好的例子, 例如,輔導自閉症兒童的機器人 前所未有地吸引着孩子們, 或者,與老師們共事的機器人 讓孩子們投入學習,取得了新成果。 這並非只限於孩子。 早期的研究顯示, 在醫療保健領域中, 機器人能夠幫助醫生和病人。
This is the PARO baby seal robot. It's used in nursing homes and with dementia patients. It's been around for a while. And I remember, years ago, being at a party and telling someone about this robot, and her response was, "Oh my gosh. That's horrible. I can't believe we're giving people robots instead of human care." And this is a really common response, and I think it's absolutely correct, because that would be terrible. But in this case, it's not what this robot replaces. What this robot replaces is animal therapy in contexts where we can't use real animals but we can use robots, because people will consistently treat them more like an animal than a device.
這是叫做「帕羅」的海豹寶寶機器人。 它被用於養老院中,陪伴失智症患者。 這項服務已有一段時間了。 我還記得,幾年前在一個聚會上, 我告訴某人關於這種機器人的事, 她的反應是, 「噢,天哪, 這太可怕了。 簡直難以置信,照顧人們的 竟然是機器人,而不是人工護理。」 這是相當普遍的反應, 而且我認為這也是人之常情, 因為那樣做真的非常糟糕。 但在這裏,機器人 所替代的並非人工護理。 機器人所替代的,是動物療法, 在這種環境下, 我們不允許使用活的動物, 但我們可以使用機器人, 因為人們永遠會把它們 當作動物,而不是設備。
Acknowledging this emotional connection to robots can also help us anticipate challenges as these devices move into more intimate areas of people's lives. For example, is it OK if your child's teddy bear robot records private conversations? Is it OK if your sex robot has compelling in-app purchases?
隨着這些設備日益走近人們的生活, 承認與機器人的情感聯結 同樣能夠幫助我們 預料到即將面臨的挑戰。 例如,如果你孩子的泰迪熊機器人 會記録私人談話,這合理嗎? 如果你的性愛機器人強烈 要求購買內置服務,這合理嗎?
(Laughter)
(笑聲)
Because robots plus capitalism equals questions around consumer protection and privacy.
因為當機器人遇上資本主義, 便會產生關於消費者保護 以及隱私方面的問題。
And those aren't the only reasons that our behavior around these machines could matter. A few years after that first initial experience I had with this baby dinosaur robot, I did a workshop with my friend Hannes Gassert. And we took five of these baby dinosaur robots and we gave them to five teams of people. And we had them name them and play with them and interact with them for about an hour. And then we unveiled a hammer and a hatchet and we told them to torture and kill the robots.
這些並不是唯一造成 我們對於這些機器的 行為十分重要的原因。 在最初那場 對恐龍寶寶機器人的 實驗數年之後, 我和朋友哈尼斯 · 哥薩特 開了個研討會。 我們拿了 5 個這樣的 恐龍寶寶機器人, 分發給 5 組受試者。 我們讓人們給它們起名字, 和它們一起玩耍、互動, 大約用了一個小時。 然後我們拿出一隻錘子和斧頭, 我們要受試者折磨並殺死機器人。
(Laughter)
(笑聲)
And this turned out to be a little more dramatic than we expected it to be, because none of the participants would even so much as strike these baby dinosaur robots, so we had to improvise a little, and at some point, we said, "OK, you can save your team's robot if you destroy another team's robot."
實驗的結果比我們想象的 更有戲劇性, 因為參與者甚至都不忍敲打 這些恐龍寶寶機器人, 所以我們不得不臨時變卦, 後來只好告訴他們, 「好吧,如果摧毀了其他組的機器人, 你就能拯救自己的機器人了。」
(Laughter)
(笑聲)
And even that didn't work. They couldn't do it. So finally, we said, "We're going to destroy all of the robots unless someone takes a hatchet to one of them." And this guy stood up, and he took the hatchet, and the whole room winced as he brought the hatchet down on the robot's neck, and there was this half-joking, half-serious moment of silence in the room for this fallen robot.
即使是這樣也沒有起色。 他們同樣做不到。 所以最後,我們說, 「我們要摧毀所有的機器人了, 除非某個人拿起斧頭 對其中一個下手。 某位仁兄站起來了, 他拿著斧頭, 當他砍中機器人的脖子時, 整個房間的空氣都仿彿凝固了。 這就是當時在房間裏 因為這隻倒下的機器人 所帶來既幽默又嚴肅的時刻。
(Laughter)
(笑聲)
So that was a really interesting experience. Now, it wasn't a controlled study, obviously, but it did lead to some later research that I did at MIT with Palash Nandy and Cynthia Breazeal, where we had people come into the lab and smash these HEXBUGs that move around in a really lifelike way, like insects. So instead of choosing something cute that people are drawn to, we chose something more basic, and what we found was that high-empathy people would hesitate more to hit the HEXBUGS.
這真是一段有趣的經歷。 當然,這不是一個具有控制組的研究, 但它確實導向了 我在麻省理工學院 和帕拉斯 · 南迪和辛西婭 · 布雷西亞 一起做的一些後續的研究, 我們當時讓人們進入實驗室 粉碎這些電子甲蟲, 它們像昆蟲一樣十分逼真地移動。 所以,與其選擇某種 吸引人們的可愛動物, 我們選擇了更為簡單的東西, 我們發現,富有同情心的人 在粉碎電子甲蟲時會更加猶豫。
Now this is just a little study, but it's part of a larger body of research that is starting to indicate that there may be a connection between people's tendencies for empathy and their behavior around robots. But my question for the coming era of human-robot interaction is not: "Do we empathize with robots?" It's: "Can robots change people's empathy?" Is there reason to, for example, prevent your child from kicking a robotic dog, not just out of respect for property, but because the child might be more likely to kick a real dog?
這只是一個小研究, 但這也屬於一個更大範疇的研究, 該研究開始表明: 在人們對同情之心的傾向 和他們對於機器人的行為之間 或許存在聯繫。 但對於未來人類和機器互動的時代, 我想問的並不是「我們會對 機器人產生同情嗎?」 而是「機器人能夠 改變我們的同情心嗎?」 例如, 阻止你的孩子踢一隻機器狗, 且並非出於對財產的尊重, 而是因為如此孩子就更有可能 虐待真正的狗,這是否有道理?
And again, it's not just kids. This is the violent video games question, but it's on a completely new level because of this visceral physicality that we respond more intensely to than to images on a screen. When we behave violently towards robots, specifically robots that are designed to mimic life, is that a healthy outlet for violent behavior or is that training our cruelty muscles? We don't know ... But the answer to this question has the potential to impact human behavior, it has the potential to impact social norms, it has the potential to inspire rules around what we can and can't do with certain robots, similar to our animal cruelty laws. Because even if robots can't feel, our behavior towards them might matter for us. And regardless of whether we end up changing our rules, robots might be able to help us come to a new understanding of ourselves.
並且,同樣不侷限於孩子。 這個問題源於暴力遊戲, 但它站在一個全新的角度, 因為這種自發的肢體 讓我們做出的反應 會比對螢幕上的圖片更強烈。 當我們虐待機器人, 尤其是那種被設計成 模仿生命的機器人, 這是對暴力行為的正常發泄, 還是在助長我們的暴戾習性? 我們不知道…… 但是這個問題的答案 能夠影響人類行為, 它能夠影響社會準則, 它能夠啓發我們,對於 特定的機器人,哪些事能做, 哪些事不能做。 與我們的「動物保護法」相似。 因為即使機器人無法感知情感, 我們對它們的行為 對我們或許很重要。 不論最後我們是否 改變了我們的準則, 機器人或許能夠幫助我們 再一次全新地認識我們自己。
Most of what I've learned over the past 10 years has not been about technology at all. It's been about human psychology and empathy and how we relate to others. Because when a child is kind to a Roomba, when a soldier tries to save a robot on the battlefield, or when a group of people refuses to harm a robotic baby dinosaur, those robots aren't just motors and gears and algorithms. They're reflections of our own humanity.
我過去 10 年來 所瞭解的大部分內容 並不是關於技術本身。 而是關於人類的心理、 同情心以及我們 如何與他人建立聯繫。 因為當一個孩子和善相待 Roomba 機器人時, 當一名軍人想在 戰場上拯救一名機器人時, 或者當一群人拒絶 傷害恐龍機器寶寶時, 這些機器人已不再是 馬達、齒輪和演算法。 他們都折射出我們人性的光輝。
Thank you.
謝謝。
(Applause)
(掌聲)