My job is to design, build and study robots that communicate with people. But this story doesn't start with robotics at all, it starts with animation. When I first saw Pixar's "Luxo Jr.," I was amazed by how much emotion they could put into something as trivial as a desk lamp. I mean, look at them -- at the end of this movie, you actually feel something for two pieces of furniture.
(Laughter)
And I said, I have to learn how to do this. So I made a really bad career decision.
(Laughter)
And that's what my mom was like when I did it.
(Laughter)
I left a very cozy tech job in Israel at a nice software company and I moved to New York to study animation. And there I lived in a collapsing apartment building in Harlem with roommates. I'm not using this phrase metaphorically -- the ceiling actually collapsed one day in our living room. Whenever they did news stories about building violations in New York, they would put the report in front of our building, as kind of, like, a backdrop to show how bad things are.
Anyway, during the day, I went to school and at night I would sit and draw frame by frame of pencil animation. And I learned two surprising lessons. One of them was that when you want to arouse emotions, it doesn't matter so much how something looks; it's all in the motion, in the timing of how the thing moves. And the second was something one of our teachers told us. He actually did the weasel in "Ice Age." And he said, "As an animator, you're not a director -- you're an actor." So, if you want to find the right motion for a character, don't think about it -- go use your body to find it. Stand in front of a mirror, act it out in front of a camera -- whatever you need -- and then put it back in your character.
A year later I found myself at MIT in the Robotic Life Group. It was one of the first groups researching the relationships between humans and robots. And I still had this dream to make an actual, physical Luxo Jr. lamp. But I found that robots didn't move at all in this engaging way that I was used to from my animation studies. Instead, they were all -- how should I put it -- they were all kind of robotic. (Laughter) And I thought, what if I took whatever I learned in animation school, and used that to design my robotic desk lamp. So I went and designed frame by frame to try to make this robot as graceful and engaging as possible. And here when you see the robot interacting with me on a desktop -- and I'm actually redesigning the robot, so, unbeknownst to itself, it's kind of digging its own grave by helping me.
(Laughter)
I wanted it to be less of a mechanical structure giving me light, and more of a helpful, kind of quiet apprentice that's always there when you need it and doesn't really interfere. And when, for example, I'm looking for a battery that I can't find, in a subtle way, it'll show me where the battery is. So you can see my confusion here. I'm not an actor. And I want you to notice how the same mechanical structure can, at one point, just by the way it moves, seem gentle and caring and in the other case, seem violent and confrontational. And it's the same structure, just the motion is different. Actor: "You want to know something? Well, you want to know something? He was already dead! Just laying there, eyes glazed over!"
(Laughter)
But, moving in a graceful way is just one building block of this whole structure called human-robot interaction. I was, at the time, doing my PhD, I was working on human-robot teamwork, teams of humans and robots working together. I was studying the engineering, the psychology, the philosophy of teamwork, and at the same time, I found myself in my own kind of teamwork situation, with a good friend of mine, who's actually here. And in that situation, we can easily imagine robots in the near future being there with us. It was after a Passover Seder. We were folding up a lot of folding chairs, and I was amazed at how quickly we found our own rhythm. Everybody did their own part, we didn't have to divide our tasks. We didn't have to communicate verbally about this -- it all just happened.
And I thought, humans and robots don't look at all like this. When humans and robots interact, it's much more like a chess game: the human does a thing, the robot analyzes whatever the human did, the robot decides what to do next, plans it and does it. Then the human waits, until it's their turn again. So it's much more like a chess game, and that makes sense, because chess is great for mathematicians and computer scientists. It's all about information, analysis, decision-making and planning.
But I wanted my robot to be less of a chess player, and more like a doer that just clicks and works together. So I made my second horrible career choice: I decided to study acting for a semester. I took off from the PhD, I went to acting classes. I actually participated in a play -- I hope there’s no video of that around still.
(Laughter)
And I got every book I could find about acting, including one from the 19th century that I got from the library. And I was really amazed, because my name was the second name on the list -- the previous name was in 1889.
(Laughter)
And this book was kind of waiting for 100 years to be rediscovered for robotics. And this book shows actors how to move every muscle in the body to match every kind of emotion that they want to express.
But the real revelation was when I learned about method acting. It became very popular in the 20th century. And method acting said you don't have to plan every muscle in your body; instead, you have to use your body to find the right movement. You have to use your sense memory to reconstruct the emotions and kind of think with your body to find the right expression -- improvise, play off your scene partner. And this came at the same time as I was reading about this trend in cognitive psychology, called embodied cognition, which also talks about the same ideas. We use our bodies to think; we don't just think with our brains and use our bodies to move, but our bodies feed back into our brain to generate the way that we behave.
And it was like a lightning bolt. I went back to my office, I wrote this paper, which I never really published, called "Acting Lessons for Artificial Intelligence." And I even took another month to do what was then the first theater play with a human and a robot acting together. That's what you saw before with the actors. And I thought: How can we make an artificial intelligence model -- a computer, computational model -- that will model some of these ideas of improvisation, of taking risks, of taking chances, even of making mistakes? Maybe it can make for better robotic teammates. So I worked for quite a long time on these models and I implemented them on a number of robots.
Here you can see a very early example with the robots trying to use this embodied artificial intelligence to try to match my movements as closely as possible. It's sort of like a game. Let's look at it. You can see when I psych it out, it gets fooled. And it's a little bit like what you might see actors do when they try to mirror each other to find the right synchrony between them. And then, I did another experiment, and I got people off the street to use the robotic desk lamp, and try out this idea of embodied artificial intelligence. So, I actually used two kinds of brains for the same robot.
The robot is the same lamp that you saw, and I put two brains in it. For one half of the people, I put in a brain that's kind of the traditional, calculated robotic brain. It waits for its turn, it analyzes everything, it plans. Let's call it the calculated brain. The other got more the stage actor, risk-taker brain. Let's call it the adventurous brain. It sometimes acts without knowing everything it has to know. It sometimes makes mistakes and corrects them. And I had them do this very tedious task that took almost 20 minutes, and they had to work together, somehow simulating, like, a factory job of repetitively doing the same thing. What I found is that people actually loved the adventurous robot. They thought it was more intelligent, more committed, a better member of the team, contributed to the success of the team more. They even called it "he" and "she," whereas people with the calculated brain called it "it," and nobody ever called it "he" or "she." When they talked about it after the task, with the adventurous brain, they said, "By the end, we were good friends and high-fived mentally." Whatever that means.
(Laughter)
Sounds painful. Whereas the people with the calculated brain said it was just like a lazy apprentice. It only did what it was supposed to do and nothing more, which is almost what people expect robots to do, so I was surprised that people had higher expectations of robots than what anybody in robotics thought robots should be doing. And in a way, I thought, maybe it's time -- just like method acting changed the way people thought about acting in the 19th century, from going from the very calculated, planned way of behaving, to a more intuitive, risk-taking, embodied way of behaving -- maybe it's time for robots to have the same kind of revolution.
A few years later, I was at my next research job at Georgia Tech in Atlanta, and I was working in a group dealing with robotic musicians. And I thought, music: that's the perfect place to look at teamwork, coordination, timing, improvisation -- and we just got this robot playing marimba. And the marimba, for everybody like me, it was this huge, wooden xylophone. And when I was looking at this, I looked at other works in human-robot improvisation -- yes, there are other works in human-robot improvisation -- and they were also a little bit like a chess game. The human would play, the robot analyzed what was played, and would improvise their own part. So, this is what musicians called a call-and-response interaction, and it also fits very well robots and artificial intelligence. But I thought, if I use the same ideas I used in the theater play and in the teamwork studies, maybe I can make the robots jam together like a band. Everybody's riffing off each other, nobody is stopping for a moment. And so I tried to do the same things, this time with music, where the robot doesn't really know what it's about to play, it just sort of moves its body and uses opportunities to play, and does what my jazz teacher when I was 17 taught me. She said, when you improvise, sometimes you don't know what you're doing, and you still do it. So I tried to make a robot that doesn't actually know what it's doing, but is still doing it. So let's look at a few seconds from this performance, where the robot listens to the human musician and improvises. And then, look how the human musician also responds to what the robot is doing and picking up from its behavior, and at some point can even be surprised by what the robot came up with.
(Music)
(Music ends)
(Applause)
Being a musician is not just about making notes, otherwise nobody would ever go see a live show. Musicians also communicate with their bodies, with other band members, with the audience, they use their bodies to express the music. And I thought, we already have a robot musician on stage, why not make it be a full-fledged musician? And I started designing a socially expressive head for the robot. The head doesn’t actually touch the marimba, it just expresses what the music is like. These are some napkin sketches from a bar in Atlanta that was dangerously located exactly halfway between my lab and my home. So I spent, I would say, on average, three to four hours a day there. I think.
(Laughter)
And I went back to my animation tools and tried to figure out not just what a robotic musician would look like, but especially what a robotic musician would move like, to sort of show that it doesn't like what the other person is playing -- and maybe show whatever beat it's feeling at the moment.
So we ended up actually getting the money to build this robot, which was nice. I'm going to show you now the same kind of performance, this time with a socially expressive head. And notice one thing -- how the robot is really showing us the beat it's picking up from the human, while also giving the human a sense that the robot knows what it's doing. And also how it changes the way it moves as soon as it starts its own solo.
(Music)
Now it's looking at me, showing that it's listening.
(Music)
Now look at the final chord of the piece again. And this time the robot communicates with its body when it's busy doing its own thing, and when it's ready to coordinate the final chord with me.
(Music)
(Music ending)
(Final chord)
(Applause)
Thanks. I hope you see how much this part of the body that doesn't touch the instrument actually helps with the musical performance. And at some point -- we are in Atlanta, so obviously some rapper will come into our lab at some point -- and we had this rapper come in and do a little jam with the robot. Here you can see the robot basically responding to the beat. Notice two things: one, how irresistible it is to join the robot while it's moving its head. You kind of want to move your own head when it does it. And second, even though the rapper is really focused on his iPhone, as soon as the robot turns to him, he turns back. So even though it's just in the periphery of his vision, in the corner of his eye, it's very powerful. And the reason is that we can't ignore physical things moving in our environment. We are wired for that. So if you have a problem -- maybe your partner is looking at their iPhone or smartphone too much -- you might want to have a robot there to get their attention.
(Laughter)
(Music)
(Music ends)
(Applause)
Just to introduce the last robot that we've worked on, it came out of something surprising that we found: Some point people didn't care about the robot being intelligent, able to improvise and listen, and do all these embodied intelligence things that I spent years developing. They really liked that the robot was enjoying the music.
(Laughter)
And they didn't say the robot was moving to the music, they said "enjoying" the music. And we thought, why don't we take this idea, and I designed a new piece of furniture. This time it wasn't a desk lamp, it was a speaker dock, one of those things you plug your smartphone in. And I thought, what would happen if your speaker dock didn't just play the music for you, but would actually enjoy it, too? And so again, here are some animation tests from an early stage.
(Laughter)
And this is what the final product looked like.
(Music)
(Music ends)
So, a lot of bobbing heads.
(Applause)
A lot of bobbing heads in the audience, so we can still see robots influence people. And it's not just fun and games.
I think one of the reasons I care so much about robots that use their body to communicate and use their body to move is -- I'm going to let you in on a little secret we roboticists are hiding -- is that every one of you is going to be living with a robot at some point in your life. Somewhere in your future, there will be a robot in your life. If not in yours, your children's lives. And I want these robots to be more fluent, more engaging, more graceful than currently they seem to be. And for that I think maybe robots need to be less like chess players and more like stage actors and more like musicians. Maybe they should be able to take chances and improvise. Maybe they should be able to anticipate what you're about to do. Maybe they even need to be able to make mistakes and correct them, because in the end, we are human. And maybe as humans, robots that are a little less than perfect are just perfect for us.
Thank you.
(Applause)