So over the past few centuries, microscopes have revolutionized our world. They revealed to us a tiny world of objects, life and structures that are too small for us to see with our naked eyes. They are a tremendous contribution to science and technology. Today I'd like to introduce you to a new type of microscope, a microscope for changes. It doesn't use optics like a regular microscope to make small objects bigger, but instead it uses a video camera and image processing to reveal to us the tiniest motions and color changes in objects and people, changes that are impossible for us to see with our naked eyes. And it lets us look at our world in a completely new way.
So what do I mean by color changes? Our skin, for example, changes its color very slightly when the blood flows under it. That change is incredibly subtle, which is why, when you look at other people, when you look at the person sitting next to you, you don't see their skin or their face changing color. When we look at this video of Steve here, it appears to us like a static picture, but once we look at this video through our new, special microscope, suddenly we see a completely different image. What you see here are small changes in the color of Steve's skin, magnified 100 times so that they become visible. We can actually see a human pulse. We can see how fast Steve's heart is beating, but we can also see the actual way that the blood flows in his face. And we can do that not just to visualize the pulse, but also to actually recover our heart rates, and measure our heart rates. And we can do it with regular cameras and without touching the patients. So here you see the pulse and heart rate we extracted from a neonatal baby from a video we took with a regular DSLR camera, and the heart rate measurement we get is as accurate as the one you'd get with a standard monitor in a hospital. And it doesn't even have to be a video we recorded. We can do it essentially with other videos as well. So I just took a short clip from "Batman Begins" here just to show Christian Bale's pulse. (Laughter) And you know, presumably he's wearing makeup, the lighting here is kind of challenging, but still, just from the video, we're able to extract his pulse and show it quite well.
So how do we do all that? We basically analyze the changes in the light that are recorded at every pixel in the video over time, and then we crank up those changes. We make them bigger so that we can see them. The tricky part is that those signals, those changes that we're after, are extremely subtle, so we have to be very careful when you try to separate them from noise that always exists in videos. So we use some clever image processing techniques to get a very accurate measurement of the color at each pixel in the video, and then the way the color changes over time, and then we amplify those changes. We make them bigger to create those types of enhanced videos, or magnified videos, that actually show us those changes.
But it turns out we can do that not just to show tiny changes in color, but also tiny motions, and that's because the light that gets recorded in our cameras will change not only if the color of the object changes, but also if the object moves. So this is my daughter when she was about two months old. It's a video I recorded about three years ago. And as new parents, we all want to make sure our babies are healthy, that they're breathing, that they're alive, of course. So I too got one of those baby monitors so that I could see my daughter when she was asleep. And this is pretty much what you'll see with a standard baby monitor. You can see the baby's sleeping, but there's not too much information there. There's not too much we can see. Wouldn't it be better, or more informative, or more useful, if instead we could look at the view like this. So here I took the motions and I magnified them 30 times, and then I could clearly see that my daughter was indeed alive and breathing. (Laughter) Here is a side-by-side comparison. So again, in the source video, in the original video, there's not too much we can see, but once we magnify the motions, the breathing becomes much more visible. And it turns out, there's a lot of phenomena we can reveal and magnify with our new motion microscope. We can see how our veins and arteries are pulsing in our bodies. We can see that our eyes are constantly moving in this wobbly motion. And that's actually my eye, and again this video was taken right after my daughter was born, so you can see I wasn't getting too much sleep. (Laughter) Even when a person is sitting still, there's a lot of information we can extract about their breathing patterns, small facial expressions. Maybe we could use those motions to tell us something about our thoughts or our emotions. We can also magnify small mechanical movements, like vibrations in engines, that can help engineers detect and diagnose machinery problems, or see how our buildings and structures sway in the wind and react to forces. Those are all things that our society knows how to measure in various ways, but measuring those motions is one thing, and actually seeing those motions as they happen is a whole different thing.
And ever since we discovered this new technology, we made our code available online so that others could use and experiment with it. It's very simple to use. It can work on your own videos. Our collaborators at Quanta Research even created this nice website where you can upload your videos and process them online, so even if you don't have any experience in computer science or programming, you can still very easily experiment with this new microscope. And I'd like to show you just a couple of examples of what others have done with it.
So this video was made by a YouTube user called Tamez85. I don't know who that user is, but he, or she, used our code to magnify small belly movements during pregnancy. It's kind of creepy. (Laughter) People have used it to magnify pulsing veins in their hands. And you know it's not real science unless you use guinea pigs, and apparently this guinea pig is called Tiffany, and this YouTube user claims it is the first rodent on Earth that was motion-magnified.
You can also do some art with it. So this video was sent to me by a design student at Yale. She wanted to see if there's any difference in the way her classmates move. She made them all stand still, and then magnified their motions. It's like seeing still pictures come to life. And the nice thing with all those examples is that we had nothing to do with them. We just provided this new tool, a new way to look at the world, and then people find other interesting, new and creative ways of using it.
But we didn't stop there. This tool not only allows us to look at the world in a new way, it also redefines what we can do and pushes the limits of what we can do with our cameras. So as scientists, we started wondering, what other types of physical phenomena produce tiny motions that we could now use our cameras to measure? And one such phenomenon that we focused on recently is sound. Sound, as we all know, is basically changes in air pressure that travel through the air. Those pressure waves hit objects and they create small vibrations in them, which is how we hear and how we record sound. But it turns out that sound also produces visual motions. Those are motions that are not visible to us but are visible to a camera with the right processing. So here are two examples. This is me demonstrating my great singing skills. (Singing) (Laughter) And I took a high-speed video of my throat while I was humming. Again, if you stare at that video, there's not too much you'll be able to see, but once we magnify the motions 100 times, we can see all the motions and ripples in the neck that are involved in producing the sound. That signal is there in that video.
We also know that singers can break a wine glass if they hit the correct note. So here, we're going to play a note that's in the resonance frequency of that glass through a loudspeaker that's next to it. Once we play that note and magnify the motions 250 times, we can very clearly see how the glass vibrates and resonates in response to the sound. It's not something you're used to seeing every day. But this made us think. It gave us this crazy idea. Can we actually invert this process and recover sound from video by analyzing the tiny vibrations that sound waves create in objects, and essentially convert those back into the sounds that produced them. In this way, we can turn everyday objects into microphones.
So that's exactly what we did. So here's an empty bag of chips that was lying on a table, and we're going to turn that bag of chips into a microphone by filming it with a video camera and analyzing the tiny motions that sound waves create in it. So here's the sound that we played in the room.
(Music: "Mary Had a Little Lamb")
And this is a high-speed video we recorded of that bag of chips. Again it's playing. There's no chance you'll be able to see anything going on in that video just by looking at it, but here's the sound we were able to recover just by analyzing the tiny motions in that video.
(Music: "Mary Had a Little Lamb")
I call it -- Thank you. (Applause) I call it the visual microphone. We actually extract audio signals from video signals. And just to give you a sense of the scale of the motions here, a pretty loud sound will cause that bag of chips to move less than a micrometer. That's one thousandth of a millimeter. That's how tiny the motions are that we are now able to pull out just by observing how light bounces off objects and gets recorded by our cameras.
We can recover sounds from other objects, like plants.
(Music: "Mary Had a Little Lamb")
And we can recover speech as well. So here's a person speaking in a room.
Voice: Mary had a little lamb whose fleece was white as snow, and everywhere that Mary went, that lamb was sure to go.
Michael Rubinstein: And here's that speech again recovered just from this video of that same bag of chips.
Voice: Mary had a little lamb whose fleece was white as snow, and everywhere that Mary went, that lamb was sure to go.
MR: We used "Mary Had a Little Lamb" because those are said to be the first words that Thomas Edison spoke into his phonograph in 1877. It was one of the first sound recording devices in history. It basically directed the sounds onto a diaphragm that vibrated a needle that essentially engraved the sound on tinfoil that was wrapped around the cylinder.
Here's a demonstration of recording and replaying sound with Edison's phonograph.
(Video) Voice: Testing, testing, one two three. Mary had a little lamb whose fleece was white as snow, and everywhere that Mary went, the lamb was sure to go. Testing, testing, one two three. Mary had a little lamb whose fleece was white as snow, and everywhere that Mary went, the lamb was sure to go.
MR: And now, 137 years later, we're able to get sound in pretty much similar quality but by just watching objects vibrate to sound with cameras, and we can even do that when the camera is 15 feet away from the object, behind soundproof glass.
So this is the sound that we were able to recover in that case.
Voice: Mary had a little lamb whose fleece was white as snow, and everywhere that Mary went, the lamb was sure to go.
MR: And of course, surveillance is the first application that comes to mind. (Laughter) But it might actually be useful for other things as well. Maybe in the future, we'll be able to use it, for example, to recover sound across space, because sound can't travel in space, but light can.
We've only just begun exploring other possible uses for this new technology. It lets us see physical processes that we know are there but that we've never been able to see with our own eyes until now.
This is our team. Everything I showed you today is a result of a collaboration with this great group of people you see here, and I encourage you and welcome you to check out our website, try it out yourself, and join us in exploring this world of tiny motions.
Thank you.
(Applause)