There is an ancient proverb that says it's very difficult to find a black cat in a dark room, especially when there is no cat. I find this a particularly apt description of science and how science works -- bumbling around in a dark room, bumping into things, trying to figure out what shape this might be, what that might be, there are reports of a cat somewhere around, they may not be reliable, they may be, and so forth and so on.
Now I know this is different than the way most people think about science. Science, we generally are told, is a very well-ordered mechanism for understanding the world, for gaining facts, for gaining data, that it's rule-based, that scientists use this thing called the scientific method and we've been doing this for 14 generations or so now, and the scientific method is a set of rules for getting hard, cold facts out of the data.
I'd like to tell you that's not the case. So there's the scientific method, but what's really going on is this. (Laughter)
[The Scientific Method vs. Farting Around]
And it's going on kind of like that.
[... in the dark] (Laughter)
So what is the difference, then, between the way I believe science is pursued and the way it seems to be perceived? So this difference first came to me in some ways in my dual role at Columbia University, where I'm both a professor and run a laboratory in neuroscience where we try to figure out how the brain works. We do this by studying the sense of smell, the sense of olfaction, and in the laboratory, it's a great pleasure and fascinating work and exciting to work with graduate students and post-docs and think up cool experiments to understand how this sense of smell works and how the brain might be working, and, well, frankly, it's kind of exhilarating.
But at the same time, it's my responsibility to teach a large course to undergraduates on the brain, and that's a big subject, and it takes quite a while to organize that, and it's quite challenging and it's quite interesting, but I have to say, it's not so exhilarating. So what was the difference? Well, the course I was and am teaching is called Cellular and Molecular Neuroscience - I. (Laughs) It's 25 lectures full of all sorts of facts, it uses this giant book called "Principles of Neural Science" by three famous neuroscientists. This book comes in at 1,414 pages, it weighs a hefty seven and a half pounds. Just to put that in some perspective, that's the weight of two normal human brains.
(Laughter)
So I began to realize, by the end of this course, that the students maybe were getting the idea that we must know everything there is to know about the brain. That's clearly not true. And they must also have this idea, I suppose, that what scientists do is collect data and collect facts and stick them in these big books. And that's not really the case either. When I go to a meeting, after the meeting day is over and we collect in the bar over a couple of beers with my colleagues, we never talk about what we know. We talk about what we don't know. We talk about what still has to get done, what's so critical to get done in the lab. Indeed, this was, I think, best said by Marie Curie who said that one never notices what has been done but only what remains to be done. This was in a letter to her brother after obtaining her second graduate degree, I should say.
I have to point out this has always been one of my favorite pictures of Marie Curie, because I am convinced that that glow behind her is not a photographic effect. (Laughter) That's the real thing. It is true that her papers are, to this day, stored in a basement room in the Bibliothèque Française in a concrete room that's lead-lined, and if you're a scholar and you want access to these notebooks, you have to put on a full radiation hazmat suit, so it's pretty scary business.
Nonetheless, this is what I think we were leaving out of our courses and leaving out of the interaction that we have with the public as scientists, the what-remains-to-be-done. This is the stuff that's exhilarating and interesting. It is, if you will, the ignorance. That's what was missing.
So I thought, well, maybe I should teach a course on ignorance, something I can finally excel at, perhaps, for example. So I did start teaching this course on ignorance, and it's been quite interesting and I'd like to tell you to go to the website. You can find all sorts of information there. It's wide open. And it's been really quite an interesting time for me to meet up with other scientists who come in and talk about what it is they don't know.
Now I use this word "ignorance," of course, to be at least in part intentionally provocative, because ignorance has a lot of bad connotations and I clearly don't mean any of those. So I don't mean stupidity, I don't mean a callow indifference to fact or reason or data. The ignorant are clearly unenlightened, unaware, uninformed, and present company today excepted, often occupy elected offices, it seems to me. That's another story, perhaps.
I mean a different kind of ignorance. I mean a kind of ignorance that's less pejorative, a kind of ignorance that comes from a communal gap in our knowledge, something that's just not there to be known or isn't known well enough yet or we can't make predictions from, the kind of ignorance that's maybe best summed up in a statement by James Clerk Maxwell, perhaps the greatest physicist between Newton and Einstein, who said, "Thoroughly conscious ignorance is the prelude to every real advance in science." I think it's a wonderful idea: thoroughly conscious ignorance.
So that's the kind of ignorance that I want to talk about today, but of course the first thing we have to clear up is what are we going to do with all those facts? So it is true that science piles up at an alarming rate. We all have this sense that science is this mountain of facts, this accumulation model of science, as many have called it, and it seems impregnable, it seems impossible. How can you ever know all of this? And indeed, the scientific literature grows at an alarming rate. In 2006, there were 1.3 million papers published. There's about a two-and-a-half-percent yearly growth rate, and so last year we saw over one and a half million papers being published. Divide that by the number of minutes in a year, and you wind up with three new papers per minute. So I've been up here a little over 10 minutes, I've already lost three papers. I have to get out of here actually. I have to go read.
So what do we do about this? Well, the fact is that what scientists do about it is a kind of a controlled neglect, if you will. We just don't worry about it, in a way. The facts are important. You have to know a lot of stuff to be a scientist. That's true. But knowing a lot of stuff doesn't make you a scientist. You need to know a lot of stuff to be a lawyer or an accountant or an electrician or a carpenter. But in science, knowing a lot of stuff is not the point. Knowing a lot of stuff is there to help you get to more ignorance. So knowledge is a big subject, but I would say ignorance is a bigger one.
So this leads us to maybe think about, a little bit about, some of the models of science that we tend to use, and I'd like to disabuse you of some of them. So one of them, a popular one, is that scientists are patiently putting the pieces of a puzzle together to reveal some grand scheme or another. This is clearly not true. For one, with puzzles, the manufacturer has guaranteed that there's a solution. We don't have any such guarantee. Indeed, there are many of us who aren't so sure about the manufacturer.
(Laughter)
So I think the puzzle model doesn't work.
Another popular model is that science is busy unraveling things the way you unravel the peels of an onion. So peel by peel, you take away the layers of the onion to get at some fundamental kernel of truth. I don't think that's the way it works either. Another one, a kind of popular one, is the iceberg idea, that we only see the tip of the iceberg but underneath is where most of the iceberg is hidden. But all of these models are based on the idea of a large body of facts that we can somehow or another get completed. We can chip away at this iceberg and figure out what it is, or we could just wait for it to melt, I suppose, these days, but one way or another we could get to the whole iceberg. Right? Or make it manageable. But I don't think that's the case. I think what really happens in science is a model more like the magic well, where no matter how many buckets you take out, there's always another bucket of water to be had, or my particularly favorite one, with the effect and everything, the ripples on a pond. So if you think of knowledge being this ever-expanding ripple on a pond, the important thing to realize is that our ignorance, the circumference of this knowledge, also grows with knowledge. So the knowledge generates ignorance. This is really well said, I thought, by George Bernard Shaw. This is actually part of a toast that he delivered to celebrate Einstein at a dinner celebrating Einstein's work, in which he claims that science just creates more questions than it answers. ["Science is always wrong. It never solves a problem without creating 10 more."]
I find that kind of glorious, and I think he's precisely right, plus it's a kind of job security. As it turns out, he kind of cribbed that from the philosopher Immanuel Kant who a hundred years earlier had come up with this idea of question propagation, that every answer begets more questions. I love that term, "question propagation," this idea of questions propagating out there.
So I'd say the model we want to take is not that we start out kind of ignorant and we get some facts together and then we gain knowledge. It's rather kind of the other way around, really. What do we use this knowledge for? What are we using this collection of facts for? We're using it to make better ignorance, to come up with, if you will, higher-quality ignorance. Because, you know, there's low-quality ignorance and there's high-quality ignorance. It's not all the same. Scientists argue about this all the time. Sometimes we call them bull sessions. Sometimes we call them grant proposals. But nonetheless, it's what the argument is about. It's the ignorance. It's the what we don't know. It's what makes a good question.
So how do we think about these questions? I'm going to show you a graph that shows up quite a bit on happy hour posters in various science departments. This graph asks the relationship between what you know and how much you know about it. So what you know, you can know anywhere from nothing to everything, of course, and how much you know about it can be anywhere from a little to a lot. So let's put a point on the graph. There's an undergraduate. Doesn't know much but they have a lot of interest. They're interested in almost everything. Now you look at a master's student, a little further along in their education, and you see they know a bit more, but it's been narrowed somewhat. And finally you get your Ph.D., where it turns out you know a tremendous amount about almost nothing. (Laughter) What's really disturbing is the trend line that goes through that because, of course, when it dips below the zero axis, there, it gets into a negative area. That's where you find people like me, I'm afraid.
So the important thing here is that this can all be changed. This whole view can be changed by just changing the label on the x-axis. So instead of how much you know about it, we could say, "What can you ask about it?" So yes, you do need to know a lot of stuff as a scientist, but the purpose of knowing a lot of stuff is not just to know a lot of stuff. That just makes you a geek, right? Knowing a lot of stuff, the purpose is to be able to ask lots of questions, to be able to frame thoughtful, interesting questions, because that's where the real work is.
Let me give you a quick idea of a couple of these sorts of questions. I'm a neuroscientist, so how would we come up with a question in neuroscience? Because it's not always quite so straightforward. So, for example, we could say, well what is it that the brain does? Well, one thing the brain does, it moves us around. We walk around on two legs. That seems kind of simple, somehow or another. I mean, virtually everybody over 10 months of age walks around on two legs, right? So that maybe is not that interesting. So instead maybe we want to choose something a little more complicated to look at. How about the visual system? There it is, the visual system. I mean, we love our visual systems. We do all kinds of cool stuff. Indeed, there are over 12,000 neuroscientists who work on the visual system, from the retina to the visual cortex, in an attempt to understand not just the visual system but to also understand how general principles of how the brain might work. But now here's the thing: Our technology has actually been pretty good at replicating what the visual system does. We have TV, we have movies, we have animation, we have photography, we have pattern recognition, all of these sorts of things. They work differently than our visual systems in some cases, but nonetheless we've been pretty good at making a technology work like our visual system. Somehow or another, a hundred years of robotics, you never saw a robot walk on two legs, because robots don't walk on two legs because it's not such an easy thing to do. A hundred years of robotics, and we can't get a robot that can move more than a couple steps one way or the other. You ask them to go up an inclined plane, and they fall over. Turn around, and they fall over. It's a serious problem. So what is it that's the most difficult thing for a brain to do? What ought we to be studying? Perhaps it ought to be walking on two legs, or the motor system. I'll give you an example from my own lab, my own particularly smelly question, since we work on the sense of smell. But here's a diagram of five molecules and sort of a chemical notation. These are just plain old molecules, but if you sniff those molecules up these two little holes in the front of your face, you will have in your mind the distinct impression of a rose. If there's a real rose there, those molecules will be the ones, but even if there's no rose there, you'll have the memory of a molecule. How do we turn molecules into perceptions? What's the process by which that could happen? Here's another example: two very simple molecules, again in this kind of chemical notation. It might be easier to visualize them this way, so the gray circles are carbon atoms, the white ones are hydrogen atoms and the red ones are oxygen atoms. Now these two molecules differ by only one carbon atom and two little hydrogen atoms that ride along with it, and yet one of them, heptyl acetate, has the distinct odor of a pear, and hexyl acetate is unmistakably banana. So there are two really interesting questions here, it seems to me. One is, how can a simple little molecule like that create a perception in your brain that's so clear as a pear or a banana? And secondly, how the hell can we tell the difference between two molecules that differ by a single carbon atom? I mean, that's remarkable to me, clearly the best chemical detector on the face of the planet. And you don't even think about it, do you?
So this is a favorite quote of mine that takes us back to the ignorance and the idea of questions. I like to quote because I think dead people shouldn't be excluded from the conversation. And I also think it's important to realize that the conversation's been going on for a while, by the way. So Erwin Schrodinger, a great quantum physicist and, I think, philosopher, points out how you have to "abide by ignorance for an indefinite period" of time. And it's this abiding by ignorance that I think we have to learn how to do. This is a tricky thing. This is not such an easy business.
I guess it comes down to our education system, so I'm going to talk a little bit about ignorance and education, because I think that's where it really has to play out. So for one, let's face it, in the age of Google and Wikipedia, the business model of the university and probably secondary schools is simply going to have to change. We just can't sell facts for a living anymore. They're available with a click of the mouse, or if you want to, you could probably just ask the wall one of these days, wherever they're going to hide the things that tell us all this stuff.
So what do we have to do? We have to give our students a taste for the boundaries, for what's outside that circumference, for what's outside the facts, what's just beyond the facts.
How do we do that? Well, one of the problems, of course, turns out to be testing. We currently have an educational system which is very efficient but is very efficient at a rather bad thing. So in second grade, all the kids are interested in science, the girls and the boys. They like to take stuff apart. They have great curiosity. They like to investigate things. They go to science museums. They like to play around. They're in second grade. They're interested. But by 11th or 12th grade, fewer than 10 percent of them have any interest in science whatsoever, let alone a desire to go into science as a career. So we have this remarkably efficient system for beating any interest in science out of everybody's head.
Is this what we want? I think this comes from what a teacher colleague of mine calls "the bulimic method of education." You know. You can imagine what it is. We just jam a whole bunch of facts down their throats over here and then they puke it up on an exam over here and everybody goes home with no added intellectual heft whatsoever.
This can't possibly continue to go on. So what do we do? Well the geneticists, I have to say, have an interesting maxim they live by. Geneticists always say, you always get what you screen for. And that's meant as a warning. So we always will get what we screen for, and part of what we screen for is in our testing methods. Well, we hear a lot about testing and evaluation, and we have to think carefully when we're testing whether we're evaluating or whether we're weeding, whether we're weeding people out, whether we're making some cut. Evaluation is one thing. You hear a lot about evaluation in the literature these days, in the educational literature, but evaluation really amounts to feedback and it amounts to an opportunity for trial and error. It amounts to a chance to work over a longer period of time with this kind of feedback. That's different than weeding, and usually, I have to tell you, when people talk about evaluation, evaluating students, evaluating teachers, evaluating schools, evaluating programs, that they're really talking about weeding. And that's a bad thing, because then you will get what you select for, which is what we've gotten so far.
So I'd say what we need is a test that says, "What is x?" and the answers are "I don't know, because no one does," or "What's the question?" Even better. Or, "You know what, I'll look it up, I'll ask someone, I'll phone someone. I'll find out." Because that's what we want people to do, and that's how you evaluate them. And maybe for the advanced placement classes, it could be, "Here's the answer. What's the next question?" That's the one I like in particular.
So let me end with a quote from William Butler Yeats, who said "Education is not about filling buckets; it is lighting fires."
So I'd say, let's get out the matches. Thank you.
(Applause)
Thank you. (Applause)