Mark Zuckerberg, en journalist stillede ham spørgsmål om nyheds feeds. Og journalisten spurgte ham, "Hvorfor er det her så vigtigt?" Og Zuckerberg svarede, "Et egern der dør i din have, er nok mere relevant for dine interesser lige nu, end folk der dør i Afrika". Og jeg vil tale om hvordan et internet baseret på den ide om relevans kunne se ud.
Mark Zuckerberg, a journalist was asking him a question about the news feed. And the journalist was asking him, "Why is this so important?" And Zuckerberg said, "A squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa." And I want to talk about what a Web based on that idea of relevance might look like.
Så da jeg voksede op I et virkelig landligt område i Maine, betød internettet noget helt andet for mig. Det betød en forbindelse til verden. Det betød, at noget forbandt os alle sammen. Og jeg var sikker på, at det ville være rigtig godt for demokratiet og for vores samfund. Men der er et skift på vej i hvordan information er tilgængelig online, og det er usynligt. Og hvis vi ikke er opmærksomme på det, kunne det blive et reelt problem. Så jeg lagde først mærke til det her, et sted hvor jeg bruger meget tid -- min Facebook side. Jeg er progresiv, politisk set -- stor overraskelse -- men jeg har altid sørget for at interessere mig for at møde konservative. Jeg kan godt lide at høre hvad de tænker på; jeg kan godt lide at se hvad de linker til; jeg kan godt lide at lære en ting eller to. Så jeg blev overrasket en dag, da jeg lagde mærke til, at de konservative var forsvundet fra mit Facebook feed. Og det der viste sig at ske, var at Facebook kiggede på hvilke link jeg klikkede på, og det lagde mærke til, faktisk, at jeg klikkede mere på mine liberale venners links end på mine konservative venners links. Og uden at spørge mig om det, havde den fjernet dem. De forsvandt.
So when I was growing up in a really rural area in Maine, the Internet meant something very different to me. It meant a connection to the world. It meant something that would connect us all together. And I was sure that it was going to be great for democracy and for our society. But there's this shift in how information is flowing online, and it's invisible. And if we don't pay attention to it, it could be a real problem. So I first noticed this in a place I spend a lot of time -- my Facebook page. I'm progressive, politically -- big surprise -- but I've always gone out of my way to meet conservatives. I like hearing what they're thinking about; I like seeing what they link to; I like learning a thing or two. And so I was surprised when I noticed one day that the conservatives had disappeared from my Facebook feed. And what it turned out was going on was that Facebook was looking at which links I clicked on, and it was noticing that, actually, I was clicking more on my liberal friends' links than on my conservative friends' links. And without consulting me about it, it had edited them out. They disappeared.
Men Facebook er ikke det eneste sted der udfører denne slags usynlig, algoritmisk redigering af nettet. Google gør det også. Hvis jeg søger efter noget, og du søger efter noget, selv lige nu, på præcis det samme tidspunkt, får vi måske vidt forskellige resultater. Selv hvis man er logget ud, fortalte en ingeniør mig, er der 57 signaler som Google kigger efter -- alt fra hvilken slags computer man sidder ved, til hvilken browser man benytter, til hvor man sidder -- som den bruger til at skræddersy ens søgeresultater. Tænk lige på det et øjeblik: Der er ikke nogen standard Google mere. Og I ved, det sjove ved det er, at det er virkelig svært at se. Man kan ikke se hvor forskellige ens søgeresultater er fra nogle andres.
So Facebook isn't the only place that's doing this kind of invisible, algorithmic editing of the Web. Google's doing it too. If I search for something, and you search for something, even right now at the very same time, we may get very different search results. Even if you're logged out, one engineer told me, there are 57 signals that Google looks at -- everything from what kind of computer you're on to what kind of browser you're using to where you're located -- that it uses to personally tailor your query results. Think about it for a second: there is no standard Google anymore. And you know, the funny thing about this is that it's hard to see. You can't see how different your search results are from anyone else's.
Men for et par uger siden, bad jeg en flok venner at Google "Ægypten" og at sende mig screenshots af resultatet. Så her er min ven Scotts screenshot. Og her er min ven Daniels screenshot. Når man stiller dem op ved siden af hinanden, behøver man ikke engang at læse linksene, for at se hvor forskellige de to sider er. Men hvis man læser linksene, er det ret opsigtsvækkende. Daniel fik ikke noget om protesterne i Ægypten overhovedet, på den første side af Google resultater. Scotts resultater var fyldt med dem. Og det her var dagens store historie på det tidspunkt. Så forskellige begynder resultaterne at blive.
But a couple of weeks ago, I asked a bunch of friends to Google "Egypt" and to send me screen shots of what they got. So here's my friend Scott's screen shot. And here's my friend Daniel's screen shot. When you put them side-by-side, you don't even have to read the links to see how different these two pages are. But when you do read the links, it's really quite remarkable. Daniel didn't get anything about the protests in Egypt at all in his first page of Google results. Scott's results were full of them. And this was the big story of the day at that time. That's how different these results are becoming.
Men det er heller ikke kun Google og Facebook. Det er noget der indtager nettet. Der er en hel række af firmaer der laver denne slags personliggørelse. Yahoo News, det største nyhedssite på internettet, er nu personliggjort -- forskellige mennesker får forskellige ting. Huffington Post, Washington Post, New York Times -- er alle i gang med at personliggøre på forskellige måder. Og det flytter os rigtig hurtigt imod en verden hvor internettet viser os hvad det tror vi vil se, men ikke nødvendigvis hvad vi har brug for at se. Som Eric Schmidt sagde, "Det vil blive rigtig svært for mennesker at se eller forbruge noget der ikke på den ene eller anden måde er blevet skræddersyet til dem".
So it's not just Google and Facebook either. This is something that's sweeping the Web. There are a whole host of companies that are doing this kind of personalization. Yahoo News, the biggest news site on the Internet, is now personalized -- different people get different things. Huffington Post, the Washington Post, the New York Times -- all flirting with personalization in various ways. And this moves us very quickly toward a world in which the Internet is showing us what it thinks we want to see, but not necessarily what we need to see. As Eric Schmidt said, "It will be very hard for people to watch or consume something that has not in some sense been tailored for them."
Så jeg mener det her er et problem. Og jeg mener, at hvis man kigger på alle disse filtre, hvis man kigger på alle disse algoritmer, får man det jeg kalder en filterboble. Og ens filterboble er ens egen, personlige, unikke univers med information som man lever i online. Og hvad der er i ens filterboble, afhænger af hvem man er, og det afhænger af hvad man gør. Men det der er ved det er, at man ikke kan bestemme hvad der er i den. Og endnu vigtigere, man kan ikke se hvad der bliver redigeret væk. Så en af problemerne med filterboblen blev opdaget af nogle forskere ved Netflix. Og de kiggede på køerne ved Netflix, og de lagde mærke til noget ret spøjst, noget som mange af os sikkert har lagt mærke til, hvilket er at der er nogle film der egentlig bare suser afsted til vores hjem. De kommer i køen, og de suser bare afsted. Så "Iron Man" suser afsted, og "Waiting for Superman" kan vente i rigtig lang tid.
So I do think this is a problem. And I think, if you take all of these filters together, you take all these algorithms, you get what I call a filter bubble. And your filter bubble is your own personal, unique universe of information that you live in online. And what's in your filter bubble depends on who you are, and it depends on what you do. But the thing is that you don't decide what gets in. And more importantly, you don't actually see what gets edited out. So one of the problems with the filter bubble was discovered by some researchers at Netflix. And they were looking at the Netflix queues, and they noticed something kind of funny that a lot of us probably have noticed, which is there are some movies that just sort of zip right up and out to our houses. They enter the queue, they just zip right out. So "Iron Man" zips right out, and "Waiting for Superman" can wait for a really long time.
Det de fandt ud af var at i vores Netflix kø foregår der en episk kamp mellem vores fremtidige, ideelle selv og vores mere impulsive nutidige selv. I ved, vi vil alle gerne være en person der har set "Rashomon", men lige nu vil vi gerne se "Ace Venture" en fjerde gang. (Latter) Så den bedste redigering, giver os en smule af hvert. Det giver os en smule Justin Bieber, og en smule Afghanistan. Det giver os nogle informationsgrøntsager; det giver os noget informationsdessert. Og udfordringen med denne slags algoritmiske filtre, disse personaliserede filtre, er at, fordi de hovedsageligt kigger på hvad man klikker på først, kan det forstyrre balancen. Og i stedet for en balanceret informationskost, kan det ende med at man er omgivet af informations junk food.
What they discovered was that in our Netflix queues there's this epic struggle going on between our future aspirational selves and our more impulsive present selves. You know we all want to be someone who has watched "Rashomon," but right now we want to watch "Ace Ventura" for the fourth time. (Laughter) So the best editing gives us a bit of both. It gives us a little bit of Justin Bieber and a little bit of Afghanistan. It gives us some information vegetables; it gives us some information dessert. And the challenge with these kinds of algorithmic filters, these personalized filters, is that, because they're mainly looking at what you click on first, it can throw off that balance. And instead of a balanced information diet, you can end up surrounded by information junk food.
Hvad dette indeholder er egentlig, at vi måske har misforstået historien om internettet. I et online samfund -- sådan her er skabelses mytologien -- i et online samfund, var der de her kontrollører, redaktørerne, og de kontrollerede informationsstrømmen. Og pludselig kom internettet og fjernede dem, og det tillod os alle at blive forbundet med hinanden, og det var fedt. Men det er faktisk ikke det der foregår lige nu. Det vi ser nu, er mere end slags videregiven fra de menneskelige kontrollører til de algoritmiske. Og det der er med de algoritmiske er, at de har endnu ikke den slags medfødte etik som redaktørerne havde. Så hvis algoritmerne vil organisere verden for os, hvis de skal beslutte hvad vi kommer til at se, og hvad vi ikke kommer til at se så må vi være sikre på at de ikke udelukkende kigger på relevans. Vi må sørge for, at de også viser os ting der er ubehagelige eller udfordrende, eller vigtige -- det er hvad TED gør -- andre synspunkter.
What this suggests is actually that we may have the story about the Internet wrong. In a broadcast society -- this is how the founding mythology goes -- in a broadcast society, there were these gatekeepers, the editors, and they controlled the flows of information. And along came the Internet and it swept them out of the way, and it allowed all of us to connect together, and it was awesome. But that's not actually what's happening right now. What we're seeing is more of a passing of the torch from human gatekeepers to algorithmic ones. And the thing is that the algorithms don't yet have the kind of embedded ethics that the editors did. So if algorithms are going to curate the world for us, if they're going to decide what we get to see and what we don't get to see, then we need to make sure that they're not just keyed to relevance. We need to make sure that they also show us things that are uncomfortable or challenging or important -- this is what TED does -- other points of view.
Og det der er ved det er, at vi faktisk har været her før som samfund. I 1915, var det ikke fordi aviserne svedte over deres borgerlige pligter. Så lagde folk mærke til at de gjorde noget rigtig vigtigt. At man, faktisk, ikke kunne have et velfungerende demokrati hvis borgerne ikke fik en god informationsstrøm, at aviserne var afgørende, fordi de opførte sig som filteret, og så udviklede journalisternes etik sig. Det var ikke perfekt, men det første os gennem det sidste århundrede. Så nu, er vi på en måde tilbage i 1915 med nettet. Og vi har brug for at de nye kontrollører koder den form for ansvar ind i den kode de skriver.
And the thing is, we've actually been here before as a society. In 1915, it's not like newspapers were sweating a lot about their civic responsibilities. Then people noticed that they were doing something really important. That, in fact, you couldn't have a functioning democracy if citizens didn't get a good flow of information, that the newspapers were critical because they were acting as the filter, and then journalistic ethics developed. It wasn't perfect, but it got us through the last century. And so now, we're kind of back in 1915 on the Web. And we need the new gatekeepers to encode that kind of responsibility into the code that they're writing.
Jeg ved at der er mange mennesker fra Facebook og Google til stede -- Larry og Sergey -- mennesker der har hjulpet os med at bygge nettet som det er, og det er jeg taknemmelig for. Men vi skal virkelig være sikre på at disse algoritmer har indkodet en sans for det offentlige liv, en sans for de borgerlige pligter. Vi må være sikre på, at de er gennemsigtige nok til at vi kan se hvad reglerne er der afgører hvad der kommer gennem vores filtre. Og der er brug for at I giver os den samme kontrol så vi kan afgøre hvad der kommer igennem og hvad der ikke gør. Fordi jeg mener at vi virkelig har brug for at internettet er den ting som vi alle har brug for det er. Vi har brug for, at det forbinder os alle sammen. Vi har brug for at det introducerer os for nye ideer og nye mennesker og forskellige synsvinkler. Og det kommer det ikke til at gøre hvis det efterlader os alle sammen i et net bestående af én.
I know that there are a lot of people here from Facebook and from Google -- Larry and Sergey -- people who have helped build the Web as it is, and I'm grateful for that. But we really need you to make sure that these algorithms have encoded in them a sense of the public life, a sense of civic responsibility. We need you to make sure that they're transparent enough that we can see what the rules are that determine what gets through our filters. And we need you to give us some control so that we can decide what gets through and what doesn't. Because I think we really need the Internet to be that thing that we all dreamed of it being. We need it to connect us all together. We need it to introduce us to new ideas and new people and different perspectives. And it's not going to do that if it leaves us all isolated in a Web of one.
Tak.
Thank you.
(Bifald)
(Applause)