I'm James. I'm a writer and artist, and I make work about technology. I do things like draw life-size outlines of military drones in city streets around the world, so that people can start to think and get their heads around these really quite hard-to-see and hard-to-think-about technologies. I make things like neural networks that predict the results of elections based on weather reports, because I'm intrigued about what the actual possibilities of these weird new technologies are. Last year, I built my own self-driving car. But because I don't really trust technology, I also designed a trap for it.
我是詹姆斯。 我是個作家也是個藝術家。 我喜歡創造一些有關科技的作品。 我所做的事包括: 在世界各地的城市街道上繪製 實體大小的軍用無人機輪廓, 這樣大家就可以了解並開始思考 這些平時難以見到 也很難想像的科技。 我會建造類神經網路的東西, 它能根據氣象報導來預測選舉結果, 因為我很好奇 這些奇怪的新科技究竟有多少可能。 去年,我自製了一台自動駕駛汽車。 但因為我並不完全相信科技, 所以我也為它設計了一個陷阱。
(Laughter)
(笑聲)
And I do these things mostly because I find them completely fascinating, but also because I think when we talk about technology, we're largely talking about ourselves and the way that we understand the world. So here's a story about technology.
我去做這些事主要是因為 我覺得它們真的很吸引我, 也是因為我認為, 當我們談到科技時, 我們其實是在談論我們自己 以及我們理解世界的方式。 下面我想和大家分享一個 關於科技的故事。
This is a "surprise egg" video. It's basically a video of someone opening up loads of chocolate eggs and showing the toys inside to the viewer. That's it. That's all it does for seven long minutes. And I want you to notice two things about this. First of all, this video has 30 million views.
這是一個叫做《驚喜蛋》的短片。 影片的內容就是一個人 打開了一堆巧克力蛋, 然後把裡面的玩具展示給觀眾看。 僅此而已, 7 分鐘的影片就這個內容。 我想請各位從中注意兩件事。 第一,這個影片有 3000 萬人次的點閱率。
(Laughter)
(笑聲)
And the other thing is, it comes from a channel that has 6.3 million subscribers, that has a total of eight billion views, and it's all just more videos like this -- 30 million people watching a guy opening up these eggs. It sounds pretty weird, but if you search for "surprise eggs" on YouTube, it'll tell you there's 10 million of these videos, and I think that's an undercount. I think there's way, way more of these. If you keep searching, they're endless. There's millions and millions of these videos in increasingly baroque combinations of brands and materials, and there's more and more of them being uploaded every single day. Like, this is a strange world. Right?
第二件事是, 播放這個影片的頻道 擁有 630 萬名的訂閱者, 累計有 80 億人次的點閱率, 而該頻道中的大多數的影片 都差不多是這樣子的, 3000 萬人看一個人打開這些蛋。 這聽起來太奇怪了,但如果你在 YouTube 上搜索「驚喜蛋」, 你會找到 1000 萬支相關影片, 我認為這還是個低估的數字。 我認為實際數量遠大於此。 如果你繼續搜尋, 就會發現它們多不勝數。 有數百萬支像這樣的影片, 標題和實際內容都是千奇百怪, 而且每天都以遞增的數量在上傳。 這真是個奇怪的世界,對吧?
But the thing is, it's not adults who are watching these videos. It's kids, small children. These videos are like crack for little kids. There's something about the repetition, the constant little dopamine hit of the reveal, that completely hooks them in. And little kids watch these videos over and over and over again, and they do it for hours and hours and hours. And if you try and take the screen away from them, they'll scream and scream and scream. If you don't believe me -- and I've already seen people in the audience nodding -- if you don't believe me, find someone with small children and ask them, and they'll know about the surprise egg videos. So this is where we start. It's 2018, and someone, or lots of people, are using the same mechanism that, like, Facebook and Instagram are using to get you to keep checking that app, and they're using it on YouTube to hack the brains of very small children in return for advertising revenue.
但重點是,看這些影片的 觀眾並不是成人, 而是小孩,年紀很小的小孩。 這些影片就像是小孩們的古柯鹼, 它們一遍又一遍地重複播放, 揭曉驚奇蛋的驚喜感, 讓多巴胺一點一滴地累積, 就這樣讓小孩們完全上了癮。 小孩們會一次又一次地 觀看這些影片, 他們會在此花費數小時的時間。 如果你試著阻止他們觀看, 他們會不斷地一直尖叫。 如果你們不相信我, 我已經看到觀眾席上 有人在點頭了—— 如果你們不相信我, 去問問那些有小孩的人, 他們都知道驚喜蛋影片是什麼。 所以,我們從這裡開始說起。 2018 年,有人或很多人 用像臉書及 Instagram 現今在用的相同機制, 讓你不斷回去查看它們的應用程式。 他們在 YouTube 上用 這種方法入侵小孩子的腦袋, 來賺取廣告收入。
At least, I hope that's what they're doing. I hope that's what they're doing it for, because there's easier ways of making ad revenue on YouTube. You can just make stuff up or steal stuff. So if you search for really popular kids' cartoons like "Peppa Pig" or "Paw Patrol," you'll find there's millions and millions of these online as well. Of course, most of them aren't posted by the original content creators. They come from loads and loads of different random accounts, and it's impossible to know who's posting them or what their motives might be. Does that sound kind of familiar? Because it's exactly the same mechanism that's happening across most of our digital services, where it's impossible to know where this information is coming from. It's basically fake news for kids, and we're training them from birth to click on the very first link that comes along, regardless of what the source is. That's doesn't seem like a terribly good idea.
至少,我希望他們只是在 賺取廣告收入。 我希望他們做這種事的目的 只是為了賺錢, 因為 YouTube 上 有更簡單賺取廣告收入的方法。 你可以捏造一些東西 或抄襲別人的東西, 比如如果你去搜尋當下 很流行的兒童卡通, 像「粉紅豬小妹」 或「汪汪隊立大功」, 你也會找到數百萬個搜尋結果, 當然,大多數這類卡通都 不是原創者上傳的, 它們來自一大堆隨機帳戶。 無法知道是誰上傳的, 也不知道他們上傳的動機, 這聽起來是不是有點熟悉? 因為這些操弄方式, 正是目前大多數的 數位網路平台所做的事。 你根本不可能知道這些資訊的來源。 基本上,就像是給兒童看的假新聞。 可笑的是,孩子從出生開始, 我們就訓練他們按下最先看到的連結, 不管它的來源為何。 這聽起來不是個非常好的主意。
Here's another thing that's really big on kids' YouTube. This is called the "Finger Family Song." I just heard someone groan in the audience. This is the "Finger Family Song." This is the very first one I could find. It's from 2007, and it only has 200,000 views, which is, like, nothing in this game. But it has this insanely earwormy tune, which I'm not going to play to you, because it will sear itself into your brain in the same way that it seared itself into mine, and I'm not going to do that to you. But like the surprise eggs, it's got inside kids' heads and addicted them to it. So within a few years, these finger family videos start appearing everywhere, and you get versions in different languages with popular kids' cartoons using food or, frankly, using whatever kind of animation elements you seem to have lying around. And once again, there are millions and millions and millions of these videos available online in all of these kind of insane combinations. And the more time you start to spend with them, the crazier and crazier you start to feel that you might be.
還有一個 YouTube 頻道 在兒童圈也很夯, 叫做《手指家庭之歌》。 我剛聽觀眾席上有人在吟唱, 這就是《手指家庭之歌》。 這是我能找到的最初版本。 是 2007 年上傳的, 只有 20 萬人次的點擊率。 這點點擊率似乎不算什麼, 但它的曲調卻會在腦中揮之不去。 我不會放給在座的各位聽, 因為它的魔音 會傳入你們腦中盤旋不去, 我自己深受其害, 我不會這樣對你們。 但就像驚喜蛋, 它會進到孩子們的腦中, 讓孩子們上癮。 短短幾年間 這些手指家庭之歌的影片 在各處流行開來, 還有不同語言版本的, 在各種兒童動畫片中出現, 有食物版的, 可以這麼說,你能找到的 各種動畫元素都有相應的版本。 再說一次,線上有 數百萬支這樣的影片, 有著各種瘋狂的組合。 你花越多時間在它們上面, 你就會覺得自己越瘋狂。
And that's where I kind of launched into this, that feeling of deep strangeness and deep lack of understanding of how this thing was constructed that seems to be presented around me. Because it's impossible to know where these things are coming from. Like, who is making them? Some of them appear to be made of teams of professional animators. Some of them are just randomly assembled by software. Some of them are quite wholesome-looking young kids' entertainers. And some of them are from people who really clearly shouldn't be around children at all.
我就是這樣開始投入的, 有種很深的陌生感, 也完全沒有辦法理解 我周圍的這些事物 是怎麼被製造出來的。 因為不可能知道這些影片的來源, 它們是誰製作的? 當中有些看起來是由 專業動畫師團隊製作的, 有些則只是軟體隨機拼湊而成的, 有些影片看起來似乎對孩子有益, 有些則顯而易見 絕對是兒童不宜的。
(Laughter)
(笑聲)
And once again, this impossibility of figuring out who's making this stuff -- like, this is a bot? Is this a person? Is this a troll? What does it mean that we can't tell the difference between these things anymore? And again, doesn't that uncertainty feel kind of familiar right now?
同樣的,不可能知道 這些東西是由誰製作的。 是機器人製作的嗎? 是人製作的嗎?或是酸民製作的? 當我們再也不能分辨 它們的差別時, 意味著什麼呢? 同樣的,這樣的 不確定性是否有點熟悉?
So the main way people get views on their videos -- and remember, views mean money -- is that they stuff the titles of these videos with these popular terms. So you take, like, "surprise eggs" and then you add "Paw Patrol," "Easter egg," or whatever these things are, all of these words from other popular videos into your title, until you end up with this kind of meaningless mash of language that doesn't make sense to humans at all. Because of course it's only really tiny kids who are watching your video, and what the hell do they know? Your real audience for this stuff is software. It's the algorithms. It's the software that YouTube uses to select which videos are like other videos, to make them popular, to make them recommended. And that's why you end up with this kind of completely meaningless mash, both of title and of content.
人們獲取點閱率的主要方式── 注意,點閱率就是金錢── 是把熱搜的關鍵字塞進影片標題裡。 比如,你可以用「驚喜蛋」 接著加上「汪汪隊立大功」 和「復活節彩蛋」 或這一類的東西, 把其他熱門影片的 關鍵字加進你的標題, 最終變成一串無意義的標題字句, 沒有任何人類看得懂。 當然因為只有幼童會看你的影片, 他們哪懂什麼? 這類影片實際的觀眾是軟體本身。 它是種演算法, 是 YouTube 用來 篩選相似影片, 及讓影片更熱門、受推薦的演算法。 這就是為什麼最後你看到的 標題或內容, 是毫無意義的大雜燴。
But the thing is, you have to remember, there really are still people within this algorithmically optimized system, people who are kind of increasingly forced to act out these increasingly bizarre combinations of words, like a desperate improvisation artist responding to the combined screams of a million toddlers at once. There are real people trapped within these systems, and that's the other deeply strange thing about this algorithmically driven culture, because even if you're human, you have to end up behaving like a machine just to survive.
但重要的是,你們必須記住, 這個演算最佳化系統 還是有人的參與。 這些人被迫要應對處理 這些與日俱增的怪異文字組合, 就像是個拼了命的即興藝術家, 要在同一時間去回應 100 萬名齊聲尖叫的學步兒。 真的有人被困在這些系統當中, 這種演算法導向的文化, 還有個很奇怪的特點, 就是即使你是個人, 最終也得要像機器一樣行為, 才得以存活下來。
And also, on the other side of the screen, there still are these little kids watching this stuff, stuck, their full attention grabbed by these weird mechanisms. And most of these kids are too small to even use a website. They're just kind of hammering on the screen with their little hands. And so there's autoplay, where it just keeps playing these videos over and over and over in a loop, endlessly for hours and hours at a time. And there's so much weirdness in the system now that autoplay takes you to some pretty strange places. This is how, within a dozen steps, you can go from a cute video of a counting train to masturbating Mickey Mouse. Yeah. I'm sorry about that. This does get worse. This is what happens when all of these different keywords, all these different pieces of attention, this desperate generation of content, all comes together into a single place. This is where all those deeply weird keywords come home to roost. You cross-breed the finger family video with some live-action superhero stuff, you add in some weird, trollish in-jokes or something, and suddenly, you come to a very weird place indeed.
此外,在螢幕的另一端, 還是有幼童在看這些影片, 雙眼黏著螢幕,所有的注意力 都被這些詭異的手法所吸引。 大部分的孩子年紀小到 都還不會使用網路。 他們只會用小手捶打螢幕。 還有所謂的自動播放。 這個功能會讓各種影片 以接力賽的方式播放, 無止盡地一直播放下去。 現今的系統中有太多奇怪的東西了, 以至於自動播放會帶你 看到一些很奇怪的影片。 就是這樣,只要十幾個步驟, 你就可能從一支可愛的數火車影片, 跑到米老鼠手淫的影片。 是的,非常遺憾。 情況變得越來越糟。 會造成這種現象, 是因為當這些不同的熱門關鍵字、 所有能吸引注意力的組合、 與迫不及待要產製播出的影片, 全部結合在一起所造成的結果。 這就是那些極其怪異的關鍵字 所自食的惡果。 你將手指家庭影片 和真人版超級英雄的內容混雜, 再加上一些奇怪、酸民才懂的笑話 或其他東西, 轉瞬間,你就真的會看到 非常奇怪的頁面。
The stuff that tends to upset parents is the stuff that has kind of violent or sexual content, right? Children's cartoons getting assaulted, getting killed, weird pranks that actually genuinely terrify children. What you have is software pulling in all of these different influences to automatically generate kids' worst nightmares. And this stuff really, really does affect small children. Parents report their children being traumatized, becoming afraid of the dark, becoming afraid of their favorite cartoon characters. If you take one thing away from this, it's that if you have small children, keep them the hell away from YouTube.
會讓父母惱火的內容, 通常就是與暴力或是色情 相關的內容,對嗎? 兒童卡通正遭到攻擊, 正一點點死去, 怪異的惡作劇內容真的會嚇壞孩子。 你們看到的就是軟體匯入 上述各種雜亂無章的元素後, 自動呈現出孩子最害怕的夢魘影片。 這些東西真的會影響到小朋友。 有家長反應他們的孩子受到了創傷, 開始害怕黑暗, 開始害怕他們最喜歡的卡通角色, 如果你要從這當中學到一件事 那就是:若你有小孩, 千萬別讓他們靠近 YouTube。
(Applause)
(掌聲)
But the other thing, the thing that really gets to me about this, is that I'm not sure we even really understand how we got to this point. We've taken all of this influence, all of these things, and munged them together in a way that no one really intended. And yet, this is also the way that we're building the entire world. We're taking all of this data, a lot of it bad data, a lot of historical data full of prejudice, full of all of our worst impulses of history, and we're building that into huge data sets and then we're automating it. And we're munging it together into things like credit reports, into insurance premiums, into things like predictive policing systems, into sentencing guidelines. This is the way we're actually constructing the world today out of this data. And I don't know what's worse, that we built a system that seems to be entirely optimized for the absolute worst aspects of human behavior, or that we seem to have done it by accident, without even realizing that we were doing it, because we didn't really understand the systems that we were building, and we didn't really understand how to do anything differently with it.
還有一件事真的對我影響很大, 那就是我不確定我們是否了解 我們是如何走到今天這一步的。 我們匯入所有的影響因素、 所有的東西, 並以無法預期的方式運作出結果。 然而,這也是我們 建造整個世界的方式。 我們匯集所有的數據資料, 儘管許多資料是不好的, 許多歷史資料是充滿偏見的、 充滿我們史上衝動偏激的觀點, 然後把這些數據資料建入 龐大的數據庫中, 接著讓它們自動化, 它們自行運作產製出信用報告、 保險費、 預測性警務系統、 和判刑指南。 其實我們就是以這些數據資料 在建構當今的世界。 我不知道哪種比較糟糕: 是我們似乎建造了一個 人類絕對負面行為的優化系統, 還是似乎是無意為之 卻這樣做了, 甚至我們真的沒有意識到 自己在做什麼, 因為我們真的不了解 我們建立的系統, 且我們其實不了解有什麼 其他不同的方式可以採用。
There's a couple of things I think that really seem to be driving this most fully on YouTube, and the first of those is advertising, which is the monetization of attention without any real other variables at work, any care for the people who are actually developing this content, the centralization of the power, the separation of those things. And I think however you feel about the use of advertising to kind of support stuff, the sight of grown men in diapers rolling around in the sand in the hope that an algorithm that they don't really understand will give them money for it suggests that this probably isn't the thing that we should be basing our society and culture upon, and the way in which we should be funding it.
我認為有幾樣東西肯定 是在 YouTube 上 驅使這個現象發生的原因, 第一項就是廣告。 它靠關注和點閱率獲利, 不考量其他的變數, 也不在乎這些內容是誰創作的, 權力的集中化,隔離了其他的 影響變數。 我認為,不論你對於 使用廣告來宣傳某個商品 有什麼樣的感受, 像這些成年男子包著尿布 在沙灘上打滾的畫面, 這些人冀望他們搞不懂的演算法, 會因這段影片而付錢給他們。 這種現象表明,我們不應該 將我們的社會和文化 立基在這種東西之上, 也不應該用這種方法來贊助它。
And the other thing that's kind of the major driver of this is automation, which is the deployment of all of this technology as soon as it arrives, without any kind of oversight, and then once it's out there, kind of throwing up our hands and going, "Hey, it's not us, it's the technology." Like, "We're not involved in it." That's not really good enough, because this stuff isn't just algorithmically governed, it's also algorithmically policed. When YouTube first started to pay attention to this, the first thing they said they'd do about it was that they'd deploy better machine learning algorithms to moderate the content. Well, machine learning, as any expert in it will tell you, is basically what we've started to call software that we don't really understand how it works. And I think we have enough of that already. We shouldn't be leaving this stuff up to AI to decide what's appropriate or not, because we know what happens. It'll start censoring other things. It'll start censoring queer content. It'll start censoring legitimate public speech. What's allowed in these discourses, it shouldn't be something that's left up to unaccountable systems. It's part of a discussion all of us should be having.
另外一個驅動因素就是自動化。 也就是說運用所有的技術, 在沒有任何監督的機制下, 一旦影片上架曝光了, 就兩手一攤、無奈地說 :「嘿, 跟我們無關,是科技製做出來的。」 就像「我們沒有參與其中。」一樣。 這理由可不好。 因為這種東西 不僅是由演算法來主導, 也是由演算法來監管的。 YouTube 首次正視這個問題時, 他們說第一件事要做的事, 就是他們要使用更好的 機器學習演算法, 來調整播放內容。 關於機械學習, 任何專家都會告訴你, 那就是我們所稱的軟體, 一個沒人知道它是如何運作的東西。 這些軟體已經夠多了。 我們不應該任由人工智慧來決定 什麼是合適的, 因為我們知道會發生什麼。 它將開始審查其他東西。 它將開始審查同性戀內容。 它將開始審查合法的公共演講。 演講是否合法獲准, 不應該由一個 無法負起責任的系統來決定。 這是我們所有人 都應該思考與討論的。
But I'd leave a reminder that the alternative isn't very pleasant, either. YouTube also announced recently that they're going to release a version of their kids' app that would be entirely moderated by humans. Facebook -- Zuckerberg said much the same thing at Congress, when pressed about how they were going to moderate their stuff. He said they'd have humans doing it. And what that really means is, instead of having toddlers being the first person to see this stuff, you're going to have underpaid, precarious contract workers without proper mental health support being damaged by it as well.
我還想提醒各位: 一些替代的方案也不盡如人意。 YouTube 最近也宣佈 將要推出兒童專用的應用程式。 裡面的內容將完全由人來篩選, 臉書總裁扎克伯格 也在國會說了相同的話, 當被問到將如何改進他們的內容時。 他說已經有人在做這件事了。 他真正表達出來的是 與其讓那些蹣跚學步的幼童 成為第一個看這些內容的人, 你要讓那些拿著臨時性合約、 薪水過低、 沒有心理健康醫療支持的員工們, 成為這些夢魘影片的受害者。
(Laughter)
(笑聲)
And I think we can all do quite a lot better than that.
我想我們可以做到的 遠不止這些。
(Applause)
(掌聲)
The thought, I think, that brings those two things together, really, for me, is agency. It's like, how much do we really understand -- by agency, I mean: how we know how to act in our own best interests. Which -- it's almost impossible to do in these systems that we don't really fully understand. Inequality of power always leads to violence. And we can see inside these systems that inequality of understanding does the same thing. If there's one thing that we can do to start to improve these systems, it's to make them more legible to the people who use them, so that all of us have a common understanding of what's actually going on here.
總結這兩件事,我真正的想法是: 監管代理。 想想我們自己真的能了解多少。 藉由監管,我的意思是: 我們如何知道依最佳利益來行事。 這要在我們自己都搞不懂 它是如何運作的系統中, 是不可能達成的。 權力的不平等終會導致暴力。 我們也可在這些系統中看到, 理解的不平等也會造成相同的結果。 如果要做一件事來改善這些系统, 那就是讓使用他們的人 能更清楚地了解它們。 這樣大家都有基礎的認知, 理解到實際的狀況
The thing, though, I think most about these systems is that this isn't, as I hope I've explained, really about YouTube. It's about everything. These issues of accountability and agency, of opacity and complexity, of the violence and exploitation that inherently results from the concentration of power in a few hands -- these are much, much larger issues. And they're issues not just of YouTube and not just of technology in general, and they're not even new. They've been with us for ages. But we finally built this system, this global system, the internet, that's actually showing them to us in this extraordinary way, making them undeniable. Technology has this extraordinary capacity to both instantiate and continue all of our most extraordinary, often hidden desires and biases and encoding them into the world, but it also writes them down so that we can see them, so that we can't pretend they don't exist anymore. We need to stop thinking about technology as a solution to all of our problems, but think of it as a guide to what those problems actually are, so we can start thinking about them properly and start to address them.
我對這些系統著墨最多的, 如我前所述, 其實並不關乎於 Youtube。 而是所有的一切。 這些關乎責任和監管的問題, 不透明與複雜性的問題, 由於中央集權所導致的 暴力和剝削問題── 這些更重要、更嚴重的問題。 它們不僅僅是 YouTube 或一般的科技問題而已, 甚至不是新的問題, 這些問題已經存在很久了, 但是最終我們建立了這個系统, 全球性的系统 ── 網際網路, 以非凡的方式向我們展現 它無法讓人抗拒的魅力。 科技具有非凡的能力 去例示與延續 我們所有那些卓越的能力, 而其通常隱藏著慾望與偏見, 而我們把那些慾望與偏見 一併編碼寫進了這世界, 但也因為它們被編碼寫下了 所以我們看得到, 所以我們就不能假裝它們並不存在。 我們不能再認為科技 是解決所有問題的利器。 我們需要把科技當作一種指引, 帶領我們發現真正的問題所在, 如此我們方能正視我們的問題, 並且解決它們。
Thank you very much.
感謝聆聽!
(Applause)
(掌聲)
Thank you.
謝謝!
(Applause)
(掌聲)
Helen Walters: James, thank you for coming and giving us that talk. So it's interesting: when you think about the films where the robotic overlords take over, it's all a bit more glamorous than what you're describing. But I wonder -- in those films, you have the resistance mounting. Is there a resistance mounting towards this stuff? Do you see any positive signs, green shoots of resistance?
海倫·沃特斯:詹姆斯, 謝謝你蒞臨演講。 非常有趣! 當你想像影片是由機器霸主接管時, 會以為比你講的還要迷人刺激些。 我想知道──這些影片,阻抗增長。 對你所描述這些的阻抗 是否有所增長呢? 你看到任何正面的跡象、 萌芽的阻抗嗎?
James Bridle: I don't know about direct resistance, because I think this stuff is super long-term. I think it's baked into culture in really deep ways. A friend of mine, Eleanor Saitta, always says that any technological problems of sufficient scale and scope are political problems first of all. So all of these things we're working to address within this are not going to be addressed just by building the technology better, but actually by changing the society that's producing these technologies. So no, right now, I think we've got a hell of a long way to go. But as I said, I think by unpacking them, by explaining them, by talking about them super honestly, we can actually start to at least begin that process.
詹姆斯·布瑞德: 我並不知道正面的阻抗, 因為我認為這是需要長期抗戰的事。 我想它已崁入到文化很深的層次。 我的友人叫艾麗諾 · 塞爾塔, 她總是說: 任何影響規模和範圍 巨大的科技問題, 一開始都源自政治問題。 所以這些我們正在努力解決的事情, 並不是光靠改進我們的科技而已, 應該要改變創造出這些科技的社會。 所以現在我認為還有漫長的路要走。 但如我所說,藉由將它們搬上檯面, 通過真誠地解釋與溝通, 我們就至少可以踏出 這個漫長旅程的第一步。
HW: And so when you talk about legibility and digital literacy, I find it difficult to imagine that we need to place the burden of digital literacy on users themselves. But whose responsibility is education in this new world?
海倫·沃特斯: 當你談及易懂性和數位素養的時候, 我認為這很難想像, 要用戶自己背負數位素養的責任。 但是在這個新世界裡 教育是誰的責任呢?
JB: Again, I think this responsibility is kind of up to all of us, that everything we do, everything we build, everything we make, needs to be made in a consensual discussion with everyone who's avoiding it; that we're not building systems intended to trick and surprise people into doing the right thing, but that they're actually involved in every step in educating them, because each of these systems is educational. That's what I'm hopeful about, about even this really grim stuff, that if you can take it and look at it properly, it's actually in itself a piece of education that allows you to start seeing how complex systems come together and work and maybe be able to apply that knowledge elsewhere in the world.
詹姆斯·布瑞德:我覺得 這責任落在我們所有人的身上, 我們所做、做建、所創造的全部, 都需經彼此相互討論達成共識, 包含那些迴避問題的人。 我們建造系統並不是為了 欺騙或者震攝人們 去做正確的事情, 而是在每一個步驟去教育他們, 因每個系統都具教育性。 這就是我希望看到的, 即使它如此令人不悅, 如果你能正確、適當地看待它, 那麼它本身就是一種教育, 讓你看到複雜的系統 是如何結合在一起工作的, 或許還能夠將這些知識 應用到世界其他地方。
HW: James, it's such an important discussion, and I know many people here are really open and prepared to have it, so thanks for starting off our morning.
海倫·沃特斯:詹姆斯, 這是個重要的討論。 我知道許多人抱著開放的心 來聆聽你的演說。 感謝你為我們的早晨 揭開精彩的序幕。
JB: Thanks very much. Cheers.
詹姆斯·布瑞德: 非常感謝大家!
(掌聲)
(Applause)