I'm James. I'm a writer and artist, and I make work about technology. I do things like draw life-size outlines of military drones in city streets around the world, so that people can start to think and get their heads around these really quite hard-to-see and hard-to-think-about technologies. I make things like neural networks that predict the results of elections based on weather reports, because I'm intrigued about what the actual possibilities of these weird new technologies are. Last year, I built my own self-driving car. But because I don't really trust technology, I also designed a trap for it.
我是詹姆斯。 我是个作家和艺术家, 我的工作跟技术有关。 我做一些诸如绘制真实大小的军用无人机 在世界各地城市街道上, 这样人们就可以去想象和思考 这些平时蛮难见到和蛮难想象的技术。 我会制作一些神经网络的东西, 通过天气预报 来预测选举结果, 因为我感兴趣 这些奇怪的新技术 实际的可能性是什么。 去年,我还建造了自己的自动驾驶汽车。 但因为我不完全相信技术, 我同时也给它设计了个包围圈。
(Laughter)
(笑声)
And I do these things mostly because I find them completely fascinating, but also because I think when we talk about technology, we're largely talking about ourselves and the way that we understand the world. So here's a story about technology.
我做这些事情主要是 因为我发现他们真的很吸引人, 但也因为我觉得当我们谈论科技时, 我们其实主要是谈论我们自身 以及我们理解世界的方式。 下面的故事就是有关科技的。
This is a "surprise egg" video. It's basically a video of someone opening up loads of chocolate eggs and showing the toys inside to the viewer. That's it. That's all it does for seven long minutes. And I want you to notice two things about this. First of all, this video has 30 million views.
这是“惊喜蛋”视频。 内容就是人打开各种巧克力蛋 向观众展示里面的玩具是啥。 就这样,这视频长达7分钟。 我想让你们注意这两点: 首先,这视频有3千万观看量。
(Laughter)
(笑声)
And the other thing is, it comes from a channel that has 6.3 million subscribers, that has a total of eight billion views, and it's all just more videos like this -- 30 million people watching a guy opening up these eggs. It sounds pretty weird, but if you search for "surprise eggs" on YouTube, it'll tell you there's 10 million of these videos, and I think that's an undercount. I think there's way, way more of these. If you keep searching, they're endless. There's millions and millions of these videos in increasingly baroque combinations of brands and materials, and there's more and more of them being uploaded every single day. Like, this is a strange world. Right?
另一件事是, 它来自一个拥有630万订阅用户的频道, 该频道总播放量为80亿次, 这个频道主要都是这类的内容。 3千万人观看打开这些蛋蛋。 这听起来相当古怪, 但如果你在YouTube上搜索“惊喜蛋”, 它会告诉你这类视频多达1千万个, 我认为这数字还低估了。 我认为数量远远大于此。 如果你继续搜索,几乎无穷无尽。 这类视频,数不胜数。 如巴洛克般花哨地 混合着各种品牌和材料, 每天这类视频上传的数量还越来越多。 这真是个奇怪的世界,对吗?
But the thing is, it's not adults who are watching these videos. It's kids, small children. These videos are like crack for little kids. There's something about the repetition, the constant little dopamine hit of the reveal, that completely hooks them in. And little kids watch these videos over and over and over again, and they do it for hours and hours and hours. And if you try and take the screen away from them, they'll scream and scream and scream. If you don't believe me -- and I've already seen people in the audience nodding -- if you don't believe me, find someone with small children and ask them, and they'll know about the surprise egg videos. So this is where we start. It's 2018, and someone, or lots of people, are using the same mechanism that, like, Facebook and Instagram are using to get you to keep checking that app, and they're using it on YouTube to hack the brains of very small children in return for advertising revenue.
但事实是,不是成年人在看这些视频。 是小朋友,小孩子们。 这些视频就像小孩子的毒品。 是那种让人反复的东西, 小多巴胺不断涌现出来, 完全令他们着迷不已。 孩子们一遍又一遍地看这些视频, 一个小时接一个小时地看。 如果你试图把屏幕关上, 他们会朝你尖叫,尖叫,尖叫 假如你不相信我—— 我已经看到在座各位已经有人点头—— 如果你不相信我,找个有小孩的人问问, 他们肯定知道这些惊喜蛋视频。 这就是我们开始的地方。 2018年,有人,或很多人 在使用类似的机制,如Facebook 和Instagram就在使用的机制 来让你沉迷他们的应用, 他们在YouTube上用这些机制 劫持小孩子的脑袋 来换取广告收入。
At least, I hope that's what they're doing. I hope that's what they're doing it for, because there's easier ways of making ad revenue on YouTube. You can just make stuff up or steal stuff. So if you search for really popular kids' cartoons like "Peppa Pig" or "Paw Patrol," you'll find there's millions and millions of these online as well. Of course, most of them aren't posted by the original content creators. They come from loads and loads of different random accounts, and it's impossible to know who's posting them or what their motives might be. Does that sound kind of familiar? Because it's exactly the same mechanism that's happening across most of our digital services, where it's impossible to know where this information is coming from. It's basically fake news for kids, and we're training them from birth to click on the very first link that comes along, regardless of what the source is. That's doesn't seem like a terribly good idea.
至少,我希望这是他们在做的事情。 我希望这是他们做这些事情的目的。 因为Youtube上有更简单的 赚取广告收入的方式 你可以捏造或者干脆照搬其他人的作品。 所以如果你搜索真正流行的儿童卡通 比如“小猪佩奇”或者“狗狗巡逻队” 你会发现这类视频也有不计其数。 当然,其中绝大部分并非由 内容版权方上传。 他们来自大量不同的随机账号, 几乎无法知道谁在上传, 他们的动机是什么 。 这是不是听起来有点熟悉? 因为这完全就是同一套机制, 几乎每个网络平台上都会利用它, 我们根本无法知道这些信息的来源。 这就是给儿童看的假新闻, 我们从他们出生开始 训练他们点击每一个链接, 不管信息来源何处。 这听起来可不是个好主意。
Here's another thing that's really big on kids' YouTube. This is called the "Finger Family Song." I just heard someone groan in the audience. This is the "Finger Family Song." This is the very first one I could find. It's from 2007, and it only has 200,000 views, which is, like, nothing in this game. But it has this insanely earwormy tune, which I'm not going to play to you, because it will sear itself into your brain in the same way that it seared itself into mine, and I'm not going to do that to you. But like the surprise eggs, it's got inside kids' heads and addicted them to it. So within a few years, these finger family videos start appearing everywhere, and you get versions in different languages with popular kids' cartoons using food or, frankly, using whatever kind of animation elements you seem to have lying around. And once again, there are millions and millions and millions of these videos available online in all of these kind of insane combinations. And the more time you start to spend with them, the crazier and crazier you start to feel that you might be.
这个视频在儿童的YouTube频道 也相当流行。 这叫做“手指之歌”。 我刚听到有人在观众席上叹气。 这是“手指之歌”。 这是我能找到的最初版本。 来自2007年,当时只有20万播放量, 在这场游戏中根本微不足道。 但它拥有不可思议的绕梁三日音调, 我可不打算播放给你们听, 因为它也会钻进你的脑袋里, 就跟它钻进我脑袋一样, 我不打算那样对你们。 但跟惊喜蛋一样 它留在了孩子的头脑里 让他们产生沉迷。 所以不到几年,这些手指之歌视频 开始无处不在, 于是就有了各种语言的版本, 有流行卡通使用食物 或者使用任何动画元素的版本。 你就像躺在上面一样。 再一次,这些不计其数视频 以各种疯狂的组合方式在网上出现。 你花在这些上面的时间越多, 你就会觉得自己越来越疯癫。
And that's where I kind of launched into this, that feeling of deep strangeness and deep lack of understanding of how this thing was constructed that seems to be presented around me. Because it's impossible to know where these things are coming from. Like, who is making them? Some of them appear to be made of teams of professional animators. Some of them are just randomly assembled by software. Some of them are quite wholesome-looking young kids' entertainers. And some of them are from people who really clearly shouldn't be around children at all.
这就是我要开始讲的, 始终,我有一种陌生感, 我也不理解 这些东西如何被构建出来。 因为我没法知道这些东西来自哪里。 比如,谁制作了它们? 有些视频似乎来自专业动画团队。 有些只是随机由软件合成。 有些视频中有看起来友好的表演者 但也有一些视频中的人 一看就是儿童不宜的。
(Laughter)
(笑声)
And once again, this impossibility of figuring out who's making this stuff -- like, this is a bot? Is this a person? Is this a troll? What does it mean that we can't tell the difference between these things anymore? And again, doesn't that uncertainty feel kind of familiar right now?
再一次,几乎不可能搞清楚是谁 制作了这些东西。 是机器人? 是人?还是网络喷子? 我们无法分辨出彼此差别 到底意味着什么? 再一次,这种不确定性是不是有点熟悉?
So the main way people get views on their videos -- and remember, views mean money -- is that they stuff the titles of these videos with these popular terms. So you take, like, "surprise eggs" and then you add "Paw Patrol," "Easter egg," or whatever these things are, all of these words from other popular videos into your title, until you end up with this kind of meaningless mash of language that doesn't make sense to humans at all. Because of course it's only really tiny kids who are watching your video, and what the hell do they know? Your real audience for this stuff is software. It's the algorithms. It's the software that YouTube uses to select which videos are like other videos, to make them popular, to make them recommended. And that's why you end up with this kind of completely meaningless mash, both of title and of content.
所以人们获得观看量的主要方法是 记住,观看量意味着金钱, 是他们用热门词充斥这些视频的标题。 以“惊喜蛋”为例, 你会增加“狗狗巡逻队”、“复活节彩蛋“ 或者任何其他词语, 这些来自其他热门视频的词 添加到你的标题, 直到你最终得到这种对人类而言 毫无意义的词语混杂。 因为当然只有很小的孩子在看你的视频, 他们能知道什么? 你这些东西的真正观众是软件, 是算法。 这是YouTube使用来 选择哪个视频像哪个视频, 让他们流行和推荐的算法。 所以你最终得到的就是 这种完全没有意义的大杂烩, 不管是标题还是内容。
But the thing is, you have to remember, there really are still people within this algorithmically optimized system, people who are kind of increasingly forced to act out these increasingly bizarre combinations of words, like a desperate improvisation artist responding to the combined screams of a million toddlers at once. There are real people trapped within these systems, and that's the other deeply strange thing about this algorithmically driven culture, because even if you're human, you have to end up behaving like a machine just to survive.
但事情是,你需要记住, 这个优化的算法系统还是需要人的参与, 这些人被迫面对处理 这些越来越奇怪的词语组合, 就像一个绝望的即兴艺术家 要对上百万尖叫孩子组合 做出回应一样。 一些人则被困在这个系统里面, 另一个很奇怪的事情是 关于算法驱动文化, 因为即便你是人类, 你最终也会变得像机器一样, 只是为了生存。
And also, on the other side of the screen, there still are these little kids watching this stuff, stuck, their full attention grabbed by these weird mechanisms. And most of these kids are too small to even use a website. They're just kind of hammering on the screen with their little hands. And so there's autoplay, where it just keeps playing these videos over and over and over in a loop, endlessly for hours and hours at a time. And there's so much weirdness in the system now that autoplay takes you to some pretty strange places. This is how, within a dozen steps, you can go from a cute video of a counting train to masturbating Mickey Mouse. Yeah. I'm sorry about that. This does get worse. This is what happens when all of these different keywords, all these different pieces of attention, this desperate generation of content, all comes together into a single place. This is where all those deeply weird keywords come home to roost. You cross-breed the finger family video with some live-action superhero stuff, you add in some weird, trollish in-jokes or something, and suddenly, you come to a very weird place indeed.
而且,屏幕的另一面是, 这些小孩仍然在看这些视频, 他们的注意力完全被 这些奇怪的机制所左右。 大部分小孩年纪都很小, 甚至还不会使用网页。 他们只会用他们的小手敲打着屏幕。 这是自动播放按钮, 它会不断循环播放这些视频, 无休止持续数小时。 现在这个系统里有很多奇怪的东西 自动播放会带你去一些非常奇怪的地方。 这里演示的是,在十几个步骤里, 你可能会从数火车的有趣视频 播到手淫的米老鼠。 是的,非常抱歉。 事情确实变糟糕了。 这就是会发生的事情: 当所有这些不同的关键词 所有可以吸引注意力的内容 所有迫不及待地让视频播出的心理 都在同一个地方冒出来。 这是这些非常怪异的关键词 自食其果的地方。 你混合了手指家庭视频 和一些超级英雄的真人动作视频, 你加入一些奇怪,恶搞或其他元素。 突然之间,你就会进入非常怪异的领域。
The stuff that tends to upset parents is the stuff that has kind of violent or sexual content, right? Children's cartoons getting assaulted, getting killed, weird pranks that actually genuinely terrify children. What you have is software pulling in all of these different influences to automatically generate kids' worst nightmares. And this stuff really, really does affect small children. Parents report their children being traumatized, becoming afraid of the dark, becoming afraid of their favorite cartoon characters. If you take one thing away from this, it's that if you have small children, keep them the hell away from YouTube.
那些会让父母感到不安的东西 是充满暴力和色情的内容,对吧? 儿童们的卡通形象被侵犯, 被杀害, 怪异的恶作剧,真的让孩子们感到恐惧。 现在你所能得到的是软件 把这些不同的内容 自动组合成了儿童们的噩梦。 这些东西真的,真的会影响小孩子。 父母报告说他们的孩子受到了精神创伤, 变得害怕黑暗, 开始害怕他们喜爱的卡通角色。 如果你可以从中学到一件事: 如果你有小孩, 让他们远离YouTube。
(Applause)
(掌声)
But the other thing, the thing that really gets to me about this, is that I'm not sure we even really understand how we got to this point. We've taken all of this influence, all of these things, and munged them together in a way that no one really intended. And yet, this is also the way that we're building the entire world. We're taking all of this data, a lot of it bad data, a lot of historical data full of prejudice, full of all of our worst impulses of history, and we're building that into huge data sets and then we're automating it. And we're munging it together into things like credit reports, into insurance premiums, into things like predictive policing systems, into sentencing guidelines. This is the way we're actually constructing the world today out of this data. And I don't know what's worse, that we built a system that seems to be entirely optimized for the absolute worst aspects of human behavior, or that we seem to have done it by accident, without even realizing that we were doing it, because we didn't really understand the systems that we were building, and we didn't really understand how to do anything differently with it.
但另一件事情, 这事情真正让我关心这个的是 我不确定我们是否真正理解 事情是如何发展到这一步的。 我们已经看到了所有这些影响, 所有这些东西, 以一种没人意料得到的方式 组合在一起。 然而,这也是我们构建 整个世界的方式。 我们得到了所有这些数据, 无数的坏数据, 大量充满偏见的历史数据, 充斥着历史上最糟糕冲动的数据, 我们把他们制作成海量数据集 然后让它们自动化。 我们把它们生成信用报告, 保险费用, 预测性警务系统, 量刑建议。 这就是我们今天基于这些数据 构建世界的方式。 我不知道哪个更糟糕: 是我们似乎建造了一个完全 适合人类负面行为的优化系统, 或者我们只是无意中造就了它, 完全没有意识, 因为我们并不理解我们们正创建的系统, 我们也并不知道是否有 其他不同的方式来使用它。
There's a couple of things I think that really seem to be driving this most fully on YouTube, and the first of those is advertising, which is the monetization of attention without any real other variables at work, any care for the people who are actually developing this content, the centralization of the power, the separation of those things. And I think however you feel about the use of advertising to kind of support stuff, the sight of grown men in diapers rolling around in the sand in the hope that an algorithm that they don't really understand will give them money for it suggests that this probably isn't the thing that we should be basing our society and culture upon, and the way in which we should be funding it.
我认为有几个因素看起来导致了 YouTube内容事件的发生, 其中之首是广告。 一种靠注意力赢利的模式 而不考虑其他变量因素, 不关心是哪些人确实开发了这些内容。 权力的集中,东西的分散。 我想,不管你对用广告来宣传某物 有着什么样的看法, 在沙里翻滚的穿纸尿裤的大人 希望无知的机器算法 会发钱给他们。 意味着这类事情 不应该是我们社会和文化所依托的 也不该是我们应该资助的。
And the other thing that's kind of the major driver of this is automation, which is the deployment of all of this technology as soon as it arrives, without any kind of oversight, and then once it's out there, kind of throwing up our hands and going, "Hey, it's not us, it's the technology." Like, "We're not involved in it." That's not really good enough, because this stuff isn't just algorithmically governed, it's also algorithmically policed. When YouTube first started to pay attention to this, the first thing they said they'd do about it was that they'd deploy better machine learning algorithms to moderate the content. Well, machine learning, as any expert in it will tell you, is basically what we've started to call software that we don't really understand how it works. And I think we have enough of that already. We shouldn't be leaving this stuff up to AI to decide what's appropriate or not, because we know what happens. It'll start censoring other things. It'll start censoring queer content. It'll start censoring legitimate public speech. What's allowed in these discourses, it shouldn't be something that's left up to unaccountable systems. It's part of a discussion all of us should be having.
另一个主要驱动因素是自动化, 是所有这些技术的部署 多多益善,不加审察。 而一旦出了问题, 就两手一摊,“嗨, 不是我们做的,技术搞的。” 又如,“我们没有参与其中。” 这还不够好, 因为这些平台不仅由算法控制, 还有被算法监管。 当YouTube开始关注到这个问题时, 他们说他们会做的首要事情是 部署更好的机器学习算法 来控制内容。 好吧,机器学习, 很多专家都会告诉你, 就是那种我们开始说的 我们无法真正理解它如何工作的软件。 我认为我们已经受够它了。 我们不该让AI决定 什么是合适与否的内容, 因为我们知道会发生什么。 它会开始审查其他事情。 它会开始审查同性内容。 它会开始审查合法的公开演讲。 在演讲中应该允许什么, 这不该是留给不可靠系统去决定的事。 这是我们所有人都应该讨论的事情。
But I'd leave a reminder that the alternative isn't very pleasant, either. YouTube also announced recently that they're going to release a version of their kids' app that would be entirely moderated by humans. Facebook -- Zuckerberg said much the same thing at Congress, when pressed about how they were going to moderate their stuff. He said they'd have humans doing it. And what that really means is, instead of having toddlers being the first person to see this stuff, you're going to have underpaid, precarious contract workers without proper mental health support being damaged by it as well.
但我要留个提醒 替代方案也并非完美。 YouTube最近也宣布 它们会推出它们的儿童版APP 将会完全由人工监管。 Facebook——扎克伯格在国会上 也说了类似的话, 当被问到如何去调整他们的产品时。 他说他们会让人处理。 那其实隐含的意思是, 以其让蹒跚学步的孩子 成为第一个看到这些东西的人, 你打算让那些工资过低,不稳定, 没有心理辅导的临时工 被那些视频伤害。
(Laughter)
(笑声)
And I think we can all do quite a lot better than that.
我想我们都可以做得比这更好。
(Applause)
(鼓掌)
The thought, I think, that brings those two things together, really, for me, is agency. It's like, how much do we really understand -- by agency, I mean: how we know how to act in our own best interests. Which -- it's almost impossible to do in these systems that we don't really fully understand. Inequality of power always leads to violence. And we can see inside these systems that inequality of understanding does the same thing. If there's one thing that we can do to start to improve these systems, it's to make them more legible to the people who use them, so that all of us have a common understanding of what's actually going on here.
总结这两件事,我的想法在于: 自主能动性 就像,我们是否真正懂得“自主能动性”: 我们是否知怎样按照 自己的最佳利益行事。 而这几乎无法在 我们并不完全理解的系统中实现。 权力的不对等总会导致暴力。 我们可以在这些系统中看到 理解的不对等也会造成同样的结果。 如果我们能够做一件事情 去提升这些系统, 那就是让它们变得更透明 这样我们所有的人都对其中的情况 有一个共同的理解。
The thing, though, I think most about these systems is that this isn't, as I hope I've explained, really about YouTube. It's about everything. These issues of accountability and agency, of opacity and complexity, of the violence and exploitation that inherently results from the concentration of power in a few hands -- these are much, much larger issues. And they're issues not just of YouTube and not just of technology in general, and they're not even new. They've been with us for ages. But we finally built this system, this global system, the internet, that's actually showing them to us in this extraordinary way, making them undeniable. Technology has this extraordinary capacity to both instantiate and continue all of our most extraordinary, often hidden desires and biases and encoding them into the world, but it also writes them down so that we can see them, so that we can't pretend they don't exist anymore. We need to stop thinking about technology as a solution to all of our problems, but think of it as a guide to what those problems actually are, so we can start thinking about them properly and start to address them.
但是,我认为这些系统的关键问题 我希望我已解释过了, 这真的并不是YouTube的问题。 任何事都是一样。 这些问责和自主能动性, 不透明性和复杂性问题, 暴力和剥削问题 是因为权力集中于少数人手中。 这些都是更大的问题。 他们不仅是YouTube的问题, 而且不仅仅是科技问题, 它们甚至都不是新问题。 他们已经存在很久了。 但是我们最终建立了这个系统, 这个全球系统,互联网, 它实际上是用这种特殊的方式 向我们展示, 他们至高无上。 技术有这种非凡的能力 去具现化和继续 我们所有最卓越,而通常 被隐藏的欲望和偏见, 并把它们编码到世界中, 但是它也会把它们写下来, 这样我们就能看到它们了。 所以我们不能假装这些问题不再存在了。 我们需要停止把技术 当作是解决一切问题的良方, 而应把它看作指引我们发现 问题的指南针, 这样我们才可以开始正视它们 并开始着手解决它们。
Thank you very much.
谢谢!
(Applause)
(掌声)
Thank you.
谢谢!
(Applause)
(掌声)
Helen Walters: James, thank you for coming and giving us that talk. So it's interesting: when you think about the films where the robotic overlords take over, it's all a bit more glamorous than what you're describing. But I wonder -- in those films, you have the resistance mounting. Is there a resistance mounting towards this stuff? Do you see any positive signs, green shoots of resistance?
海伦·沃尔特斯:詹姆斯, 谢谢你的到来和这个演讲。 这很有趣: 当在电影中,当那些机器人 开始统治世界时, 那个景象好似比你描述的更加宏伟。 但我想知道的是,在那些电影里, 往往会有人类抵抗军 现实中是否也存在这些反抗军呢? 你有没有看到任何积极的迹象, 抵抗的萌芽?
James Bridle: I don't know about direct resistance, because I think this stuff is super long-term. I think it's baked into culture in really deep ways. A friend of mine, Eleanor Saitta, always says that any technological problems of sufficient scale and scope are political problems first of all. So all of these things we're working to address within this are not going to be addressed just by building the technology better, but actually by changing the society that's producing these technologies. So no, right now, I think we've got a hell of a long way to go. But as I said, I think by unpacking them, by explaining them, by talking about them super honestly, we can actually start to at least begin that process.
詹姆斯·布里德尔:我不清楚 直接的抵抗力量, 因为我觉得这是非常长期的问题。 我认为它在文化中根深蒂固。 我的朋友,埃莉诺·萨塔,总是说 任何足够规模和范围的科技问题 首先是政治问题。 所有这些我们在努力解决的问题 不能仅仅通过研发更好的技术来解决, 而应通过改变产生 这些技术的社会来解决。 所以,现在我想我们还有很长的路要走。 但就像我说的,我想把它们拆解, 通过解构他们,诚实地谈论他们 我们至少可以开始这个过程。
HW: And so when you talk about legibility and digital literacy, I find it difficult to imagine that we need to place the burden of digital literacy on users themselves. But whose responsibility is education in this new world?
沃尔特斯: 还有当你谈到易读性和 数字素养时, 我觉得非常难以想象 我们要把数字扫盲的负担 放在用户身上。 但在这个新世界,教育是谁的责任?
JB: Again, I think this responsibility is kind of up to all of us, that everything we do, everything we build, everything we make, needs to be made in a consensual discussion with everyone who's avoiding it; that we're not building systems intended to trick and surprise people into doing the right thing, but that they're actually involved in every step in educating them, because each of these systems is educational. That's what I'm hopeful about, about even this really grim stuff, that if you can take it and look at it properly, it's actually in itself a piece of education that allows you to start seeing how complex systems come together and work and maybe be able to apply that knowledge elsewhere in the world.
布里德尔:再次,我认为这个责任是 我们所有人的责任, 我们所做的、所构建的、所制造的一切, 需要与每个回避问题的人 以相互求同为目的,进行讨论; 我们建造这些系统并不是为了 去诱惑人 去做正确的事情。 而是教育他们的每一步, 因为每一个系统都是有教育意义的。 这就是我所希望的, 即使是非常严峻的问题, 如果你能正确地看待它, 这本身就是一种教育, 让你可以开始看到复杂的系统 是如何结合在一起工作的 也许能够将这些知识 应用到世界的其他地方。
HW: James, it's such an important discussion, and I know many people here are really open and prepared to have it, so thanks for starting off our morning.
沃尔特斯:詹姆斯, 这是一个很重要的讨论, 我知道这里很多人都乐于和你讨论, 谢谢你今早的开场演讲。
JB: Thanks very much. Cheers.
布里德尔:非常感谢。
(Applause)
(掌声)