Chris Anderson: Carole Cadwalladr, it's so nice to get to sit down with you. A few days ago, you opened the TED conference with an absolute blockbuster of a talk. Got a huge reaction from people. In a one-sentence summary, and I'd like you to expand on this, what you argued was that we're in the middle of what looks like a digital coup. That the combination of Trump and a collection of big tech leaders is in danger of creating a new kind of autocracy in America. Is that about the core of it?
Carole Cadwalladr: Yes, that's right. Well, I mean, I think, you know, I put up the photo from the inauguration, and one of the things that I found really resonated with people, which is, so it's the photo of the tech leaders behind Trump. And, you know, I called it tech bros in hostage situations. And it's this idea that Silicon Valley has been captured by the administration, and the administration is acting in all sorts of unlawful ways. And Silicon Valley is now part of that.
CA: And the main way in which Silicon Valley is helping advance this is what?
CC: Well, you know, I talked about, for example, for me, the big "Danger! Danger!" moment was when, you know, the first weekend that the administration took office, Elon Musk sent his, I call them cyber troops, into the US Treasury, where they gained unlawful access. They got access to the nation's data, its financial data. And he now has that. And, you know, for me, as you know, as I know, I called it the crack cocaine of Silicon Valley is always data. You need data to feed the AI. And you can never put it back. I mean, that's one of the whole things. Once you've got the data, when you've got the entire nation's data, you can't just put that genie back in the bottle. And that to me, it's a power grab. Which goes beyond any of the guardrails of democracy. And that's not just about now, you know. Silicon Valley as we know, does not think in four-year cycles. This is absolutely about a land grab for the future. That's what, you know, the thing that I was sort of trying to say really. Goes beyond politics.
CA: So I think probably the purpose of this conversation now is for me to gently try and play devil's advocate. TED is obviously -- we're trying to be open tent. We want people of all political views and so forth, we want to listen and, you know, treat with curiosity and respect and so forth. So I'm going to frame what a different view of what's happening might be and see what you make of it.
I mean, one thing to say, first of all, is that, Silicon Valley is not a thing. Like, from inside Silicon Valley, they would probably all say, "No, these are our competitors." Elon Musk, Mark Zuckerberg were both there, but they are competitive enemies, to say the least. And so, I think it was you who coined this powerful term "broligarchy."
CC: I believe so. I started using it a year ago, because it was like, oh, hold on a minute. What we're seeing here is this elite, this business elite, like oligarchy, but it's tech bros. And I was like, of course, it's broligarchy.
CA: So the way I think of an oligarchy is of a group of powerful people kind of acting in unison. And I think they would say that they are not acting in unison. That largely they are competitive with each other. And maybe there are some aligned interests, like, having legislation that makes it easier for companies to expand dynamically and so forth. But that's one piece. Like, if it was the case that they generally are competitors with each other, what's the sense in which you feel that they're acting, you know, sort of as a group?
CC: I don't think there's any conspiracy here, and I don't necessarily think they're acting as a group at all. And this is where I think it's really helpful to understand, to look at America now from the frame of understanding what has happened in other countries. And I think that's one of the blocks actually with understanding this situation. So, you know, I think Russia is a template here for what is happening, which is the breed of, we call them oligarchs, right? They didn't agree with each other. They were suing each other, they were sometimes murdering each other. But it was they needed a relationship with Putin, with power. And in some cases, it was about enriching themselves, about creating opportunities. But a lot of the time it was also just survival. And that’s what I mean about them looking like hostages. There wasn't a choice, it feels to me, in terms of who was up there on the dais at the inauguration. Trump knew that he needs Silicon Valley because in any, you know, in a standard coup, when the military takes over, the junta takes over, the first thing you do is take over the radio station, right. You need to have the means of communication. And in this case, the means of communication are these big Silicon Valley companies.
And why it's such a, you know, colossal thing that is happening right now and why it's not just America, of course we can see that in many ways, but, you know, the fact is, these are global communication platforms and they are now in an alignment, captured, whatever you want to talk about it, with what is a coming autocratic regime.
CA: So even though they're in competition with each other, you're saying that they share a need to have the president's approval. So they are doing things to win that approval, and thereby they're helping to construct and empower the creation of a kind of autocracy.
CC: I mean, Trump was explicit in his threats, right? I mean, I think he sort of threatened Zuckerberg with jail. You know, it's partly carrot, but it is partly stick, you know, and that is understood, I think, that there is both the carrot, which is there are opportunities, he’s going to tear up regulation, it's going to make it much easier to do the things that they want to do. But there is also the, we can see that if you’re not obeying, then life is going to be very difficult. And we see that playing out in all sorts of ways.
So for example, with media organizations, we're seeing lawsuits on a daily basis. We're seeing it against big legal firms. That's one of the most shocking things for people. So it's not just they're sucking up to Trump for the sake of it. They feel they don't have a choice.
CA: So it's unquestionably true that there was a big swing in Silicon Valley. that has traditionally been left of center. toward Trump over the last six months of the last campaign. If you talk to people there, most of them, I think, would have explained it as follows. They would have said two things. One, these companies are reacting against years of sort of progressive culture that they didn’t like. You know, that got in the way of building the stuff that they wanted to build. And a belief that the deregulation commitments of Trump and the explicit effort he made to do things like embrace crypto and so forth, show that he was interested in having a sort of a positive environment in which technology could flourish. And from that standpoint, like, a defender of Elon would say, look, Elon is known for running businesses more efficiently than anyone on the planet. Cut 75, 80 percent of the workforce at Twitter and at least operationally, functionally, added features and so forth. So what they would say is it's fantastic and amazing to have, for the first time, really, a really powerful businessman come into government and apply some of the tools to save what are, you know, crazy wasted costs in government, probably by common consent. And that the key to do that is that you have to start with the actual information systems, that that's the pathway to do that. So what might look like "Oh, he's going in to seize the data for his own use," is not that, it's delivering the means to figure out how to make the cuts. Do you see any rationale for that at all?
CC: I could absolutely see from an engineer's brain, that looks totally, completely rational and absolutely, and why wouldn't you? And also, if you're an engineer, as we've seen in Silicon Valley, laws, regulation, you know, sod that, we know how to do it better. We know how to do it faster. If we do it quick enough, then actually it'll take them ages to catch up with us, and we've already done it. Like, that is the history of Silicon Valley.
CA: Move fast and break things.
CC: You know what, though? We've got to stop using that phrase. It sounds so innocent, and it's like, oh, it's like a baby breaking its toys. What that means is breaking the law and getting away with it. It's having absolute impunity and knowing that it takes ages for regulators to catch up. And that has created the situation that we're in. And this is exactly how DOGE is working. So everything that he has done, for example, the cuts to USAID -- devastating, devastating cuts. That is money which was allocated by Congress. This is not lawful. And this is, to use the Silicon Valley framework, it's because they've always gotten away with it. So, you know, if you do it fast enough, it's then too late. The damage has been done, and the world moves on. And that is the mistake that we have made with Silicon Valley time and time again. And it's why now, whilst this is happening in real time, this is the moment that people have to act. Because if you want to take the lessons from authoritarian countries, then it's too late after the fact. The longer that this goes on, the longer the breaking, the wrecking, the vandalism, the illegal and unlawful behavior goes on, the more it's consolidated, the harder it is to fight back.
And so my talk, in essence, was about the fact that even though it's confusing, people are in denial. They feel powerless. You know, there's this sort of moment of paralysis. But actually people do have power. And feeling that power. And that was what I was trying to communicate in my talk, really, as somebody who has experienced powerlessness. And I think, as I said, it was only coming to TED that I had this revelation about, actually, when you're at your most powerless, it's often because you are powerful. That's why you have to be stopped. And the people of America are more powerful than these guys, right? There's more of you, and you have values and morals, on your ... ethics, you know, a belief in the law, on your side. So that's the thing that I was trying to communicate.
CA: Well, you really touched a nerve, in the most powerful way, by being eloquent, as we've just heard, and by being vulnerable and, you know, coming at it from a very personal space. And when it comes to the demolition of USAID, I personally know organizations of people who were wrecked by that. And I think history will show that that was a pretty brutal and reckless approach.
CC: Can I turn the tables, Chris, and ask you about that? Because, you know, TED has done this amazing thing of bringing together innovators, plus also people who think about the really hard problems that the world faces. And it's always been about a sort of synthesis between those and finding new ways. And this spirit of optimism has always run through the place. And for me personally, you know, it's been a big thing. I first came to TED in 2005. But this is something really different, isn't it? And do you find it hard to retain your optimistic frame?
CA: I've always described myself as a determined optimist, which means that no matter how dark things are, you look for a pathway forward that has some hope, and you try and shine a light on it. And hopefully people, you know, you can find your way there. I've always believed that the worlds of ideas, innovation, technology, are actually ultimately more powerful than politics. And I'm dismayed at the world of politics right now, dismayed at it because it seems pretty hopeless. There seems to be an impossible divide between two tribes. I actually think that over the next three or four years, the even bigger story will be how technology plays out. Because I think AI is growing in power at such a speed that it, you know, it will be more important than the political decisions are made. So for me, my focus is, can we think of a way of ensuring that we get the best of AI and not the worst?
And that takes us, I think, into one of the key conversations I want to have with you, is around data. In your talk, you so powerfully talked about how these companies are extracting our data. It's surveillance capitalism, surveillance fascism, I think you called it.
CC: I cut that line, you know. Can I just say that, I was really sad when I woke up the next morning. I was really sad because I had that line in there of like, "This is no longer surveillance capitalism. We're on our pathway to surveillance fascism." But because I was mindful that everybody was like, "You've got to cut it down a bit." So I lost that and it was one of my --
CA: Well, there we go, we've got it back. This I think is such -- I'm wrestling with myself because data, it's true that data, your data used by someone in power against your interest is a horrifying thing. It's also true that in a way, you know, everything works from data. You need information to have any kind of useful knowledge. So the sharpest way I can put this is this. Let's say I've got a very powerful AI companion, right, that I'm consulting and I'm getting wisdom from and so forth. And you ask the question, do I want it to know about me or not? I think most people will end up concluding that they do want that AI to know about them, because it's only by knowing about you that they can actually give you wise advice that's tailored to what you need and who you know. And it's almost like the classic sort of examples about the misuse of data around, you know, the advertiser knows before you do that, you're, you know, a target for Viagra or for whatever, you know ... ailment that they can foist on you. And it feels very uncomfortable. But when it comes to an actual intelligence that you're working with, I wonder whether you're going to win the argument on data or whether actually most people are actually going to voluntarily say, "No, please, literally, I want you to read all my emails and help me be wiser." Is that horrifying, or do you see some logic to that?
CC: I understand the beautiful, you know, the beautiful vision that that is, that there is a really helpful assistant who's going to know all your problems and, you know, help you reach the great solutions. The thing is about it, is that first off, it's ownership, right? Who are these companies owned by? What are their values? Who are they aligned with? Where might that data end up? And do you trust them? And are they transparent about it? Do you know what’s going to happen to that data, where it could end up? And the thing is, is that in the current environment we're in, none of those things are true, right? There are none of these companies where you could say, yes, this person is the Nelson Mandela of the tech industry, and I have complete trust and faith in them. And even if there was, you know, the fact is, as we've seen with "23 and Me," right? So people have done these genetic tests that this company now has, their genetic code, and it's now up for sale. So where is that going to end up? And the thing is, to go back to it, which is you never get your data back, when it's gone, it's gone. And the ways that that can be weaponized against you -- a lot of women in America are starting to understand what that means. These period tracking apps now, which now, understanding that’s a surveillance device, that, you know, if there's some instance in which they might have to seek health care access, that that could become evidence which could be used against them. This is really personal information. And that's the thing.
You know, a lot of people talk about data as property. It's really so much more than that. It's like an aspect of your, you know, it's like your blood, your bones, your skin, your cells. You have to think about how that can and will be -- like, assume the worst. Just at this point in time, you have to assume the worst.
CA: The business models of the AI platforms are different from the business models of social media. Social media was dependent on advertising, and the core there is almost like, give the advertisers data that you've extracted and let them use it how they will. For the AI platforms to earn people's subscription, where you’re literally paying out an amount per month, I think they're going to conclude that it's in their interest to demonstrate that they are trustable. I mean, if they're not, people won't subscribe.
CC: But they're not trustworthy. Who's trustworthy in the AI space then, out of these companies? Who is being transparent, ethical and legitimate in their approach to data use and the models they're building?
CA: Obviously, you know, the stated policies of all of the companies is that they want to honor users' interests. I mean, people I've met -- I mean, so I've spent time talking with Demis Hassabis, who's head of DeepMind and basically drives Google's most important AI efforts. I think he's an honorable person. I think he's trying really hard to do the right thing and to develop, Google's AI products on fair principles. It’s not to say it’s an easy thing to do. But I also think even if you like, say, let's say that you don't trust Sam Altman. If OpenAI is exposed as abusing data, they have literally already billions of dollars that will go out the door from people who won't continue to subscribe to them. So you can say that maybe some individual at the top is not trustworthy. What I'm saying is that the actual system here doesn't obviously pull towards mistrust. It actually pulls towards -- like it's key to win people's trust for them to succeed.
CC: But I think one of the best measures of people’s behavior in the future is their behavior in the past. And this is actually how a lot of these systems work, right? And if you look at the behavior of OpenAI in the past, which is it illegally scraped data from numerous sources without respecting property rights or any other laws in different jurisdictions.
CA: You had that beautiful point in your talk where you said, “So I asked ChatGPT to write a TED Talk in the style of Carole Cadwalladr.” And and you showed what was --
CC: Yeah, and it was basically the outline of my talk.
CA: It was compelling. Could have saved you a lot of time, Carole.
CC: Except as I said, it's like the opposite of human creativity.
CA: Well, so let me ask this though. Like here, you said, "I did not consent to this. And I do not consent." And it feels like you've, you know, it just feels outrageous that they've been reading all your stuff and are now doing this. And anyone else could write a talk in the style of Carole Cadwalladr. Would you feel differently about it if there was an improved business model here, where the platforms committed to respecting individual talent. So that, for example, when a request is made to specifically embody the style of a musician or a writer or an artist, that actually there would be some compensation back to that person. So that you could say, “Actually, this is a way in which I could amplify my impact on the planet, and I will actually be compensated for it." Does that change the -- ?
CC: So I think that's fundamental to it. But however, it's just like, you go back to the point is that this was done without any of our permission, right? And so we can see that there are big players who are being able to make deals. And I use "The Guardian" as an example of that, right? Which is, “The Guardian” has done this syndication deal after the fact. Because the damage has already been done. They’ve already scraped the entirety of The Guardian’s website. So I understand the logic. It's like, well, you might as well try and make some money out of it. But of course, that's not respecting of the IP of individual contributors there. And individual contributors are not going to be in a position to do these deals with the platforms because there's no collective ability to force a proper negotiation.
So in a theoretical world with an ethical AI company, which asked your permission before it scraped your data and then paid you whenever it used that in some ways. But as we know, it's so hard to make that assessment, right, because it's taken in such vast amounts of data, and then it's mixed it all up into some weird sausage, which it's now putting back out there.
CA: I mean, they would argue, and I'm not saying I agree with this, but they would argue that ... Every time technology changes, that the rules need to be worked out again. That you've got a situation where, you know, your words were published, put out freely for anyone on the internet to read. No matter how many more people read your past words, you don’t get any more payment, and so the data is out there. They would argue that it's out there as a sort of public resource for fair use. I think it's right that people are challenging that because the fact is that, say, a given artist could easily be displaced by AI able to do much more --
CC: No, no, no. It's actually much, much deeper than that. Which is that every nation-state in the world has some form of property law, right? You can't walk into somebody's house and just steal the silver. Like, that's fundamentally the basis of law and order in our countries.
CA: But when it's intellectual --
CC: No, it's property. These are property laws. You know, in Britain, we've had this law since 1783. This isn't just some -- And if you can't respect the basic fundamental underlying principles with which we order society -- which is “Do not steal” -- then what are you left with? It's like, "It's fine, we're going to take your silver. And then if we sell it on eBay, we might give you like, five percent of it."
CA: Yeah, so I get the anger. There is a difference between a physical object, where if you steal it, that person doesn't have it, versus a digital property, where if you "steal" it, you still have access to it.
CC: Not under the law, there's no difference.
CA: You know, I think there's traditionally a difference in like, when an idea is out there, it can be built on and amplified, like, for example, in the music business, there's constant building on one person's work by the next artist. You know, the most -- the kindest way of viewing what they're doing for them, is to say, we're not stealing, we're amplifying. CC: I think we are absolutely lost if we do not respect the law.
And that's what we're seeing, this is what is happening.
CA: But the law isn't defined yet properly in AI. It's in the process of being defined.
CC: It's property. These are just property laws, it's no different.
And it's the point to go back to the case, right. You know, the underlying basis of what Google did where it digitized, it stole you know, every single written book in the world, didn't it? That was one of its first acts. And as I sort of said, it's just this acting with impunity has led us to a place where that ideology is now embedded in the government of the biggest superpower in the world, and that is what's playing out now in real time. And if you don't like those laws, well, then you don't respect these ones either. This is where we're in a sort of cascading situation.
CA: Right, right. I want two things simultaneously. I want a world in which creators are respected and fairly compensated for what they do. I also want a world where I can search for the collective wisdom of humanity and find it. Like, I want to be able to read from all these books and discover them. And so --
CC: But we’re not going to be able to have any further wisdom, because there's going to be no economic model for anybody to write another book or, you know. So, we’re ceding, we’re ceding.
CA: Where I agree with you is that artists, absolutely, and writers, should be compensated. And if we could do that, it's just about possible to imagine a world where AI, data can actually amplify the best thinkers. You know, the fact that in principle, someone's kid could have a conversation with Einstein based on his wisdom. You know, that's not something possible before now. Like, arguably that makes the world better, you know. Does it? I think it’s a reasonable conversation. But I think most people here would agree that the law is not yet in a good place and that writers and artists are in severe danger of being --
CC: It's not in severe danger. It's happened. I think going back to it, which is its power. I think we just keep on having to come back with -- you know, it’s power. This is power being concentrated in the hands of a very few companies, which are now aligned with a rogue state. That is what America is now in the world. And to go back to the point of what you were saying before, which is that, you know, one of the key things I wanted to get across in the talk is that technology is politics now, and politics is technology. There is no separation between them. And I really appreciate, Chris, it was so punchy of you to -- And we should talk about why you decided to put me first as the opening talk of the conference. Tell me, why did you decide that?
CA: I put you first because there are a huge number of people, probably the large majority of, certainly the TED community, is in a bit of a state of shell shock right now. I mean, the pace of change has not been seen before, either politically or technologically, and people don't know what to make of it. And you are unbelievably eloquent at naming it and helping people feel it. You expressed emotions and feelings that so many people in the room feel, and they were just so moved to hear that come from someone so powerfully. And, you know, you don't hold back. Most people are frightened to make bold accusations against named individuals. You're fearless. And it's really something.
I would actually love you, just speaking of the journey that you've been on, to just explain a bit more your own story here. Because you mentioned in the talk briefly that the last time you spoke, you know, you'd end up being sued and that it turned your life upside down. We're not going into sort of naming names and all the rest of it, but I mean, I think it's fair to say, like in that original talk, you describe someone as a liar based on prior reporting, and he sued you for that. And -- What was the court ruling there? At some point, the court ruled that you would have to pay his legal costs.
CC: What it is, is that I said words which we published in The Guardian, which were perfectly defensible, which was that he had lied about his relationship with the Russian government. And that was based upon this series of secret meetings which this Brexit donor had, or non-disclosed meetings, let's just say, that the Brexit donor had with Russian embassy officials in the lead up to the Brexit vote. Now that's just fact. And it was in our reporting.
But the thing which I got tripped up on, Chris, which is that in the very arcane meshes of British libel law, a single judge decides on the meaning of your words. And they take into context the entire talk, and then they formulate their view of that meaning for all time. So the judge came up with this formulation, which is that he had accepted money in contravention of the law on such. And so therefore I had libelled because I had made an accusation that he had accepted foreign funding. I had never said these words at any point. Those words were never said in the talk. I certainly never meant to say those words, but that's what I had to go into court to defend. And that is why it was the Kafkaesque quality of it, which was so confounding. Because I was having to defend something which I’d never said, and that was where it turned the case on its head, because it meant I couldn't defend that judge's meaning. So I then had to defend on the public interest of why I gave the talk. And it put all of the onus on me and my reporting. And that's why, instead of me getting to do discovery on the man, he got to do discovery on me. And that was when I talked about, because this is the thing which is really relevant, it was various things which are really relevant to what's going on in the US right now, which is that the court case was called a SLAPP, which means that it's a way of trying to shut down critical reporting or critical voices. And that's what we're seeing happening in the US, these weaponized lawsuits. Organizations across America are now preparing for this to happen to them. They know that they are going to be on the end of, you know, highly politicized lawsuits in which they're going to have to open up their computers, their laptops. And they also know that this is going to be accompanied, as it was in my case, by a sort of massive online hate campaign. And so that's the analogy which I was trying to make, which is what happened to me is a warning for what is coming for other people in America.
CA: Well, needless to say, for everyone at TED, it was horrifying to see what you went through.
CC: And I won. So just to be clear on that, I won the case. The public interest was found to be, my talk was absolutely lawful at the time that I gave it. And then on appeal, what happened is, in the one year after I gave that talk, a police investigation into the Brexit donor was voided. And at that point the Court of Appeal decided that the defense fell away. So it was the continued publication by TED, which is a foreign media organization in a foreign jurisdiction, I was held responsible for. And that's why he got damages awarded against him. And that's the thing which we're now appealing at the European Court of Human Rights.
Chris, because I think, the thing was complicated, nobody really understood it, it was the pandemic. And, you know, I think when you realized the gravity of what was happening, you rang me up after the trial had ended and made that very generous gesture, which I do really appreciate, which is you said, you know, we will see you right.
CA: Yes.
CC: So thank you for that. I don't want that to go unremarked.
CA: I mean, you're an amazing fighter. So, Carole, during your talk, you referred to the terrible personal experience you had the last few years after your last talk. For someone who wants to understand more about what happened there, where can they go?
CC: I'm in one week's time leaving my job. Not through choice, but because 100 journalists from The Guardian are being terminated because The Guardian has sold our corner of it. So I have set up a Substack, and I will write a full account where I would love to be able to explain to people what the bigger picture behind that. I really believe in independent media and independent film. And that I think is so vital at this time. And so, you know, as I was saying, my newspaper has been bought by unknown, unclear investors, and I don't feel it's possible to do the same kind of independent journalism there. But you know, there is an explosion, there's a thirst and demand from people for, I think, these like, clear, independent voices. And I think out of the total crisis of media and what is happening, social media, you know, information chaos, I call it, I do also think there is an opportunity there to grow properly sustainable media from the ground up, supported by readers who value that without being dependent upon advertising, which we've seen has been a terrible media game. And algorithms, which we've seen is another terrible game.
CA: Carole, thank you so much for coming to TED. Took lots of courage. You really touched people. Really wish you well as you continue your journey.
CC: Thank you for having me, I really appreciate it.