Hello. It looks like you�re using an ad blocker that may prevent our website from working properly. To receive the best Tortoise experience possible, please make sure any blockers are switched off and refresh the page.

If you have any questions or need help, let us know at memberhelp@tortoisemedia.com

From the file

ThinkIn with James Harding: The battle for truth | Does truth even exist anymore and, if so, who owns it? Welcome to Season One of ThinkIn with James Harding

Facebook and the battle for truth

Facebook and the battle for truth


In the court of public opinion, Facebook is Exhibit A in the battle for truth; the most powerful example on the planet of the damage unregulated social media is doing to the public square. Facebook’s response has been to set up a higher court, its own supreme court in fact. Whatever it does, whatever it says, it won’t be the last word.

Episode 5 in Season One of ThinkIn with James Harding, The Battle For Truth.


  • Noah Feldman, author, academic and professor of law at Harvard Law School 
  • Stephen Levy, Wired editor at large and author of Facebook: The Inside Story
  • Poppy Wood, senior advisor at RESET: resetting the internet for democracy
  • Alexi Mostrous, investigations editor at Tortoise 


James Harding: Hello.

Noah Feldman: Let me just things down here, can you guys hear me well?

James: We can hear you very well. Noah, thanks for doing this, I know you have to juggle a hundred things.

Noah: Sure, no, no, I’m glad we were able to work the timing out. Thank you for asking me. 

James: Thank you. Also, I’ve been one of those people who I think for about 20 years now has been, ‘Oh, you must know Noah Feldman’ and I’ve always said ‘No but you’re the 74th person who’s said that’, so now I am able to say yes, funnily enough I was talking to him on a Zoom call. 

And Poppy, I do know you. I don’t know whether you remember, we met years ago, I think when you were in Downing Street.

Poppy Wood: That’s right.

James: Stephen, thanks for joining us. I don’t know why people in radio always ask what you had for breakfast but why don’t you tell us what you had for breakfast?

Stephen Levy: Okay, I had an omelette for breakfast.

James: Did you really?

Stephen: Yes, I did.

James: That’s good living. I’m just checking with my colleague, Katie, who is going under the name of Matt in this as part of our battle for truth, even Katie is a deep fake. Katie, is that all right in terms of the sound? 

Hello, and welcome to a ThinkIn, I’m James Harding and I think it’s fair to say that in today’s battle for truth there is one business that has become the battlefield. In the fights over facts and fake news, over accuracy and accountability, in the battle that seems to be consuming all of us and who and what we can trust, there is that one company that somehow finds itself the home to every argument, that seems to play host to a war on all fronts. It is, of course, Facebook. And you may say that well, in saying that, I have already made up my mind, that I start this ThinkIn with a point of view, and you’d be right, I do.

For some years now, I’ve been worried that democracy is in a losing battle with technology. I’ve looked on at the standoff between Capitol Hill and Silicon Valley, between the Hill and the Valley, and marvelled at government’s failure to stop tech’s assault on trust and truth, polarisation and privacy and political manipulation. And increasingly, when the big tech CEOs head to Washington to appear in front of those Congressional hearings, you can’t help but feeling it’s government ignorance and impotence that’s on parade. It’s as if the politicians, not the execs, are in the dock. In all this, I suppose it’s the case that practically, as well as symbolically, it all feels as though it sits with Facebook. Democracy is up shit-creek and Mark Zuckerberg, it seems, has the paddle. 

There are any number of examples or incidents from around the world that we could take as Exhibit A in Zuckerberg’s own battle for truth. Facebook enabling genocide in Myanmar: 

“Some of the most inflammatory speeches have been made by this monk on YouTube and Facebook.” 

“A social media site has been used to incite violence against Rohingya refugees living in the country’s Rakhine State.” 

“The evidence is that in Myanmar, a country pushing out Rohingya Muslims, Facebook has been hopelessly ill-equipped to deal with dangerously violent posts.”

Cambridge Analytica’s wholesale data breach for political ends: 

“The data of as many as 87 million of its users may have been improperly shared with Cambridge Analytica.” 

“This is all down to widespread criticism of how Facebook protects your data.” “And it was my mistake and I’m sorry.”

The anti-vaxxer conspiracy theories that go viral:

“The top story is a call for social media companies to face tough financial and criminal penalties if they fail to clamp down on anti-vaccine content.” 

The to-and-fro over Holocaust denial; Instagram influencers peddling lies, insecurity and hatred and, of course, the Big Lie. 

“Trump: That is why I am determined to protect our election system which is now under coordinated assault and siege.” 

Facebook’s role in Donald Trump’s claim to a landslide election victory in 2020. 

“Facebook is pulling down groups aiming to oppose election results including ones organising action with the hashtag #stopthesteal.” 

“I’ve found posts since the insurrection on January 6th praising or calling for more violence on Facebook…”

For this ThinkIn, you’ll be pleased to hear we are not going to start with any of that, we are going to start somewhere else entirely. We’re going to begin with a sunny day and a bike ride

It’s 2017 and Noah Feldman, the author, academic and Professor of Law at Harvard, has been staying with his college friend, Sheryl Sandberg, who is Zuckerberg’s right hand at Facebook. They are just outside Menlo Park in California and Noah Feldman decides to head out for a bicycle ride. As you can see, I am rather warming to my theme here and I’m rather tempted to start embellishing the story – some facts are too good to check, as the old journalistic jibe goes – but you’ll be pleased to know I don’t need to do that because Noah Feldman is actually here, he can describe the moment. 

James: So, Noah, why don’t you start by telling us that, what happened on that bicycle ride?

Noah: Thank you, James. I was climbing up Old Lahonda, which is an extremely beautiful Californian hill that was much too difficult for me to climb on my bicycle and I was probably a little bit oxygen deprived and the brain starts to do funny things. I had been thinking a lot in that context about two subjects: one was how constitutions come into existence. I had just finished a huge brick of a biography of James Madison, the lead draftsperson of the US constitution – I’m not sure if anybody read it but it got me thinking – and the other thing I was thinking about was how free expression in the world was really in a crisis because of Facebook and other social media platforms and because of the tremendous competing pressures, both to shut down more expression on the platforms because of the danger that the bad kind of expressions cause, that you were alluding to, and also because of the countervailing risk that all of the pressure that would be brought to bear on Facebook would be pressure to take down content, and of course that would mean that Facebook was in a position of being a censor and thereby limiting content and if much discourse takes place on social media platforms, then you have too little speech.

Somewhere on this ride it hit me that what we need, what Facebook needs – it wasn’t we in my mind at the time, what they need – is some kind of solution in the form of a Supreme Court, a constitutional court that would do some of the heavy lifting of balancing the very difficult, almost impossible to answer competing concerns of what content needs to be suppressed and what content needs to be permitted.

James: And so, what happens when you ride down the hill, what happens next?

Noah: Well, on the way down I of course started to play out how it would look in practice, you know. No sooner had I got to the top and said ‘I’ve got this great idea’ then on the way down I thought, wait a minute, how could this actually be done? The way down, which is always a little bit easier, was thinking look, how can you have something that is genuinely independent? And then I thought well, a government could do it and perhaps in Europe or elsewhere governments might impose regulation where they’d do such a thing but in the US the government can’t do it because of the US First Amendment, the US basically can’t lawfully serve as this kind of a censor and so then I thought, well could a whole industry do it? I said, well in principle, yes, but why would the rest of the industry want to associate itself with Facebook or undertake such an experiment? I thought really, as a first step, only Facebook itself could do it and it would have to do it by spinning out a lot of its powers that presently rest in the senior leadership, in Sheryl Sandberg, in Mark Zuckerberg and so whether this is practicable just totally depends on whether they think that they no longer want to do the job of themselves sitting in on their own meetings and deciding whether holocaust denial is hate speech or holocaust denial isn’t hate speech, which is genuinely I think …

I’m going to cut Noah off here because the ThinkIn that follows involves not just him but also Stephen Levy, Wired’s Editor-at-Large and the author of Facebook: The Inside Story. You are going to hear too from Poppy Wood, she’s Senior Advisor at Reset which is focused on resetting the internet for democracy and she previously worked for two years as an advisor in Downing Street on tech policy and Alexi Mostrous, who is Editor here at Tortoise and puts out our weekly Tech Nation Sensemaker newsletter, is joining us too. 

I should explain that a ThinkIn is at the heart of what we do at Tortoise. We’re a slow newsroom and the ThinkIn is an open news meeting, a deliberate effort to hear competing views and then come to a clearer sense of what to think and in this series of podcasts, we’re taking the ThinkIn, our series of organised listening, our forum for civilised disagreement and over the course of six conversations, trying to make sense of the battle for truth. As you’ll hear, rather than admire the problem of trust and truth in the internet age, we’ve chosen to kick off by trying to think about possible, even just partial answers. We try at least to frame the answer to perhaps the knottiest question of our times – who will set the rules for the internet?

You’re going to hear that regulating content at internet speed and scale, something that seemed impossible, is in fact doable. You’ll hear too that there’s a real, and I think worrying, prospect of the US setting the rules for the internet world-wide but rooted in America’s own political culture and preoccupations. You’ll hear what to me felt like a really useful distinction between freedom of speech and freedom of reach and you won’t have to listen too closely to hear what the architect of Facebook’s Supreme Court thinks should be its judgement in the case of Donald J. Trump.

Businesses can of course come up with the most anodyne names, particularly for the most interesting things, so Noah’s idea of Facebook Supreme Court became the Facebook Oversight Board. Yes, that’s right, our subject today is the Facebook Oversight Board, because in its own way isn’t it an admission of failure? Both government’s failure to regulate and the company’s unwillingness to do so. 

James: Poppy Wood, what did you think listening to Noah? Do you think from what you’ve seen of the Oversight Board, it has created a real court for the internet?

Poppy: I think in the absence of regulation it’s an admirable agenda but it’s an unelected upper chamber overseeing an unelected lower chamber. This is not the same as oversight and proper regulation. It all sounds very grand and powerful but the opportunity to effect change is very limited, it’s not involved in the overwhelming majority of cases, it takes a long time for cases to be referred to it, it’s not about testing actual law it’s about testing Facebook’s own rules and it makes the whole agenda about content rather than upstream prevention of harm and viewing these …

James: What do you mean by that, Poppy? Just explain upstream prevention.

Poppy: It becomes entirely a freedom of speech agenda rather than looking at how this content is being promoted broadly across a platform, so the Oversight Board views cases that have been referred to it in isolation and looks at whether or not it has breached Facebook’s own terms and conditions. It’s irrelevant to look at these individual pieces of content at the front end of Facebook if it can’t look at them in the broader context and get to the heart of what’s driving this content which is the algorithmic back end and that’s where I think the limits of the Oversight Board really are. It’s focused exclusively on content and not being able to get to the heart of what’s driving that which is the algorithms. 

James: It’s cases rather than systems?

Poppy: Absolutely.

James: Alexi Mostrous. Alexi, can I bring you in to do something entirely partial which is, if you like, to make the case against Facebook Supreme Court, against the Oversight Board?

Alexi Mostrous: Yes, sure. So, I’ve been thinking about this a bit and I think that their problems can be divided into three categories – problems of content, problems of structure and third, specific problems relating to Trump. So, problems of content, a couple of examples of how that works: there’s a problem of quantity, so since the Facebook Board starting accepting cases in October 2020, they have had more than 150,000 cases appealed to them. How do you deal with that in practice? They’ve heard about five and Facebook are not obliged to read across decisions from one case to another so that whole idea of how one decision affects the ecosystem hasn’t really been worked out. 

Secondly, there’s obviously a problem in convening a global court to lay down moral rules that cut across multiple nationalities, so even in the testing of the Oversight Board there were major differences, as you’d expect, between what for instance was acceptable in Italy and what’s acceptable in Nigeria but over and above those kind of inherent issues around content, there’s also particular problems around structure so the Board can only reverse decisions to take down content, it can’t reverse decisions made by Facebook to keep content up and a lot of people would say that’s half the problem with misinformation. So, their powers to effect content are automatically already limited in that quite important respect.

Secondly, as I mentioned, the decisions aren’t actually binding on Facebook in the same way as a Supreme Court setting down a precedent would be binding on any lower courts that came under it and third, and I think this is potentially a problem that can be ironed out, there’s a problem of transparency so is it fair to call this a court when you can go online and watch the Supreme Court making a deliberation but you can’t even know who the five members are of individual Facebook Oversight Board panels that will decide on individual cases. So, on the Trump case for instance, out of the 20 members of the Facebook Board as I understand it, five will decide that case but we don’t know who they are and we can’t go into a room while they’re making a decision, we don’t know how that works. So, the court analogy, query how accurate that is…

James: Alexi, excuse me, I’m just going to interrupt you because for all the many miracles of podcasting, the one thing you can’t quite get are people’s facial expressions on a Zoom call and if you just picked up Noah’s, you’d know that you’d want to hear his response to those things and I wonder, actually Noah, if we could do those things in turn: content, structure and Trump, separately and in particular I suppose the question about content, which is this question about scale and, to an extent, speed. Whether it is just simply impossible to regulate a platform where content is being uploaded at a speed and scale that is beyond the human, beyond just the national/local conversation.

Noah: The scale is a very grave challenge and at the moment that challenge rests not with the Oversight Board but with Facebook itself and specifically with the algorithmic tools that it uses not to choose to promote content in the first place, which I’ll come to in a moment, but with the algorithmic tools that it uses to take down content that offends terms of service or spreads disinformation or if it does other kinds of serious social harm. There, I think you have to think about it the way we think about large states and how they enforce laws. In principle, it ought to be impossible to enforce the law in a country of 300 million, certainly in a country of a billion and yet there are institutions that do a maybe not perfect but a reasonable job of doing so and yet we don’t trust that those institutions – the police, let’s say – will always do a good job, we have institutions of oversight to make sure, at least in a democracy, that they’re doing the best job that they can be given the circumstances.

Seen from that perspective, of course it will not be adequate in the long run just to have a single independent court or independent oversight board, that would not be sufficient. I think Poppy is entirely right about that but this is an experiment and this is the first step of the experiment and over time I very much hope that Facebook and other platforms will experiment with much more expanded and extended forms of devolved governance, like the structure of courts that Alexi was referring to. I will say that I do think that the Oversight Board does have binding authority on the decisions that it issues and it can then overturn any deviant decision by Facebook subsequently and in that sense, it is literally modelled on the way that courts operate with respect to precedent.

James: Just on your point about the first step, will you just pick up on Alexi’s point? You’re in the States, I’m in the UK, the truth about Facebook is that it is a US based company that operates globally. The implication of a global Oversight Board is that it answers initially, or at least in the first instance, the concerns that dominate its thinking inside the US and the point that Alexi was making was that there was something undemocratic and culturally insensitive about having something that tries to make decisions, if you like, centred in the US in cultures and public squares, country by country, region by region. So, when you mention a first step, is your thinking that we move to nation-by-nation Supreme Courts for Facebook?

Noah: Well, what I was thinking is in the first instance, national laws still apply to Facebook wherever it is, so in the UK, in France, in Germany it is already the case, there will be regulatory schemes created democratically by a legitimate governance that impose standards that Facebook will be obligated by law to follow and that it will follow and that it is committed to following. It can’t operate in those countries, rule of law countries, without adhering to local law and that will include a lot of content regulation potentially, as it already does in Germany. So, if those countries decide that their people wish to impose regulatory standards that differ from those that the Oversight Board will use, those are the standards that are going to prevail, certainly in terms of taking down content. So, that’s the first point that I would make, that this is not in any way a substitute for regulation when governments wish to regulate and it’s in the charter of the Board that it can’t obligate Facebook to do anything that would make Facebook violate local law.

Now, you might have some countries that say, our only regulation is that we require all platforms to have an internal mechanism for managing these problems and cases because those countries, again democratically, would decide they don’t want to do the heavy lifting of the regulation themselves and that may be true for some countries, I think you’ll see a range of different options. 

James: I appreciate that but I just want to understand whether you think that the Oversight Board should operate nation by nation?

Noah: I think so. The Oversight Board has 20 members now, that will probably be expanded substantially over time. They are already from all over the world, they will continue to be from all over the world. I think it will be very difficult to have separate Oversight Boards for each country and I also think it would be, in my view, unwise to try to do that because the effort that Facebook has is to produce a global platform and so Facebook ought to have global principles and standards that it believes are valid no matter where it’s operating, subject to the very real constraints that different countries have the democratic right to impose their own rules and regulations and when the people of those countries have made those decisions, I would say that provided that those rules and regulations are created by a democratic state. I think it’s harder when Facebook has to face down an authoritarian or autocratic government, it’s not obvious to me that Facebook ought to comply but it is Facebook’s policy that it complies with local laws even if the government is illegitimate. I’m just speaking, as I am throughout this conversation, entirely for myself. 

James: Stephen, if you would?

Stephen: It shows how the Board is evolving. Alexi mentioned that the Board can’t comment on content that has been left up if someone complains it should be taken down; actually, today they announced, as was their plan all along, that they were starting to do that so the news is just fresh off the emails about that and I think it shows that we’re going to see evolution in this Board. 

There was one really interesting thing in the first set of decisions that people really didn’t notice, was that where there were decisions they ruled on, the offending piece of content was taken down by the poster and Facebook said ‘Don’t rule on this’ to the Board, it’s moot. The Board said, ‘Sorry, we’re not going to do that, we’re going to interpret the rule that we can rule on it anyway’ and to me that’s an indication that maybe the Board is not going to blindly respect all the lines that were drawn with Facebook in its charter. My hope for the Board is that at some point it will go rogue and address things like the algorithms that Poppy was talking about that are so critical to the way content appears on Facebook. There have been indications that some of the members of the Board are concerned about that and they actually would like to get to that issue in their rulings and suggest to Facebook that they should use different principles in their algorithms. To me, the best possible outcome would be since the Board is independently funded, it would say ‘Do you know what, we’re going to alter the charter to go beyond what Facebooks wants us to do’. 

James: Poppy, can I just, before I go back to Noah to talk about the structure points that Alexi mentioned, I do just want to pick up on this idea that somehow the Facebook Oversight Board will be able to dovetail with nation-by-nation regulation on platforms and content and you obviously thought a bit about this within the UK government context but so far is it fair to say that governments have really struggled to figure out a way to set rules for internet operators and, if so, why?

Poppy: It’s very difficult and it has taken … yes, it is taking time but Noah, you were talking about national jurisdictions making their own content legislation and the Oversight Board having to, about taking priority over decisions by the Oversight Board. If you look at what’s being cooked up in the UK and in Europe, it’s really not content regulation that’s being devised as a way of managing this problem; the best regulation is looking at risk mitigation rather than at content because …

James: Explain that, sorry Poppy, explain that. 

Poppy: So, the UK agenda, the Online Safety Agenda and what’s happening in the EU with the DSA, the Digital Services Act, focuses on platforms identifying harms before they arise and reducing their spread when they do so to avoid getting to a situation when we are talking about keeping content up or taking it down and that’s accounting for the difference between freedom of reach and freedom of speech. Facebook likes to roll out the public square analogy, it’s the public square of the internet, anyone can show up and speak but of course in a public square, you are speaking to the audience that has opted in to hear you and maybe you have a megaphone and you manage to get your message across to some passers by but by no means by speaking in a public square do you have a right to reach the widest audience possible and I think there is some confusion about that with people when they engage online. It’s not just about Facebook, it’s about all of these platforms, that people feel they have a right to freedom of reach and that’s just not the case. 

So, the UK government and the DSA that’s being developed by the European Union is really looking at how do we reduce the spread of harmful content and really tries to avoid content being taken down where it’s legal. It helps platforms by making definitions about what’s illegal speech and what’s legal speech but then within that says, look platforms, this stuff is not very sexy, okay, this is risk assessment, this is monitoring, this is audit.

James: The risk of the spread.

Poppy: Exactly. 

James: Poppy, can I just stop you because I want to make sure I understand this right. It feels like we are then talking about two, if you like, completely different world views in terms of addressing this problem. On the one hand, you’ve got if you like the Supreme Court analogy, the Oversight Board, which case by case takes issues which themselves then are symbolic of behaviours online and provide precedent for Facebook in its behaviour and then there’s the harms outlook that says actually, what we’re worried about is the real-life impact of the outcomes and what we’re going to do is establish guard rails that try to limit the spread of damaging information and so those are two different world views: one is a regulatory one and the other one is a legal one. Noah?

Noah: I don’t think the distinction is quite as sharp as you describe, even in Poppy’s formulation. I do think that there is a question of whether speech should be prohibited altogether or whether it should be allowed but only allowed to exist in some corner of the internet and that is a distinction that one could draw but I think that the Facebook Oversight Board already, and certainly in the way that it’s likely to evolve especially in the terms that Stephen was talking about, is pretty likely in the long-run to be interested in spread, not only in whether something does or does not remain up. I see the whether or not it remains up is just a good first step for the Board to learn what it’s doing and to address some hard questions but over time, it’s going to have to think about the greater issue and I think here I would just note, in response to Poppy’s point, I think the listener should ask themselves, would you trust the government to decide not only on what I do should or shouldn’t be permitted but on how broadly certain ideas would be allowed to spread which are nevertheless legal. 

That’s a very difficult judgement and if you say something is false, there might be a reason to restrict it but what if you just think it’s misleading in some general sense but you’re not certain that it’s false. That raises really hard questions and those are the kind of questions that not only regulators but lawyers would also want to think about. You can imagine a newspaper where the government says, well, you can’t sell this newspaper because if you sell it, those ideas will spread. That doesn’t sound so good even though technically the government would be allowing one copy of the newspaper to be printed, it would hardly be the freedom of the press. 

James: But Noah, we sort of skipped past the question of scale because there’s a question here also about whether it’s doable. Is it possible, you talked about policing 300 million or a billion people, is it possible to police the volume of content online?

Noah: Yes, I mean it’s not possible to police it perfectly, it’s only possible to police it with substantial error rates and with good institutional mechanisms in place to deal with those error rates but can it be done? Yes. I mean think of it this way, material is being promoted algorithmically at that scale and so material can be down-ranked or not promoted or suppressed at that scale, just not perfectly and as the technology develops, it’s two-directional. It is technology that can promote speech and it is technology that can demote or suppress speech.

James: Poppy, you wanted to come back in? 

Poppy: Yes, certainly I see the question about over-reach by governments and is it for governments to decide what spreads at scale? But at the moment, and Noah, you went on to say that, what’s determining what’s spread is algorithms and I think much better to have elected government steering regulators on the harms that play out online rather than platforms, that it’s the best we’ve got and particularly in liberal democracies, it is something that governments are thinking very, very carefully about. Of course, regulation by certain state actors in this space is very worrying and there are some bad examples of regulation but if you look at the UK and the EU, they are really trying to ensure that they preserve freedom of speech as much as they can and in my mind, it is much better to have people who have really thought about this and thought about harm reduction, rather than algorithms that are trained to focus on the bottom line.

James: So, Poppy, let’s address that because in different ways you and Stephen, which his dream of a rogue Oversight Board and Alexi his worry about structure, all point to the same direction which is – and it is the suspicion, Noah, that to an extent the thing that you’ve created or inspired lets Facebook off the hook because in prosecuting individual issues, individual cases, you don’t go after the systemic problems – as Poppy says – with the algorithm, with anonymity, with a whole host of things that are endemic to the way in which Facebook works. Should the Supreme Court, should the Oversight Board, be able to go after the business model itself?

Noah: I haven’t noticed anybody letting Facebook off the hook. I didn’t notice it in your introduction, James, and in conversation with people that I know and people I encounter, I haven’t seen that yet and nor do I anticipate that it would be the case. One of my views about the way the world operates these days is you can’t beg, borrow or steal transparency and you can’t beg, borrow or steal legitimacy; you actually have to earn it because everybody’s watching. So, the Oversight Board will gain legitimacy if it makes good decisions and it knuckles under to whatever Facebook wants it won’t be seen as legitimate. So, that remains to be seen and we’ll talk about the Trump decision and I think that lays out a good example. If they just rubber-stamp what Facebook did, I don’t think they’ll be seen as terribly legitimate.

That said, will the Oversight Board expand its reach? Yes, and that doesn’t have to be by going rogue, as Stephen said. It can also be that if it works well and Facebook sees that it’s gaining legitimacy, Facebook may be prepared to confer greater decision-making authority on the Oversight Board, including on front-end questions like how the algorithms work and questions that potentially implicate the business model in serious ways, because it’s in Facebook’s interest to do so. You can’t get a company that’s responsible to its shareholders to do something if it is not in the company’s self-interest. The company has a fiscal duty, has a fiduciary duty to its shareholders to act in its own interests but its interests may correspond in this instance to devolution of power including over questions of algorithmic reach because it’s very costly to Facebook to be seen as merely interested in its bottom line. The company does not want to be seen that way and if you imagine yourself in the position of its managers and owners, what’s motivating them each morning is not making money, they have enough money – what’s motivating them is for the company to be thought well of and to be a net positive contributor to the world at a moment when many, many people clearly don’t see the company that way at all.

James: So, why don’t we take – because a) it’s one that we can all understand and, heaven help us, we’ve all got views on, is the Trump decision, the quote/unquote ‘indefinite ban’ that Facebook imposed on Trump following the riot on Capitol Hill, depending on your point of view, an assault on Capitol Hill on January 6th of this year, 2021. Stephen, why don’t you just set out if you like, what you think is at stake in the Oversight Board’s decision on Donald Trump?

Stephen: I think in a way too much. The timing is not good and Facebook did not do the Oversight Board a favour by kicking this essentially political question to the Oversight Board and it’s guaranteed to make enemies. Essentially, the issue at hand is a balance between letting a public figure express himself, in this case, or whether a public figure should follow the policies of Facebook. In Trump’s case, Facebook had decided previously that because he was a public figure, first as a candidate and then as the President of the United States, they would keep up his content even if it violated Facebook’s policies about hate speech and other things. Then they ruled that there were some things that were so dangerous or hateful in terms of the [33:09], that they might have to tag or label some of the things and ultimately, after the insurrection at the Capitol, they took him down entirely, indefinitely but not necessarily permanently and by making this decision something that the Oversight Board is going to have rule on, I think this is something that they suggested the Oversight Board, the other way that they get their cases, if they choose the cases that are appealed by people whose content was taken down or, going forward, people who thought the content left up shouldn’t be taken off, so I think whatever they decide is going to make enemies of the Oversight Board. So, I think it’s unfortunate for the board that they are going to make this decision at this time. 

James: Noah, give your read on the Trump case.

Noah: Well, Stephen’s right that it’s soon in a life of an institution to make what will possibly be its most consequential decision in a long period of time. In a perfect world, it would have been lovely to have had six months or a year for the Commission to get its feet wet and to build legitimacy but the world doesn’t always correspond to our plans and the truth is, it would have been preposterous for the Board to exist at all if it did not have a chance to review this hugely consequential decision that the company made, so it’s going to happen, it’s going to be trial by fire. 

I think Stephen is right that there will be some people angry at the board either way and that’s fine. The test will be, can the Board come up with a solution that both acknowledges that there are real free expression issues, even for Donald Trump if I may say so. I am not a supporter of Donald Trump, I testified in front of Congress that he ought to be removed from office, I excoriate him in every way but he nevertheless has some free expression interest and rights and 70 million people plus voted for him. On the other hand, the fact that Facebook has rules and Trump in at least several instances very clearly violated those rules. So, what I’m hoping is that the Board will be able to say something along the lines of ‘Listen, Facebook, if you want to de-platform somebody, anybody but especially someone so significant, you have to cross your T’s, you have to dot your I’s, you have to be very clear – he’s violated strike one, he’s violated strike two, now comes strike three and he’s out.’ Trump has violated one or two, let’s say, but he has perhaps not violated the third so he is reinstated but on the condition that he continues to follow the rules and, if he fails to, he can be off and off permanently. I would like to see something clear in that regard that is clarified.

James: So, two questions of process then, Noah. Question one is, how do we, the public, gain trust in Facebook, the principle of free speech and political representation, all of which are at stake in this particular case, if we can’t see or hear the process of deliberation, the case on both sides? That’s question one in process and question two is, once that decision is made, how do we have confidence that it rules out for everyone else who violates standards of truth and incitement to violence?

Noah: In constitutional courts around the world, we don’t hear the private deliberations of the judges or the justices. We hear two sides make an argument if it’s an adversarial system and we might hear a few questions being asked and then, subsequently, what we get is a written decision that articulates the different points of view and in some systems, they don’t even have an oral argument prior to the decision making, people just put in written submissions. 

So, the Oversight Board’s decision in that sense will be just as transparent as the decision of any constitutional court from around the world. It will offer reasoning’ it will have to acknowledge the costs and the benefits and it will talk about Facebook’s arguments and it will presumably talk about Trump’s arguments that he may have raised as well. Trump is free to go out in the public right now and explain why he thinks he should be reinstated. Facebook has already made public statements about why they think he should be taken down, so I think the public discourse is already out there and I think to that extent it’s highly transparent.

James: Obviously this is not my Mastermind subject but justice being done and being seen to be done, there is a vast difference between the public discourse or the debate that’s slugged out in the media and the debate that might be had within a courtroom and I would have thought that someone who cares about Supreme Courts wants a transparency in the process of the case on both sides before you come to read the decision. 

Noah: I do like the idea of publicly available briefing but I also don’t think that the Oversight Board needs to replicate the – at least in the US – the 75-page briefs that are often submitted on both sides of an issue.

James: Would you not think though that actually the 75-page briefs themselves would be part of us coming to understand what is and what is not permissible and the thinking behind that?

Noah: Yes, they’re available out there, they’re available. You could go on the Supreme Court’s website in the US and read them but nobody does.

James: No, but what I’m saying is that you can’t do that with the Oversight Board. 

Noah: I think if Trump wished, he could tomorrow release the document that he submitted, if he did submit the document to the Oversight Board explaining his views. 

James: And has Facebook submitted its document? 

Noah: Yes, I mean I think Facebook’s submission will be essentially what Facebook said publicly when it made its decision a) to take down Trump and b) to submit the issue to the Oversight Board. I think that is essentially what Facebook will be submitting, I don’t think there are any other longer submissions. We didn’t design this to, as I said, have 75-page briefs going in although they could be posted publicly if people wished to. There’s no hidden discourse here that’s going on, the conversation will be out there. What will not be public is the internal conversation of the board members in precisely the same way that it’s not public when the UK Supreme Court, the late-lamented committee of the House of Lords and now the UK Supreme Court, decides something.

James: And just to touch on that point about the waterfall, what happens if they do decide to reinstate or if they do set rules by which you are going to be de-platformed? How do those cascade to all the other people who abuse their position on any of the Facebook platforms?

Noah: Well, it’s in the Charter that Facebook agreed to that Facebook will be bound by a decision like that, not only in Donald Trump’s case but in all future cases so if the Oversight Board says, look, these are the rules you must follow to de-platform somebody, then that will apply to you and to me as much as it will apply to Donald Trump and if Facebook doesn’t do a good job of implementing that, the Oversight Board will come back and say you’re not doing a good job of implementing the orders that we put in place. 

James: Stephen?

Stephen: It’s not my understanding that the Board has got to be able to tell Facebook how to de-platform people in general, I thought this was just something regarding Trump and it’s up to Facebook to make a recommendation about the way Facebook goes forward but it’s up to Facebook to accept that recommendation or not, is that not the case?

Noah: I don’t understand it that way. The way I understand it is that the Board has two kinds of powers: it can make a definitive decision in a given case based on principles and rules and then those rules are the rules that Facebook agrees to follow going forward, that’s like any court, right. The court decides a case, technically it only binds the parties in that case but the lower courts follow it because they know that if it goes back to the highest court, the highest court will strike down their decision unless they follow those rules.

Then, separately from that, the Oversight Board also has a recommendatory power where it can go beyond the facts before it and say ‘We think you should do thus and such in general’ and that is just a recommendation and Facebook doesn’t have to follow that unless it chooses to, so I see that it’s bifurcated in that sense. 

James: Poppy, can I ask you to do the difficult job, if you like, of reading across from what’s essentially a private sector innovation to the future of public regulation? I don’t know if you saw Noah some months ago talked about the precedent by which private courts actually in certain jurisdictions seeded public courts, the one could lead to the other, and I wonder whether you see, Poppy, what’s happening with the Oversight Board, with this Facebook Supreme Court, being a precursor if you like to some kind of public body that would have oversight, not just over Facebook but over all internet platforms?

Poppy: Not really for the UK and Europe. I think the Oversight Board is a really US centric agenda and the Trump decision in particular, of course it’s not, whatever the Board decides, it’s not going to rebuild trust in Facebook. This is a political decision and you have 50% of people who think he should be reinstated and 50% of people who think he shouldn’t in the US. I don’t think regulators and officials in the UK and in Europe are looking at the Facebook Oversight Board as the paragon of how to deliver this agenda. As I said, the regulators over here are much more looking at the systemic issues, are looking at the business model, are looking at algorithmic promotion of content and division and all of this stuff is just world’s apart from what’s happening I think at the Oversight Board. 

James: And therefore, you would buy, Poppy, this idea, of this three-way split in internet standards that the US goes one way for reasons of the First Amendment that Noah mentioned, that China and its internet culture goes another way for reasons of the Communist party being in charge and that Europe goes a third, that Europe has a systemic approach that is different?

Poppy: The best approach of course is a global one but that’s never going to happen and certainly, the platforms themselves are calling for government intervention here and the best thing for them would global standards, of course, but that’s kicking things far down the line so you’ve got to go with what governments are able to do and their different approaches. I think the US will come further than it has done over the course of the next few years certainly; the UK’s agenda started years ago and isn’t even in law yet, these things really take time. In a perfect world, of course, there would be a global standard and it would be robust but at the moment we are looking at first move advantage and it’s not just the UK and Europe, Canada, Australia, lots of nations are really trying to get to grips with this. I think the US for lots of reasons has been further behind but certainly, the First Amendment shouldn’t stop regulation in lots of forms for the internet in the US, if you look at regulation even for kids online and what can be done there and it certainly doesn’t stop the systemic approach or the business model approach. There’s plenty that can be done but the UK and Europe has made a head start, I’d say and it’s looking pretty promising.

James: So, Noah, let me assume that you do get invited back to Sheryl Sandberg’s for a weekend at some point in the not-too-distant future, you get on a bicycle again and you think to yourself, right, well, take on board from what you’ve heard from Stephen and Alexi and Poppy, that yes okay, you can see it’s a first step, it could be a PR move, it could be a big regulatory step, it could be both but there are issues here about the content which is addressed and the content that’s not, about whether or not it’s a systemic answer or whether it’s a case by case answer and, perhaps most importantly, how it’s going to relate to government regulation as opposed to just being self-regulation and you are back on the bicycle thinking, right, which of those problems do we want to address next and how do you go about it?

Noah: Well, I think they all need to be addressed over time but, as you hint, one wants to roll out experiments one at a time rather than trying to solve everything in one fell swoop. I think a very valuable next step would be to think about – Poppy mentioned an Oversight Board by its nature isn’t involved in most daily decisions, an interesting question to ask would be are there ways for Facebook to devolve power so that there’s independent and transparent decision making not driven by the business model all the way through the range of important decisions?

Another important question to think about would be are there structural issues that can be addressed, not only by an Oversight Board but also by other independent entities that might be rolled out or imagined in the future that do touch on front-end algorithmic decisions and that therefore are more closely connected to the issues that European regulators are thinking about? Because again, the European regulators may end up saying some version of ‘We insist that you have a mechanism in place for considering and deciding about the spread of information that is dangerous’ rather than saying here are our precise substantive standards that we’re going to apply case in and case out. So, if they are to say that then perhaps the institutions within Facebook that make those decisions shouldn’t be the same ones that lead right up to the senior leadership, maybe they should also be independent in some meaningful sense and more transparent. 

A last issue to think about that I think Mark Zuckerberg has been thinking about for many, many years, long before I ever met him, is how to get different kinds of legitimate public input into decision making. Facebook is not a company but it is a kind of a community, a new kind of a community, if that word can be applied to three billion people. It’s not a democracy but are there ways to get public input that are meaningful, that would not fall prey to some of the problems of so-called democracy on the internet, you know, which can lead to perverse results?

I think the punchline would be the technology of social media is unlikely to disappear and if it’s not Facebook that’s doing it, others will also be doing it. They will evolve because technologies evolve. That technology has the kind of power and capacity, both on the negative side and on the positive side that it needs to be subjected to regulation, that comes from governments in part for sure but that also comes from internal concern about ethics. What we need are institutions that would enable a better job of addressing those questions legitimately and transparently and with the decision-making further and further away from the people who just by chance turn out to be the people running the company.

James: Noah, thank you. Poppy, Stephen, Alexi, thank you. I know that we are just at the end of our time and I wanted to just say, though, that I acknowledged when I went into this that it wasn’t exactly that my mind was made up but I did have some clear views. What’s been quite fascinating and, for me, fun about this conversation is a lot of them have been thrown up in the air, so I went into this thinking that the real obstacle here is how do you regulate the content of the internet, how do you regulate at that scale and at that speed and actually I am quite reassured that that is doable, that we’ll get to the tech that enables us to do it, imperfectly as you said, Noah, but that it’s an achievable ambition.

What becomes much more difficult and what’s equally clear is this tension between government and corporate, if you like the Madison/Facebook tension which is we want our governments to have the powers to do things but we don’t necessarily want them to use them. We don’t want governments to censor freedom of speech, at the same time we’ve recognised that companies are not able and probably don’t have the capability to make these choices and the tension between the people that have the power but we don’t want them to use it and those who have the power but are not capable of administering it, is a real one and I can understand much more clearly what that problem is. I do think that, Poppy, your point which I hadn’t heard before about the difference between freedom of reach and freedom of speech is hugely helpful because what that manages to do is defuse a huge range of issues. Freedom of speech is a political principle, is a value; freedom of reach feels much more like a matter of industrial design and you can help… go about fixing that. So, I think that is a distinction that’s helpful.

I should say two final things that really struck me. One is that I am really unsettled and unnerved by the idea or the reality of US dominance of online jurisdiction in the way in which, and perhaps a product of, its dominance of online innovation and that one of the consequences of a global Oversight Board is that you will have echoes of the problems in Myanmar, echoes of the WhatsApp problem in Brazil, because there’s not regional jurisdiction at this stage, it all goes up to a global Facebook Oversight Board and, as a result, the issues of Donald Trump, the issues that convulse America, are the issues that determine online behaviours and there is a US hegemony in terms of online culture that will come as a result of this and that’s something I think we should worry about greatly but I’m really struck, if anything, Noah, at the end by the fact that this innovation is probably going to be more important rather than less than I thought at the beginning of this conversation, partly because it’s evolving and it’s clear that it’s involving, but partly listening to you, Poppy, because I recognise quite how difficult it is going to be to introduce really effective government oversight of what happens on the internet and so therefore these innovations and self-regulation are going to matter more.

The question that I’m left with is the question to use, Noah, your phrase, is how you police, how you enforce that quote/unquote ‘internal concern about ethics’ because there is a personal element of responsibility here for what happens online and what happens on platforms and it can’t be just up to the judgement of those people who run these big businesses to make those calls. 

So, I wanted to say thank you, this is a massive subject and weirdly, we’ve done better in getting our arms around it than I’d thought we might and that’s only thanks to you. A big thank you, Noah Feldman, Poppy Wood, Stephen Levy and Alexi Mostrous, a huge thank you to you all. Thanks very much. 

As you’ll know, this ThinkIn came out of a series that we held in our newsroom at Tortoise, a series of ThinkIn’s on the battle for truth. We hold ThinkIn’s every day at Tortoise, you can easily book them in the app and our thinking, our journalism is better informed by your involvement and engagement. I really appreciate how much our members are helping us get a better, deeper, more thoughtful understanding of the world and the times that we’re in. Look forward to seeing you at a ThinkIn soon. 

Thank you for listening to the battle for truth. I’m James Harding, my producer is Katie Gunning, Tom Kinsella wrote the original music and it is a podcast from Tortoise Studios which is run by Ceri Thomas. 

James: Have you met each other before?

Poppy: No, we haven’t. 

Noah: No, it’s a great pleasure to meet.

Poppy: Likewise. When I was in government, I was working for the coalition though Nick Clegg was in office when I was in Number 10, I haven’t seen him for a while but he is now your colleague. He used to be mine.

Noah: Yes, he very much is. 

James: Noah, can I just check one accuracy point. I know we said, or Alexi said, that the five, it’s not clear who the five members of the Oversight Board are who are adjudicating on the Trump case. I thought I saw your brow furrow at that point, is that not right?

Noah: Yes, well, I was talking partly about his transparency point but to be precise, the five judges or Board Members will make a decision and then they will show it to the other 15 and if the 15 want to overrule the five, they can, so all of the Board Members, all 20 of them, have input into what will be said ultimately and the Board Members could choose to sign the opinion if they wish to or they could choose not to sign it and they could choose to leak who worked on which part of it if they wished to or they might choose not to.

James: Are there such things as dissenting opinions? Can you dissent?

Noah: Yes, dissenting opinions are allowed, yes. 

James: Fantastic. Okay, thank you all, have a very good day.

Noah: Okay, take care. 

Poppy: Thank you very much everyone, speak soon. 

Next in this file

Trump, Twitter and the Battle for Truth

Trump, Twitter and the Battle for Truth

You will never again see a Tweet from Donald Trump. He’s been banned from Twitter for life without the right to appeal. The world feels quieter, but is it better? The power to hand out megaphones in our democracies, and the power to take them away, now rests in a tiny number of hands

6 of 6