Friday 16 October 2020
Mark Zuckerberg’s choice to ban Holocaust denial on Facebook is undoubtedly the right one. But who adjudicates for such decisions in parts of the world where he doesn’t know or understand the issues? What about the untruths and conspiracy theories that don’t cross his radar?
I should give you some context before I let loose. When I started out as a reporter, an editor at the Financial Times warned me that managing journalists wouldn’t be easy. As a colleague put it: “Reporters are like puppies; if you don’t give them plenty of exercise, they shit on the carpet.” Well, we had an example of life imitating journalism this week, when I brought my dog, Scout, into the office. She didn’t soil the newsroom, but I have to confess, she was a nuisance – excitable and disruptive. She ate Emma Sullivan’s lunch in the long-term editorial meeting; she chewed through a few soft furnishings too. She just couldn’t seem to sit still, and when people had more than enough on, she was another distraction.
By the time I got home, I was tired and embarrassed and so I sent one of those apologetic notes, feeling embarrassed that I’d brought the dog in – that the editor’s dog had come to the office and everyone had to say how cute it was, when in fact it was just really annoying. I sent the email. No-one replied. And yes, I knew that the editor’s dog was a new sin in the newsroom: it had been as annoying as I feared.
I tell you that because it gives you a little background to what happened next. It was late then. So, I settled into watching The Social Dilemma on Netflix, and while it didn’t say anything that I hadn’t really heard, or read, or thought about before, it wound me up.
Worse, as if proof of how modern tech is corroding our minds, I found myself scrolling through some news, messages and emails even while I was watching The Social Dilemma. Proof, I suppose, of their point.
And as I was doing that, I read this. “I’ve struggled with the tension between standing for free expression and the harm caused by minimizing or denying the horror of the Holocaust.” It was written by Mark Zuckerberg on his blog, an announcement that, having previously accepted people’s right to publish views he disagrees with, even Holocaust denial, he had changed his mind; it would no longer be allowed on Facebook’s platforms.
I am Jewish; my family fled Germany in the 1930s; and I know that Holocaust denial is an exceptional example of where racism meets lies. So his decision resonates with me.
But, having been responsible for BBC News and, therefore, information output the world over, I know it will also resonate with people who care as deeply about other genocides, injustices and suffering. It’s not just people alarmed and endangered by misinformation about the Armenian or the Rwandan genocide, it’s a long list of contested atrocities and poisonous conspiracy theories. And they will think that Mark has taken on the responsibility not just of publisher of what’s on Facebook, but editor-in-chief. Someone had to. It can’t keep putting misinformation into the public square and tell us it’s just a platform, it’s not responsible for what’s said on it.
But did Facebook really want him to be the person who sits in judgment of what untruth is considered sufficiently dangerous and corrosive to be banned from its platform? On what basis does he balance the arguments of free speech vs lies, misinformation and hatred? Who adjudicates for such decisions in other parts of the world where he doesn’t know or understand the issues? And what’s the system for appealing publication of those things that don’t cross his radar?
If you ask Facebook, they’ll say that Holocaust denial is not really the acid test of whether it’s acting like a publisher or editor. It’s a special case, atypical when it comes to genocide denial. Anyway, they’d say, they’re a private company, they’re entitled to say there are certain things which users can’t post (nudity, incitement to violence etc). More than that, they say they don’t commission content, so it’s not fair to think of them as publishers. They live in a new part of the public square, somewhere between publisher and utility and, as they’ve said, they’re asking governments to step in and regulate them.
Mark Zuckerberg’s decision on Holocaust denial is undoubtedly the right one. How on earth could he want to allow for Holocaust denial on free speech grounds and be so laissez faire that he finds himself presiding over a business of the mind that stokes anti-semitism, tramples over the truth and tears at society? But the longer they pick and choose what content they edit and then wait, I’d say rather cynically, for politicians to come to some agreement on their oversight, the more they lose trust and respect.
The more people are persuaded that they should be broken up. The more national governments will – or should – choose to impose their own national requirements on content liability, on third party challenges, open access to their data and, yes, on personal penalties for the executive leadership for failing to meet public standards. This may sound to you like a stretch, but, remember, within a matter of months, the equivalent oversight, sanctions and penalties on the banks and bankers were put in place after the financial crisis. Now, we are in the midst of an information crisis.
Facebook still behaves like a US company that happens to have global operations. But for all of us outside America, all of us who live on the platforms it has created, it’s a global company headquartered in the States. If Mark Zuckerberg has made the judgment that Facebook must stop publishing things that are dangerous and untrue, then it can’t just be the ones that concern him or make headlines in America. Zuckerberg’s decision on Holocaust denial crosses a line in terms of responsibility for editorial content. It is a recognition, and a welcome one, by Facebook’s founder of the requirement on the company to weigh truth against free speech. And as he well knows, it doesn’t stop with Holocaust denial. Nor should it.