The most significant moments in Apple’s history have mostly been product launches – from the Macintosh 128k saying “hello”, to the audience whooping as Steve Jobs announced a “revolutionary mobile phone” in 2007. But the most significant moment of Tim Cook’s leadership belongs in an entirely different category – and may be the most significant of them all.
It started with an atrocity. On 2 December 2015, Rizwan Farook, an inspector for the San Bernardino County Department of Public Health, in California, and his wife, Tashfeen Malik, fired over 100 bullets into the crowd at a Christmas party. Fourteen people died, 22 were seriously injured. Farook and Malik were killed in a shootout with the police. They had been motivated, at least in part, by Islamist terror groups and the terrible catechisms they found on the internet.
The FBI began investigating the attack, but soon ran into a problem: they couldn’t unlock Farook’s iPhone, which had been provided to him by the Public Health Department, and which they wanted to check for evidence. It was kept shut by one of Apple’s default security features, a four-digit passcode. If the wrong passcode were entered ten times, then all the data on the phone would be erased. The FBI was struggling to break this heavily encrypted system.
Thus began a very public argument between the law enforcement agency and the tech giant. They asked Apple to make new software for the phone – a “backdoor” key – so that the encryption could be overridden. Apple refused, on the grounds that the security of its devices was sacrosanct. The FBI then went from asking to demanding, through court orders. Still Apple refused. After a couple of months, the FBI paid an outside company – or perhaps one or more freelance hackers – to successfully access the phone’s digital innards.
There are complicating addendums to this story; it wasn’t entirely iConoclasts versus the Man. In one of the court motions against Apple, dated 19 February 2016, it was revealed that some of the company’s engineers had met with the FBI to discuss other methods for accessing the phone’s contents.
The most promising of these alternatives – an automatic backup of Farook’s most recent iPhone data to his iCloud account, from which it could be accessed – was shut off because the Public Health Department had already changed the password to his iCloud account without speaking to Apple. This meant that the backup couldn’t then happen without the four-digit passcode to his phone, and that again, of course, is where Apple drew the line. They were okay with the iCloud data being accessed, in part because it was a work account and the password was known. They were not okay with coding their way around an iPhone’s security features.
In an open letter to customers, Tim Cook observed at the time: “The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.” It was and remains important, for his company’s interests, that the encryption of those devices is seen as uncrackable.
This was especially true after the release of iOS 8 in 2014, over a year before the San Bernardino attacks. This was the operating system that included, for the first time, Apple’s Heath and Pay apps – and the encryption built into it was strengthened accordingly. Even today, the “Legal Process Guidelines” that Apple has produced for “Government & Law Enforcement within the United States” are clear: for devices on iOS 4 to iOS 7, Apple may be able to extract data upon legal request; whereas for devices on iOS 8 or later, “Apple is unable to perform an iOS device data extraction as the data typically sought by law enforcement is encrypted, and Apple does not possess the encryption key”. Farook’s iPhone was on iOS 9.
But there was still something brave about Cook’s stance – and risky. The San Bernardino case was after the Edward Snowden revelations, so there was some liminal unease about the state and its access to people’s data; but it was some time before the Cambridge Analytica scandal, which made privacy an urgent question for all the tech giants. Apple’s CEO was opening up a very big question about the relationship between a company and a country: who makes the law? And he was doing so over a terrorist’s phone. This was not a sure-fire bet.
Did it pay off? Well, Apple’s value has doubled in the years since. And they have been able to distinguish themselves from other tech giants, such as Facebook and Google, who rely on a certain looseness of data so that it can be repurposed or perhaps even sold on. Apple’s devices are instead marketed as walled gardens; pretty places where you’re free from intrusion. As part of a financial filing made in late 2017, the company codified six “values” – one of which was that “privacy is a fundamental human right”.