Hello. It looks like you’re using an ad blocker that may prevent our website from working properly. To receive the best Tortoise experience possible, please make sure any blockers are switched off and refresh the page.

If you have any questions or need help, let us know at memberhelp@tortoisemedia.com

Thursday 9 May 2019


Data trail

  • ‘You need to see it to believe it,’ a British security official said of Palantir’s software
  • The tech giant’s systems, which analyse vast databases of public and private information, were initially used by intelligence and security services
  • Police are now using the same technology to monitor citizens and predict crime. The failures of accountability are glaring

By Jim Giles

Palantir Technologies has attracted so much negative attention that its press department seems to have given up. When I called, they sneered at the idea that I would write anything other than a hit piece. Then they stopped answering emails. “I’m yet to see an article on Palantir that is good,” one former employee told me.

It’s easy to see why they feel this way. Palantir, which specialises in data analysis, has earned something of a reputation for helping police departments and intelligence agencies to amass an unprecedented trove of data on US citizens, many of whom have never committed an offence. Then there is Peter Thiel, the billionaire who co-founded the company. Thiel’s greatest hits include lamenting that his preferred political ideology – libertarianism – has been hampered by “the vast increase in welfare beneficiaries and the extension of the franchise to women”.

Venture capitalist Peter Thiel, founder of Palantir

Palantir has fostered a culture of secrecy. After Thiel hired Alex Karp, a friend from law school, to be the company’s CEO, Karp set about building a company that could do business with the military and intelligence services, which meant hiring people from those worlds. The company is named after the all-seeing eye in Tolkien’s Lord Of The Rings, and teams were given military-sounding code names. “At times I would roll my eyes at these 22-year-old Stanford kids who would call themselves deltas and echos,” says a former Palantir engineer.

Inconveniently for a secretive company, Palantir is widely reported to be going public. One of Silicon Valley’s biggest private companies – a so-called unicorn – is working on a stock market listing later this year. It’s not the first. But, after the listings of the better-known Zoom, Lyft and Uber, Palantir represents much more than just another West Coast leviathan going public, more than just a small group of people making fortunes on tech.

Palantir sits at the intersection of private data, big business and public policy. It epitomises the confused libertarian ideology of the West Coast, which prizes individual freedoms and shuns government, but nonetheless is tooling up the state with super-charged surveillance powers. Its collection and analysis of our information poses awkward questions. Who owns the data? Who ends up more powerful, Palantir or its clients? Who holds the likes of Palantir to account?

Part of the reason for the focus on Palantir is surely Thiel himself. His prescient investments have won him respect and influence (none more so than his decision to put $500,000 into Facebook in 2004, gaining a stake that he would later sell for more than a billion dollars). He is also widely disliked. In 2016, he apologised for comments, made in a book he wrote 20 years previously, in which he speculated that “a multicultural rape charge may indicate nothing more than belated regret”.

Thiel backed Donald Trump’s presidential campaign the same year, donating $1.25 million of his own money, and has said that he’ll support Trump again in 2020. He bankrolled the lawsuit that shut down Gawker, a blog that earned the billionaire’s anger after identifying him as gay. Gawker ran some tawdry pieces, but it was also known for smart social commentary. Thiel’s vengeance looked to many like a wealthy man abusing his power.

Perhaps Palantir’s biggest problem is that it is too convenient a villain. “I think all these concerns are valid and true, but Palantir has become the bogeyman that we project all our anxieties about technologically mediated surveillance on to,” says Sarah Brayne, a sociologist at the University of Texas at Austin, who has studied the company’s software. Palantir did not invent policing’s privacy invasions and it is not alone in profiting from them. In fact, it may take privacy more seriously than its rivals. The question is whether, by focusing on Palantir, we give a free pass to something much more concerning – a network of companies and elected officials that uses taxpayers’ money to spy on Americans.

Palantir was founded around 15 years ago. Computing costs were tumbling and digital devices shrinking, allowing organisations of all stripes to amass increasingly large stores of data. Thiel was one of the first to see a connection between big data and public security. In 2002, at the age of 34, he had made $55 million when PayPal, another company he helped create, was sold to eBay. Thiel founded Palantir the following year. There his engineers developed techniques for identifying the fingerprints of fraud in the data that flowed through the company’s servers. Thiel thought that Palantir could use similar methods to catch bad guys in the real world. The investment arm of the CIA, In-Q-Tel, was an early investor, and the Department of Defence an early customer.

Part of what Palantir provided was simply better digital plumbing. Assembling data in a single interface was a significant challenge. The intelligence services collect vast amounts of information, from internet browsing histories to mobile phone logs. The data is often held in different places and stored in different formats. What’s more, intelligence analysts also make use of “unstructured data” – facts contained in news articles, for example, rather than in a spreadsheet. Once the problem of collating all this information was solved, Palantir’s engineers had to build tools that allowed analysts to quickly and accurately spot useful patterns in the mass of data.

Palantir has said little about exactly how they solved these challenges, but documents obtained by The Intercept show that potential clients were impressed. “You need to see it to believe it,” GCHQ officials wrote in 2008 after watching a demo of Palantir’s software. A few years later, after apparently becoming a Palantir customer, a GCHQ presenter told officials from the Five Eyes intelligence-sharing alliance – the UK, US, Australia, New Zealand and Canada – that integrating Palantir into GCHQ operations led to “faster analytics” and other “unexpected benefits”.

“Palantir has courted a bad-boy image,” says Andrew Ferguson of the University of the District of Columbia, who studies law enforcement’s use of big data. The company could point, for instance, to prestigious clients without having to reveal exactly what it did for them.

As it goes public, don’t people have a right to know more about what it does for whom?

In 2011, the Los Angeles Police Department began using Palantir software. The following year, Sarah Brayne, the University of Texas sociologist, moved to LA, where she spent two and a half years observing the use of the new technology. Her research reveals a department that is generally happy with its purchase. Police told her about making an arrest after using Palantir to find all cars seen close to a series of robberies at around the time the crimes took place. Another time, police broke a case after matching data on a car spotted in the vicinity of a gang murder with records of gang membership.

A city map of Atlanta showing potential hotspots using a predictive crime algorithm

At the heart of these breakthroughs was Palantir’s ability to integrate previously siloed data. The data on the car movements, for example, comes from the city’s Automatic Licence Plate Reader (ALPR) system, a network of cameras mounted on buildings, street lights, and police cars. The cameras capture images of passing cars; computer vision software is then used to record the vehicles’ make, model, colour, and licence number. The LAPD won’t say how many cameras it has in the network, but reports suggest that the network is useful enough for police to search the data hundreds of times a day.

Thiel and colleagues raised funding of $444 million in September 2014 and another $880 million little more than a year later. Too little is known about Palantir’s revenues to declare the company a success, but it does seem to have ridden out its bumps and successfully expanded beyond intelligence and security into the corporate sector. Writing in January, one analyst predicted that 70 per cent of Palantir’s 2019 revenue would come from non-government contracts. The Wall Street Journal estimates that the company’s revenue grew by more than a third in 2018, to $880 million. Earlier this year, Palantir landed a single contract – to supply the US army with battlefield intelligence software – reported to be worth $800 million or more.

Concern about the company’s activities has grown in tandem with its income. Palantir’s software offers users an exceptional degree of oversight, which, if not properly controlled, can easily be abused. According to Bloomberg Businessweek, JPMorgan scaled back its Palantir deployment after a rogue security official used the software to spy on company executives.

Palantir is often portrayed as driving forward the expansion of the surveillance state. But it’s actually only one player in the network of private and public organisations that help law enforcement to collect and analyse huge data sets. IBM has a longstanding relationship with the New York Police Department, which includes providing the city with predictive policing technology. Amazon provides its face recognition software, known as Rekognition, to police in at least two US states.

Palantir seems to take privacy seriously. Thiel and Karp have talked about the need to build sophisticated privacy controls into their software, and they appear to have delivered on that promise. In 2012, Palantir established a panel of well-respected experts to advise the company on privacy and civil liberties. Chris Hoofnagle, director of the Law and Technology programme at the University of California, Berkeley, is on the panel. “I work with Palantir because the company has developed the best-in-class privacy by design that I’ve ever seen,” he says.

Peter Thiel with Donald Trump in 2016

In the 1970s and 1980s, policing was largely reactive. Think about the TV cop dramas of the time: a radio crackles with an alert, tyres squeal, sirens wail, the cuffs come out. Around 25 years ago, senior figures in US policing began to argue that they could do a better job of reducing crime rates if officers were able to snuff out crime before it happened. And so they started investing in what became known as predictive policing. The idea is to run historic crime data through an algorithm that estimates the probability of a future crime occurring at a given time and place, so that the police can concentrate resources there.

Predictive policing relies on records of past crimes, and police departments that adopted the approach were incentivised to collect and store more and more data. Technological advances had a similar effect. The fall in price of video equipment, together with progress in computer vision, sparked the proliferation of ALPR networks and led to a huge increase in data on vehicle movements. The public helped, too, by voluntarily sharing stories and photos on social media. Police departments started off wanting to cut crime, but in the process they got into the big data business.

Police monitor security cameras and licence plate readers in New York City

This has changed policing in ways we are only beginning to appreciate. Compare law enforcement with intelligence work. One difference is that spies get more leeway. They are burdened with less interference from government and the courts. They are less likely to have to justify their methods to the public. Americans are generally OK with this, because the targets are both foreign and, often, genuinely scary. If the CIA is tracking a member of Isis who is planning a terrorist attack in the US, few people are going to care about the suspect’s privacy.

The rise of big data policing has led local law enforcement to adopt the same kind of thinking, but without the national security justifications. Police now routinely amass large amounts of data on people who haven’t been suspected of a crime, let alone had a warrant issued against them. US police can track their citizens’ movements, their bills, their drunken nights out, all with minimal oversight. The police have become spies, and Americans have become their targets.

This power can easily be misused. Take the case of the LA Police Department’s Chronic Offender Programme, which was part of a plan to focus police effort on gang members and individuals responsible for violent crime. Candidates for the programme were given a point score based on the information held on them in the LAPD’s Palantir system, which includes details of gang membership and prior arrests. Details of individuals with the highest scores were distributed to local police branches, whose strategies for dealing with these targets included “conducting door knocks and advising the offender of available programmes and services designed to reduce the risk of recidivism”.

The result, says Ferguson, provides an “object lesson” in the dangers of data-based policing. In the LAPD programme, a person can earn a point just by being stopped by the police in the street, regardless of whether they committed a crime or were even suspected of one. “It creates a self-fulfilling prophecy,” says Ferguson. If you’re stopped by police, you get a point, which makes it more likely that you’ll be stopped again. Perhaps as a result, an audit of the scheme, published in March of this year, found that almost half of those in the programme had no gun crime arrests. Los Angeles Police Chief Michel Moore responded to the audit by announcing plans to scrap the programme.

Those problems could plausibly be blamed on sloppy administration, but there is evidence of systematic failures. Earlier this year, researchers at New York University identified nine cases in which US police using predictive policing systems, including Palantir’s, had fed in historical data known to be corrupted by falsified reports, planted evidence and other forms of police misconduct. Communities of colour were often the target of this bias, raising fears that the supposedly scientific predictions made by the algorithms would also be biased against those communities.

“If law enforcement is pervaded by bias, the data that goes into secret algorithms will technologically launder those existing biases,” says Shahid Bhuttar at the Electronic Frontier Foundation, a civil liberties group based in San Francisco.

Palantir allows LAPD users to search across data sources for persons of interest

In conversations with LA police, Brayne learned that an extraordinary collection of data was being ingested into the city’s Palantir system. Its sources include records from debt collection agencies, social media, house repossession reports, road tolls and utility bills. LAPD staff also said that they were working on adding data from hospitals, parking lots and even calls made to Pizza Hut. “I’m so happy with how big Palantir got,” an LAPD captain told Brayne “I mean it’s just every time I see the entry screen where you log on there’s another icon about another database that’s been added.”

As Brayne points out, the LAPD now owns huge amounts of data that many city residents would consider irrelevant, private, or both. “Quotidian activities are being codified by law enforcement organisations,” she writes. In addition to the ALPR data, the police are able to enter information about a suspect’s romantic partners and roommates. Which means that if you live in LA and your ex had a run-in with the law, you may come up in the LAPD’s Palantir system. In fact, given the rate at which the system is hoovering up data, it’s possible that if you live in LA you’re already in the system, regardless of who you know or what you’ve done.

“The technological capacity for surveillance is way outpacing laws governing the use of data,” says Brayne. “It’s a moment of institutional failure.”


All photographs Getty Images


What’s next?

No date for a Palantir public offering has been set but analysts say the market could value the company at over $40 billion – more than twice the level of most valuations before its battlefield intelligence contract with the Pentagon was announced in March.

There is no suggestion that Peter Thiel’s public support for President Trump had any bearing on that decision, in which Palantir was preferred over the established defence giant Raytheon.

 This article was amended at 18:12 on 9th May to clarify that ownership of data analysed by Palantir systems remains with police and intelligence agencies

Further reading

Peter Thiel’s radical thinking and iconoclastic style are on show in Zero to One (Currency, 2014), a highly readable memoir-cum-manual on how to start a business and why not to invest in a modern American university eduction.

Worries about predictive policing in the US were reported in this FT story last month, which was in turn based partly on this report by the Partnership on AI. Note that the Partnership draws on expertise from many Silicon Valley tech giants – but not Palantir.

Need proof that privacy is the Valley’s new religion? Check out Google’s CEO in the New York Times.