Hello. It looks like you’re using an ad blocker that may prevent our website from working properly. To receive the best Tortoise experience possible, please make sure any blockers are switched off and refresh the page.

If you have any questions or need help, let us know at memberhelp@tortoisemedia.com

Tuesday 22 January 2019

Trapped in the matrix

Young people with no record of offending can find themselves on a police database without knowing why or how to get off it

By Tom Goulding

Kavon Kelly gets up around 7.30am. His mother, Camille, drives him from their home in Leytonstone, East London, along the A12 that stretches into Essex, past Romford and on to Barking and Dagenham College, where he is studying to be an electrician.

Kavon has classes all morning, then heads to the canteen at lunchtime. Many of the 12,000 or so students doing apprenticeships and technical training at the college walk to Dagenham Road during their break. Kavon doesn’t leave the canteen. He doesn’t leave the college buildings once during the day.

When classes finish at 3pm, Camille waits in her black Audi outside the college to pick him up. They drive back to Leytonstone and from then until bedtime, Kavon doesn’t do much. “I just stay at home,” he says, and goes to bed around 10pm.

Weekends are similar. Occasionally he visits his uncles or gets a haircut. Apart from that, he doesn’t go outside the house. “I know if I go out, the police will stop me, try to talk to me and search me,” he says. “I can’t go anywhere.”

Throughout his teenage years, Kavon, now 20, noticed that the police took a keen, often inexplicable, interest in him. Heavily built, 6ft 2in and black, Kavon has autism and finds stressful or physical situations difficult. Camille often gets him a cab home from visits to family so he can avoid getting into difficulties on the street. “I’m constantly on edge when he’s out the house,” she says.

The family has learnt that since he  was 13, Kavon has been on the Gangs Matrix, a Metropolitan Police database of about 4,000 mostly young, mostly black individuals in London, tracked and ranked as likely perpetrators of gang-related violence. The database was formally established in response to the 2011 London riots, although surveillance of gang-suspected youths has gone on a lot longer.

 

Tom Pilston

Kavon Kelly

The matrix is one example of the growing use of data-driven algorithms and artificial intelligence by UK police forces – uses that so far have been legally messy, largely unaccountable and lack reliable evidence that they actually help policing. While police use of big data may be inevitable, shadowy algorithms, with processes that remain a mystery to most officers themselves, now have decisive power to determine whether certain people are destined to a cycle of surveillance, arrests and court cases.

The Gangs Matrix works as follows: police gather intelligence about a person from previous offences, tip-offs, tracking their behaviour, social media activity and their friends. They feed this into an algorithm – a mathematical formula – in this case written by consultants Accenture, which produces a “risk score” of red (the highest), amber or green (the lowest) for the likelihood of engaging in gang violence.

More than 1,500 individuals on the Matrix had a “harm score” of zero, police revealed in 2017, which means they have never been charged with an offence and there is no police intelligence linking them to gang violence. This is about 40 per cent of all those on the matrix, but no one on it has a right to find out that they are there. The matrix shapes police monitoring of those on it, informing local patrols and contributing to court cases.

 

How UK Police are using big data

West Midlands – predictive intervention

The National Data Analytics Solution (NDAS) aims to prompt interventions for people deemed likely to commit a violent crime, especially those with mental health difficulties, to offer them support from social workers or a health programme. Using datasets from multiple police forces, the tool may be used nationwide if successful.

Norfolk – burglary-solving predictor

Norfolk Police are using an algorithm that takes 29 pieces of data from previous burglaries to suggest whether there is a realistic prospect of solving a crime.

Durham – reoffending risk calculator

Durham police use a Harm Assessment Risk Tool (HART), which assigns an ex-offender a high, medium or low risk of reoffending. Using 34 pieces of data for an ex-offender (age, gender, criminal history), the tool has been shown to be 62.8% accurate in its predictions.

South Wales – Facial Recognition technology

South Wales Police and other forces, including the Metropolitan Police, have tested the use of Automated Facial Recognition (AFR) to identify suspects. Police film people on the street, often at big events, and then try to match the footage to biometric maps of suspects’ faces drawn by software. One use at the 2017 Champions League Final in Cardiff was found to have wrongly identified more than 2,000 people as possible criminals – the technology currently misidentifies women and black people at significantly elevated rates.

Kent – predictive policing formula

Kent police used a predictive policing tool from private sector firm PredPol, to predict where crime was likely to take place, based on a machine-learning algorithm which ingested historic crime data. After criticism that the tool led to over-policing of particular neighbourhoods, Kent dropped the service after five years, struggling to show that it produced a reduction in crime.

Being on the matrix has intruded into Kavon’s life in frequent and unpredictable ways. When the family moved house to Leytonstone in March 2016, police turned up at midnight, banging on the door. A known gang member had moved into the borough, they said. “‘Kavon Kelly, does he live here?’ they demanded,” says Camille. “I don’t know what they were trying to do apart from intimidate us because there was no arrest warrant, nothing Kavon was accused of.”

When a girl Kavon had dated disappeared, armed police raided his grandmother’s house at 2am. The girl was eventually found elsewhere; Kavon had nothing to do with it. He has been stopped and searched “countless” times, he says. When police thought they spotted Kavon in Enfield on the day a stabbing took place in the borough, he was taken from college for questioning. Denied bail, he was held overnight in a cell. He had been several miles away from the scene of the stabbing.

The matrix is made up of individual databases for each of London’s 32 boroughs, whose local police gang units share information with non-police agencies such as housing associations, job centres and schools. The protocol for how these institutions use its information is unclear. A school or jobcentre will know a person is on the matrix, but often not why – only the risk level may be visible.

While police say that any action from other agencies should not be based solely on matrix inclusion, “people working in outside agencies are afraid of the term ‘gang’,” says Katrina Ffrench, director of the police monitoring organisation StopWatch. “They judge you and will engage with you differently, if not outright discriminate against you.”

Kavon’s amber status was used by his college in a disciplinary procedure after he got into an argument with another student. “When they saw he was on the matrix, their whole attitude, their whole demeanour changed,” says Camille. Kavon avoided being expelled only when a youth intervention program vouched for him.

getty images

Metropolitan Police officers make an arrest as they as they raid a house in London as part of Operation Razorback

The rolling story of London’s street violence informs the speed and depth at which the matrix has been deployed. There were 135 violent deaths in London in 2018, the highest in a decade. More than a third of the victims were young people, and over half the killings were stabbings. Every new death renews the conversation about who or what is to blame – from drill music (a type of rap) to austerity’s effect on youth services to middle-class cocaine use fuelling the street drug trade. The police, embattled by real-term budget cuts of 19 per cent since 2010, appear to rely heavily on the matrix to track those who might be responsible.

Will algorithms be the new norm in policing?

Fewer than 1 in 6 police forces in the UK uses algorithms, but there is no clear guidance for those that do on how to use them legally, fairly, and in a way which balances the rights of the individual against the rights of the community. Trials, from Kent to Cardiff to Birmingham, have had at best mixed success in predicting where crime takes place, who commits it and when to intervene to prevent reoffending.

Durham Police use a Harm Assessment Risk Tool (HART) that, like the Gangs Matrix, assigns a risk score – this time to reoffenders – to determine if they should enter a special rehabilitation programme. Data from 34 different sources, such as age, gender and offending history are entered into  the algorithm and produce a risk score, but how that score is reached is a “black box” – not fully explainable as it uses machine learning, where the software creates a self-adjusting algorithm based on data it received.

Both the HART tool and the Gangs Matrix show that any algorithm is only as good as the data that goes into it, just as police officers’ decisions are only as good as the factors taken into account in making them. The very nature of a predictive algorithm means that an individual is compared with the past behaviour of people with similar profiles, but who are separate individuals making independent decisions.

With HART, one input was a person’s postcode, but this was removed after it was shown to create bias against the poor. For the Gangs Matrix, “a lot of this data comes from decisions made by officers on the ground”, according to Rick Muir, director of the Police Foundation think tank. “And they may not be following very clear rules for what should be going into these databases.”

A man is given emergency medical treatment after being stabbed in the stomach at the Notting Hill Carnival

getty images

A man is given emergency medical treatment after being stabbed in the stomach at the Notting Hill Carnival

Durham Police’s own tests on HART showed it to have 62.8 per cent accuracy in predicting reoffending. The system is designed to overstate the risk, to avoid the “false negative” of assigning a low score to someone on parole who later turns out to be violent. Given the unknowable inner working of the algorithm, it is difficult for an individual to challenge its assessment of them as high-risk.

Algorithmic predictions may sometimes be more scrutable than human decisions. The process behind an algorithm may be complex but we should be able to know what it is and identify any bias. In judging the usefulness of algorithms, Marion Oswald, a criminologist, and Sheena Urwin, head of criminal justice for Durham Police, who have both written extensively about data use in policing, warn against “comparing algorithmic decisions with a mythical perfect human decision-maker”. Algorithms that don’t use machine-learning will have been written by a human according to a particular police logic. Amending that logic may be quicker than, say, rooting out institutional racism from a police force.

It is not clear whether the Gangs Matrix algorithm is such a black box, but it effectively operates as one to the police officers and non-police agencies who use it. They do not know how it generates the risk score. Accenture, who made the algorithm, seemed to distance itself from it, refusing to answer specific questions on how it works, merely saying: “It was a pilot that lasted several weeks only. We haven’t worked on it since.”

Urwin and Oswald, who have worked with Parliament on the proper use of algorithms in policing, highlight the risk of officers deciding to “abdicate responsibility for what are risky decisions” to an algorithm shrouded in complexity, when resources are stretched. Kavon was added to the matrix when he was 13 and a pupil at a school for children with social and mental difficulties. He got into a playground fight, which didn’t amount to much, but police in Redbridge placed Kavon on the database.

An officer brought Kavon’s case to the attention of Leroy Logan, a former deputy of neighbouring borough Hackney’s police force who now supports young people on the database and whose insight into police monitoring of Kavon gave his family a rare glimpse of how the matrix works.

He has never been involved in a “gang”, and he has never been to prison, but the algorithm deemed him an amber risk, and police at times seem to treat him as if he is one of Leyton’s most dangerous youths. “Officers on the street blindly trust the matrix, without questioning how someone got put on there,” says Logan. “They think the algorithm can’t be wrong.”

Jeremy Corbyn Visits The Peckwater Estate To Meet Families Bereaved By Violent Crime

Getty Images

Police officers walk through the Peckwater Estate, London

Does an ethical Matrix exist?

An algorithm whose decisions are unexplainable is simply not advisable in policing contexts, says Oswald, who, with Urwin, has written a framework to regulate police use of algorithms. Each letter of their acronym ‘ALGO-CARE’ refers to criteria such as being advisory, challengeable and accurate. It is unclear how many of their criteria for a legal and ethical database are fulfilled by the Gangs Matrix. It may be none.

Moreover, the notion of a “gang” that drives the matrix’s data collection is a “highly racialised” concept, according to Hannah Couchman, policy and campaigns officer at the human rights advocacy group Liberty.  The Metropolitan Police define a gang as “a relatively durable, predominantly street-based group of young people who see themselves (and are seen by others) as a discernible group, and engage in a range of criminal activity and violence”.

However, 78 per cent of people on the matrix are black, according to 2016 police data, despite black people comprising 13 per cent of London’s population. Hanging out in groups, suggestive Snapchat posts or even being a victim of gang-related violence have been sufficient evidence for hundreds of young black people to be included. “Some of the gang ‘associations’ are so, so tenuous,” says Leroy Logan. “For many people it’s just hanging out with people already of interest to the borough’s gang unit.”

In data-driven policing, the definition of the target can shape the results it produces. Police mapping of organised crime groups, gangs involved in grooming or human trafficking do not show up because not as much  data exists as it does on, say, drug-trafficking groups. Equally, relevant to the Gangs Matrix is that black Britons are nine times more likely to be stopped and searched for drugs than whites, according to a 2018 report from the London School of Economics (LSE). “The data going into the database will reflect previous police activity,” says Muir, “and then the output is saying ‘keep going back to them, keep going back,’ which creates a feedback loop. That’s why you have so many young black men on it.”

No fixed criteria for getting on the matrix means there’s no fixed criteria for getting off it. “Kavon’s not a gang member, he’s not committing crimes,” says Logan. “If Kavon can be on the matrix, then my children can.”

There is validity in a tool to track youth street gangs, says Rick Muir, as the sheer size of big data requires sophisticated technology to make it useful. “Membership of gangs can be risk accelerators of violence, where the notions of retribution and reputation are strong,” he says.

“But what the Met Police need is a set of ethical rules and guidelines, enforced by a type of ethics panel which serves as oversight. This sort of transparency is crucial in building consent and legitimacy, which all policing must have.”

The Information Commissioner’s Office ruled in November that the way the matrix is shared with outside agencies breaches data protection laws, while the Mayor of London Sadiq Khan finally published his long-awaited review of the matrix in December 2018. It was critical of many aspects of the matrix and called for a ‘comprehensive overhaul’, giving the Metropolitan Police a year to improve transparency and redress the disproportionate representation of young black men. But the review did not call for it to be scrapped altogether.

Without transparency and without an escape route, the matrix has trapped Kavon with little  prospect of release. “I do miss people,” he says, as he watches his younger sister putting up a Christmas tree in their living room.

“He hasn’t got a social life,” says Camille. “He’s walking round, but really it’s like he’s living in a prison.”

Additional reporting by Tanya Nyenwa