What just happened
Long stories short
- European Union leaders called for a full investigation into the origins of Covid.
- More than 350,000 people in Ethiopia’s Tigray region are living in famine conditions, with nearly two million more at risk, according to the UN.
- German authorities disbanded an elite state police unit after finding its members were involved in a chat group that glorified the Nazis.
The human in the machine
On Thursday, Tortoise held its inaugural Responsible AI Forum in partnership with Lord Rothschild at Waddesdon Manor. Members of the Tortoise AI Network, who either came in person or dialed in remotely, heard Masayoshi Son, chief executive of SoftBank, Demis Hassabis, chief executive of DeepMind, Helle Thorning-Schmidt, former Danish Prime Minister, and Azeem Azhar, founder of Exponential View – among a host of other experts – give their views on the promises and threats of artificial intelligence.
It was a day full of insight – both optimistic and pessimistic – and we’re still processing what was said. Next week we’ll publish a full statement setting out the forum’s main conclusions and an agenda for how we think AI can be developed and deployed responsibly, by both corporations and governments.
For now, here are some of the key takeaways:
- China and US dominance. China is now home to 40 per cent of the world’s unicorns (startups with a value of more than $1 billion) with 45 per cent located in the US, Masa Son said. The rest of the world has just 15 per cent. Moreover, Son added, China is no longer the “copycat” nation it once was. China now leads the US in fintech, education and e-commerce; with the US ahead in drug discovery and autonomous driving. Everyone else is far behind. In a VC-focused session hosted by Siraj Khaliq of Atomico, Ash Fontana of Zetta Venture Partners suggested that AI companies ultimately had to choose between the US and China – and the values they represent. “If you go deep enough in this field you will have to make a choice about which sovereign states you align yourself with.” (To find out more about Europe’s AI decline, read our story on the latest findings of our Tortoise Global AI Index here).
- World Institute of AI. Demis Hassabis mooted the idea of a world institute of AI with “real powers to set guidelines”, bolstered by a “cutting edge” research division. Hassabis suggested that the institute could come under the jurisdiction of a government body such as the UN and could house some of the best researchers in the world. Although geo-political differences were a major barrier to its creation, Hassabis described how the institute could consider AI problems like values, fairness and bias and lay down guidelines for appropriate use. Jack Clark, co-founder of Anthropic, an AI safety company, also had ideas on how to increase responsibility. “Treat the deployment of technology as an ecological event,” he said. “In the same way as [governments] measure and assess weather and provide public data… we need governments to be able to assess and monitor technological deployments as they are happening.”
Threats. Some interesting points emerged relating to AI concerns. Specifically:
- Cyber. “When do the gloves come off on cyber?” Tom Hurd, former director-general at the Office for Security and Counter Terrorism, asked. Malign actors were moving from targeting consumers to targeting logistics, he noted, potentially causing physical harm by shutting down a whole supply chain. “The controversial question is when will we put malign cyber actors at physical risk, because they are putting us at physical risk,” he asked. “We’ve made that leap in the world of terrorism.” In his keynote, Masa Son said that computer violence should be regarded as the same as physical violence. “Those who commit these crimes should be as severely punished as… murder.” Masa added that cryptocurrency should be “severely” regulated and AI weapons like autonomous robots should be banned.
- Killing machines. Military AI posed some of the biggest threats, according to Stuart Russell, professor of computer science at Berkeley. He pointed out that autonomous cars have to be almost 100 per cent reliable before they’re released, whereas a military robot that offers a 50 per cent chance of killing someone is fine because “you can send two of them.” Stuart said he and two grad students built an “effective killing machine”.
- Mischaracterisation, deployment and autonomy. Demis Hassabis outlined three ways of thinking about AI dangers. Mischaracterisation is where problems are labelled as AI-based when they are not. Deployment is the human misuse of AI for the wrong tasks: such as using it to decide a parole hearing. Autonomy is the “sci-fi” problem: the fear that machines will develop their own learning patterns without a human set of values. “‘Move fast and break things’ is exactly what we should not be doing” in this case, he said.
The day was not all doom and gloom. Speakers were clear that AI has the potential to change our lives for the better. As Masa Son said, the most recent drug discoveries wouldn’t be happening without it. For Tom Hurd, AI allowed the government to anticipate critical security threats. And Demis Hassabis told us that he spends his week applying AI algorithms to nuclear fusion, complex maths problems and predicting protein structures through AlphaFold.
Wealth investment, fairness, prosperity
The first day of the G7-related fuss in Cornwall can be marked down as a success for Boris Johnson. The PM was able to brush over reports that Joe Biden, the US president, had formally rebuked him over Brexit and Northern Ireland – and made the most of a seaside photo op with new wife Carrie and both Bidens. It’s a stay of execution rather than a solution, though. Emmanuel Macron, the French president, last night warned against attempts by the UK to renegotiate the Northern Ireland protocol; and EU leaders have confirmed that they will be bringing the subject up. Johnson won’t be able to avoid the tough questions all weekend.
On vaccines: Johnson this morning announced that the UK will donate 100 million vaccine doses to countries around the world. The numbers might sound impressive, but only 25 million will be donated this year – the rest by the end of 2022. Yesterday, Biden committed to 500 million. The BBC has a useful, if depressing, Reality Check on just how many will be needed – and how quickly – to significantly affect transmission of the virus globally. Hint: 600 million doesn’t touch the sides.
Our planet environment, natural resources, geopolitics
Flight of fancy
Boris Johnson’s decision to take a private jet 250 miles from London to Cornwall for a summit with climate change very firmly on the agenda raised some eyebrows. That’s just the start. Quartz has published a good piece about the financial commitments of G7 nations ($) to solving climate change, using analysis by the Overseas Development Institute. Yes, G7 nations have recently made a lot of noise about the new steps they are taking to tackle carbon pollution, but some of them are not yet fronting up the cash. When compared to the 2009 Copenhagen climate commitment for rich countries to raise $100 billion annually by 2020, all bar France and Germany fall short. One for the agenda this weekend?
New things technology, science, engineering
A blinking success
A giant “blinking” star – 100 times the size of the sun and 25,000 light years away – has been discovered near the heart of the Milky Way. An international team of astronomers noticed the star in data gathered from the Vista telescope in Chile, which has been watching for stars that vary in brightness over the past decade. This star began to fade in 2012, and over a period of 100 days dimmed by 97 per cent – before returning to its former glory. The Guardian reports that the most likely explanation is that a “dusty disc” from an orbiting planet or second star got in the way of the telescope. Astronomers expect the same thing to happen again within the helpful timeframe of 20-200 years.
Belonging identity, society, beliefs, countries
Opposition MPs are calling for Priti Patel, the home secretary, to resign, after it was alleged that she misled Parliament by claiming that the Napier Barracks asylum camp was established in line with Public Health England (PHE) advice. Newly published correspondence shows that PHE had advised against housing asylum seekers in shared dormitories in the middle of a pandemic. Yet the Home Office still went ahead with Napier Barracks in September. Firmly in Boris Johnson’s good books, a high-profile bullying scandal hasn’t yet knocked Patel off her perch in the Home Office. It seems unlikely that a government intent on defending its Covid track record – and seemingly keeping Napier Barracks open – will join the push for her to go.
For more on Napier Barracks, listen to our Slow Newscast with reporter Jack Shenker from January.
The 100-year life health, education, living, public poliCY
Time for change
Sexual harassment is so “commonplace” for some school children that they no longer bother to report it, according to an Ofsted report. The school watchdog spoke to over 900 children in 32 schools and colleges and revealed a crisis of epidemic proportions. Nine in 10 girls are regularly sent unsolicited explicit photos; at one school, girls described being asked to send nude images by up to 10 or 11 different boys per night. Already, calls have been made by ministers and womens’ groups for better training for schools and staff to help deal with the problem. But as Soma Sara, the founder of anonymous survivor testimony website Everyone’s Invited, pointed out in a Slow View for Tortoise yesterday: people have been shouting about this issue for years. Instead of concrete action, it’s become worse. What’s so different this time?
Stay safe, wash your hands, open windows when you can.
Photographs by Tom Pilston for Tortoise and Getty Images
Machines have become our managers
Digital surveillance of employees has gone mainstream during the pandemic, but it’s not always effective and it erodes human dignity. Governments, companies and workers need a plan for the future