â€śKiller robotsâ€ť are already being developed and used by lots of countries. But if these machines are really autonomous, who do we blame if things go wrong?
Hello, Iâ€™m Claudia and this is the Sensemaker.Â
One story, everyday, to make sense of the world.
Today, slaughter bots: can we control the machines that are programmed to kill us?Â
In the United States, a major reportÂ from a â€śnational security commissionâ€ťÂ on artificial intelligence talks about AI enabling a â€śnew paradigm in war fightingâ€ť â€“ and urges massiveÂ amounts of investment in the field.DW News
Imagine youâ€™re in the middle of a war and it looks like thereâ€™s no end in sight.Â
Itâ€™s expensive and itâ€™s bloody. So instead of sending in more troops, you send in a swarm of slick, silver killer robots that operate, on their own.Â And no, Iâ€™m not describing an episode of Black Mirror.Â
These killer robots are already being developed and used by China, Israel, South Korea, Russia, the United Kingdom and the United States.Â
Just last year Azerbaijan was using so-called â€śkamikaze dronesâ€ť in strikes against Armenian forces in the disputed region of Nagorno-Karabakh. Â
Next fighting between Azerbaijan and Armenia over the disputed territory Nagorno-Karabakh is in its ninth day. Both sides have accused each other of attacking civilian areas and the casualties are going up.BBC News
You see, when people talk about AI and weapons, itâ€™s normally in terms of the future.Â But experts in machine learning and artificial intelligence have said that because the technology for autonomous weapons is now readily available, the age of the killer robot is now here.
Officially, theyâ€™re called autonomous lethal weapons. Theyâ€™re mostly small drones that can select and engage targets without human intervention, mainly being deployed in situations that are highly volatile.
You can see why they are called slaughter bots.
They can operate anywhere: in the air, on land, on and under water â€“ and even in space. Unsurprisingly, their existence has generated a lot of controversy.
Weâ€™ve spent the last five years detailing all of the legal challenges raised by killer robots under international humanitarian law, international human rights law, the massive accountability gap that would be created by the use of a fully autonomous weapon.France24 English
To address this issue head on, the United Nations has been hosting talks in Geneva for the last four years to figure out if and how we should be using these killer robots. But last week the US rejected calls for a binding agreement that would regulate the use of these weapons.Â John Tasioulas director of the Institute for Ethics in AI, called Joe Bidenâ€™s position â€śsad but unsurprising.â€ť
The idea of a truly â€śautonomousâ€ť weapon raises eyebrows for one obvious reason: if these machines are really autonomous, who do we blame when things go wrong?Â
Already,Â today, autonomous weapons are being used to find a target over long distances andÂ destroy it without human intervention. And this revolution is just getting startedÂ â€“ turbocharged by artificial intelligence.DW News
The appeal of these machines is pretty clear, thereâ€™s evidence to show theyâ€™re better at the job than humans.
It cost the pentagon about $850,000 a year for each soldier in Afghanistan but a small rover equipped with weapons costs roughly $230,000 and kamikaze drones cost just $6000 a piece.Â There is also the point about fewer soldiersâ€™ lives being put at risk.Â
As one British army commander put it, bluntly: â€śI can build a machine that can go into a dangerous place and kill the enemy or we can send your son â€” because thatâ€™s the alternative. How do you feel now?â€ťÂ
There have also been compelling arguments about how machines could be more ethical than people because they wouldnâ€™t be able to commit war crimes such as rape.Â
But itâ€™s just not that simple.Â Whilst for years many have tried to convince the world that AI is objective and unbiased â€“ there is a lot of evidence to show the opposite.Â
In September 2019 the photograph of Joshua Bada from England was rejected when he applied for a new passport. The reason, the system mistook his lips for an open-mouth as a result the man was told his application had not been accepted because he needed to provide a neutral expression and a closed mouth, something he had in fact done. But the algorithms couldnâ€™t interpret his lips correctly because they were obviously not trained with a sufficient amount of images of black people.DW Shift
Even though these machines would undoubtedly be able to process lots of data far quicker than humans, if the data sets theyâ€™re receiving are inaccurate or limited or riddled with human biases then atrocities like genocide and ethnic cleansing seem almost inevitable.Â
As of right now, lethal autonomous weapons are still largely controlled by humans who have the final say.Â But experts warn that the more sophisticated they become, they will also learn new capabilities that will eventually make their missions unstoppable.Â
It wouldnâ€™t be an overstatement to say that they could go rogue.Â
And as experts have pointed out, machine learning works best when there are distinctions between two groups: a good guy and a bad guy, for instance.Â Â
Most of the time, in a war setting, this isnâ€™t the case, especially with the rise of guerrilla and insurgent warfare.Â
The Iraqi air force is targeting the group in the area but it looks like many civilians have been killed instead. An Iraqi MP urged the government to launch an investigation to how such a mistake could happen, saying it harms the reputation of the Iraqi army and raises questions about the intelligence they have on ISIL.Al Jazeera English
Even human soldiers arenâ€™t always able to distinguish enemy combatants from civilians.Â
Ultimately, the idea of ethical robotic killing machines seems contradictory and unrealistic.Â
The UN Secretary-General AntĂłnio Guterres has called these weapons â€śmorally repugnant and politically unacceptable.â€ťÂ
Having autonomous weapons means itâ€™s easier and cheaper to kill people. For army strategists, that might be a positive.Â Â Â
But society is left with a clear question: is this something we really want?Â
Todayâ€™s story was written by Nimo Omer and produced by Ella Hill.
The super divorce: The Akhmedovs and a game of hide and seek
When a High Court Judge awarded Tatiana Akhmedova a record ÂŁ450m divorce payout, it was only the beginning of a case that went on to become the most expensive family feud in historyâ€¦