Skip to main content

We’re running out of time to stop killer robot weapons

The fully autonomous AI weapons now being developed could disastrously transform warfare. The UN must act fast
Mock killer robot
 Mock killer robot in central London. ‘Countries that recognise the dangers cannot wait another five years to prevent such weapons from becoming a reality.’ Photograph: Carl Court/AFP/Getty Images
It’s five years this month since the launch of the Campaign to Stop Killer Robots, a global coalition of non-governmental groups calling for a ban on fully autonomous weapons. This month also marks the fifth time that countries have convened at the United Nations in Geneva to address the problems these weapons would pose if they were developed and put into use.
The countries meeting in Geneva this week are party to a major disarmament treaty called the Convention on Certain Conventional Weapons. While some diplomatic progress has been made under that treaty’s auspices since 2013, the pace needs to pick up dramatically. Countries that recognise the dangers of fully autonomous weapons cannot wait another five years if they are to prevent the weapons from becoming a reality.
Fully autonomous weapons, which would select and engage targets without meaningful human control, do not yet exist, but scientists have warned they soon could. Precursors have already been developed or deployed as autonomy has become increasingly common on the battlefield. Hi-tech military powers, including China, Israel, Russia, South Korea, the UK and the US, have invested heavily in the development of autonomous weapons. So far there is no specific international law to halt this trend.
Experts have sounded the alarm, emphasising that fully autonomous weapons raise a host of concerns. For many people, allowing machines that cannot appreciate the value of human life to make life-and-death decisions crosses a moral red line.
Legally, the so-called “killer robots” would lack human judgment, meaning that it would be very challenging to ensure that their decisions complied with international humanitarian and human rights law. For example, a robot could not be preprogrammed to assess the proportionality of using force in every situation, and it would find it difficult to judge accurately whether civilian harm outweighed military advantage in each particular instance.
Fully autonomous weapons also raise the question: who would be responsible for attacks that violate these laws if a human did not make the decision to fire on a specific target? In fact, it would be legally difficult and potentially unfair to hold anyone responsible for unforeseeable harm to civilians.
Loaded: 0%
Progress: 0%
The Campaign to Stop Killer Robots, which Human Rights Watch co-founded and coordinates, argues that new international laws are needed to preempt the development, production and use of fully autonomous weapons. Many roboticistsfaith leadersNobel peace laureates and others have reached the same conclusion, as is evident from their open letters, publications and UN statements: the world needs to prevent the creation of these weapons because once they appear in arsenals, it will be too late.
At the UN meeting going on now, one of two week-long sessions that will take place this year, countries are striving to craft a working definition of the weapons in question and to recommend options to address the concerns they raise. The countries have offered several possible ways to proceed. The momentum for a preemptive prohibition is clearly growing. As of Monday, the African Group and Austria have joined 22 other countries voicing explicit support for a ban. Other countries have aligned themselves with a French/German proposal for a political declaration, a set of nonbinding guidelines that would be an interim solution at best. Still others have explicitly expressed opposition to a preemptive prohibition and a preference for relying on existing international law.
Despite this divergence of opinion, the discussion on the first day had a significant common thread. Almost all countries that spoke talked about the need for some degree of human control over the use of force. The widespread recognition that humans must have control over life-and-death decisions is heartening. If countries agree that such control needs to be truly meaningful, a requirement for human control and a prohibition on weapons that operate without such control are two sides of the same coin.
These developments are positive, but the countries meeting this week clearly have much work ahead of them. To stay in front of technology, they should negotiate and adopt a new legally binding ban by the end of 2019. Only then will they have a chance to prevent the creation of a weapon that could revolutionise warfare in a frightening way.
 Bonnie Docherty is a senior arms researcher at Human Rights Watch and associate director of armed conflict and civilian protection at Harvard Law School’s International Human Rights Clinic

Comments

Popular posts from this blog

DU poll violence at Zakir Hussain, principal blames ABVP members

The incident took place on Monday when ABVP’s vice-presidential candidate, Shakti Singh, had gone to the college for campaigning, which ended at 8 pm. On the last day of campaigning for the Delhi University Students’ Union (DUSU) polls, members of the Akhil Bharatiya Vidyarthi Parishad (ABVP) allegedly vandalised property at Zakir Hussain (Evening) College and assaulted students and staff. The incident took place on Monday when ABVP’s vice-presidential candidate, Shakti Singh, had gone to the college for campaigning, which ended at 8 pm. College principal Masroor Ahmad Baig claimed,”I don’t know how it started, but I was shocked to hear the commotion. They vandalised college property, broke chairs and threw flower pots. It was ABVP activists who beat up students; they even hit girls and staff.” Police were called in and Baig said the college was filing a complaint. DCP (central) Mandeep Singh Randhawa, said, “Police personnel were present at the spot when the incident too...

Is Facebook too powerful?

The problem is that the way Facebook marketed itself to consumers back in 2008, is not how it is now. (Source: Getty) Three hours and 27 minutes into Mark Zuckerberg’s testimony before the Senate, senator Dan Sullivan pressed him on the question – is  Facebook  too powerful? It’s a sentiment reflected by the House of Lords here in the UK, which has called on the Competition and Markets Authority to launch a study into  Google  and Facebook’s “dysfunctional and opaque” digital advertising market. Facebook can be an extraordinarily useful tool – it has connected people and ideas around the world like nothing before. But it is exactly this magnitude which is becoming a problem. A lack of understanding on the part of politicians and consumers alike about what Facebook is, what it does, and how it works means that it hasn’t been regulated like other behemoths in the communications world. This lack of regulation has allowed Facebook to become huge. Self-...

The Top 5 Creepiest "NoSleep" Reddit Stories You'll Ever Read

Image Source The Stories That Will Never Let You Sleep Again Have you ever read or seen something so terrifying that you had to keep the lights on in your house in order to sleep? If not, prepare to read these terrifying stories that will make you want to hide under the covers tonight. Image Source Return To Sender This story shows that you should be careful when a box is labeled "fragile." Manny  had one of the most annoying neighbors in the world. Justin was trying to be a bigshot YouTube star and often did wild and crazy stunts for views. Manny had seen Justin attempt stupid things like the cinnamon challenge or him laying flat on the hood of his car as it slowly crept down the driveway unmanned. He was usually screaming things like "epic win," "epic fail," or "epic maintenance of the status quo." Manny was getting to the point he just wanted this guy to get whatever YouTube fame he could get and just move on...