The dangers of killer robots spark a new arms race

Self-propelled weapon systems - also known as killer robots - may have killed humans for the first time last year, according to a recent United Nations Security Council report on the Libyan civil war.

Killer robots could be key players in future conflicts. Photo: Automationworld
Killer robots could be key players in future conflicts. Photo: Automationworld

History may well identify this as the starting point of the next great arms race, one that could potentially become humanity's last.

Automated weapon systems are lethally armed robots that can operate independently, selecting and attacking targets without the need for human decision-making.

Militaries around the world are investing heavily in autonomous weapons research and development. The US alone spent $18 billion on automatic weapons between 2016 and 2020.

According to Professor James Dawes, an expert in the weaponization of artificial intelligence at Macalester University (UK), autonomous weapons create an unstable and fragmented balance of defenses with the nuclear world, such as the minimally limited power of the President of the United States to launch an attack.

Unmanned military robot in a US Navy exercise. Photo: US Navy
Unmanned military robot in a US Navy exercise. Photo: US Navy

Professor Dawes says there are four main dangers with automatic weapons.

Misidentification of goals, algorithmic errors

When selecting a target, can an automatic weapon distinguish between hostile soldiers and a 12-year-old child playing with a toy gun? Can it distinguish civilians fleeing a conflict site from rebels in a tactical retreat?

The problem here is not that machines will make such mistakes, and humans will not. It's the difference between human error and algorithmic error, like the difference between sending a letter and posting a tweet.

The scale, range and speed of killer robotic systems - dictated by a targeting algorithm - can misidentify humans, as in the US drone incident. in Afghanistan recently.

Damaged house and car after a US drone strike that killed nine members of a family in Kabul, August 2021. Photo: AFP
Damaged house and car after a US drone strike that killed nine members of a family in Kabul, August 2021. Photo: AFP

Self-propelled weapons expert Paul Scharre, at the Center for a New American Security, used the trigger gun incident to explain the difference. Trigger gun is a condition in which a faulty machine gun continues to fire after the trigger has been released. The gun continued to fire until it ran out of ammunition, because of course the gun didn't know it made a mistake.

Trigger guns are an extremely dangerous incident, but they are also operated by humans, the wielder may attempt to point the weapon in a safe direction. By definition, automatic weapons will not provide such protection.

As numerous studies of algorithmic errors across industries have shown, the best algorithms – acting as designed – can produce locally accurate results, but still rapidly spread critical errors. importance on populations. The problem is not just that when AI systems go wrong, they go wrong in bulk, and the creators often don't know why and are therefore unable to fix it.

Weaponizing artificial intelligence can lead to serious risks. Illustration
Weaponizing artificial intelligence can lead to serious risks. Illustration

Cheap killer robots can go viral

The next two risks are the problem of low- and high-level proliferation of self-propelled weapons.

At a low level, militaries develop self-propelled weapons with the assumption that they can prevent and control their use. But if the history of weapons technology has taught the world anything, it is this: weapons will be ubiquitous.

Market pressures could lead to the widespread production and sale of what could be considered automatic weapons as with the Kalashnikov assault rifle: cheap, effective, and virtually impossible killer robots. controlled as they circulate around the world.

Self-propelled weapons of the "Kalashnikov" type can fall into the hands of those outside the control of the government, including international and domestic terrorists.

Automatic weapons have terrible destructive power, easily lead to war

At a high level of proliferation of weapons, countries can compete to develop increasingly destructive versions of automatic weapons, including those capable of mounting biological weapons, chemical, radioactive and nuclear. The ethical dangers of increasing the lethality of weapons will increase.

Advanced self-propelled weapons are more likely to lead to conflict and war. These weapons would also reduce both the need and the risk to the belligerent's troops, dramatically altering the cost-benefit analysis experienced by the parties in launching and sustaining war.

Asymmetric wars – i.e. wars waged on the soil of nations lacking competitive technology – are likely to become more common.

The Kargu-2, a hybrid between a drone and a bomb, made by a Turkish defense contractor, may have hit civilians during the Libyan civil war. Photo: The Conversation
The Kargu-2, a hybrid between a drone and a bomb, made by a Turkish defense contractor, may have hit civilians during the Libyan civil war. Photo: The Conversation

Undermining the law of war

Ultimately, self-propelled weapons will destroy humanity's last barricade against war crimes and atrocities: the international law of war. These laws, codified in treaties dating back to the Treaty of Geneva of 1864, are the thin line separating war from massacre.

The laws of war are based on the fact that people are responsible for their actions even in wartime, that the right to destroy enemy soldiers in battle is not synonymous with the right to kill civilians.

But how can automatic weapons be responsible? Who is responsible for a robot that commits war crimes? Who will be on trial: Weapons? Soldiers? The commander of the soldier? Weapons Corporation?

NGOs and international law experts are concerned that autonomous weapons will lead to a serious accountability gap.

Google Tech News - Conversation

Post a Comment

Previous Post Next Post