Possible first use of combat drones with artificial intelligence worried experts

1 week ago
#

The alleged use of such aircraft in Libya is described in a U.N. report.

WASHINGTON – Western military experts are verifying information about last year’s deaths in Libya caused by the use of an autonomous drone controlled by artificial intelligence without human involvement.

A panel report released last week said the drone operating in Libya tracked and remotely fought soldiers fighting on the side of Libyan Gen. Khalifa Haftar.

UN member states have been debating for months whether a global agreement on the use of combat drones, autonomous and otherwise, and the restrictions that should be placed on their use, is appropriate.

The UN report on Libya adds to the debate.

Possible first use of combat drones with artificial intelligence worried experts

Recent advances in drones have big implications regionally and globally, said Zia Meral of Britain’s Royal Institute for Defence Studies.

“The time has come to assess the status of Turkish drones and advanced combat technologies and their significance for the region and for NATO,” he said at an event organized by the institute in London.

According to a U.N. report, Turkish-made Kargu-2 autonomous drones attacked Haftar’s militia last March, probably on behalf of the national consent government. This was the first case of a successful drone attack equipped with artificial intelligence.

Many human rights organizations oppose the use of autonomous drones, which do not require that they be remotely operated by human operators once they have been programmed.

There have been rumors that Turkey-supplied drones with artificial intelligence, along with remotely piloted drones, were used last year by Azerbaijani forces in clashes with Armenia in disputed Nagorno-Karabakh and neighboring territories.

If drones with artificial intelligence have indeed carried out a deadly attack, it opens a new chapter in the use of autonomous weapons, the Bulletin of the Atomic Scientists said. Critics of such lethal vehicles, which can use facial recognition technology, say it raises a number of moral, ethical and legal dilemmas.

These types of weapons work on the basis of software algorithms trained with large data sets, for example, to classify different objects. Computer vision programs can be trained to recognize school buses, tractors or tanks. However, the data they are trained on may not be complex or reliable enough, and artificial intelligence may learn the lesson incorrectly, warns the nonprofit newsletter.

Kargu-2 manufacturer STM told Turkish media last year that their drones are equipped with facial recognition technology, allowing them to identify and neutralize targets without deploying ground forces.