Cercar

diumenge, 20 de juny del 2021

A.I. Drone May Have Acted on Its Own in Attacking Fighters, U.N. Says

A United Nations report suggested that a drone, used against militia fighters in Libya’s civil war, may have selected a target autonomously.

Published June 3, 2021

https://www.nytimes.com/2021/06/03/world/africa/libya-drone.html

A military drone that attacked soldiers during a battle in Libya’s civil war last year may have done so without human control, according to a recent report commissioned by the United Nations.

The drone, which the report described as “a lethal autonomous weapons systems,” was powered by artificial intelligence and used by forces backed by the government based in Tripoli, the capital, against enemy militia fighters as they ran away from rocket attacks.

The fighters “were hunted down and remotely engaged by the unmanned combat aerial vehicles or the lethal autonomous weapons systems,” according to the report, which did not say whether there were any casualties or injuries.

The weapons systems, it said, “were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect a true ‘fire, forget and find’ capability.”

The United Nations declined to comment on the report, which was written by a panel of independent experts. The report has been sent to a U.N. sanctions committee for review, according to the organization.

The drone, a Kargu-2, was used as soldiers tried to flee, the report said.

“Once in retreat, they were subject to continual harassment from the unmanned combat aerial vehicles and lethal autonomous weapons systems,” according to the report, which was written by the U.N. Panel of Experts on Libya and released in March. The findings about the drone attack, described briefly in the 548-page document, were reported last month by The New Scientist and by the Bulletin of the Atomic Scientists, a nonprofit organization.

Human-operated drones have been used in military strikes for over a decade. President Barack Obama for years embraced drone strikes as a counterterrorism strategy, and President Donald J. Trump expanded the use of drones in Africa.

Nations like China, Russia and Israel also operate drone fleets, and drones were used in the war between Azerbaijan and Armenia last year.

Experts were divided about the importance of the findings in the U.N. report on Libya, with some saying it underscored how murky “autonomy” can be.

Zachary Kallenborn, who studies drone warfare, terrorism and weapons of mass destruction at the University of Maryland, said the report suggested that for the first time, a weapons systems with artificial intelligence capability operated autonomously to find and attack humans.

“What’s clear is this drone was used in the conflict,” said Mr. Kallenborn, who wrote about the report in the Bulletin of the Atomic Scientists. “What’s not clear is whether the drone was allowed to select its target autonomously and whether the drone, while acting autonomously, harmed anyone. The U.N. report heavily implies, but does not state, that it did.”

But Ulrike Franke, a senior policy fellow at the European Council on Foreign Relations, said that the report does not say how independently the drone acted, how much human oversight or control there was over it, and what specific impact it had in the conflict.

“Should we talk more about autonomy in weapon systems? Definitely,” Ms. Franke said in an email. “Does this instance in Libya appear to be a groundbreaking, novel moment in this discussion? Not really.”

She noted that the report stated the Kargu-2 and “other loitering munitions” attacked convoys and retreating fighters. Loitering munitions, which are simpler autonomous weapons that are designed to hover on their own in an area before crashing into a target, have been used in several other conflicts, Ms. Franke said.

“What is not new is the presence of loitering munition,” she said. “What is also not new is the observation that these systems are quite autonomous. How autonomous is difficult to ascertain — and autonomy is ill-defined anyway — but we know that several manufacturers of loitering munition claim that their systems can act autonomously.”

The report indicates that the “race to regulate these weapons” is being lost, a potentially “catastrophic” development, said James Dawes, a professor at Macalester College in St. Paul, Minn., who has written about autonomous weapons.

“The heavy investment militaries around the globe are making in autonomous weapons systems made this inevitable,” he said in an email.

So far, the A.I. capabilities of drones remain far below those of humans, said Mr. Kallenborn. The machines can easily make mistakes, such as confusing a farmer holding a rake for an enemy soldier holding a gun, he said.

Human rights organizations are “particularly concerned, among other things, about the fragility or brittleness of the artificial intelligence system,” he said.

Professor Dawes said countries may begin to compete aggressively with each other to create more autonomous weapons.

“The concern that these weapons might misidentify targets is the least of our worries,” he said. “More significant is the threat of an A.W.S. arms race and proliferation crisis.”

The report said the attack happened in a clash between fighters for the Tripoli-based government, which is supported by Turkey and officially recognized by the United States and other Western powers, and militia forces led by Khalifa Hifter, who has received backing from Russia, Egypt, the United Arab Emirates, Saudi Arabia and, at times, France.

In October, the two warring factions agreed to a cease-fire, raising hopes for an end to years of shifting conflict.

The Kargu-2 was built by STM, a defense company based in Turkey that describes the weapon as “a rotary wing attack drone” that can be used autonomously or manually.

The company did not respond to a message for comment.

Turkey, which supports the government in Tripoli, provided many weapons and defense systems, according to the U.N. report.

“Loitering munitions show how human control and judgment in life-and-death decisions is eroding, potentially to an unacceptable point,” Mary Wareham, the arms advocacy director at Human Rights Watch, wrote in an email. She is a founding coordinator of the Campaign to Stop Killer Robots, which is working to ban fully autonomous weapons.

Ms. Wareham said countries “must act in the interest of humanity by negotiating a new international treaty to ban fully autonomous weapons and retain meaningful human control over the use of force.”

Libya: Possible First Use of AI-Armed Drones Triggers Alarm Bells (07.06.2021)

 Voice of America
Jamie Dettmer
Western military experts are assessing whether an autonomous drone operated by artificial intelligence, or AI, killed people -- in Libya last year -- for the first time without a human controller directing it remotely to do so.
A report by a United Nations panel of experts issued last week that concluded an advanced drone deployed in Libya "hunted down and remotely engaged" soldiers fighting for Libyan general Khalifa Haftar has prompted a frenetic debate among Western security officials and analysts.
Governments at the United Nations have been debating for months whether a global pact should be agreed on the use of armed drones, autonomous and otherwise, and what restrictions should be placed on them. The U.N.'s Libya report is adding urgency to the debate. Drone advances have "a lot of implications regionally and globally," says Ziya Meral of the Britain's Royal United Services Institute, a defense think tank.

"It is time to assess where things are with Turkish drones and advanced warfare technology and what this means for the region and what it means for NATO," he said at a RUSI-hosted event in London.

According to the U.N. report, Turkish-made Kargu-2 lethal autonomous aircraft launched so-called swarm attacks, likely on behalf of Libya's Government of National Accord, against the warlord Haftar's militias in March last year, marking the first time AI-equipped drones accomplished a successful attack. Remnants of a Kargu-2 were recovered later.

The use of autonomous drones that do not require human operators to guide them remotely once they have been programmed is opposed by many human rights organizations. There were rumors that Turkish-supplied AI drones, alongside remote-guided ones, were used last year by Azerbaijani forces in their clashes with Armenia in the disputed region of Nagorno-Karabakh and its surrounding territories.

Myriad of dilemmas
If AI drones did launch lethal swarm attacks it would mark a "new chapter in autonomous weapons," worries the Bulletin of the Atomic Scientists. Critics of AI drones, which can use facial-recognition technology, say they raise a number of moral, ethical and legal dilemmas.

"These types of weapons operate on software-based algorithms 'taught' through large training datasets to, for example, classify various objects. Computer vision programs can be trained to identify school buses, tractors, and tanks. But the datasets they train on may not be sufficiently complex or robust, and an artificial intelligence (AI) may 'learn' the wrong lesson," the non-profit Bulletin warns.

The manufacturer of the Kargu-2, Defense Technologies and Trade (STM), told Turkish media last year that their drones are equipped with facial-recognition technology, allowing individual targets to be identified and neutralized without having to deploy ground forces. And company executives say Kargu-2 drones can swarm together overwhelming defenses.

Last month, Turkish President Recep Tayyip Erdogan lauded the success of Turkish unmanned aerial vehicles (UAV), saying the results they had produced "require war strategies to be rewritten." Turkey has deployed them in military operations in northern Syria, Turkish officials have acknowledged.

Speaking at a parliamentary meeting of his ruling Justice and Development Party (AKP) in Ankara, Erdogan said Turkey plans to go further and is aiming to be among the first countries to develop an AI-managed warplane. Recently the chief technology officer of Baykar, a major Turkish drone manufacturer, announced the company had slated 2023 for the maiden flight of its prototype unmanned fighter jet.

'A significant player'
Sanctions and embargoes on Turkey in recent years have been a major driving force behind Ankara pressing ahead to develop a new generation of unconventional weapons, says Ulrike Franke of the European Council for Foreign Relations. "Turkey has become a significant player in the global drone market," she said at the RUSI event. When it comes to armed drones, she noted, there are four states dominating drone development -- the U.S., Israel, China and Turkey. The latter pair, the "new kids on the block," are driving drone proliferation because unlike the U.S. they are not reticent about export sales, she said.

"Turkey has shown that a mid-sized power, when it puts its mind and money behind it, can develop very sophisticated armed drones," says Franke.

Last October when the disputed enclave of Nagorno-Karabakh saw the worst fighting there since 1994, Turkish drones were assessed as having given Azerbaijan a key edge over the Armenians. Turkish drones sliced through Armenia's air defenses and pummeled its Russian-made tanks.

Analysts calculate around 90 countries have military drones for reconnaissance and intelligence missions and at least a dozen states have armed drones. Britain is believed to have ten; Turkey around 140. The U.S. air force has around 300 Reaper drones alone. The deployment of armed drones to conduct targeted killings outside formal war zones has been highly contentious. But AI drone development is adding to global alarm.

"With more and more countries acquiring armed drones, there is a risk that the controversies surrounding how drones are used and the challenges these pose to international legal frameworks, as well as to democratic values such as transparency, accountability and the rule of law, could also increase," Britain's Chatham House noted in a research paper published in April.

"This is accentuated further, given that the use of drones continues to expand and to evolve in new ways, and in the absence of a distinct legal framework to regulate such use," say the paper's authors Jessica Dorsey and Nilza Amaral.