Rocksolid Light

Welcome to novaBBS (click a section below)

mail  files  register  newsreader  groups  login

Message-ID:  

Pohl's law: Nothing is so good that somebody, somewhere, will not hate it.


tech / rec.aviation.piloting / Autonomous AI Equipped Flying Killer Robots Are Here! - Autonomous AI Equipped Flying Killer Robots_1579732425_kargu-en.pdf (0/1)

SubjectAuthor
o Autonomous AI Equipped Flying Killer Robots Are Here! - Autonomous AI Equipped FLarry Dighera

1
Autonomous AI Equipped Flying Killer Robots Are Here! - Autonomous AI Equipped Flying Killer Robots_1579732425_kargu-en.pdf (0/1)

<sdgkbg12hel86iei6qeot2s3cvqrbt08b6@4ax.com>

  copy mid

https://www.novabbs.com/tech/article-flat.php?id=30&group=rec.aviation.piloting#30

  copy link   Newsgroups: rec.aviation.piloting
Path: i2pn2.org!i2pn.org!weretis.net!feeder8.news.weretis.net!feeder1.feed.usenet.farm!feed.usenet.farm!tr3.eu1.usenetexpress.com!feeder.usenetexpress.com!tr1.iad1.usenetexpress.com!border1.nntp.dca1.giganews.com!nntp.giganews.com!buffer1.nntp.dca1.giganews.com!buffer2.nntp.dca1.giganews.com!news.giganews.com.POSTED!not-for-mail
NNTP-Posting-Date: Fri, 04 Jun 2021 10:56:02 -0500
From: LDigh...@att.net (Larry Dighera)
Newsgroups: rec.aviation.piloting
Subject: Autonomous AI Equipped Flying Killer Robots Are Here! - Autonomous AI Equipped Flying Killer Robots_1579732425_kargu-en.pdf (0/1)
Date: Fri, 04 Jun 2021 08:55:30 -0700
Message-ID: <sdgkbg12hel86iei6qeot2s3cvqrbt08b6@4ax.com>
User-Agent: ForteAgent/8.00.32.1272
MIME-Version: 1.0
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: 8bit
Lines: 728
X-Usenet-Provider: http://www.giganews.com
X-Trace: sv3-RIok6OHlVn7nc5vzJthxiULYEHHFQlID3QWg7bTBS6ImJQkR06z6wz0eev0aqXJFi8o2KknmNcpw+G9!LSpcPrFzBHTJZW7e3+BT0xxj/1e0+Cw/SatWxD0dHupVnJH0k8cNZ2VaIOrGQG3SyTn/GRU=
X-Complaints-To: abuse@giganews.com
X-DMCA-Notifications: http://www.giganews.com/info/dmca.html
X-Abuse-and-DMCA-Info: Please be sure to forward a copy of ALL headers
X-Abuse-and-DMCA-Info: Otherwise we will be unable to process your complaint properly
X-Postfilter: 1.3.40
X-Original-Bytes: 35957
 by: Larry Dighera - Fri, 4 Jun 2021 15:55 UTC

Video:
https://www.stm.com.tr/en/kargu-autonomous-tactical-multi-rotor-attack-uav

Autonomous AI Equipped Flying Killer Robots, what could possibly go wrong?
:-(

---------------------------------------------------------------------------
https://www.npr.org/2021/06/01/1002196245/a-u-n-report-suggests-libya-saw-the-first-battlefield-killing-by-an-autonomous-d

A Military Drone With A Mind Of Its Own Was Used In Combat, U.N. Says

June 1, 20213:09 PM ET
JOE HERNANDEZ

A Kargu rotary-wing attack drone loitering munition system manufactured by
the STM defense company of Turkey. A U.N. report says the weapons system was
used in Libya in March 2020.
Emre Cavdar/STM

Military-grade autonomous drones can fly themselves to a specific location,
pick their own targets and kill without the assistance of a remote human
operator. Such weapons are known to be in development, but until recently
there were no reported cases of autonomous drones killing fighters on the
battlefield.

Now, a United Nations report about a March 2020 skirmish in the military
conflict in Libya says such a drone, known as a lethal autonomous weapons
system — or LAWS — has made its wartime debut. But the report does not say
explicitly that the LAWS killed anyone.

"If anyone was killed in an autonomous attack, it would likely represent an
historic first known case of artificial intelligence-based autonomous
weapons being used to kill," Zachary Kallenborn wrote in Bulletin of the
Atomic Scientists.
https://thebulletin.org/2021/05/was-a-flying-killer-robot-used-in-libya-quite-possibly/

The assault came during fighting between the U.N.-recognized Government of
National Accord and forces aligned with Gen. Khalifa Haftar, according to
the report by the U.N. Panel of Experts on Libya.

"Logistics convoys and retreating [Haftar-affiliated forces] were
subsequently hunted down and remotely engaged by the unmanned combat aerial
vehicles or the lethal autonomous weapons systems such as the STM Kargu-2
.... and other loitering munitions," the panel wrote.

The Kargu-2
https://www.stm.com.tr/en/kargu-autonomous-tactical-multi-rotor-attack-uav
is an attack drone made by the Turkish company STM that can be operated both
autonomously and manually and that purports to use "machine learning" and
"real-time image processing" against its targets.

The U.N. report goes on: "The lethal autonomous weapons systems were
programmed to attack targets without requiring data connectivity between the
operator and the munition: in effect, a true 'fire, forget and find'
capability."

"Fire, forget and find" refers to a weapon that once fired can guide itself
to its target.

The idea of a "killer robot" has moved from fantasy to reality

Drone warfare itself is not new. For years, military forces and rebel groups
have used remote-controlled aircraft to carry out reconnaissance, target
infrastructure and attack people. The U.S. in particular has used drones
extensively to kill militants and destroy physical targets.

Azerbaijan used armed drones to gain a major advantage over Armenia in
recent fighting for control of the Nagorno-Karabakh region. Just last month,
the Israel Defense Forces reportedly used drones to drop tear gas on
protesters in the occupied West Bank, while Hamas launched loitering
munitions — so-called kamikaze drones — into Israel.

What's new about the incident in Libya, if confirmed, is that the drone that
was used had the capacity to operate autonomously, which means there is no
human controlling it, essentially a "killer robot," formerly the stuff of
science fiction.

Not all in the world of security are concerned.

"I must admit, I am still unclear on why this is the news that has gotten so
much traction," Ulrike Franke, a senior policy fellow at the European
Council on Foreign Relations, wrote on Twitter.

Franke noted that loitering munitions have been used in combat for "a while"
and questioned whether the autonomous weapon used in Libya actually caused
any casualties.

Jack McDonald, a lecturer in war studies at King's College London, noted
that the U.N. report did not make clear whether the Kargu-2 was operating
autonomously or manually at the time of the attack.

While this incident may or may not represent the first battlefield killing
by an autonomous drone, the idea of such a weapon is disquieting to many.

A global survey commissioned by the Campaign to Stop Killer Robots last year
found that a majority of respondents — 62% — said they opposed the use of
lethal autonomous weapons systems.
https://www.stopkillerrobots.org/2021/01/poll-opposition-to-killer-robots-strong/
------------------------------------------------------------------------

https://thebulletin.org/2021/05/was-a-flying-killer-robot-used-in-libya-quite-possibly/

Was a flying killer robot used in Libya? Quite possibly
By Zachary Kallenborn | May 20, 2021

A promotional video about autonomous weaponized drone. A screenshot from a
promotional video advertising the Kargu drone. In the video, the weapon
dives toward a target before exploding.

Last year in Libya, a Turkish-made autonomous weapon—the STM Kargu-2
drone—may have “hunted down and remotely engaged” retreating soldiers loyal
to the Libyan General Khalifa Haftar, according to a recent report by the UN
Panel of Experts on Libya. Over the course of the year, the UN-recognized
Government of National Accord pushed the general’s forces back from the
capital Tripoli, signaling that it had gained the upper hand in the Libyan
conflict, but the Kargu-2 signifies something perhaps even more globally
significant: a new chapter in autonomous weapons, one in which they are used
to fight and kill human beings based on artificial intelligence.

The Kargu is a “loitering” drone that can use machine learning-based object
classification to select and engage targets, with swarming capabilities in
development to allow 20 drones to work together. The UN report calls the
Kargu-2 a lethal autonomous weapon. It’s maker, STM, touts the weapon’s
“anti-personnel” capabilities in a grim video showing a Kargu model in a
steep dive toward a target in the middle of a group of manikins. (If anyone
was killed in an autonomous attack, it would likely represent an historic
first known case of artificial intelligence-based autonomous weapons being
used to kill. The UN report heavily implies they were, noting that lethal
autonomous weapons systems contributed to significant casualties of the
manned Pantsir S-1 surface-to-air missile system, but is not explicit on the
matter.)

Many people, including Steven Hawking and Elon Musk, have said they want to
ban these sorts of weapons, saying they can’t distinguish between civilians
and soldiers, while others say they’ll be critical in countering fast-paced
threats like drone swarms and may actually reduce the risk to civilians
because they will make fewer mistakes than human-guided weapons systems.
Governments at the United Nations are debating whether new restrictions on
combat use of autonomous weapons are needed. What the global community
hasn’t done adequately, however, is develop a common risk picture. Weighing
risk vs. benefit trade-offs will turn on personal, organizational, and
national values, but determining where risk lies should be objective.

It’s just a matter of statistics.

At the highest level, risk is a product of the probability and consequence
of error. Any given autonomous weapon has some chance of messing up, but
those mistakes could have a wide range of consequences. The highest risk
autonomous weapons are those that have a high probability of error and kill
a lot of people when they do. Misfiring a .357 magnum is one thing;
accidentally detonating a W88 nuclear warhead is something else.

There are at least nine questions that are important to understanding where
the risks are when it comes to autonomous weapons.

How does an autonomous weapon decide who to kill? Landmines—in some sense an
extremely simple autonomous weapon—use pressure sensors to determine when to
explode. The firing threshold can be varied to ensure the landmine does not
explode when a child picks it up. Loitering munitions like the Israeli Harpy
typically detect and home in on enemy radar signatures. Like with landmines,
the sensitivity can be adjusted to separate civilian from military radar.
And thankfully, children don’t emit high-powered radio waves.

But what has prompted international concern is the inclusion of machine
learning-based decision-making as was used in the Kargu-2. These types of
weapons operate on software-based algorithms “taught” through large training
datasets to, for example, classify various objects. Computer vision programs
can be trained to identify school buses, tractors, and tanks. But the
datasets they train on may not be sufficiently complex or robust, and an
artificial intelligence (AI) may “learn” the wrong lesson. In one case, a
company was considering using an AI to make hiring decisions until
management determined that the computer system believed the most important
qualification for job candidates was being named Jared and playing high
school lacrosse. The results wouldn’t be comical at all if an autonomous
weapon made similar mistakes. Autonomous weapons developers need to
anticipate the complexities that could cause a machine learning system to
make the wrong decision. The black box nature of machine learning, in which
how the system makes decisions is often opaque, adds extra challenges.


Click here to read the complete article
1
server_pubkey.txt

rocksolid light 0.9.8
clearnet tor