1. Lethal autonomous weapons (LAWs) can independently search and engage targets based on programmed instructions and constraints, but their behavior in real world situations is unpredictable.
2. Autonomous weapons carry risks of unintentional escalation due to their speed and scale of operation. Recent research has shown autonomous weapons can lead to inadvertent escalation in war games.
3. Artificial intelligence enables mass production of autonomous weapons at low cost, which is highly destabilizing as it removes the limit on the number of weapons systems that a human can adequately supervise, allowing activation of hundreds, thousands, or millions of weapons simultaneously by a single individual.
Original Description:
Original Title
Scalability of weapons systems and Prolifiration risk
1. Lethal autonomous weapons (LAWs) can independently search and engage targets based on programmed instructions and constraints, but their behavior in real world situations is unpredictable.
2. Autonomous weapons carry risks of unintentional escalation due to their speed and scale of operation. Recent research has shown autonomous weapons can lead to inadvertent escalation in war games.
3. Artificial intelligence enables mass production of autonomous weapons at low cost, which is highly destabilizing as it removes the limit on the number of weapons systems that a human can adequately supervise, allowing activation of hundreds, thousands, or millions of weapons simultaneously by a single individual.
1. Lethal autonomous weapons (LAWs) can independently search and engage targets based on programmed instructions and constraints, but their behavior in real world situations is unpredictable.
2. Autonomous weapons carry risks of unintentional escalation due to their speed and scale of operation. Recent research has shown autonomous weapons can lead to inadvertent escalation in war games.
3. Artificial intelligence enables mass production of autonomous weapons at low cost, which is highly destabilizing as it removes the limit on the number of weapons systems that a human can adequately supervise, allowing activation of hundreds, thousands, or millions of weapons simultaneously by a single individual.
Lethal automomous weapons (law’s) are a type of autonomous
military robot that can independently search and engage target with help of programmed incriptions and constraits.
1.Unpredictable performance
Lethal autonomus weapons or (law’s) have been called
“unpredictable by design” this is mainly due to the unpredictable behavior of these weapons settings in real world.
2.Escallation Risk
Given the speed and scale at which autonomous weapons are
capable of operating, these weapons systems carry risks of unintentional escalation. a good parallel of how adversial AI systems can rapidly escalate out of control. And just having human control on an autonomous weapon eliminates 80% of problems Recent research by RAND cooperative has noted that “the speed of autonomous weapons did lead to inadvertent escalation in the wargame” 3.Scalability of weapons systems and proliferation task Artificial intelligence enables tasks to be accomplished at scale and lower cost. The resulting ability to mass produce autonomous weapons cheaply, creates a dynamic that is highly destabilizing to society. When a human is required to make the decision to kill, there is an inherent limit to how many weapons they can adequately supervise, on the order of a single to a few individual weapons. Removing human judgment also removes the limit on the number of weapons systems activated, meaning a single individual could activate hundred, thousands, or even millions of weapons at once.
3.Selective Targeting of Groups
Selecting individuals to kill based on sensor data alone, especially
through facial recognition or other biometric information, introduces substantial risks for the selective targeting of groups based on perceived age, gender, race, ethnicity or dress.
Ways to Reduce damage by laws:
1. The Positive Obligation of Human Control: The first is a
commitment by countries that all weapons systems must operate under meaningful human control. This means that humans and not algorithms; decide to kill.
2. Prohibitions on Systems Incapable of Human Control: The
second element is for countries to agree to prohibit weapons systems incapable of meeting the human control requirement.