Hostname: page-component-669899f699-tpknm Total loading time: 0 Render date: 2025-04-27T18:54:31.412Z Has data issue: false hasContentIssue false

Humanity at the Crossroads: Human Rights Challenges in the Age of Lethal Autonomous Weapon Systems

Published online by Cambridge University Press:  24 April 2025

Asif Ali
Affiliation:
PhD Candidate, Rajiv Gandhi School of Intellectual Property Law, Indian Institute of Technology Kharagpur, Kharagpur, West Bengal, India. Email: [email protected].
Subramanian Ramamurthy
Affiliation:
Associate Professor, Rajiv Gandhi School of Intellectual Property Law, Indian Institute of Technology Kharagpur, Kharagpur, West Bengal, India. Email: [email protected].
Rights & Permissions [Opens in a new window]

Abstract

Artificial Intelligence (AI) has enriched the lives of people around the globe. However, the emergence of AI-powered lethal autonomous weapon systems (LAWS) has become a significant concern for the international community. LAWS are computer-based weapon systems capable of completing their missions, including identifying and engaging targets without direct human intervention. The use of such weapons poses significant challenges to compliance with international humanitarian and human rights law. Scholars have extensively examined LAWS in the context of humanitarian law; however, their implications for human rights warrant further discussion. Against this backdrop, this paper analyzes the human rights challenges posed by LAWS under international law. It argues that using LAWS in warfare and domestic law enforcement operations could violate human rights, such as the rights to life, human dignity, and remedy, among others. Thus, it calls for a prohibition of the use of killer robots against humans.

Type
Article
Copyright
© The Author(s), 2025. Published by International Association of Law Libraries

Introduction

The renowned movie director James Cameron’s iconic film The Terminator presents a bleak future where the evil artificial intelligence (AI) entity known as Skynet triggers a nuclear conflict against humanity to ensure its own survival. Released in 1984, the film demonstrated remarkable foresight by anticipating concerns that now dominate discussions about intelligent computer systems.Footnote 1 The late esteemed scientist Stephen Hawking characterized AI as the foremost threat to human civilizationFootnote 2—an assertion consistent with historical perceptions of scientific and technological advancements, such as the invention of nuclear weapons, which stand as existential risks. In recent years, activists, scientists, and governments have advocated for United Nations (UN)-level prohibitions on “killer robots,” including Lethal Autonomous Weapon Systems (LAWS). While the technology depicted in The Terminator remains speculative, the emergence of “general AI,” a self-aware form of AI, is anticipated in the future.Footnote 3

According to the United States Department of Defense (US DOD), “AI refers to the ability of machines to perform tasks that normally require human intelligence—for example, recognizing patterns, learning from experience, drawing conclusions, making predictions, or taking action […].Footnote 4 AI has grown significantly during the past decade. It is used in many spheres of our lives, including banking, healthcare, and transport.Footnote 5 However, certain applications of AI remain highly controversial, particularly in the development of LAWS,Footnote 6 often referred to as killer robots. Once developed, LAWS are considered the third revolution in warfare after the invention of gunpowder and nuclear weapons.Footnote 7 AI is the principal driver behind LAWS, which can independently target and resort to deadly force. LAWS are classified as fully autonomous weapons due to their capacity to make targeting decisions independent of immediate human control. These weapon systems can swiftly and appropriately determine whether and how to respond to a specific threat without requiring any intervention from human operators.Footnote 8 The rapid progress of these weapon systems has the potential to completely transform the way warfare is conducted and could significantly impact law enforcement operations. Furthermore, it raises serious apprehensions regarding the infringement of human rights, potentially compromising the right to life and various other rights.Footnote 9

The rise of autonomous weapons has prompted grave concerns and has begun to receive criticism across international and non-governmental organizations (NGOs).Footnote 10 For instance, Christof Heyns stated that the use of lethal autonomous robots (LARs) “may not be acceptable due to the lack of an effective legal accountability system, and because robots should not possess the authority to make decisions regarding human life and death.”Footnote 11

Additionally, members of the international community, including international organizations and numerous States, are advocating for the explicit prohibition of LAWS. They are urging the adoption of international regulations to govern the development and deployment of these weapon systems. Given the multiple issues that occur with autonomous weapons, such as possible harm to civilians, lack of accountability, prospective arms races, and ethical concerns around granting machines the power to decide matters of life and death, resistance to these weapons is not surprising.Footnote 12 Concerning the deployment of killer robots, UN Secretary-General António Guterres opined that “the prospect of machines with the discretion and power to take human life is morally repugnant.”Footnote 13

However, strong military nations such as the US, the United Kingdom (UK), and Russia oppose new international norms on the regulation of autonomous weapons.Footnote 14 For them, the existing international humanitarian law (IHL) that regulates warfare is sufficient to adequately regulate future weapon systems laden with cutting-edge technologies. They also highlight the potential advantages of increased autonomy that AI weapon systems can bring. They argue that such weapons have the potential to be more selective in their targeting, thereby decreasing the threats posed to civilians.Footnote 15 Countries such as China, Israel, Russia, South Korea, the UK, and the US are among the countries making significant investments in the creation of autonomous weaponry. Furthermore, States such as Australia and Turkey are also allocating resources for the production of LAWS.Footnote 16

This paper proceeds as follows: The second part defines LAWS. The third part discusses the advantages and major concerns of robotic weapon systems. The fourth part analyzes the human rights challenges posed by using LAWS in armed conflict and law enforcement operations. The last part concludes the paper.

Defining LAWS: What are they?

Before defining LAWS, it is important to understand autonomy. Autonomy may be defined as a machine’s ability to function irrespective of any further human intervention after it has been initiated.Footnote 17 Different degrees of autonomy can be observed in various weapons. For instance, in the case of simple weapons like landmines, a lesser degree of autonomy is observed. These devices possess a minimal level of “autonomy,” as they are initially planted by a party and subsequently explode automatically when triggered without any further interaction needed with the party that deployed them. Semi-autonomous weapon systems, such as fire-and-forget missiles on aircraft that are commonly used in modern warfare, demonstrate a moderate level of autonomy—that is, after being targeted by the pilot, these missiles lock onto the designated target and carry out the attack without requiring human involvement.Footnote 18

However, certain existing weapon systems exhibit a higher degree of autonomy. For instance, the US Navy and the Israeli Army employ weapon systems capable of autonomously detecting, tracking, and firing at incoming missiles based on pre-programmed parameters. Similarly, autonomous sentry guns positioned at the border between North and South Korea automatically engage any objects within their range. Nevertheless, the weapon systems currently under design and development by many countries are expected to possess significantly higher levels of autonomy. They are designed to possess the capacity to control their movements, identify targets independently, and make decisions to engage and eliminate these targets without any human intervention. By incorporating robust AI, these future weapon systems might even learn and autonomously determine the most effective strategies for targeting.Footnote 19

There is not a single, universally accepted definition of LAWS.Footnote 20 However, the US DOD’s definition is widely discussed globally, which describes autonomous weapon systems as “weapon systems that, once activated, can select and engage targets without further operator intervention.”Footnote 21 The primary requirement of the US DOD definition is that a system for warfare must be capable of operating independently and must “select and engage targets” after their initiation without further human involvement. This is also the opinion of Human Rights Watch, which claims that any robot capable of selecting and striking targets without human assistance is a fully autonomous weapon.Footnote 22

Likewise, in a United Nations (UN) Human Rights Council report, the former UN Special Rapporteur on extrajudicial, summary, or arbitrary executions, Christof Heyns, stated that “the term LARs refers to robotic weapon systems that, once activated, can select and engage targets without further intervention by a human operator. The important element is that the robot has an autonomous ‘choice’ regarding selection of a target and the use of lethal force.”Footnote 23

One of the most vocal critics of LAWS, the Campaign to Stop Killer Robots (an NGO), stated that “[k]iller robots are weapon systems that would automatically select and attack targets without meaningful human control. This means the decision to kill would be made by a machine.”Footnote 24

However, various nations have developed their own definitions of LAWS based on distinct criteria that particularly focus on the technological complexity of the weapon system. According to these definitions, LAWS are considered weapon systems that have the capability to exhibit cognition at a level comparable to humans. Conversely, some argue that a definition of LAWS is unnecessary or even undesirable. Nonetheless, despite these disparities, most participants in the discussions on LAWS generally concur that the defining characteristics of LAWS encompass complete autonomy (without manual human control) and the capacity to cause lethal outcomes.Footnote 25

The Use of LAWS in Armed Conflict: Potential Advantages and Major Concerns

Advantages

Many States are investing significant amounts of money to develop LAWS because of their several advantages on the battlefield.Footnote 26 Firstly, LAWS are more capable of processing data quickly and making complex targeting decisions than human soldiers. Secondly, these weapon systems are machines. As a result, their decisions would neither be influenced by emotions nor fear or anger; rather, they would respond to stimuli better than armed forces.Footnote 27 Thirdly, LAWS would not possess the human instinct of self-preservation that could cause rash decision-making, thus leading to a shoot-first psychology. Also, LAWS would perform a military task for a longer duration without getting bored, stressed, or fatigued.Footnote 28

Additionally, LAWS would be more suitable to use in remote areas than conventional weapons because the former would not require a human operator to function.Footnote 29 Furthermore, LAWS would reduce the casualties of human soldiers by reducing the number of soldiers sent to the front lines. Moreover, LAWS would have access to the dangerous areas on the battlefield that are beyond the reach of armed forces. Undoubtedly, the use of LAWS can have economic advantages by lowering the costs of military missions; for example, fewer armed forces may be required to finish the same task, thereby enabling soldiers to relocate for more complex jobs.Footnote 30 In addition, LAWS would cost less than soldiers on the battlefield, thereby increasing the chances of LAWS deployment over human soldiers. For instance, an American soldier costs around US$850,000 per year for deployment in Afghanistan, while a TALON robot can cost around US$230,000 annually.Footnote 31

Lastly, the supportersFootnote 32 of autonomous weapons contend that these systems can adhere to international law more effectively than human soldiers in combat situations. They may act more cautiously and judiciously than soldiers because they do not possess the instinct of self-protection.Footnote 33

Major concerns

OpponentsFootnote 34 of LAWS, such as Human Rights Watch, the Campaign to Stop Killer Robots, and others, have argued that autonomous weapons should be outlawed altogether because they cannot be programmed to respect the laws of war. Further, complying with the rules of IHL requires human qualities that are absent in LAWS.Footnote 35 Essentially, critics argue that LAWS cannot comply with the rules of armed conflict: therefore, they should be banned. Moreover, one of the major concerns linked to the employment of LAWS pertains to the potential lack of accountability for IHL infringements.Footnote 36 Individuals are held accountable under IHL for any unjust killings of civilians. As a result, any weapon employed during warfare fails to satisfy the criteria outlined in the laws of war and, as such, must not be deployed.Footnote 37 Furthermore, any failures or glitches in LAWS might potentially lead to the loss of innocent lives. Undoubtedly, incorporating autonomy in LAWS would enable the differentiation between non-combatants and combatants. However, any catastrophic malfunction in the robots would have devastating consequences.

Indeed, these devices are susceptible to faults. Moreover, AI systems exhibit skepticism when it comes to adjusting to unanticipated conditions.Footnote 38 Additionally, an adversary might undermine the system via data corruption or software manipulation of AI weaponry. Furthermore, opponents of LAWS contend that not all probable scenarios that LAWS may encounter during a conflict can be programmed and that any mistake in the weapon systems might result in a significant number of civilian fatalities or an inadvertent attack on the enemy. Moreover, once triggered, LAWS cannot be turned off in the event of an autonomous weapon failure; hence, until their energy or ammunition runs out, they will continue the attack.Footnote 39 Critics further caution that the employment of LAWS may eradicate the feeling of compassion and human dignity in warfare. When an enemy is asleep or taking a shower, for example, a combatant would not assault them; but LAWS would not make similar moral decisions. Giving autonomous robots the authority to decide between life and death also strips troopsFootnote 40 of their human dignity and turns soldiers into inanimate things.Footnote 41

Human Rights Challenges Posed by LAWS in Armed Conflict and Law Enforcement Operations

This section analyzes the human rights issues of LAWS. In doing so, the authors argue that the deployment of LAWS could violate human rights, such as the rights to life, human dignity, and remedy, among others.Footnote 42 The argument is based on the grounds that LAWS are machines that lack the human attributes to comply with the rules of international law on the battlefieldFootnote 43 and law enforcement operations in the domestic sphere.Footnote 44 This is because there are infinite circumstances during war and law enforcement operations that cannot be entirely programmed into machines to comply with the rules of war.Footnote 45 In other words, endless scenarios in war and law enforcement operations cannot be coded into LAWS to comply with international law, especially in the area of human rights.

Right to Life and LAWS

In the context of warfare, the principles and provisions of human rights law and IHL are applicable. The principles of humanitarian law mandate that belligerent parties must consistently differentiate between combatants and civilians during armed conflicts; solely combatants are legitimate targets, while civilians are entitled to protection. Also, the parties must discriminate between civilian objectives (such as schools, hospitals, churches, etc.) and military objectives. This implies that only military objectives can be targeted in attacks. They must adopt steps that minimize harm to civilians and civilian objects, and they must also discriminate between civilians and combatants while attacking. The assault shall not inflict unjustified damage on persons and civilian targets.Footnote 46

However, LAWS would be unlikely to abide by IHL norms such as distinction and proportionality since assessing conditions during combat is a complicated process that requires human judgment, which is lacking in LAWS. Based on the statement of the former UN Special Rapporteur on extrajudicial, summary, or arbitrary executions, Christoff Heyns, the assessment of such situations necessitates human qualities and judgment, which are lacking in lethal robots. The absence of human qualities makes it difficult to meet international standards.Footnote 47 Therefore, the right to life may be violated in war. Even during war, the right to life remains intact due to its inviolable nature. Thus, even in situations of armed conflict, arbitrary killing amounts to unlawful acts of killing.Footnote 48 According to Rule 89 of the International Committee of the Red Cross’s customary IHL rules, “arbitrary deprivation of the right to life under human rights law includes the unlawful killing in the conduct of hostilities.”Footnote 49 As mentioned previously, the application of LAWS poses a risk to the compliance of rules of distinction and proportionality under IHL. As a result, there would be a possibility of LAWS arbitrarily killing civilians and thereby infringing their right to life during warfare.Footnote 50

Likewise, in the realm of law enforcement, it is imperative to strictly adhere to international human rights law (IHRL) and uphold generally accepted policing principles, such as the UN Basic Principles on the Use of Force and Firearms (UNBPUFF). The deployment of LAWS without much human control in such contexts would inevitably lead to illegal harm and fatalitiesFootnote 51—thus violating human rights law standards. The utilization of LAWS within the law enforcement domain may potentially endanger essential human rights, such as the right to life, as outlined in article 6(1)Footnote 52 of the International Covenant on Civil and Political Rights (ICCPR). Under the principles of human rights, only a limited number of situations warrant the application of deadly force. To start, such force must be legally permitted in conformity with accepted international standards. Secondly, it should be used to safeguard human life. Thirdly, it should only be used as a final option. Fourthly, the use of force should be proportionate to the threat at hand. Lastly, officials must be held responsible for the breach of the law.Footnote 53

Under Principle 9 of UNBPUFF, law enforcement officials are authorized to employ lethal force solely in situations where there is an immediate and substantial risk of death or serious injury.Footnote 54 This requires a thorough assessment of potential or imminent threats, including the identification of the source of danger, the exploration of non-violent alternatives, an evaluation of the necessity to use lethal force to neutralize the threat, the utilization of various communication methods to defuse the situation, a determination of appropriate weapon or equipment use, and prioritization of the preservation of the right to life. Given the constantly shifting, volatile, and unexpected nature of law enforcement operations, it is beyond the reach of lethal autonomous robots to adequately reproduce these essential decision-making skills that are unique to humans.Footnote 55 This means the right to life is at risk by using LAWS in domestic law enforcement. Moreover, before resorting to the use of force, law enforcement officials are expected to employ non-violent methods, such as negotiation, persuasion, and de-escalation. These methods rely on human qualities, such as empathy, negotiation skills, an understanding of crowd behavior, and extensive training to effectively respond to fluid and unpredictable situations. Such attributes cannot be effectively replicated by algorithms.Footnote 56 Thus, such machines may violate the right to life granted under international law.

Human Dignity and LAWS

Many scholars argue that the use of autonomous weapons would violate human dignity.Footnote 57 The preamble of the Universal Declaration of Human Rights (UDHR) states that “recognition of the inherent dignity and of the equal and inalienable rights of all members of the human family is the foundation of freedom, justice, and peace in the world.”Footnote 58 All human beings have inherent rights of human dignity,Footnote 59 which may be under threat from LAWS, as delegating the power to take human lives to machines is like devaluing human life.Footnote 60 Killer robots can take the lives of human beings, though such weapons are just machines, not humans, and as a result, they would be unable to uphold the dignity of those humans being targeted.Footnote 61 These weapons are machines, so they would be unable to understand the importance of individual human life or the loss of human life.Footnote 62 LAWS not only risks the lives of civilians but also poses a threat to human dignity by giving power to machines to kill humans.Footnote 63 Likewise, Mary Ellen O’Connell argues that the mechanized killing of humans by LAWS raises concern for human dignity, as killer robots lack a human conscience when taking lives. In the US, an increasing number of robots slaughter cattle,Footnote 64 without a human conscience. In the future, machines could also decide who dies and carry out the killing. The Vatican’s Archbishop Silvano Tomasi emphasizes that human qualities like compassion and insight are essential for decision-making over life and death. Machines cannot possess these qualities, and even emotionally or psychologically impaired individuals remain human with dignity. It follows that machines taking human lives violates human dignity.Footnote 65

Additionally, there is a likelihood that the deployment of LAWS could infringe upon the prohibition of torture and other cruel, inhumane, or humiliating treatment or punishment. Similar to the ban on willful killing, torture is prohibited under all circumstances, including war, and cannot be excused or waived. No exceptions are allowed to this basic principle of international law, which imposes responsibilities on all countries regardless of their treaty obligations.Footnote 66 Moreover, per article 17 of the ICCPR,Footnote 67 which covers the right to privacy, LAWS may infringe on people’s right to privacy and their right to equality and freedom from discrimination under the ICCPR’s article 26.Footnote 68

It would take massive amounts of data to construct targeted algorithms and establish data patterns for LAWS to use this information to decide when and against whom to employ force. Due to this potential for enormous data collecting and indiscriminate mass monitoring, LAWS poses a serious threat to people’s right to privacy.Footnote 69 Moreover, the extensive gathering and analysis of individual data may also affect the principle of equality and non-discrimination. Machine-learning systems can significantly and swiftly reinforce or alter existing power dynamics, as the datasets utilized to train algorithms often encompass historical prejudices that are subsequently replicated and intensified.Footnote 70 As demonstrated in research conducted by the American Civil Liberties Union (ACLU), the facial recognition software known as “Rekognition” produced erroneous results by misidentifying twenty-eight members of the US Congress as individuals who had been previously arrested.Footnote 71 Notably, the inaccuracies were disproportionately skewed towards people of color, with six members of Congress’s Black Caucus among those falsely matched. Hence, LAWS have the capacity to perpetuate systemic discrimination, potentially leading to severe and life-threatening circumstances.Footnote 72

Right to Remedy and LAWS

The recourse of autonomous weapons presents striking obstacles to the realization of the right to seek remedies for human rights infringements in both armed conflicts and domestic law enforcement. Article 8Footnote 73 of the UDHR and article 2(3)Footnote 74 of the ICCPR stipulate that the right to remedy is universally applicable to all instances of human rights violations. This right is encompassed in several human rights treaties. Under the right to remedy, nations must guarantee personal accountability by prosecuting those implicated in severe breaches of human rights. According to General Comment no. 31 of the UN Human Rights Committee, it is incumbent upon States to conduct investigations into allegations of misconduct and, in the event that evidence of specific violations is discovered, to take appropriate measures to hold the offenders accountable as mandated by the ICCPR. The omission to conduct an inquiry and bring the offender(s) to justice may amount to a distinct violation of the ICCPR. The UN General Assembly’s 2005 adoption of the Basic Principles and Guidelines on the Right to a Remedy and Reparation requires upholding the duty to conduct investigations and pursue legal remedies. The norms necessitate States holding people liable who are proven to have committed human rights infringements.Footnote 75

The responsibility to initiate legal proceedings encompasses activities conducted in the context of law enforcement operations or military hostilities. According to the Basic Principles and Guidelines of 2005, action must be taken against grave IHRL breaches, including arbitrary killings. The principles also encompass “serious violations of international humanitarian law,” which specifically apply to armed conflicts.Footnote 76

The Fourth Geneva ConventionFootnote 77 and Additional Protocol IFootnote 78 are essential legal frameworks for safeguarding civilians under IHL. These agreements mandate that the parties undertake legal proceedings against “grave breaches,” which encompass war crimes that involve intentional attacks on civilians or assaults that are disproportionate.Footnote 79 The right to remedy goes beyond criminal prosecution and includes reparations, which encompass restitution, compensation, rehabilitation, satisfaction, and assurances of non-recurrence. Moreover, the Basic Principles and Guidelines of 2005 impose an obligation on States to implement decisions on claims brought by victims against individuals or entities.

The preceding norms highlight access to legal remedies and reparations for transgressions of IHL and IHRL. As a result, individuals who have suffered injury are eligible to receive some type of remedial action, which may go beyond penal action in the form of reparations. The right to remedy crystallizes two important purposes. Firstly, this right aims to deter future violations by implementing measures to prevent the recurrence of similar violations. Secondly, it provides retribution, giving victims the satisfaction of seeing someone held accountable for the harm they endured.Footnote 80 Further, punishment conveys to wrongdoers that they have committed an offense, acknowledging the victim’s suffering and upholding their moral claims.

However, when it comes to determining a remedy and establishing accountability for the actions of LAWS, robots lack agency, whether moral or otherwise.Footnote 81 In other words, there is a kind of accountability gap concerning LAWS usage, and as a result, such weapons are deemed illegal and unethical.Footnote 82 Not only does the deployment of LAWS raise concerns about an accountability gap for violations that occur during wartime or law enforcement scenarios, but the victim’s right to remedy would also be violated.

The decision-making process of these autonomous machines, which involves life-and-death determinations without meaningful human intervention, creates an accountability gap. It becomes unclear who should be held liable in such situations. Assigning responsibility to the autonomous machine itself is impractical since it cannot experience physical or psychological pain and cannot be punished like a human being. Other potential candidates for accountability, such as the manufacturer, programmer, superior officer, or commander, also face legal and practical obstacles. Anticipating all possible scenarios and predicting the actions of the autonomous machine becomes a challenge. Existing laws may not establish accountability fairly, leading to an accountability gap where no human can be accused or held responsible. The absence of accountability or retribution for the victim’s suffering could result in feelings of frustration. Furthermore, this approach could potentially compromise the objectives of deterrence and retribution, which are crucial for attaining justice in such circumstances.Footnote 83

Moreover, persons in authority who use or develop LAWS that are exclusively aimed at serious violations of the right to life would be guilty under relevant laws and be punished accordingly. However, proving criminal intent in such cases can be challenging and is likely uncommon, especially among representatives of States that generally adhere to international legal standards. The more concerning scenario arises when a fully autonomous weapon causes an arbitrary killing without any evidence of human intent or foreseeability. In such situations, no individual can be directly held accountable for the decision to initiate the attack, making it difficult to establish indirect liability as well.Footnote 84

For instance, the use of drones in recent times has demonstrated there is a lack of investigations into unlawful killings resulting from drone strikes. For example, an Amnesty International report highlighted grave concerns regarding US drone strikes in Pakistan. The US government failed to conduct proper investigations, and prosecutions were not pursued for the US’s human rights violations.Footnote 85 Even though drones are not considered to be fully autonomous weapons, there are difficulties in investigating associated human rights abuses, and those responsible for those abuses remain unpunished. As a result, Pakistani victims of the drone attacks (and their families) did not obtain justice. This happened even though humans had control over the US drones.Footnote 86

It is clear from the above that human control over the use of armed drones does raise issues of accountability and prosecutions for human rights abuses, but thus far, accountability for drone strikes and prosecution thereof have remained elusive. Thus, the removal of human control altogether from weapon systems like LAWS would render it almost impossible to establish accountability for IHRL violations and the right to remedy.Footnote 87

Conclusion

The advent of AI-powered LAWS marks a transformative yet deeply concerning shift in warfare and law enforcement. While LAWS’ compliance with IHL has been extensively debated, this paper has analyzed the challenges that LAWS poses to IHRL. LAWS, devoid of human qualities needed to interpret complex ethical and legal contexts, pose a grave risk to fundamental rights, including the rights to life, security, privacy, and human dignity. Additionally, LAWS’ lack of accountability mechanisms threatens to undermine affected individuals’ right to remedy.

To address these challenges, the international community must act decisively. This includes maintaining meaningful human control over LAWS to ensure adherence to international legal standards and prevent unlawful civilian harm. Furthermore, the authors underscore the urgency of prohibiting the use of killer robots against humans, advocating for the establishment of robust international regulations to safeguard human rights and dignity in the face of advancing AI technologies. Only through such concerted efforts can humanity ensure that technological progress does not come at the expense of our shared legal and ethical values.

Footnotes

1 Joe Burton and Simona R. Soare, “Understanding the Strategic Implications of the Weaponization of Artificial Intelligence,” 11th International Conference on Cyber Conflict (CyCon) 1, no. 1 (2019): 1–17, https://doi.org/10.23919/CYCON.2019.8756866.

2 Rory Cellan-Jones, “Stephen Hawking Warns Artificial Intelligence Could End Mankind,” BBC News, Dec. 2, 2014, https://www.bbc.com/news/technology-30290540.

3 Burton and Soare, “Understanding the Strategic Implications of the Weaponization of Artificial Intelligence” (Footnote n 1).

4 “Summary of the Department of Defense Artificial Intelligence Strategy: Harnessing AI to Advance Our Security and Prosperity,” U.S. Department of Defense, accessed Nov. 23, 2024, https://media.defense.gov/2019/feb/12/2002088963/-1/-1/1/summary-of-dod-ai-strategy.pdf.

5 Soha Rawas, “AI: the Future of Humanity,” Discover Artificial Intelligence 4, no. 25 (2024), https://doi.org/10.1007/s44163-024-00118-3; see also Adib Bin Rashid and MD Ashfakul Karim Kausik, “AI Revolutionizing Industries Worldwide: A Comprehensive Overview of Its Diverse Applications” Hybrid Advances 7 (2024), https://doi.org/10.1016/j.hybadv.2024.100277.

6 Coley Felt, “Autonomous Weaponry: Are Killer Robots in Our Future?,” Henry M. Jackson School of International Studies, Feb. 14, 2020, https://jsis.washington.edu/; see also Stephen Harwood, “A Cybersystemic View of Autonomous Weapon Systems (AWS),” Technological Forecasting & Social Change 205 (2024): 1–15, https://doi.org/10.1016/j.techfore.2024.123514.

7 “Open Letter on Autonomous Weapons,” Future of Life Institute, accessed Nov. 23, 2024, https://futureoflife.org/open-letter/open-letter-autonomous-weapons-ai-robotics/.

8 Erica H. Ma, “Autonomous Weapons Systems under International Law,” New York University Law Review 95, no. 5 (2020): 1437.

9 “Autonomous Weapon Systems: Five Key Human Rights Issues for Consideration,” Amnesty International, accessed Nov. 23, 2024, https://www.amnestyusa.org/wp-content/uploads/2017/04/autonomous_weapons_systems_report.pdf.

10 Damien Cottier, “Lethal Autonomous Weapons Systems (LAWS): Apprehensions and Implications,” Parliamentary Assembly of the Council of Europe (2022), accessed Nov. 23, 2024, https://assembly.coe.int/LifeRay/JUR/Pdf/TextesProvisoires/2022/20221116-LawsApprehension-EN.pdf; Jack M. Beard, “Autonomous Weapons and Human Responsibilities,” Georgetown Journal of International Law 45, no. 3 (2014): 619.

11 See Christof Heyns, Report of the Special Rapporteur on Extrajudicial, Summary, or Arbitrary Executions, United Nations Human Rights Council, 23rd Session, A/HRC/23/47, accessed Nov. 23, 2024, https://www.ohchr.org/sites/default/files/Documents/HRBodies/HRCouncil/RegularSession/Session23/A-HRC-23-47_en.pdf.

12 Charles P. Trumbull IV, “Autonomous Weapons: How Existing Law Can Regulate Future Weapons,” Emory International Law Review 34 (2020): 533.

13 See “Secretary-General’s Address to the General Assembly,” United Nations, Sept. 25, 2018, https://www.un.org/sg/en/content/sg/statement/2018-09-25/secretary-generals-address-general-assembly-delivered-trilingual.

14 Hayley Evans, “Too Early to Ban? U.S. and U.K. Positions on Lethal Autonomous Weapons Systems,” Lawfare, accessed Nov. 23, 2024, https://www.lawfaremedia.org/article/too-early-ban-us-and-uk-positions-lethal-autonomous-weapons-systems.

15 Trumbull IV, “Autonomous Weapons,” 535 (Footnote n 12).

16 Human Rights Watch, Stopping Killer Robots: Country Positions on Banning Fully Autonomous Weapons and Retaining Human Control, Aug. 10, 2020, https://www.hrw.org/report/2020/08/10/stopping-killer-robots/country-positions-banning-fully-autonomous-weapons-and.

17 Beard, “Autonomous Weapons and Human Responsibilities,” 622 (Footnote n 10).

19 Ibid., 6.

20 Congressional Research Service, Defense Primer: U.S. Policy on Lethal Autonomous Weapon Systems, updated Jan. 13, 2023, https://crsreports.congress.gov/product/pdf/IF/IF11294/6.

21 US Department of Defense, Directive 3000.09: Autonomy in Weapon Systems, Jan. 25, 2023, https://media.defense.gov/2023/Jan/25/2003149928/-1/-1/0/DOD-DIRECTIVE-3000.09-AUTONOMY-IN-WEAPON-SYSTEMS.PDF.

22 Human Rights Watch, Losing Humanity: The Case Against Killer Robots, accessed Nov. 23, 2024, https://www.hrw.org/sites/default/files/reports/arms1112_ForUpload.pdf.

23 Heyns, Report of the Special Rapporteur (Footnote n 11).

24 See Amnesty International, Stop Killer Robots Activism Toolkit, accessed Nov. 23, 2024, https://www.amnesty.org.uk/files/2020-08/Stop%20Killer%20Robots%20Activism%20Toolkit.pdf?xPcBuAZX_UoV_XSsYKOQ_gkjh_luRDep=.

25 Heyns, Report of the Special Rapporteur (Footnote n 11).

26 Christopher P. Toscano, “Friend of Humans: An Argument for Developing Autonomous Weapons Systems,” accessed Nov. 23, 2024, https://jnslp.com/wp-content/uploads/2015/05/Friend-of-Humans.pdf.

27 Ma, “Autonomous Weapons Systems under International Law,” 1445 (Footnote n 8).

28 Ibid.

29 Amitai Etzioni and Oren Etzioni, “Pros and Cons of Autonomous Weapons Systems,” Military Review (May-June 2017), accessed Nov. 23, 2024, https://www.armyupress.army.mil/Portals/7/military-review/Archives/English/pros-and-cons-of-autonomous-weapons-systems.pdf; “Pros and Cons of Autonomous Weapons Systems,” RoboticsBiz, accessed Nov. 23, 2024, https://roboticsbiz.com/pros-and-cons-of-autonomous-weapons-systems/.

30 “Pros and Cons of Autonomous Weapons Systems,” RoboticsBiz (Footnote n 29).

31 Ma, “Autonomous Weapons Systems under International Law,” 1445 (Footnote n 8).

32 See Michael N. Schmitt, “Autonomous Weapon Systems and International Humanitarian Law: A Reply to the Critics,” Harvard National Security Journal 4 (2013): 1–37; Marco Sassoli, “Autonomous Weapons and International Humanitarian Law: Advantages, Open Technical Questions and Legal Issues to Be Clarified,” International Law Studies Series 90 (2014): 308–40; Ronald C. Arkin, “The Case for Ethical Autonomy in Unmanned Systems,” Journal of Military Ethics 9, no. 4 (2010): 332–41, https://doi.org/10.1080/15027570.2010.536402; Kenneth Anderson, Daniel Reisner, and Matthew Waxman, “Adapting the Law of Armed Conflict to Autonomous Weapon Systems,” International Law Studies 90 (2014): 386–411; M.N. Schmitt and J.S. Thurnher, “‘Out of the Loop’: Autonomous Weapon Systems and the Law of Armed Conflict,” Harvard National Security Journal 4, no. 2 (2013): 231–81.

33 James Foy, “Autonomous Weapons Systems: Taking the Human out of International Humanitarian Law,” Dalhousie Journal of Legal Studies 23 (2014): 47.

34 See Mary Ellen O’Connell, “Banning Autonomous Weapons: A Legal and Ethical Mandate,” Ethics & International Affairs 37, no. 3 (2023): 287–98; Peter Asaro, “On Banning Autonomous Weapon Systems: Human Rights, Automation, and the Dehumanization of Lethal Decision-Making,” International Review of the Red Cross 94, no. 886 (2012): 687–709; Noel E. Sharkey, “The Evitability of Autonomous Robot Warfare,” International Review of the Red Cross 94, no. 886 (Summer 2012): 787–800; Philip Alston, “Lethal Robotic Technologies: The Implications for Human Rights and International Humanitarian Law,” Journal of Law, Information and Science 21, no. 2 (2011/2012): 35–60; Noel Sharkey, “Saying ‘No!’ to Lethal Autonomous Targeting,” Journal of Military Ethics 9, no. 4 (2010): 369–83, https://doi.org/10.1080/15027570.2010.537903; Bonnie Lynn Docherty, Shaking the Foundations: The Human Rights Implications of Killer Robots, Human Rights Watch, 2014, https://www.hrw.org/report/2014/05/12/shaking-foundations/human-rights-implications-killer-robots.

35 Ma, “Autonomous Weapons Systems under International Law,” 1446 (Footnote n 8).

36 Beard, “Autonomous Weapons and Human Responsibilities,” 619 (Footnote n 10).

37 Foy, “Autonomous Weapons Systems,” 47 (Footnote n 33).

38 Trumbull IV, “Autonomous Weapons: How Existing Law Can Regulate Future Weapons,” 551 (Footnote n 12).

39 Ibid., 552.

40 Ibid.

41 Ibid., 553.

42 Christof Heyns, Report of the Special Rapporteur (Footnote n 11); Amanda Sharkey, “Autonomous Weapons Systems, Killer Robots and Human Dignity,” Ethics and Information Technology 21 (2019): 75–87; Christof Heyns, “Autonomous Weapons Systems: Living a Dignified Life and Dying a Dignified Death.” In Autonomous Weapons Systems: Law, Ethics, Policy, edited by Nehal Bhuta et al., 3–20 (Cambridge University Press, 2016); Christof Heyns, “Autonomous Weapons in Armed Conflict and the Right to a Dignified Life: An African Perspective,” South African Journal on Human Rights 33, no. 1 (2017): 46–71; Docherty, Shaking the Foundations (Footnote n 34); Peter Asaro, “On Banning Autonomous Lethal Systems: Human Rights, Automation, and the Dehumanizing of Lethal Decision-Making,” International Review of the Red Cross 94, no. 886 (2012): 687–709.

43 N. Sharkey, “The Evitability of Autonomous Robot Warfare (Footnote n 34).

44 Docherty, Shaking the Foundations (Footnote n 34).

45 Noel Sharkey, “Why Robots Should Not Be Delegated with the Decision to Kill,” Connection Science 29, no. 2 (2017): 177–86, https://doi.org/10.1080/09540091.2017.1310183; see also N. Sharkey, “Saying ‘No!” (Footnote n 34).

46 Rasha Abdul Rahim, The Weaponisation of AI: An Existential Threat to Human Rights and Dignity, accessed Nov. 23, 2024, https://giswatch.org/sites/default/files/gisw2019_web_th7.pdf.

47 Ibid., 42–43.

48 Docherty, Shaking the Foundations (Footnote n 34).

49 International Committee of the Red Cross, Database on Customary International Humanitarian Law, Rule 89, accessed Nov. 23, 2024, https://ihl-databases.icrc.org/en/customary-ihl/v1/rule89.

50 Docherty, Shaking the Foundations (Footnote n 34); see also Heyns, Report of the Special Rapporteur (Footnote n 11); O’Connell, “Banning Autonomous Weapons: A Legal and Ethical Mandate,” 287–98 (Footnote n 34).

51 N. Sharkey, “Why Robots Should Not Be Delegated with the Decision to Kill” (Footnote n 45).

52 Article 6(1) states that every human being has the inherent right to life. This right shall be protected by law. No one shall be arbitrarily deprived of his life. See https://www.ohchr.org/en/instruments-mechanisms/instruments/international-covenant-civil-and-political-rights#:~:text=PART%20III-,Article%206,shall%20be%20protected%20by%20law.

53 N. Sharkey, “Why Robots Should Not Be Delegated with the Decision to Kill,” 43 (Footnote n 45).

55 N. Sharkey, “Why Robots Should Not Be Delegated with the Decision to Kill,” 43 (Footnote n 45).

56 Ibid.

57 See O’Connell, “Banning Autonomous Weapons: A Legal and Ethical Mandate,” 287–98 (Footnote n 34); Peter Asaro, “On Banning Autonomous Weapon Systems: Human Rights, Automation, and the Dehumanization of Lethal Decision-Making,” International Review of the Red Cross 94, no. 886 (2012): 687–709; Robert Sparrow, “Robots and Respect: Assessing the Case Against Autonomous Weapon Systems,” Ethics & International Affairs 30, no. 1 (2016): 93–116, https://doi.org/10.1017/S0892679415000647; A. Sharkey, “Autonomous Weapons Systems,” 75–87 (Footnote n 42); Christof Heyns, Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, UN Doc. A/HRC/23/47, 2013, pp. 1–22.

58 United Nations, Universal Declaration of Human Rights, accessed Nov. 23, 2024, https://www.un.org/en/about-us/universal-declaration-of-human-rights.

59 Ibid.

60 Frank Sauer, “Stepping Back from the Brink: Why Multilateral Regulation of Autonomy in Weapons Systems Is Difficult, Yet Imperative and Feasible,” International Review of the Red Cross 102, no. 913 (2020): 253–54, https://doi.org/10.1017/S1816383120000466.

61 “Mind the Gap: The Lack of Accountability for Killer Robots,” Human Rights Watch (2015): 1–38, https://www.hrw.org/sites/default/files/reports/arms0415_ForUpload_0.pdf.

62 Ibid.

63 Ibid., 10; see also A. Sharkey, “Autonomous Weapons Systems,” (Footnote n 42).

64 Alicia Wallace, “Tyson and other Meat Processors Are Reportedly Speeding Up Plans for Robot Butchers,” CNN (July 10, 2020), https://www.cnn.com/2020/07/10/business/tyson-meatpacking-plants-automation/index.html.

65 O’Connell, “Banning Autonomous Weapons: A Legal and Ethical Mandate,” 294–95 (Footnote n 34).

66 Amnesty International, “Autonomous Weapons Systems: Five Key Human Rights Issues for Consideration,” 11 (2015), https://www.amnestyusa.org/wp-content/uploads/2017/04/autonomous_weapons_systems_report.pdf.

68 Ibid., art. 26.

69 N. Sharkey, “Why Robots Should Not Be Delegated with the Decision to Kill,” 44 (Footnote n 45).

70 Ibid.

71 Jacob Snow, “Amazon’s Face Recognition Falsely Matched 28 Members of Congress with Mugshots,” American Civil Liberties Union (ACLU), July 26, 2018, https://www.aclu.org/news/privacy-technology/amazons-face-recognition-falsely-matched-28.

72 N. Sharkey, “Why Robots Should Not Be Delegated with the Decision to Kill,” 44 (Footnote n 45).

75 Docherty, Shaking the Foundations, 17 (Footnote n 34).

76 Ibid., 17–18.

77 See Geneva Convention Relative to The Protection of Civilian Persons in Time of War of 12 August 1949, https://www.un.org/en/genocideprevention/documents/atrocity-crimes/Doc.33_GC-IV-EN.pdf.

78 See Protocols Additional to The Geneva Conventions of 12 August 1949, https://www.icrc.org/sites/default/files/external/doc/en/assets/files/other/icrc_002_0321.pdf.

79 Docherty, Shaking the Foundations, 18 (Footnote n 34).

80 Ibid.

81 N. Sharkey, “The Evitability of Autonomous Robot Warfare” (Footnote n 34).

82 Docherty, Shaking the Foundations, 18 (Footnote n 34).

83 Ibid., 19.

84 Ibid.; see also L. Monnett, Sending Up a Flare: Autonomous Weapons Systems Proliferation Risks to Human Rights and International Security [Research brief] (Geneva Academy, 2024), https://www.geneva-academy.ch/joomlatools-files/docman-files/Sending%20Up%20a%20Flare%20Autonomous%20Weapons%20Systems%20Proliferation%20Risks.pdf.

85 Amnesty International, “Will I be next?”: U.S. drone strikes in Pakistan (2013), https://www.amnesty.org/en/wp-content/uploads/2021/06/asa330132013en.pdf.

86 Amnesty International, “Autonomous Weapons Systems: Five Key Human Rights Issues for Consideration,” 25 (2015), https://www.amnesty.org/en/wp-content/uploads/2023/05/ACT3014012015ENGLISH.pdf.

87 Ibid., 26.

References

Rashid, Adib Bib and Ashfakul Karim Kausik, MD. “AI Revolutionizing Industries Worldwide: A Comprehensive Overview of Its Diverse Applications.” Hybrid Advances 7 (2024). https://doi.org/10.1016/j.hybadv.2024.100277.CrossRefGoogle Scholar
Wallace, Alicia, “Tyson and Other Meat Processors Are Reportedly Speeding Up Plans for Robot Butchers.” CNN, July 10, 2020. https://www.cnn.com/2020/07/10/business/tyson-meatpacking-plants-automation/index.html.Google Scholar
Sharkey, Amanda, “Autonomous Weapons Systems, Killer Robots and Human Dignity.” Ethics and Information Technology 21, no. 2 (2019): 7587. https://doi.org/10.1007/s10676-018-9494-0.CrossRefGoogle Scholar
Etzioni, Amitai and Etzioni, Oren, “Pros and Cons of Autonomous Weapons Systems.” Military Review (May-June 2017). Accessed November 23, 2024. https://www.armyupress.army.mil/Portals/7/military-review/Archives/English/pros-and-cons-of-autonomous-weapons-systems.pdf.Google Scholar
Amnesty International. “Autonomous Weapon Systems: Five Key Human Rights Issues for Consideration.” Accessed November 23, 2024. https://www.amnestyusa.org/wp-content/uploads/2017/04/autonomous_weapons_systems_report.pdf.Google Scholar
Amnesty International. “Will I be next?”: U.S. drone strikes in Pakistan. Amnesty International (2013). Accessed November 23, 2024. https://www.amnesty.org/en/wp-content/uploads/2021/06/asa330132013en.pdf.Google Scholar
Docherty, Bonnie Lynn, Shaking the Foundations: The Human Rights Implications of Killer Robots. Human Rights Watch, 2014. https://www.hrw.org/report/2014/05/12/shaking-foundations/human-rights-implications-killer-robots.Google Scholar
Trumbull, Charles P. IVAutonomous Weapons: How Existing Law Can Regulate Future Weapons.” Emory International Law Review 34 (2020): 533.Google Scholar
Heyns, Christof, Report of the Special Rapporteur on Extrajudicial, Summary, or Arbitrary Executions. A/HRC/23/47 (2013). United Nations Human Rights Council, 23rd Session. Accessed November 23, 2024. https://www.ohchr.org/sites/default/files/Documents/HRBodies/HRCouncil/RegularSession/Session23/A-HRC-23-47_en.pdf.Google Scholar
Toscano, Christopher P., “Friend of Humans: An Argument for Developing Autonomous Weapons Systems.” Accessed November 23, 2024. https://jnslp.com/wp-content/uploads/2015/05/Friend-of-Humans.pdf.Google Scholar
Felt, Coley, “Autonomous Weaponry: Are Killer Robots in Our Future?” The Henry M. Jackson School of International Studies, February 14, 2020. https://jsis.washington.edu/.Google Scholar
Congressional Research Service. Defense Primer: U.S. Policy on Lethal Autonomous Weapon Systems. Updated January 13, 2023. https://crsreports.congress.gov/product/pdf/IF/IF11294/6.Google Scholar
Cottier, Damien, Lethal Autonomous Weapons Systems (LAWS): Apprehensions and Implications. Parliamentary Assembly of the Council of Europe, 2022. Accessed November 23, 2024. https://assembly.coe.int/LifeRay/JUR/Pdf/TextesProvisoires/2022/20221116-LawsApprehension-EN.pdf.Google Scholar
Ma, Erica H., “Autonomous Weapons Systems under International Law.” New York University Law Review 95, no. 5 (2020): 1437.Google Scholar
Sauer, Frank, “Stepping Back from the Brink: Why Multilateral Regulation of Autonomy in Weapons Systems Is Difficult, Yet Imperative and Feasible.” International Review of the Red Cross 102, no. 913 (2020): 253–54. https://doi.org/10.1017/S1816383120000466.CrossRefGoogle Scholar
Future of Life Institute. “Open Letter on Autonomous Weapons.” Accessed November 23, 2024. https://futureoflife.org/open-letter/open-letter-autonomous-weapons-ai-robotics/.Google Scholar
Evans, Hayley, “Too Early to Ban? U.S. and U.K. Positions on Lethal Autonomous Weapons Systems.” Lawfare. Accessed November 23, 2024. https://www.lawfaremedia.org/article/too-early-ban-us-and-uk-positions-lethal-autonomous-weapons-systems.Google Scholar
Human Rights Watch. Losing Humanity: The Case Against Killer Robots. Accessed November 23, 2024. https://www.hrw.org/sites/default/files/reports/arms1112_ForUpload.pdf.Google Scholar
Human Rights Watch, Stopping Killer Robots: Country Positions on Banning Fully Autonomous Weapons and Retaining Human Control, August 10, 2020. https://www.hrw.org/report/2020/08/10/stopping-killer-robots/country-positions-banning-fully-autonomous-weapons-and.Google Scholar
Beard, Jack M., “Autonomous Weapons and Human Responsibilities.” Georgetown Journal of International Law 45, no. 3 (2014): 617–81.Google Scholar
Snow, Jacob, “Amazon’s Face Recognition Falsely Matched 28 Members of Congress with Mugshots.” American Civil Liberties Union (ACLU). July 26, 2018. https://www.aclu.org/news/privacy-technology/amazons-face-recognition-falsely-matched-28.Google Scholar
Foy, James, “Autonomous Weapons Systems: Taking the Human out of International Humanitarian Law.” Dalhousie Journal of Legal Studies 23 (2014): 4770.Google Scholar
Burton, Joe and Soare, Simona R., “Understanding the Strategic Implications of the Weaponization of Artificial Intelligence.” 11th International Conference on Cyber Conflict (CyCon) 1, no. 1 (2019): 117. https://doi.org/10.23919/CYCON.2019.8756866.Google Scholar
Anderson, Kenneth, Reisner, Daniel, and Waxman, Matthew, “Adapting the Law of Armed Conflict to Autonomous Weapon Systems.” International Law Studies 90 (2014): 386411.Google Scholar
Monnett, L., Sending Up a Flare: Autonomous Weapons Systems Proliferation Risks to Human Rights and International Security. Research brief. Geneva Academy, 2024. https://www.geneva-academy.ch/joomlatools-files/docman-files/Sending%20Up%20a%20Flare%20Autonomous%20Weapons%20Systems%20Proliferation%20Risks.pdf.Google Scholar
Schmitt, M.N. and Thurnher, J.S.. “‘Out of the Loop’: Autonomous Weapon Systems and the Law of Armed Conflict.” Harvard National Security Journal 4, no. 2 (2013): 231–81.Google Scholar
Sassoli, Marco, “Autonomous Weapons and International Humanitarian Law: Advantages, Open Technical Questions and Legal Issues to Be Clarified.” International Law Studies Series 90 (2014): 308–40.Google Scholar
O’Connell, Mary Ellen, “Banning Autonomous Weapons: A Legal and Ethical MandateEthics & International Affairs 37, no. 3 (2023): 294–95. https://doi.org/10.1017/S0892679423000357.Google Scholar
Schmitt, Michael N., “Autonomous Weapon Systems and International Humanitarian Law: A Reply to the Critics.” Harvard National Security Journal 4 (2013): 137.Google Scholar
Sharkey, Noel E., “The Evitability of Autonomous Robot Warfare.” International Review of the Red Cross 94, no. 886 (Summer 2012): 787800. doi:10.1017/S1816383112000732.CrossRefGoogle Scholar
Sharkey, Noel E., “Saying ‘No!’ to Lethal Autonomous Targeting.” Journal of Military Ethics 9, no. 4 (2010): 369–83. https://doi.org/10.1080/15027570.2010.537903.CrossRefGoogle Scholar
Sharkey, Noel E., “Why Robots Should Not Be Delegated with the Decision to Kill.” Connection Science 29, no. 2 (2017): 177–86. https://doi.org/10.1080/09540091.2017.1310183.CrossRefGoogle Scholar
Asaro, Peter, “On Banning Autonomous Lethal Systems: Human Rights, Automation, and the Dehumanizing of Lethal Decision-Making.” International Review of the Red Cross 94, no. 886 (2012): 687709. https://doi.org/10.1017/S1816383112000768.CrossRefGoogle Scholar
Alston, Philip, “Lethal Robotic Technologies: The Implications for Human Rights and International Humanitarian Law.” Journal of Law, Information and Science 21, no. 2 (2011/2012): 3560.Google Scholar
Rahim, Rasha Abdul, The Weaponisation of AI: An Existential Threat to Human Rights and Dignity. Accessed November 23, 2024. https://giswatch.org/sites/default/files/gisw2019_web_th7.pdf.Google Scholar
Sparrow, Robert, “Robots and Respect: Assessing the Case Against Autonomous Weapon Systems.” Ethics & International Affairs 30, no. 1 (2016): 93116. https://doi.org/10.1017/S0892679415000647.CrossRefGoogle Scholar
RoboticsBiz, “Pros and Cons of Autonomous Weapons Systems.” Accessed November 23, 2024. https://roboticsbiz.com/pros-and-cons-of-autonomous-weapons-systems/.Google Scholar
Arkin, Ronald C., “The Case for Ethical Autonomy in Unmanned Systems.” Journal of Military Ethics 9, no. 4 (2010): 332–41. https://doi.org/10.1080/15027570.2010.536402.CrossRefGoogle Scholar
Cellan-Jones, Rory, “Stephen Hawking Warns Artificial Intelligence Could End Mankind.” BBC News. Last modified December 2, 2014. https://www.bbc.com/news/technology-30290540.Google Scholar
Rawas, S., “AI: The Future of Humanity.” Discover Artificial Intelligence 4, no. 25 (2024). https://doi.org/10.1007/s44163-024-00118-3.CrossRefGoogle Scholar
Harwood, Stephen, “A Cybersystemic View of Autonomous Weapon Systems (AWS).” Technological Forecasting & Social Change 205 (2024): 115. https://doi.org/10.1016/j.techfore.2024.123514.CrossRefGoogle Scholar
United Nations, “Secretary-General’s Address to the General Assembly.” September 25, 2018. https://www.un.org/sg/en/content/sg/statement/2018-09-25/secretary-generals-address-general-assembly-delivered-trilingual.Google Scholar
United Nations, Universal Declaration of Human Rights. Accessed November 23, 2024. https://www.un.org/en/about-us/universal-declaration-of-human-rights.Google Scholar
US Department of Defense, Directive 3000.09: Autonomy in Weapon Systems. January 25, 2023. https://media.defense.gov/2023/Jan/25/2003149928/-1/-1/0/DOD-DIRECTIVE-3000.09-AUTONOMY-IN-WEAPON-SYSTEMS.PDF.Google Scholar
US Department of Defense, “Summary of the Department of Defense Artificial Intelligence Strategy: Harnessing AI to Advance Our Security and Prosperity.” Accessed November 23, 2024. https://media.defense.gov/2019/feb/12/2002088963/-1/-1/1/summary-of-dod-ai-strategy.pdf.Google Scholar