Authored by Shrishti Mishra, a fourth-year student at the Institute of Law, Nirma.
DATE OF PUBLICATION- 09/02/2021
Lethal autonomous weapon systems (‘LAWS’) are machines having the ability to automatically target and deploy lethal force. Such weapons can be build to possess different degrees of autonomy, however, those that have little or no human control beyond the development stage of the weapon have been particularly identified as conflicting with international humanitarian law (‘IHL’). This concern seems to have a valid ground since States are already advancing to achieve higher levels of autonomy in their weapons, Israel’s loitering munition system HARPY and China’s progress in developing swarmtechnology are but few examples of the same. Currently, there is no specific international treaty or customary law that regulates the use of LAWS; hence, recourse must be made to available laws in order to check its validity. Amidst this backdrop, one principle of IHL that has frequently appeared in the discourse around LAWS is the Martens Clause.
Interpreting the Martens Clause
The Martens Clause prevents the assumption that anything that is not explicitly prohibited is permitted in international law. The Hague and Geneva Conventions institutionalize this notion referring to phrases such as “principles of humanity” and “dictates of public conscience” to regulate State’s conduct in situations of armed conflicts. Although, the status of Martens Clause as a principle of Customary International Law (‘CIL’) is not disputed among commentators, there is much confusion on its application.
The modern application of this principle can be seen in the Nuclear Weapon’s Case where the International Court of Justice (‘ICJ’) extensively deliberated on the use of nuclear weapons. The deliberations presented two contrasting interpretations of the clause. One view suggests that the Martens Clause simply reaffirms the idea that in absence of any specific international legal instrument, States remain bound by already established rules of CIL. This would essentially mean that the Martens Clause does not provide for a separate norm of State conduct and would only be a reminder of State’s obligation towards other customary laws. Countries such as the United Kingdomexpressed in their written opinion that the existence of Martens Clause would not be sufficient to prohibit the use of nuclear weapons in absence of other established rules of CIL prohibiting the same.
The other view expresses that existence of Martens Clause is sufficient in itself to regulate States’ conduct. This means that weapons in question would not just have to demonstrate compliance with existing CIL norms but additionally, they would be tested to see if they outrage “principles of humanity” and “dictates of public conscience.” The necessary implication is that the Martens Clause provides additional criteria on which conduct of States could be regulated. Support for this view can be found in the dissenting opinion of Justice Shahabuddeenwhere he argued that the 1977 Hague Convention provides for protection of civilians and belligerents under principles of international law in the absence of a written code of conduct. It notes that principles of international law arise from “established customs, principles of humanity and from the dictates of public conscience.” Had the text merely intended to accord protection under rules of CIL, it would have restricted itself to the first phrase. The mention of other two phrases is indicative of separate sources that could give rise to international law.
In the context of LAWS, the first view would mean that LAWS would be judged on the already existing CIL principles of unnecessary suffering and indiscriminate effect as there is no specific CIL norm prohibiting autonomous weapons. This would require a technical analysis of the weapon on the ground of proportionality of suffering caused by the weapon in comparison to the military advantage offered by it. Notably, the IHL standard of satisfying proportionality is relatively lowand practical and States only have to adhere to a standard of “feasibility” during the situations of armed conflict. Therefore, the standard of proportionality requires States to anticipate a balance between collateral damage and the military advantage offered by the weapon as a whole. Further, the Additional Protocol I stresses that military advantage emanating from an attack is subject to constant change according to the “circumstances ruling at that time”.
States have on multiple occasions used this interpretation to justify collateral damages such as civilian deaths caused by the attacks. See for instance the response of United Kingdom, U.S.A. and India in the Nuclear Weapons Case. These nuclear enabled nations have stated that harm caused by a nuclear attack to civilians and civilian property would not be disproportionate in certain circumstances such as reprisals. This interpretation would involve the issue of predictability and reliability that is central to the debate on legality of LAWS.
Predictability and reliability of a weapon are widely accepted legal standards to assess legality of a weapon. Many commentators have expressed that LAWS cannot satisfy these criteria as first, there is concern regarding the availability of effective mechanisms to test predictability and second, these programmes function in an opaque or “black box” manner which makes it almost impossible to understand how they reach to a particular output. This leads to the threat that the algorithm may function well outside the creator’s goal in already unpredictable environments of armed conflicts. These concerns are not out of place because incidents of AI weapons going rogue are not unprecedented. Hence, the first view restricts itself to a technical evaluation of LAWS. Importantly, there is no definitive understanding on the applicability of these criteria which would inevitably allow technologically empowered States to benefit from the lack of consistent interpretation. Judging from this, it is not unimaginable that States would be tempted to deploy LAWS because of their immense military potential as compared to other weapons.
On the other hand, ethical normativity lies at the heart of the second interpretation where the discussion is framed by the question whether the decision to kill a human should be given to machine. This has been supported by multiple bodies including the International Committee of the Red Cross, Human Rights Watch, Article 36, United Nations Institute for Disarmament Researchand the UN Special Rapporteur on Arbitrary Executions who have expressed that the use of fully LAWS undermine human dignity and “denigrates the value of life itself” by removing human agency in the decision to kill and therefore will attract the application of Martens Clause.
Hence, evaluation of LAWS on this premise makes the machine’s predictability and reliability secondary to this interpretation. This is not to say that the dictates of public conscious would not be affected in the least by technical capabilities of LAWS but this would ensure that the power to determine their legality does not remain confined to a few powerful States possessing LAWS like it happened in the case of nuclear weapons. The second view would give legal validation to voices across the world that eventually would have to face any undesired consequences that may arise out of violations committed by the use of LAWS.
ConclusionAdmittedly, there is no settled interpretation of the Martens Clause, yet it would be erroneous to cite it as redundant. The ICJ in Nicaragua recognized the separate status and importance of CIL principles even if they are codified in treaties. Hence, rules of CIL continue to exist and be applicable separately from a treaty embodying the same rule. Applying the same reasoning to the Martens Clause, it becomes clear that even though instances of armed conflict can be covered by conventional instruments of IHL, the separate status of Martens Clause as a legally binding norm could not be disputed. It is also admitted that currently and even in the foreseeable future, existence of a robot war-fare seems restricted to science fiction. However, the Secretary General on Chemical and Biological Weapons while arguing on legality of chemical warfare also admitted that there is insufficient knowledge of any comparable substance likely to be used as a chemical agent; still the potentially disruptive nature of such an event can prompt a ban on development and deployment of chemical weapons. Similarly, noting the potential that AI could reach and the unique conflictions such technology will have with international law, it is not incorrect to evaluate possible regulations within the existing framework and the Martens Clause would be indispensable to the discussion.