Connect with us

Defense

Autonomous Weapon Systems: Understanding and Operationalizing Human control

Published

on

Much has been already written on the autonomous weapon systems (AWS), and repeating the same conceptual description would be unnecessary. Here, we shall briefly discuss the difficulties in objectifying AWS[1]and understand the current developments on the concept of Human control on AWS. The essay shall analyze Stockholm International Peace Research Institute (SIPRI) and International Committee on Red Cross (ICRC) combined report (Boulanin, Neil, Netta, & Peldan, 2020) released in June 2020, and Ajay Lele’s article titled ‘Autonomous Weapon systems. To have a construct for the discussion ahead, let’s define what exactly the AWS in this article is.AWS is understood as the military-grade machine that can make their own decisions without human intervention(Lele, 2019). If that is broadly the understanding of AWS, defining Lethal Autonomous Weapon Systems (LAWS) turns out to have similar problems as in defining Terrorist. It is because of the subjectivity involved in the term ‘Lethal.’ For example, cyber warfare can be equally or more lethal than an airstrike. Are cyber-attacks assisted by Artificial Intelligence (AI) considered as LAWS? No consensus arrived for the latter.

If that is the dichotomy involved with the objectification of definition, the term ‘autonomy’ of the machine system itself is contextual. This makes it difficult to arrive at a universal legal consensus. During the World Wars, remote-controlled tanks, guided missiles were considered as autonomous as they could make decisions regarding their physical movements without soldiers manning them directly. take another example – It is impossible for a pilot while flying at the speed of Mach, to observe the targets with a naked eye. Decisions must be made within a fraction of second, to which the human body is not made of. There, the decision is made by computers along with high precision cameras. Isn’t that autonomous when the vision is considered? Consider US Tomahawk missiles. A sub-sonic cruise missile capable of maneuvering its way towards the target without constant human supervision. Even this is autonomous!

But the concern involving the development and deployment was not like that of today’s AI-based AWS. No matter how advanced the autonomy was, the decision making power, control regarding the actions on the filed was the pure prerogative of humans. The introduction of AI changes that. We have arrived at a junction in history where no human can comprehend the societal structure(Winner, 1978, p. 290). Even within the military, the complex inter-dependence of technology and humans have gone to an un-comprehensible level. AI involved weapon systems have aggregated the ‘black-box’ concern, pushing all the states to re-visit the humanitarian, ethical standards of the AWS.

As of current AWS deployment, airborne autonomous systems are saddle at Unmanned Air Vehicles (UAE), land-based robots at the preliminary stage (US SWORDS TALON), and Sea-based are missile systems assisted with auto-detection systems. However, the threat of machines taking the cognitive decisions without any human input is possible only with Artificial General Intelligence (AGI) and Artificial Super intelligence (ASI)[2]. The ASI is not yet invented and scientists are not sure if it is possible but it is strongly opined by some that it is not impossible and likely to be realized by the 1st third of the next century (Bostrom, 1998). Such super intelligence would be considered to have the capacity to become an uncontrolled offensive system but largely the current developments fall under controlled – defensive systems (Lele, 2019).

On these AWS, the 8 years long persisting concern of expert groups on emerging technologies figures two main aspects – human control, and accountability of AWS. The 2019 report of the Group of Governmental Experts (GGE) has drawn four principles on which further AWS policy research would be undertaken. They cover the aspects of International Humanitarian Law (IHL), Human control and accountability, the applicability of international law on the usage of AWS, and the accountability of development, deployment and usage must adhere to Convention on Certain Conventional Weapons (CCW) and necessary international laws.

AWS -Ethics and Human Control

The aspect of human control of AWS is the major ongoing debate in international fora. Previously, it never happened that a military operation is completely carried autonomously by munitions and thus, no laws are governing such aspects. These ex-ante debates on the probable loss of human control are anchored to the machine’s uncertain capabilities on predictability, ability to analyze the environment, and differentiating civilians and combatants. While humanitarian law is unquestionable agreed on while deploying the AWS, the ethical standards to be made are much more complex because of their subjectivity. The ethics of a soldier is different from the ethics of civilian. The debates of ethical standards on AWS are of two types, Result driven (consequential approach), action-driven (Deontological approach).  The latter depends on the moral judgments of the user. It considers the rights of both combatants and civilians alike while engaging in conflict. The former includes the probable consequences of the military operation. The international norms would take both the approaches into considerations in arriving at the final draft as the research is ex- ante.

For a proper subjective understanding, brood over the question – ‘Save her fellow soldier or save civilian? Which is ethical?’

To have ethics-based human control over the AWS, there are three ways – strict control of the weapons, control of the environment, and to have a hybrid human-machine interaction. Out of these, the last option is the most sophisticated and challenging. It involves humans in the loop and the entire decision making would be left to the human. She would be responsible for the identification of the target and analyzing the environment supported by the AI-based analysis. In the current stage of AI development, this becomes necessary as the intelligence of algorithms does not match that of a human.

Operational Challenges

Technology is always used to enhance their capabilities and to ensure their dominance of force. AWS would be an exceptional addition to its arsenal and probably be a leap forward for the military. While it is so, human control becomes more necessary so that AWS is used for the tactical and strategic advantage of the commanders but not as the commander itself.

The challenge which all the militaries across the world face are the knowledge required to operate such sophisticated AI-based weapons. To take control of the AWS as when required, the supervisor of the systems should have enough knowledge about the working of the system including the working of the algorithm. In addition to that, deployed AWS will not always be operated by the controller. It would be left auto most of the time which makes the operator dormant. SIPRI report provides a concept of ‘safe human-machine ratio’ to overcome the challenges of human-machine interaction. This formula is provided to have optimum operational personnel. If more humans are involved in the loop, co-ordination becomes difficult and less makes it strenuous to handle the decision making.

Nh = Nv + Np + 1

Where,

Nh– number of humans needed

Nv – number of vehicles

Np– number of payloads on those vehicles

+ 1 – additional safety officer.67

However, these three approaches are mutually dependent. On the whole, the report advises establishing a structural, cognitive, educational framework to embed humans into AWS working.[3]

Proceeding further, who, what, when, how are the univocal questions arising with the human control of the AWS. The questions who supervises and what provides a technically similar scenario to the already deployed systems like THAAD. The commander in control of the strategy, deployment, and decisions will have the obligation to ensure that the usage is in line with the IHL. Answering the question when, the involvement of humans is considered not to be just at the stage of usage, but even in the pre-development and development stage according to the GGE report. The last question of ‘how?’ involves the extent and type of human control. It requires proper Compliance with applicable international law along with the ability to retain and exercise human agency and moral responsibility for the use of force and its consequences and ensuring military effectiveness while mitigating risks to friendly forces.

Even if the supervision becomes mandatory, the AWS systems suffer from three different challenges viz. Human inclination towards machine bias, out of the loop controls, under-trust. The probable solution appears again to have a sophisticated human-machine interaction with a new structure to educate, train the operators.

Characteristics to be considered in drafting norms

The key characteristics to be considered –

Weapon SystemEnvironmentUser
Type of target Type of effect Mobility Types and capabilities of sensors System complexity Duration of autonomous operation.Predictability ObservabilityControllabilityThe physical and  cognitive abilities of humansThe user’s ability to understand the system; andThe distribution of human control.

1st column indicates the developmental and operational limits of the AWS. Of course in the view of ethical and humanitarian concerns, if there is a scientific solution for the latter, there may arrive a situation where the military establishment would consider realizing Elllul’s technological society.

2nd column emphasizes the restrictions on the operations to avoid civilian harm. One can think of not approving the usage of AWS in civilian spaces. Well, there is always a counter-argument that machines might be more efficient in differentiating combatants to innocent civilians, given their sematic censors, facial recognition algorithms. Surprisingly, the report has not touched on this aspect. 

3rd column, human-machine interaction is a vivid encouragement of human supervision and retaining the ability to intervene in the AWS at any point.

Finally, the overarching concern regarding human control and ethical usage looms on the efficient international norms. The problem of accountability and ethical debates shows that states are not concerned with the technology itself but the absence of laws. So the debate should revolve around the establishment of legal structures, both nationally and internationally to develop and use AI systems in the military. The above-categorized attributes become central in drafting the human control structures to deploy AWS into the armed forces. The complex interconnectedness of AI development and its integration into the latest weapon systems requires states to have their norms on AWS while adhering to common consensual international laws. This makes states retain their authority to determine the extent of human control and at the same time encourage the international scientific community to actively engage in the development of scientific solutions to uncertain autonomy.

On a concluding note, reiteration on the objectivity and contextual definitions of AWS, fear of un-ethical calls being taken by the autonomous systems and the loss of human agency takes us to the texts of French Philosopher, Jacques Ellul. His account -‘The technological society’ provides that the agency of humans would be completely taken over by techniques and technology with the current development and advancing dependence of humans on technology. Such a society with ubiquitous technology would restrict the knowledge systems of human civilization. With this hindsight, if one reads George Orwell’s 1984, it is sure that they would strongly advocate a ban on AWS development. However, Winner’s ‘autonomous technology’ provides an excellent scrutiny on Ellul’s work, reiterating the importance of understanding the change that the technology brings into the society, and how the social structures change accordingly so that they could accommodate such development. Based on Winner’s account, the SIPRI report and Lele’s article which has been critically looked at here would provide the best possible way towards incorporating AWS into the military with necessary considerations to account for while drafting the international norms.  However, it is in the ethos of military to adopt the advance technology and improve their efficiency.

References

Bostrom, N. (1998). How Long Before Superintelligence? International Journal of Future Studies, 2.

Boulanin, V., Neil, D., Netta, G., & Peldan, C. (2020). Limits of Autonomy in Weapon Systems: Identifying Practical Elements of Human Control. Stockholm: SIPRI.

Lele, A. (2019, January- March). Debating Lethal Autonomous Weapon Systems. Journal of Defence Studies, 13(1), 33-49.

Winner, L. (1978). Autonomous Technology. USA: MIT Press.


[1] This is for states to arrive at common consensual norms in the development and deployment of AWS. 

[2] Whose intelligence is far ahead of human intelligence. Having the capacity to cognitively comprehend wide variables in the surroundings and calculating numerous aspects simultaneously.

[3]I deliberately chose this articulation‘embed humans into AWS’ because the training of operators, providing a sufficient number of them to an AWS system, involving them in the process, etc. arrives from the pre-conception that soldiers should be able to learn and use AWS. It is seldom thought that AWS should be designed in such a way that it should meet the requirements of a particular commander.

Continue Reading
Comments

Defense

India’s Sprouting Counterforce Posture

Published

on

In recent years, the technological advancements by India in the domain of counterforce military capabilities have increased the vulnerability of the South Asian region. While trying to disturb the strategic stability in South Asia, India through its adventuresome counterforce posture against Pakistan is on the verge of becoming a rogue state. Notwithstanding the repercussions, India is voyaging towards destabilization in the South Asian Region.

India’s enhanced strategic nuclear capabilities which includes-the development of Multiple Independent Reentry Vehicles (MIRVs), Ballistic Missile Defence System (BMD), Inter-Continental Ballistic Missiles (ICBMs), supersonic and hypersonic cruise missiles, and acquisition of nuclear-capable submarines- indicate that India is moving away from its declared policy of ‘No First Use’ (NFU) towards a more aggressive, counterforce posture against Pakistan. The BMD and MIRV technology along with the provision of an advanced navigation system under BECA would embolden India to go for the first strike against Pakistan. While having reliance on BMD, as to be sheltered in return. These technological advancements made by India are sprouting a new era of counterforce posture, which would further make the South Asian region volatile and vulnerable to conflicts.

India’s urge to acquire counterforce capability is strongly associated with its doctrinal shift. As the stated posture requires flexibility in the use of nuclear weapons, which fortifies the first strike capability, and thus a deviation in India’s declared policy of ‘No First Use’ (NFU) has become more significant, particularly concerning its impact on regional stability. India’s declared policy of NFU, set out in Draft Nuclear Doctrine in 1999, followed by its first amendment in January 2003 has since then been into hot debates. Pakistan has long doubted the Indian policy of NFU, as the actions and statements by the officials of the latter have always been aggressive and protruding towards the former. India, now, is drifting away from its policy of NFU with the acquisition of counterforce capabilities, particularly against Pakistan. This is further evident from the statement issued by India’s Defense Minister Mr. Rajnath Singh, back in August 2019. It stated “Till today, our nuclear policy is ‘no-first-use’ (NFU). What happens in the future depends on the circumstances.” A change at the doctrinal level is evident in the Indian strategic enclave. Notwithstanding the challenges and repercussions caused by the counterforce strategy and with an attempt to destabilize the nuclear deterrence in the region, India would go unjustifiably low to attain such measures.  

In the same vein, India has been enhancing its nuclear capabilities for strategic flexibility against its regional rivals. By the same token, it wants to attain nuclear dominance, which would ultimately result in chaos in the region. The counterforce capability by India would compel its adversaries to heed towards the preemptive strike, in case of a crisis, out of the fear of the use of Nuclear weapons first by the patent enemy.  Moreover, the counterforce capability pushes the enemy to put the nuclear weapons on hair-trigger mode, which is directly linked with the crisis escalation.  The acquisition of counterforce capability by India would likely provoke a new arms race in the region. This would further destabilize the already volatile South Asian region. The far-reaching destabilization which India is trying to create, just to have an edge on the nuclear adversary, would be back on India’s face, faster than she knew it.

On the contrary, Pakistan has been maintaining a posture of Credible Minimum Deterrence (CMD) and does not claim to have a No-First Use (NFU) policy. Moreover, Pakistan’s nuclear capability is defensive in principle and a tool for deterrence. Given the Indian evolved notions of counterforce preemption, even now Pakistan would be left with no choice but to leave room for carrying out a ‘first strike’ as a feasible deterrent against India. Nevertheless, with the advent of technological innovations, its countermeasure arrives soon, too. Presently, there are two aspects that Pakistan should take into consideration; the growing Indo-US nexus and India’s concealed innovations in the nuclear posture. Though India is far from achieving counterforce strikes against Pakistan’s nuclear targets, concrete steps are required for maintaining future deterrence stability. With that intention, Pakistan might need to look towards its allies for getting hands-on the modern capabilities which includes- advanced communication and navigation systems, sensors, and advancements in artificial intelligence and otherwise, is essential for strengthening its deterrent capability. Pakistan should heed towards the development of absolute second-strike capability; as, what is survivable today, could be vulnerable tomorrow. Therefore, advancements in technology should be made for preserving nuclear deterrence in the future as well.

Summarizing it all, the existence of Pakistan’s nuclear deterrence has created a stable environment in the region, by deterring full-scale wars on multiple occasions that might have resulted in a nuclear exchange. With the revolution in nuclear technology, the threat of nuclear war has emerged again. Instead of going towards the attainment of peace and stability in the region, India has been enhancing its counterforce capabilities. This would likely remain a significant threat to the deterrence stability in the region. Moreover, any kind of failure to maintain nuclear deterrence in South Asia could result in an all-out war, without any escalation control. India, in its lust for power and hegemonic designs, has been destabilizing the region. Both the nuclear states in South Asia need to engage in arms restraint and escalation control measures. This seems to be a concrete and more plausible way out; else the new era of destabilization could be more disastrous.  

Continue Reading

Defense

A pig in a poke of Lithuanian Armed Forces

Published

on

The proverb “a chain is only as strong as its weakest link” perfectly reflects the situation in the Lithuanian armed forces. It is it unclear how the army will carry out its tasks, if everything that happens there runs counter to common sense.

The conscription took place in Lithuania. The recruits once again were revealed by an electronic lottery on January 7, 2021. 3,828 recruits were selected from the list of 38 thousand conscripts aged 18 to 23.

The idea of using electronic lottery in such a serious procedure arises a lot of questions among Lithuanians. Young people are suspicious of this method and fully admit the possibility of corruption. Nobody could check the results and so nobody could be blamed for random selection. The more so, the armed forces could get weaker recruits than in case of using usual ways of choosing among candidates. So, the army buys a pig in a poke.

This approach to recruitment in Lithuania results in presence of those with criminal intents and inclinations. Сases of crimes committed by Lithuanian military personnel have increased. Incidents with the involvement of military regularly occurred in Lithuania in 2020.

Thus, a soldier of the Lithuanian army was detained in Jurbarkas in October. He was driving under the influence of alcohol. A Lithuanian soldier suspected of drunk driving was detained also in Siauliai in December. Panevėžys County Chief Police Commissariat was looking for a soldier who deserted from the Lithuanian Armed Forces and so forth.

Such behaviour poses serious risks to public safety and leads to loss of confidence in the Lithuanian army in society.

Lithuanian military officials have chosen a new way to discourage young people from serving in the army, which is already not popular.

“The road to hell is paved with good intentions.” The ministry of defence decided to run a photo contest that would reflect service in the country’s armed forces. It is doubtful that such pictures will attract to the army, but the real situation is provided.

Usually, popularization is the act of making something attractive to the general public. This contest served the opposite goal. Look at the pictures and make conclusions.

Continue Reading

Defense

Fatah-1: A New Security and Technological Development About Pakistan’s Indigenous GMLRS

Published

on

Islamabad: It seems like 2021 has been a good start for Pakistan specifically with regard to stepping up its missile testing. On the 7th of January, the Pakistan military has successfully conducted a purely indigenously developed missile test flight known to be Fatah-1. As stated by various reports, Fatah-1 is an extended-range Guided Multi-Launch Rocket System (GMLRS) which itself is a developed variant of the guided MLRS family.

According to the recent statement given by the Inter-Services Public Relations (ISPR) about the newly developed rocket, it was stated: “The weapon system will give Pakistan Army capability of a precision target deep in the enemy territory.” Director-General of Pakistan Army, Media Wing, major general Babar Iftikhar on 7th January tweeted: “Pakistan today conducted a successful; test flight of indigenously developed Fatah-1, Guided Multi Launch Rocket System, capable of delivering a conventional Warhead up to a range of 140 km.”

Defense analyst Mr. Syed Muhammad Ali also stated in his capacity: “the new system was very fast, accurate, survivable, and difficult to intercept”. A video was also shared by ISPR on their official website, in which the missile launch can be seen while being fired from the launcher however, the details on when and where the test flight has taken place, along with the specification of the rocket system are yet to be announced.

Currently, Pakistan Army owns a wide range of Short-Range Ballistic Missiles (SRBM), Medium-Range Ballistic Missiles (MRBM), Battlefield Ballistic Missiles (BBM), Rocket Artillery, and Surface to Surface Cruise Missile (SSCM). In the previous year, Pakistan had also maintained prime success in conducting the Ra’ad-II cruise missile and Ghaznavi surface-to-surface ballistic missile (SSBM). Besides, Pakistan Air Force (PAF) on 30thDecember made apt progress when it comes to the national air defense arsenal as it was announced that PAF is beginning the production of the State-of-the-art JF-17 Thunder Block 3 fighter jets, at the same time acquiring the 14 dual-seat Jf-17 aircraft.

According to various reports, the JF-17 Thunder Block 3 will be said to have a new radar operational capability which will be far better in the practical domain as compared to the Raphael aircraft acquired by India. Whereas, the exchange of 14 dual-seat aircraft, manufactured with Pak-China cooperation were also given to the PAF which will be used for extensive training.

The recent successful testing of Fatah-1 has been considered to be another milestone for Pakistan as it tends to be a fitting response to the recent developments in the conventional capabilities carried out by India and also to India’s Cold Start Doctrine.

Continue Reading

Publications

Latest

Trending