The use and deployment of nuclear weapons must firmly remain in the hands of human beings and never be delegated to any technological marvel, no matter how much it emulates human behaviour
This piece is part of the series, 25 Years Since Pokhran II: Reviewing India’s Nuclear Odyssey
The uses and applications of Emerging Technologies—an umbrella term encompassing technologies seemingly as varied as Artificial Intelligence (AI), quantum computing, cloud computing, additive manufacturing and autonomous systems—have been viewed from the perspective of either applying them in conventional warfighting scenarios or accentuating a nation’s capabilities in the ‘grey zone’. There have hardly been debates or arguments for or against the impact of emerging technologies on a country’s nuclear arsenal in the security studies field, barring some rare exceptions. It is the intent of this analysis to examine convergences between these two fields, especially AI, quantum computing, and autonomous systems, and examine the possibilities of how they will interact with each other in the future.
Major tenets of India’s nuclear doctrine
India’s nuclear doctrine, at least the public version was released officially on 4 January 2003 and had eight important points, chiefly, a “No First Use (NFU) nuclear posture”; credible minimal deterrent; massive retaliation against a first strike with an aim to inflict “unacceptable damage” and; nuclear retaliatory attacks to only be authorised by the civilian political leadership through the Nuclear Command Authority. The doctrine also mentions the appointment of a Commander-in-Chief (C-in-C) of the Strategic Forces Command (SFC), which will manage and administer the strategic forces. Further issues such as command and control structures, state of readiness, targeting strategy for a retaliatory attack, and operating procedures for various stages of alert and launch were also examined in detail, as mentioned in the doctrine. These publicly and officially available attributes of India’s nuclear doctrine provide a scaffolding on which the computing-heavy layer of AI, quantum computing, and autonomous systems will be applied.
Data and compute are essential for the machine learning/ deep learning (ML/DL) variant of AI, which relies on extremely powerful Graphics Processing Units (GPU) chips capable of parallel processing and massive amounts of data to find interlinkages amongst the data sets and come out with a response.
Artificial Intelligence and nuclear weapons
The world’s first programmable, electronic, general-purpose computer ENIAC, or the Electronic Numerical Integrator and Computer, was completed in December 1945. Initially meant to calculate artillery shell trajectories, its first use was solving a set of three partial differential equations, which would determine the amount of Tritium required to ignite ‘Super’, a theoretical model of a hydrogen fusion bomb designed by Edward Teller. From the very beginning, computing and nuclear power have been intertwined. Compute, along with data, talent and institutions, form the core of AI development across the world. Data and compute are essential for the machine learning/ deep learning (ML/DL) variant of AI, which relies on extremely powerful Graphics Processing Units (GPU) chips capable of parallel processing and massive amounts of data to find interlinkages amongst the data sets and come out with a response. In cases where data is not that forthcoming or difficult to obtain, for example, simulating war scenarios or extracting data from actual wars, researchers have resorted to using game-theoretic scenarios to derive various contingencies. This helps model the behaviour of rational actors in both zero-sum and non-zero-sum games as well as games where more than two actors are involved. Usage of these models can help countries plan their nuclear strategies in a multi-polar world. AI can also be used for building almost complete pictures of an adversary’s conventional and nuclear inventory and forces, using a host of data from satellite imagery, open-source intelligence (OSINT) and merging it with human, signal, and geo-spatial intelligence.
Apart from modelling scenarios, AI’s use can be especially helpful in creating new variants of the nuclear materials to be used in fission- and fusion-based nuclear weapons, which straddle the golden zone between maximum yield and minimal volume. AlphaFold, a neural network based AI tool, is currently used to predict the three-dimensional folding structures of proteins. This takes the form of a computational approach that “incorporates physical and biological knowledge about protein structure (…) into the design of the deep learning algorithm”. This idea, if not the process, can be used in the future to create, design or find novel elements or their mix thereof for an optimal output. Similarly, AI techniques can also be used in conjunction with additive manufacturing for increasing the efficiency and design of complex procedures and operations such as outage scheduling, in-core fuel management, and fuel cycle parameters. Another aspect that is gaining prominence is the use of digital twins for predictive maintenance. AI can be used for creating an exact digital replica of a nuclear reactor, fission or fusion device, or any delivery platform(s) and their maintenance cycles, examined either in real time or sped up (eg 2x, 3x, 5x, 10x and so on) to look at structural issues that may cause breakdowns and introduce mitigatory factors at an early stage. In the military domain, the United Kingdom has set out an ambitious programme for creating a next generation air combat system, nicknamed Project Tempest, which involves the initial creation of a digital twin of the aircraft on which all modifications will be experimented.
AI and quantum computing can also assist in solving targeting challenges for the military, when it comes to the usage of nuclear weapons.
AI and quantum computing can also assist in solving targeting challenges for the military, when it comes to the usage of nuclear weapons. Despite the writings of few analysts who contend that India may also be looking at counterforce options, India’s nuclear doctrine stays resolutely as NFU. However, if the scenario changes, then the SFC would be tasked to come up with a list of targets. The challenge is twofold: Ascertaining the need for additional and/or alternate deployment and matching targets with weapon systems. In both cases, quantum computing and AI can help—the former by spinning various contingencies using parallel processing and incorporating and modelling chaos using “imperfect information”, and the latter by fusing and prioritising sensor input and coming up with multiple courses of action. Quantum computing can also be used in solving traffic optimisation issues (a practical problem when it comes to mobilising conventional forces in India’s hinterland) or the perennially tricky traveling salesman problem (TSP), which can be analogous to the target and platforms challenge.
Autonomous systems and nuclear weapons
Autonomous systems are touted as the future of conventional warfare. The major reason is the proliferation of anti-access and area denial weapons (A2/AD) platforms, such as anti-ship missiles, long range artillery, and sophisticated Surface to Air Missiles (SAMs). These systems will be supplemented by electronic warfare systems and cyber weapons, which will create “denied environments” such as GPS-denied environments. Autonomous systems with the ability to identify, select, and neutralise targets on their own, with humans either on-the-loop or out-of-the-loop are weapons platforms of the future. However, this creates its own challenges, the most critical being that of the conception of war as being fought by humans. Though some may dismiss this concern as being concerned with ethics, it is one of cold rationality. If war remains within the realm of human society (as a social act) it can still involve issues of proportionality, morality, human rights, and the most human act of all—bargaining. If wars are fought by machines, ostensibly on behalf of humans, then issues of yields, optimisation and results—especially in conflicts fought in denied environments under the nuclear overhang—will be predominant. Autonomous weapons can never make the decision that was taken by Lt Col Stanislav Petrov in 1983.
If war remains within the realm of human society (as a social act) it can still involve issues of proportionality, morality, human rights, and the most human act of all—bargaining.
While emerging technologies can be used to maximise the potential of the nuclear deterrent in a country, optimising for size, efficiency, and maintenance parameters, the use and deployment of nuclear weapons must firmly remain in the hands of human beings and never be delegated to any technological marvel, no matter how much it emulates human behaviour.
Lt Col Akshat Upadhyay is a Research Fellow at Manohar Parrikar Institute for Defence Studies and Analyses (MP-IDSA) in the Strategic Technologies Centre.
The views expressed above belong to the author(s).
ORF research and analyses now available on Telegram! Click here to access our curated content — blogs, longforms and interviews.