Contact Us

The time to negotiate rules for AI in nuclear weapons is now

Thought Leader: Anja Manuel
March 8, 2023
Written by: Anja Manuel

Lessons from the aftermath of the first nuclear strikes demonstrate it can take decades to establish peaceful stability

Anja Manuel is co-founder and principal of Rice, Hadley, Gates & Manuel LLC, a consulting firm

In the early days of the cold war, the US had won the race to acquire nuclear weapons, the most powerful and deadly on Earth. America then did something unprecedented and noble: in 1946, less than one year after the tragedies of Hiroshima and Nagasaki, the US government proposed that the UN should control nuclear resources and ensure they were used only for peaceful purposes.

We are now witnessing the advent of truly powerful artificial intelligence, and are not yet capable of understanding fully both the promise and perils of this new technology.

All the more reason then for our governments to be as wise and judicious as in the immediate aftermath of the first nuclear strikes: we must try to control the most harmful military aspects of AI — and ensure that the gains accrue to all.

Naive, critics say. Quixotic. The Chinese and Russians will never go for it. That’s likely to be true — in the short run.

It is unrealistic to expect that China, Russia, the US and EU will immediately place constraints on military uses of AI. After the initial US proposal in 1946 it took almost two decades — years which saw the testing of hydrogen bombs in the south Pacific, swelling nuclear arsenals, the Cuban missile crisis — before the USSR and US agreed to the Limited Test Ban Treaty in 1963 and 1968’s Nuclear Non-Proliferation Treaty.

But that doesn’t mean early efforts were wasted, just as they wouldn’t be now, for two important reasons.

First, simply by beginning this conversation with allies, but also with China and Russia, we would exercise the muscle of co-operation. Key technical experts and government officials would get to know each other, learn more about how our competitor countries are approaching military uses of AI, and what assumptions we each make about the other that could be dangerous. Second, during repeated meetings of private sector, academic and government groups on these topics, a sketch would naturally emerge of what uses of AI are seen as straightforwardly beyond the pale, and which need to be protected against. Even short of an agreement, this is powerful, valuable information to maintain stability in the face of a rapidly developing technology.

Bonnie Jenkins of the US state department has said ‘we have an obligation to create strong norms of responsible behaviour concerning military uses of AI’ © Denis Balibouse/Reuters
Bonnie Jenkins of the US state department has said ‘we have an obligation to create strong norms of responsible behaviour concerning military uses of AI’ © Denis Balibouse/Reuters

In a little-noticed announcement, the US state department recently made a small but promising foray in this direction. In February in The Hague, Bonnie Jenkins, the department’s under secretary for arms control, put forward 12 non-legally binding norms to govern military uses of AI. They include an exhortation that humans should always control any launch of nuclear weapons, and that the Geneva Conventions should apply. She emphasised that “we have an obligation to create strong norms of responsible behaviour concerning military uses of AI”.

This is a great, if limited, start. Urgent steps should be taken to double down on these efforts. As Henry Kissinger, former Google chief executive Eric Schmidt and others have warned, China, the US and Europe are all in danger of sleepwalking into conflict, given this era of new technology that we don’t really understand and can’t control. This is not just hyperbolic scaremongering: in the Ukraine conflict, drones will probably be used soon to select and attack targets without human interference. As ChatGPT and similar technologies develop, they will soon be capable of writing code viruses more potent and damaging than any we saw with WannaCry and NotPetya.

It is true, as sceptics may argue, that even 75 years of painstakingly negotiated arms control agreements have not banished nuclear weapons from the world. Those negotiations have, however, succeeded in the most important measure of all: since the horror of Hiroshima and Nagasaki, nuclear weapons have not once been used in war.

Subscribe to the WWSG newsletter.

Check Availability

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

0
Speaker List
Share My List