Lawmakers in the US have proposed a new bill, the Block Nuclear Launch by Autonomous AI Act, which would ban the use of government money to launch a nuclear bomb using an autonomous weapon system not subject to meaningful human oversight. The move comes amid fears of artificial intelligence, which has the potential to cause catastrophic harm if not properly controlled.
Although the American Department of Defense policy already bans AI from autonomously launching nuclear weapons, the proposed bill would codify existing Pentagon rules for nuclear weapons, ensuring that no autonomous system without meaningful human oversight can launch a nuclear weapon or select or engage targets to launch one.
The bill’s sponsors, Senators Edward Markey (D-MA) and Ted Lieu (D-CA), Don Beyer (D-VA), and Ken Buck (R-CO) noted that a 2021 National Security Commission on AI report recommended affirming a ban on autonomous nuclear weapon launches, not only to prevent it from happening within the US government but also to encourage similar commitments from China and Russia.
The sponsors hope that publicizing the bill will draw attention to the potential dangers of current-generation autonomous AI systems, a concern in Congress and the tech world. The proposal also highlights the sponsors’ other nuclear non-proliferation efforts — like a recent bill restricting the president’s power to unilaterally declare nuclear war.
Although the ban on autonomous nuclear weapons launches is already in place, the bill’s sponsors hope that making it a federal law will add an extra layer of protection. The measure would signal to the rest of the world that the United States is dedicated to the safe use of AI and would not accept its use in nuclear weapons.