Some of the scientists think that artificial intelligence can bring a “Terminator-style” war. For others, it sounds a little bit like an apocalyptic fantasy. Nevertheless, according to publications by defense experts and nuclear scientists, the world may be inching closer to an era where a Terminator-style apocalyptic war could be possible due to yielding control over atomic weapons to artificial intelligence.
Bunch of artificial intelligence experts was telling the Jerusalem post for years that people are worried about artificial intelligence on humanity in the famous “Terminator” movie. It is no fairy style that artificial intelligence might make a catastrophic mistake with a nuclear weapon.
Recently a top group of nuclear scientists, the Bulletin of the Atomic Scientists, published an article. The other defense experts also published an article concerning Artificial Intelligence. They said that there is a possibility that Russia is already integrating AL into a new nuclear torpedo. The modern atomic weapon is developing with the name of Poseidon, to make it autonomous.
China and the US also considered injecting artificial intelligent deeper into their nuclear weapons programs. It is what claims the Atomic Scientists report. Both countries want to overhaul and modernize their nuclear inventory.
There are no express reports concerning Israel integrating artificial intelligence. Nevertheless, according to foreign reports, they integrate AI into an apparatus between 80-200 nuclear weapons. There are some reports about the IDF integrating artificial intelligence into conventional weapons, for example, as its spice bomb carried by F-16s.
The report sounds concerned about the issue. The publication says that integrating AI into nuclear weapons’ systems may become culturally inevitable. Cause there are a strong chance non-conventional weapons will become more dominated by AI.
Artificial Intelligence and Nuclear War risk
The experts and scientists are writing about nuclear holocaust risks. They think it won’t be a hostile takeover by AI. Nevertheless, it will be hacking of the AI. It will slip out of control thanks to badly misjudging a situation or a technical error.
Unnamed vehicles can magnify such risks. They will carry nuclear weapons while no one on board. Moreover, no one will be responsible for making the final decision to deploy a nuclear weapon.
There is a secondary, but still severe risk. Artificial intelligence integration into the early warning system might overwhelm human decision-makers. AI can be faster, referring to the nuclear trigger finger. It can yield to the technology despite any social concerns they might have.
Automated evidence in general and AI can reinforce bubble-style thinking, some studies show. It can be more difficult for the analysts to entertain alternate narratives concerning what might occur in hi-stress and murky situations.
An excellent example of it is the 1983 incident. Stanislav Petrov, a Soviet officer, disregarded automated visual and audible warnings that US nuclear missiles were inbound.
Petrov was right, and technology was wrong. If Petrov had trusted technology, there could be a nuclear war.
Despite all, there are some valuable aspects of artificial intelligence in the nuclear weapons arena. It can gather more comprehensive and accurate data so that decision-makers are guessing in the vague.
So, is this advantage worth of Terminator-style war? Most experts answer that no, it is not deserving of it.