Fig. 1: Armed Terminator T-800. (Source: Wikimedia Commons.) |
With the recently unveiled policy that envisions the introduction of low-yield nuclear weapons on submarine-launched ballistic missiles, many concerns have been raised about the safety of the world, as well as the potential international competitions of expanding nuclear arsenal. [1] However, many people have also expressed their concerns for an emerging technology - Artificial Intelligence, or A.I. that could pose greater threat than nuclear weapons in the long run.
Even though the "nukes" are low-yield, they could cause damage as severe as the atomic bombings of Hiroshima and Nagasaki back in 1945. The United States has tactical nuclear weapons (low-yield) 500 B61 bombs deployed to several countries, which are the B61-3 and B61-4 variants that are reported to have yields of 0.3 - 170 kilotons and 0.3 - 50 kilotons respectively. But the bomb dropped on Hiroshima was merely 13 kilotons. [2,3] However, after the Japan nuclear bombing disaster in 1945, many treaties have been signed world-wide to ensure nuclear disarmament, such as the Treaty on the Non-proliferation of Nuclear Weapons (NPT), and the Treaty Banning Nuclear Weapon Tests in the Atmosphere, in Outer Space and Under Water. [4,5] Many organizations have also been established like the International Atomic Energy Agency that seeks to promote the peaceful use of nuclear energy, and to inhibit its use for any military purpose. [6] As for the artificial intelligence, although it is still in its premature form, the development has gone so fast in recent years that researchers have predicted that the around 47 percent of total US employment could be automated over the next decade or two. [7] Besides, the US government has hugely funded and issued plans for A.I. development for national security and development. [8] A.I. has the potential to take over people's jobs, and it can also pose dangers to national security through means like cyberattack. But nowadays regulations against misuse of A.I. have yet surfaced. Concerns about possible A.I. takeover in the future have already arisen. [9] Machines do not suffer from radioactivity as humans do, and they may be more capable than humans in many aspects in the future. When humans pose dangers to their own existence, they could potentially take measures to protect themselves, even by using nuclear weapons to wipe out the human race. We might get a glimpse of this future from some science fiction movies, such as The Terminator that was directed by James Cameron in 1984 (Fig. 1).
In my opinion, A.I. can be more dangerous than nuclear weapons in the future. People have suffered from the 1945 trauma and been constantly trying to ensure the regulations against nuclear weapons and the expansion of nuclear power. But the lack of regulations and difficulty to set rules for the development of artificial intelligence render A.I. to be more dangerous than nuclear weapons. Thus new rules and means of supervision are needed for A.I. development to make sure it is for the betterment of human race, rather than endangering us.
© Xuanbing Cheng. The author warrants that the work is the author's own and that Stanford University provided no input other than typesetting and referencing guidelines. The author grants permission to copy, distribute and display this work in unaltered form, with attribution to the author, for noncommercial purposes only. All other rights, including commercial rights, are reserved to the author.
[1] M.R. Gordon, "U.S. Outlines Plan on Nuclear-Weapons Use," Wall Street Journal, 4 Feb 18.
[2] C. Murdock et al., "Project Atom: A Competitive Strategies Approach to Defining US Nuclear Strategy and Posture for 2025-2050," Center for Strategic and International Studies, May 2015.
[3] H. M. Kristensen and R. S. Norris, "The B61 Family of Nuclear Bombs," Bull. Atom. Sci. 70 779 (2015).
[4] "Treaty on the Non-Proliferation of Nuclear Weapons," International Atomic Energy Agency, INFCIRC/140, 22 Apr 70.
[5] "Treaty Banning Nuclear Weapon Tests in the Atmosphere, in Outer Space and under Water," (TIAS 5433), United States Treaties and Other Agreements, 14 UST 1313 (1963).
[6] "Statute," International Atomic Energy Agency, 28 December 1989.
[7] C. B. Frey and M. A. Osborne, "The Future of Employment: How Susceptible Are Jobs To Computerisation?," Technol. Forecast. Soc. 114 254 (2017).
[8] "The National Artificial Intelligence Research and Development Strategic Plan," U.S. National Science and Technology Council, Oct 2016.
[9] J. Barrat, Our Final Invention: Artificial Intelligence and the End of the Human Era (Thomas Dunne Books, 2013).