Dozens of scientists, health care professionals and scientists have written a letter to the U. N. call for an international ban on autonomous robots, saying recent advances in artificial intelligence “have brought us to the brink of a new arms race in lethal autonomous weapons.”
The letter, which is signed by more than 70 professionals in the healthcare sector and is compiled by the Future of Life Institute, argues that lethal autonomous weapons could fall into the hands of terrorists and despots, lowering the threshold for armed conflict and the “weapons of mass destruction leaving very little to kill a lot.”
“Moreover, autonomous weapons are morally abhorrent, as we must never allow that the decision to take a human life to algorithms,” the letter continues. “As healthcare professionals, we believe that breakthroughs in science have enormous potential to benefit society and should not be used for the automation of damage. We therefore call for an international ban on lethal autonomous weapons.”
WITH THE HELP OF THE “KILLER ROBOTS” IN WAR WOULD BE A VIOLATION OF INTERNATIONAL LAW, LAWYERS SAY
In addition to the letter, a study written by Dr. Emilia Javorsky posits that recent developments by a number of countries are working on lethal autonomous weapons systems “would represent a third revolution in warfare,” following gunpowder and nuclear weapons.
The effort made by the Future of Life Institute follows as of 2018 a property of more than 2,400 persons of companies and organizations around the world. Google DeepMind, the European Association for AI and the University College London, and others said that they had “neither part nor in the support of the development, the production, trade or use of lethal autonomous weapons.”
Others have taken their concerns to the U. N. and about the benefits and the costs of the murderers robots. Experts from several countries met in August 2018 at the Geneva offices of the U. N. to focus on lethal autonomous weapons systems and capabilities to investigate a possible regulation of them, among other issues.
In theory, fully autonomous, computer-controlled weapons do not exist yet, the UN officials said at the time. The debate is still in its infancy and the experts have at times struggled with definitions. The United States has argued that it is premature to have a definition of such systems, and much less regulated.
Some groups say governments and armies should be taken to prevent the development of such systems, which have sparked fears and led some critics to suggest distressing scenarios of their use.
In 2017, Tesla CEO Elon Musk and other leading artificial intelligence experts called on the United Nations for a global ban on the use of robots, in which drones, tanks and machine guns. “If this Pandora’s box is opened, it will be hard to close, Musk and 115 other specialists from around the world wrote in the letter.
IS SKYNET A REALITY? AS TRUMP SIGNS EXECUTIVE ORDER ON ARTIFICIAL INTELLIGENCE, TECH GIANTS WARN OF DANGER
“The greatest risk that we face’
Musk has several times worried about the rise of artificial intelligence, have previously indicated it could be the biggest risks we face as a civilization.” The tech exec has even gone so far as to say that it could lead to World War III.
Research firm IDC expects global spending on robots and drones will reach $ 201.3 billion in 2022, at an estimated $95.9 billion in 2018.
Through the years, a number of celebrities, including Musk, legendary theoretical physicist Stephen Hawking and many others have warned against the rise of artificial intelligence.
In September 2017, Musk tweeted that he thought that AI could play a direct role in the cause of world War III. Musk’s thoughts were in response to the comments of the Russian President Vladimir Putin, who said that “who is the leader in this sphere [artificial intelligence] will become the ruler of the world.”
In November 2017, prior to his death, Hawking his theory that AI could eventually “destroy” mankind if we are not careful with them.
CLICK HERE FOR THE FOX NEWS APP
The AP and Fox News’ Christopher Carbone contributed to this report.