As healthcare adopts artificial intelligence to save lives, an open letter signed by 116 founders of robotics and AI companies from 26 countries has urged the United Nations to urgently ban “killer robots” developed for waging war.
Coinciding with the opening of the world’s largest AI conference in Melbourne today, it is the first time the world’s top experts, including Tesla and SpaceX founder Elon Musk, have joined in calling for an end to the robot arms race.
The open letter, released at the International Joint Conference on Artificial Intelligence, warns of the danger to humankind from the development of autonomous weapons of war.
“Lethal autonomous weapons threaten to become the third revolution in warfare. Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend,” the letter states.
“These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora’s box is opened, it will be hard to close.”
The letter calls on the UN “to find a way to protect us all from these dangers.”
Other company founders to have joined as signatories to the letter include Mustafa Suleyman of Google’s DeepMind, Esben Østergaard from Universal Robotics and Element AI’s Yoshua Bengio.
In 2015, a letter released at the IJCAI conference in Buenos Aires was signed by thousands of researchers in AI and robotics and endorsed by British physicist Stephen Hawking, Apple Co-founder Steve Wozniak and cognitive scientist Noam Chomsky, and contributed to UN recognition of the issue.
The first formal discussions by the UN’s Review Conference of the Convention on Conventional Weapons on autonomous weapons were due to be held today but have been delayed until November.
Scientia Professor of Artificial Intelligence at the University of New South Wales Toby Walsh, an organiser of both letters, said he hopes the warning from the AI and robotics industry will lead the UN to act before time runs out.
“Nearly every technology can be used for good and bad, and artificial intelligence is no different. It can help tackle many of the pressing problems facing society today: inequality and poverty, the challenges posed by climate change and the ongoing global financial crisis. However, the same technology can also be used in autonomous weapons to industrialise war,” Walsh said.
In healthcare, AI’s transformative impact is gaining a foothold in surgery, cancer diagnosis, radiology, personalised medicine and drug development, with predictions that future robots may be able to take on some of the responsibilities of GPs.
A June report from Accenture Consulting, Artificial Intelligence: Healthcare’s New Nervous System, projected the market for healthcare-related AI would grow from around $600 million in 2014 to $6.6 billion in 2021.
But while progress in AI is revolutionising patient care, Walsh said the UN needs to ban lethal autonomous weapons systems under its convention on certain conventional weapons (CCW), joining chemical and intentionally blinding laser weapons.
“We need to make decisions today choosing which of these futures we want. I strongly support the call by many humanitarian and other organisations for a UN ban on such weapons, similar to bans on chemical and other weapons,” he said.