Two years ago, the Future of Life Institute presented an open letter at the 2015 International Conference on Artificial Intelligence (IJCAI) urging the United Nations to ban the development of weaponized artificial intelligence. Now a second open letter has been released, again coinciding with the start of the 2017 IJCAI. This new letter is co-signed by over 100 founders of robotics and AI companies from around the world, and demands the UN stop delaying its talks and take action.
Just a few years ago, the idea of autonomous weaponry resided solely within the realms of science fiction, but the rapidly advancing fields of AI and robotics have turned a frightening fiction into a dawning reality. With global arms manufacturer Kalashnikov recently launching a fully automated range of combat modules and startup Duke Robotics attaching machine guns to drones, the future of robotic and autonomous warfare seems incredibly close.
The original 2015 letter, directed at the UN, was co-signed by over 1,000 different scientists and researchers from around the world, including Stephen Hawking, Noam Chomsky and Steve Wozniak. The UN slowly, but surely, responded, formally convening a group of experts in late 2016, under the banner of the Convention on Conventional Weapons (CCW) with a view towards discussing and implementing a global ban.
The first discussions of this newly formed UN group were set to take place this month, but they were canceled back in May due to "insufficient funding". This bureaucratic bungle, stemming from several nations apparently falling into arrears with promised contributions, also threatens to cancel the second scheduled meeting on lethal autonomous weapons set for November this year.
These delays inspired this second open letter, which concentrated on recruiting support from those on the business and industry side of robotics and AI. One hundred and sixteen founders of major companies from around the world have already co-signed this new letter, including Elon Musk, Mustafa Salesman (founder of Google's DeepMind), and Essen Østergaard (founder of Denmark's Universal Robotics).
"Lethal autonomous weapons threaten to become the third revolution in warfare," the letter states. "Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora's box is opened, it will be hard to close."
Despite getting a notable collection of industry luminaries on board, this appeal is looking like it will face an uphill battle over the coming months and years. Advocates of a ban on lethal autonomous weapons want all development in the field to be considered for prohibition, just as is done with biological and chemical weapons, but not all countries are agreeable.
While most UN member countries, including the US and UK, have agreed to forming this panel of experts, any actual proposal for a ban will likely face strong opposition. In 2015 the UK foreign office told The Guardian that the government does not see a need for these new laws. Russia of course, has not expressed support for this entire process either.
The United States has not communicated a solid position on the matter, and while it supported the convening of this UN group, one can't imagine the world's biggest military power willingly supporting a proposal that would stifle its ability to develop complex new weapons systems – especially when Russia has already indicated support for the Kalashnikov AI systems.
Whether such broad collective support across academic, research, and industry fields actually amounts to anything is yet to be seen, but this second open letter hopefully prompts a conversation on AI weapons development that the world drastically needs to have.
Source: University of New South Wales