Kaiser Derden
I'm sure the Russians, Chinese and North Koreans would love the UN to ban this ... whats the old saying, Ban guns and only criminals will have guns ...
Nairda
People seem to forget that the future of war with autonomous platforms when fully utilized will be precise and vicious. Further, that such conflict will be matched on both sides with their own version of the same. The battle when waged will look something akin to two swarms of bees colliding. If one side tries to human micromanage decisions, they will simply loose the battle.
This notion of micromanaging the actions of autonomous weapons at this time is a sign of low confidence in the immature coding / limitation of the field processing. And I support that, at this time.
As field computing improves to the point where it can recognize the difference between a mother holding her baby to a soldier holding a gun,this whole topic will fall by the wayside and will have to be revisited. If you look at many examples in history, humans soldiers have been able to tell one thing from another, but have chosen to ignore the mandate and make ethically questionable decisions.
A machine with clear guidelines could not easily be swayed. Any failure to apply these ethical principles in battle would be the fault of the programmer for not identifying them in the initial equipment acceptance testing/simulation. When found, easily remedied for all machines, for the same mistake never to be repeated.
I support the idea of a kill switch as a last resort, but it will be impossible to keep weapons out of AI's hands because they will likely integrate heavily into civilian security initiatives such as perimeter/checkpoint defense and law enforcement assistance in the foreseeable future.
Deres
Those people are not quite up-to-date on military technology ... "Fire-and-forget" projectile, a crude form of AI in weapons have been in use since WWII when the first acoustic torpedoes were developed. With such a oversimplified vision, artillery should also be banned because there is usually no one on the target to verify that there is no civilian ...
Gaƫtan Mahon
Having recently watched a slightly dated History Channel Documentation about DARPA. If what they said about their mentality is true and it still being like that then we're sure going to see something like that being researched and developed by them for the sole reason of having such a system before anyone else has in order for the US not being surprised by unknown technology.
So yeah... Good luck with that proposed ban.
swaan
A soldier by definition is a biological robot. Do not hesitate, follow orders. Shoot first, ask questions later. The enemy is not like us, they are monsters. All the same risks apply as to AI.
Robert Walther
At least Skynet was not intent on converting the world to an mad religion.
Captain Danger
A bit late for that. I think the Israelis have automatic machine guns that fire on the Gaza strip. I am sure that technology is used elsewhere along long boarders , eg north / south Korea Would mine fields qualify for this? There has been no hesitation in deploying those.
And as for the reasoning and compassion of humans, we have several millenia of conflict to see how well that works.
joebloeIDAHO
I've been using this AI weapon for years. http://atlanticpondsupply.ca/image/data/scarecrow-sprinklers.jpg
I've taken out many dogs - and their owners. At 1st I thought "ooo...collateral damage." but then I realized it was the owners that were the real targets anyway...
drgnfly004
i honestly question what the UN is supposed to do about it? what are they going to do if some country develops this?.... impose sanctions.... if i were the rouge country that developed it the UN would be the least of my worries.
Yankie007
People who are against this "Ban" truly underestimate the danger we will face in the near future if we acquire this AI technology in the military! We already have enough WMD like Nuclear Arsenals which is already super dangerous so that alone, we're already playing Russian roulette as many volatile countries has this deadly technology like North Korea, Pakistan, Russia, China, Iran (coming soon) and many others who will not hesitate to push the button if a conflict starts! Drones in the army are already human killing machines so adding AI Weaponized Robots on top of these deadly technologies is putting the future of mankind in jeopardy!