Telling international arms traders they cant make killer robots is like telling soft-drinks makers that they cant make orangeade, says science writer Philip Ball
One response to the call by experts in robotics and artificial intelligence for an ban on killer robots (lethal autonomous weapons systems or Laws in the language of international treaties) is to say: shouldnt you have thought about that sooner?
Figures such as Teslas CEO, Elon Musk, are among the 116 specialists calling for the ban. We do not have long to act, they say. Once this Pandoras box is opened, it will be hard to close. But such systems are arguably already here, such as the unmanned combat air vehicle Taranis developed by BAE and others, or the autonomous SGR-A1 sentry gun made by Samsung and deployed along the South Korean border. Autonomous tanks are in the works, while human control of lethal drones is becoming just a matter of degree.
Yet killer robots have been with us in spirit for as long as robots themselves. Karel apeks 1920 play RUR (Rossums Universal Robots) gave us the word (meaning labourer in Czech). His humanoid robots, made by the eponymous company for industrial work, rebel and slaughter the human race. Theyve been doing it ever since, from Cybermen to the Terminator. Robot narratives rarely end well.
Its hard even to think about the issues raised by Musk and his co-signatories without a robot apocalypse looming in the background. Even if the end of humanity isnt at stake, we just know that one of these machines is going to malfunction with the messy consequences of Omni Consumer Products police droid in Robocop.
Such allusions could seem to make light of a deadly serious subject. OK, so a robot Armageddon might not be exactly frivolous, but these stories, for all that they draw on deep-seated human fears, are ultimately entertainment. Its all too easy, though, for a debate like this to settle into the polarisation of good and bad technologies that science-fiction movies can encourage, with the attendant implication that, so long as we avoid the really bad ones, all will be well.
Read more: http://www.theguardian.com/us