We are currently systematically using tools to kill each other and even autonomous machines are a tried and tested method to kill humans, both soldiers and civilians. Land mines are maybe one of the best examples for such autonomous killing machines, although they are of course rather simple. But even the deterministic simplicity of killing machines can be their greatest asset. The Russians maintained an autonomous system during the cold war called the Dead Hand that, once activated, would automatically launch intercontinental missiles in case a nuclear attack on Russia was detected. Today’s weapon systems have become more complex, such as the Phalanx Gun System, which already caused the death of a soldier in 1989. Despite the increase of complexity the fundamental questions remain the same. Who will take the responsibility for a non-deterministic weapon system? And how do we define autonomy?
Interestingly, it is a requirement for a just war that the participating agents must be responsible for their actions. For an autonomous killing machine to fight a just war, we would need to give them the legal state of a person, so that it can take responsibility for its actions. But the ideas for a just war have fallen out of fashion. We no longer declare or end wars; we directly invade or attack from the air. Maybe the only glimpse of hope is that autonomous weapon systems, left to their own devices, will quickly run out of battery or fuel. Land mines remain a deadly threat long after the original conflict has ended.
This legal requirement of responsibility becomes even more pressing in a civilian context. We will need to make our autonomously driving cars legal persons so that it can be made responsible for the deaths they will cause. In the near future I am convinced that we will have to deal with more fatalities through autonomous cars than autonomous war machines. The first documented death by robot already occurred as early as 1979 when a factory worker was hit by a robotic arm.
It is also important to make a clear distinction between Science and Fiction. Many of the recent articles on “Killing Robot” used imagery from the movies “Terminator” to illustrate human like killing machines. Using a fiction to talk about real world problems is misleading at best. The autonomous weapon systems we will be dealing with in the near future come from the air, not the ground. The media is abusing the Frankenstein Complex to stir fears of androids. This may seriously harm the research and development of androids.
Isaac Asimov, a famous writer wrote: “Violence is the last refuge of the incompetent”. By that definition, autonomous killing machines are hopelessly incompetent, since all they know is violence. Lets remain competent in our decision.
I am interested in anthropomorphism, one of my PhD students is even working on it full time. I came across the “The drunken octopus wants to fight you” meme and I found it to be very inspiring. It is amazing how we can see all sorts of creatures in everyday objects, even the most mundane. I went to my local DIY store, actually found that exact cloth hook and took a picture of it. I then turned it into a t-shirt. Do you wanna fight?
I am playing Simpsons Tapped out and based on the data at the Simpsons Wiki, I confirmed a suspicion I had for some time. The houses in Springfield are setup so that the houses that give you income at a higher frequency also give you the highest hourly income rate (see figure 1). This means that the more often you have to tap on a house, the higher your hourly income. You are being rewarded for wasting your time in this game.
Yesterday we took out our Nao robots for a presentation. We drove there with a car and we were wondering if robots also need to be buckled. We decided that it probably would be a good idea. Better to be safe than sorry.
Eduardo Sandoval helped the Computer Science Field Guide by contributing to the introduction video for the artificial intelligence chapter. It got slightly out of hand and the results are just really entertaining. Have a look: