The University of Canterbury is currently running a new campaign and we had the privilege of showcasing our work. The photo was taken at a professional studio and shows a typical human-robot interaction study setup.
We are currently systematically using tools to kill each other and even autonomous machines are a tried and tested method to kill humans, both soldiers and civilians. Land mines are maybe one of the best examples for such autonomous killing machines, although they are of course rather simple. But even the deterministic simplicity of killing machines can be their greatest asset. The Russians maintained an autonomous system during the cold war called the Dead Hand that, once activated, would automatically launch intercontinental missiles in case a nuclear attack on Russia was detected. Today’s weapon systems have become more complex, such as the Phalanx Gun System, which already caused the death of a soldier in 1989. Despite the increase of complexity the fundamental questions remain the same. Who will take the responsibility for a non-deterministic weapon system? And how do we define autonomy?
Interestingly, it is a requirement for a just war that the participating agents must be responsible for their actions. For an autonomous killing machine to fight a just war, we would need to give them the legal state of a person, so that it can take responsibility for its actions. But the ideas for a just war have fallen out of fashion. We no longer declare or end wars; we directly invade or attack from the air. Maybe the only glimpse of hope is that autonomous weapon systems, left to their own devices, will quickly run out of battery or fuel. Land mines remain a deadly threat long after the original conflict has ended.
This legal requirement of responsibility becomes even more pressing in a civilian context. We will need to make our autonomously driving cars legal persons so that it can be made responsible for the deaths they will cause. In the near future I am convinced that we will have to deal with more fatalities through autonomous cars than autonomous war machines. The first documented death by robot already occurred as early as 1979 when a factory worker was hit by a robotic arm.
It is also important to make a clear distinction between Science and Fiction. Many of the recent articles on “Killing Robot” used imagery from the movies “Terminator” to illustrate human like killing machines. Using a fiction to talk about real world problems is misleading at best. The autonomous weapon systems we will be dealing with in the near future come from the air, not the ground. The media is abusing the Frankenstein Complex to stir fears of androids. This may seriously harm the research and development of androids.
Isaac Asimov, a famous writer wrote: “Violence is the last refuge of the incompetent”. By that definition, autonomous killing machines are hopelessly incompetent, since all they know is violence. Lets remain competent in our decision.
Update: The media has become overwhelming. Have a look at the press section to get an impression of the coverage.
The response of the media to our LEGO Minifigure study has been overwhelming. Not only did the New Zealand Herald report, but also Die Welt, Tagesschau. There has also been an interview on the Australian’s ABC radio:
And I also gave a TV interview today for TV3 News. You can watch it below:
Today the LEGO Mindstorms EV3 arrived. Now I need to figure out what to build with it. Any suggestions? I am very much looking forward to the ability to use four motors, the SD card slot and iOS compatibility.
It is sad that once I got to appreciate LEGO Digital Designer (LDD), it is already out of date. LEGO sill allows you to use it, but it is already a bit out of date. The new L motors are not in the database and it is unclear if they will ever be added. In any case, I build my RACE model in it and you can download the LXF file.
We had our first official kick off meeting for the Wordovators Project. The Wordovators project is a collaboration between the Northwestern University, New Zealand Institute of Language, Brain and Behaviour (NZILBB) and the Human Interface Technology Laboratory New Zealand (HIT Lab NZ).