Robert Sparrow wrote and excellent paper about Robots, Rape, and Representation. I was shocked to see exactly what he was talking about, namely non-consensual sex with androids in one of my favourite movies, Blade Runner. Rick Deckard is clearly forcing himself on the android Rachel and even asks her to tell him that she wants him. The scene stops after some more pretty forceful kissing and we don’t know what happens afterwards, but the way Rick controls Rachel and denies her leaving his apartment is highly problematic.
UPDATE: It took Youtube and Warner Brothers 30 seconds to file a copyright claim dispute about my video. Amazing.
My talk on Persuasive Robots at the Emotional Machines Conference.
I was invited to give a talk at the Interdisciplinary Conference on Emotional Machines in Stuttgart on September 21st, 2017. My talk focused mainly on the work I did in collaboration with Jürgen Brandstetter (doi: 10.1145/2909824.3020257, doi: 10.1177/0261927X15584682, doi: 10.1109/IROS.2014.6942730). My main argument was that the number of robots in our society will increase dramatically and robots will participate in the formation of our language. Through their influence on our language they will be able to nudge our valence related to certain terms. Moreover, it will only take 10% of us to own a robot for them to dominate the development of our language.
This is also the first time I used a 360 degree camera to record a talk. This technology becomes particularly useful when following the discussion between the speaker and the audience. YouTube’s 360 video feature does not work in all web browser (e.g. it does not work with Safari). Chrome and Firefox should be fine.
A holonomic robot uses omni-directional wheels to drive and turn in any direction on the spot. Agilis is an example of an early LEGO holonomic robot. My model is much simpler and robust. Essential to all holonomic robots are the use of omni-directional wheel, such as the the ones from Rotacaster. I am using a compass sensor to allow the robot to be remote controlled on an absolute grid using Connexion’s Space Navigator. This 3D input devices can be mapped to the unique movements and rotations of a holonomic robot.
You can play TicTacToe with this LEGO Mindstorms EV3 robot. It uses three motors to drop the balls into the right field. It uses a NXTCam to view the board and then calculates the best move using a MiniMax Algorithm. All future moves are explored an rated according to their winning chances. The work is based on the TicTacToe code of Thomas Kaffka. An IR sensor detects your hand when you drop your ball. The robot is using red balls and the human player uses blue balls. The Java code is available over at Github. The building instructions are available for LEGO Digital Designer. I used the MinuteBot baseplate, which is useful for building static Technic/Mindstorms models.
LDD does not have all the required pars in its database. You will have to replace 22961 with 27940. You will also need to add a worm wheel 27938. In addition you should use a lamp to provide consistent lighting. I used a USB powered LED circular lamp the can be powered through the USB port of the EV3. I only had to take out the lens in the middle so that the camera fits through the hole. A rubber band holds the light in place. To calibrate the robot I added a little arm at the end of the base plate against which the robot arm rotates. The position of the camera can be centered on the board using the wrench and through sliding along the axles.
You can also find information about the robot over at Rebrickable. The inventory there is correct and complete. Except for the base plate of course.