In analogy to living creatures in nature, two robots will live in a constrained environment, which forces them to compete for resources. The robots will have certain attributes and behaviors, which will evolve through an adaptive system. This will lead to the development of different strategies for survival. Observers will be able to understand the situation and its dynamics by observing and listening to the emotional expressions of the robots.
This is a multidisciplinary project, involving expertise in the areas of Adaptive Systems and User-System interaction. We believe this to be a major reason why the project could be considered relevant for the Eindhoven Embeded System Institute (EESI). Furthermore, the project will provide EESI with a compelling demonstration of an embedded system.
The LEGO Mindstorm Robots offer a flexible and easy to use implementation environment. The Mindstorm Robot has a very limited memory and performance. Therefore an external computer takes control. It executes the software and controls the robot via the infrared connection. The robots can be used as a platform for further development and research. Different software could enable them to solve completely new problems.
Emotional computing is a new field of studies in the field of user-system interaction. The interaction between two robots is a challenge for synthesizing an emotion architecture and emotional expressions. These expressions create an interface for the humans to understand their internal processes. Acoustic and behavioral communication skills will be used for this purpose.
One of the major goals of the project is to explore the advantages of learning adaptive systems in a real world situation where the environment is constantly changing and it is impossible to hard-code beforehand into the system the most suitable action for all possible occurrences. Moreover, the evolutionary aspects will enable us to explore issues related with the emergence of complex behaviour.
Working prototypes of two robots
Software to control the robots
No verbal or facial expressions
Project start: 01.02.2000
Project end: 30.06.2000