DOI: 10.1145/1514095.1514174 | CITEULIKE: 4414608 | REFERENCE: BibTex, Endnote, RefMan | PDF PDF Version

Meerbeek, B., Saerbeck, M., & Bartneck, C. (2009). Towards a Design Method for Expressive Robots. Proceedings of the 4th ACM/IEEE International Conference on Human-Robot Interaction, San Diego pp. 277-278.

Towards a Design Method for Expressive Robots

Bernt Meerbeek

Philips Research
High Tech Campus 34 (WB51)
5656 AE Eindhoven
bernt.meerbeek@philips.com

Martin Saerbeck, Christoph Bartneck

Department of Industrial Design
Eindhoven University of Technology
Den Dolech 2, 5600MB Eindhoven, NL
m.saerbeck@tue.nl, christoph@bartneck.de

Abstract - Autonomous robots tend to induce the perception of a personality through their behavior and appearance. It has been suggested that the personality of a robot can be used as a design guideline and as a mental model of the robot. We propose a method to design and evaluate personality and expressions for domestic robots.

Keywords: Design, Experimentation, Human Factors


1. Introduction

Robots will induce the perception of being life-like and having a personality through their appearance and behavior [1] -[3]. Already in 1944, Heider and Simmel [4] demonstrated that people attribute motivations, intentions, and goals to simple inanimate objects, based solely on the pattern of their movements. More recently, Reeves and Nass [2] have demonstrated that users are naturally biased to ascribe certain personality traits to machines, to PCs, and other types of media. For a designer, it is therefore important to understand how the perception of product personality influences the interaction.

Personality is an extensively studied concept in psychology [5]. The Big-Five personality trait theory is currently supported by most empirical evidence and is generally accepted in the scientific community [6]. It describes personality in 5 dimensions: extraversion, agreeableness, conscientiousness, neuroticism, and openness to new experiences. The theory can be used as a framework to describe and design the personality of products, and in particular of robots. Norman [1] describes personality as: ‘a form of conceptual model, for it channels behavior, beliefs, and intentions into a cohesive, consistent set of behaviors.’, which indicates that deliberately equipping a robot with a personality helps to provide people with good models of the robot behavior. Important questions that arise are: What kind of personality is appropriate for the robot? How to express personality in the behavior of a product? We propose a method to address these questions in the design process.

2. Related Work

We identified 3 main perspectives on designing the expressive behavior of a robotic product: the (1) technology, (2) artistic, and (3) user centered perspective. When the first robots were constructed, the behavior was fully determined from a technological, functional point of view. Several architectures for designing the behavior of robotic characters have been proposed [8][9]. Even though the underlying technology is an essential factor for the feasibility of a product, they tend to narrow the design space by technical limitations, rather than by user insights. The focus of the artistic approach on the other hand is on how people perceive the behavior. The underlying idea of conveying messages through expressive behavior is borrowed from the field of movies and animations and applied to robots [10]. Many of the proposed design methods for robotic characters have been borrowed from the human-computer interaction field and are characterized by a strong focus on users. The key principle is an iterative design cycle to evaluate and refine the product. Many user-centered methods have been reported ([11] - [13]).

3. Personality design method

Although several approaches to design personalities for expressive autonomous products have been proposed, we miss a practical method that integrates a user-centered, artistic, and technical approach to designing personalities. In this section, we describe the process that we followed to design personality and expressions for a domestic robot and propose this as a design method for autonomous products in general. The method consists of five main steps, which are described subsequently.

1. Create a personality profile

We use the notion of personality as a central design guideline to create consistent and understandable behavior (mental model), but what kind of personality should be designed for a robot? We adopted a user-centered approach to create a personality profile. As a starting point, we used the “Big-Five” (see section 0). For each personality dimension, we selected several traits as triggers for potential end-users to talk about the personality of a product. The traits were presented on cards to participants and they were asked to explain what the characteristics would mean for the behavior of the robot. Next, they were asked to indicate how desirable this characteristic was. Furthermore, we recorded why a particular trait was desired or not desired. Based on the user feedback, a descriptive personality profile of about 300 – 400 words was created, illustrating the character of the robot. This profile can be used in a similar way as personas [12]. While personas are often used to describe users in the target group and communicate it to a development team, the personality profile describes what (‘who’) the product is and provides a frame of reference for later stages in the product development.

2. Get inspiration for expressions

We organized a workshop with 4 actors from an improvisational theatre group to explicitly address the artistic aspect of the design process for a robotic application and to get inspiration for robotic expressions. The workshop was held in a realistic living room setting in the ExperienceLab facility [14] and recorded with video cameras. First, the actors studied the personality profile to identify with the character. Next, the actors showed behavior of the robot in particular situations during various ‘exercises’ that are commonly used in improvisational theater, thereby focusing on movements and sounds (but no talking). Video cards ([15]) were used to group, compare, and analyze the large amounts of video material. The clustered video cards with descriptions of the behaviors and example video clips were discussed in the project team and additional ideas for expressions were generated.

3. From actor to robot expressions

The expressions of the actors were translated into expressions for the domestic robot. Since human expressions cannot be mapped one-to-one with expressions of the robot, we abstracted the human expressions first. For example, an actor was looking around and pretending to make pictures of the room to express that he was exploring the environment. This was converted to repetitive turns to the left and to the right (‘looking around’), flashing white lights (‘camera flash light’), and a click sound (‘picture taken’) for the domestic robot. The designed expressions were sketched in a written scenario and an animated storyboard. Both were used to discuss the expressions within the project team. The final storyboard served as input for the visualization in 3D animations.

4. Visualize in 3D animation

We used 3D graphical simulations for prototyping and testing scenarios of robotic behavior, because these offer the possibility to gather feedback from users without the hassle of building a fully functional hardware prototype. Furthermore, a 3D simulation gives an impression a dynamic behavior, which is important since the timing of movements and behaviors is a crucial element for the meaning of an expression [16]. Nowadays, several software packages are being used to simulate robotic behavior, each with their own advantages and disadvantages [17]. We used the Open Platform for Personal Robotics (OPPR) framework as described in [18] to develop visual impressions of the robotic behavior in a realistic setting. One particular strength of OPPR is that it uses physical simulations for rendering animated behavior, so it closely resembles the behavior on the hardware platform. Hence, the behaviors can be developed and tested virtually with users and at a later stage reused on a real robot.

5. Evaluate with think-out-loud

The 3D animations were used in a think-out-loud evaluation. The objective of the evaluation was to find out how people perceive the expressions of the robot. How do people interpret the designed behaviors? Why is some behavior preferred over other behavior? 12 people participated individually in a one-hour session in which the 3D animations of the expressive robot behavior were shown. The think-out-loud evaluation and verbal protocol analysis resulted in valuable qualitative user feedback on the designed expressions. The results clearly indicated which expressions were easy to interpret and which were appreciated by the participants. With the evaluation, we finished the first iteration of our design cycle. We have established a clear view on the desired product personality and gathered user feedback on the designed behavior. The results will be used as input for the next iteration.

4. Conclusion

We have described a process to design the behavior of a domestic robot and proposed it as a method to design a personality and appropriate expressions for autonomous products. Our iterative design process integrates technical, artistic and user-centered perspectives on designing a personality for domestic robots and consists of five main steps. The proposed method combines proven methods from HCI and translated those to the field of HRI. We will continue applying the proposed method to various product development activities and hope to improve it.

5. References

  1. Norman, D. A. (2001). How humans might interact with Robots. from http://www.jnd.org/dn.mss/how_might_human.html
  2. Nass, C., & Reeves, B. (1996). The Media equation. Cambridge: SLI Publications, Cambridge University Press. | view at Amazon.com
  3. Meerbeek, B., Hoonhout, J., Bingley, P., & Terken, J. M. B. (2008). The influence of robot personality on perceived and preferred level of user control. Interaction Studies - Social Behaviour and Communication in Biological and Artificial Systems, 9(2), 204-229. | DOI: 10.1075/is.9.2.04mee
  4. Heider, F., & Simmel, M. (1944). An experimental study of apparent behavior. American Journal of Psychology, 57, 243-249.
  5. Carver, C. S., & Scheier, M. F. (2003). Perspectives on Personality (5th ed.). Boston: Allyn & Bacon.
  6. McAdams, D. P., & Pals, J. L. (2006). A new Big Five: Fundamental principles for an integrative science of personality. American Psychologist, 61(3), 204-217. | DOI: 10.1037/0003-066X.61.3.204
  7. Dryer, D. C. (1999). Getting Personal With Computers: How To Design Personalities For Agents. Applied Artificial Intelligence, 13(3), 273 - 295. | DOI: 10.1080/088395199117423
  8. Duffy, B. R., Dragone, M., & O'Hare, G. M. P. (2005). Social Robot Architecture: A Framework for Explicit Social Interaction. Proceedings of the Android Science: Towards Social Mechanisms, CogSci 2005, Stresa, pp 18-28.
  9. Snibbe, S., Scheeff, M., & Rahardja, K. (1999). A layered architecture for lifelike robotic motion. Proceedings of the The 9th International Conference on Advanced Robotics, Tokyo.
  10. Breemen, A. (2004). Bringing Robots To Life: Applying Principles Of Animation To Robots. Proceedings of the CHI2004 Workshop on Shaping Human-Robot Interaction - Understanding the Social Aspects of Intelligent Robotic Products, Vienna.
  11. Benyon, D., Turner, P., & Turner, S. (2005). Designing interactive systems : people, activities, contexts, technologies. Harlow, England ; New York: Addison-Wesley.
  12. Pruitt, J., & Grudin, J. (2003). Personas: practice and theory. Proceedings of the Proceedings of the 2003 conference on Designing for user experiences, San Francisco, California, pp 1 - 15. | DOI: 10.1145/997078.997089
  13. Osada, J., Ohnaka, S., & Sato, M. (2006). The scenario and design process of childcare robot, PaPeRo. Proceedings of the Proceedings of the 2006 ACM SIGCHI international conference on Advances in computer entertainment technology, Hollywood, California, pp No. 80. | DOI: 10.1145/1178823.1178917
  14. Aarts, E., & Diederiks, E. (2006). Ambient Lifestyle: From Concept to Experience. Amsterdam: BIS Publishers.
  15. Buur, J., & Soendergaard, A. (2000). Video card game: an augmented environment for user centred design discussions. Proceedings of the Proceedings of DARE 2000 on Designing augmented reality environments, Elsinore, Denmark, pp 63 - 69. | DOI: 10.1145/354666.354673
  16. Hendrix, J., Ruttkay, Z., Hegen, P. t., Noot, H., Lelievre, A., & Ruiteer, B. d. (2000). A facial repertoire for avatars. Proceedings of the Workshop on Interacting Agents, Enschede, pp 27-46.
  17. Saerbeck, M., & Breemen, A. (2007). Design guidelines and tools for creating believable motion for personal robots. Proceedings of the 16 th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN 2007, Jeju, Korea, pp 386-391. | DOI: 10.1109/ROMAN.2007.4415114
  18. Breemen, A. (2005). Scripting technology and dynamic script generation for personal robot platforms. Proceedings of the Intelligent Robots and Systems, 2005. (IROS 2005). 2005 IEEE/RSJ International Conference on, pp 3487-3492. | DOI: 10.1109/IROS.2005.1545522

© ACM, 2009. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in "Meerbeek, B., Saerbeck, M., & Bartneck, C. (2009). Towards a Design Method for Expressive Robots. Proceedings of the 4th ACM/IEEE International Conference on Human-Robot Interaction, San Diego pp. 277-278." http://doi.acm.org/10.1145/1514095.1514174" | last updated April 28, 2009 | All Publications