DOI: 10.1007/978-3-540-74796-3_49 | CITEULIKE: 1669167 | REFERENCE: BibTex, Endnote, RefMan | PDF PDF Version

Mubin, O., Mahmud, A. A., & Bartneck, C. (2007). TEMo-chine: Tangible Emotion Machine. In C. Baranauskas, P. Palanque, J. Abascal & S. D. J. Barbosa (Eds.), Human-Computer Interaction – INTERACT 2007, LNCS 4662 (pp. 511-514). Berlin: Springer.

TEMo-chine: Tangible Emotion Machine

Omar Mubin, Abdullah Al Mahmud

User-System Interaction Program
Eindhoven University of Technology
Den Dolech 2, 5600MB Eindhoven, The Netherlands
o.mubin@tm.tue.nl, a.al-mahmud@tm.tue.nl

Christoph Bartneck

Department of Industrial Design
Eindhoven University of Technology
Den Dolech 2, 5600MB Eindhoven, NL
christoph@bartneck.de

 

Abstract - We examine whether or not it is possible to determine, recognize and/or report the emotional state of a group of people through touch and/or body motion. We present the initial design of a mechanism for an asynchronous yet anonymous means of communication where the basic framework is set up by defining interaction with the system and aggregating the individual interaction components. We present the results from our initial user evaluation based on a scenario-based methodology. The results prove that users tend to exhibit similar emotional expression and interaction modalities, which could be used to determine general emotional states.

Keywords: emotion, interaction, scenario


Introduction

In a public environment it would be interesting to have a feel of the collective emotional state of people in general, for example is the majority of the community happy or satisfied? We attempt to motivate the design of an interface that allows the general public to express themselves freely in a tangible manner.

We describe the process of the design and evaluation of an affective and interactive machine, the TEMo-chine (Tangible Emotion-machine). Our design concentrates on a unidirectional interaction mechanism by capturing physical actions of users and mapping them to emotions. Other design studies [2] are worked on a more bidirectional approach (i.e. the system responds as well based on the input) using Artificial Intelligence and various other paradigms. There have been numerous works in the area of Affective Computing (computing that relates to, arise from and influences emotions [3] ) that have employed physiological measures such as blood pressure and skin conductance to measure the emotions of users [2]. However, this may be accurate but it does not allow you to explicitly express your emotions to the system [5, 6]. There has also been work in the area of gesture recognition to capture affect. We try to motivate the design of a more tangible and yet physical interaction methodology.

There has been recent work along the lines of attaining affective feedback by using the body of the user or bodily gestures rather than verbal self-report measures [1,4]. Our goal was to ascertain if we could actually generalize the emotional state via affective feedback from a collective group of people. We chose to investigate if physical actions could be used to interpret basic emotion state and direction (e.g., targeted object or avatar) of an emotion.

Scenario Based Methodology

For the purposes of our evaluation and design, we employed the use of a scenario-based methodology. The idea behind TEMo-chine was to establish a design concept that could aggregate individualistic yet not personal interactions of people. We determined that one of the useful applications of such a design would be in a social context with a high degree of varying emotional state. We decided to evaluate our design idea within the workspace of a company office. An anonymous form of interaction would be desirable, to avoid any ethical and/or privacy concerns. Why we chose the scenario of an office was a relatively simple decision. Considerable emotional friction prevails in office settings and there could not be a hardly more competitive and stressful environment. The TEMO-chine was based on the paradigm that users would express their emotions with a physical action targeted towards an avatar [6].

Prototype and User Test. To physically represent our design idea we developed a simple non-working physical mock up (Fig. 1). Our TEMo-chine design consisted of two tangible iconic representations or avatars related to employees, in the context of a company office. These direction/intended targets/avatars of emotion were defined as i) Environment: Peers, Workspace ii) Authority: Manager, Boss, Lawyers, Partners. The avatars were labeled accordingly (see Fig. 1). In the situation of a working environment, we felt these were the likely sources of emotional distress or emotional appeasement. We conducted the test with 15 participants (8 Male, 7 Female). Each user chosen for the prototype study had professional working experience in an office environment and was currently employed.

Fig. 1.The TEMo-chine prototype

Test Setup and Measurement. We predefined emotional states into two primary and basic categories:  ‘Happy’ and ‘Angry’. Our understanding was that we could easily place most levels of emotional interaction into these broad categories, that it would methodologically make the experiment less convoluted, and lastly two levels of emotional categorization would suffice for an initial design. However, it is worth mentioning that future research should investigate interaction modalities for various other types of subtle emotions. To further complement the context, participants were presented with 6 scenarios randomly, 3 for each emotion type. For example a scenario was: an employee realizes that a peer is promoting him/herself at his/her expense in front of the boss.

Participants were instructed to perform one interaction of their own choice upon a preset avatar based on their prevailing emotion, once a scenario was read out. Users were observed and video recorded during their interactions with the prototype. The video coding scheme defined a set of plausible and most likely yet discernable actions for each emotion. For emotion type 'Happy' the following four possibilities were identified: A) a smooth and soft physical hand gesture, such as a pet, a caress or a rub, B) a hug, C) a kiss and D) neutral(no physical gestures carried out) or if the action was indiscernible/unrecognizable/invalid. For emotion type 'Angry' there were three options: A) a rounded fist gesture: such as a hit, bang or punch, B) an open handed (one or both hands) gesture such as a slap or flick and C) neutral(no physical gestures carried out). Therefore, for each participant we had a frequency count of which actions were performed for each emotion type.

Results and Discussion

After conducting the user study, we analyzed the results and came up with a frequency discretization of which user actions were usually performed for a particular hypothetical emotional situation. The resulting coded data was analyzed as a within subjects design. Pair wise comparisons of means for each interaction gesture output were carried out with other gestures within each emotion. For emotion type 'Happy' gesture A was significantly the most often adopted interaction gesture (p = 0.019, p = 0.003, p < 0.001). For emotion type 'Angry' gesture B was significantly the most often adopted interaction gesture (p = 0.017, p < 0.001). No significant differences were found for the other physical actions. Moreover, gender did not have an influence on the interaction modalities.

As far as the 'Happy' emotion was concerned, the mock up of the avatars might have been one reason why participants were not ready to adopt very personal interactions such as a hug or kiss. Many participants were also observed to express some reluctance at first, before actually deciding which action to carry out. On the other hand with regards to the 'Angry' emotion, participants might have been hesitant to employ a rounded fist gesture to the avatars as they seemed more animate. Generally we could expect that, humans would not care about aggressive behavior towards objects and artifacts as compared to agents which are social and/or represent some lifelikeness.

We did experience and observe some interesting interaction modalities. In one particular situation a participant exhibited a dual form of interaction, as gesture A from the 'Happy' category was subsequently followed by gesture A from the 'Angry' category. On this particular instance, the interaction was categorized under the 'Angry' gesture C category. A dual input in one interaction on the same avatar would likely be unrecognizable for a real machine, since even with limiting the input to touch a human has many intricate forms of tangible interaction which machines are unlikely to completely understand for now.

There exist various limitations of our design. Firstly, we feel a cultural bias might hinder accurate generalizations across large samples of population. Moreover, the user study was carried out in an unreal setting using a scenario-based methodology. Hence, the results might not generalize to all situations. There might also be a tendency amongst humans of their physical actions evolving over time. The ideal extension for this research would be to carry out an extensive experiment in a real context by placing a working prototype in the environment. Based on our design, in order to identify affect the system needs to recognize the physical action (via sensory information) and match it to an appropriate emotion.

We have a presented an initial design idea of a system that can effectively sum up individual emotional interaction vectors of users into an aggregated output. This is not done by explicitly recognizing emotions but rather by interpreting physical actions and matching them to emotions. This relationship needs to be fully quantified although we feel that our initial results could be used for emotion recognition/emotion self report. Future work should also concentrate on analyzing interaction for more subtle categories of emotions, for e.g. Irritation, Sadness. Other interesting aspects to investigate include determining the exact visualization of the aggregation results to the group of people in concern and providing feedback to them, either on an individual level or on a shared medium. However, as we have suggested the next iteration would be to implement a working automated prototype and to test the system in a real environment.

Acknowledgement

The authors would like to acknowledge the contribution of Jeff Burkham, Yuan Gao, Zhihui Zhang and Kristina Höök, towards the initial phase of the study.

References

  1. Isbister, K., Höök, K., Sharp, M., & Laaksolahti, J. (2006). The sensual evaluation instrument: developing an affective evaluation tool. Proceedings of the Proceedings of the SIGCHI conference on Human Factors in computing systems, Montreal, Quebec, Canada, pp 1163-1172 | DOI: 10.1145/1124772.1124946
  2. Overbeeke, C. J., Vink, P., & Cheung, F. K. (2001). The emotion-aware office chair - An exploratory study. Proceedings of the The International Conference on Affective Human Factors Design London, pp 262-267.
  3. Picard, R. W. (1997). Affective computing. Cambridge: MIT Press. | view at Amazon.com
  4. Sundström, P., Stahl, A., & Höök, K. (2005). A User-Centered Approach to Affective Interaction. In Affective Computing and Intelligent Interaction (Vol. 3784/2005, pp. 931-938). Berlin: Springer. | DOI: 10.1007/11573548
  5. Wensveen, S., Overbeeke, K., & Djajadiningrat, T. (2002). Push me, shove me and I show you how you feel: recognising mood from emotionally rich interaction. Proceedings of the Proceedings of the conference on Designing interactive systems: processes, practices, methods, and techniques, London, England, pp 335-340 | DOI: 10.1145/778712.778759
  6. Wensveen, S., Overbeeke, K., & Djajadiningrat, T. (2000). Touch me, hit me and I know how you feel: a design approach to emotionally rich interaction. Proceedings of the Proceedings of the conference on Designing interactive systems: processes, practices, methods, and techniques, New York City, New York, United States, pp 48-52 | DOI: 10.1145/347642.347661

This is a pre-print version | last updated February 5, 2008 | All Publications