HCI and the Face

Call for Participation for CHI2006 Workshop on April 22ND 2006

The human face plays an important role in many aspects of verbal and non-verbal communication. As such, the face is a rich source of information relevant to human-computer interaction. The fields of eye gaze tracking and face recognition have both reached sufficient maturity that several companies now offer commercial products based on these technologies. By contrast, other aspects of facial information processing, including expression and gesture recognition have yet to reach a comparable stage of development.

This workshop will consist of a general assessment of the state of the art of facial information processing in HCI. By examining a broad range of topics in HCI related to this theme we will attempt to understand why certain areas of face-based HCI, such as facial expression processing and robotic facial display, have lagged others, such as gaze tracking, and identity recognition. The goal is to collectively arrive at a set of research strategies to bring the more slowly developing areas up to speed.

An extended list of possible topics will be made available at the workshop’s web site. Briefly:

  • Innovative methods for controlling robotic faces
  • Expression and gesture measurement and tracking tools
  • The role of context in processing facial information
  • Multimodality: combining facial information with affective data such as EMG and galvanic response
  • Theory and research paradigms
  • Key privacy, trust, and security issues for face processing applications
  • Face technology for the disabled
  • The face in technology for the aging population
  • Face technology for entertainment computing and games
  • Funding, publication and exchange of research


The workshop will consist of a day long highly interactive format that will encourage group dialogue and knowledge transfer/sharing.

Submission Details

Submissions should consist of 2 – 4 page position statements in ACM format. Statements should be submitted as PDF files not exceeding 5 MB in size, and should be sent by e-mail to mlyons@atr.jp.

Important Dates

December 15th, 2005: Submission deadline
February 1st, 2006: Notification of acceptance

March 1st, 2006: Accepted papers published on workshop web site
April 22rd, 2006: Workshop


Michael J. Lyons
ATR Intelligent Robotics and Communication Labs, mlyons@atr.jp

Christoph Bartneck
TU/e Department of Industrial Design, christoph@bartneck.de

Further information can be found:


Time Activity
09:00-09:10 Michael Lyons & Christoph Bartneck
Welcome and introduction to the workshopReflection on Robotic Intelligence
Christoph Bartneck
09:10-09:45 Looking Behind the Face: Research on Robotic and Computational Modeling Oropharyngeal Anatomy for Speech Synthesis
Keynote presentation by Sidney Fels
09:45-10:00 coffee break
10:00-10:15 Emotions and EMG measures of facial muscles in interactive contexts
Sascha Mahlke
10:15-10:30 Concurrence of Facial & Bodily Expression: A Feasibility Study
Elizabeth Crane
10:30-10:45 The Face in Activity Analysis and Gesture Interfaces
Alejandro Jaimes
10:45-11:00 ECAs Capabilities
Catherine Pelachaud
11:00-11:15 Emotion-mapped Robotic Facial Expressions based on Philosophical Theories of Vagueness
Phil Serchuk
11:15-11:30 Faces as Content
David J. Chatting
11:30-11:45 Usability Indicators – In Your Face
Pedro Branco
11:45-12:00 When the Interface is the User’s Face: Ideas for Research and Applications
Ian Li
12:00-12:15 Searching for Emotional Content in Digital Video
Aleksandra Sarcevic
12:15-12:30 identify issue for dicussion and form themed groups for the afternoon
12:30-14:00 lunch
14:00-15:30 group work
15:30-15:45 coffee break
15:45-17:00 discussion, summary, wrap up
  dinner (optional)

Leave a Reply

Your email address will not be published. Required fields are marked *