Facial Expressions as Game Input with Different Emotional Feedback Conditions Michael Lankes Stefan Riegler Astrid Weiss ICT&S Center, Univ. Salzburg Sigmund-Haffner-Gasse 18 5020 Salzburg ICT&S Center, Univ. Salzburg Sigmund-Haffner-Gasse 18 5020 Salzburg ICT&S Center, Univ. Salzburg Sigmund-Haffner-Gasse 18 5020 Salzburg [email protected] [email protected] Thomas Mirlacher Michael Pirker ICT&S Center, Univ. Salzburg Sigmund-Haffner-Gasse 18 5020 Salzburg ICT&S Center, Univ. Salzburg Sigmund-Haffner-Gasse 18 5020 Salzburg thomas.mirlacher @sbg.ac.at michael.pirker3 @sbg.ac.at ABSTRACT ICT&S Center, Univ. Salzburg Sigmund-Haffner-Gasse 18 5020 Salzburg [email protected] express emotion [?].” This approach of user interaction roots in the fact that people often tend to treat computers socially [?]. Thus, followers of the affective interface approach try to incorporate interaction modalities that resemble human communication habits. The other major field concerning emotion in HCI is emotional design. It postulates that emotion is considered as an important factor of the user’s experience with interactive systems. Emotional design aims to incorporate emotional aspects in the interactive system design process [?]. Concerning games and emotion both categories, affective computing and emotional design, appear to be relevant as researchers are interested in investigating the emotional impact of affective interfaces on players, and are keen on investigating the emotional affected user experience during gameplay. To observe emotions as input for gameplay, we developed the game “EmoFlowers”. The proposed game design approach includes aspects of affective computing, should provide a positive user experience, and is easy to learn via innovative emotional input methods. By utilizing facial expression as an explicit input method for games, users are enabled to interact with a system in a natural way. Our goal is focused on the investigation of various affective feedback conditions and their impact on user experience and user effectiveness. We propose a game design approach that utilizes facial expressions as an input method under different emotional feedback configurations. A study was conducted in a shopping centre to assess our game “EmoFlowers” focusing on user experience and user effectiveness. The study revealed that interaction with a game via facial expression is perceived naturally, is easy to learn, and provides a positive user experience. Categories and Subject Descriptors H.5m [Information interfaces and presentation]: Miscellaneous Keywords Affective interfaces, facial expression, emotion, games, feedback, user experience, user effectiveness 1. [email protected] Manfred Tscheligi INTRODUCTION Emotions are one of the major factors while assessing user experience in games. According to Minge [?] emotional factors have to be integrated in Human Computer Interaction (HCI) in order to assess complex interaction experience. They have a significant impact on user experience as they influence people’s actions, expectations, and future evaluations [?]. Mahlke [?] introduces a taxonomy by classifying emotion in HCI into affective computing and emotional design. The concept of affective computing implies systems that are able to perceive the “user’s emotion, model the user’s affective states, adapt to the user’s affective state and 2. ! " # $#% &' ()(*++ (,&,( ) -+ 253 RELATED WORK Getting an overview on the factor emotion in games appears to be a difficult undertaking. One main reason can be seen in the fact that different aspects and goals under different theoretical implications are pursued when examining the state of the art in this field. One approach forms the field of the investigation of emotional response patterns elicited by video games [?]. Ravaja et al. presented different types of computer games (Tetris, James Bond: Nightfire, etc.) to 37 subjects. To measure their emotional response patterns they employed categorical (fear, joy, etc.) and dimensional measurement methods (arousal and valence dimensions). They conclude that different types of games elicit different types of emotional dispositions. Currently it seems that most research is carried out by resorting to physiological reactions to emotion eliciting gaming situations. The research results of Hazlett [?] identified that activities of the corrugator supercilii and the zygomaticus major correspond to positive or negative valence values in a gaming situation. Mandryk and colleagues [?] merged various physiological measurement methods to quantify the emotional status during gameplay. Sykes et al. [?] measured the affective state of players and assumed a correlation between arousal and the pressure ”used to depress buttons on a gamepad.” If the factor arousal showed high values (high difficulty level in game) than players pressed the buttons harder. Our approach focuses on explicit emotional facial expression, which is based on the research efforts of Bernhaupt et al. [?]. Their game, “Emotional Flowers”, is conceptualized for working contexts, and harnesses the player’s emotions as the primary means of game interaction. The player’s facial expression of emotion serves as an input method to control the growth of a virtual flower. The goal is to grow the flower as fast as possible, based on positive emotions like happiness. The game concept supports multiple players’ flowers of all participants within the game are additionally displayed on an ambient display in a public area. It was hypothesized that emotions have an impact on the user, but also have an effect on the social interaction within the group of players. The researchers found out that the concept of utilizing facial expressions as an input method for playing a game, is easy to learn. 3. Figure 1: Screenshot of the game interface: Left: flower growth bar (0-10); Bottom: emotion bar (from left to right: sadness with high intensity, sadness with low intensity, neutral, joy with low intensity, joy with high intensity); Below emotion bar: counter; Right: video feedback (first condition) time fairly short to reduce cognitive overload. We adapted the game “EmoFlowers” to these requirements. It is a short and self-explaining game, which creates a few minutes of fun and entertainment while shopping. 4. GAME DECRIPTION The game “EmoFlowers” utilizes facial expression as an input method to influence the growth of a virtual flower. The player is forced to reproduce specific facial expressions to progress in the game. By means of specific emotions, players are enabled to alter the current weather state in the game. The goal consists of achieving the appropriate weather condition to grant the optimal growth environment for the flower (growth: 10) within a predefined time frame (3 minutes). Depending on the current need of the flower, the players have to reproduce the required weather condition through facial expression. In the context of “EmoFlowers” two basic emotions [3] with two different intensities were employed in the gameplay: joy and sadness. When showing joy with a high intensity the weather state is set to bright sunshine. Joy with a fairly low intensity causes a low amount of sunshine. Furthermore, the emotion sadness (two intensities) leads to rain. A neutral facial expression is also implemented as the software, in some cases, cannot categorize some expressions, or the expressed emotion is not applicable for the current game situation. The interaction script is fairly simple. In idle mode the game installation displays the basic game rules. Players start the game by showing the joy expression for 5 seconds. Afterwards players are asked to perform the relevant four facial expressions (joy with high intensity, joy with low intensity, sadness with high intensity, sadness with low intensity) to get familiar with the input method via a short tutorial (duration: 30 seconds). The game starts immediately after the tutorial session. One emotional icon of the interface displays the actual facial expression, while another icon (indicated by a red circle) shows the required emotion (see figure 1). The required emotions are chosen randomly and are altered within a short period of time. Since the interaction time should be short the player has a time limit OUR APPROACH The previously introduced research by Bernhaupt et al. [?] served as the basis for our project. However, the game of Bernhaupt and colleagues was designed for continuous usage in a working environment, and had to be redesigned since our game was evaluated in a shopping centre. Within this setting, interaction time is deemed to be short and feedback has to be provided immediately. Another aspect that distinguishes our research efforts from previous work can be found in the investigation of different emotional feedback configurations within the same game design to observe differences in user experience and user effectiveness. Emotional feedback is provided via video, icons, and, solely, the current game status. Concerning game interaction, players have to reproduce specific facial expressions to progress in the game. Via emotions, interpreted by a software tool [?], players are enabled to influence the weather status in the game. The weather state has a direct influence on the growth of a virtual flower (see figure 1). Focusing on the shopping setting, applicable game design strategies are required that pursue the following goals: they should have easy rules, they are often adaptations of traditional and well-known games with a short duration, and should not arouse a high cognitive load. In contradiction to many PC games no excessive concentration is needed [?]. To incorporate the requirements of the shopping setting in our game design, we utilized the concept of computer supported cooperative play (CSCP). Al-Zubaidi et al. [?] defined specific guidelines to develop and design games that respect the requirements of CSCP at work. The authors postulate to define simple game mechanics that do not require a long learning process. The players should be able to communicate during game play. Another issue is to keep the interaction 254 Figure 3: Children playing the game Figure 2: Scheme of the technical setup tion bar (see figure 1 - bottom), the growth of the plant, and the growth bar. We hypothesized that the feedback conditions have a significant impact on the user effectiveness and user experience when playing the game. After interacting with the system, the participants were interviewed with a standardized questionnaire, consisting of thirteen questions on the user experience factors fun, engagement and cognitive load. The questions had to be answered on a five-point Likert Scale: 1= “absolutely agree” to 5=“absolutely disagree”. The time needed to reach the game goal was also stored to measure user effectiveness. Moreover, participants were asked if the game had an impact on their current emotional status. of 3 minutes to fulfil the objective. The game ends when the growth of the flower reaches its maximum value (10) or when the time limit exceeds. 5. TECHNICAL SETUP The interpretation of facial movement is achieved via the “Realtime FaceDetector” provided by the Fraunhofer Institute for Integrated Circuits (IIS) [?]. The software is capable to interpret facial expression based on the categorical approach [?]. The categorical approach claims that there are some “fundamental” emotions (joy, fear, sadness, anger, disgust, surprise). The term “fundamental” presents those patterns of responses to the world that evolution has equipped us with, due to their necessity for our survival. All other emotions are somehow derived from this small set of simpler emotions. The “Realtime FaceDetector” is capable to interpret the type and intensity of four basic emotions (joy, sadness, anger, surprise). The technical setup of our game design approach consists of a PC, a LCD display, and a webcam. The web-cam is mounted on top of the LCD screen. Players have to interact in front of the camera (focused on the face area), in order to capture the facial features of the player. 7. RESULTS The study was carried out with 105 participants with an age range from 7 to 68 years, whereas 85,7% of the participants were younger than 14 years. The questionnaire revealed that 92,4% of the participants reported to have a positive user experience while playing. 63,8% of the players described that the game had an impact on their emotional disposition. The overall design of the interface was also perceived positively (92,4%). The rating of the amount of time to fulfil the game goals is evenly distributed over all feedback conditions. With the exception of the first condition (video feedback) participants neglected the feedback displays and 6. USER STUDY - EXPERIMENTAL SETUP concentrated on the emotion bar (see figure 1 - bottom). In To evaluate the user experience and effectiveness of “EmoFlow- some cases players insisted on playing the first condition as they perceived it to be more funny. Furthermore, it is worth ers” we conducted a three-day user study in a shopping cento mention that the majority of participants was interested tre in November 2007. We randomly asked visitors of the in the input method “facial expression for games”, and could shopping centre, if they would like to play the game. It is imagine to integrate such games into their shopping activiworth to mention that many children, but also adults, even ties. Concerning the impact of the three conditions on the requested to take part in our study. We implemented three factor efficiency there was a significant main effect of confeedback conditions in our game, which were presented sedition on the completion of the game (F (2, 101) = 3.97, quentially to the participants. The first condition features p<.05). A following post-hoc-test (Scheffe, p<.05) revealed feedback via video and serves as an affective mirror. The a difference between feedback via icons (M: .75, SD: .19) interface element displays video information of the player’s and the no-feedback condition (M: .61 SD: .20). The asface while playing the game. Consequently, the player is pect of gender did not reveal any significant differences in informed about his current portrayed emotion. The second all conditions focusing on the questionnaire and efficiency. feedback condition uses emoticons, which show abstract repWe conducted several nonparametric tests (Mann-Whitneyresentations of the emotion category interpreted by the softU-Tests) in order to find differences between children and ware. The last condition only delivers feedback via the emo- 255 10. ADDITIONAL AUTHORS Additional authors: Thomas Scherndl (ICT&S Center, Univ. Salzburg, email: [email protected]) 11. [1] K. Al-Zubaidi and G. Stevens. Mensch & Computer 2004: Allgegenwärtige Interaktion., chapter CSCP at Work. Number 137-146. Oldenbourg Verlag, 2004. [2] R. Bernhaupt, A. Boldt, T. Mirlacher, D. Wilfinger, and M. Tscheligi. Using emotion in games: emotional flowers. In ACE ’07: Proceedings of the international conference on Advances in computer entertainment technology, pages 41–48, New York, NY, USA, 2007. ACM Press. [3] P. Ekman, W. Friesen, and P. Ellworth. Emotion in the human face. Pergamon, New York, NY, USA, 1972. [4] R. L. Hazlett. Measuring emotional valence during interactive experiences: boys at video game play. In CHI ’06: Proceedings of the SIGCHI conference on Human Factors in computing systems, pages 1023–1026, New York, NY, USA, 2006. ACM Press. [5] S. Mahlke, M. Minge, and M. Thüring. Measuring multiple components of emotions in interactive contexts. In CHI 2006, 2006. [6] R. L. Mandryk and K. M. Inkpen. Physiological indicators for the evaluation of co-located collaborative play. In CSCW ’04: Proceedings of the 2004 ACM conference on Computer supported cooperative work, pages 102–111, New York, NY, USA, 2004. ACM Press. [7] M. Minge. Methoden zur erhebung emotionaler aspekte bei der interaktion mit technischen systemen. Master’s thesis, Freie Universität Berlin Fachbereich Erziehungswissenschaften und Psychologie, 2005. [8] D. A. Norman. Emotional Design: Why We Love (Or Hate) Everyday Things. Basic Books, January 2004. [9] N. Ravaja, M. Salminen, J. Holopainen, T. Saari, J. Laarni, and A. Järvinen. Emotional response patterns and sense of presence during video games: potential criterion variables for game design. In NordiCHI ’04: Proceedings of the third Nordic conference on Human-computer interaction, pages 339–347, New York, NY, USA, 2004. ACM. [10] J. Read. An Investigation of Participatory Design with Children - Informant, Balanced and Facilitated Design. Shaker Publishing, Eindhoven; Netherlands, 2002. [11] B. Reeves and C. Nass. The Media Equation : How People Treat Computers, Television, and New Media Like Real People and Places (CSLI Lecture Notes S.). Center for the Study of Language and Inf, January 2003. Figure 4: Means and standard deviations for each item adolescents/adults. We found a highly significant difference for several variables indicating user experiences i.e. diversion of the game (Z = -2.49, p<.05), perceived game speed (Z = -3.67, p<.01), fun (Z= -4.46, p<.01), and the agreeableness of the game’s interface (Z=-3.06, p<.01). Younger participants (age of 11 or below) rated in all these questions higher indicating that they liked the game more than older participants. Interestingly younger participants agreed also significantly more often that the game had influence on their emotions (Z=-.2.72, p<.05). Nevertheless there was no significant difference of efficiency measures (time, percent) between the age groups. The means and standard deviations for each item are given in figure 4. 8. CONCLUSION & FUTURE WORK This paper gives an overview of our game “EmoFlowers”. The game uses facial expression as an input method, which grants a natural way of interaction with a system. Different feedback conditions were employed to evaluate user effectiveness and user experience. CSCP guidelines were successfully applied in a shopping setting. A majority of users reported a positive user experience when playing the game. The paper is just a first step in our ongoing research efforts. In the future we will investigate the gathered research data more in detail and use it as a basis to develop a generalized user experience questionnaire for short duration games. To identify further design improvements, it seems reasonable to evaluate the game once more focusing on older participants, since this group rated the game lower than the younger part. However, the lower rating can be explained by empirical findings that younger people tend to rate game evaluations with a higher score [?]. Another goal is the implementation of additional emotion categories (fear, anger, etc.) in the game concept. As we also observed, participants performed sounds, when portraying an emotion. Thus, we will apply additional input modalities (audio, tone of voice). Moreover, we will evaluate the user experience of the game concept in other contexts, like home or learning environments. 9. REFERENCES ACKNOWLEDGMENTS We would like to thank the Fraunhofer Institute for Integrated Circuits (IIS) for providing their “Realtime FaceDetector”. 256
© Copyright 2026 Paperzz