Squeeze Me: Gently Please Jelle Stienstra & Patrizia Marti Communication Science Department University of Siena Italy [email protected], [email protected] ABSTRACT This paper presents the Squeeze Me, a research-throughdesign case that explores the emergence of empathic behavior between human and machine by sparking an expression-rich relation. The Squeeze Me is a squeezable device used to grab attention from a robot, providing ground for expressive values to be shared. The expressions exerted on the mediating device by the human are mapped to expressive behaviors of the robot in the modality of motion in forthcoming interaction. We propose a doublelayered interaction paradigm in achieving natural and socially acceptable synthesis. Firstly, a direct mapping, inherently exhibiting a natural relationship. Secondly, an amplifying and reductive mapping to construct a personalizing relationship through vivid and lively interactions fed by the intentions of the robot as well as the user. The design case serves to explore consequences of a phenomenological approach on the constitution of empathy in the fields of human and robot interaction. With this work we intend to inspire design engineering to shift from representational and discrete to rich, continuous-sustained and other embodied mechanisms for interaction when targeting empathic behavior to emerge. Author Keywords Interaction design; continuous mapping; empathy ACM Classification Keywords H.5.m. Information interfaces and presentation (e.g., HCI): User Interfaces – theory and methods, input devices and strategies, interaction styles. General Terms Design INTRODUCTION Addressing people is what we consider to be our main duty while designing intelligent products and systems intended to transform society. We design for people, for our experiences, for the flow of our everyday lives. We use phenomenology as inspiration for design to respect the Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. uniqueness of our life-world’s perceptions, from a bodily [4] and a contextual experience perspective [6]. In succession of earlier work that focused on addressing human capabilities in a balanced and respectful manner [9], we here aim to explore the consequences of a phenomenological design approach, by focusing on humans social skills moreover the earlier explored perceptualmotor, emotional and cognitive skills. For this researchthrough-design case we utilize the design practical notion that social skills concern our capabilities to perceive and act in relation to other living beings and entities. In other words, they allow us to affirm and act upon action possibilities in others as well as allowing us to appreciate and act upon someone else’s feelings. This refers to a complementary relation between one and others’ perceptual-motor, emotional, and cognitive skills within a shared context. Following, this turns us back to earlier phenomenologyinspired design work that focused on addressing our perceptual-motor and emotional skills as they are the media of our social. The particular aim of this design case, the constitution of empathy within a human and robot context, thus points us to bringing together earlier work on empathy in the field of robotics with interaction design paradigms such as natural, rich and embodied interaction that build upon the emergence of meaning in the interaction. The aesthetic quality of our research-through-design case can therefore be found in the interaction rather than its visual form. In this paper we explore the possibility of achieving, by design, an empathic relation with interactive devices in order to enrich the experience of use as an emergent and dynamic outcome of the interaction. Phenomenology-inspired design principles Design and Human-Computer Interaction has been inspired by phenomenology as pushed in the past decades by Overbeeke [9], Dourish [2], Winograd and Flores [17] and Fällman [3]. Exemplary commercial outcomes found in HCI are touch-screen interactions, real-time car-navigation, the Fonckel (www.fonckel.com) and various gaming products such as the Kinect and Wii. For the researchdesign-case we mainly build upon design principles derived from Merleau-Ponty’s phenomenology of perception [6], which concerns notions such as embodiment, the emergence of meaning and intersubjectivity pushing the phenomenological stance beyond an individual experience of the world. While we interact with the world around us, using our body, using our perceptual-motor skills, we do so in a sustained continuous way more than discrete state changes as seen in most interface design. Therefore our primary design principle is to allow for a continuous interaction. Further, while we interact with the world we are surrounded by rich and expressive form-shifting interactions more than one-dimensional interactions (i.e. buttons that merely allow for on/off states) as found in most interface design. In order to feel what is good or tasteful, you would need to feel bad, less good, sweeter, more bitter, tighter etc. as well. Subsequently, this concerns interaction and not merely an input or output. It concerns the relation that constitutes a meaning in the interaction. Therefore our secondary design principle is to allow for rich interaction. In order to address our emotional skills, we utilize our perceptual-motor skills driving upon ‘natural interaction’ [8] as can be found in the flow of everyday life and kinetic and dynamic laws found in nature. The Frogger framework by Wensveen et al. [16, 11], provides designerly handles to design for such embodied natural interactions to address emotional skills. This can be achieved by closely mapping (or even coupling) the input to output in six aspects: time, location, direction, modality, dynamics, and expression. Two projects that use this framework in similar manner are Glowve [15] and Sensible Alternative [12]. The Glowve is a soft-toy for children that are afraid in the dark while intending to sleep. When the child squeezes the soft-toy, a result naturally following from being afraid and seeking for comfort is that the room lights up coherently. The child is enabled to see that there is nothing to be afraid of. The Sensible Alternative is a smart-phone interface that uses the backside of the phone to push in relevant applications. Here the pressure exerted is mapped to the appearance of the icons. If gently pushed icons appear gently while if thoroughly pushed the icons appear more vibrant. Like squeezing a clear-cut orange, the way of squeezing it influences the way the juices flow out. Both projects make use of the human perceptual-motor skill to hold and exerting pressure. By closely mapping this embodied and continuously rich input of pressure to coherent feedback emotional skills were addressed. Empathy Both examples exploit a single human-product interaction addressing our perceptual-motor and emotional skills through applying mappings that are experienced as natural. Though, the products are merely responsive towards a human interaction. We target to utilize our social skills and push towards meaning to emerge in interaction where responsiveness is not merely directly coupled. We aim for empathy between human and product. Empathy is a controversial construct, evoking debate over its nature, definition and measurement in any context. It implies the apprehension of another’s inner world and a joint understanding of emotions. Notwithstanding its argumentative connotation, its beneficial effect on attitudes and social behavior is widely recognized. This is the reason why a growing number of applications of the concept have emerged in different fields. For example in HCI, Bickmore [1], and Paiva et al. [10] attempted to emulate empathy in virtual agents. In design research, Koskinen at al. [5] developed methods and techniques as inspiration for design, to understand how people make sense of emotions. In the field of assistive robotics, Tapus and Matarić [13, 14] developed a model of empathic interaction based on verbal and non-verbal communication with robots. Taking a phenomenological approach to action and perception, we believe that empathy is not a result of an internal judgment or a merely cognitive activity. It is a social product emerging dynamically as an outcome of the interaction whereby actions and perception of people synergize with one another. Synergizing in the context of human-robot interaction, is for example when both robot and human are looking in the same direction (e.g. towards a source of noise), robot anticipation with opening a door, sharing joy or fear when watching a movie together. However, these synergies may become actualized in different ways. In our design case we explore the empathic resonance with a bottom-up and ‘behavior-based’ design strategy. In particular, we developed a concept of moody interaction to enrich people’s experience through a continuous dialogue with artifacts of everyday use. The concept is based on the earlier mentioned Frogger Framework, where responsiveness is mapped and coupled in a continuous action-perception loop. We extend this with a dynamic layer that continuously adjusts the mapping through interaction. This enables the interacting entities to show their shared intentionality to engage in interaction and to resonate with one another [7]. In this way we aim at making interaction expressive, embodied and responsive through a continuous action-perception loop without using a previous representation or plan of the interaction itself. This work has been carried out in the context of ACCOMPANY (Acceptable robotiCs COMPanions for AgeiNg Years), an European project funded under FP7ICT-2011-7 (http://accompanyproject.eu). The project develops a robot companion as part of a smart environment to facilitate independent living of elderly at home. In the context of this project, we explore rich and natural ways for interaction, focusing on empathy as a means to enable meaningful and engaging relations between human and machine. DESIGN CASE We explore the emergence of empathy provided by coherently natural and moody responsiveness via two input modalities, haptically and auditory: The ‘Squeeze Me’ and the ‘Call Me’. With these input devices the user, in our case an independent living elderly, is enabled to request attention from a caregiving robot, the Care-O-Bot. In this exploration we focus on the locomotive qualities of the robot as expression carrier. We further just focus on attention requesting from the elderly assuming the robot can complete the requested tasks. Just like the way people approach other people, the speed, acceleration, the angle and the closeness of coming near are technical descriptors that carry expressive qualities that address our feelings such as appropriateness, shyness, comfort, care, laziness, enthusiasm and so on. The main path of movement of the robot is predefined due to system architecture in the main platform; therefore we apply the expressive qualities in the movement through dynamic movement profiles describing the pace of movement over time. … just brushed my teeth, and surprisingly I was able to take a refreshing shower after yet another warm night. It is not easy facing the heat; it makes me tired, even more. Time to sit down and explore some tv channels, amusing myself by what the world is concerned with nowadays. Thirst is getting to me, perhaps the effort to shower myself took the best of me for the day. I need a drink, pff, getting out of this chair? No way, it is comfortable but I do not even have the strength to get out if I have to move around all day with this sun pushing the energy out of me. No the sun is not what it used to be. Ok, focus, water … Help, now what? Care-OBot is there to assist me, but it is early. I didn’t see him move around yet, perhaps he is still asleep. Well lets just wake him up to get me some water, after that I might have the strength to do some things on my own again … his help is needed, I squeeze gently, it is merely a pinch in the remote. Would he have even felt it? Yes, slow but smoothly Care-O-Bot turns towards me. Calmly driving in my direction. He is awake, and actually seems rather helpful today, perhaps not as grumpy as last night ... Naturally Approaching The Squeeze Me device is a simple (analog) force-sensingresistor to which we directly map the values of the movement of the robot. The expressions exerted on the mediating device by the human are mapped to expressive behaviors of the robot in the modality of motion in forthcoming interaction. A short pinch results in a sturdy movement, a hard squeeze results in a quick movement and a gentle touch in a slow approach. This direct mapping inherently exhibits a natural relationship while maintaining the richness exhibited by the user. … Care-O-Bot has been helpful today, yes, life can be lazy. Living from my chair, and the more firmly I squeeze, I am attended more rapidly … What is going on, he turns his back to me. That’s not the way it is supposed to be, I will squeeze again! Listen! No? fine, I do it myself! ... ow come on, give me a hand. I simply do not have the strength for this. Maybe I shouldn’t have used you as my slave today. Making you run around, for me to live like a king. Sorry, please, just take me to bed, I’ll ask gently ... Mood Figure 1. The Care-O-Bot platform used to explore expressive qualities in its movement to achieve empathy. On top of the directly mapped expressions of pinching to movement, interactions that inhibit a natural relationship, we here provide an amplifying and reductive mapping layer. This layer shifts the direct mapping towards a less expected mapping throughout the day (through interaction). The dynamically adjustments of expressive mapping, sometimes even inverting the input towards response provide a vivid and lively interaction. With a natural relationship as reference, the moody interaction evokes denial, over-enthusiastic, stubbornness and require the elderly to adjust its behavior in interaction. A relationship is constructed through these shifting interactions, we target for empathy through moodiness within a natural response. … Why am I on the floor? Hmm my knee, what to do now? It hurts so bad, I am too old for this ... HELP! … Hey that is quick, Care-O-Bot is here to offer me a hand. He came rushing at me just I wanted ... Similar to the ‘Squeeze Me’, the ‘Call Me’ design case utilizes expressive messages given by the elderly directly mapped and thereby inherently exhibiting a natural relationship. The expressiveness in a whisper results in a gentle movement while the shout holds more abrupt values demanding rapid attention. Intonation (loudness, length, dynamics, timbre) is continuously mapped to the movement of the Care-O-Bot. The symbolic meaning of the vocal command is ignored for now. In other words, we do not know what is said, we merely utilize how it is said. We foresee opportunities to combine both within a user interface. The moody interaction, providing an empathic experience through a continuous dialogue as described with the ‘Squeeze Me’ scenario, applies for the ‘Call Me’ as well. Shouting to the robot all day long is not going to make him work more and harder. Actually, he will ignore and provoke other ways of asking attention from the user. Whispering all day long, on the other hand, will make the Care-O-Bot probably invade the elderly’s personal space in order to ‘hear’ properly, this will transform the way of interacting as well. Through changing behavior in interaction between human and machine we target empathy. … Nice, finally sitting down to enjoy a movie. It is not like the cinema but well at least I am not alone. Care-O-Bot joined me from next to my couch… (scary scene, the tension builds up, to be heard within the soundscape of the movie) Ha, yes this is scary but you don’t have to look away cause this is a scary scene. You do not want to see the movie? Blocking your eyes with your fingers… (Bang) … Relax, it is just a movie, are you scared? You almost ran into the closet with your sudden reaction… Ahh, you do not want to miss the scene, right? Peeking through your fingers… Active participation Surrounding noise can be considered to disturb our expressive design proposal, due to technological difficulties, in order to achieve natural interaction and empathy. The robot would respond to the environment. With the above-described scenario we hope to illustrate that the shared context can provoke empathy through active participation. CLOSING REMARKS In this paper we presented a research-through-design case exploring empathy as a means to achieve expressive and rich interactions with a robot companion. The concept of empathic behavior is enabled by inherently meaningful and moody interactions implemented in two experience prototypes. The first one ‘Squeeze Me’ explores an expressive tactile interaction while the second one ‘Call Me’ plays with intonation of vocal commands to obtain personalized and moody responses. Expression in the movement is limited to the general movement of the Care-O-Bot, though in the future we plan to apply the expressive movement profiles on more ‘body’ parts, such as the tray and arm while performing tasks. Furthermore, a full action-perception-loop is not achieved here, as the movement of the robot is of responsive not interactive nature. An uncoupling of the torso’s movement from the system architecture will allow us to fully explore an action-perception-loop in the future. As stated in the introduction, the design cases are developed as a research-through-design, to make the concept of empathic interaction tangible and experience-able, to reflect on and challenge the theoretical framework. This will support the development of a new set of research questions and to guide the development of the next iteration of prototypes. Elderly users will use, experience and co-reflect on these prototypes as part of the iterative process. These prototypes should not be regarded as final solutions. The prototypes serve as physical hypotheses to explore a phenomenological approach to the design of empathic behavior. Our prototypes show that empathic demeanors in humanrobot interaction can be achieved in a direct, perceptual way not necessarily mediated by the use of complex and predefined procedures or sequences of actions. The proposed design does not require the representation of complex internal states and inferential mechanisms in the robot to provoke an empathic behavior to emerge. The proposed design basically relies on coupling and mapping actions and their effects through a continuous action-perception loop exploiting the richness and continuity of our embodied skills. ACKNOWLEDGMENTS We thank Ernesto Di Iorio for his profound support with prototyping the research-through-design case. The first author is further associated to the Designing Quality in Interaction group, Department of Industrial Design, Eindhoven University of Technology where the mentioned design-practical notion of social skills was initiated and is being developed further. REFERENCES 1. Bickmore, T.W. Relational agents: Effecting change through human-computer relationships. Doctoral dissertation, Massachusetts Institute of Technology (2003). 2. Dourish, P. Where the Action Is: The Foundations of Embodied Interaction. Cambridge, Massachusetts: MIT Press (2001). 3. Fällman, D. In Romance with the Materials of Mobile Interaction : A Phenomenological Approach to the Design of Mobile Information Technology. Doctoral dissertation, Umeå Universitet (2003). 4. Gibson, J.J. An ecological approach to visual perception. London: Lawrence Erlbaum Associates (1979). 5. Koskinen, I., Battarbee, K., & Mattelmäki, T. (Eds.) Empathic design. Helsinki: IT Press (2003). 6. Merleau-Ponty, M. Phenomenology of Perception (C. Smith, Trans.). New York: Humanities Press (1962, Original work published in 1945). 7. Marti, P. Perceiving while being perceived. In International Journal of Design, 4(2) (2010), 27-38. 12. Stienstra, J.T., Overbeeke, C.J. & Wensveen, S.A.G. There is More in a Single Touch: Mapping the Continuous to the Discrete. In Proc. CHItaly2011, New York: ACM (2011), 27-32. 13. Tapus, A., Matarić, M.J. Socially Assistive Robots: The Link between Personality, Empathy, Physiological Signals, and Task Performance. In AAAI Spring Symposium on Emotion, Personality, and Social Behavior, (2008). 8. Norman, D.A. The Design of Everyday Things. 257, Doubleday, New York, NY (2002). 14. Tapus, A., Matarić, M.J. Emulating Empathy in Socially Assistive Robotics, AAAI Spring Symposium on Multidisciplinary Collaboration for Socially Assistive Robotics, Palo Alto, CA, (2007). 9. Overbeeke, C.J. The Aesthetics of the Impossible, Eindhoven: Eindhoven University of Technology (2007). 15. Trotto, A., Hummels, C.C.M., Overbeeke, C.J., Frens, J.W. & Cianfanelli, E. Rights through Making - Wearing Quality. Firenze: Edizioni Polistampa (2009). 10. Paiva, A., Dias, J., Sobral, D., Aylett, R., Sobreperez, P., Woods, S., Zoll, C. & Hall, L. Caring for agents and agents that care: Building empathic relations with synthetic agents. In Proc. AAMAS’04. New York: ACM (2004). 16. Wensveen, S.A.G., Djajadiningrat, J.P. & Overbeeke, C.J. Interaction Frogger: a design framework to couple action and function through feedback and feedforward. In Proc. DIS 04, New York: ACM (2004), 177-184. 11. Stienstra, J.T., Bruns Alonso, M., Wensveen, S.A.G. & Kuenen, C.D. How to design for transformation of behavior through interactive materiality. Accepted for NordiCHI 2012 (2012). 17. Winograd, T. & Flores, F. Understanding Computers and Cognition. Norwood, NJ, Intellect (1986).
© Copyright 2026 Paperzz