Towards ICT Support for Elderly Displaced People: Looking for Natural Gestures Alessandra Melonio, Laura Tarantino, and Tania Di Mascio Abstract In the aftermath of natural disaster ICT-based tools can support technology-erudite people but risk to sharpen the isolation of vulnerable population groups because, e.g., of grey digital divide. In this paper we discuss some usability studies performed after L’Aquila earthquake on elderly people at risk of social isolation, to single out interaction needs and platform requirements for easy-to-use elderly oriented tools ideally not requiring learning need. In particular, we focus our analysis on intuitiveness of multifinger gestures on tablets. 1 Introduction The study here reported originated within the context of more general studies launched after the L’Aquila earthquake, aimed at designing ICT-based support tools for populations hit by natural disasters. Though in the immediate aftermath of a disaster most of the attention is focused on material needs and material damages, social and psychological immaterial damages may result even more relevant than material ones (see, e.g., [9, 14]). Social interaction is recognized as a key factor for the reconstruction of broken social ties and of a new universe of legitimate shared meanings [14], but this process can be hampered by the geographical dispersion of the population caused by post-disaster evacuation and displacement. In the cyberspace era, which offers immaterial spaces where communication may take place, it is legitimate to investigate on the kind of ICT-based interaction that may support the social recovery [1]. Actually, many ICT-enhanced support tools, categorized as Disaster Management Systems, have been developed to this A. Melonio • L. Tarantino (*) • T. Di Mascio Dipartimento di Ingegneria elettrica e dell’informazione, Università degli Studi dell’Aquila, L’Aquila, Italy e-mail: [email protected]; [email protected]; [email protected] M. De Marco et al. (eds.), Information Systems: Crossroads for Organization, Management, Accounting and Engineering, DOI 10.1007/978-3-7908-2789-7_16, # Springer-Verlag Berlin Heidelberg 2012 135 136 A. Melonio et al. end [3, 8, 12], mostly aimed at enabling real-time, categorized, geo-referenced data sharing and based on communication via sms or web portals. Unfortunately, if the recourse to technology may help technology-erudite people in cutting distances and re-tying broken linkages, at the same time it may exacerbate the isolation of vulnerable population groups, as in the case of elderly people, emphasizing the imbalance derived by grey digital divide. To overcome the reluctance of users of this kind, often nearly technophobic, it is necessary to conceive easy-to-use tools tailored to the capabilities of aging users and that can be used with almost no learning need. While cellular phones, though diffused, suffer from small sized displays and small buttons, not appropriate for visual and motor impairments typical of the aging process [4, 5], recent tablets based on touch interfaces promise to be an appropriate platform on which to base elderly-centered applications that allow to exploit the potential of networked communication in disaster recovery phases. In the rest of the paper we report on a preliminary usability study on iPad conducted in the aftermath of the L’Aquila earthquake with senior people at a risk of social isolation and with scarce e-literacy: after a brief discussion on the context of use, we discuss the platform requirements and the experiment results. 2 Context Specific Platform Requirements The 6.3 quake that hit L’Aquila on April 6, 2009 made unlivable about 50% of residential buildings and most public buildings, forcing the evacuation of more than 70,000 people. In the immediate aftermath of the disaster, evacuees were scattered over the whole Abruzzo region. Official reports [11] show that more than 2 years after the quake more than 35,000 people are still displaced: about 20,000 are housed in more than 20 “new villages” built by the Civil Defense after the quake and located all around the city territory (along a 100 km closed path), about 1,000 are hosted in temporary shelters (like barracks and hotels, some still on the coast), and nearly 15,000 live in autonomously arranged housing solutions. The historical city center, which previously hosted almost the entire public life, is still inaccessible to citizens, as well as historical centers of surrounding villages. The condition of elderly people is particularly heavy: they are often separated by family members, far from their homes and their original communities, and with scarce travelling or commuting capabilities. In the series of interviews we conducted with 15 senior (65–85 years) citizens hosted in temporary shelters and nursing homes, a recurring concern coming out was the impossibility to “see my house” or “see what is going on in the city” or to “get news about my former neighbors”. It was soon clear to us that, beside needs traditionally addressed by research on assistive technology, ICT-supported ageing and elderly care networks (see, e.g., [2, 4, 13]), also specific displacement-related needs had to be addressed. Based on our interviews, we determined functional and non functional requirements for an elderly-centered application that would result appealing also to “technology Towards ICT Support for Elderly Displaced People: Looking for Natural Gestures 137 reluctant” users (nearly 50% of interviewees never used cellular phones, let alone PCs) [6]. In this paper we focus on platform requirements. In a post-disaster context a specific aspect to be considered is the instability of the housing conditions, with many evacuees already moved two or three times after the disaster. This suggests to prefer mobile computing solutions over, e.g., smart house oriented approaches [2]. Actually, recent mobile devices relying on relatively large (1000 ) touch screens allow to overcome usability problems typical of cellular phones, since text and icons can be displayed reasonably large and buttons can in principle be made scalable [10]. The benefit of touch screen interfaces over indirect input devices for elderly users has already been demonstrated [7, 15]. Furthermore studies show that older users may be slower than younger ones in performing simple gestures, but not necessarily less accurate, and that familiarity of the gesture can influence the performance [10]. While these studies are conducted mostly at syntax level, it remains unclear to which extent gestures result natural to use and effective also at semantic level, that is whether novice older users (in our case often with low education background) succeed in easily associate gestures to their function/effect and to remember the association for later re-use. We aim at making one step further in the research in this area, by providing answers to this open problem. In the following we report on a preliminary usability study. 3 Evaluation The main research issue of the study was to evaluate whether customary iPad gestures are intuitively understood, accepted and adopted by elderly users with no or scarce e-literacy. In order to do so, we studied how five elderly users interacted with five different applications (iBook, Skype, Photos, VLC and iScopa), based on a set of seven simple gestures: Tap, Double Tap, Press, Drag, Pinch, Flick/Swipe, and Flick/Scroll (as defined in http://www.lukew.com/ff/entry.asp?1071). Participants: On a voluntary basis, five novice older users, age 68–83, took part in the test. None of them ever used a touch device before the experiment; one of them never used even a cellular phone, one uses a smart phone, while the others regularly use a first generation mobile phone. Their formal education varies from illiterate to high educated people. Overall, the sample covers the classes singled out in the users’ analysis: User A (female, age 71) is dynamic and socially active; User B (male, 79) lived part of his life is the United States as emigrant involved in a variety of jobs, from reception desks to driver, used to interact with people and is familiar with everyday consumer items like cellular phones and video cameras; User C (male, 83) has always worked as a farmer in a small town and doesn’t have any familiarity with any type of technology; User D (female, 70) is a housewife who does not likes to interact with people and with technological artifacts; User E (male, 68) used to work in a bank and possesses a PC and a smartphone. Venue and dates: The evaluation took place in users’ living environment to make them feel comfortable without extra pressure. All users were living in 138 A. Melonio et al. temporary accommodations: users A and B in removable wooden houses installed after the quake (in Villa Sant’Angelo (AQ)), users C and D in a nursing home (Opera Santa Maria della Pace, Fontecchio (AQ)), and user E in a hotel (on the Abruzzo coast). The sessions took place in three different days in November 2010. Procedure: To recruit users in the nursing home we presented the project to the manager and he introduced us to users C and D. For others users we asked directly to them whether they wanted to participate. Before performing the tests, users were met individually to explain ethical considerations concerning consent, withdrawal and confidentiality, and asked permission to make videos. They were informed about the expected duration of the session, what the experiment intended to evaluate, the characteristics of the applications and the order in which these would be presented, and what they were expected to do. A brief practice session was then conducted to help participants understand the iPad basic behavior and let them get familiar with it. Users were reassured that it was not a problem if they did not know how to perform a task and that they could freely try to touch or ask for help. When the participants had assured the experimenter that they fully understood the tasks and that they were ready to proceed, the experiment began. The evaluation was conducted using a combination of the thinking aloud protocol and the “cognitive walkthrough” inspection method [16]. Four users were filmed with two cameras, to record both gestures and faces, and to be able to study later, from facial expressions, user’s emotions related to the interaction (see Fig. 1). One user preferred not to be filmed. Tasks: The selected gesture set was tested in five different applications, each allowing us to evaluate the effectiveness of a subset of gestures, as follows: 1. 2. 3. 4. 5. iBook: Tap, Press, Flick/Swip, Drag Skype: Tap, Flick/Scroll, Drag Photos: Tap, Double Tap, Flick/Scroll, Flick/Swip, Pinch VLC (video player): Tap, Drag iScopa (traditional Italian card game): Tap, Double Tap, Drag. The use of the same gesture in more than one application allowed us to evaluate gestures memorability by looking at user’s behavior in successive steps. There were five consecutive rounds for each participant, one for application. Different users were assigned different application orders, to avoid biases derived from particular orderings (see Table 1). For each round and each user, the first task was always a Fig. 1 Pictures from the tests: a room of the nursing home (left) and a user in trouble (right) Towards ICT Support for Elderly Displaced People: Looking for Natural Gestures Table 1 Descriptions of evaluation sessions Users User A User B User C User D User E 139 Round order 2,1,3,4,5 1,2,3,4 5,1,3 1,5,3,2,4 3,1,2,5,4 Fig. 2 Structure of the iBook evaluation tasks free interaction with the application during which the user was invited to comment aloud. Then we tried to enforce a sequence of specific tasks (e.g., “go to next page” while using iBook) depending on the application they were using. For example, Fig. 2 shows how we designed the walk through the iBook application (other evaluation rounds were similarly structured [6]). In many cases the tasks were performed in a sequence different from what we planned or not performed at all (users B and C gave up with some applications); in these cases we asked explanations about the choices the user made or the reason for abandoning the task. Users were not forced in any actions but when in trouble (unexpected system behaviors or long time for task execution) or asked help, they got help from us. Experimental measures: System usability was evaluated by qualitative efficacy measures, qualitative and quantitative efficiency measures and subjective and objective satisfaction measures. In particular: • Efficacy: we studied which gesture is selected to achieve a particular goal, if gestures are correctly performed, if one or two hands are used for multifinger gestures, if cursors are used, if numbers and letters are correctly typed, if visualized items are readable and comprehended; • Efficiency: we counted the number of gestures performed to reach a goal, and we considered: the path followed to reach a goal, if the user remembers already performed gestures, if the user is able to associate a gesture to a goal re-utilizing 140 A. Melonio et al. it to reach the same goal later, if the user remembers the position of interactive items necessary to activate a function; • Satisfaction: video recorded facial expressions helped us to evaluate whether the user was relaxed, worried or amused while performing a task (see in Fig. 2 a user in trouble diverting attention from the device to ask help); direct questions posed during task execution helped us to evaluate if the user has readability problems, if s/he appreciates interface properties (e.g., colors, icons, etc.), if s/he understands the meanings of symbols, icons, buttons and the like. Results and discussion: Due to space limitation, we report here a general discussion on the evaluation session and summary data on tasks execution presented according to a Likert scale (0. . .5), as follows: • • • • 5 (Accurate): the user accurately identifies objects and perform gestures; 4 (Good): the user completes the task in a short time without problems; 3 (Discreet): the user completes the task with some age-related problems; 2 (Poor): the user has many difficulties in completing the task and often requires help; • 1 (none): the user does not understand what to do. • 0: the user needs help For each tested application, evaluation results were organized in a summary table arranged according to task similarity determined on the basis of the gestures a task requires (see an example in Table 2 for iBook). Then a global summary table for individual gestures (see Table 3) was derived according to scores obtained by each gesture in different tasks of different rounds. The scores were weighted to take into account the influence that individual gestures have on task completion when the task requires more than one gesture (e.g., for regulating brightness (Task T.1.4.2) a sequence of Tap and Press is necessary). To assign weights, relative difficulty of gestures within the task was evaluated on the basis of observation of users during task executions. Our findings can be summarized as follows: General aspects: all users instinctively put the tablet in horizontal position; expected problems related to lack of haptic feedback are found (e.g., users keep Table 2 Results summary for iBook Task Gestures Users T.1.1, T.1.5 T.1.2, T.2.2.1 T.1.3 T.1.4.1 T.1.4.2, T.1.4.3 T.1.4.4 T.1.E T.1.S A 4 4 3 4 4 3 4 □ Tap Tap, press Double tap, flick/swip Drag, tap Press and tap for near object Tap, memorability Tap, physical button B 3 2 3 1 3 2 3 □ C 4 2 4 3 3 4 4 D 4 4 4 3 2 1 2 □ E 5 5 5 5 5 4 4 Towards ICT Support for Elderly Displaced People: Looking for Natural Gestures Table 3 Summary figures Gestures Tap Double tap Drag Press Flick/swip Flick/scroll Pinch Physical button Support User A 4 3 3 3 4 4 3 4 □ User B 3 1 2 2 3 3 2 4 □ User C 5 3 4 3 5 4 3 5 User D 3 2 3 3 4 4 3 3 □ 141 User D 5 4 5 5 5 5 4 5 on pressing on a specific interface item even if they do not get any system response); scroll bars and tab bar items are not perceived as interactive objects. Age specific impairments: users constantly look for a support where to lean the tablet; they have problems in selecting items with size below 2 cm and in reading characters below 17 pt. Gesture performance: expected slowness in performing gestures is confirmed [10]. Furthermore the following observations can be done: • Flick/Swipe are friendly for all users; • Drag is done easily and is quite intuitive; • Flick/Scroll is sometimes complex to perform and the vertical-scroll, from bottom to top, is preferred over the horizontal one (e.g. in scrolling photos, the user sees to the empty spaces on top and bottom of the screen and makes a movement upwards, whereas horizontal white space is not seen); • Users have difficulty in Press, and in fast single Taps that often become Press; • A Double Tap is never used except by mistake (e.g. by double tapping on a photo, or double tapping on the left side of the page of a book they accidentally achieve a result, but they did not notice which gesture was done and then they are not able to redo it to achieve the same result); • Pinch is never done intuitively but is done only after an example and is always done with two hands. Memorability: most users do not remember gestures and interactive actions done to achieve a specific goal, ask confirmation on whether the present action was already previously done, and follow different paths in different tasks to achieve the same goal. Emotion: while users initially show anxiety and concern in using the tablet, as they acquire familiarity with the tool they start to appreciate what they can get from it: they look interested in the book readings (which they do aloud), they are amazed while interacting with photos, and enjoy with emotional participation the card game commenting aloud while playing as if the iPad was a human player. 142 A. Melonio et al. 4 Conclusion and Future Works We discussed some issues about platform requirements for an ICT application aimed at supporting elderly people after natural disasters. It is necessary to rely on intuitive interaction based on direct input languages, preferably on mobile devices with touch sensible medium sized screens (about 1000 ). In this scenario, the evaluation study was aimed at verify flaws and deficiencies of existing iPad applications with respect to gesture use. Though our results are preliminary with respect to the number of involved users, the overall experiment setting up contributes to the research area by providing a setting for evaluating gestures at semantic level. The evaluation results constituted the basis for the design of a prototype that, in a first round of usability tests at mockup level, proved to be more elderly-oriented than the traditional iPad applications (errors and numbers of attempts necessary to reach the goals were lower than in the tests here reported). Future works will be aimed at the implementation of the system and at more extensive evaluation experiments. References 1. Barbini, F.M, D’Atri, A., Tarantino, L., Za, S. (2011), ICT support to the reconstruction of social meanings after a disaster, In G. Bradley, D. Whitehouse and G. Singh (Eds.), Proc of IADIS Intl Conf on ICT, Society and Human Beings 2011 (pp 224–228), Associate Editors: Luis Rodrigues 2. Callejas Z. & López-Cózar R. (2009), Designing smart home interfaces for the elderly, ACM SIGACCESS Newsletter, 95, 10–16. 3. Careem, M. et al (2006), Sahana: Overview of a Disaster Management System. In Proc of the Int Conf on Information and Automation (ICIA), Colombo, Sri Lanka, 361–366. 4. Czaja, S.J. & Lee, C.C. (2007), The impact of aging on access to technology, Universal Access in The Information Society - UAIS, 5(4), 341–349. 5. Fisk A.D., Rogers W.A., Charness N., Czaja S.J, Sharit J. (2004), Designing for older adults, CRC Press, Boca Raton, USA 6. Melonio, A. (2010), Interazione naturale con dispositivi di nuova generazione, University of L’Aquila, Master Thesis. 7. Murata, A. & Iwase, H. (2005). Usability of touch-panel interfaces for older adults. Journal of Human Factors. 47(4), 767–776. 8. Okolloh, O., 2009. Ushahidi, or ‘testimony’: Web 2.0 tools for crowdsourcing crisis information. Participatory learning and action, 59, 65–70. 9. Perry R.W. & Lindell M.K. (1978), The psychological consequences of natural disaster: a review of research on American communities, Mass Emergencies, 3, 105–115. 10. Stossel C. (2009), Familiarity as a Factor in Designing Finger Gestures for Elderly Users, In Proc. of the 11th Int Conf on Mobile HCI, New York: ACM. 11. http://www.commissarioperlaricostruzione.it/Informare/Situazione-della-popolazione-postsisma/Report-sulla-situazione-della-popolazione-post-sisma-al-28-giugno-2011 12. Turoff M. et al (2004). The Design of Emergency Response Management Information Systems. Journal of Information Technology Theory and Application, 5,4, pp.1–35. Towards ICT Support for Elderly Displaced People: Looking for Natural Gestures 143 13. Vercruyssen, M, (1997), Movement control and speed of behavior. In A.D. Fisk, W.A. Rogers (Eds.), Handbook of human factors and the older adult (pp. 55–86). San Diego, CA, US: Academic Press 14. Weick K.E. (1993). The collapse of sensemaking in organizations. The Mann Gulch disaster. Administrative Science Quarterly, 38(4), 628–652. 15. Rau P.P., & Hsu J. (2005), Interaction devices and web design for novice older users, Educational Gerontology, USA, 2005, 39, 19–40. 16. Nielsen J. & Mack R.L. (1994), Usability inspection methods, John Wiley &Sons, New York.
© Copyright 2026 Paperzz