Ed-Media paper -- 1on1 sessions

Human-Centered Design in Action: Designing and Performing Testing
Sessions with Users with Special needs
Dominik Hagelkruys
University of Vienna, Austria
Faculty of Computer Science, Educational Technologies (CSLEARN)
[email protected]
Renate Motschnig
University of Vienna, Austria
Faculty of Computer Science, Educational Technologies (CSLEARN)
[email protected]
Christina Böhm
University of Vienna, Austria
Faculty of Computer Science, Educational Technologies (CSLEARN)
[email protected]
Vera Vojtova
Pedagagicko-psychologická poradna Brno
[email protected]
Maud Kotasová
Pedagagicko-psychologická poradna Brno
[email protected]
Kristyna Jurkova
Pedagagicko-psychologická poradna Brno
[email protected]
Abstract: The Human-Centered Design approach heavily relies on the inclusion of users into the
design process. In special needs projects, in particular, it is vital to include the target users due to
their specific requirements on the end product. However, conducting testing sessions with users
with special needs always requires a lot of preparation and careful consideration. In the case of
the LITERACY-project this special user group consists of people with dyslexia, a learning
disability. Moreover, testing is performed across three nations. In this article the process of
planning, conducting and analyzing such a testing session with a special user group is described
and analyzed. It will discuss the considerations while planning the testing sessions, special
preparation needed to create a suitable testing environment, choosing feedback channels and
analyzing the data collected in the sessions.
Introduction
The LITERACY-project is an on-going European wide research project, which will conclude in February 2015 after
a runtime of three years. It is funded by the European Commission and aims to support adults and teenagers > 16
years of age with dyslexia (or limited reading literacy) and to improve their social inclusion. Its goal is to provide an
ICT solution which will enable dyslexic youth and adults to acquire learning skills, accommodation strategies and
methods for succeeding at literacy related tasks at work and at home. The ICT solution, a web portal, has been
designed to enable users with dyslexia to operate independently online, by providing interactive services like for
example:



Comprehensive analysis of one's strengths and weaknesses and based on it
personalized e-learning tools and assistive technology and
a community zone with a specialized human-computer interface – helping users to socialize in ways they
find meaningful.
(LITERACY Project, 2012)
The LITERACY-Portal is based on a human-centered design (HCD) approach. Human Centered Design (HCD) is a
specific design approach, which sets the focus on the users of a product. The HCD-process is standardized (ISO
9241-210:2010) and follows the philosophy that a product can only be good if it suits the needs of the people who
use it. Through the application of various HCD-methods an initial working version of the website was created. This
version is a product of heavy inclusion of dyslexic users who provided valuable input and insight and participated in
multiple testing sessions.
The aim of the testing sessions described in this paper was to improve the usability of the LITERACY-portal.
Although the design of the portal was chosen and altered incrementally by dyslexic users, some of the features were
only tested in theory or as a detached part of the portal. Therefore feedback regarding the functionality of the portal
as a whole construct was necessary to further improve its usability.
The following section will analyze the special challenges connected with dyslexia and creating a dyslexia friendly
testing-environment as well as difficulties in the process of recruiting participants and supervisors. Subsequently the
set-up of the face-to-face and group testing sessions will be described. Then the results of those testing sessions will
be displayed and analyzed. The paper concludes with a short summary and a brief reflection of the testing
procedure.
Special challenges connected with the target audience
In order to provide feedback that is valuable and analyzable a lot of thought was put in the set-up of the testing
sessions. It was decided that a task-based testing sessions which is held in form of a face-to-face meeting in
combination with group sessions would provide the best feedback for the design-team. The following paragraphs
will describe the problems and obstacles but also the key factors that were important for the decision making on how
the testing sessions should be held and which tools and methods should be used.
The first thing that needed to be considered were the targeted participants and specific needs and traits. Conducting
testing sessions with a special needs user group is always challenging and usually require some alteration to standard
procedures. In the case of this testing four main issues that are partly interconnected were identified:
● Creating a test-environment that is dyslexia-friendly.
● Recruiting professional supervisors who are able to carry out testing sessions.
● Recruiting participants that can take part in both testing sessions.
● Coordinating and facilitate the testing in three different countries.
Creating a dyslexia-friendly test-environment
Based on the knowledge acquired through previous testing sessions with dyslexic users and relevant literature like
for example the book “Understanding User Centered Design (UCD) for People with Special Needs” (Thimbleby,
2008), which offers valuable insights into the complex topic of Human-Centered Design in a special needs context, a
few crucial criteria for success were identified.
The testing sessions with dyslexics needed to be conducted by dyslexia specialists, who would act as supervisors
during the process. The testing optimally took take place in a calm and comfortable environment that did not have
too many distractive stress-factors. The main goal was to keep the cognitive load of the participants as low as
possible. As the testing-procedure itself was straining enough, it was crucial to minimize additional distractions or
complications. In order to create such an calm and open atmosphere the ideas of the Person-Centered Approach
(Rogers and Roethlisberger, 1991; Motschnig and Nykl, 2011), active listening (Rogers and Farson, 1957) and
appreciative inquiry (Barret and Fry, 2005) were included into the design-process of the testing.
Recruiting professional supervisors
Conducting testing sessions with participants from a special needs group automatically requires highly professional
supervisors who are not only able to execute the testing but also need to be able to emphasize with the participants
and respect their needs. Therefore supervisors were deemed to be a key success factor for these testing sessions. Due
to the composition of the LITERACY-Consortium, which included specialists from different areas and multiple
countries, contacts to dyslexia and special education experts had already been established. With the help of these
contacts it was possible to recruit teams in the Czech Republic, Hungary and Israel. All supervisors were provided
with guidelines and written procedures in order to ensure the comparability between the countries.
Recruiting participants
Although the recruiting participants was done by partners in the particular countries it proved to be rather difficult.
The pool of potential and willing participants is quite small and needing them to come at two different occasions
made recruiting even more difficult. During the planning of the testing it was decided that it would create the best
feedback if “new” users, who use the portal for the first time, would be recruited. As there had already been multiple
testing sessions regarding different parts of the LITERACY-portal conducted, this further drained the pool of
potential participants.
Therefore it was decided to only recruit a small group of users per country and to keep the testing on a small scale.
In the end 7 participants were recruited in each of the three countries, who participated in the face-to-face testings
and the subsequent group session.
Coordinating between three countries
Coordinating and facilitating the testing in three different countries created a number of problems. Firstly the testing
was planned and set-up in English language as this made coordinating with a multi-lingual team easier. Therefore
the guidelines and work-flows had to be translated into the respective native languages, while trying to not indistinct
the meaning of specific phrases. This was important as keeping the correct context of a question was the basis to be
able to compare the results from different countries.
Secondly there was the problem of providing comparable tasks among multiple language versions of an onlineportal. As the content of each language version slightly differed from the others the task-based testing had to be
adapted to the content available in the specific version while also trying to keep tasks as similar as possible.
Lastly there was the issue of having different timeframes. The problem of recruiting suitable participants resulted in
different testing-windows between the different countries. In order to create comparable results the technical
partners had to ensure that the portal-design stayed the same throughout the testing-phases. This required constant
coordination between partners.
Set-up of the testing sessions
As described before, the testing consisted of two connected sessions: an individual face-to-face meeting and a
subsequent group sessions with all participants from the respective country. Following these two testing-procedures
will be explained in detail.
Face-to-face sessions
The face-to-face sessions were conducted by an expert in the field of dyslexia and/or special education. The chosen
participants from each country were individuals with dyslexia that had no prior experience with the LITERACYportal. As the goal of the testing was to improve the usability of the online-portal, the testing sessions was designed
in a way to gather data in multiple ways: user-feedback, system-feedback and observation. In order to produce this
multi-faceted feedback the testing was designed in a four step process: Exploring the portal without restrictions,
complete predefined tasks, filling out a questionnaire and a concluding verbal discussion about the portal. The basic
testing-procedure is listed below:
1.
2.
3.
4.
5.
Introduce the participant to the LITERACY-project, its goals and the purpose of the testing: improving the
design and usability of the portal interface.
Open the LITERACY-portal and give the participant some time to just freely browse the portal without any
instructions.
Provide the participant with the list of tasks and ask him/her to perform them one-by-one. Ask them kindly
to “think aloud”, meaning to verbalize how they are intending in going along to solve the task.
After finishing all tasks ask the participant to fill out the usability-questionnaires.
Finally the participants were asked the following concluding questions:
● What is your overall impression regarding the portal?
● Would you use the portal in the future?
○ If yes, how would you use it?
● What are the portals strengths?
● What are its weaknesses?
● What would need to happen for you to use the portal regularly?
Design of the task-based testing
While defining tasks to be performed by participants of the testing a few issues had to be considered. Firstly it was
important not to overwhelm dyslexic users with too many tasks. A multi-stepped design of the testing procedure
requires a lot of time to finish and also strains the attention of the participants as they have to switch their focus with
every step. Therefore it was necessary to limit the amount of tasks to a minimum while also trying to gather enough
feedback to make reasonable conclusions. Secondly the content differed between the portal languages and therefore
the tasks had to be adapted for each individual country. Lastly it was crucial to find clear and concise wordings for
the various tasks. As dyslexics already have a hard time reading it was important to lower their cognitive load while
doing the tasks as much as possible. If participants would have trouble understanding the questions the results would
not be exploitable in a meaningful way. Therefore experts in dyslexia were contacted to help with the wording of the
tasks. Table 1 shows the list of the tasks used for the validation of the Czech-language portal:
1.
Change the language settings to your native language
2.
Try to find some general information about dyslexia
3.
Find an article about how to succeed at your workplace
3.1.
Find information that might help you to learn foreign languages?
3.2.
4.
Get back to the home-screen
You want to share your opinions and experiences as a dyslexic with others. Where would you do that?
Find a suitable area on the portal
4.1.
Go to the Forum and find a thread about the Literacy-Portal
4.1.1
Create a new topic describing your life-story (i.e.: “Peter’s life story”)
4.1.2
Post a comment in reply to an existing topic
4.2
Enter the chat module and see if someone is online
4.2.1
Make an entry into the chat
4.2.2
Log out of the chat-module
4.3
Read about LITERACY’s latest activities via the Twitter and Facebook feeds
5.
You want to assess yourself with the help of the LITERACY-Portal. Find an area where you can do
that.
5.1
Fill out the initial assessment questionnaire
6
Where would you search for tools that help you while using the computer
6.1
Where can you find contact information
6.2
Where would you look for a manual or other help on the portal
6.3
Do you think you can express your creative side through the portal? If yes, where?
Table 1: List of validation tasks in Czech
Creation of the questionnaires
In order to be able to capture the inputs of the participants in an effective and productive way the application of
already existing usability questionnaires was considered. For this reason multiple usability questionnaires, including
for example “AttrakDiff” (User Interface Design GmbH, 2014), “System Usability Scale” (Brooke, 2004),
“Computer System Usability Questionnaire” (Lewis, 1995), “ISONORM 9241-110” (Prümper), and
“ISOMETRICS” (Gediga & Hamborg, 1999), were analyzed. Unfortunately none of these existing questionnaires
met the requirements set for this testing sessions. Most of the questionnaire were quite extensive and elaborate,
which would be too much for the participants to handle, considering they already finished a whole task-based testing
procedure prior to filling out the questionnaire. Additionally almost all questionnaires included questions that are not
applicable for this testing sessions and therefore would confuse the participants. Furthermore the wording of the
questions was quite abstract and sometimes technical terminology that cannot be considered common knowledge
was used, which made using the questionnaire difficult. Therefore it was decided to create new questionnaires based
on “ISONORM 9241-110” and the “System Usability Scale”.
While the “System Usability Scale”, which only consists of 10 questions, only needed some rephrasing of the
questions to make it applicable for dyslexic users, the “ISONORM 9241-110” had to be adjusted in multiple ways.
The ISONORM-questionnaire is very detailed and therefore takes a long time to finish. Even the short version
consists of 21 questions. Therefore it was shortened to 8 questions that complemented the results from the “System
Usability Scale” questionnaire. Additionally the original questionnaire-layout of having 7 answer-options (worst best) was changed to a 5 answer-options layout in order to be similar to the “System Usability Scale”. This change
would make the transition between the two questionnaires easier and reduce the cognitive load for the participating
users.
Collecting of data and tracking of users
The set-up of this testing session made it possible to gather data through multiple channels: user-feedback, systemrecords and observations from the supervising expert. In order to manage the gathered data more easily these three
channels were limited through provided instructions. These limitations made it possible to create feedback regarding
certain areas of the portal that is quantifiable and easily comparable. Nevertheless users still were able to provide
their unfiltered feedback without any boundaries at given points of the testing sessions.
The results of the task-based testing part of the session were captured in two ways. Firstly the system tracked the
movement of the user on the portal and stored it so it can be analyzed. Secondly the supervisor filled out a protocol
noting the progress of the participant. Screenshots of both can be seen in Image 1.
Image 1: Supervisor-sheet and output of tracking tool
The questionnaires were filled out through a web-interface, which made analyzing the data easier and avoided a lot
of problems that would arise from the multi-language approach and the independent execution of the participating
countries of the testing. The web-interface was developed with the help of experts in dyslexia, interface design and
actual dyslexic users. Furthermore there were some examples and good practices used from sources like the articles
“Web Accessibility: Designing for dyslexia” (Bell, 2009) and “Multimedia Software Interface Design for SpecialNeeds Users” (Lányi, 2009), which helped creating a suitable interface for dyslexic users.
The following Image 2 illustrates a screenshot of the resulting interface.
Image 2: Online-questionnaire
Group sessions
One group session was conducted in each of the three countries. They were carried out by native experts in dyslexia
and attended by the participants of the preceding face-to-face sessions. The goal of these sessions was to gather
more feedback about the portal in a guided think-aloud process. The basic procedure of these sessions saw the
expert, acting as a moderator, browse through different parts of the portal and asking the participants for feedback,
opinions, solutions or alternatives. Generally the tour included most parts of the portal and all bits of feedback were
documented in a protocol. Example questions for each part of the portal were provided beforehand in order to start
and guide the discussion. Following is an example of how the different areas of the portal were discussed:
● What are your thoughts regarding the starting page? What impression does it make?
● Do you feel overwhelmed by the provided information?
● Is it hard to start navigating the portal? Do you know where to start?
The group session concluded with a general discussion about the portal and further suggestions for improvement.
Results
The results of the face-to-face as well as the group sessions were analyzed in detail. Especially the task-based testing
highlighted areas and parts of the navigation concept that did not work as intended and had to be improved. An
example of such an area shown in Image 3 was unveiled by task 4.3 "Read about LITERACY’s latest activities via
the Twitter and Facebook feeds”:
Image 3: Results of task 4.3 (CZ-testing)
Through a combination of this data and the verbal feedback provided by the participants it was possible to identify
social media quick-links that had been added to the portal recently confused the users. When asked to look for feeds
they immediately tried to click on the Facebook or Twitter logo to share the current page via social media and got
confused when they did not find a feed.
The comments collected by the supervisors in combination with the data from the user-tracking also provided
valuable insights into the thought processes of the participants. While analyzing these various bits of data it for
example became clear that the log-in/sign-up interface was very confusing for the users and regularly created
problems. As a result the interface will be redesigned in order to create the system responses the users expect.
The results of the questionnaires also provided valuable input. As can be seen in the following table the main
concern of users was that the portal was inconsistent and different functions were not integrated very well. As the
portal was only in the process of being assembled this feedback was in sync with the feedback from experts and
designers within the LITERACY-consortium. Nevertheless the questionnaires also showed that the portal was easy
to use and did not require a lot of previous knowledge to operate, which was a main goal when designing the
interface.
Image 4: Results of the adapted SUS questionnaire
The participants of the testing also provided some ideas to further improve the portal and its interface. One
suggestion was to add some fun and relaxing activities to the portal that could help to get in the right state of mind to
do training exercises. Other suggestions were to provide a tutorial on how to use the text-to-speech software
properly or to add a list of dyslexia experts that could be contacted.
Conclusion
This chapter illustrated the process of planning, designing, executing and analyzing testing sessions with a special
needs target group within the context of the international LITERACY-project, an interactive web-application that is
intended to support the social inclusion of users who are struggling readers. The aim of the testing was to gather
information regarding the usability and design of the web-portal in order to make it easier to use, navigate and
understand. Due to the fact that the testing sessions were conducted with special needs users, a lot of preparation
was put into the planning and design of the testing. The specific needs and prerequisites of dyslexic users had to be
considered in order to create a suitable environment for the participants. Therefore experts in the field of dyslexia
had to be included into the planning phase of the sessions, the adaptation process of the instruments used for the
testing, and also as supervisors to conduct the testings. Multiple sources of feedback were created in order to gather
as much information as possible. This resulted in an analysis of a data-triangulation from oral user-feedback,
supervisor-protocols and system logs. This rich set of various forms of qualitative and quantitative data allowed for
a detailed analysis of specific user concerns and problems that surfaced during the testing sessions. The findings
gathered through this testing directly influence the design of the LITERACY-portal, The research group’s
experience with the HCD process was very confirming, nevertheless, specific techniques needed to be sensitively,
thoughtfully and skillfully adapted to the special need of the target users.
Acknowledgements
This work was supported by the European Commission (Grant agreement no: 288596). The authors thank their
consortium partners, in particular Vera Vojtova, Maud Kotasová, Éva Gyarmathy, Tsipi Egozi, Ofra Razel, Břetislav
Beránek, Jakub Hrabec and Jakub Marecek and for their collaboration without which this paper could not have been
written.
References
Barret, F. J., & Fry, R. E. (2005). Appreciative inquiry: a positive approach to building cooperative capacity. Ohio:
Chagrin Falls.
Bell, L. (2009). Web Accessibility: Designing
http://lindseybell.com/documents/bell_dyslexia.pdf
for
Dyslexia.
Retrieved
October
15.
2013,
from
Gediga, G., & Hamborg, K.-C. (1999). IsoMetrics: An usability inventory supporting summative and formative
evaluation of software systems. In H.-J- Bullinger & J. Ziegler (Eds.), Human-Computer Interaction: Ergonomics
and User Interfaces, Proceedings of HCI International `99, Volume I. New Jersey: Lawrence Erlbaum Associates.
ISO 9241-210:2010, Ergonomics of human-system interaction – Part 210: Human-centered design for interactive
systems.
Lányi, C. S. (2009). Multimedia Software Interface Design for Special-Needs Users. In M. Khosrow-Pour (Ed.),
Encyclopedia of Information Science and Technology, Second Edition (pp. 2761-2766). Hershey, PA: IGI Global.
Motschnig, R., & Nykl, L. (2011). Komunikace zaměřená na člověka: rozumět sobě i druhým. Prague: Grada.
Rogers, C. R., & Farson, Richard E. (1957). Active Listening. In R. G. Newman, M. A. Danziger, & M. Cohen
(Eds., 1987), Communication in Business Today. Washington D.C; Heath and Company.
Rogers, C. R., & Roethlisberger, F.J. (1991). Barriers and Gateways to Communication. Harvard Business Review,
91610, 105-111.
Thimbleby, H. W. (2008). Understanding User Centred Design (UCD) for People with Special Needs. In K.
Miesenberger, J. Klaus, W. L. Zagler & A. I. Karshmer (eds.), Computers helping People with Special Needs,
Proceedings of the 11th International Conference ICCHP, Lecture Notes in Computer Science, 5105 (pp. 1-17),
Berlin-Heidelberger: Springer.
User Interface Design GmbH (2014). AttrakDiff. Retrieved December 5. 2014, from http://attrakdiff.de/
Brooke, J. (2004). SUS - A quick and dirty usability scale. Retrieved December 5. 2014, from
http://www.usabilitynet.org/trump/documents/suschapt.doc
Lewis, J. (1995). IBM computer usability satisfaction questionnaires: Psychometric evaluation and instructions for
use. International Journal of Human-Computer Interaction, 7(1), 57-78.
LITERACY Project (2012). Social Inclusion of People with Dyslexia, Retrieved December 4. 2014, from
http://www.literacyproject.eu/
Prümper, J. (n.d.). Fragebogen ISONORM 9241/110-S. Beurteilung von Software auf Grundlage der Internationalen
Ergonomie-Norm DIN EN ISO 9241-110. Retrieved December 5. 2014, from http://www.seikumu.de/de/dok/dokechtbetrieb/Fragebogen-ISONORM-9241-110-S.pdf