Journal of Digital Information, Vol 7, No 1 (2006)
End-User Quality of Experience Oriented Adaptive Elearning System
Cristina Hava Muntean and Dr. Jennifer McManis
Performance Engineering Laboratory, Dublin City University, Glasnevin, Dublin 9, Ireland.
Tel: (+353) 1 700 7645 194 Fax: (+353) 1 700 5508
Email: {havac, mcmanisj}@eeng.dcu.ie
Abstract
In the context of new devices and with a variety of network technologies that allow access to the
Internet, the providers of e-learning materials have to ensure that the users have a positive
experience using their e-learning systems and they are happy to re-use them. Adaptive Hypermedia
research aims to provide personalised educational material that ensures a positive learning
experience for the end-users. However, user experience is dependent not only on the content served
to them, but also on the user perceived performance of the e-learning system. This leads to a new
dimension of individual differences between Web users: the end-user Quality of Experience (QoE). We
have proposed a solution for Adaptive Hypermedia Systems (AHS) that provides satisfactory end-user
QoE through the use of a new QoE layer. This layer attempts to take into account multiple factors
affecting QoE in relation to the delivery of a wide range of Web components such as text, images,
video, audio.
The effectiveness of our QoE layer has been tested in comparison to a standard educational AHS and
the results of these tests are presented in this paper. Different educational-based evaluation
techniques such as learner achievement analysis, learning performance assessment, usability survey
and correlation analysis between individual student performance and judgment on system usability
were applied in order to fully assess the performance of the proposed QoE layer. Results of the tests
showed that the use of the QoE layer brought significant improvements in terms of user learning
performance, system usability and user satisfaction with the personalised e-learning system while not
affecting the user learning achievement.
Keywords
end-user QoE, adaptive hypermedia, e-learning, end-user perceived performance, learning
performance
1 Introduction
Extensive research in the area of Web-based adaptive hypermedia has demonstrated the benefit of
providing personalized content and navigation support for specific users or user categories. A
comprehensive review on the developed adaptive hypermedia systems, the techniques used during
the adaptation process and the applicability areas of these systems can be found in Brusilovsky. 2001
and Brusilovsky 1996.
Web users differ in skills, aptitudes, goals and preferences for processing accessed information. They
may have different perceptions of the same content and performance factors. Finally, they may have
special needs due to disabilities. Therefore, the Web-based Adaptive Hypermedia Systems (AHS) try
to capture and analyse these user-related features in order to optimise the user experience with the
Web site. A variety of AHS have been applied in the educational area, providing e-learning services.
This research area has attracted huge interest due to its capability for facilitating personalized elearning, its distributed nature and its simplicity of interaction. Several good examples exist in the
academic community including ELM-ART II, AHA! and JointZone. These systems build a model of the
goals, knowledge and preferences of each individual person and use this model throughout the
interaction with the user in order to propose content and link adaptations, which would best suit elearners. Lately, researchers started to integrate learning styles in the design of a AHS along with the
classic learner's features. Several systems providing adaptation to users' learning styles have been
created such as INSPIRE and AES-CS.
With the advance in computer and communication technology a variety of Internet access devices
(e.g. laptop, pocketPC, PDA, mobile phone) have been launched on the market. The type and
capacity of the access device, the network the device operates on, the available bandwidth, the state
of the network that may very dynamically over course of session and the complexity of the Web pages
delivered, all affect the quality of experience for the end-user. Thus, end-users of educational and
training services expect not only high-quality and efficient educational material but also a perfect
integration of this material with the day-to-day operational environment and network framework. In
this context it is significant to highlight a new problem faced by network-based education over the
Internet: providing a good level of end-user perceived Quality of Service (QoS), also called Quality of
Experience (QoE).
Currently Adaptive Hypermedia Systems for Education (AHSE) place very little emphasis on QoE and
its effect on the learning process. This QoE-unaware approach is perhaps unsuited to a general
learning environment (Figure 1) where one can imagine a student with a laptop moving from a low
bandwidth home connection, to a higher bandwidth school connection, and potentially to public
transport with a mobile connection with a widely varying bandwidth connection. It should be noted
that some adaptive hypermedia systems have taken into consideration some performance features
(e.g. device capabilities, the type of the access, state of the network, etc.) in order to improve the
end-user QoE. For example GUIDE system considers hand-held units as tools for navigation and display
of an adaptive tourist guide. INTRIGUE, a tourist information system that assists the user in the
organization of a tour, provides personalized information that can be displayed on WAP phones.
Merida et al. have considered HTTP protocol, type of the access and the server load in the design of
the SHAAD. However, these account for only a limited range of factors affecting performance and do
not fully address QoE.
Figure 1. A New E-learning Environment
Therefore, adaptive hypermedia systems should also take into consideration QoE characteristics when
the user profile is built and regularly monitor in real-time any change in the system that might
indicate variations of QoE. These include changes in the user's operational environment and also
modifications of user behaviour, which might possibly indicate dissatisfaction with service (such as an
abort action). This would allow for better Web content adaptation that suites varying delivery
conditions.
This paper presents an approach that introduces a new QoE-based content adaptation strategy that
enhances the functionality of a classic adaptive hypermedia system and aim to improve the end-user
QoE. The QoE-based enhancement (QoE layer) measures and analyses various factors that may affect
QoE. QoE layer consists of different components (Figure 2) The Performance Monitor measures a
variety of performance metrics in order to learn about the Web user's operational environment
characteristics, changes in network connectivity between the user's computer and Web server, and
assesses the consequences of these changes on the user's QoE. This information is synthesized in the
Perceived Performance Model, which proposes strategies for tailoring Web content in order to
optimise QoE.
In order to demonstrate the benefits of the proposed QoE layer we have deployed it in the opensource AHA! system creating Quality of Experience-aware AHA!(QoEAHA). In this paper we present
results from subjective evaluation in the educational area. The goal of this evaluation was to assess
the learning outcome, learning performance, system usability and user QoE when the original AHA!
and the QoEAHA systems are used in a low bit rate home-like environment. The results indicated that
QoEAHA significantly improves performance and user satisfaction with their experience. The usage of
the QoE-layer did not affect the user-learning outcome.
2 Quality of Experience
The term Quality of Experience (QoE) relates to end-user expectations for QoS. QoE is defined by
Empirix as the collection of all the perception elements of the network and performance relative to
expectations of the users. The QoE concept applies to any kind of network interaction such as Web
navigation, multimedia streaming, voice over IP, etc. Depending on the type of application the user
interacts with, different QoE metrics that assess the user's experience with the system in term of
responsiveness and availability have been proposed. QoE metrics include subjective elements and can
be influenced by any sub-system between the service provider and the end-user. ITU-T
Recommendation G.1010 provides guidance on the key factors that influence QoS from the
perspective of the end-user (i.e. QoE) for a range of applications that involves voice, video, images
and text.
In the area of World Wide Web applications, QoE has been also referred as end-to-end QoS or enduser perceived QoS. Measuring end-to-end service performance, as it is perceived by end-users is a
challenging task. Previous research (Bhatti et al. 2000, Krishnamurthy et al. 2000, Bouch et al. 2000)
shows that many QoS parameters such as download time, perceived speed of download, successful
download completion probability, user's tolerance for delay, and frequency of aborted connections
factor into user perception of provided quality. Measurement of these parameters may be used to
assess the level of user satisfaction with performance. The interpretation of these values is complex,
varying from user to user and also according to the context of the user task.
End-user perceived QoS has also been addressed in the area of multimedia streaming. Research such
as (Blakowski 1996, Ghinea 1998, Watson 1997) assesses the effect of different network-centric
parameters (i.e. loss, jitter, delay), the continuous aspect of multimedia components that require
synchronization, or the effect of multimedia clip properties (i.e. frame size, encoding rate) on enduser perceived quality when streaming different type content.
In this paper QoE is addressed only in the area of Web-based AHS with applicability in education.
Typical e-learning systems may involve a combination of text, images, audio and video, and their
quality of service is based on the combination of all of these rather than any individual component.
The educational context also has its own set of requirements and user expectations in terms of
learning outcome and it is against these that user perceptual quality will be evaluated.
3 QoE-aware Adaptive Hypermedia System for Education
Starting from a generic architecture of an AHS that consists of a Domain Model (DM), a User Model
(UM), an Adaptation Model (AM), and an AHS engine (Wu 2001) we have enhanced the system with a
QoE layer that was presented in Muntean 2004a and Muntean 2004b. The QoE layer includes the
following new components (see Figure 2): the Perceived Performance Model (PPM), the Performance
Monitor (PM), the Adaptation Algorithm (AA) and the Perceived Performance Database (PP DB).
Figure 2. QoE-aware AHSE Architecture
3.1 Performance Monitor
The PM is in charge of monitoring and measuring in real time performance metrics which are then
used to infer information regarding user QoE. the performance metrics include download time, roundtrip time, throughput and user behaviour-related actions (e.g abort requests). The utility of a session
(Bouch et al. 2000) is also calculated and reflects the fact that users become less tolerant to delay as
time passes. Tests over high speed connections showed that a 10 sec download time was considered
acceptable to 95 % of the participants during the first four Web page accesses, still acceptable for 80
% of the participants during the access of an extra 6 pages, but only for 60 % of accesses over the 11th
page were still acceptable (Bouch et al. 2000). Similar tests were performed for download time values
between 16 sec and 6 sec and for different type of connections. The conclusion of these results was
that the download time should improve over the duration of a session in order to keep acceptable the
client's experience with navigation on a web site positive.
The information gathered by the PM during the user access sessions is delivered to the PPM. The
mechanism used to measure the performance metrics is based on filtering TCP packets that carry
information and monitoring the signals exchanged by the HTTP protocol.
3.2 Perceived Performance Model and Perceived Performance Database
The PPM has the important function of providing a dynamical representation of the user perceived
QoE. It models the performance related information in order to learn about the user operational
environment characteristics, about changes in network connection and the consequences of these
changes on the user's quality of experience. PPM also considers the user's subjective opinion about
his/her QoE explicitly expressed by the user. This introduces a degree of subjective assessment,
which is specific to each user. The user related information is modelled using stereotype-based
technique that makes use of probability and distribution theory (Muntean 2004a) and saved in the PP
database.
Finally, the PPM suggests the optimal Web content characteristics (e.g. the number of embedded
objects in the Web page, the dimension of the based-Web page without components and the total
dimension of the embedded components) that would best meet the end-user expectation related to
QoE. PPM aims to ensure that the access time per delivered page, as perceived by the user, respects
the user tolerance for delay and it does not exceed the satisfaction zone.
Based on a survey of the current research into user tolerance for delay, three zones of duration that
represent how users feel were proposed in (Sevcik 2002): zone of satisfaction, zone of tolerance and
zone of frustration. According to a number of studies (Bhatti et al. 2000, Bouch et al. 2000, Servidge
1999, Ramsay et al. 1998) on the effects of download time on users' subjective evaluation of the Web
site performance it was indicated that users have some thresholds (user tolerance) for what they
consider adequate or reasonable delay. A user is "satisfied" if a page is loaded in less then 10-12 sec,
but higher values cause disruption and users are distracted. Any delay higher then 30 sec causes
frustration. At the same time it is significant to mention that when the user is aware of the existence
of a slow connection, he/she is willing to tolerate a delay that averages 15 sec but does nor exceed
25 sec (Chiu 2001).
3.3 Adaptation Algorithm
The objective of the Adaptation Algorithm (AA) is to determine and apply the correct transformations
on the personalised Web page (according to the User Model) in order to match the PPM suggestions on
the Web page characteristics. Two types of transformations are considered: modifications in the
properties of the embedded components (presented as concepts in the DM) and/or elimination of
some of the components. These actions are applied to those components the user is the least
interested in as recorded by the UM. The work presented in this paper considers that the Web pages
consist of text and images. Since images contribute with the largest quantity of information to the
total size of a web page, in this work they were the only ones taken into consideration by this
algorithm.
In order to match the PPM suggestion related to the total size of the embedded images, image
compression is first applied and, if further reduction is necessary, image elimination is applied.
Different compression rates (expressed as percentage) are applied to each image depending on: the
total reduction suggested on the total size of embedded images, the image size and user interest in
the image as specified in the UM. Thus, if a user is more interested in image A than image B, image A
will be reduced less than image B. If one of the computed compression rates cannot be applied to an
image (e.g. due to the fact that the quality will be lower than acceptable for the end-users) an image
elimination strategy is applied. In the case when an image has to be eliminated, a link to the image is
introduced. In this way, if a user does really want to see the image, the link will offer this
possibility.The algorithm used for image compression/elimination for the tests in this paper is
described in Muntean 2004c. Naturally, the quality of the image relative to its size will depend on the
sophistication of the compression technique, as is the decision regarding user perception of the image
quality. This is a subject of ongoing research.
Further extension of the algorithm may consider multimedia clips (audio and/or video) that could be
embedded in a Web page. For this situation, techniques that involve size and quality adjustments for
audio and video can be applied (e.g. for video compression techniques involving frame rate,
resolution and colour depth modifications and respectively for audio silence detection and removal
technique). These adaptation techniques are studied by the multimedia networking area and they are
not addressed in this paper. In addition, a component elimination strategy may be replaced by one of
substituting a less bandwidth intensive equivalent for the information eliminated. For example, if
video clips could not be supported, images or a sequence of images could be sent instead.
4 Assessing the Benefits of the QoE Layer
For illustration and testing purposes the proposed QoE Layer was deployed on the open-source AHA!
system, creating QoEAHA. The AHA! system was developed at the Eindhoven University of Technology,
in the Database and Hypermedia group. The system is used in educational area as an adaptive
hypermedia courseware application that supports the "Hypermedia Structures and Systems" course
(TU/e course, DeBra 1997). AHA! was used in order to demonstrate the benefits brought by the QoE
Layer.
Among the advantages of the AHA! system are the following:
AHA! has been extensively tested and accepted by the research community.
AHA! is a simple general-purpose hypermedia adaptive system.
The AHA! architecture respects the general reference AHAM model (DeBra 1998).
AHA! is open source and therefore allows for extensibility.
AHA! system also provides an adaptive tutorial as testing material. The content of this tutorial was
used as educational material for the students in the experimental tests performed during our
research. As the material was already designed prior to the proposal of the QoE Layer, it provides
independent testing material for the subjective evaluation.
4.1 Objectives of the Evaluation Experiment
The QoE evaluation investigates the feasibility and usability of applying the QoE Layer in order to
support performance-based adaptation. This adaptation is performed based on the end-user
perceived performance and their experience with the adaptive system when interacting with the
system in a low bit rate operational environment (connection bandwidth up to 128 kbps).
The objectives of the experiment were the following:
To investigate the impact of QoE extension on student performance
To assess the usability and effectiveness of the QoEAHA system in comparison to the original
AHA! system
To determine the improvements brought by the QoEAHA in terms of end-user QoE
The impact of the QoE Layer on student performance was investigated by comparing the performance
of the students when the two systems AHA and QoEAHA were used. Students' performance was
assessed in terms of the two most important metrics: learner achievement and learning performance.
The number of revisited web pages was also investigated.
Usability evaluation was performed through an on-line usability questionnaire filled-out by the
students after they completed a study task.
The analysis of students QoE was performed through an on-line questionnaire that assessed the user
opinion in relation to performance issues and user satisfaction on the perceived quality.
4.2 Setup Conditions
The experiments took place in the Performance Engineering Laboratory, School of Electronic
Engineering, Dublin City University. A task-based scenario involving an interactive study session was
developed and carried out in laboratory settings. The test environment was designed to be uniformed
for all participants. The tests took place in closed medium-size laboratory room where no other
people were allowed in and no other activities were performed. The room had no windows and the
level of artificial light was the same for all participants.
The laboratory-network setup used for testing involved four desktops PC Fujitsu Siemens with
Pentium III (800MHz) processors and 128 MB memory, a Web server IBM NetFinity 6600 with two
processors Pentium III (800 MHz) and 1GB memory and one router Fujitsu Siemens with Pentium III
(800MHz) processor and 512 MB RAM that has a NISTNET network emulator installed on it. The
NISTNET instance that allows for the emulation of various network conditions characterized by certain
bandwidth, delay and loss rate and pattern was used to create a low bit rate modem-like operational
environment with a 56 kbps connection speed (Figure 3). This setup offers similar connectivity to that
experienced by residential users. The emulated network conditions determined performance-related
adaptations when the QoEAHA was used.
Figure 3. Laboratory-Network Configuration for the Subjective Testing
The subjects involved in this study are comprised of forty-two postgraduate students from the Faculty
of Engineering and Computing at Dublin City University. They were randomly divided into two groups.
One group used the original AHA! system, whereas the second one used QoEAHA. The subjects were
not aware of what system version they were using during the experiment. No time limitation was
imposed on the execution of the required tasks. None of the students had previously used any of the
two versions of the AHA! system and none of them has accessed the test material prior taking the
tests. Therefore no previous practice with the environments was assumed for any of them. The
material on which the students performed the task consisted of the original adaptive tutorial
delivered with the AHA! system version 2.0.
Interactive Study Session
The students were asked to complete a learning task that involved the study of the AHA! installation
chapter from the AHA! tutorial over a 56 kbps connection speed. At the start of the study session the
subjects were asked to read a short explanation concerning the use of the system and the required
duties. Their duties were as follows:
Complete an online Pre-Test evaluation consisting of a questionnaire with six questions
related to the learning topic. The test is used to determine subject's prior knowledge in this domain.
Log onto the system and proceed to browse and study the material.
Complete an online Post-Test evaluation at the end of the study period. The Post-Test
consists of a questionnaire with fifteen questions that test recollection of facts, terms and concepts
from the supplied material, as suggested in Bloom's taxonomy.
Answer a Usability Questionnaire consisting of ten questions related to navigation,
accessibility, presentation, perceived performance and subjective feedback.
In order to fully assess the subjects learning achievement, both Pre-Test and Post-Test questionnaires
(Muntean 2005) were devised from the four different types of test-items most commonly used in the
educational area: "Yes-No", "Forced-Choice", "Multi-Choice" and "Gap-Filling" test items. These test
items have different degrees of difficulty and their corresponding answers were assigned weights in
the final score accordingly. The maximum score for Pre-Test is 10 points and the maximum score for
Post-Test is 30 points. The final scores were normalized in the range of 0 to 10.
4.3 Learner Achievement
Learner achievement is defined as the degree of knowledge accumulation by a person after studying
certain material. It continues to be a widely used barometer for determining the utility and value of
distance learning technologies.
During the study-based scenario learner achievement was assessed by comparing Pre-Test and PostTest scores achieved by the subjects using the QoEAHA and AHA! systems respectively. The results of
the Pre-Test and Post-Test are shown in Table 1 and Table 2.
Mean
Score
AHA!
Min Score Max Score SDEV
0.35
0.0
2.0
0.552
QoEAHA 0.30
0.0
2.0
0.530
Table 1. Pre-Test Results
Mean
Score
Min
Score
Max
Score
SDEV
6.70
4.30
9.30
1.401
QoEAHA 7.05
4.60
9.0
1.395
AHA!
Table 2. Post-Test Results
A two-sample T-Test analysis, with equal variance assumed, performed on the Pre-Test scores shows
that statistically both groups of students had the same prior knowledge of the studied subject
(significance level alpha=0.01, t=0.21, t_critical= 2.42, p(t)=0.41). This result means that the learner
achievement can be assessed by processing only the Post-Test score.
Following the Post-Test results evaluation, the mean score of the subjects that used QoEAHA was 7.05
and the mean score of those that used AHA! was 6.70. A two-sample T-Test analysis on these mean
values does not indicate a significant difference in the final marks of the two groups of users.
(alpha=0.05, t=-0.79, t_critical=1.68, P (t)=0.21). Therefore it can be stated that there is no
significant difference in the learning outcome between the users of the QoEAHA and AHA! systems.
Since the answers for three questions from the Post-Test questionnaire required the subjects to study
the images embedded in the Web pages affected by performance-based adaptations, an analysis of
the students learning outcome on these questions was performed. After the scores related to these
three questions were normalized in the range 0 to 10, the mean value of the students' scores was 6.30
for the QoSAHA group and 6.40 for AHA! group. A two-sample T-Test analysis, with equal variance
assumed again indicates with 99% confidence level that there is no significant difference in the
student learning achievements (t=-0.08, t-critical=2.71, p(t)=0.93, confidence level alpha=0.01). This
result is very important as an adaptive degradation up to 34 % in the image quality was performed by
the QoEAHA.
In summary, these test results indicate that the QoEAHA system did not affect the learning outcome
and offered similar learning capabilities to the classic AHA! system.
4.4 Learning Performance
The term learning performance refers to how fast a study task (learning process) takes place. The
completion time for a learning session (Study Session Time) is measured from the start of the session,
when the subject logs into the system and starts to study until the student starts answering the PostTest questionnaire.
The distribution of the Study Session Time measured for the students involved in performing the
required learning task using the AHA! and QoEAHA systems respectively is presented in Figure 4.
Figure 4. Distribution of the Study Session Time Measured for the Students Involved in the Learning
Task
On average, students that made use of the QoEAHA system (Average Study Time = 17.77 min)
performed better than the ones that used the AHA! (Average Study Time = 21.23 min). This fact was
confirmed with 99% confidence level by the T-Test analysis. The very large majority of the students
that used QoEAHA (71.43 %) performed the task in less than 20 minutes, with a large number of
students (42.87 %) requiring between 15 and 20 minutes. In comparison, when the AHA! system was
used, only 42.85 % of the students succeeded to finish the learning task in 20 min. The majority of
them (71.42 %) completed in less than 25 minutes, with the largest number of students (28.57 %)
completed in the interval 20-25 minutes. (Table 3).
QoEAHA
AHA
Number of Students
(%)
Number of Students
(%)
0-10
9.50
0.0
0-20
71.43
42.85
0-25
85.71
71.42
20-25
14.28
28.57
25-35
14.29
28.57
Study Time
Interval (mins)
Table 3. Percentage of Students that Have Succeeded to Learn
over Different Periods of Time
Apart of the Study Session Time, Number of Accesses to a page performed by a person was also
measured in order to investigate the students learning performance. This metric can provide an
indication on the quality of learning. Any re-visit to a page may indicate that the student was not
able to recall the information provided in the page and thus the learning process was of poor quality.
On average the students from the QoEAHA group performed a smaller number of re-visits (avg. =1.40)
to a page than those from the AHA! group (avg. =1.73). An unpaired two-tail T-Test with unequal
variance assumed, confirmed with 92% confidence that there is a significant difference in the number
of visits performed by a student when the two versions of AHA! systems were used.
Another important metric for assessing the quality of the learning process is Information Processing
Time per page (IPT/page). IPT represents the time taken by a student to read and assimilate the
information displayed on a Web page. It was measured from the moment when the web page was
delivered and displayed on the computer screen until the user sends a request for another page. The
web page was not loaded in a progressive way. The test results indicated that on average a lower
time per page (IPT=4.31 min) was spent by a student to process the information when QoE-aware
version was used, in comparison to the case when AHA! system was used (IPT=4.95 min).
Summarising these results, the students that used the QoEAHA system had shorter Study Session Times
than those that used the AHA! system. This was due to the fact that the material was delivered
faster. Since the download time per page did not exceed the user tolerance for delay threshold, the
students were constantly focused on their task, resulting in shorter Information Processing Time per
page as well. Results showed that an improvement of 16.27 % in the Study Session Time for the whole
learning session was obtained when the QoE-aware version was used. On average, an improvement of
26.5% on the access time/page was obtained when the QoE-aware version was used. An access time
per page no higher than 12 sec provided by the QoE-aware system has ensured a smooth learning
process. This observation is confirmed when assessing the number of re-visits to a page (on
average19% decrease with QoEAHA) and information procession time per page (on average13%
decrease provided with QoEAHA).
4.5 Usability Assessment
The main goal of the usability evaluation strategy is to measure the usability and effectiveness of the
QoE-aware AHA system in comparison to the original AHA! system. The methodology of study involved
the usage of the online questionnaire technique. This is one of the most widely used techniques in the
education area.
At the end of the interactive study session both group of subjects were asked to complete an online
usability evaluation questionnaire consisting of ten questions with answers on a five point scale (1poor - 5-excellent). The questions were devised to respect the widely used guidelines suggested by
Preece for evaluating Web sites. They relate to navigation, presentation, subjective feedback,
accessibility and user perceived performance. The accessibility and user perceived performance
questions assess the end-user QoE. Four questions of the survey relate to these two categories. These
four questions assess user opinion in relation to the overall delivery speed of the system (Q6), the
download time of the accessed information in the context of Web browsing experience (Q7), the user
satisfaction in relation to the perceived QoS (Q9) and whether the slow access to the content has
inhibited them or not (Q5). The results of the QoE related questions for both AHA! and QoEAHA
systems are graphically presented in Figure 5.
Figure 5. Usability Evaluation Results on Questions that Assessed the End-User QoE
As seen from the chart the QoEAHA system has provided a better QoE for the end users, improving the
users' satisfaction, which was above the good level for all questions. The AHA! system scored just
above the average level, significantly lower than QoEAHA. This good performance was obtained in
spite of the subjects using a slow connection (56 kbps) during the study session and not being
explicitly informed about this. Overall, the mean value of QoE usability assessment was 4.22 for
QoEAHA and 3.58 for AHA!. This lead to an improvement of 17.8 % brought by the QoEAHA system. A
two-sample T-Test analysis on the results of these four questions confirmed with a confidence level
above 99 %, (p<0.01) that users' opinion about their QoE is significantly better for QoEAHA than for
AHA!.
The usability assessment on the other questions related to the navigation and presentation features
achieved an average score of 3.83 for AHA! and 3.89 for QoEAHA, demonstrating that these features
were not affected by the addition of the QoE enhancements.
Finally, an overall assessment when all ten questions were considered of equal importance shows that
the students considered the QoEAHA system (mean value=4.01) significantly more usable then the
AHA! one (mean value=3.73). These results were also confirmed by the unpaired two-tailed T-Test
(t=2.44, p<0.03) with a 97 % degree of confidence. This increase of 7.5 % in the overall QoEAHA
usability was mainly achieved due to the higher scores obtained in the questions related to end-user
QoE.
4.6 Correlation Analysis
One aspect worth examining is to determine whether or not there is any correlation between the
performance of individual students and their perception of system usability. Therefore, the goal of
this analysis that computes the Spearman coefficient is to examine if students that performed well in
the Post-Test evaluation, thought that the system was more usable, while students with much lower
scores expressed bad opinions. A strong correlation between the two set of results (Post-Test and
Usability) would discredit to a certain extent the results of the usability evaluation experiment.
The correlation analysis has been performed for both QoEAHA and AHA! systems. For the QoEAHA the
value of the Spearman coefficient was rs=0.23 while for the AHA! this value was rs=0.03. Values lower
than 0.33 indicate a weak correlation, while values higher than 0.67 indicate a strong correlation. As
both computed coefficients have values lower than 0.67 there is no strong correlation between the
two data sets.
Summarising the results, no correlation has been found between the students learning outcome and
their judgment on the system usability. The opinions expressed by the students in the usability
questionnaire were not influenced by their final scores in the Post-Test evaluation.
5 Conclusions
This paper has proposed the Quality of Experience (QoE) as another dimension of user
characterisation that should be taken into consideration by the personalization process provided by
adaptive hypermedia applications. QoE is directly influenced by the operational environment through
which the user interacts with the AHS (bandwidth, delay, loss, device capabilities, etc) and by the
user subjective assessment of their perceived performance. The goal of any AHS should be not only to
provide the content that would best suit the user's goals, knowledge or interest but also to provide
the best content that would fit the user's operational environment. In this context we have proposed
a QoE-Layer enhancement for AHS that analyses some key factors that influence QoE and makes a
correlation between their values and Web page characteristics that provide the best QoE for the enduser.
The QoE layer was implemented as an independent module providing an extra layer of adaptation
(performance-based content adaptation) on a personalised content generated by an AHS. It can be
easily integrated with a classic AHS that respects the following conditions: the application domain is
defined as a collection of concepts and concept relationships (Domain Model) and the system builds a
user profile and maintain a sorted list (e.g. percentage) of the user interest in the concepts defined
in the Domain Model.
For evaluation purposes QoE Layer was deployed on the open-source AHA! system, creating the
QoEAHA. QoEAHA was tested in the educational area. This paper presents a study on the impact of
the usage of the QoE Layer as part of an adaptive e-learning system when the students access the
educational material using a low bit rate home-like operational environment. Different educationalbased evaluation techniques such as learner achievement analysis, learning performance assessment,
usability survey and correlation analysis between individual student performance and judgment on
system usability were applied in order to fully assess the performance of the QoEAHA.
The most significant conclusions drawn from the subjective testing presented in this paper are the
following:
The usage of QoE Layer brought significant learning performance improvements. These
improvements were achieved as the educational material is delivered faster and the students are
constantly focused on performing their tasks. Long periods of waiting for a required page annoy
people and disturb their concentration on the task.
16.27 % improvement in the execution time of a learning task was obtained when the
QoEAHA was used. Most of the QoEAHA group students (71.43 %) finished the task in less than 20
minutes, while only 42.85 % of the AHA! group students finished in the same period of time.
The students from QoEAHA group performed on average a smaller number of re-visits to a
Web page than those from the AHA! group.
An overall assessment of system usability has shown that students considered QoEAHA
system significantly more usable than AHA!. QoEAHA achieved a 7.5 % increase in the usability
survey results due to the higher marks awarded on the QoE-related questions. Focusing only on the
end-user QoE-related usability an improvement of 17.8 % was obtained.
QoE Layer does not affect the learning outcome of the students. Both groups of students
received similar marks on the final evaluation test. Therefore, the QoEAHA offers similar learning
capabilities to the classic AHA! system.
6 Further Work
The proposed QoE layer has been shown to bring improvements to learning performance for material
consisting of text and images delivered in a low bitrate environment. Further work is necessary to
explore its effectiveness in a wider range of situations. Two possible directions are the extension to
multimedia content where performance problems may arise even in higher bandwidth environments
and the application to Adaptive Hypermedia Systems in areas other than education.
The delivery of multimedia content to end-users over heterogeneous networks with variable delivery
conditions presents significant challenges. We intend to broaden the use of the QoE layer to
applications that deliver personalised multimedia content to e-learners. The extended QoE layer will
monitor and analyse in real-time the values of multimedia-streaming-related parameters (e.g. delay,
loss, jitter, end-user perceived multimedia quality estimation metrics) and will make suggestions
about the optimal type and characteristics of multimedia stream (bit rate, frame rate, resolution)
delivered to the user in order to provide a good level of QoE. The adaptation algorithm will consider
new adaptation strategies for the multimedia clips that may involve size and quality adjustments
techniques for audio and video, and in the worst case, substitution of alternative forms of material
such as sequences of images.
Another direction to explore is to combine the QoE layer with adaptive hypermedia systems applied in
other areas such as on-line information systems and to investigate usability and benefits brought by
the new system. These systems differ from the educational ones by providing a bigger navigational
space, a higher flexibility to the users to navigate in the hyperspace and the users of the system have
different objectives.
References
AHA! Project http://aha.win.tue.nl/
Ardissono, L., Goy, A., Petrone, G., Segnan, M., Torasso, P., (2002) "Ubiquitous User Assistance in a
Tourist Information", 2nd International Conference on Adaptive Hypermedia and Adaptive Web Based
Systems (AH2002), Malaga, Spain, pp. 14-23
Bhatti, N., Bouch, A. and Kuchinsky, A. (2000), "Integrating User Perceived Quality Into Web Server
Design". Computer Networks Journal, Vol. 33, No. 1-6, pp.1-16
Blakowski, G. and Steinmetz, R. (1996), "A Media Synchronisation Survey: Reference Model,
Specification, and Case Studies". IEEE Journal on Selected Areas in Communications, Vol. 14, No.1,
pp. 5-35
Bloom, B. S., Mesia, B. B. and Krathwohl, D. R. (1964), "Taxonomy of Educational Objectives (two
vols: The Affective Domain & The Cognitive Domain)", New York. David McKay Inc.
Brusilovsky, P., (2001), "Adaptive Hypermedia", User Modeling and User-Adapted Interaction Journal,
Vol. 11 No. 1-2, pp. 87-110
Brusilovsky, P., (1996), "Methods and Techniques of Adaptive Hypermedia", User Modeling and UserAdapted Interaction Journal, Special Issue on Adaptive Hypertext and Hypermedia, Vol. 6, No. 2-3,
pp. 87-129
Bouch, A., Kuchinsky, A. and Bhatti, N. (2000), "Quality is in the Eye of the Beholder: Meeting Users'
Requirements for Internet Quality of Service". In Proceedings of the ACM CHI 2000 Conference on
Human Factors in Computing Systems, Hague, Netherlands
K. Cheverst, K., Mitchell, K., Davies, N., (2002) "The Role of Adaptive Hypermedia in a ContextAware Tourist Guide", Communications of the ACM Journal, Vol. 45, No. 5, pp. 47-51
Chiu, W. (2001), "Best Practices for High Volume Web Sites", IBM RedBooks
De Bra, P. (1997), "Teaching Through Adaptive Hypertext on the WWW". Journal of Educational Telecommunications, Vol. 3, No. 2/3, pp.163-180
De Bra, P. and Calvi, L. (1998), "AHA! An Open Adaptive Hypermedia Architecture". The New Review
of Hypermedia and Multimedia, Vol. 4, pp. 115-139
Empririx, "Assuring QoE on Next Generation Networks", White paper
Ghinea, G. and Thomas, J. P. (1998), "QoS Impact on User Perception and Understanding of
multimedia Video Clips". In Proceedings of ACM Multimedia Journal, Bristol, UK, pp. 49-54.
Krishnamurthy, B. and Wills, C.W. (2000) "Analysing Factors that Influence End-to-End Web
Performance". Computer Networks Journal, Vol. 33, No.1-6, pp. 17-32
ITU-T Recommendation G.1010 (2001) "End-User Multimedia QoS Categories"
Merida, M., Fabregat, R., Matzo, J.L, (2002), "SHAAD: Adaptible, Adaptive and Dynamic Hypermedia
System for Content Delivery", 2nd International Conference on Adaptive Hypermedia and Adaptive
Web Based Systems (AH2002), Workshop on Adaptive Systems for Web-based Education Málaga, Spain
Muntean, C., H. and McManis, J. (2004a) "A QoS-Aware Adaptive Web-based System", In Proceedings
of the IEEE International Conference on Communications (ICC04), Paris, France
Muntean, C., H. and McManis, J. (2004b) "QoSAHA: A Performance Oriented Learning System". In
Proceedings of AACE ED-MEDIA '04 Conference, Lugano, Switzerland
Muntean, C., H. and McManis, J. (2004c) "End-User Quality of Experience Layer for Adaptive
Hypermedia Systems", In Proceedings of 3rd International Conference on Adaptive Hypermedia and
Adaptive Web-based systems, Workshop on Individual Differences in Adaptive Hypermedia,
Eindhoven, The Netherlands, pp. 87-96
Muntean, C. (2005) Pre-Test and Post Test Questionnaires for QoEAHA Evaluation
http://www.eeng.dcu.ie/havac/QoEAHAEvalForms/
Ng, M., H., Hall, W., Maier, P., Armstrong, R., (2002), "The Application and Evaluation of Adaptive
Hypermedia Techniques in Web-based Medical Education", Association for Learning Technology
Journal, Vol.10, No. 3, pp. 19-40
Papanikolaou, K., A., Grigoriadou, M., Kornilakis, H., Magoulas, G. D. (2003), "Personalizing the
Interaction in a Web-based Educational Hypermedia System: the Case of INSPIRE", User Modeling and
User-Adapted Interaction Journal, Vol. 13, No. 3, pp. 213-267
Preece, J. (2000) "Online Communities: Designing Usability, Supporting Sociability". In John Willey &
Sons (eds), Chichester UK
J. Ramsay,J., Barbasi, A., Preece, J. (1998), "A Psychological Investigation of Long Retrival Times on
the World Wide Web", Interacting with Computers Jurnal, Elsevier Ed., March.
Servidge, P. (1999), "How Long is Too Long to Wait for a Web Site to Load?", Usability News
Sevcik, P. J. (2002) "Understanding How Users View Application Performance", Business
Communications Review, Vol. 32, No. 7, pp. 8-9
Triantafillou, E., Pomportsis. A, and Georgiadou, E. (2002), "AESCS: Adaptive Educational System
Base on Cognitive Styles", 2nd International Conference on Adaptive Hypermedia and Adaptive Web
Based Systems (AH'2002), Workshop on Adaptive Systems for Web-Based Education, Malaga, Spain,
pp.10-20.
TU/e course 2L690 Hypermedia Structures and Systems, http://wwwis.win.tue.nl/2L690/
Watson, A., Sasse, M. A. (1997), "Multimedia Conferencing via Multicasting: Determining the Quality
of Service Required by the End User". In Proceedings of AVSPN '97, Aberdeen, UK, pp. 189-194.
Weber G., Brusilovsky, P., (2001), "ELM-ART: An Adaptive Versatile System for Web based
Instruction", Artificial Intelligence in Education, Special Issue on Adaptive and Intelligent Web-based
Educational Systems, Vol. 12, No. 4, pp. 351-384
Wu, H., De Kort E., De Bra, P. (2001), "Design Issues for General-Propose Adaptive Hypermedia
Systems". In Proceedings of the 12th ACM Conference on Hypertext and Hypermedia, Aarhus,
Denmark, pp.141-150
© Copyright 2026 Paperzz