Briton, Derek_EES 2012 Paper.

European Evaluation Society 2012 Conference, Helsinki, Finland
Using Open Source Survey Tools for Qualitative Inquiries on Educational
Development at a Distance Online University
Keywords: open source survey tools, utilization-focused evaluation, online qualitative evaluation,
educational development, distance online university
Authors
Corinne Bossé, Learning Designer, Centre for Learning Design and Development, Athabasca University
Cindy Ives, Director, Centre for Learning Design and Development & Associate Vice President Academic
(Learning Resources), Athabasca University
Derek Briton, Chair, Faculty of Interdisciplinary Studies, Athabasca University
Abstract
This paper reports on two open source survey tools that were used to gather data related to Athabasca
University’s educational development activities within a qualitative evaluation framework. A
comparative review of a Moodle module questionnaire and Lime Survey aims to analyze the multiple
uses of evaluative instruments as part of a broader discussion on ‘utilization-focused evaluation’ for
Higher Education projects.
Introduction
In his seminal book, Weaving the Web: The Original Design and Ultimate Destiny of the World Wide
Web, Tim Berners-Lee (2000) states: “The web is more a social creation than a technical one. I designed
it for a social effect—to help people work together—and not as a technical toy. The ultimate goal of the
Web is to support and improve our weblike existence in the world (p.123).” This interconnected view of
technology aligns with the focus of this paper which aims to report on the purposeful uses of web-based
evaluative tools to assist in the implementation of overarching organizational goals and objectives. At
the institutional level, the move to the networked environment has implications for assessing
educational development activities within a distance online university such as Athabasca University
(AU).
Institutional context
As part of its mandate, the Canadian Open University seeks to: remove barriers to undergraduate and
graduate education; provide high-quality, interactive learning environments; and actively pursue
technological innovations that can enhance teaching, research and administrative functions’. AU has a
distributed workforce of 1300 faculty and staff members working at four locations within Alberta,
Canada and home offices. Over 900 courses are offered with more than 50 undergraduate and graduate
programs. The learning environment is characterized by a self-paced mode of undergraduate course
delivery and continuous enrolment, which means that every month students register to start a course.
Graduate courses are mostly paced. These key institutional features create both unique challenges and
opportunities.
AU’s recent restructuring and organizational change is affecting its operational processes at all levels. .
The Educational Development initiative is about preparing the university constituents with the changes
to come with ensuing expected outcomes as illustrated in the roadmap (see Figure 3). Within this
context, the Centre for Learning Design and Development (CLDD) sponsored two formative evaluations
within the past three years as part of its educational development activities. This paper reports on two
open source survey tools that were used to gather data related to Athabasca University’s (AU)
educational development activities within a qualitative evaluation framework. First, a Moodle
questionnaire module was used to assess the educational development needs of faculty. In another
instance, Lime Survey served to gather qualitative information for an expert review on the usability of
course learning objects from both a technical and pedagogical dimensions. A comparative review of
both tools will be provided from an educational development perspective. It aims to analyze the
multiple uses of evaluative instruments as part of a broader discussion on ‘utilization-focused
evaluation’ (Patton, 2008) in the context of Higher Education projects.
Conceptual framework
A formative evaluation framework was adopted for the qualitative enquiries conducted for both
evaluation projects. From an education al development perspective, the purpose of the formative
evaluation framework was to identify teaching and learning needs as well as to document and assess an
ongoing process of improvements to online course learning activities. It drew upon elements of
participatory methods and utilization-focused approach (Patton, 2008). The input of various
constituencies within the university was solicited at different stages of the evaluation projects related to
educational development activities. The use of a Moodle questionnaire and the Lime Survey facilitated
the consultative process to obtain relevant input that would be of most use to stakeholders with a view
to improving future designs and integration of online course learning resources.
Uses of Open Survey Tools
Open education is an integral part of Athabasca University’s organizational culture as one of the
pioneering online and distance teaching universities. Therefore, there is a strong institutional support
for open source tools such as Lime Survey and Moodle, which is the university’s learning management
system (LMS). The databases and servers for each tool are hosted within different units of the Canadian
Open University. This level of technical integration within the institution makes it easier to access and
use these open source survey tools as part of the academic practice for both faculty and professionals.
Within this institutional context, integrating open source tools to conduct qualitative inquiries on recent
educational development initiatives sponsored by AU‘s Centre for Learning Design and Development
(CLDD) can be viewed as a strategic alignment towards supporting innovative teaching and learning
activities. In fact, one of the rationales for using Moodle to conduct a needs assessment was building on
AU faculty’s familiarity with the LMS to raise their awareness about the Moodle questionnaire module.
One of the outcomes is to make use of this feature to gather additional qualitative feedback from
students to enhance course design and development. Similarly, the expert review conducted through
Lime Survey provided an opportunity for faculty and professionals to test the tool as well as responding
to the object of the qualitative inquiry focused on improving future course learning objects design. Each
open source online survey tools provided the options for participants to return more than once to
complete their questionnaire. Upon completion, the online survey could no longer be accessed by
them.
Evaluation Project I: Moodle questionnaire
The main purpose of the needs assessment was to help determine which educational development
activities could contribute to an enhanced teaching and learning experience at Athabasca University.
Over the course of an academic year, several informal and formal discussions were conducted with
various stakeholders (e.g. Deans, Centre Chairs, etc.) to identify which potential educational
development activities to include in a Moodle questionnaire. Given its brief format, with five main
multiple questions, it was essential to receive feedback on the relevance of the items that were
emerging from stakeholders’ meetings. The evaluation team decided that short online questionnaire
would ensure the user-friendliness of the open source web-based evaluative tool for participants who
had various levels of experience and skills with the institutional learning management system.
Following the pilot of the Moodle Questionnaire, a link to the online survey was sent to AU faculty to
assess their educational development needs. About a third of the faculty responded to the survey during
the spring of 2010. None reported having difficulties using the Moodle questionnaire. The main results
of the needs assessment of faculty for educational development activities are displayed in Moodle
questionnaire in Figure 1.
Figure 1
The top five topics of interest emerging from the survey: Course design for online learning (66%);
interactive online activities (64%); creating simple and effective content for audio and video (64%);
integrating online resources (55%); and using social media in courses (55%). The information collected
through the Moodle questionnaire served as a baseline to identify specific aspects of the teaching and
learning experience, and the technologies used to support that experience that would be the most
relevant to AU faculty. Subsequent educational development workshops were developed based on the
results of the needs assessment. A further analysis of the findings indicated that there were
opportunities for innovation that needed to be capitalized on. The second formative evaluation project
built on the top five needs identified by targeting several areas of design and integration of online
course learning resources.
Evaluation Project II: Lime Survey
A formative evaluation of Athabasca University CAF1 showcase course enhancements was conducted
over the course of Spring-Fall 2011. The CAF Showcase Course project focused on up to twenty of AU's
largest registration courses with the goal of increasing accessibility, usability, interactivity, and efficacy
by including a combination of assisted learning enhancements such as webcasts, podcasts, video, audio,
interactive multimedia, wikis, blogs, discussion forums and other social software. Another goal was to
1
Community Adjustment Fund Project : https://projects.athabascau.ca/caf
develop resources that are reusable. The courses chosen for enhancement met most of the following
criteria:
-
representative of the disciplines offered at AU
-
high enrolment (most will come from the top 100 for greatest impact on learners)
-
representative of the faculties
-
representative of academic centres
-
representative of enhancements
-
already available online (preferably)
-
willing course professors with the support of their chair and dean
-
recently revised and/or up to date content
-
political optics
The CLDD director, in collaboration with design staff and in consultation with professors, chairs and
deans, chose the courses to be enhanced. Upon completion of the showcase course project milestones,
a formative evaluation was conducted to assess the usability of the learning objects from both a
technical and a pedagogical dimension.
From an evaluator’s perspective, one of the most useful features of using Lime Survey was the ease of
use to analyze descriptive statistics through a selection of multiple formats as illustrated in Figure 2. It
facilitates the data analysis process as data is represented in several modes of representations
(numerical, textual, graphical, etc.) with accessible feedback. The design of Lime Survey enables more
opportunities to run sophisticated analyses than responses generated using a Moodle questionnaire
module. This open source web-based tool also ensures that privacy laws related to Canadian
participants are followed as their data are not stored on US or foreign servers like other ones commonly
used by evaluators.
Purpose of Expert Review
The expert review was a key component of the formative evaluation of more than 20 CAF showcase
course enhancements. This ‘heuristic evaluation’ provided a relatively rapid way of generating a
systematic examination of the design of the LOs to detect usability problems and issues (McNaught, C.,
Lam, P. & Cheng, K., 2009; Nielsen, 2003). It aimed to obtain feedback from a community of expert
practitioners in learning design and educational development that will inform and improve future
designs and uses of AU course learning objects (LOs). More specifically, the purpose of the expert review
was to assess the usability of selected learning objects on two design dimensions: technical and
pedagogical.
Reviewers’ selection
Learning designers (LDs) and instructional media analysts (IMAs) with at least five to more than ten
years’ experience in their field participated in the expert review of showcase-course learning objects.
They were randomly selected. However, efforts have been taken to minimize conflict of interests and to
ensure that the few LDs/IMAs who were directly involved in the design or development of the selected
LOs didn’t evaluate them. Overall, there were eight expert reviewers who participated in the formative
evaluation.
Questionnaire Design
The criteria and questions used in the expert review for evaluating the learning objects were drawn
from five main sources: the CLOE draft guidelines (in Haughey & Muirhead, 2005); the Le@rning
Federation Soundness Specification (Freebody, Muspratt, 2007); the Learning Object Review Instrument
(Nesbit, Belfer & Vargo, 2002) and its revised version (Li, Nesbit & Richards, 2006); Nielsen’s (2003)
usability guidelines as well as Lund (2001) USE questionnaire; and usability metrics retrieved from a U.S.
governmental Web site (http://www.usability.gov). Relevant sections of these validated evaluative
instruments were adapted and contextualized to examine both the technical and pedagogical
dimensions of the CAF showcase course enhancements. The design of the questionnaire was an iterative
process. It was piloted with an AU expert team in learning design and educational development and it
was revised a number of times before finalizing the version used in the formative evaluation. The online
and open Lime Survey tool was selected as the final format of the questionnaire.
Scope
It was estimated that it would take between 30 and 45 minutes to review the selected learning objects
and complete the online Lime Survey. It was suggested that the LDs/IMAs first reviewed the learning
object within its context before completing the survey. The reviewers had the opportunity to go back to
the Lime Survey more than once to complete it. Overall, 21 showcase course enhancements were
reviewed by the experts. The first rounds of reviews started during the last two weeks of April 2011 and
were completed by the end of June 2011. A follow up debrief session took place with the reviewers
during that period.
Evaluation Findings
An evaluation matrix was constructed based on a thematic literature review focused on evaluation of
learning objects and multimedia resources. It was also used to generate the codes for data collection
and data analysis purposes. A documentary review (e.g. proposals, status reports, CAF demo
presentation notes) was also done to triangulate the evaluation data sources.
Technical Dimension
An overview of the expert review key findings for all the CAF showcase course enhancements follows. In
the technical dimension of the evaluation, the expert review indicates that there are high levels of
agreement about the presentation design with 97% indicating that the overall look and feel of the LOs is
effective; 96% finding the LOs easy to use; and around 82% indicating an appropriate level of technical
integration of the CAF showcase courses within the learning management system (i.e. Moodle). The
perceived level of technological interactivity tends to be low for the majority (74%) of LOs. This
information has subsequently being used to improve the latest version of all the learning objects that
were assessed in the formative evaluation. Biology course enhancements were perceived to have the
highest level of technological interactivity.
Pedagogical Dimension
Several types of LOs were designed as indicated in Figure 3:
Figure 2
More than half (52%) of the reviewed LOs were videos, followed in order of importance by audio (41%),
quiz (22%), interactive tutorial (11%) and graphic (11%). Other types of LOs were also included in lesser
percentages such as decision-making tree (4%), interactive case study (4%), podcast (4%), web page (4%)
and picture (2%). About 7% of LOs did not fit into any listed type. None were classified as simulations.
Discipline/Field
The bulk (65%) of the LOs was primarily intended to be used in the Humanities and Social Sciences
discipline and field. The rest were split among Sciences and Technology (15%), Business (13%), and
Health Studies (7%).
Purpose/Intended Use
The top 5 intended uses of LOs amongst ten listed choices are as follows:

The majority of LOs (57%) are intended to help learner with foundational concepts

More than half (52%)of LOs are intended be used as revision or review of new knowledge, a
concept or skill and to provide for multiple learning preferences beyond reading

Slightly less than half (48%) of LOs are intended to help learners develop new knowledge , a
concept or skill

About a third (35%) of the LOs are intended to be used as an orienting or tuning-in activity

Slightly less than a third (28%) as an instructor-directed demonstration tool
In general, several components along the pedagogical dimension were rated with overall high levels of
agreement in terms of intended use, design interaction, accessibility, reuse, learner interaction,
motivation/engagement and effectiveness.
Thematic Observations on the Findings
Some emerging themes coming from triangulating the data collected throughout the formative
evaluation are as follows.
Technical dimension
 Presentation Design of the course enhancements obtained the highest levels of agreement
among experts with an overall look and feel of the LOs perceived as being effective
 Ease of Use: The LOs score relatively highly in terms of overall ease of use
 Technical Integration of LOs is perceived to be relatively seamless into the Learning
Management System (i.e. Moodle)
 Technical Interactivity was the weakest component of the usability of the LOs. Even though the
taxonomy of technological interactivity used in the evaluation might have been ambiguous given
that some elements such as file sharing did not necessarily apply to the LOs, there seems to be a
need to revisit the level of technical interactivity in future. A selected review of the CAF’s
proposals’ objectives reveals the discrepancy between the intent to design interactive LOs and
the lack of clarity in defining what kinds of interactivity is expected to be achieved. A revised
taxonomy might guide the design LOs to define, assess and to achieve greater levels of
technological interactivity.
Overall, the technical dimension of the usability of the LOs integrates relatively well several usability
principles outlined by Nielsen in terms of user interface design (Nielsen, 2003).
Pedagogical Dimension
 Intended purpose: there is a diverse range of intended purposes associated with the LOs that
cover the whole spectrum of Bloom’s learning taxonomy and Conole (2009) learning design
activities
 Design Interaction:
- Clear instructions should be consistent for all LOs that are integrated into courses.
- The overall low level of agreement and high level of ambiguity regarding the existence
of clear learning outcomes stands out ; A cross-reference to the CAF reveals some
clearly stated objectives but no explicit learning outcomes associated with LOs ; this





might be an area to improve in terms of clearly defining learning outcomes associated
with LOs as this pedagogical practice is associated with producing higher quality learning
environment (Hirumi, 2005)
The type of LO design might have played a role in the perceived opportunity for
learners to obtain feedback within or outside the LO itself; Built-in automated feedback
might increase level of interactivity both along the technical and pedagogical
dimensions and it’s worth considering extending this feature beyond quizzes
- Learner’s interaction was perceived to be mostly passive and associated with clicking
through material; this might correlate in part to the low level of technological
interactivity associated with LOs; it might be also be an opportunity to include a greater
range of design controls that would enable learners to manipulate their learning
environment; designing LOs as active learning episodes might also increase learner
interaction
Accessibility: In the context of this formative evaluation, accessibility focused on examining the
extent to which multiple formats were available to learners to address different learning
preferences. The evaluation indicates that there is a need to provide closed captioning for audio
and video to a greater extent. Transcripts of audio and video should be used consistently
whenever applicable.
Reuse: reuse of the LOs in another learning context seems is perceived to be low ; it’s an area
that will need to be revisited in terms of clearly defining reuse and ensuring it occurs to a
greater extent
Learner Interaction tends to be influenced by the design of the LO
Motivation/Engagement: the LOs are widely perceived to be motivating and engaging for
intended learners; it was one of the strongest pedagogical components
The learning effectiveness of the LOs seems to correspond relatively well to their perceived
intended uses; overall, the LOs were mostly perceived to help learners in lower levels of
Bloom’s taxonomy of learning domains, that is, knowledge, and comprehension
Based on findings from the evaluation, key areas to be taken into account in future integration of
learning objects into courses can be summarized with Leacock and Nesbit (2007) framework to evaluate
the quality of multimedia learning resources. It focuses on verifying that following components are part
of the design of LOs:







learning goal alignment with activities and assessments
adaptive feedback or content driven by learner input
motivation of the target learners
presentation design which enhance learning and efficient processing
interaction usability focused on ease of use
accessibility with a view to include multiple formats and design control to accommodate
diverse learners
reusability of LOs in various learning contexts (Leacock & Nesbit, 2007).
Recommendations/Future Directions
In sum, it is hoped that the findings of the formative evaluation will be useful ‘to inform an ongoing
cycle of reflection and innovation’ (Patton, 2008, p.116). A potential course of action that can be
inferred from the expert review results of the course enhancements follows:









Specifying both the intent and expected outcomes while designing and integrating LOs into
courses
Examining the showcase course enhancements from students’ perspective by integrating
targeted evaluations on revised technical and pedagogical elements of usability into the various
courses; triangulating the perceived technical and pedagogical dimensions with students’ actual
use of the learning resources and actual outcomes.
Revisiting the reuse pedagogical components to integrate multiple formats and configurations
into which these learning resources can be embedded in online learning environments
Further adapting and revising a taxonomy of technological interactivity to better define and
assess how well LOs achieve interactivity
Incorporating to greater extent built-in automated feedback into the design of LOs to increase
learners’ control (Nielsen, 2003)
Expanding the design feature of accessibility to be more inclusive by taking into account
different range of teaching and learning contexts comprised of diverse learners who might
experience disability (The Le@rningFederation ,2007)
Using expert reviews as one of the learning design strategies to enhance integration of
multimedia learning resources into online course design and development
Tracking the actual use and reuse of these multimedia learning resources by integrating them
into open education resources platforms that use analytics
Monitoring and evaluating the usability of the learning resources that are integrated into online
courses over time to establish institutional design benchmarks.
Future formative evaluations of online course components will be an opportunity to further explore
these issues internally as well as to refine the instruments that will be used. Some recommendations are
already being incorporated in current courses. They also align with broader educational development
goals as illustrated in Figure 3.
Figure 3
Conclusions
Patton (2008) argues persuasively that participation and involvement of evaluation’ stakeholders
substantially increase the likelihood of using the results of an evaluation in order to advance societal as
well institutional interests. The collaborative dimension built in the design of both qualitative enquiries
was, thus, key component of the evaluation process that was undertaken using Moodle and Lime
Survey. From an educational development standpoint, a valuable outcome of using open source webbased evaluative tools is to engage people both in the process of evaluation itself as well as using tools
to improve online learning environment in which they operate. Although both qualitative evaluation
projects were different in terms of objective and scope, one of the reasons for using these Web-based
open source survey tools stem from an institutional commitment to accessibility, flexibility and, ease of
use. This factor may have had an incidence on participants’ responses and emerging findings from both
online qualitative inquiries.
At this exploratory stage of the comparative review, it is anticipated that Moodle and Lime Survey will
be embedded as part of AU systematic research-based responses to appropriately identify and address
educational development needs and challenges. As Athabasca University continues to forge ahead with
various educational development activities, there is an internal commitment to the iterative process
around evaluation. Hence, the intentional alignment in evaluation methodology within the larger pursuit
of institutional AU goals derive from a sense of purposefulness and openness to share evaluation
processes and results more effectively within a networked environment.
BIBLIOGRAPHY
Ally, M., & Cleveland-lnnes, M. (2006). Learners' use of LOs. Journal of Distance
Education, 21(2), 44-57.
Berners-Lee, T. & Fischetti, M. (2000). Weaving the Web: The Original Design and Ultimate Destiny of
the World Wide Web. New York: HarperCollins Publishers.
Boyle T. (2009). The design of learning objects for pedagogical impact. In Lockyer, l., Bennett, S.
Agostinho S. and Harper B. (Eds) The Handbook of Research on Learning Design and Learning Objects:
issues, applications and technologies. Retrieved April 12, 2011 from
http://www.cuhk.edu.hk/clear/download/paper/McNLC_Lockyer.pdf
Conole, G. (2009). The role of Mediating Artefacts in Learning Design. In Lockyer, l., Bennett, S.
Agostinho S. and Harper B. (Eds) The Handbook of Research on Learning Design and Learning Objects:
issues, applications and technologies. Retrieved April 12, 2011 from
http://www.cuhk.edu.hk/clear/download/paper/McNLC_Lockyer.pdf
Freebody, P. & Muspratt, S. (2007). Uses and effects of The Le@rning Federation’s learning objects: An
experimental and observational study. Retrieved April 12, 2011 from
http://www.ndlrn.edu.au/working_with_jurisdictions/reports_and_research/reports_and_research_1.h
tml
The Le@rningFederation (2007). Le@rningFederation Soundness Specification. Victoria, Melbourne:
The Le@rning Federation. Retrieved April 12, 2011 from
http://www.ndlrn.edu.au/verve/_resources/tlf_report_final_2007.pdf
Haughey M., & Muirhead, B. (2005). Evaluating learning objects for schools. The e-Journal of
Instructional Science and Technology. 8(1), Article3. Retrieved April 12, 2011 from
http://www.ascilite.org.au/ajet/e-jist/docs/vol8_no1/fullpapers/Haughey_Muirhead.pdf
Hirumi, A. (2005). In Search of quality: An analysis of e-learning guidelines and specifications. Quartely
Review of Distance Education, 6(4), 309-330.
Kay, R., & Knaack, L. (2007). Evaluating the learning in learning objects. Open Learning: The Journal of
Open and Distance Learning, 22(1), 5–28.
Lau, S. & Woods, P.C. (2009). Understanding learner acceptance of learning objects: The roles of
learning object characteristics and individual differences. British Journal of Educational Technology
40(6).
Leacock, T. L., & Nesbit, J. C. (2007). A Framework for Evaluating the Quality of Multimedia Learning
Resources. Educational Technology & Society, 10 (2), 44-59.
Li, J. Z., Nesbit, J. C., & Richards, G. (2006). Evaluating learning objects across boundaries: The semantics
of localization. Journal of Distance Education Technologies, 4(1), 17-30.
Lund, A (2001) Measuring Usability with USE questionnaire. Usability Interface, 8(2). Retrieved on April
12, 2011 from http://www.stcsig.org/usability/newsletter/0110_measuring_with_use.html
McGee, P. (2003) Observing Interactivity in Learning Objects. Proceedings of the 7th Annual E-learn
Conference. Phoenix,CA. Retrieved on April 12, 2011 from
http://scholar.google.ca/scholar?q=McGee+%29Observing+Interactivity+in+Learning&hl=en&as_sdt=0&
as_vis=1&oi=scholart
McGreal, R. (2004). Learning objects: A practical definition. International Journal of Instructional
Technology and Distance Learning, 1 (9).
McNaught, C. , Lam, P. & Cheng, K. (2009).Using Expert Reviews to Enhance Learning Designs. In
Lockyer, l., Bennett, S. Agostinho S. and Harper B. (Eds) The Handbook of Research on Learning Design
and Learning Objects: issues, applications and technologies. Retrieved April 12, 2011 from
http://www.cuhk.edu.hk/clear/download/paper/McNLC_Lockyer.pdf
Nesbit, J., Belfer, K., & Vargo, J. (2002). A convergent participant model for evaluation of learning
objects. Canadian Journal of Learning and Technology, 28 (3).
Nielsen, J. (2003). “Ten usability heuristics”. Retrieved April 12, 2011 in
http://www.useit.com/papers/heuristic/heuristic_list.html
Nurmi, S., & Jaakkola, T. (2006a). Effectiveness of learning objects in various instructional settings.
Learning, Media and Technology, 31(3), 233-247.
Patton, M.Q. (2008). Utilization-Focused Evaluation. (4th ed.). Thousand Oaks, CA: Sage.
Scriven,M. (1991). Beyond formative and summative evaluation. In Evaluation and education : at a
quarter century. Chicago:University of Chicago Press.
Scriven, M.(1996). Types of evaluation and types of evaluator. Evaluation Practice.
Vol. 17, No. 2, pp. 151-161.
The Le@rning Federation (2007). Quality Assurance Framework for Online Content Development.
Retrieved April 12, 2011 from
http://www.ndlrn.edu.au/verve/_resources/quality_assurance_framework_v3.3_2007.pdf
Usability.gov. http://www.usability.gov
Wiley, D. (2001). Connecting LOs to Instructional Design Theory: A definition, a metaphor,
and a taxonomy: The Instructional Use of LOs. Wiley, D. (ed). Retrieved April 12 ,2011 http://www.
reusability. org/read/chapters/wiley.doc.