Valuing Evaluation: A Case Study of Professional Development to Support Academic Engagement in Online Evaluation Processes and Outcomes Dr. Diana Quinn, Senior Lecturer: Professional Development Learning Connection, Magill Campus, University of South Australia, [email protected] Abstract: Governments and Universities are adopting student evaluation data as a definitive measure of teaching and learning quality. Online evaluations are a time- and financially-efficient way of enhancing the capacity of academics to respond to student feedback and implement course improvements. Planned strategies for instituting online evaluations produce the strongest response rates and therefore the most accurate information on which to base course development. This case study describes the strategy and outcomes of a professional development project in The Division of Education, Arts and Social Sciences at the University of South Australia, that aimed to support academics attempting to improve student engagement with online evaluations of course and teaching quality. A choice theory-informed approach to professional development was adopted that employed collation and sharing of best practice and modelling of preferred behaviour. An improvement in response rate over the entire Division of 16.6% was achieved during the test period. A survey of staff found that those strategies to improve response rates to online surveys that aimed at both improving student’s understanding of the power of the evaluation process as well as access to online survey instruments were the most successful. Strong support for the professional development approach adopted was reported by participating staff in a survey. Although learning outcomes were strong for the academics surveyed, continuing support in the use of the internet, as an interactive teaching and learning tool, was identified as an area for development. Keywords: online evaluation; response rates, professional development, case study, Choice theory Introduction Professional development (PD) is required to help university educators better value the role of evaluation as a mechanism to facilitate scholarly improvement in teaching and learning. At the University of South Australia (UniSA), this task is addressed through induction and mandatory teaching programs that have been recently extended to include selected sessional staff. However, there still exists a large group of continuing and sessional academics that have incomplete understandings about evaluation as a tool to improve student learning outcomes. This group’s PD needs are negotiated annually as projects, on a service agreement between academic divisions and the central unit for supporting teaching and learning at UniSA, the Flexible Learning Centre (FLC). This paper describes one component of the project called Valuing Evaluation, that was formulated as part of the Division of Education, Arts and Social Science (EASS) FLC 2003 service agreement, and the selection, application and evaluation of the PD approach used to support it. An important question to ask, however, is why do so many continuing and sessional university teaching staff have a limited understanding of the role of evaluation in improving teaching and learning? Why is it that they value evaluation so lowly? One answer is that most of these teachers have gleaned their working knowledge about teaching from their experiences as learners. In analysing the nature of a teacher’s learning experiences; we may then understand how, in many instances, one-way perspectives about evaluation have developed. Teachers, as learners, may have participated in evaluation, but it is unlikely that they have experienced, personally, the end results of effective evaluation. They might never have had the feeling of power that a learner experiences when their feedback impacts future teaching and learning activities. It would also be unlikely that they would have an appreciation of how evaluation data is collated and interpreted, prior to its use. Without such experiences, it is unrealistic to expect that teachers would understand the evaluation process as a method of promoting continuing improvement in teaching and learning. From each cohort of learners, the next generation of teachers emulates. But given a common perception that providing feedback about teaching and courses is futile, many of tomorrow’s teachers are likely to carry a negative attitude towards the processes associated with evaluation, if not the outcome. University policy requires yesterday’s students to, now as teachers, collect feedback from today’s students. The techniques used to obtain responses to paper surveys must, to some extent, be based on their own experiences. Thus, they often rely on ‘capture methods’, where students are not allowed to leave a class until the survey has been completed and returned. However, without the knowledge of what to do with the data, the completed surveys often would sit in piles in the bottom drawer of filing cabinets, gathering dust. University policy, as well as proposed government policy, has meant that academics need to engage with evaluation. UniSA’s policy A35A says that every course must be evaluated in some way, every time it is taught. The University has developed standardised evaluation tools, the Course Evaluation Instrument (CEI) and Student Evaluation of Teaching (SET), that have been described elsewhere (Reid 2001). Collated student feedback from these surveys is required for Program review, as well as evidence to support academic promotion and scholarly teaching award applications. Currently, the CEI and SET allow for online, or paper delivery methods. Although the paper method has significant human resource implications, it has often been preferred. The University has decreed that by 2005, the option to use paper-based CEI and SET surveys will no longer be available unless in exceptional circumstances, as they relate to particular students. Therefore, a need existed to help academics identify ways to improve students’ engagement with evaluation delivered in an online format. In a similar way to student-centred learning, academiccentred professional development (ACPD) was required to meet the needs of this group of academic staff. This process starts with understanding where the learners current conceptions about the topic are, and then selecting an approach that will help to satisfy their distinct learning needs. Choosing a professional development approach to bring about change The next task was to determine the appropriate PD approach. A recent article provides some insights about what academics require as motivation in order for them to comply with institutional effectiveness activities. It is based on data gathered by a survey of 386 ‘faculty’, from 168 American higher education institutions (Welsh and Metcalfe 2003). The surveyed academics indicated that their primary motivation for complying with institutional programs was if they understood how the change would improve the institution, its programs and services. Secondly, they were much more likely to support university-wide programs ‘if they, and their colleagues led, owned and participated in the process’ (p 40). Thirdly, they would support programs that were outcomes-orientated, indicating they valued evidence of ‘real results arising from instructions and efforts to improve’ (p41). An additional finding was effective communication of the process and results was a critical feature for academics to support institutional wide campaigns. This data by Welsh and Metclf (2003) supports the recommendations of others. For example, Rogers 1983 framework for implementation of innovation suggested several factors that influence whether or not faculty will choose to adopt a new practice: access to models, trial without commitment, observing best practice, support for the task, and compatibility with their current practice (Rogers 1983; cited in McLoughlin 2000). David Boud (1999) was influenced by Knowles’ adult learning theory when he recommended using peer learning within local working environments as an ideal form of PD for Australian universities. Peer learning was recommended as it allowed informal collaboration, critique, visibility of scholarly practice and the building of an academic’s self-esteem within local contexts. Boud wrote, Academic development has been successful when it has drawn on a deep understanding of the ethos of higher education institutions, their cultural practices and the discourse of academia (Boud 1999; p 10). This process was also supported by Ryan et al. (2000) who recommended PD that was ‘lightly formalised’, to allow the development of local communities of practice. Ryan noticed that his approach was responsive to academic needs, enhanced the visibility of exemplars, and integrated local and centralised support systems. However, the time required for building relationships of trust between PD staff and academics, lack of visible structure, and hard to measure outcomes were identified as limitations of this approach. Using this information, it is possible to piece together a strategy for ACPD to facilitate a change in behaviour in relation to evaluation. But academics are people too. Why do we need a separate system of understanding human behaviour just for academics, or even adults? Surely we are all humans with brains that work the same way? Would psychological human behaviour literature offer more, or different information, to help understand how people make choices in life to change their practice or behaviour? Using Boyer’s scholarship of integration, which encourages crossdiscipline investigations (Boyer 1990, p18-21), a psychological framework for planning a ACPD approach, was explored. Exploring the use of Choice theory to formulate ACPD approaches The role of PD is often about facilitating change. The decision to change however, ultimately rests with the individual, it is their choice. This concept has been developed into a psychological theory describing human behaviour by William Glasser, called Choice theory (Glasser 1999) that has been extensively applied to education and work settings (Glasser 1999, Glasser 1998, Piltz 2002). Although complex, this relationship-based theory can be used to offer insights into designing effective student-centred learning activities for all people (Glasser 1998) and has some parallel with current theories of teaching and learning. Glasser argues that we all see the same world differently (Glasser 1998, p 44). Individuals filter incoming information from the ‘real world’ based on their current knowledge and values. If information passes through these ‘filters’ it will become part of their Perceived World (Figure 1). Only the information in a person’s Perceived World will be analysed. This aligns well with studentcentred learning approaches that commence with efforts to find out what the learner’s current conceptions of a topic are before teaching starts (Moore & Patrick 2000; Biggs 1999, p18). For, if the learner is not receptive to incoming information, they will not engage, and the information will not be considered. Belonging Power Quality Perceived World World Freedom Fun Needs Comparison V a l u e s F i l t e r K n o w l e d g e F i l t e r The Real World Behaviour Choices Figure 1: Diagrammatic representation of how people choose to behave based on Choice theory (Glasser 1999; cartoon based on that http://www.sctboces.org/choicetheory/theory.htm; 13/10/2003) Another component of Glasser’s theory is that each person has individual needs; including a physiological need for survival, but also four distinct psychological needs for belonging, power, freedom and fun, required in different amounts for each individual. These needs provide individuals with the motivation for what they do in their lives (Glasser 1998). This concept parallels the extrinsic, social, achievement and intrinsic motivation drivers, features of the ‘presage’ that impact on learning (Biggs 1999, p 59-60). Information that enters an individual’s Perceived World is then compared with their current conception of their ideal or Quality World. The Quality World is full of images of the people, things and activities that mean the most to that person. When the incoming information in the Perceived World conflicts with the images stored in the Quality World, frustration is experienced and a change in behaviour is considered. Glasser proposes that all behaviour is purposeful and that people will chose to behave in a way they believe will satisfy their needs and re-establish an equilibrium between their Perceived and Quality Worlds. He describes behaviour as having four components, that of thought, action, feeling and physiology, and although most people are only aware of one or two components of their behaviour, each component is occurring to make Total behaviour (Glasser 1998). When learning activities encourage awareness of the ‘affective domain’, through emotion-rich techniques such as role play, transformational learning occurs, with the result being a change in the way the learner sees and behaves in the world (Di Biase 1998). This parallels the awareness of Total behaviour that Glasser encourages to bring about change (Glasser 1998). In this project, Choice theory has been interpreted within the field of ACPD to support evaluation. Choice theory advocates a cooperative system that aims to develop strong trusting relationships between the leaders of teaching and learning in Divisions and Schools, PD staff and academics, satisfying individuals’ needs for belonging. A focal topic about evaluation was selected such that it was likely to be perceived by EASS academics as being relevant to them, allowing the information about evaluation to pass into their Perceived World. The techniques of valuing input, sharing best practice, and modelling expectations were used to support academics’ needs for belonging, power and freedom. Similarly, academics needs for power and freedom were respected by ensuring they remained in control of monitoring the quality of their teaching, and became aware that quality comes from a process that is centred around continual improvement (Glasser 1998, p 291). Method Determining academics’ current conceptions about online evaluation To commence ACPD, it was important to ascertain current learners’ conceptions of online evaluation. Information gleaned from EASS teaching and learning meetings indicated that many academics’ current conceptions were related to response rates to online surveys and the potential ramifications for promotion and teaching awards (images in their Quality World). Other thoughts were that students had ‘evaluation overload’ and therefore did not want to engage in evaluation. Using Glasser’s model it is possible to predict that information couched in relation to improving response rates would be more likely to be accepted into the Perceived World of an academic at that particular time, with that particular group of academics. Thus, the topic of Strategies to improve response rates to online evaluation was the preferred starting place for this project that aimed to improve academics’ understanding of evaluation and commitment to engaging in its processes. Analysis of gross Divisional changes in number of responses to online CEIs To determine engagement of students in anonymous and non-compulsory course evaluation processes, the CEI database was queried to determine the number of CEIs published in EASS in semester 1, 2002, semester 2, 2002 and semester 1, 2003 as well as the total number of responses returned in each of these time periods. Applying Choice theory to ACPD: sharing and modelling best practice To collect current understandings of best practice in motivating students to participate in online CEI surveys, CEI reports from semester 2, 2002 were reviewed by the Dean Teaching and Learning in EASS to identify those academics achieving good response rates compared to the number of students enrolled. PD staff interviewed these academics in early semester 1, 2003, to identify strategies they believed were contributing to their strong response rates to online evaluations, and then collated their responses. To share the understandings of best practice, PD staff emailed, to all academic staff in EASS, the collated recommendations to improve online CEI response rates, in week 6 of semester 1, 2003. The brief email indicated that the strategies were recommendations by their peers of ways, they might consider, to improve response rates to online surveys. To model best practice, PD staff sent an email in week 10 (of 13) in semester 1, 2003 to academic staff in EASS. The communication contained 2 examples of email messages that could be used to encourage students to participate in online course evaluation (based on previous recommendations of their peers). Evaluating ACPD To evaluate the effectiveness of the ACPD approach used, academic staff in EASS were surveyed in week 16 of semester 1, 2003 using an online TellUs2 survey. Academics were asked if an improvement in response rate was detected in semester 1, 2003. Opinions were also sought about the effectiveness of the ACPD strategies that had been informed by Choice theory. Results Changes in the number of CEI responses returned The CEI database was queried to determine if there had been a change in the number of responses received in semester 1, 2003 compared to semester 1, 2002 within EASS. The absolute number of CEI surveys published in EASS did not change greatly, with a 4.3% increase noted (350 in semester 1, 2002 compared to 365 in semester 1, 2003). In contrast, the absolute number of responses experienced a 16.6% increase in the same time period (5591 in semester 1, 2002; 6522 in semester 1, 2003) (see Figure 2). The average number of student responses returned increased from 15.97 to 17.86 per CEI survey. An anonymous survey of academic staff in the Division of EASS was conducted. In total 16 responses were received from 7 of the 8 schools in the Division. All but one of the respondents gave their permission for their responses to be used as a part of a research publication. Respondents were asked if their response rates to online CEIs increased in semester 1, 2003. Sixty percent agreed (n=9 of 15) and 6% disagreed (n=1 of 15). 7000 6000 5000 number if CEIs published in EASS 4000 number of responses received in EASS 3000 2000 1000 20 03 es te r1 20 02 Se m es te r2 Se m Se m es te r1 20 02 0 Figure 2. A comparison of the number of Course Evaluation Instrument surveys published and absolute number of responses received in the period semester 1 2002 to semester 1 2003 within the Division of Education, Arts and Social Sciences. Collating best practice To determine best practice in encouraging students to respond to online CEIs, telephone interviews were conducted with 7 academics who had been identified as having strong response rates in semester 2, 2002 online CEI surveys, to ascertain the strategies they had used to motivate student participation. These strategies were classified into categories, those that improve students’ access to and familiarity with the online environment, which could be implemented in the short term, or those that improve understanding of evaluation such that students value the evaluation process, which were longer term strategies. The list of recommendations was circulated by email to academic staff in EASS and then published as a Teaching Guide on UniSA’s Learning Connection web site (Learning Connection 2003). Assessing the effectiveness of Choice theory-informed PD strategy To determine if the type of PD employed in the Valuing evaluation project, had impacted positively on learning outcomes for academic staff, respondents were asked to respond to seven Likert items on a survey. The resulting data is presented in Table 1. The sharing and modelling aspects of Choice theory were well received (87% agreement) and 66% of respondents agreed that their learning outcomes (they can ‘continue to maximise response rates’ and they had an ‘improved understanding of online evaluation’) had been achieved. Table 1. Summary of responses from EASS academics about the use of Choice theoryinformed professional development strategies Item Number of respondents (n=15) Agree Neutral Disagree 13 1 1 8 6 0 13 1 0 1 5 9 I think I can continue to maximise response rates by applying these strategies 9 6 0 Compared to last year I have improved my understanding of online evaluation 9 6 0 Compared to last year, I feel more confident about using online techniques for evaluation 7 6 2 I thought that collating and sharing successful strategies from EASS academics to improve response rates was a useful way to conduct professional development I trusted these strategies as I knew they had worked for my peers I appreciated being sent the model emails suggesting how I could communicate with students about the CEI The model emails were inappropriate for use with my students Discussion ACPD to support the understanding of evaluation is essential to improve the quality of teaching and learning. For non-education trained academics, particularly those who graduated at a time when evaluation was less essential to professional survival, it can be a complete mystery about what happens to student feedback surveys about teaching and learning, beyond the collection of the data. As they do not understand, they therefore have little chance of valuing and implementing effective evaluation processes. New continuing academics and selected sessional staff at UniSA are engaging in teaching programs that will help raise awareness about evaluation. The purpose of the Valuing Evaluation project was to break the cycle of superficial understandings about evaluation at UniSA in the large remaining group of continuing and sessional academics who may never have been exposed, or have chosen to avoid, information relating to evaluation and its impact on teaching and learning quality. Proposed government, and current university policies, have created an environment in Australian universities that require academics to engage with evaluation. By using ‘capture techniques’ with paper surveys, many academics have enjoyed a high response rate for student evaluations. The validity of data collected by such standover techniques should be questioned. The reality for UniSA academics is however, that CEI and SET will only be able to be administered online from 2005. Online surveys offer an anonymous environment for students who choose to provide feedback about courses and teachers. Compared to paper surveys, online evaluations provide significant efficiencies in the cost and time spent processing data. However, the online dimension of surveys of student feedback also creates hurdles for academic staff and students. Only 12.5% of respondents to the survey indicated they were teaching interactively online (data not shown). If many academic staff are not using online teaching techniques, then how can they expect to communicate to their students how to participate in online evaluation? It is this group of academics who feel most vulnerable (loss of power and freedom) by the types of edicts announced by the University, that all surveys will be online by 2005. In this case study, the PD approach was informed by Choice theory, which can act as a window into understanding student-centred learning, or its equivalent, ACPD. Moore and Patrick (2000) wrote that student-centred learning requires a conceptual shift for the teacher, from ‘I will tell you this and therefore you will learn’, to ‘I want to help you in ways which are effective for you and match your needs’. Similarly, ACPD aims to put the academic at the centre of the learning process. It was demonstrated in this case study that Choice theory can be used a framework for the selection of an approach most suited to a particular group of learners. Techniques that aimed to build trust, foster a sense of belonging and protect learner power and freedom, were used. The collaboratively produced document, about strategies to improve response rates to online evaluation, was assembled by PD staff following interviews with EASS staff identified as having good response rates to online CEIs. Embedded within this document, was all the information that was required for an academic to make an informed choice about how to change their teaching practice to value evaluation. In essence, a guise had been used to increase the likelihood that information about the value of evaluation could pass from the real world, to the Perceived World of many academics within EASS. Once in, this information would be compared to images within their Quality World (possibly containing images of promotion, teaching awards and an enjoyable working environment) and analysed against their needs profile, for action. Staff responding to an anonymous survey, indicated that they were satisfied with the PD approach used in this case study (Table 1). Unsolicited email communication, received following dissemination of the Strategies Teaching guide, also strongly supported the approach. I can’t thank you enough for this – it really helped the penny drop for me as far as online evaluation was concerned (personal communication, 2003). An increase in response rates to an online survey was identified in EASS during the period semester 1, 2002 to semester 1, 2003. Although the PD approach used may have contributed to this process, there are other factors that may have impacted, such as better advertising of the evaluation tool to students by central mechanisms, and improved access to online environments for students. According to Glasser’s description of needs, all people have 5 basic needs (Glasser, 1999). The first, survival, is common to everyone, but the remaining four, belonging, power, freedom and fun, are needed in different amounts by different people. Appreciating that learner’s needs are diverse is an important consideration for ACPD. It also helps explain why, in the survey, not all respondents showed enthusiasm for sharing the best practice of their peers (differing needs for belonging and power). The main theme of this paper is an analysis of what is the most effective way to influence someone to change their practice. A literature review of recommendations to change behaviour, reveals a relationship between these recommendations and the human behaviour model of Choice theory (Table 2). The basic needs identified by Glasser of belonging, power and freedom could be matched, however the literature was noticeably devoid of references to the impact of fun on learning. Fun and humour are different. Ramsden found that humour ranked low in a list of learners’ needs from their teachers at university (Ramsden 1992, p90) yet making material interesting and engaging ranked the highest. Making learning fun can certainly foster interest and engagement. Using Glasser’s model however, it is possible to predict that efforts to incorporate fun into the learning environment for ACPD would also help some academics make the choice to change as it would help them satisfy this basic need. The process used in this case study, of collating and sharing best practice in relation to strategies to improve online response rates and modelling required behaviour, was designed in response to the evidence available and may not be directly applicable to other environments. The approach however, of using ACPD guided by Choice theory, is transferable. Table 2: A comparison of recommendations for influencing academics to change their practices Glasser’s Choice theory, interpreted for ACPD Walsh (2003) Ryan (2000) Boud (1999) Information needs to pass through the current knowledge and values filters to be perceived. People will comply if the recommendation is compatible with their current practice Academic-centred professional development (ACPD) needs to recognise learners current conceptions, accurate or not. Everyone has a basic need for belonging; but this need may be greater in some compared to others. ACPD approaches need to work at building collaborative professional teams. Process lightly formalised to develop local communities of practice This can be done through sharing local best practice and fostering peer learning. Honest and open communication also develops the sense of trust necessary to help individuals work collaboratively. Better communication at all levels of the process Everyone has basic need for power; but this need may be greater in some compared to others. ACPD approaches need to respect this need for power. Academics led, owned and participated in the process This can be done through involving academics in the decision making process, modelling what it is that academics need to do, demonstrate that they will get the outcomes for the effort, ensuring the materials and support to do the requested task are available, and reinforcing that what they will do will help make their institution, and therefore themselves, more powerful. Rogers (1983) Evidence that results will be achieved (outcomes orientation) Peer learning in local environment for collaboration, critique Observing best practice Visibility of scholarly practice Access to models Trust needed, but time consuming to develop Visible exemplars Integration of local and centralised support systems Support for the task Emphasise how it will improve the institution Everyone has basic need for freedom; but this need may be greater in some compared to others. ACPD approaches need to respect this need for freedom. This can be done through various ways including allowing academics to explore processes without commitment, and providing models and examples for them to utilise, or ignore. Everyone has basic need for fun; but this need may be greater in some compared to others. ACPD approaches would aim to make learning fun. Building of academics’ self esteem Trial without commitment Conclusion Choice theory, when used as a window to understand ACPD, may well assist PD staff in leading academics towards complying with institutionally desired goals. This process first requires determination of academics’ current conceptions of the topic held within their Perceived world. And, secondly, in the provision of useful and pertinent information presented in such a way as to emphasise how adopting a new practice or behaviour will satisfy their basic needs of belonging power, freedom and fun. References Biggs, J 1999. Teaching for Quality Learning at University, The Society for Research into Higher Education, Open University Press, Buckingham, UK. Boud, D 1999. ‘Situating academic development in professional work: using peer learning’, International Journal of academic development vol. 4 no. 1, pp. 3-10. Boyer, E 1990. Scholarship reconsidered: Priorities for the professoriate. The Carnegie Foundation for the Advancement of Teaching. Jossey-Bass, New York. Di Biase, W 1998. ‘Mezirow’s theory of transformative learning with implications for science teacher educators’ International Conference of the Association for the Education of Teachers in Science, Charolette, USA, viewed 7 Nov. 2003, <http://www.ed.psu.edu/CI/Journals/1998AETS/s2_1_dibiase.rtf>. Glasser, W 1999. Choice theory, Harper Perennial, New York. Glasser, W 1998. Choice theory in the classroom, Harper Perennial, New York. Reid, I 2001. ‘Strategic collaboration for online delivery’ Electronic proceedings from Educause in Australia, 2001, viewed 7 Nov. 2003, <http://www.gu.edu.au/conference/educause2001/papers/Ian_Reid.doc>. Learning Connection, 2003. ‘Evaluation: Strategies to maximise online evaluation response rates’ Teaching Guide, University of South Australia, Adelaide., viewed 7 Nov. 2003, <http://www.unisanet.unisa.edu.au/learningconnection/teachg/evaluation_strategies.doc>. McLoughlin, C 2000. ‘Creating partnerships for generative learning and systemic change: redefining academic roles and relationships in support of learning’, The International Journal for Academic Development vol. 5 no. 2, pp. 116-128. Moore, K & Patrick, K 2000. Student centred learning. Royal Melbourne Institute of Technology, Teaching and Learning web site, viewed 7 Nov. 2003, <http://www.teaching.rmit.edu.au/progimprov/sclearn.html>. Piltz, W 2002. ‘Applying Choice theory and reflection to enhance student outcomes in Group Dynamics’. Australian Institute of Sport Conference, viewed 7 Nov. 2003, <http://www.ausport.gov.au/fulltext/2002/achper/Piltz2.pdf>. Ramsden, P 1992. Learning to teach in higher education. Routledge, London. Ryan, M, Hanrahan M & Duncan, M 2000. ‘The professional engagement model of academic induction into online teaching’. Electronic proceedings from The Australian Association for Research into Education , viewed 7 Nov. 2003, <http://www.aare.edu.au/00pap/han00419.htm>. Welsh, JF & Metcalfe, J 2003. ‘Cultivating faculty support for institutional effectiveness activities: benchmarking best practices’, Assessment & Evaluation in Higher Education vol. 28 no. 1, pp. 3345. © Diana Quinn, University of South Australia 2003. Link to presentation.
© Copyright 2026 Paperzz