Led to change. The National Leadership Education for School

Led to change
The National Leadership Education for School Principals in
lower and upper secondary schools in Norway; change in the
schools, goal achievement and recommendations
Final report from the Evaluation of the National Leadership Education for
School Principals
Ingunn Dahler Hybertsen, Bjørn Stensaker, Roger Andre
Federici, Marit Schei Olsen, Anniken Solem and Per Olaf
Aamodt
Led to change
The National Leadership Education for School Principals in
lower and upper secondary schools in Norway; changes in the
schools, goal achievement and recommendations
Final report from the Evaluation of the National Leadership Education for
School Principals
Ingunn Dahler Hybertsen, Bjørn Stensaker, Roger Andre
Federici, Marit Schei Olsen, Anniken Solem and Per Olaf
Aamodt
Final report
from the Evaluation of the National Leadership Education for School Principals
Published by
Address
NIFU Nordic Institute for Studies in Innovation, Research and Education
P.O. Box 5183 Majorstuen, N-0302 Oslo. Office address: Wergelandsveien 7, N-0167 Oslo
Translator to English
Daniel Christopher Engen
Order placed by
Address
The Norwegian Directorate for Education and Training
P.O. Box 9359 Grønland, NO-0135 Oslo
Printed
Cover photo
Link Grafisk
Scanpix
ISBN
ISBN
978-82-327-0106-3 (print)
978-82-327-0107-0 (online)
www.nifu.no
Preface
This final report is the last of four reports from the evaluation of the national leadership education
for school principals – an initiative that the Directorate for Education and Training has realized to
strengthen the leader competence of principals and other leaders in lower and upper secondary
schools in Norway. The evaluation was conducted as a cooperation project between NIFU and NTNU
Social Research during the period 2010-2014. The final report is written by Ingunn Dahler Hybertsen
(NTNU Social Research), Bjørn Stensaker (NIFU, project leader), Roger Andre Federichi, Marit Schei
Olsen, Anniken Solem (all three NTNU Social Research) and Per Olaf Aamodt (NIFU). Hybertsen and
Stensaker have coordinated the report work, and the researchers have cooperated in both the
collection and analyzes of different sources of data. Federici and Aamodt have been especially
responsible for quantitative data and analyzes, while Hybertsen, Olsen and Solem have contributed
with collecting qualitative data and analyzes of these. The authors would like to thank Randi Røthe
and Trude Røsdal (NIFU) who have contributed to the data collection and analyzes early on in the
project period. Thanks to Per Morten Schiefloe from NTNU/NTNU Social Research and Jannecke
Wiers-Jenssen from NIFU, Arne Mastekaasa at UiO, as well as the Directorate for Education and
Training for constructive comments to an earlier draft of the final report.
Oslo/Trondheim, November 2014
NIFU
Sveinung Skule
Director
NTNU Social Research
Jan Tøssebro
Director
3
4
Contents
Preface ........................................................................................................................................ 3
Summary ..................................................................................................................................... 7
1
Introduction .................................................................................................................... 11
1.1
1.2
Leadership as a tool for quality development in school ...........................................................................11
The evaluation´s focus areas and the report´s structure ..........................................................................13
2
Method and data ............................................................................................................ 15
2.1
2.2
Framework and evaluation level ...............................................................................................................15
Surveys ......................................................................................................................................................17
2.2.1
2.2.2
2.2.3
Population and data collection ........................................................................................................................................................ 17
Instruments ..................................................................................................................................................................................... 20
Analyzes and significance ................................................................................................................................................................ 21
2.3
Case studies of schools .............................................................................................................................22
2.3.1
2.3.2
Selection of schools and data collection .......................................................................................................................................... 22
Instruments and analyzes ................................................................................................................................................................ 25
2.4
Summary ...................................................................................................................................................25
3
Participants´ assessments of change ................................................................................ 27
3.1
3.2
3.3
3.4
Participants´ experience of the school´s learning culture.........................................................................27
The participants´ capacities for learning and development .....................................................................30
The participants´ assessments of change over time .................................................................................31
Summary ...................................................................................................................................................32
4
Change in the schools ...................................................................................................... 33
4.1
4.2
4.3
Change in learning culture ........................................................................................................................33
Change in leadership practice ...................................................................................................................40
Summary ...................................................................................................................................................42
5
The impact of school leadership education....................................................................... 43
5.1
The TALIS-survey .......................................................................................................................................43
5.1.1
5.1.2
Use of time, tasks, and academic development .............................................................................................................................. 44
Experience of school leadership ...................................................................................................................................................... 46
5.2
Summary ...................................................................................................................................................47
6
Central findings, goal achievement and recommendations ............................................... 48
6.1
6.2
6.3
Central findings in the evaluation .............................................................................................................48
The leadership education´s goal achievement .........................................................................................51
Implications and recommendations .........................................................................................................52
References ................................................................................................................................. 55
List of Tables .............................................................................................................................. 57
List of Figures ............................................................................................................................. 58
Appendix ................................................................................................................................... 59
5
6
Summary
This is the final report from the evaluation of the national leadership education for school principals
that was initiated by the Directorate for Education and Training in 2009. This report adds to the three
reports already published by adding complete survey data from all cohorts in the program so far, and
by reporting from the case study schools that were selected as a follow-up of the program. These
case studies shed more light on how participants with completed leadership education are able to
instigate change and development at their home institutions. The report also summarizes the main
findings of the evaluation.
The national program is organized by the Directorate of Education in Norway, and is provided by six
institutions in Norway: The Administrative Research Institute (AFF), the Norwegian Business School
(BI), Oslo University (UiO), the University College of Oslo and Akershus (HiOA), Bergen University
(UiB) and the Norwegian University of Science and Technology (NTNU).
The evaluation has built on different data sources. The participants’ assessment of the program is
collected through a variety of surveys and through a smaller selection of interviews. Document
analysis of the program provisions has been done, as well as a series of interviews with
representatives of the program providers. In addition, 12 case studies of schools where their school
leadership has attended the leadership program have also been undertaken.
Key findings in the evaluation
The experience of the participants in the national leadership education for school principals has been
reported earlier in the evaluation, and key findings are that they rate the various programs offered
by the providers very highly regardless of the specific program they have attended. The participants
experienced high pedagogic and didactical quality in the programs, and they rated the relevance of
the program as very good. When asked whether their initial expectations of the program have been
met, a large majority confirms this. Another finding is that the capacity of the participants to change
and develop as leaders has been strengthened as a result of the program. The participants report
that they are more capable of undertaking a number of key leadership tasks after completing the
program, although the increase reported is quite small. In the current report, another cohort of
participants in the leadership education has been included in the analysis, and the new analysis
conducted supports earlier findings. When asked whether the culture for learning in their own school
7
has changed as a result of the program, the participants report a significant, although small, positive
change. This finding suggests that there might be a link between the change and development of the
participants and the change and development in the culture for learning in the schools where they
work. When comparing the cohorts, there are few differences. The impact of the national leadership
education has not changed in the period analyzed.
In the current report, a small sample of schools have also been studied in more detail, where the key
question investigated is whether the staff in these schools experience changes in the culture for
learning as a result of the participation of the rector in the program. The analysis has also tried to
isolate any common characteristic of the schools with respect to how changes have been
implemented. Interviews and survey data from the case studies suggest that staff experience
changes along a number of dimensions, although most of the informants have problems relating
change directly to the leadership education. Common characteristics between the schools are a
reduction in the number of projects and priorities undertaken (concentration), the development of a
joint management/leadership team within the schools, and an increased emphasis on building
competence throughout the school. A number of the schools also report changes in the collaboration
between the school and the local municipality (the school owner), and that this is seen as important
in changing practices within the schools.
Are the objectives of the national program met?
In general, the national leadership education for school principals has had three main objectives.
First, to strengthen the quality of the Norwegian school system in general. Second, to strengthen the
school leadership training in Norway. Third, to strengthen the competence of those participating in
the program. To assess the first objective is very difficult due to a number of methodological
challenges in identifying the direct causality between the national initiative and the quality of the
Norwegian school system. However, data from another survey (TALIS survey) undertaken among
school leaders does suggest that leadership training in general has a positive, although varied, effect
on leadership practice. Meanwhile, it is a fact that the program has been seen as a very attractive
program by school leaders in Norway, and it is possible to argue that this, at least indirectly, is
positively linked to improved quality. However, one should emphasize that the improvement of
quality within Norwegian schools is most likely influenced by a number of factors, and that school
leadership is an important, but not a sufficient factor triggering improvement. Not least, the
evaluation has highlighted that the cooperation between schools and the local municipalities (the
school owners) can be improved.
It can be strongly argued that the national school leadership education has contributed to
strengthening school leadership training in Norway. Through this initiative, several new arenas for
communication and development of school leadership have been created, and the evaluation also
shows that there is an increased integration between theory and practice in school leadership
training. The fact that the higher education institutions offering the program have collaborated with
independent consultants in the private sector seems to have benefitted the programs.
Based on the answers of the participants it is also fair to conclude that the leadership education has
contributed to strengthening the competence of school leaders in the program, and that they are
more confident with regards to their role as leader. In general, self-efficacy measures can be said to
be a central indicator for the capacity to develop as a leader. The expectations the participants report
8
on these issues suggest that their capacity has increased. In addition, a side effect of the program is
that participants have developed new social networks they find very useful in their job.
Recommendations for future programs
As indicated, the leadership education program can be said to be a very successful initiative.
However, the current design and organization of the program can still be adjusted, and the following
suggestions are put forward for consideration:
The existence of a national program for school leadership training is important. Such a program
seems to create motivation and interest among participants, and to strengthen the legitimacy of
school leadership in general.
The education program seems to have functioned both as a recruitment channel for future leaders as
well as a competence building scheme among existing school leaders. This double function works
well and should be continued.
Future initiatives should – as is also the case today – aim to integrate the national education program
in other school leadership programs provided by higher education institutions in Norway. Such
integration may benefit both institutions and future participants wanting to deepen and develop
their competence in leadership further.
Many participants report that the current program is challenging to attend alongside their regular
job. Initiatives that ease the burden on the participants with respect to the overall workload should
be considered.
The participants in the national leadership education report that the program has led to a valuable
and dynamic social network that stimulate individual and group learning afterwards. These “learning
environments” should be supported and further developed as part of the program, where theoretical
and practical learning are combined.
The links between the local municipalities (the school owners) and the individual schools should be
further developed. Change and development in schools following increased leadership competence
is dependent on collaboration with the local municipalities.
School leadership is an important factor in developing quality in schools, but specific initiatives such
as the national program could benefit from being tied to other developmental projects organized by
the Directorate of Education and Training.
9
10
1
Introduction
1.1
Leadership as a tool for quality development in school
Development of the Norwegian school is high on the political agenda, with ever more attention to
the importance of school leadership, and the importance that lower and upper secondary schools
both have competent and knowledgeable leadership (Parliament Report No. 31 (2007-2008)). The
creation of the national leadership education in 2009 can be considered an operational continuation
of key elements of the Knowledge Promotion, where in Parliament Report No. 30 (2003-2004) it was
stressed how important goal and result management, empowerment of the profession and
accountability and knowledge-based practice of the profession were in developing quality in schools.
The emphasis on school leadership is based on the recognition that schools, as well-functioning
organizations, achieve better student results, and that school leadership has a significant impact on
students´ learning and learning environment (OECD, 2008). Not least, the principal can be very
important in that he follows up on the teachers, increases teacher motivation, and facilitates good
working conditions - something which also is significant with regard to students' learning (Robinson
et al., 2009). At the same time, the principal´s position as a leader has not always been as strong in
Norwegian schools. The individual teacher´s autonomy remains strong. At many schools there is a
predominant silent approval that leadership should not interfere too much with teachers' work. The
fact that many principals have been assigned new administrative duties because of municipal
efficiency and transition to two-level leadership, could contribute to increasingly weaken the
possibility for active leadership.
Today the role of the principal is regulated in Education Act paragraph 9-1, which states that each
school should have a responsible professional, education and administrative leadership residing in
the principal. However, the law does not operationalize what the content of the professional,
education and administrative leadership actually is, or how this can best be exercised. Parliament
Report No. 31 (2007-2008) points out that Norway, compared with other countries, has fairly large
number of national requirements in connection with employment to principal positions. Although
some courses within education leadership exist in Norway, it has been argued that many of these
programs have had a weak connection to practice (Parliament Report. No. 31 (2007-2008)). Even
though it is not unusual that principals have supplementary training and continuous education in
administration and leadership subjects, the principal survey from 2005 showed that almost 40
11
percent had no formal leadership training. Although an increasing number of principals acquire
leadership training, there are increasingly many principals who do not have such a formal
background.
With this as a starting point, the Ministry of Education and Research notified, in Parliament Report
No. 31 (2007-2008), that a national leadership education for newly appointed principals and other
school leaders who lack formal leadership competence would be established. It was argued that a
change in the leader role in schools requires that the principal has the competence and the will to
lead, but also that there is an acceptance among the employees that leadership is practiced.
The Ministry of Education and Research had given the Directorate for Education and Training the task
of defining the requirements and expectations regarding an education provision for principals in
lower and upper secondary schools, whilst conducting a tender for a national education provision. It
was pointed out that the education should be related to practice, and that it could be part of a more
extensive Master´s program within education or school management. The program should
correspond to 30 credits within the university/university college system, and duration of 1.5 to 2
years through a series of workshops.
After the first tender in 2009 the Directorate for Education and Training gave four communities the
task of developing and executing the leadership education for a program period of 5 years. After an
assessment of whether one should increase the education capacity, a new tender was done in 2010,
and a further two communities entered. From autumn 2010 six education institutions offered the
education as part of the national effort:
•
University of Oslo (UiO)
•
University of Bergen (UiB)
•
The Norwegian Business School (BI)
•
The Administrative Research Institute at the Norwegian School of Economics
•
Oslo and Akershus University College (HiOA)
•
The Norwegian University of Science and Technology (NTNU)
In terms of content, the various program providers had a certain amount of autonomy in how the
program could be organized. Key boundaries, however, have been that the competence
requirements that the Directorate for Education and Training have set for the education is related to
1) students' learning results and learning environment, 2) management and administration, 3)
cooperation and organizational development, and 4) development and change, and that these
should be reflected in the provisions. Increased confidence in the leadership role was a central aim of
the education. The total provision could be accommodated in more comprehensive Master´s
programs in education management. The academic provisions should also be practice-oriented and
the program providers were required to cooperate with consulting firms who have experience with
this. In this way, the leadership education opened for cooperation between education institutions
and other actors who have expertise on the more practical aspects of leader development in the
form of skills training. The program has a general admission each fall and participants receive their
education at one of the six providers.
12
For the period 2010-2014, the Directorate for Education and Training also desired that a follow-up
evaluation of the six education provisions that were developed, with a focus on both the quality of
the program and effects over time at the schools.
The purpose of the evaluation was to develop knowledge of how the various education provisions
worked in practice. The evaluation task should therefore also shed light on more information about
the quality of the education provided, participant satisfaction and their assessments of their own
development upon completion of the education. The central point of the continuous evaluation was
however to answer whether the education has contributed to the individual participant becoming a
better leader, and whether it is possible to trace any changes in the schools back to the principals'
participation in the leadership education.
Following a competitive tender in 2010, NIFU, in cooperation with NTNU Social Research was
commissioned to undertake this follow-up evaluation. This final report of the evaluation aims to
provide an overall representation of how the national program has achieved the goals that were
defined for this initiative, and discuss implications for future leadership education.
1.2
The evaluation´s focus areas and the report´s structure
The design that has formed the basis for this follow-up evaluation has had the ambition to capture
the complexity of the national leadership education through method triangulation and the combining
of qualitative and quantitative data. Since it has been a follow-up evaluation, the reporting has
occurred continuously. Three earlier reports are completed (Hybertsen Lysø et al., 2011; 2012; 2013)
and the reports´ focus and perspectives, findings and main conclusions are summarized briefly here.
For details on the methodological approach and data types that are used we refer to the individual
reports.
In the first report (Hybertsen Lysø et al., 2011) the Norwegian initiative was placed in a larger
theoretical perspective and compared with similar measures in other countries. The purpose was to
see if the Norwegian leadership education was in line with international development trends. The
report also contained a brief summary of international theory and practice in both school leadership
and leader development, and put in place important guidelines for the analytical framework
underpinning the evaluation. An important conclusion was that school leadership training, execution
of leadership and assessment of results is a difficult exercise that requires a diverse foundation of
data and multidimensional approach to how data should be interpreted.
The second report (Hybertsen, Lysø et al., 2012) could convey the first results from the leadership
education, and the spotlight was put on how the six providers had organized their education
programs from the framework, which the Directorate for Education and Training had defined. The
data was based on interviews with the six providers at different times, analyzes of a number of
documents from the providers (tenders, general descriptions of the programs, curricula, syllabus and
work requirements), and visits to workshops. In addition, participants´ views on the education
provisions were obtained through participant surveys that are designed to measure program quality
and results. The report concluded that the six provisions partly emerged as educationally different,
but the participants´ experience of the programs´ quality and relevance to practice was found to be
very positive - regardless of which provision they were tied to. The explanation for this was that the
13
national provision, generally speaking, recognizes school leadership, and is a learning arena that
covers the participants' needs for support and networking.
In the third report (Hybertsen, Lysø et al., 2013) the spotlight moved to the participants themselves,
on their assessments of change in their own learning and development. The three cohorts with
participants who are included in this evaluation consist of a total of over 1,100 principals and school
leaders who have been enrolled in the education program. Through surveys, which participants in
each cohort answered before and after participation in the education, various areas were examined
and compared to the participants' expectations. The purpose was to identify any correlations
between the program provisions that school leaders had attended and the experience of
development as leaders. The conclusions were that the participants' expectations were essentially
fulfilled and that their capacities for learning and development - measured in the feeling of security
in performing various tasks for school leaders - had increased along many dimensions after
completing the education.
In this final report, the dimension of change is even more in focus than in previous reports. Firstly,
the post-test for the third cohort of participants is included in the analyzes of whether one is satisfied
with the education provision and whether the provision´s quality is maintained. Secondly, one has
followed some selected participants "home" to their schools, and through interviews with principals
and their leadership team, assessed changes in practice in their own schools and whether this can be
attributed to the principal´s participation in the education. Since such data collection is relatively
demanding in terms of time and resources, only a small number of participants and schools have
been followed-up in this way.
The analyzes in the present report are based on four research questions, where the first two
questions are addressed in Chapter 3 and the last two in Chapter 4:
1. Based on the participants' evaluation of the benefits and own development, can some changes in
the program provisions´ quality from cohort to cohort be identified?
2. Do the participants experience a change in the learning culture in their own schools from pre to
post-participation in the education?
3. Do the staff at selected schools experience that the learning culture has changed in the schools
from pre to post-participation in the education?
4. Can some common characteristics at the selected schools, when it comes to changes in practice
from the principal´s participation in the education, be identified?
Chapter 5 illustrates the importance of the school leader education more generally by the analysis
from the TALIS survey. In the concluding chapter we discuss the national leadership education for
school leaders overall, and the most important findings are highlighted through the compiling of the
various parts of the evaluation. The discussion answers whether the national initiative has achieved
the objectives that were formulated, and what practical implications the central findings may have in
the form of recommendations for a future education in school leadership.
14
2
Method and data
NIFU and NTNU Social Research have, since autumn 2010, collected several types of quantitative and
qualitative data on the national leadership education, both from the six program providers, three
cohorts of over 1,100 participants and a selection of twelve school cases.
This method chapter mainly describes the data for the new analyzes presented which are based on
the participants' experience of own development and the school's learning culture, description of
changes in practice at the schools from the principal and their leadership teams, and the school
staff´s assessment of changes in the learning culture. For the methodology and data for the analyzes
from TALIS presented in the report, a brief introduction is given before the results are presented, and
further reference is directed to Caspersen, Aamodt, Vibe and Carlsten (2014). Initially the various
reports will be presented through placing them on different evaluation levels.
2.1
Framework and evaluation level
There is no way to unambiguously measure the direct effect of investment in measures for the
development of leaders and an organization’s performance (Hybertsen Lysø et al., 2011).
Development of leaders and change in organizations are complicated interactions and it is
methodologically difficult to isolate variables to measure the effects of individual measures. For
example, it is not possible to identify a simple causal relationship between the school leadership and
a better school. Even though research on school leadership shows what works, the studies are weak
conceptually (Hybertsen Lysø et al., 2011). Most studies of programs for leaders have concentrated
on measuring learning and development at the individual level, and most studies show that
participants experience participation as a positive. Results from the participant survey in this
evaluation, as presented in Report 3 (Hybertsen Lysø et al., 2013), shows that the participants
experience increased capacity for learning and development upon completion of the program, and
thereby increased confidence in the leader role. These results are in line with most studies on
leadership development / education at the individual level (Hybertsen Lysø, 2014).
Systematic studies of how participation in leadership programs contribute to concrete changes at the
organizational level are rarer, and studies show no unambiguous results (Hybertsen Lysø et al.,
15
2011). An interesting question is whether the research attempts to find whether causal relationships
may have helped to create or reinforce expectations about changes in organizations as a result of
individual leaders´ participation in leadership development (Hybertsen Lysø, 2014). Firstly, it is
argued that the relationship between individual learning and organizational change cannot be seen
as a linear process when both the organizational context and learning context are complex. Secondly,
it is argued that such a presumed relationship based on an instrumental view of what the learning of
leadership is and how recognition of leaders´ practice-based knowledge and understanding of how
leadership is learned, are seldom included in studies. Therefore, the transfer of individual learning
and development to specific changes in the organization can be both a challenge to achieve, not to
mention difficult to measure. Even though constantly increased investment in measures of learning
and development of leadership does not have unambiguous support in research, it does not mean
that such programs have no effect (Hybertsen Lysø, 2014).
The first report aimed at creating a theoretical and methodological framework for the evaluation and
for the empirical studies that form the basis for the other reports. Through a recognition of the
complexity of evaluating the effects of leader development, the project has attempted to evaluate
the national leadership education from different perspectives - from representatives of the six
providers, participants and staff at schools - and at different levels of analysis. Kirkpatrick's (1998)
evaluation levels are used to categorize different forms of assessment of learning measures.
Although these levels cannot be directly translated into neither learning of leadership nor school
context, the model is used to identify the various parts of the evaluation and the relationship
between the different levels of analysis. The model is illustrated in figure 1.
Figure 1 Evaluation level – (cited in Kirkpatrick, 1998)
In the second report, the analyzes were primarily based on the participants' satisfaction (evaluation
level 1) with the provisions when it comes to how they experienced the quality of education and
relevance to practice, but also their expectations - both through data from the participant survey and
from interviews with principals. Participant satisfaction was seen in relation to the intentions the six
providers had with the leadership education - based on analyzes of both documents and interviews
with program providers. The third report focused on the participants' assessments of their own
16
development from the education by measuring changes in their capacities for learning (evaluation
level 2) - both data from the participant survey (pre- and post-tests) on changes in their expectations
about mastering various dimensions of school leadership and interviews with principals. In this
reporting, the focus is moved upward in the model to the principal´s application (evaluation level 3)
of learning in terms of changes in practice in schools and implementation (evaluation level 4) in
terms of change at the organizational level. This forms a basis to assess whether any changes in
practice in schools can be traced back to the principal´s participation in the education. The present
report´s data analyzes of the participants´ learning, changes in practices and learning culture in
schools, are described in the next section.
2.2
Surveys
The quantitative approach was mainly implemented to investigate development in participants over
time in the three cohorts that have completed the leadership education in the evaluation period.
This type of evaluation design is characterized by repeated measurements over a longer period and
has often intended to describe stability and change (Ringdal, 2007). Based on a longitudinal design,
participants in the principal program were encouraged to fill in a questionnaire in connection with
the education, one at the start and one after completing the education. This is referred to as a prepost-test design (Gall, Gall & Borg, 2007). In the present report, data from the participant survey for
the three cohorts are compared to identify any changes in the provisions´ quality over time, and
whether the performance of cohort 3 provides the same positive results in terms of participants'
increased capacities for learning and development as found in Report 3 (Hybertsen Lysø et al., 2013).
2.2.1
Population and data collection
The population is well defined and consists of all the participants in the principal program. Because
all participants were encouraged to participate in the survey, the selection is equal to the population.
To get in touch with the population, the various program providers were asked to submit their
respective participant lists to the evaluation group. This information was then fed into the Select
Survey 1, a web-based system for electronic data collection. Respondents in the present report
consist of participants who started in autumn 2010 (referred to as Cohort 1 2), fall 2011 (referred to
as Cohort 2) and autumn of 2012 (referred to as Cohort 3). Table 1 shows an overview of the number
of participants, date of departure, number of reminders, the number of responses and response
percentage distributed by pre- and post-tests.
The service is purchased by NTNU who administrates the system
The evaluation refers to the class of participants in the period 2010/2011 as Class 1, but for the four providers that started in
2009 this would in reality be their Class 2.
1
2
17
Table 1 Overview of selection distributed by pre- and post-tests and by three cohorts
Cohort
Number of
participants
Time
Reminders
Number of
answers
Percent
Pre-test
Cohort 1
Cohort 2
Cohort 3
334
380
413
21.02.11
01.12.11
01.10.12
3
3
3
320
313
287
95.8
82.3.
69.5
Post-test
Cohort 1
Cohort 2
Cohort 3
334
380
413
29.05.12
18.04.13
01.02.14
3
3
2
183
160
226
54.8
42.1
54.7
Note. The table shows number of participants and answers distributed by program providers. The given number of
participants is based on the lists that the providers sent out. Because of sign-up after the fact, and some attrition, these
numbers deviate somewhat from the actual number of participants.
Table 1 provides a comprehensive overview of all participants and shows that the response rate on
pre-tests for all cohorts is satisfactory. One possible explanation for the difference between Cohort 1
and 2 is that the evaluation group attended Workshop 2 for Cohort 2 among all providers and
informed orally about the evaluation. We have no explanation for the declining response rate of pretest from Cohort 2 to 3, but we cannot ignore that the way one informed about the evaluation was of
some significance. The number of respondents in post-tests are somewhat lower, but still above 50
percent in Cohort 1 and 3. Post-test for Cohort 2 is answered by 42.1 percent. Generally, the
response rate is considered satisfactory (Babbie, 2004; Gall et al., 2007) for pre-tests, but for posttests special caution should be paid in relation to interpretation of findings and results. This also
means that any changes after the education, measured through changes from pre-test to post-test,
must be interpreted with particular caution because uncertainty in the change includes the statistical
uncertainty in two steps.
There may be several different reasons why participants fail to respond. A known cause that has
affected the response rate is that more participants for various reasons have left the program
midway, or have changed jobs. We have no exact figures on this. Respondents were also given the
opportunity to waive their right to receive the post-test (Cohort 1: 15 people, Cohort 2: 17 people
and Cohort 3: 6 people). Other causes may be the size of the questionnaires, which are relatively
extensive considering the number of questions that participants have to relate to. This is a known
cause of some of the departures between the two questionnaires.
Respondents´ answers from pre- and post-tests were paired at the individual level. This means that
the answers in the first survey are linked to participants' answers in the second survey. Table 2 shows
the number of respondents who could be paired.
18
Table 2 Overview of selection who answered both pre- and post-tests
Cohort
Pre- and post
Cohort 1
Cohort 2
Cohort 3
Number of
participants
Number of
answers
Percentage
Number
paired
Percent
334
380
413
183
159
226
54.8
41.8
54.7
170
130
175
50.8
34.2
42.4
As the table shows, the overall response rate varies between about 42 and 55 percent. This is
somewhat lower than desirable, but a response rate that today is relatively normal in various
surveys. As expected, the number of responses that can be paired in the pre- and post-tests is even
lower. Low response rates generally increase the likelihood of random deviations. The fact that you
have three cohorts means that one can use response patterns from different cohorts as a test of
whether such deviations occur. Therefore, in this report different cohorts are compared to take this
into account. No findings were made that suggest such deviations, and the relatively low response
rate should therefore not pose a problem for the robustness of the results.
The comparison of participants' experiences across program providers was the focus of Report 2 and
3, and the results showed that there were no significant differences between the six provisions based
on the data collected from participants (see Hybertsen Lysø et al., 2012; 2013). The comparison of
program providers is not the focus of this report, but the distribution of the number of participants
and the number of responses for each provider are included only to show that the cohort size is
relatively unchanged during the period for each of the six providers. Distribution between providers
is shown in Table 2.
19
Table 3 Overview population and answers distributed by program provider
Cohort
AFF
BI
HiOA
NTNU
UiO
UiB
Cohort 1
No.
Resp
12
11
140
118
49
42
61
57
60
60
27
21
Pre-test
Cohort 2
No.
Resp
24
21
119
101
62
56
69
52
70
41
30
30
Cohort 3
No.
Resp
24
12
119
81
62
50
69
54
70
47
30
22
Post-test
Cohort 1 Cohort 2 Cohort 3
Resp
Resp
Resp
7
12
10
65
43
51
23
19
24
35
20
36
26
21
25
11
12
13
The table also shows that the number of participants varies to a large degree between each provider,
from AFF with a small cohort to BI with several cohorts spread out in different parts of the country.
For details about the program providers´ intentions and practices, as well as differences in
pedagogical conditions in the provisions, we refer to Report 2 (Hybertsen Lysø et al., 2013).
2.2.2
Instruments
The participant survey is based on a combination of established instruments and new instruments
that are designed based on the Directorate for Education and Training´s model for school leadership
and the theoretical approach to the evaluation (see Hybertsen Lysø et al., 2012; 2013). The data
underlying this report is mainly drawn from the established instruments NPSES, DLOQ and UWES as
described in the following.
The participants´ experience of change in their own capacities for learning and development is
measured through a validated instrument for efficacy expectations, The Norwegian School
Leadership Self-Efficacy Scale (NPSES) (Federici & Skaalvik, 2011; 2012) and consists of 36 different
questions that are categorized in the following dimensions of school leadership; (1) pedagogical
leadership, (2) economics, (3) administrative leadership, (4) satisfaction, (5) support, (6) guidance
teachers, (7) teaching, (8) performance monitoring, (9) relationship parents (10) relationship school
owners and (11) relationship communities / businesses. An example of questions about performance
monitoring is: "How confident are you on a scale from 1 to 7 that you can implement specific
measures to improve student learning outcomes?" For details about the dimensions and an overview
of the questions, we refer to Report 3 (Hybertsen Lysø et al., 2013). The questions were formulated
identically in pre- and post-tests to examine changes in the leaders´ expectations about coping.
Efficacy expectation is generally a measurement of capacity for learning and development, and
interpreted as an indication of increased confidence in the leadership role. The analyzes, in the
present report, of changes in efficacy expectations are based on data from the participant survey.
The instrument that measures organizational learning is based on the instrument Dimensions of the
Learning Organization Questionnaire (DLOQ) developed by Watkins and Marsick (1997; 1999). This is
developed based on the research by Argyris and Schön (1996) which, over time, has documented
that a good learning culture within the organization helps to create conditions to apply individual
learning from participation in leadership programs. The dimensions that we sought to map do
20
however correlate largely with the attitudes, actions and structures that school leadership research
has demonstrated to be significant for creating good learning outcomes for students (Hybertsen Lysø
et al., 2011). The instrument is validated and applied in a number of studies in different sectors and
countries (Marsick, 2013; Watkins & Dirani, 2013), and is for the present evaluation translated into
Norwegian and adapted to the school context.
The survey consists of 43 questions that are categorized into seven dimensions of organizational
learning; (1) continuous learning, (2) dialogue, (3) cooperation and team, (4) knowledge sharing, (5)
inclusion and shared vision, (6) systems thinking and (7) leadership, role models and support.
Participants were asked to consider how often these occur in their own schools on a scale from 1 to 7
where 1 = almost never and 7 is almost always. An example question is "At this school ... the
employees openly discuss challenges related to work to learn from them." In addition nine questions
were formulated and tailored to the evaluation; four questions were based on competencies for
education leadership that underlie the leadership education, three questions to identify which kinds
of knowledge types change and development were based on, and two questions about scope. An
example of questions: "The school has on the whole been managed and administrated better this
year compared with the year before." The questions were formulated identically in pre- and posttests to examine any change in learning culture, competence and application of knowledge types 3.
The instrument is included in both the participant survey and the survey that was used to collect
quantitative data from the staff in the selected school-cases and the results from both of these data
collections are included in the present report.
The instrument that deals with experience of engagement or job satisfaction is taken from The
Utrecht Work-Engagement Scale (UWES) (Schaufeli & Bakker, 2004). The survey consists of nine
claims and focuses on respondents' overall experiences of having a positive connection to work. On a
scale of 1 to 7, respondents were asked to consider how often they feel thus, and an example of a
question is: "I am enthusiastic about my job." The numbers on the scale represented: never,
sometimes a year, monthly, sometimes a month, weekly, sometimes a week, daily. Identical
questions were asked in pre- and post-tests, and data used in this report is derived from the surveys
at the selected schools.
2.2.3
Analyzes and significance
Descriptive and parametric analyzes of the data from the participant survey were conducted. Initially,
the data was examined using descriptive analysis. Such analyzes are used to study the properties of a
variable and how respondents are distributed in this. These may be goals such as the mean and
standard deviation. For example, we use average values for presenting respondents' answers in the
various thematic areas. It is important to note that the average values must be related to the scales
used. For example, questions about efficacy expectations are interpreted such that the higher the
value, the higher the leaders' efficacy expectations.
Furthermore, parametric analyzes in the form of t-tests and analyzes of variance (ANOVA) are
conducted. Such analyzes compare averages between different times and / or different groups. In
3
The survey questions that are used in the analysis can be found in the appendix
21
this report, t-tests are used to examine whether respondents' answers have changed significantly
between pre- and post-tests and whether there are differences between cohorts.
When the term significance is used one means, simply put, that the result is not due to chance.
Consequently, it is not coincidence or measurement errors that create changes or differences, but
that there may be properties in groups, or what we measure, which enable us to find this result. A
statistically significant result is thus a measure of how confident we can be that the results found in a
selection can be generalized and how certain we can say that the results apply to the population. It is
important to note that there are weaknesses by only showing whether the results are significant or
not. Significant results may be trivial and unimportant. Here the selection size also plays a role. Effect
sizes are therefore also calculated. These are analyzes that measure the strength of the differences in
two means, for example change in self-efficacy before education and after education. This provides a
better measurement than a significance test.
In relation to t-tests, the effect size Cohen's d is often used. This is calculated as the difference in
average value between two groups divided by the total standard deviation in those two groups
(Cohen, 1988). Standard deviation is a measurement of dispersion. That is the extent to which the
distribution of responses congregates around the average or are more scattered throughout the
scale. Cohen's d is therefore a measurement that shows whether the difference between the groups
is significant, trivial or whether it has any practical significance. Cohen's d shows whether the effect is
small (0 to 0.9), moderate (0.3 to 0.49) or large (> 0.5). In the ANOVA analyzes the effect size ETA is
calculated. This coefficient indicates whether the differences between the groups are small (0.01),
moderate (0.06) or large (0.14). In the analyzes of changes between pre- and post-tests for both
efficacy expectations (NPSES) and organizational learning (DLOQ), the size of the change in terms of
efficacy is indicated. Although the changes described as moderate, extreme caution in relation to the
interpretation of the results must be exercised.
2.3
Case studies of schools
Data is collected using qualitative and quantitative methods, on a small number of schools at several
points in time over a longer period. The collection of data is carried out in parallel and independent
of each other, but is seen in the context of the analysis process. For example, the focus of interviews
is not designed on the basis of the quantitative results and vice versa. The purpose of the case
studies is to describe any changes in learning culture (quantitative data) and whether some
similarities can be identified in terms of changes in practice at the selected schools over time
(qualitative data) and thus whether this can be traced back to the principal´s participation in the
education.
2.3.1
Selection of schools and data collection
The selection is made from the first cohort that attended the leadership education in 2010, and 12
case schools were selected; two principals from each of the six program providers. The principals
vary in terms of age, gender and number of years of experience as leaders. The selection is strategic
in terms of differences as it pertains to conditions such as type of school, size and geography. Schools
range from large high schools with a relatively wide range of education, through junior high schools,
22
to small and large elementary schools. Because the purpose is to identify any similarities between
the schools when it comes to changes based on the principal's participation in the education, the
individual school will not be described and analyzed in the report. The choice of schools, however,
represents the variety found in the group of participants that the leadership education has as its
target demographic. Although different schools´ challenges and opportunities can vary a lot, it is
precisely this variation that provides an opportunity to identify similarities and relate this to the
principal's participation in the program. Schools are therefore regarded as instrumental cases in the
evaluation.
Three rounds of data collection is completed in the period March 2011 to September 2014, and for
various reasons two of the schools (school 7 and 12) dropped out in-between these rounds 4. The
qualitative material, which is used to identify similarities, is extracted from the ten schools in the
original selection that also completed the second round of data collection. Four of these ten schools
have employed a new principal during the period, and principals who participated in the education
are now either employed as principal at another school or are employed by the school owners. There
have also been changes in all leadership teams who are interviewed in the period. Depending on the
timing the principals changed positions, we have nevertheless interviewed some of them to review
the education and practice in schools and at the schools with new principals he / she participates
with his/her leadership team. Data collection in schools has been demanding and largely been
adapted to the individual school. The timing for the three rounds has therefore varied depending on
different conditions at the schools, but also due to the six providers´ programs starting at different
times and having different durations. This shows that empirical studies of changes in practice in
schools as a result of measures in leader development is a challenge when it comes to continuity of
informants and other matters arising. Meanwhile the turnover of leaders is an example of change in
schools that can be traced back to the principal´s participation in the leader program, in that they
seek new challenges after completing the education.
The first round of data collection was conducted in the form of school visits and interviews with
principals and, where applicable, with the leadership teams in the schools, spring 2011, as well as the
collection of first round data based on questionnaires (DLOQ, UWES) from leaders and school staff.
For some of the smaller schools interviews with the principal were conducted by telephone and
questionnaires sent by mail. Follow-up interviews with some of the principals were also conducted
by phone. This was done after completing the education, as at the time when the school visits were
conducted they had not completed the program. The interviews with the principal focused on
expectations and perceived quality of the program they had attended. Participants were also asked
to reflect on their own learning processes, what they had applied from the education at their own
schools right after the program was completed, and challenges of applying what had been learned
from the program at their own schools.
The second round of data collection was done by telephone in September 2013 in connection with
Report 3. The main focus was that the principals should first describe their own change and
development as leaders in recent years, then they were asked about and in what way participation in
the leadership education had contributed to their own development, and finally any other conditions
or factors that had contributed to the development. In this round, the principals also reflected on
4 School 7 pulled out before the second round of qualitative and quantitative data collection due to time pressure. School 12 pulled
out in connection with the first round of data collection due to changes in management.
23
changes in practice in schools at the given time, and whether and how they thought about working
further on change and development at the schools. The third round of data collection was conducted
in a similar manner as the first round, and was conducted in the period April to September 2014. This
consisted of school visits to most of the schools (some by telephone), interviews with principals and
leadership teams where applicable, and an identical survey on organizational learning, as in round
one.
The quantitative data material, which is analyzed to examine changes in schools, is based on the
questionnaire answered by the staff at two specific times; pre-test in 2011/2012 and post-test in
2014. Two of the twelve schools dropped out of the survey around the time of the pre-test at the
schools, while two schools (school 10 and 11) dropped out in connection with the post-test 5. The
quantitative data used in the present report is therefore based only on the schools where staff have
answered both pre- and post-tests (School 1, 2, 3, 4, 5, 6, 8 and 9). The survey was distributed in
paper format to the staff who were present at the time of the survey, and it is not taken into account
that some schools have had some changes in staff between the two dates. The number of
respondents at the individual schools who answered the pre- and post-tests were paired at the
school level and are shown in the table below.
Table 4 Overview of number of respondents pre- and post-tests distributed by schools
School
(participants)
School 1 (35)
School 2 (130)
School 3 (150)
School 4 (31)
School 5 (30)
School 6 (60)
School 8 (19)
School 9 (22)
SUM
Pre-test
Number of answers Answer percentage
32
61
78
22
26
43
15
14
291
91
47
52
71
87
72
79
64
70
Post-test
Number of answers Answer percentage
32
93
85
23
29
11
7
17
297
91
72
57
74
97
18
37
77
65
Note. Total number of participants is in brackets. The percentage of answers is only estimated as it is calculated from the
number of personnel who were in pedagogical positions and leadership positions at the time of the last round of data
collection. We have not taken into account that the number who have received the form on the survey day could be
somewhat lower, so that the actual percentage of answers could therefore be somewhat higher than given in the table.
The table reflects the schools variation in size, showing that the two largest schools (2 and 3) make
up nearly half of the respondents if we see the selection as a whole. The response rate for schools is
generally relatively high, but varies somewhat between schools. Some have almost one hundred
percent responses; while others have a slightly lower response, and the table shows that there is
some variation for each school in the number of responses to pre- and post-tests. The average
response rate for schools is 70 at pre-test and for post-test the response rate is 65.
5School
10 and 11 have participated in both rounds of data collection, but round two consisted only of qualitative data collection,
as they did not return the forms filled in as agreed.
24
2.3.2
Instruments and analyzes
The instruments that were used in the first and third round of qualitative data collection are based
on the theoretical framework for the evaluation and objectives with the leadership education. The
interviews with the leadership team consisted of three interview instruments; four areas of expertise
in the Directorate for Education and Training´s model for leadership as described in the introduction,
Mintzberg's model of leadership as practice 6 and seven dimensions of organizational learning.
Informants were first asked individually to circle the areas of the competence model that they felt
characterized practice at their own schools, then reflect on this together, and make suggestions for
areas that the model may not cover.
Subsequently the leadership teams, based on Mintzberg's model, were asked to describe the types of
knowledge the leadership team uses in its leadership work, and finally asked to reflect on conditions
in their own schools from the dimensions of organizational learning. These seven dimensions are the
same as in the survey to the school staff, and correspond to the question battery DLOQ that is
applied in the participant survey (described earlier in this chapter). Additionally, the survey included
questions about the four areas of the competence model and application of knowledge types in
transformational and development work. In interviews with the principals who had participated in
the education (where they were still employed) in the last round of data collection, the focus was a
more direct description of changes to the school from the four areas of expertise in the Directorate
for Education and Training´s model. Researchers had also looked into different documents
(strategies, plans and surveys) from the schools that have wanted to share these, but these will not
be reproduced in the report for the sake of anonymity.
To identify similarities across schools, interviews with principals and leadership teams were first
transcribed and then analyzed through data reduction, coding and categorization. Data reduction
was done by organizing the relevant information in a school matrix, both to extract categories for
change in practice that were similar, as well as some unique examples. The quantitative results from
the survey on changes in learning culture, after the principle´s participation in the education, were
first analyzed in order to say something about overall changes in schools, then separately. Based on
the results from the analyzes, these became a starting point for further exploration of interesting
trends in the qualitative material.
The selection of schools is strategic, based on variation, and consists of schools at all levels. The
purpose of the survey is to identify patterns across different schools when it comes to change in
learning culture, but also any unique circumstances. Both similarities and uniqueness at some
schools will be elaborated with the qualitative material.
2.4
Summary
The methods and data basis that are described in this chapter mainly consist of data that serves as
the basis for the analyzes presented in Chapter 3 and Chapter 4. The data from the participants is
based on the participant survey with altogether over 1100 participants distributed in three cohorts
and six program providers. Data from school cases is based on interviews with principals and
6 For a detailed description of the Directorate for Education and Training´s model of management and Mintzberg´s theories of
leadership as practice, see Report 1 and Report 2 (Hybertsen Lysø et al., 2011; 2012)
25
leadership teams, as well as surveys on organizational learning. For method and data basis for the
analyzes from TALIS that are presented in Chapter 5, we refer to Caspersen, Aamodt, Vibe and
Carlsted (2014). In the three next chapters the results from the analyzes are presented, before these
are summarized in Chapter 6.
26
3
Participants´ assessments of change
In the evaluation period, the providers have completed the education for three cohorts, and they
reported that they have continually developed the provision based on various input. The previous
report included data as far as post-test for Cohort 2, and in the present report post-test for Cohort 3
is also included in the analyzes. Some of the analyzes correspond to Report 3 but with a greater
range of selection, while analyzes of the participants' experience of learning culture is not carried out
earlier in the evaluation. The following two research questions are examined based on data from the
participant survey:
-
Do the participants experience that the learning culture at their own schools has changed
from pre to post- participation in the education?
-
Based on the participants´ assessments of benefits and own development, can any changes
in the program provisions´ quality from cohort to cohort be identified?
The first question is addressed through the results from questions about efficacy expectations and
organizational learning from cohort to cohort, and further analysis of key findings from Report 3. The
second question is examined through an analysis of the dimensions of organizational learning more
generally. The purpose of these studies is twofold. Firstly, we will attempt to map changes in the
participants' assessments of own development, and of learning culture in the school, that the
leadership education could conceivably have contributed to. Secondly, we want to highlight any
changes in the program provisions´ quality based on changes in the participants' evaluation of the
benefits and development from cohort to cohort during the survey period. In the absence of
"objective" goals of benefits of the leadership education, analyzes are based on changes in the
participants' experience of learning culture in schools and expectations to master different aspects of
school leadership.
3.1
Participants´ experience of the school´s learning culture
To assess the participants´ experience of learning culture, they were asked how often the following
conditions occur at their own schools, ranged from a scale of 1 to 7. Changes after the education are
measured along 7 dimensions for organizational learning by comparing answers from pre-and post27
tests. The pre-test is based on Cohort 2-3 as organizational learning was not included in the pre-test
before Cohort 2, while the post-test is based on Cohort 1-3. The respondents´ average scores from
pre- to post tests are shown in Figure 2.
7
6
5
4
3
Leadership, role
models and support
systematic thinking
Inclusion and
common vision
Knowledge sharing
Post-test (Classl 1-3)
Cooperation and
team
1
Dialogue
Pre-test (Class 2-3)
Continuous learning
2
Figure 2 Dimensions of organizational learning - general changes
The figure shows that respondents scored higher on the post-test than the pre-test. This indicates
that participants in the education generally have an altered perception of learning culture in their
own schools in the wake of the program, measured along all seven dimensions of organizational
learning. However, this is only the participants´ perceptions of their own schools´ learning culture,
and the analysis says nothing about changes in learning culture in the schools from the staff´s
perspective. We will come back to this in the next chapter where a similar analysis is conducted for
the selection of schools included in the evaluation. The analyzes of organizational learning are also
used to examine whether any changes in program provisions´ quality from cohort to cohort can be
identified.
To investigate whether the differences were significant, a t-test was performed (see appendix). An
ANOVA analysis of all dimensions of organizational learning was conducted to investigate whether
scores on the post-test were significantly different between cohorts (only scores on the post-test).
There were no differences in the analyzes. Table 5 shows the mean scores for all dimensions.
28
Table 5 Average organizational learning for Cohort 1-3
Variable
Cohort 1
Cohort 2
Cohort 3
Continuous learning
4.84
4.89
4.88
Dialogue
5.12
5.13
5.22
Cooperation and team
Knowledge sharing
4.97
4.64
4.95
4.58
5.07
4.64
Inclusion and common vision
5.34
5.50
5.40
Systematic thinking
5.00
5.10
5.13
Leadership, role models and support
5.42
5.56
5.61
The table shows that average values for all dimensions of organizational learning are relatively stable
between the three cohorts. Figure 4 illustrates further that the differences between cohorts are
small.
7
6
5
4
3
Class 1
2
Class 2
Leadership, role models
and support
Systematic thinking
Inclusion and common
vision
Knowledge sharing
Cooperation and team
Dialogue
Continuous learning
1
Class 3
Figure 3 Dimensions of organizational learning per cohort (post-test)
The findings show that the participants do not experiences the learning culture at their own schools
to have changed from start to end of the education, and that no significant changes between cohorts
can be identified. This means that those changes that were possibly made during the program period
did not have any effect on the participants´ experience of the school´s learning culture measures
through the 7 dimensions of organizational learning.
29
3.2
The participants´ capacities for learning and development
Report 3 investigated whether there was a correlation between participants' expectations and
experience of benefits of the education, as well as changes in the participants' capacities for learning
and development. A key finding was that participants´ capacities for learning and development measured through efficacy expectations - had increased along several dimensions from beginning to
the end of the program. The different dimensions to measure efficacy expectations are taken from
the Norwegian School Leadership Self-Efficiency Scale (NPSES) (Federici & Skålvik 2011) and consist
of different questions about school leaders´ confidence in performing various tasks assigned to them.
The analyzes showed that changes had an effect size small to moderate for the first two cohorts in
the leadership education. The changes are not necessarily a direct result of participation in the
program and can also be caused by other conditions such as having longer experience as principal.
The results nevertheless provide evidence that there has been a positive development among the
participants, and which was interpreted as the education contributing to increased confidence in the
leadership role.
To investigate whether these positive results are maintained over time, we have in the present
analysis included Cohort 3 (post-test), and similar analyzes were conducted in Report 3 (Hybertsen
Lysø et al., 2013). The analyzes are done at the individual level, and the analyzes show that
participants´ experience of efficacy expectations are stable over time as we did not find any
significant differences between cohorts. To exemplify this, the average scores for efficacy
expectation with and without Cohort 3 are illustrated in Table 6.
Table 6 Average efficacy expectations for Cohort 1-2 and Cohort 1-3
Variable
Supervision
Local community
Economy
Pedagogical
Satisfaction
School owner
Parent
Support teachers
Leadership
Teaching
Results follow-up
Pre (Cohort 1-2)
4.58
3.61
5.10
5.25
5.04
4.61
5.05
5.31
4.74
5.03
4.86
Pre (Cohort 1-3)
4.51
3.64
5.00
5.21
5.05
4.59
5.06
5.31
4.68
5.05
4.81
Post (Cohort 1-2) Post (Cohort 1-3)
4.89
4.93
3.91
3.95
5.35
5.31
5.55
5.58
5.35
5.37
4.91
4.88
5.26
5.32
5.46
5.49
5.08
5.07
5.35
5.35
5.22
5.19
The table shows that the numbers do not change much, something that indicates that the
participants´ positive experience of own capacities for learning and development are sustained over
time. Figure 4 illustrates this further.
30
6
5
4
Pre (Class 1-2)
3
Pre (Class 1-3)
Post (Class 1-2)
2
Post (Class 1-3)
1
Figure 4 The participants´ efficacy expectations - Inclusion of Cohort 3
Summed up the present analyzes, which include Cohort 3, show that participants´ experience of own
development is both positive and stable over time. The results show that the participants´ efficacy
expectations have a significant increase (effect size small to moderate) along all dimensions from
start to end of education for all three. The fact that we find the same results in the analyzes of three
subsequent cohorts strengthens the significance of the statement of earlier results in Report 3
(Hybertsen Lysø et al., 2013).
3.3
The participants´ assessments of change over time
To get an indication of whether the program providers developed their programs underway, and
based on the results of analyzes in Report 2 and 3, differences between the three cohorts were
examined. The analyzes show that there are only significant differences between cohorts in the
variables "application" of different types of knowledge and understanding of education policy, as
well as experienced 'time spent on administration "(Hybertsen Lysø et al., 2013). Cohort 3 is included
in the present analysis and, based on the post-test, a one-way variance analysis (ANOVA) was
conducted. The analysis examines whether there are significant differences between the three
cohorts in these variables, and the results of the analysis are shown in Table 7.
Table 7 "Application" and "Time spent on administration" (ANOVA)
Variable
Application
Time spent on administration
Df
2 (558)
2 (433)
F
6.300
4.350
P
.002
.013
ETA
0.02
0.02
31
The table shows effect size ETA. This coefficient tells us to what extent the differences between
groups are small (0.01), moderate (0.06), or large (0.14). In other words, there are small differences
between cohorts. Table 8 shows average scores for the relevant variables, distributed by cohort.
Table 8 Average "Application" and "Time spent on administration" for Cohort 1-3
Variable
Application
Time spent on administration
Cohort 1
3.90
3.05
Cohort 2
4.13
3.03
Cohort 3
4.06
2.88
From the table we see that the average for "application" is somewhat higher for Cohort 2 and 3.
Furthermore, it is interesting that experience of time spent on administration tasks seems to decline,
something that may indicate a change in the program provisions´ focus areas. It is either way
important to note this could be due to coincidences and that the differences between cohorts is not
of great significance.
3.4
Summary
The analyzes show that the participants experience that the learning culture in their own schools has
changed from beginning to end of the leadership education. The shown significant changes in
participants' capacities for learning and development, measured through efficacy expectations
(presented in Report 3), is supported by the analyzes in this chapter where all three cohorts is
included. The results show relatively similar patterns when it comes to participants' assessments of
change at the individual and organizational levels. This gives no indication that education program
contributes to specific changes in the schools, but may indicate a correlation between the
participants' perceptions of own learning and development, and their experience of learning culture
in their own schools. It is therefore interesting to examine whether staff at the selected school-cases
experience similar changes in learning culture at schools about two years after the principals have
completed the leadership education.
To identify any changes in the program provisions´ quality from cohort to cohort, based on
participants' evaluations of the benefits and own development, we analyzed whether there are any
differences between the three cohorts that have completed the leadership education in evaluation
period. The analyzes show that no significant changes in participants' benefits can be identified as
program providers have, over time, completed the education. The small differences may be an
indication of changes in providers' intentions, but they can also be due to the participants'
expectations having changed (based on previous findings on correlation between benefits and
expectations) or that the total group of participants have changed prerequisites. The results can be
interpreted as if program quality is stable over time after several cohorts have completed the
leadership education. Despite all providers expressing that they developed their programs during the
period, this is to a small extent expressed in the present analyzes. The small differences between
cohorts can be explained in a similar manner as previous findings; that all participants are very
satisfied no matter what provisions they have attended.
32
4
Change in the schools
A central problem statement in the evaluation is how the leadership education contribute to change
in leadership practice. Based on the data material from the case studies of the schools, the following
two research questions will be addressed in this chapter:
-
Do the staff at selected schools experience that the learning culture has changed in the
schools from pre to post-participation in the education?
-
Can some common characteristics at the selected schools, when it comes to changes in
practice from the participation in the education, be identified?
The first question examines change in learning culture in the selected schools based on the survey
about organizational learning to the staff. These results will be combined with the qualitative
descriptions of dimensions of organizational learning. The second question shed light on any change
in practice at the schools based on the qualitative data from interviews with principals and/or the
leadership team. The purpose is to identify common characteristics between the schools when it
comes to change, and to present a few unique examples. The analyzes seek to illuminate how the
participants’ learning and development from the education has resulted in changed practice at their
schools.
4.1
Change in learning culture
The analysis of organizational learning, based on the participant survey presented in the previous
chapter, shows that participants in all three cohorts experience that learning culture has changed
from beginning to end of participation in the leadership education. It is therefore interesting to
examine whether similar changes can be found among the staff in the selection of schools that
participated in the evaluation. The same survey was also used for both participants in the three
cohorts (presented in the previous chapter) and as part of the case studies, but it is executed as two
separate data collections. The question posed in this chapter is whether staff feels that the learning
culture has changed in the schools from when the principal participated in the education to about
two years after completion. In addition to the survey data from the pre- and post-tests at the
schools, the analysis is based on qualitative interview data. It is important to note that the
33
quantitative analyzes are done at the school level. This makes the selection is very small (N = 8) and it
is therefore not possible to do parametric analyzes and indicate significance. The results are
illustrated through descriptive representations and must therefore be interpreted with caution.
Analysis of the schools' overall scores for learning culture in the pre- and post-tests, based on the
seven dimensions of organizational learning, are presented in Table 9.
Table 9 Overall scores on dimensions of organizational learning pre-and post-tests
Variable
Continuous learning
Post-test
Dialogue and discovery
Post-test
Cooperation and team learning
Post-test
Knowledge sharing
Post-test
Inclusion and common vision
Post-test
Systematic thinking
Post-test
Leadership, role models and support
Post-test
Average
Standard deviation
Change
4.58
4.55
5.10
5.08
4.87
4.77
4.30
4.26
4.90
4.73
4.48
4.47
4.84
4.73
0.465
0.379
0.279
0.318
0.457
0.302
0.439
0.488
0.449
0.380
0.444
0.456
0.402
0.360
-0.03
-0.02
-0.10
-0.04
-0.17
-0.01
-0.11
The table shows that we cannot identify a corresponding pattern of change in organizational learning
measured along the seven dimensions from pre- to post-tests as with the participants. This means
that an analysis of participants´ experience of change in learning culture at their own schools is
generally not sufficient enough to say anything about change in the schools.
As follows we present analyzes of the seven dimensions of organizational learning for all the eight
schools, where the staff have answered how often something occurs at their own schools on a scale
from 1 to 7. Analyzes for each of the eight schools are then compared in order to find any possible
changes in the staff´s experience of learning culture. The scores on the pre- and post-tests,
distributed by dimensions for organizational learning, are illustrated in Figure 5.
34
6
5
4
3
2
Pre
Leadership as role
models, and support
Systematic thinking
Inclusion and
common vision
Knowledge sharing
Cooperation and
team learning
Dialogue and
research
Continuous learning
1
Post
Figure 5 Combined - change of dimensions of organizational learning pre- and post-tests
The figure shows no great change in any of the dimensions of organizational learning in the schools
from the principals´ participation in the education. To find any similarities across schools, but also
any unique aspects of individual schools that can be traced to the principal´s participation in the
leadership education, we have analyzed change from pre- to post-tests for each of the schools. The
purpose of this comparison is not to generalize any changes from pre- to post-tests, but the
quantitative analyzes will form a basis for further qualitative research. Change from pre- to post-tests
is analyzed based on a scale from 1 to 7 (see appendix "analyzes schools"). In cases where the posttest shows a higher score, we refer to as a positive change. The degree of change from pre- to posttests for the schools is illustrated in Figure 6.
1,00
School School School School School School School School
1
2
3
4
5
6
8
9
Continuous learning
Dialogue and research
0,50
0,00
-0,50
Cooperation and team learning
Knowledge sharing
Inclusion and common vision
Systematic thinking
-1,00
Leadership as role models, and
support
-1,50
Figure 6 The schools - change of dimensions for organizational learning
35
The figure illustrates that all schools have a weak change from pre- to post-tests; four of the schools
(3, 4, 6 and 8) have had a positive direction change, while four of the schools (1, 2, 5 and 9) have had
a negative direction change. We note that the changes for the individual school are weak and need
to be interpreted with care, and cannot be seen as an "objective" goal for organizational changes at
the individual school as it only says something about the staff´s experience of learning culture.
Caution must also be exercised in terms of changes that can be attributed the principal´s
participation in the education.
In the time period between the pre- and post-tests, a number of factors can explain the changes.
Continuity or change in leadership could be a possible explanation. For example, School 1, 2 and 5
hired a new principal in this period, while the four schools that have a positive direction have had the
same principal, but otherwise only some changes in the leadership team at two of these schools. The
analyzes will be used to find similarities between the schools when it comes to change, and will be
further examined using qualitative data. Meanwhile, exceptions to the pattern will be identified by
presenting unique examples. It will here be emphasized that School 4 is interesting in the sense that
the principal and two in leadership team attended the leadership education together, and that there
have been some changes in the leadership team during the period when the case study is
undertaken. School 9 differs somewhat from this pattern as the analyzes show a negative change in
the experience of learning culture among staff despite the school having had the same principal in
the investigation period. Although the school has had the same principal for the entire period a great
deal of structural changes, in both the leadership team and organization, have been implemented.
This indicates that changes in practice, which the principal initiates, do not necessarily produce a
positive impact on the staff´s experience of learning culture. When leadership implements changes,
however, a negative impact can occur. Another interpretation is that continuity in leadership and the
leadership team are important for the development of learning culture.
The survey also contained questions on performance goals based on the four competency areas
taken from the model of school leadership, some questions on which types of knowledge change and
development in the school is based on (from Mintzberg), and questions about experienced latitude
and view of the job (UWES). Table 10 shows total scores for each of these goals.
Table 10 Total - Performance goals, change/development, and latitude, view of job
Variable
Performance goals
Post-test
Change and development
Post-test
Latitude
Post-test
View of the job
Post-test
36
Average
Standard deviation
4.35
4.28
4.36
4.22
3.53
3.70
5.69
5.62
0.463
0.362
0.214
0.152
0.492
0.525
0.309
0.280
Change
-0.07
-0.14
0.17
-0.07
The table only shows weak changes, and the total scored for each of the four categories questions
are illustrated in Figure 7.
6
5
4
Pre
3
Post
2
1
Performance goals
Transformation
and development
work
Latitude
View of the job
Figure 7 Total - Performance goals, change/development, and latitude, view of the job
The figure shows that the total scores for the schools show a negative change of performance goals
(competency areas for school leadership), application of knowledge types in change and
development work and view of the job (engagement). The scores for latitude show a weak negative
change. As for organizational learning, possible similarities across different school are examined, but
also possible unique conditions. The scores for change from pre- to post tests at schools for the
different conditions are analyzed and presented in Table 11.
Table 11 Schools - performance goals, change/development, and latitude, view of the job
School
Variable
1
2
3
4
5
6
8
9
Performance goals
Post-test
Change
4.37
4.15
-0.22
4.91
3.74
-1.17
4.15
4.10
-0.05
3.39
4.30
0.91
4.41
4.04
-0.37
4.34
4.75
0.41
4.38
4.82
0.45
4.83
4.33
-0.50
Change and development work
Post-test
Change
4.29
4.12
-0.17
4.31
4.07
-0.24
4.13
4.29
0.16
4.08
4.36
0.28
4.35
3.96
-0.39
4.43
4.33
-0.10
4.74
4.38
-0.36
4.54
4.21
-0.33
Latitude
Post-test
Change
3.61
3.79
0.18
3.93
3.63
-0.31
3.77
3.67
-0.10
2.52
3.04
0.52
3.25
3.50
0.25
3.56
3.95
0.39
4.14
4.79
0.64
3.46
3.25
-0.21
View of the job
Post-test
Change
5.83
5.58
-0.25
5.62
5.62
0.00
5.67
5.57
-0.10
5.11
5.10
-0.01
5.79
5.57
-0.22
5.43
5.83
0.40
5.94
6.10
0.15
6.10
5.58
-0.51
37
The table shows changes are weak two years after the principal´s participation in the education.
Changes for the eight schools are illustrated in Figure 8.
002
002
001
001
000
-001
School 1 School 2 School 3 School 4 School 5 School 6 School 8 School 9
Performance goals
Transformation and
development work
Latitude
View of the job
-001
-002
Figure 8 Schools - performance goals, change/development, latitude, view of the job
The figure shows that the pattern has certain similarities with the scores for organizational learning,
something that can be explained by changes in School 1, 2, and 5, which had hired a new principal.
The table does however show a somewhat different pattern of change in the various schools than
the analyzes of organizational learning, performance goals and knowledge types show. All schools,
except School 2, 3, and 9, had a weak positive change. The results must be interpreted very
cautiously in terms of whether the principal´s participation in the education contributes to the staff
feeling a greater freedom to prioritize with regard to use of time and work tasks. Changes in
experience of latitude can have many explanations, everything from absence of leadership to extra
resources and good organization of the work.
Performance goals vary somewhat more between the schools and these will therefore be looked at
in more detail to see whether the changes can be attributed more than other areas to any of the
competency areas in the Directorate for Education and Training´s model. The questions ask the
participants to decide whether they experience improvement in the areas compared with the year
before; (1) the students´ learning benefits, (2) leadership and administration, (3) cooperation,
organization building and supervision of teachers, and (4) development of the school. The scores,
distributed by school, are presented in Figure 9.
38
002
001
School School School School School School School School
1
2
3
4
5
6
8
9
Students´ learning outcomes
Management and administration
001
000
-001
-001
-002
Cooperation, organization
building, and supervision of
teachers
Development of the school
-002
Figure 9 Schools - performance goals, change/development, latitude, view of the job
The figure shows that at the schools (3, 4, 6, 8 and 9) that had the same principal in the period,
School 4 has had a somewhat more positive change in performance goals, based on the four areas of
competence, than the other four schools. As mentioned, School 4 had continuity in the leadership
team during the entire survey period. Of the three schools (1, 2 and 5), which had a new principal,
School 2 presented the greatest negative change. Here it is worth noting that School 2 is the largest
in the selection and has a relatively large leadership group.
Analysis of the types of knowledge the schools´ change and development are based on were included
to examine both if schools work systematically with knowledge and which types of knowledge are
applied. Changes from pre- to post-tests are examined for each school. Staff were asked to decide
whether they apply (1) research and statistics, (2) experiences from practice and (3) visions and
ideals in the school´s development work. The scores for the schools are presented in Figure 10.
001
School 1 School 2 School 3 School 4 School 5 School 6 School 8 School 9
001
000
000
000
000
Research and statistics
Experiences and practice
Visions and ideals
000
-001
-001
Figure 10 Schools - application of knowledge types and change- and development work
39
The figure shows weak changes in the schools´ application of different types of knowledge in
development work, but the differences between schools have a similarity with differences in changes
in organizational learning and performance goals. At the same time, the results show weak changes
in the application of research and statistics in change- and development work after the principal had
participated in the education. With a starting point in the patterns that are presented, and based on
the quantitative analyzes, the last part of the chapter will summarize the qualitative analyzes to
describe the changes in practice in the schools.
4.2
Change in leadership practice
A general feature of the interviews is the experience that changes have occurred in the schools
during the investigation period. This can either be the role of principal and internal conditions
compared at the school and/or relationship to school owner or external actor. Nevertheless, the
changes described appear as not very concrete and difficult to relate directly to the principal's
participation in the education. Examples of this include changes in which the organization has moved
in the right direction, cooperation has improved, or staff have more focus on learning. We identified
some similarities, but also some unique examples, to illustrate how the principals have converted
learning and development from the education to change in practice. First we describe similarities
between the schools when it comes to changes within the school; reduction in the number of priority
areas, development of an effective leadership team and increased focus on knowledge development.
Further, we describe changes in the leadership's relationship to both school owner and external
actors.
The most obvious similarity between the schools in the survey is that most have reduced the number
of priority areas. Several schools have expressed that they now have only one area or major project
they concentrate on, that they are working to reduce the number of priority areas, or that they
desire to make this happen. A common feature of many of the schools is priority areas across
disciplines and / or cohorts involving all staff. An example of this is the project that deals with the
basic skills of the students, such as reading or writing, and projects relating to skills among staff, as
cohortroom leadership or evaluation work. However, there are relatively large differences between
schools in terms of practice in how the focus area is decided, the work processes, and how the work
of the project is organized in terms of time and place. Input from the leadership education seems to
have significance for the principals when it comes to leadership of the school's priority areas, and
several express that they experience greater confidence in making decisions in the processes, as well
as a more clear control of these. However, we can identify relatively large variations between schools
in perceived level of ambition among leadership and staff, and how the level of ambition is
connected to these priority areas and / or students´ learning outcomes. Several express that an
increased focus on a project creates an arena for change, and it is perceived that it produces little
change if one widens one´s focus.
Another key feature is that the group of leaders does not appear as a coordinated leadership team,
which are both expressed in interviews with principals and leadership teams. This is for example
reflected in the organization of leadership work when it comes to time use and tasks, but also
whether the group of leaders appears coordinated regarding various practices. Here it is worth
noting that at four of the schools, a new principal was employed during the survey period, and there
40
have been changes in the leadership team at seven of the ten schools. Participation in the education
has made several of the principals more aware of the importance of a well-functioning leadership
team to create change and development. Examples include the preparation of clear goals and
expectations for the school´s development work, the importance of appearing as role models for
staff and students, and turning the focus towards learning processes.
To share experiences and common reflection is something that the principals themselves have
realized the importance of through participation in the leadership education (see Report 3), and
more want to create or develop structures and processes for collective knowledge in their own
schools. Several schools are concerned with improving systems for knowledge sharing in order to
achieve development, but there are different practices when it comes to what is shared and how this
is done. Some schools have expressed that they have a "sharing culture" but a challenge that several
have mentioned is to get people to come forward and share examples of good practices. More
people are working to improve this, such as sharing on Fronter, a requirement to share their
acquired knowledge after courses, colleague visits, school wandering and interning at other schools.
In addition, several of the schools describe that they are now working more with systematizing own
experiences, and that they often use research as a basis for development work. A good proportion of
the leaders - both principals and others - also express that they use research to legitimize arguments
for new measures to staff, the direction of the work, as well as areas for common reflection. The
ability to apply research in different ways seem to create greater confidence in the role as leader,
and several people have spent time in the leadership group to discuss research.
We, however, refer to the findings from the schools´ change in organizational learning, which shows
that four of the five schools that have had the same principal for the entire period saw a slight
positive change. These schools, however, all differ in school type and size. Two examples can be
drawn up on the basis of the most divergent results from the quantitative analyzes, and changes in
these schools are described in more detail based' on the similarities. At School 4, which has had the
most positive change in learning culture, the entire leadership team attended the leadership
program simultaneously. The team highlights, on several occasions, that public participation in
education is one of several success criteria for the school´s development work and change. They
emphasize that the leadership education has helped to strengthen cooperation in the leadership
team, exemplified through focus on learning and organizational development, with the consequence
that there is less focus on administration and leadership.
Strengthened cooperation in the team is explained by development of a common framework for the
education and terminology for reflection on school practices already underway in the program. The
leadership team says that there is a relatively high level of ambition for the school and the staff,
which extends beyond the positioning on school ranking. The focus is on collective knowledge
through organization and structures, such as that restructuring of the school´s schedule has given
staff more shared time for collaboration across cohorts and subjects. The school has one major
project related to the basic skills that has taken place over a longer period, which the leadership
team expresses has helped the staff become more confident in and motivated for knowledge
sharing.
Results for School 9 show the strongest negative change in learning culture. The school has had the
same principal in the research period, but has a history of very frequent leader changes and is
characterized by a lack of continuous and clear leadership for relatively long time. In the research
41
period, a new leadership team was established and implemented a lot of reorganization to break old
patterns that have been obstacles for developing the school. One of the changes is an establishment
of a new and larger leadership team that would work to find its modus operandi. According to the
principal, it had at the same time contributed to part of the staff experiencing change in their own
positions and reduced influence on the school. The leadership team tells of a somewhat divided staff
regarding both cooperation and work methods, but realizes that the school is in an early phase of
extensive alteration of structures and processes. This situation seems again to be significant both for
the number of parallel projects and collective knowledge development, but the leadership team is
aware of what can contribute positively. They have a desire to, among other things, clarify the
school´s visions and goals, and to give priority to certain areas in relation to this. Like several of the
other schools, they see it as a challenge to set aside enough time to transformational work, and to
prioritize between competing activities. Both principal and team express that such transformation
processes require time, but they experience confidence and support in the staff. The principal
experiences that the school only now is beginning with local knowledge development, and that they
have created structures to share knowledge and are ready on the launch pad. The leadership
education has had an impact on increased confidence in the principal to implement changes, and the
principal highlights the learning community in the education, in combination with lectures and the
literature. The education has the greatest impact on the work within the school, but also had
significance in relation to the school owner.
Several of the principals and/or leadership teams describe that the relationship to both school owner
and external actors has, actually, been the key to changing practices at the school. This means,
among other things, that the municipality has laid down guidelines for the school´s focus, priority
areas and vision, as well as providing support for both structures, processes and analyzes. The
education, however, appears to have had an impact in that some of the principals are clearer about
their expectations for school owners, and the participation has in some cases created an arena for
closer cooperation. Meanwhile it seems more schools are more receptive to input and guidance from
school owners, which may indicate an improved capacity for change. Examples are that the
principals, as a result of participation in the education, become more conscious of the use of external
actors in the school´s development work, such as the use of a leadership coach and supervisors in
processes within the staff over time, or professional input at individual workshops with staff.
4.3
Summary
The results show that, from the principal´s participation in the education to about two years after
completion, the learning culture in the selected schools has changed slightly, either positively or
negatively. This may be due to several factors, but we cite continuity of leadership as a main
explanation. Similarities that can be identified in terms of changes in practice within the schools are
the reduction in the number of projects and priority areas, the development of well-functioning
leadership teams and increased focus on knowledge development. Regarding external changes, we
look to input and guidance from the school owners, and the use of various external actors. The
purpose of the case studies has been to find similarities across schools and examine the significance
the principal´s participation in leadership education has had on changes in practice. The selection of
schools is limited, so caution should be exercised in interpreting the results of each school. As
follows, the focus will more generally be on the importance of school leadership education.
42
5
The impact of school leadership education
We place the results of the analyzes in the two preceding chapters into a larger context in order to
say something about the importance of education leadership in general. The leadership education is
a measure to improve the competence of principals and school leaders, and the participants
generally seem to have had a positive experience when it comes to their own development. Although
we have suggested, in the previous chapter, that the program has contributed to changes in the
principals´ practices in the schools, we still have limited empirical evidence about the importance of
the leadership education considering the change in leadership practices. To provide some indications
of this, we have drawn on some data from the so-called TALIS survey which was conducted in 2013
from a larger selection of Norwegian lower and upper secondary schools (Caspersen, Aamodt, Vibe &
Carlsten, 2014).
5.1
The TALIS-survey
Only data from schools where at least 50 percent of teachers have answered has been included in
the TALIS-survey. Initially, 200 elementary schools, 200 junior high schools and 150 high schools were
selected. The number of schools with more than a 50 percent response rate is respectively 144, 145
and 106. The core of TALIS is the questionnaires to teachers, but also principals are asked, both
about the school and about a number of conditions on education leadership. The reason for using
the data in this context concerns a question put to principals whether they have undergone a
leadership education program or courses. In the TALIS survey, we cannot specifically distinguish
those who have attended the national leadership education, but we can give some indications on
whether the education seems to be important in relation to how principals believe they perform
their duties. In all, there were 72 percent of principals who have confirmed that they have
participated in the leadership education or courses. The proportion is 56 percent in schools that have
fewer than 10 students, other than that there is no clear correlation with school size. There is also no
difference by grade.
43
5.1.1
Use of time, tasks, and academic development
The first thing we will look at is how the principals, who have attended in leadership education and
those who have not, distribute their work time. Use of time on various tasks is shown in Table 12.
Table 12 TALIS: Principal´s use of time on various tasks, hours on average
Participated in
leadership education
Yes
No
Internal administration
13.5
13.1
Teaching-related tasks
9.6
13.5
Contact with the students
6.0
4.7
Contact with the parents
4.8
3.6
Contact with the authorities
8.4
6.1
Other
8.2
6.1
The proportion of time used for internal administration is the same for both groups, while the table
shows some differences between time spent on various tasks. While those who have participated in
the education leadership education spend less time on teaching-related tasks, they have somewhat
more contact with students, parents and authorities. The differences are small, and can resonate
with other factors.
Furthermore, some statements about work tasks will be presented, and the principals were asked to
answer how often they participated in these activities. A comparison between those who had
participated in the school leadership education and those who have not, based on those who
answered "often" or "very often", is shown in Table 13.
44
Table 13 TALIS: The principal´s work tasks
Proportion that have answered
"often" or "very often"
Participated in leadership education
Yes
No
I cooperate with the teachers in order to solve behavioral
problems in the cohort room
I observe teaching
55
53
19
14
I support the teachers through developing new teaching
practices
I ensure that the teachers are held responsible for the
students´ learning outcomes
I solve problems related to schedules
54
50
45
40
45
49
Where it concerns work tasks the table shows that there are not large differences between those
who have participated in the education leadership education and those who have not. However, all
statements about different forms of academic leadership, among those who have completed the
leadership education, show a somewhat larger proportion who have often or very often completed
these tasks. It is only for the statement concerning problem solving in terms of scheduling where the
difference points in the opposite direction; those without leadership education have a somewhat
higher score than those who have it. This could indicate a somewhat stronger teaching oriented
leadership form among those who have participated in the education leadership education.
This can also be reflected in the follow-up of teachers, which is an important topic in TALIS. Table 14
shows how often a principal conducts an assessment of teachers and gives feedback on their work.
Table 14 TALIS: Principal´s assessment of teachers
Have given an assessment of the teachers
at the school
Participated in
leadership education
Yes
No
Never
8
18
Not every year
12
19
Annually
62
41
Two or several times a year
18
22
The table shows clear differences. Among those who had participated in the leadership education,
the proportion that conducts an assessment at least once annually is 80 percent, and 63 percent
among those who did not participate.
The principals were asked if they had established an academic development plan for the school. Here
there is a very apparent difference between those who had and had not participated in the
education leadership education. While 82 percent of those who had participated in the education
45
leadership education had established such a plan, the proportion among the other group was 63
percent.
5.1.2
Experience of school leadership
The principals also received questions on a series of problematic conditions that could hinder them
from performing the leader job in an effective way. The comparison between those who had
completed the education leadership education and those who had no such education is shown in
Table 15.
Table 15 TALIS: Principal´s experience of obstacles to effective leadership
Proportion who answer
"very little" or "not at all"
Participated in leadership education
Yes
No
Insufficient budgets and resources
27
17
Public regulations
52
46
The teachers are absent of arrive late
73
61
Lack of support and involvement from the parents
72
83
Lack of support and possibilities for own professional
development
Lack of support and possibilities for teachers´ professional
development
High level of work pressure and responsibility at work
80
68
69
62
17
21
Lack of possibility to share in the leader responsibility with
other employees
62
52
The obstacles mentioned are partly by external causes, partly by factors within the school or the
principal's own labor conditions. Consistently, there is a higher proportion of those who have
participated in leadership education who answer that the various conditions only have a moderately
negative effect.
The exceptions here are that those who do not have the education experience that lack of parental
involvement or high work pressure in the job can be an obstacle to effective leadership. These results
can be interpreted in slightly different ways, and causal direction is not provided. For example, the
financial terms can in actuality be better in the schools where the principal has completed leadership
education, but the answers may just as well be a result of different abilities to handle the problem.
The same may well be true of the extent to which government regulations are an obstacle. Lack of
support is a lesser problem for professional development for both the principal and teachers in
instances where the principal has participated in leadership education, but may also reflect that
conditions are actually better adapted. The fact that opportunities to share leadership responsibility
is a minor problem may be due to that the principal is better to implement distributed leadership,
46
but can also be a result of it being easier for the principal to participate in leadership development at
such schools.
Finally, we find that those who have completed the leadership education have to a less degree
regretted that they are principals, and that they are more satisfied with their jobs.
5.2
Summary
Data from the TALIS survey indicates that the leadership education is significant for both the
execution and the experience of the leadership job. The school leaders who participate in such
courses and programs seem to have more attention on and greater prioritizing of teaching oriented
leadership and closer follow-up of own teachers. Even though the participants in the national
leadership education cannot especially identify with this material, the results give indications that the
program in general can create changes in how principals and other school leaders execute their
work.
47
6
Central findings, goal achievement and
recommendations
In this final chapter, we will summarize the central findings from the entire evaluation period. In
order to create an overview the summary is made brief. For more in-depth analyzes tied to the
various focus areas that the evaluation has looked at we refer to the 3 previous reports (Hybertsen
Lysø et al., 2011; 2012: 2013). The chapter starts with a summary of the central findings in the
evaluation. Further on, the goals reached for the leadership education as a national initiative are
discussed. Based on this chapter we concluded with a discussion on recommendations for changes
and adjustments for the leadership education of the future.
6.1
Central findings in the evaluation
Generally, the national leadership education has many characteristics that are relatively typical in
relation to development trends for modern leadership education: this involves stronger national
governance through the establishment of standards and stated objectives, a professional content
that emphasizes the school’s core tasks and methods that allow for individual development and
practice-related exercise of leadership.
In Norway, it seems that the framework for the national leadership education is to a small degree
tied to a specific leadership direction or a particular view of leadership. The Directorate for Education
and Training has also drawn strongly on research about what promotes good school leadership, but
which generally appears as very eclectic. The focus on practice is in line with that the spearhead of
international research does not seem to agree on what theories are most relevant to use to develop
good education leadership.
The short review of international research that was done in the first report (Hybertsen Lysø et al.,
2011), provides a picture of a field of research where there is a lot of activity, but where it is difficult
to draw a more complete picture of the status of research. This is because many of the theories that
exist on the field are relatively weakly conceptually developed. The result is that falsification of
hypotheses is difficult, and that some of the discoveries made can be interpreted from different
theoretical perspectives.
48
At the same time empirically based research indicates that setting up goals for students´ learning,
developing teachers´ competence and adapting the organization to the learning objectives, are key
factors significant to students' learning (Jacobson, 2011). The research points to that such factors are
highly dependent on conditions that principals and education leaders to a small degree control including demographic and personal characteristics of students, leadership structure that the school
is subject to, and various socioeconomic and familial relationships (Jacobson 2011: 41). Thus, all good
education leadership is about finding and exploiting the latitude that is limited by these factors
(Mulford & Silins, 2011). Here the research seems to point out that there are actually opportunities
(Leithwood et al., 2010: 27-28, Robinson, 2009: 39), not least by setting high academic goals and
communicating these internally and externally; participation in planning and design of teaching and
learning has a positive effect on teacher motivation and ensures better working conditions for them;
and that the education leadership gives teachers latitude and autonomy, but are also clear when it
comes to assessment and feedback. Generally, the mantra seems to be that "the closer educational
leaders get to the core business of teaching and learning, the more likely they are to have a positive
impact on students' outcomes" (Robinson et al., 2009: 664). This perspective seems to pervade the
Norwegian leadership education. The emphasis on practice is a recurrent theme - from the
Directorate for Education and Training´s requirement descriptions for the program providers´ design
of the specific academic provisions.
This emphasis on practice has still not made the program provisions the same. On the contrary, the
programs of the six institutions that offer the leadership education appear as very different, and
where different intentions from the providers have been expressed in how the education program
has been designed. As shown in Report 2 (Hybertsen Lysø et al., 2012), the programs are in part very
different in terms of both organization, number of workshops, curriculum scope and how one thinks
that practice should be changed. Participants´ experience of the leadership education is still very
positive - regardless of the courses, they attended - both in terms of pedagogical programs and their
own development. The education´s pedagogical quality and relevance to practice are, among all
participants who have completed the education, perceived as high - regardless of which provider and
which cohort they belonged to.
Four complementary explanations for these results can be highlighted. Firstly the data shows that
the programs´ intentions largely correspond with the participants´ own expectations of the program.
Secondly it would seem like the providers of leadership education have taken their own pedagogical
approach in relation to the organization of the education provision based on the institutional
characteristics, where it has been drawn on own experiences and developed expertise in education
leadership and leadership training. The program providers are perceived as competent and
autonomous suppliers in the field. Thirdly, the variation between the provisions is subordinate to the
fact that the education provision exists - where this is perceived as an important recognition of the
role of principal and school leader. Fourth, satisfaction with the provisions can also be explained
through that the leadership program is an important social arena covering school leaders´ need for
support and networking. The context itself seems to be more significant than differences in other
pedagogical conditions in academic provisions.
The fact that the participants are satisfied with the provision they have attended need not, however,
mean that they have had a huge benefit of education in terms of their own development. In order to
say something about this, participants were asked to answer a survey both before and after the
49
education. Although it is methodologically challenging to measure benefit from leadership education,
it seems like the expectations of benefits that participants had in advance have largely been fulfilled
(Hybertsen Lysø et al., 2013). To meet high expectations over time can be a big challenge, but
program providers seem to have achieved this for all cohorts that have completed the education.
Furthermore it seems that the participants´ capacities to learn and develop - measured by own
expectations about mastering different tasks for education leaders - have increased along many
dimensions. Efficacy expectations are a key indicator of the individual capacity to develop as a leader,
and affects efforts, endurance, aspiration level and goals. Through the leadership education,
participants seem to have achieved increased expectations to master various tasks upon completion.
It must be stressed that the positive change in efficacy expectations is relatively small, but it is also
significant along a number of dimensions. This is an indication that the leadership education´s
objective to develop increased confidence in the leader role is fulfilled.
Participants´ experience of benefits and changes do not seem to depend on the program they have
been affiliated with. Participation in a learning community, which all provisions strongly emphasize,
seems to be important for education leaders´ capacities for learning and development. The program
provisions - in spite of differences in several pedagogical relationships - seem to create a relatively
similar change-oriented arena for learning. This learning arena is characterized by a strong element
of research-based knowledge and a close connection between theory and practice through both
academic writing and practical skills training. These overarching characteristics seem to have greater
significance for participants´ learning than more detailed pedagogical schemes. Individual
characteristics of participants (position, age, gender and experience) seem to not have had any
particularly great significance for the benefits that are reported later. The programs all emphasize a
balance between individual and group-based activities and guidance related to processes in groups
and writing. The change-oriented learning arena provides a basis for creating greater confidence in
the role of school leader.
Has this confidence then had consequences for how participants in the education later act as
principals? To find out more about this a small selection of participants in the national leadership
education have been followed "home" to their own schools, and the evaluation has examined
whether the school staff feel if the learning culture has changed at the schools due to the principal´s
participation in education. The evaluation has also been interested in revealing whether it is possible
to identify some common features in these schools when it comes to changes in practice at schools.
The results show that the learning culture of the selected schools have only slightly changed - but
both positively and negatively. Interviews also suggest that the principal and leadership teams feel
that schools changed during the period, including how the principal performs his role, both internal
organizational and pedagogical changes and / or relationship to school owners and external parties.
For many, the changes are nevertheless difficult to relate directly to the principal's participation in
the education - there are several factors that may contribute to changes. Common traits that were
identified at the schools, when it comes to changes in practice by schools, are primarily a reduction in
the number of projects and priority areas, the development of well-functioning leadership teams, as
well as increased focus on knowledge development. Here it may seem as if the leadership education
has contributed to an increased awareness of the school as an organization. Many schools also
report that collaboration with school authorities and external stakeholders has been important to
change practice.
50
6.2
The leadership education´s goal achievement
Parliament Report No. 31 (2007-2008) «Quality in Education» advocated a national leadership
education for school principals. The reason for the initiative was to strengthen the quality of
Norwegian education, where education leadership was considered to be an important factor. The
Directorate for Education and Training has emphasized that the education provision should be
closely tied to school owners, as it is the latter actor who is responsible for the individual school
(Hybertsen Lysø et al., 2011: 47).
Regarding the organizing of the education there were central conditions that the leadership
education should be closely linked to practice - and not just be a theoretical study. Moreover, the
program could be part of a more comprehensive master education if participants wanted further
supplementation.
After completing the leadership education, the participants should be able to apply research-based
knowledge, information about own school and their own experiences as the basis for organizational
development and improvement of students' learning outcomes. Increased knowledge about the
relationship between students' learning environment and learning outcomes was considered
especially relevant for the improvement of student learning outcomes. Furthermore, confidence in
the leadership role was underlined and a number of competency requirements identified where
information about learning outcomes, leadership and administration, cooperation and organizational
development, as well as development and change, were especially emphasized. In summary, one can
say that the leadership education had three general objectives:
-
Contribute to strengthen the quality of Norwegian education generally, not least through
tying school owners to the education
-
Contribute to strengthening school leadership education in Norway, not least through tying
theoretical and practical leadership training
-
Strengthen leadership competence of the participants, not least to give them increased
confidence in the leader role.
Whether the national leadership education has helped to strengthen the quality Norwegian school in
general, is not something this evaluation can provide a good answer to. The reason is simply that it is
difficult to develop an empirical basis for answering this question. For example, it is difficult to
identify a causal relationship between the school leader education and student learning outcomes - if
one sees this as a measure of quality. Meanwhile, the leadership education had good number of
applicants, and many education leaders in Norway have strengthened their expertise since its
inception. There is good reason to argue that this is a basis for enhancing the quality of Norwegian
education - at least indirectly. The data from the TALIS survey, which is included in this report,
indicates that the principals attending courses and programs in education leadership have a
somewhat different prioritization of time use and work tasks compared to principals who have not
participated in such provisions. Not least does education leadership seem to provide a stronger focus
on teaching oriented leadership and better follow-up of teachers - priorities that research has shown
to have a positive impact on students' learning.
At the same time, the discoveries we made when we followed the principals "home" to their own
schools show that it is not a simple connection between principals´ competence and change and
51
development in the school. Here it should be emphasized that quality development in the school is
conditioned on a greater interaction between several factors and where education leadership alone
is not always sufficient. Not least, one can argue that the relationship between school leadership and
school authorities is important for improving the quality of the school. The ambition to connect
school owners closely to the education is only partly fulfilled. Although school owners today have a
central role in the selection of candidates for the education, institutions organizing the education
provisions report that at times is difficult to engage school owners during the education. The fact
that many participants must attend the education while working full time without having set aside
time for this is also an indication that the commitment of the school owners in this education
provision could have been stronger.
The national program has however clearly strengthened the leadership education in Norway in
different ways. Interviews with representatives of program providers have clearly indicated that
many institutions have actively conducted development work in connection with the national
leadership education - a development that has also had an impact on the other education leadership
provisions available at the education institutions. The fact that the Directorate for Education and
Training has arranged regular seminars for representatives of providers also seems to be positively
received. Here the Directorate has created an arena where education institutions can exchange
experiences and views on education provisions and the experience gained as more cohorts have
completed the programs. Interviews with representatives of education institutions also show that
the experience of linking with partners from independent consultant / expert environments has been
positive. Through the formal requirement to have such partners, new connections between
theoretical and practical leadership education has been created. One interesting aspect of the
national program is that the creation of a national provision did not seem to have helped to unify the
leadership education. Provisions provided by the various institutions are as mentioned different
without there being great significance to the benefits for the participants. From this, the conclusion is
that the program undoubtedly helped to enhance participants' skills in school leadership.
6.3
Implications and recommendations
It is already decided that the current leadership education will be continued in the years to come.
Based on the results from this evaluation it is possible to point to some implications and
recommendations that the Directorate for Education and Training and future providers of this
education should reflect on.
The fact that there actually is a national leadership education has previously been pointed out to be a
factor that should not be underestimated (Hybertsen Lysø et al., 2012). The provision is a recognition
of that the school leader role, both nationally and locally, is something the participants of the
education have underlined. When participants are very satisfied with the education they have
attended, this may be because the existence of a provision with good quality may have a stronger
effect than the pedagogical design of the specific provisions. That education is a national initiative,
and this may affect both providers and participants, and an additional recognition that can create
both motivation and increased interest in school leadership - not least among those who currently do
not have a principal position in education, but considering this as an alternative. Many of the
participants in the leadership education have not been in principal positions (however other leader
52
roles) while they attended the education, which could be an indication that the current leadership
education also serves as a recruitment channel. The fact that future leadership education is still open
for those who do not have a leadership position in the school, but who are considering this,
therefore seems to be sensible.
The framework conditions for attending the leadership education appear however to vary greatly,
and there are two factors that come into play. Firstly, there are large variations between the
participants in terms of time off for the leadership education. The evaluation has revealed that many
of the participants think that the program is demanding to complete while working full time and it is
not inconceivable that several would have completed the education if their daily lives were better
arranged for participation in the education. Secondly, such framework conditions concern what
interest school owners have for the leadership education. Here it seems variations again are
substantial. Measures, which can contribute to a stronger commitment from school owners in
relation to the participants and to the education, can here be considered as appropriate. The
Directorate for Education and Training has in various ways attempted to connect school owners to
the education, but more can arguably be done here. Such a commitment from school owners can
have several benefits - both to strengthen the knowledge and interest in the school and school
leadership in the school owners, and a more coordinated effort related to changes in schools later.
The fact that school owners should be connected closer to the national leadership education also
suggests that the education should be seen more in the context of other development measures that
the Directorate for Education and Training and the school owners implement. As discussed in the
previous chapter the competence in education leadership seems to be an important, but hardly a
sufficient condition to create change in schools. There are many other factors that come into play
and which require the capacity of change. Not least, it can be questioned whether participation in
the leadership education can become a "single action" which does not give participants the longterm support they need in their leadership. When several of the participants of the leadership
education highlight the importance of the social network they have acquired through the program,
several implications can be deduced from of this. Could it be that principals have a more lasting need
for support and an arena where they can talk about, reflect on, receive, and give advice on the
choices of direction of schools? Is there a need for establishing a more solid and systematic network
to safeguard participants´ needs after finishing the education? Can one imagine that future of the
leadership education adds up to more regular follow-up meetings among participants?
Without it being possible to draw firm conclusions from the few schools that have been followed up
in this evaluation, it is interesting that where the entire leadership team at a school has attended the
leadership education, a force for change seems to have been created than where schools only had
individual participants. Here you could in the future strengthen the knowledge base by increasing the
number of people from certain schools in order to test this out more systematically. Where there
may be more from the same school who attend the same leadership education, however, is not
unproblematic. These individuals also must be given time off to attend the program and thereby
collectively create major challenges for each school. An advantage would be that they would create a
strong and dense network locally. On this basis, one might experiment with the scope and
organization of the current leadership education in the future. Norwegian schools are diverse in
several dimensions and being a school leader reflects this diversity clearly. The fact that the current
education can be combined in a longer-term education is undoubtedly an advantage for those who
53
further want to strengthen their expertise in this field. Meanwhile, one can also imagine that the
current leadership education might have more flexibility built in to it in terms of how it is
implemented. If so many school leaders must attend the leadership education in addition to their full
time job, an alternative approach might be to offer a more flexible education in the coming years.
54
References
Argyris, C. & Schön, D. A. (1996). Organizational Learning II: Theory, Method and Practice. Reading,
Mass: Addison Wesley.
Babbie, E. R. (2004). The practice of social research. Belmont, California: Thomson/Wadsworth.
Caspersen, J., Aamodt, P.O., Vibe, N. & Carlsten, T.C. (2014). Kompetanse og praksis blant norske
lærere. Resultater fra TALIS-undersøkelsen i 2013. Oslo: NIFU-rapport 41
Federici, R. A., & Skaalvik, E. M. (2011). Principal self-efficacy and work engagement: assessing a
Norwegian Principal Self-Efficacy Scale. Social Psychology of Education, 14(4), 575-600. doi: DOI
10.1007/s11218-011-9160-4
Federici, R. A., & Skaalvik, E. M. (2012). Principal self-efficacy: relations with burnout, job satisfaction
and motivation to quit. Social Psychology of Education, 15(3), 295-320. doi: DOI 10.1007/s11218012-9183-5
Gall, M. D., Gall, J. P., & Borg, W. R. (2007). Educational research: an introduction. Boston: Allyn and
Bacon.
Hybertsen Lysø, I. (2014). Tett på lederutvikling. I Klev, R. & Vie, O.E. (Red.). Et praksisperspektiv på
ledelse. Oslo: Cappelen Damm Akademisk.
Jacobson, S. (2011). Leadership effect on student achievement and sustained school success.
International Journal of Educational Management, 25, 33-44.
Kirkpatrick, D. L. (1998). Evaluating Training Programs: the Four Levels (2nd ed.). San Francisco, CA:
Berrett-Koehler.
Leithwood, K., Patton, S. & Jantzi, D. (2010). Testing a conception of how school leadership influences
student learning. Educational Administration Quarterly, 46, 671-706.
Lysø, I. H., Stensaker, B., Aamodt, P. O. & Mjøen, K. (2011). Ledet til ledelse: Nasjonal
rektorutdanning i grunn- og videregående skole i et internasjonalt perspektiv. Delrapport 1 fra
Evaluering av den nasjonale rektorutdanningen. Oslo: NIFU rapport 2011.
Lysø, I. H., Stensaker, B., Røthe, R., Federici, R., Olsen, M. S. & Solem. (2012). Ledet til lederutvikling:
Nasjonal rektorutdanning i grunn- og videregående skole; forskjeller og likheter mellom de seks
programtilbudene. Delrapport 2 fra Evaluering av den nasjonale rektorutdanningen. Oslo: NIFU
rapport 2012.
Lysø, I. H., Stensaker, B., Federici, R., Solem, A. & Aamodt, P. O. (2013). Ledet til læring: Nasjonal
rektorutdanning i grunn- og videregående skole; deltakernes vurdering av egen utvikling.
Delrapport 3 fra Evaluering av den nasjonale rektorutdanningen. Oslo: NIFU rapport 2013.
Marsick, V.J. (2013). The Dimensions of a Learning Organization Questionnaire (DLOQ): Introduction
to the Special Issue Examining DLOQ Use Over a Decade. Advances in Developing Human
Resources, 15(2), 127-132.
Marsick, V. & Watkins, K.E. (1999). Facilitating learning organizations: Making learning count.
London: Gower Press.
Mintzberg, H. (2009). Managing. San Francisco: Berrett-Koehler Publishers.
55
Mulford, B. & Silins, H. (2011). Revised models and conceptualization of successful school
principalship for improved student outcomes. International Journal of Educational Management,
25, 61-82.
OECD (2008). Improving School Leadership: Policy and Practice. Organisation for Economic
Cooperation and Development, Paris.
Ringdal, K. (2007). Enhet og mangfold: Samfunnsvitenskapelig forskning og kvantitativ metode.
Bergen: Fagbokforlaget.
Robinson, V., Hoepa, M. & Lloyd, C. (2009). School leadership and student outcomes: Identifying what
works and why (Best Evidence Synthesis Iteration). Auckland: University of Auckland.
Schaufeli, W. B., & Bakker, A. (2004). UWES Utrecht work engagement scale. Preliminary Manual.
Møller, J., Sivesind, K., Skedsmo, G. & Aas, M. (2006) Skolelederundersøkelsen 2005. Om
arbeidsforhold, evalueringspraksis og ledelse i skolen. Oslo: Universitetet i Oslo.
Stortingsmelding nr. 30 (2003-2004). Kultur for læring. Oslo: Statens Trykksaktjeneste.
Stortingsmelding nr. 31 (2007-2008). Kvalitet i skolen. Oslo: Statens Trykksaktjeneste.
Watkins, K. E. & Marsick, V. (1997). Dimensions of the learning organization. Warwick, RI: Partners
for the Learning Organization.
Watkins, K.E. & Dirani, K.M. (2013). A Meta-Analysis of the Dimensions of a Learning Organization.
Questionnaire: Looking Across Cultures, Ranks, and Industries. Advances in Developing Human
Resources, 15(2), 148-162.
Western, S. (2008). Leadership – A critical text.
56
List of Tables
Table 1 Overview of selection distributed by pre- and post-tests and by three cohorts ..................... 18
Table 2 Overview of selection who answered both pre- and post-tests .............................................. 19
Table 3 Overview population and answers distributed by program provider ...................................... 20
Table 4 Overview of number of respondents pre- and post-tests distributed by schools ................... 24
Table 5 Average organizational learning for Cohort 1-3 ....................................................................... 29
Table 6 Average efficacy expectations for Cohort 1-2 and Cohort 1-3 ................................................. 30
Table 7 "Application" and "Time spent on administration" (ANOVA) .................................................. 31
Table 8 Average "Application" and "Time spent on administration" for Cohort 1-3 ............................ 32
Table 9 Overall scores on dimensions of organizational learning pre-and post-tests .......................... 34
Table 10 Total - Performance goals, change/development, and latitude, view of job ......................... 36
Table 11 Schools - performance goals, change/development, and latitude, view of the job .............. 37
Table 12 TALIS: Principal´s use of time on various tasks, hours on average ......................................... 44
Table 13 TALIS: The principal´s work tasks............................................................................................ 45
Table 14 TALIS: Principal´s assessment of teachers .............................................................................. 45
Table 15 TALIS: Principal´s experience of obstacles to effective leadership......................................... 46
Table 16 T-test dimensions from organizational learning Cohort 2 and 3 ............................................ 62
Table 17 Schools´ scores on dimensions of organizational learning pre- and post-tests ..................... 63
Table 18 Schools´ scores on performance goals (competency areas school leadership) ..................... 64
Table 19 Schools´ scores on knowledge types that contribute to change and development .............. 64
Table 20 Schools´ scores on latitude ..................................................................................................... 65
57
List of Figures
Figure 1 Evaluation level – (cited in Kirkpatrick, 1998) ......................................................................... 16
Figure 2 Dimensions of organizational learning - general changes....................................................... 28
Figure 3 Dimensions of organizational learning per cohort (post-test) ................................................ 29
Figure 4 The participants´ efficacy expectations - Inclusion of Cohort 3 .............................................. 31
Figure 5 Combined - change of dimensions of organizational learning pre- and post-tests ................ 35
Figure 6 The schools - change of dimensions for organizational learning ............................................ 35
Figure 7 Total - Performance goals, change/development, and latitude, view of the job ................... 37
Figure 8 Schools - performance goals, change/development, latitude, view of the job ...................... 38
Figure 9 Schools - performance goals, change/development, latitude, view of the job ...................... 39
Figure 10 Schools - application of knowledge types and change- and development work .................. 39
58
Appendix
Instrument organizational learning
On a scale from 1 to 7, how often do the following occur at your school? Take as a starting point your
own work day, and place only one check mark per statement.
At this school......
1. ... the employees openly discuss the challenges connected to the work in order to learn from
them.
2. ... the employees identify the abilities and traits they need to solve future tasks
3. ... the employees help each other learn.
4. ... the employees can get economic and other resources to support their own teaching
5. ... the employees get spare time for their own learning
6. ... challenges are considered a possibility to learn more
7. ... the employees are rewarded for learning
8. ... the employees give open and honest feedback to each other
9. ... the employees listen to others´ points of view before they state their own ideas
10. ... the employees are encouraged to task questions regardless of their position
11. ... the employees ask for others´ opinions when they promote their own opinions
12. ... the employees treat each other with respect
13. ... the employees spend time to build trust in each other
14...... the teams have freedom to set their own goals
15. ... the teams treat their members equally, independent of education, position or cultural
background
16. ... the teams focus both on the tasks and how the group itself cooperates
17. ... the teams change how they think, should a group discussion or gathered information imply this
18. … the teams trust that the schools will act in accordance with the recommendations they give
20. ... two-way communication is used on a daily basis
59
21. ... we have routines that make it easier for the employees to get information
22. ... we have available, updated and open information about the employees´ competence and traits
23. … we have routines for investigating the differences between the actual and the expected
performances of the employees
24. ... individual employees´ experiences are made available for all
25. ... we have routines for investigating benefit of the employees´ academic development and
currency
26. ... employees who take extra initiative gain recognition
27. ... the employees have choices when it comes to how the work is to be done
28. ... the employees are invited to participate in developing the school´s vision and platform
29. ... the employees control the resources which are required to execute good work
30. ... support is given to employees who dare to take risks
31. ... the visions from the various teams and groups are synchronized
32. ... make a good foundation for a good balance between job and spare time
33.... the employees are encouraged to think from a global perspective
34.... the employees are encouraged to consider the students´ perspectives in the decision-making
process
35. ... consideration is given that decisions can influence employee work morale
36. ... there is cooperation with environments outside the school to promote common goals
37. ... the employees are encouraged to search for solutions from all colleagues when they face
problems or challenges
38. ... the employees are generally supported in their desires for academic improvement and
continuation training
39. ... share information about changes in the environment or organizational changes
40...encourage and support the employees in executing the school´s vision
41... supervise the employees regarding academic matters
42. ... continuously focus on the opportunities to learn
43. ... only make decisions that coincide with the school´s values
60
Think about and take a position on the following claims:
44. The students have, all in all, had a greater learning outcome this year than the year before.
45. The school has in its entirety been managed and administrated better this year compared than
the year before
46. Cooperation, organizational building and supervision of teachers has, overall, been better this
year than the year before
47. The school has had an increased positive development this year compared with the previous year
48. At our school change and development are mainly based on research and statistics
49. At our school change and development are mainly based on experience and practice
50. At our school change and development are mainly based on visions and ideals
51. I have had greater freedom to prioritize what I am to use my time on this year compared with last
year
52. I have had greater freedom to prioritize the work tasks I find important this year compared with
last year
61
Analyzes participant survey
The table shows t-test with average scores for Cohort 2 and 3, as well as the result of the test.
Table 16 T-test dimensions from organizational learning Cohort 2 and 3
Variable
Continuous learning
Post-test
Dialogue
Post-test
Cooperation and team
Post-test
Knowledge sharing
Post-test
Inclusion and common vision
Post-test
Systematic thinking
Post-test
Leadership, roles, and support
Post-test
Average
Standard
deviation
t
p
Cohens d
4.58
4.94
4.94
5.29
4.81
5.07
4.18
4.60
5.13
5.42
4.78
5.10
5.22
5.59
0.89
0.96
0.86
0.92
0.87
0.98
0.97
0.95
0.78
0.78
0.82
0.89
0.84
0.87
-5.509
.000
0.39
-5.345
.000
0.39
-3.903
.000
0.28
-6.041
.000
0.44
-5.365
.000
0.38
-4.995
.000
0.38
-5.909
.000
0.43
The table shows that the respondents score significantly higher on all the seven dimensions of
organizational learning on the post-test, compared with the pre-test. The effect size Cohens d tells
us that the effect is small (0-0.9), moderate (0.3-0.49) or large (>0.5). From the table one can read
that the effect is moderate, but still significant.
62
Analyzes school cases
The scores on dimensions for organizational learning for the eight schools that answered both preand post-tests as presented in Table 17.
Table 17 Schools´ scores on dimensions of organizational learning pre- and post-tests
School
Variable
1
2
3
4
5
6
8
9
Continuous learning
Post-test
Change
4.43
4.46
0.03
4.76
4.20
-0.57
4.35
4.61
0.26
3.69
4.25
0.57
4.91
4.57
-0.34
4.64
4.73
0.09
5.29
5.35
0.06
4.55
4.21
-0.34
Dialogue and exploration
Post-test
Change
5.28
5.13
-0.15
5.09
4.80
-0.29
4.78
5.30
0.52
4.64
4.88
0.25
5.30
4.83
-0.47
4.99
5.39
0.40
5.39
5.57
0.18
5.36
4.71
-0.65
Cooperation and team learning
Post-test
Change
5.22
4.82
-0.40
4.87
4.44
-0.43
4.52
4.86
0.34
3.92
4.65
0.73
5.11
4.66
-0.45
4.94
5.36
0.42
4.98
4.95
-0.02
5.36
4.44
-0.92
Knowledge sharing
Post-test
Change
4.53
4.00
-0.53
4.46
4.04
-0.42
3.89
4.49
0.60
3.43
4.00
0.57
4.42
4.25
-0.18
4.24
4.53
0.29
4.73
5.19
0.46
4.68
3.56
-1.12
Inclusion and common vision
Post-test
Change
5.28
4.64
-0.63
4.79
4.25
-0.54
4.39
4.68
0.28
4.18
4.26
0.09
5.13
4.73
-0.40
4.81
5.21
0.41
5.10
5.29
0.19
5.52
4.81
-0.70
Systematic thinking
Post-test
Change
4.49
4.26
-0.23
4.80
4.32
-0.48
4.47
4.61
0.14
3.58
4.02
0.45
4.41
4.30
-0.11
4.26
5.03
0.77
4.89
5.24
0.35
4.97
3.98
-0.99
Leadership, role models and
support
Post-test
Change
4.90
5.16
4.64
3.97
5.26
4.77
4.94
5.03
4.67
-0.23
4.41
-0.75
4.85
0.21
4.59
0.62
4.64
-0.62
4.86
0.09
5.50
0.56
4.34
-0.68
The table shows that there are small differences between the schools, but the scores more or less
congregate for all dimensions for each school. Table 18 shows the schools´ scores on performance
goals and Table 19 shows the schools´ scores for which knowledge types that contribute to changeand development.
63
Table 18 Schools´ scores on performance goals (competency areas school leadership)
School
Variable
1
2
3
4
5
6
8
9
Students´ learning outcomes
Post-test
Change
4.55
4.17
-0.38
4.50
3.91
-0.59
4.12
3.99
-0.13
3.95
4.48
0.53
4.09
4.00
-0.09
4.31
5.09
0.78
4.43
4.86
0.43
4.31
3.56
-0.75
Leadership and administration
Post-test
Change
4.42
4.19
-0.23
5.22
3.79
-1.43
4.08
4.10
0.02
3.05
4.13
1.08
4.50
4.11
-0.39
4.39
4.55
0.16
4.43
4.57
0.14
5.50
4.69
-0.81
Cooperation, organization
building and supervision of
teachers
Post-test
Change
4.10
4.67
4.09
3.38
4.57
4.21
4.14
4.54
3.93
-0.16
3.60
-1.07
4.20
0.11
4.22
0.84
4.00
-0.57
4.45
0.24
4.86
0.71
4.69
0.15
Development of the school
Post-test
Change
4.42
4.47
0.05
5.17
3.67
-1.50
4.27
4.11
-0.16
3.19
4.39
1.20
4.45
4.08
-0.38
4.44
4.91
0.47
4.50
5.00
0.50
5.07
4.38
-0.70
Table 19 Schools´ scores on knowledge types that contribute to change and development
School
Variable
1
2
3
4
5
6
8
9
Research and statistics
Post-test
Change
3.59
3.50
-0.09
3.81
3.61
-0.21
3.94
4.01
0.08
3.90
4.17
0.27
3.83
3.73
-0.10
4.55
4.45
-0.09
3.79
3.14
-0.64
4.00
3.69
-0.31
Experience from practice
Post-test
Change
4.91
4.60
-0.31
4.88
4.18
-0.70
4.36
4.53
0.17
3.76
4.43
0.67
4.76
4.15
-0.61
4.17
4.09
-0.08
5.14
5.29
0.14
5.23
4.56
-0.67
Visions and ideals
Post-test
Change
4.38
4.27
-0.11
4.28
4.46
0.18
4.09
4.35
0.26
4.57
4.48
-0.09
4.41
4.00
-0.41
4.57
4.45
-0.12
5.29
4.71
-0.57
4.38
4.38
-0.01
The total scores for the schools in the different environments show that latitude has a weak change.
The staff´s feeling of latitude at the schools is presented in Table 20.
64
Table 20 Schools´ scores on latitude
School
Variable
1
2
3
4
5
6
8
9
Latitude 1
Post-test
Change
3.58
3.65
0.06
3.95
3.57
-0.38
3.72
3.58
-0.14
2.52
3.13
0.61
3.29
3.50
0.21
3.63
3.91
0.28
4.07
4.86
0.79
3.57
3.19
-0.38
Latitude 2
Post-test
Change
3.65
3.94
0.29
3.91
3.68
-0.23
3.82
3.75
-0.06
2.52
2.96
0.43
3.21
3.52
0.31
3.50
4.00
0.50
4.21
4.71
0.50
3.36
3.31
-0.04
65