Evidence Brief for Illinois CSI Theory of Action and Systemwide

Illinois
CENTER for SCHOOL
IMPROVEMENT
Evidence Brief for Illinois CSI Theory of Action
and Systemwide Continuous Improvement
Introduction
Based on an extensive review of the research, the Illinois Center for School Improvement (Illinois CSI) created a
theory of action for its work:
IF a district (a) establishes interdependent, data-informed collaborative teams at all levels, (b) implements an
intentional data system, (c) puts into operation a monitoring system with feedback loops, and (d) implements
focused work deeply and with fidelity,
THEN collective adult practices and student performance will positively improve and
RESULT in increased student achievement and students being college and career ready.
The continuous improvement approach crafted by Illinois CSI includes a determination of prioritized needs,
planning for improvement based on research, evidence and innovative thinking, implementation of a focused plan
with worthy targets, progress monitoring of collective adult practices and student performance, and evaluation
of the results and process. This approach relies on interdependent, data-informed collaborative teams at the
district, school, and classroom levels and an intentional data system that informs each phase of the continuous
improvement approach.
This brief describes the research evidence base
for the Illinois CSI theory of action and systemwide
continuous improvement cycle, illustrated in the figure.
The brief specifically addresses the following:
•• Interdependent team structures
•• Data to inform decisions
•• Direction and focus
•• Deep implementation with fidelity
•• Systemic progress monitoring and feedback
Illinois Center for School Improvement
Interdependent Team Structures
DISTRIBUTED LEADERSHIP
•• Studies have suggested that when principals work with a team of teacher leaders, forming a school-based
leadership team, the speed at which transformational efforts occur is increased (Yager, Pedersen, & Yager,
2010).
•• Research indicates that district leadership plays a key part in positive change and should be embedded in
district improvement practices and strategies to make immediate and significant changes (Herman et al., 2008).
•• Leadership is an important element for districtwide improvement, and the ability of district leaders to
build capacity through a culture of candor and trust is integral for sustained improvement and student
achievement (Kirtman, 2012).
•• Longitudinal research shows that distributed leadership is the ideal setup for leadership design, where
each member of a team can “call the shots” in his or her area of expertise (“Organizational Researchers
Honor,” 2013).
•• Distributive leadership is a framework that helps describe how leadership practices work in a district or
school. It moves beyond the actions and beliefs of individual positional leaders and looks at the complex
and interconnected interaction of leaders as they influence instructional practice (Henry, 2009).
•• The redefinition of leadership as being about the “improvement of instructional practice and performance,
regardless of role” and recognizing that improvement is everyone’s responsibility—at all levels of the district
and in all districts—and it requires a common approach and focus across all programs, departments, and
offices within the district (Elmore, 2004).
•• Leadership should be viewed as the cumulative activities of a broad set of leaders, both formal and informal,
rather than as one positional leader (Spillane, Halverson, & Diamond, 2004).
TEAM SIZE
•• As a team gets bigger, the number of links that need to be managed among members increases at an
accelerating, almost exponential rate (Hackman, 2004).
•• Effective teams should comprise nine members or fewer (Hackman, 2004).
•• Teams should be stable and not change their composition frequently, but they should be given time to settle
in together (Hackman, 2004; Wharton School of the University of Pennsylvania, 2009).
•• Fearful of seeming exclusionary or putting people on the team for purely political reasons creates a
dysfunctional team. Putting together a team involves some callous decisions about membership; not
everyone who wants to be on the team should be included, and some individuals may need to be forced off
(Hackman, 2004).
TEAM ROLES
•• Teams with a person who challenges the group and opens up new ideas will outperform teams without them
(Hackman, 2004).
•• The most powerful predictor of a successful team is members who are emotionally stable. The opposite
includes anxious individuals, those who are easily agitated, who worry a lot, or who have a strong temper.
These individuals are generally bad for a team (Supovitz & Klein, 2002).
Evidence Brief for Illinois CSI Theory of Action and Systemwide Continuous Improvement
2
Illinois Center for School Improvement
SYSTEMIC APPROACH
•• Effective and linked administrative leadership at every level is key to the success of any systemic change
initiative. Leaders should be specifically trained to guide systemic change, be held accountable for the
development of planned changes, work together effectively, and sit at key decision-making tables when
budget and other fundamental decisions are discussed (Adelman & Taylor, 2007).
•• Analyses correlating measures of systemic work and student outcomes across 23 districts and 49
schools across five states show a statistically significant relationship between increased capacity to work
systemically and student achievement in 2003 and 2004 (SEDL, 2014).
•• A key determinant to successful change within any large organization is the extent to which the language of the
reform is shared across members of the organization (Supovitz & Weathers, 2004, citing Daft & Huber, 1987).
•• An evaluation report by Sam and Riggan (2013) on Cincinnati Public Schools supported by the General
Electric Foundation’s Developing Futures program evaluated the district’s capacity to support systemwide
instructional improvements. Specific practices identified in the report that have taken root in Cincinnati
Public Schools to foster internal and external engagement are as follows:
§§ Formal organizational structures to encourage horizontal and vertical collaboration, with expert cadres and
learning teams that are prepared, knowledgeable, and prepared to participate fully
§§ Input in planning and decision making
§§ Open and transparent communication
§§ Boundary spanners—people who connect and communicate with other stakeholders and serve as a
bridge between and among groups
§§ Leverage resources to support change efforts
TEAM PERFORMANCE
•• Five conditions—real team, compelling direction, enabling structure, supportive context, and competent
coaching—enhance team performance effectiveness (Hackman, 2002).
•• Highly competent coaching cannot reverse the impact of a flawed team design (Hackman & Wageman, 2005).
•• In a study of self-managing teams, team design (see the five conditions) was found to be four times as
influential as leader coaching in affecting a team’s level of self-management and almost 40 times as
powerful in affecting team performance (Wageman, 2001).
•• Coaching to build capacity significantly helped well-designed teams take advantage of their favorable
circumstances but made almost no difference for poorly designed teams (Wageman, 2001).
Data-Driven Decisions
•• Existing research on using data to make instructional decisions does not yet provide conclusive evidence of
what works to improve student achievement (Hamilton et al., 2009).
•• Information can be gleaned from districts that have demonstrated improvement that they use data to
monitor results, for making instructional and resource allocation decisions, and for accountability (Shannon
& Bylsma, 2004).
Evidence Brief for Illinois CSI Theory of Action and Systemwide Continuous Improvement
3
Illinois Center for School Improvement
•• Research studies report a range of data types and uses, including (1) creating a data-driven culture; (2) using
multiple measures of student and school performance and organizational condition information; (3) defining
the responsibility of districts to collect, analyze, and provide easy-to-use and accessible data to schools;
(4) providing professional development to district and school staff in interpreting and using data in decision
making; (5) analyzing data for planning; and (6) establishing a system to manage the data. Across all of these,
it was found that data use is a significant support for systemwide improvement (Shannon & Bylsma, 2004, as
cited by the Research and Evaluation Office at the Washington Office of Superintendent of Public Instruction).
•• Data collection is one of three “cross-cutting levers” districts have that effect change (McLoughlin & Talbert,
2003).
•• Data-driven decision making, with teachers, principals, and administrators collecting and analyzing various
types of data to make decisions to help improve the success of students and schools, guides improvement
across all levels of the education system and holds individuals, schools, and districts accountable (Marsh,
Pane, & Hamilton, 2006).
•• Student achievement increases substantially in schools with collaborative work cultures that foster a
professional learning community among teachers and others, focus continuously on improving instructional
practice in light of student performance data, and align standards and staff development support (Fullan, 1998).
Direction and Focus
•• Districts need to develop data-driven, high-reliability district systems that include clear, no-excuses goals for
teaching and learning (Goodwin, 2010).
•• There is a statistically significant relationship between how district leaders, central office staff, and teachers
perceive district leadership-related variables and student achievement. The following leadership behaviors
focused on districtwide goals significantly correlated to student achievement (Waters & Marzano, 2006):
1. Engaging in collaborative goal setting
2. Establishing non-negotiable goals for achievement and instruction
3. Ensuring board alignment and support of district goals
4. Monitoring goals for achievement and instruction
5. Using resources to support instruction and achievement goals
•• Districts and schools should stop trying to address every problem with a unique solution and focus their
improvement plans on systemic strategies that are small enough to be manageable but large enough to
make a difference in student achievement (SEDL, 2014; SRI International, 2009).
•• To increase the probability of successfully improving student achievement in low-performing systems, the
district needs first to concentrate its efforts on aligning curriculum, instruction, and assessment to state
standards (SEDL, 2014; SRI International, 2009).
•• Leaders at all levels of the system (including teacher leaders) need to support the selected focus for
improvement so that the resources of time, personnel, and energy are targeted on that focal point (SEDL,
2014; SRI International, 2009).
•• Goal-setting theory states that the source of motivation is the desire and intention to reach a goal (Redmond
& Menet, 2014). If individuals or teams find that their current performance is not achieving desired goals, then
they typically become motivated to increase effort or change their strategy (Locke & Latham, 2006). Five goalsetting principles that improve chances of success were identified: clarity, challenge, commitment, feedback,
and task complexity.
Evidence Brief for Illinois CSI Theory of Action and Systemwide Continuous Improvement
4
Illinois Center for School Improvement
•• Hattie (2009) developed a way of ranking various influences in different meta-analyses according to
their effect size. He ranked those influences that are related to learning outcomes from very positive
effects to very negative effects on student achievement. Hattie found that the average effect size of all
the interventions he studied was 0.40. Therefore, he decided to judge the success of influences relative
to this “hinge point” to find an answer to the question “What works best in education?” Goals had an
average effect size of 0.56 (Hattie, 2009).
Deep Implementation With Fidelity
TIME
•• Professional development efforts that ranged between 30 and 100 hours, with an average of 49 hours,
showed positive and significant effects on student achievement. Efforts less than 30 hours showed no
significant effects on student learning (Yoon, Duncan, Lee, Scarloss, & Shapley, 2007).
•• Professional development efforts that are related directly to actual practice, integrated with other
improvement efforts, and engage adults in collaborative learning are more effective than those that
do not (Yoon et al., 2007).
•• Because change involves coming to grips with new ideas, practices, different requirements for individuals
to perform their work, and behaviors, it is inevitable that it will not go smoothly at the beginning of
implementation. By being aware of the implementation dip, plans can be implemented in less time.
Without knowledge of the implementation dip, problems continue and people quit without giving the
innovation or idea a chance (Fullan, 1998).
CRITICAL MASS
•• A critical mass must be obtained before desired results are realized. Until 90 percent of a school faculty can
honestly report active engagement in the use and implementation of the innovation, the needle on student
performance will remain unchanged (Reeves, 2006, 2009).
•• Schools are struggling with a knowing-doing gap—they know what works but are not necessarily doing it (or
doing it deeply and with fidelity). The knowing-doing problem describes the challenge of converting knowledge
about how to boost performance into actions consistent with that knowledge (Pfeffer & Sutton, 2000).
FIDELITY OF IMPLEMENTATION
•• “Fidelity of implementation means adherence to both the proper execution of the specific practices and
the effective coordination of all the practices as they are intended to be combined” (Perlman & Redding,
2011, p. 81).
•• There has been limited research on fidelity of implementation in the social sciences or human services.
However, research in drug abuse prevention provides evidence that poor implementation is likely to result in
a loss of program effectiveness (Dusenbury, Brannigan, Falco, & Hansen, 2001).
•• Guidelines, policies, and educational information alone or practitioner training alone is not effective for
implementation with fidelity (Fixsen, Naoom, Blase, Friedman, & Wallace, 2005).
•• Longer term multilevel implementation strategies are more effective than short-term strategies (Fixsen et al.,
2005).
Evidence Brief for Illinois CSI Theory of Action and Systemwide Continuous Improvement
5
Illinois Center for School Improvement
Systemic Progress Monitoring and Feedback
•• Education leaders typically lack a clear, detailed, and timely perspective on what is happening in schools and
classrooms as a consequence of their reform initiatives. If results are weak or mixed (as is often the case),
then leaders typically cannot distinguish between ineffective reform ideas and poor implementation because
they lack an accurate picture of the depth of implementation (Leithwood & Aitken, 1995).
•• Systematic implementation monitoring of instructional improvement efforts often is missing or weak. Districts
and schools commonly formulate missions that incorporate instructional goals and determine strategies to
achieve their missions. Then, they monitor student performance results to determine the effectiveness of
their strategies. This sequence is explicit in the school improvement planning process that is commonplace
in schools across the country, although the instructional goals in these missions often lack the specificity
required for adequate monitoring and assessment of results (Supovitz & Klein, 2002).
•• In a 2004 study of Duval Public Schools, researchers concluded that the monitoring of instructional
reform initiatives is a powerful, but relatively untapped, way of distributing common understandings of
practice throughout large systems. They assert that the way in which systems are designed to measure
the implementation of a district’s reform initiative provide insight into the depth of implementation. In the
Duval Public Schools situation, key design issues included the careful selection and alignment of the data
topics, the selection of data collectors as broadly representing leadership of the system, the professional
development of the data collectors, the iterative development process of the data rubrics and evidence
forms, the decision to provide results aggregated to the district level, and the opportunities to communicate
the results. Therefore, a system planned to monitor implementation may contribute to the deepening of the
implementation it is intended to capture (Supovitz & Weathers, 2005).
•• Cincinnati Public Schools employed three practices for success in its evaluation and monitoring: identifying
specific metrics for major initiatives, monitoring regularly the indicators and decisions to stop, and
continuing or expanding the initiatives based on the monitoring and evaluation data (Sam & Riggan, 2013).
•• Monitoring of continuous improvement plans at the district and school levels is a substantial factor in
improving academic achievement. Monitoring, supporting, and evaluating progress toward goals are among
the most important duties of school leaders (Marzano, Waters, & McNulty, 2005).
•• Feedback loops are necessary for continuous improvement (Rorrer, Skrla, & Scheurich, 2008).
References
Adelman, H. S., & Taylor, L. (2007). Systemic change for school improvement. Journal of Educational and
Psychological Consultation, 17(1), 55–57.
Dusenbury, L., Brannigan, R., Falco, M., & Hansen, W. (2001). A review of research on fidelity of implementation:
implications for drug abuse prevention in school settings. Health Education Research, 18(2), 237–256.
Elmore, R. F. (2004). School reform from the inside out: Policy, practice, and performance. Boston, MA: Harvard
Education Press.
Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A
synthesis of the literature. Tampa: University of South Florida, Louis de la Parte Florida Mental Health Institute,
National Implementation Research Network. Retrieved from http://nirn.fpg.unc.edu/sites/nirn.fpg.unc.edu/
files/resources/NIRN-MonographFull-01-2005.pdf
Evidence Brief for Illinois CSI Theory of Action and Systemwide Continuous Improvement
6
Illinois Center for School Improvement
Fullan, M. (1998). Leadership for the 21st century: Breaking the bonds of dependency. Educational Leadership,
55(7), 6–10. Retrieved from http://www.ascd.org/publications/educational-leadership/apr98/vol55/num07/
Leadership-for-the-21st-Century@-Breaking-the-Bonds-of-Dependency.aspx
Goodwin, B. (2010). Changing the odds for student success: What matters most. Denver, CO: Mid-continent
Research for Education and Learning. Retrieved from http://files.eric.ed.gov/fulltext/ED544634.pdf
Hackman, J. R. (2002). Leading teams: Setting the stage for great performances. Boston, MA: Harvard Business
School Press.
Hackman, J. R. (2004). What makes a great team? Retrieved from http://www.apa.org/science/about/
psa/2004/06/hackman.aspx
Hackman, J. R., & Wageman, R. (2005). A theory of team coaching. Academy of Management Review, 30(2),
269–287.
Hamilton, L., Halverson, R., Jackson, S. S., Mandinach, E., Supovitz, J. A., & Wayman, J. C. (2009). Using
student achievement data to support instructional decision making (NCEE 2009–4067). Washington, DC: U.S.
Department of Education, Institute of Education Sciences, What Works Clearinghouse. Retrieved from http://
ies.ed.gov/ncee/wwc/pdf/practice_guides/dddm_pg_092909.pdf
Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. New York, NY:
Taylor & Francis.
Henry, S. (2009). Usable knowledge: A new view: Distributed leadership (Unpublished doctoral dissertation). Harvard
Graduate School of Education, Cambridge, MA.
Herman, R., Dawson, P., Dee, T., Greene, J., Maynard, R., & Redding, S. (2008). Turning around chronically lowperforming schools: A practice guide (NCEE 2008–4020). Washington, DC: U.S. Department of Education,
Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance. Retrieved
from http://ies.ed.gov/ncee/wwc/pdf/practice_guides/Turnaround_pg_04181.pdf
Kirtman, L. (2012). Four steps to building leadership capacity. Harvard Education Letter, 2(2). Retrieved from
http://hepg.org/hel/article/530
Leithwood, K., & Aitken, R. (1995). Making schools smarter: A system for monitoring school and district progress.
Thousand Oaks, CA: Corwin.
Locke, E. A., & Latham, G. P. (2006). New directions in goal-setting theory. Current Directions in Psychological
Science, 15(5), 265–268.
Marsh, J., Pane, J., & Hamilton, S. (2006). Making sense of data-driven decision making in education. Santa Monica,
CA: RAND. Retrieved from http://www.rand.org/content/dam/rand/pubs/occasional_papers/2006/RAND_
OP170.pdf
Marzano, R. J., Waters, T., & McNulty, B. A. (2005). School leadership that works: From research to results.
Alexandria, VA: ASCD.
McLoughlin, M., & Talbert, J. (2003). Reforming districts: How districts support school reform. A Research
Report. Seattle: University of Washington. Retrieved from https://depts.washington.edu/ctpmail/PDFs/
ReformingDistricts-09-2003.pdf
Evidence Brief for Illinois CSI Theory of Action and Systemwide Continuous Improvement
7
Illinois Center for School Improvement
Organizational researchers honor J. Richard Hackman’s legacy. (2013). Observer, 26(6). Retrieved from http://
www.psychologicalscience.org/index.php/publications/observer/2013/July-August-13/organizationalresearchers-honor-j-richard-hackmans-legacy.html
Perlman, C., & Redding, S. (Eds.). (2011). Handbook on effective implementation of school improvement grants.
Lincoln, IL: Center on Innovation and Improvement. Retrieved from http://www.centerii.org/handbook/
Pfeffer, J., & Sutton, R. I. (2000). The knowing doing gap. Boston, MA: Harvard Business School Press.
Redmond, B. F., & Menet, B. A. (2014). PSYCH 484: Work attitudes and job motivation: Goal setting theory. Retrieved
from http://wikispaces.psu.edu/display/PSYCH484/6.+Goal+Setting+Theory
Reeves, D. B. (2006). Leadership maps. Free Resource for Educators.
Reeves, D. B. (2009). Leadership and learning. Englewood, CO: The Leadership and Learning Center. Retrieved
from https://www.mbaea.org/documents/filelibrary/pdf/principals_leadership_acadmey/0809/dr_reeves_
june_15_Presentation_6FEB06CE5C083.pdf
Rorrer, A., Skrla, L., & Scheurich, J. (2008). Districts as institutional actors in educational reform. Educational
Administration Quarterly, 44(3), 307–358.
Sam, C., & Riggan, M. (2013). Building district capacity for system-wide instructional improvement in Cincinnati
Public Schools (Working paper). Philadelphia, PA: Consortium for Policy Research in Education. Retrieved from
http://www.cpre.org/sites/default/files/workingpapers/1531_districtreportcps.pdf
SEDL. (2014). SEDL’s working systemically approach: A process grounded in research. Retrieved from http://www.
sedl.org/ws/approach.html
Shannon, G. S., & Bylsma, P. (2004). Characteristics of improved school districts: Themes from research. Olympia,
WA: Office of Superintendent of Public Instruction. Retrieved from http://www.k12.wa.us/research/pubdocs/
DistrictImprovementReport.pdf
Spillane, J. P., Halverson, R., & Diamond, J. B. (2004). Towards a theory of leadership practice: A distributed
perspective. Journal of Curriculum Studies, 36(1), 3–34.
SRI International. (2009). Systemic vs. one-time teacher professional development: What does research say?
Research Note 15. Dallas, TX: Texas Instruments.
Supovitz, J., & Klein, V. (2002). Mapping a course for improved student learning: How innovative schools
systematically use student performance data to guide improvement. Philadelphia, PA: Consortium for Policy
Research in Education.
Supovitz, J. A., & Weathers, J. (2004). Dashboard lights: Monitoring implementation of district instructional reform
strategies. Philadelphia, PA: Consortium for Policy Research in Education. Retrieved from http://www.cpre.org/
sites/default/files/researchreport/820_snapshotstudy.pdf
Wageman, R. (2001). How leaders foster self-managing team effectiveness: Design choices versus hands-on
coaching. Organization Science, 12, 559–577.
Waters, J. T., & Marzano, R. J. (2006). School district leadership that works. The effect of superintendent leadership
on student achievement. Denver, CO: Mid-continent Research for Education and Learning. Retrieved from http://
files.eric.ed.gov/fulltext/ED494270.pdf
Evidence Brief for Illinois CSI Theory of Action and Systemwide Continuous Improvement
8
Illinois Center for School Improvement
Wharton School of the University of Pennsylvania. (2009). “Goals gone wild”: How goal setting can lead to disaster.
Retrieved from http://knowledge.wharton.upenn.edu/article/goals-gone-wild-how-goal-setting-can-lead-todisaster/
Yager, S., Pedersen, J., & Yager, R. (2010, Fall). Impact of variations in distributed leadership frameworks on
implementing a professional development initiative. Academic Leadership Online Journal.
Yoon, K. S., Duncan, T., Lee, S. W., Scarloss, B., & Shapley, K. L. (2007). Reviewing the evidence on how teacher
professional development affects student achievement (Issues & Answers Report, REL 2007-No. 033).
Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education
Evaluation and Regional Assistance, Regional Educational Laboratory Southwest. Retrieved from http://ies.
ed.gov/ncee/edlabs/regions/southwest/pdf/rel_2007033.pdf
Evidence Brief for Illinois CSI Theory of Action and Systemwide Continuous Improvement
1680a_08/15
9