Educator Evaluation-Improving Instruction for All Students

General Session:
Educator Evaluation-Improving
Instruction for All Students
Janice Poda
Mary Dean Barringer
Irv Richardson
Facilitators
Janice Poda
Mary Dean Barringer
Irv Richardson
State Consortium on Educator
Effectiveness (SCEE)
Background and Overview
 Changing educator evaluation systems from single
observations to multiple data sources is complex and
challenging work
 Our initial entry into this work was reactive (e.g., Widget
Effect, RTTT, waivers)
 Like any complex change, there are implementation
issues that create “problems of practice”
 The good news--addressing implementation issues
provides us the opportunity to be proactive with actions
and messaging
SCEE States’ Self-Reporting on their
Use of the MET Study’s Nine
Principles for Using Measures of
Effective Teaching
MEASURE EFFECTIVE TEACHING
Set expectations
Completely 61%
Somewhat 29%
Slightly
11%
Not at All
0%
MEASURE EFFECTIVE TEACHING
Use multiple measures
Completely 68%
Somewhat 14%
Slightly
14%
Not at All
4%
MEASURE EFFECTIVE TEACHING
Balance weights
Completely
Somewhat
Slightly
Not at All
39%
27%
23%
12%
ENSURE HIGH-QUALITY DATA
Monitor validity
Completely
Somewhat
Slightly
Not at All
12%
42%
27%
19%
ENSURE HIGH-QUALITY DATA
Ensure reliability
Completely
Somewhat
Slightly
Not at All
15%
31%
35%
19%
ENSURE HIGH-QUALITY DATA
Assure accuracy
Completely
Somewhat
Slightly
Not at All
12%
42%
31%
15%
INVEST IN IMPROVEMENT
Make meaningful distinctions
Completely
Somewhat
Slightly
Not at All
35%
27%
12%
27%
INVEST IN IMPROVEMENT
Prioritize support and feedback
Completely
Somewhat
Slightly
Not at All
39%
39%
4%
19%
INVEST IN IMPROVEMENT
Use data for decisions at all
levels
Completely
Somewhat
Slightly
Not at All
35%
31%
23%
12%
We asked SEA Personnel what they wanted
Chiefs to Know about Educator Evaluation…
“As states have redesigned educator
evaluations over the past few years, we
have learned many lessons.
At this point, what would you like all chiefs to
know and understand about educator
evaluations?”
SEA Personnel Would Like Chiefs to:
 Stay the course, but recognize that design of educator
evaluation systems takes time and requires continuous
improvement.
 Keep the focus on improved teaching practice that
results in student growth and learning.
 Integrate the evaluation system with other state reforms
instead of many seemingly disparate initiatives.
 Develop communication pipelines that allow effective
delivery of information about evaluation systems to
districts.
 Help the SEA listen, question and admit when they don’t
have the answers.
Activity: Exploring Problems of Practice
 Continuous improvement requires proactively
anticipating and addressing issues as they arise with
effective actions and targeted messaging.
 Engage you in an activity will provides an opportunity to
explore messaging and actions that address some of the
issues states are facing as they roll out the educator
evaluation systems.
Steps for the Activity
1. Choose a facilitator who will also serve as the spokesperson for
your group
2. A recorder has been assigned to your group
3. Facilitator reads directions and scenario (5 minutes)
4. Discuss scenario (20 minutes)
5. Develop three key message points (5-8 minutes)
6. Develop three action steps the SEA could take to keep the
educator evaluation work moving forward (5-8 minutes)
7. Recorder will record your thoughts on chart paper
8. We will be calling on selected groups to report out (15 minutes)
9. All chart pages will be posted for a “Gallery Walk” during the
break
Times
9:30 – 10:05
Choose facilitator/reporter for the group
Read task and scenario
Discuss issues in scenario and develop:
• Three key message points
• Three actions the SEA should take now to keep educator evaluation on track
10:05 – 10:20
Report out from selected groups
10:20 – 10:30
Observations/Ahas from the Activity
10:30 – 11:00
Break and Gallery Walk