March 9, 2011

DAC Meeting March 9, 2010 4‐7pm Presenters: Ann Best, Dina Hasiotis, Bill Horwath, Carla Stevens, Rafael Reyes Attendees: Natalie Abrameit, Drew Bissell, Paulette Bogert, Jose Contreras, Tony D’Angelo, Rosa DeAnda, Andrew Dewey, Reba Goodman, Hermas Grayless, Mary Hacopian, Andrea Holberg, John Lengers, Danette Maldonado, Valerie Manby, Brent McCowan, Sheryl Oliver, Laurie Parkin, Callie Pettway, Marie Pousson, Meeting Notes: I. Introduction (Ann Best) A. Welcome DAC B. Objectives: a. Ensure a common understanding of the draft proposal. b. Identify existing questions and preliminary recommendations DAC members have on specific aspects of the draft proposal. c. Draft an appeals process. d. Provide feedback on the rating methodology. e. Build initial draft of the final recommendation for the student performance criteria. II.What’s Next (Ann Best) A. Review SDMC’s recommendations. B. Finalize recommendations by 3/24. III. Review of Cycle 4 Recommendations and Public Comment Period (Dina Hasiotis) A. Overview of Cycle 4 recommendations from presentation. B. Public Comment feedback suggested that teachers and appraisers are generally supportive of the draft proposal, but many raised important questions and concerns. C. The process has continued to gather recommendations and feedback from SDMCs and other stakeholders. D. Remaining submissions will be analyzed and available Monday after Spring Break in condensed version but open source is always an option. IV.
Review of Principal Appraisal System (Bill Horwath) A. Designing a new system for school leaders. B. Discussing the concept of upward feedback. C. Any input or feedback would be helpful. V.
Review of new Support and Development (Dina Hasiotis) A. IPDP redesigned to be a living document for teachers. B. New role – Teacher Development Specialist: will observe lessons, provide feedback and connect teachers to job embedded opportunities. C. District is recruiting within and outside for strong TDS candidates. VI.
First breakout session (Table Groups) A. Instructional Practice and Professional Expectations Criteria a. Questions for Breakout Session 1. Are there any outstanding questions for clarification? If so, please list. 2. Are there any outstanding concerns? If so, please list. 3. Based upon the public comment feedback and the working groups recommendations, are there any changes to the current Instructional Practice and Professional Expectations criteria that you believe the DAC should consider to improve clarity, fairness and accuracy? b. General Feedback and Concerns 1. The rubrics are very fine‐grained and need to be condensed for ease and usability. 2. The level of expectation to be a “Highly Effective” will be difficult to attain, based upon the descriptors in the rubric. 3. The rubric is not applying to all teachers and does not have enough description for ancillary teachers, in subjects like PE, Art, Music, as the descriptors may not apply to them at all times. c. Recommended Changes to Instructional Practice and Professional Expectations Criteria ( *The group only got through reviewing the suggested changes for Instructional Practice) 1. Combine: “Sets and implements classroom routines and procedures” with “Maximize instructional time”, to create one criteria. 2. Combine: “Demonstrates Pedagogy and content Knowledge” with “Designs lesson plans, units and assessments,” by including descriptors from the “Demonstrates” into the “Designs..” criterion. 3. Add “effective” to the criterion‐ “Designs [effective] lesson plans, units and assessments” to increase clarity. 4. Collapse Engagement and Classroom Environment domains under Instruction Domain, to create 2 domains in Instructional Practice‐ “Instruction” and “Planning”‐ all the criteria remains the same, but the two headers of “Engagement and Classroom Environment” are removed. 5. The group chose to decline the recommendation that “Engages students in wok that develops higher‐level thinking skills” be combined with “Students actively participating in lesson activities,” as they are different in their focus. B. Appeals Process a. Questions for Breakout Session 1. Should teachers have a right to a written rebuttal at any point in the appraisal process? OR should they have the right to receive a second appraisal? 2. If a second appraisal, what should that appraisal consist of? When should they be able to request this appraisal? b. Outcomes from Discussion: 1. Teachers should be given the right to receive a second appraisal. 2. The group discussed having the appeal occur at the end of the year when teachers receive their final summative ratings or “final grade” yet there was also discussion of the right to appeal at the mid‐year conference‐ if such a right existed, teachers would only be able to appeal once a school year. 3. A second appraisal would consist of one additional observation and a review of the existing documentation for the other domains. 4. Teachers may be able to request that their second appraiser have content knowledge in their subject area, if that is the reason for the appeal. 5. The issue of timing of summative ratings was discussed, and the Cabinet is still discussing if a final overall summative rating would be given in May, based upon this year’s IP/PE and last year Student Performance or if teachers would receive final IP and PE scores in May, at the end of the year conference, and then if their student performance scores are not available, they would receive them, along with the overall summative rating over the summer/at the start of the school year, which would impact the cycle of appeals. If it is the latter option, teachers could request an appeal on their final IP/PE score in May, which would then trigger a second appraisal. C. Rating Methodology Feedback a. Questions for Breakout Session 1. Are there any outstanding questions for clarification? If so, please list. 2. Are there any outstanding concerns? If so, please list. 3. What changes would you recommend be made to highlighted squares in the insides of the look‐up tables? 4. What steps could the district put in place to audit ratings to help ensure they are as fair and reliable as possible? b. Outcomes from Discussion 1. Note 1: In their discussion, the group raised points about: (1) Considering alleviating teacher concerns/fears by rating “up” where appropriate; (2) Giving teachers ample opportunity to score at the “effective” level; and (3) Validity of the value‐added measure 2. Note 2: While the student performance table is currently labeled as the “average score of…”, analysis is underway to determine if combining scores from multiple preps in a different way provides fairer and more accurate assessments of the teacher’s impact on student learning. (Straight averaging causes the percentage of teachers in both the high and low ends of the scoring spectrum to decrease significantly.) In
n the look up tables below, the circled ccells are wherre the group d
discussed thee possibility o
of adjusting th
he value. In
n most discussions, consen
nsus was not reached by th
he group. Ho
owever, in thee circled cells where there are two nu
umbers, conssensus was re
eached by thee group that tthe value of th
hat cell shoulld be adjusted
d to the number in the paarenthesis. Fo
or the summaative look up tables below
w, the group d
did not go thro
ough each table cell by cell (because off time reestrictions), b
but instead ind
dividually revviewed the tables and provvided commeent. The circleed cells are w
where qu
uestions weree raised abou
ut the value in
n the cell, butt no consensu
us was reacheed to recomm
mend that thee value acctually be chaanged. There
e were no pro
oposed consid
derations for change in thee table for teaachers withou
ut value‐
ad
dded data. ble for Teacheer Summ
mative Table ffor Teachers Summative Tab
with Value‐A
Added Data with
hout Value‐Ad
dded Data
D. Process & Synthesis a. Group confirms deadlines for each conference: 12/30, 1/31, 5/1 b. Group suggests: 1. There should be a retraining or refresher on the rubric 2. Show a video example for a check‐in, and on what a 4 looks like, etc. 3. “Back end checking” or triangulation of the quality of the appraisal by consulting with the TDS and looking at value‐added data to provide reference points 4. There will need to be a shift in mindset for the teachers to accept the system and let it work. Everyone thinks he/she deserves a 4. 5. Possibly have teacher ‘walk in the shoes” of the appraiser utilizing the block schedule. Seeing the process from both sides and learning from being part of the process. VII.
Second breakout session – Measures of Student Learning A. Summary of Questions 1.
How will any teacher be assured in the equity of the number of measures used in “multiple measures”? 2. How will the district provide equity in the number of measures used by campus, grade, etc? 3. Is a campus going to be able to say, we’re going to use end of year PK literary and math assessments and portfolio…How do we know if campus B uses those same measures and are equitable? 4. Will student work and portfolios be a measure? 5. Teacher evaluation, incorporating different tasks; we’re using that and included it in development for early childhood‐ how will that be accounted for? 6. How will PE be covered in the new system, for teachers without EVAAS? 7. What about alignment issues with Stanford? While the approach to have additional measure helps where it is only partially aligned, I am still concerned that does not go far enough and is not fair to teachers. 8. How will the system address alignment concerns in 3rd grade language on the Stanford? 9. How will EVAAS be used differently than in ASPIRE? 10. How will teachers with high‐performing students be measures‐ what “stretch” considerations will be in place? 11. Is there potential for leniency in implementation in the first year, given this is all new? 12. How would teachers go about setting goals for student progress with their appraiser? 13. How are measures assigned to teachers? B. Summary of Concerns‐ 1. Concern: What pre‐test to use for AP courses – Could the PSAT be used? 2. Concern: Some teachers have classes that have disproportionate amounts of students who are low‐performing, and high growth is harder to achieve in those classrooms. 3. Concern: Difference between TAKS Reading and Writing and distinguishing between teachers must be addressed, as teachers who are “English” have been responsible for both. 4. Concern: EVAAS is not a tool to help improve my instruction and should not be included. 5. Concern: Some core teachers feel very strongly that this is unfair because of the difference in the ratings of non‐core and core teachers. There are few chances to score high 6.
VIII.
Concern: Montessori Teachers‐ TAKS should not be the first measure‐ it would be more suitable for the appraiser‐approved to be your primary measure. Conclusion a. The next DAC meeting will be on March 24th from 4‐7pm b. Please check your email to receive the summary of SDMC Recommendations, which we will expect for you to review prior to the meeting on March 24th. c. Please complete your homework and send your responses to [email protected].