Update on Accountability and Continuous Improvement System State Board of Education – July 13, 2016 Meeting by Martha Alvarez, ACSA Legislative Advocate [email protected] LCFF EVALUATION RUBRICS The initial phase of the evaluation rubrics are scheduled for adoption at the September State Board of Education (SBE or Board) meeting, with the understanding that they will continue to evolve over time. Based on the May and July Board actions, the current state indicators that will be incorporated into the initial phase of the rubrics include: Academic indicator based on student test scores on English Language Arts and Math for 3rd-8th grade; High school graduation rate; Progress of English learners toward English language proficiency (note: SBE’s staff recommended approach to EL progress is an indicator that incorporates reclassification rate data); Suspension rates by grade span; and College and career indicator, which combines grade 11 test scores on ELA and Math and other measures of college and career readiness. College and Career Indicator (CCI): The CCI is intended to be operational for this year, but will likely evolve as the SBE incorporates additional measures by which students would satisfy performance levels. The draft model prepared by CDE staff, and presented below, serves as a preliminary list of four levels of preparedness using accomplishment of one of a number of paths, including test outcomes, CTE, and others yet to be finalized. Several Board members and stakeholders, including ACSA, noted that the current draft if too academically focused and there are not enough options to find balance with career preparedness. The SBE Board directed staff to prepare a final CCI for additional review and consideration at the September meeting. ACSA expressed support for additional analysis and future consideration of a grade 8th indicator that assesses high school readiness. Elements to be Added in the CCI in the 2017–18 School Year State Seal of Biliteracy Golden State Seal of Merit Diploma Elements that Need Further Data Analysis California Student Aid Commission (CSAC) Grade Point Average (GPA) of 3.0 or higher1 (Prepared) CSAC GPA of 2.0 or higheri (Approaching Prepared) Completing A-G courses without maintaining an average grade of C or better 1 CSAC uses two cut points for senior GPA (which is unweighted and excludes PE, ROTC, and remedial courses): A minimum 3.00 high school GPA is required for Cal Grant A; a minimum 2.00 GPA is required for Cal Grant B. Completion of state-approved portfolio (requires development of a state approval process for well-developed instruments like student portfolio) Industry credential and/or career assessment Local Elements LEAs may possibly include local data on college and career to augment the CCI model. Draft 2016–17 College and Career Indicator Model2 WELL PREPARED Does the student meet at least 1 measure below? Career technical education (CTE) pathway completion with “C” or better Scored “Ready” on both math and ELA EAP3 3 or more Advanced Placement (AP) exams (score 3 or higher) 3 or more years of dual enrollment International Baccalaureate (IB) Diploma PREPARED Does the student meet at least 1 measure below? A-G completion with a “C” or better plus one other CCI measure Articulated CTE Pathway completion with “C” or better Scored “Ready” and “Conditionally Ready” on EAP CTE concentrator plus one year of dual enrollment 2 years of dual enrollment 2 AP exams (score 3 or higher) At least 4 IB exams (score 4 or higher) APPROACHING PREPARED Does the student meet at least 1 measure below? A-G completion with “C” or better 1 or more non-articulated CTE pathway completion CTE Concentrator (2 courses in the same pathway) Scored “Conditionally Ready” on both the ELA and math EAP Scored “Ready” and “Not Ready” on the EAP 1 year of dual enrollment 1 AP exam (score 3 or higher) 2-3 IB exams (score 4 or higher) NOT YET PREPARED The student did not meet any of the measures above. The student has not yet demonstrated readiness for college and career 2 Based on student data from the California Longitudinal Pupil Achievement Data System (CALPADS) four-year graduation cohort (i.e., original ninth grade class plus data from CSAC) 3 “Ready” requires a SBAC Score of 4/Standard Exceeded. “Conditionally Ready” requires a SBAC score of 3/Standard Met. “Not Yet Ready” requires a SBAC score of 2/Standard Nearly Met. The SBE and CDE staff provided further explanations and clarifications on the other state indicators, including a reminder that most data is a year old. Academics – Test Scores: It was noted that 2015 assessment information is available now, and 2016 data will be available in August. As a result, the first year change will show only 1 year of data, and over time, it may include a 3-year average. Individual student-level growth measures are being developed possibly for 2017-18. Graduation Rate: is a 4 year cohort rate. ESSA allows for 5 and 6 year graduation rates, and the SBE will consider incorporating that into a future version – possibly 17-18. For the initial phase, change rate will compare 2014-15 to 2013-14 data. English Learner Performance: will begin using CELDT and reclassification rate data comparing 2014-15 to 2013-14. For Long-term English Learners, the SBE staff is considering using 2015-16 data. There will not be comparable prior year data, so this information is only intended to be used at the district level, not the school level. School Climate: The SBE approved inclusion of standards for the use of local climate surveys to support a broader assessment of performance on Priority 6 (School Climate). The intent is to more comprehensively review student engagement beyond the use of suspension rates, one of the designated state indicators. The CDE will be establishing a working group to develop a survey model that addresses students, parents and staff. METHODOLOGY FOR ESTABLISHING STANDARDS FOR REMAINING LCFF PRIORITIES: The state indicators approved by the SBE in May 2016 address only three LCFF priorities (Priority 4, pupil achievement; Priority 5, pupil engagement; and Priority 6, school climate). As a result, the SBE directed staff to prepare a recommendation for the July 2016 SBE meeting on a method for establishing standards for the remaining LCFF priorities not included in the state indicators, and how those standards will inform a local educational agency’s (LEA) eligibility for technical assistance and intervention. With the advice of a working group, the CDE staff will be developing a state-supported local survey or self-assessment tool for school districts to use in measuring and reporting their own progress towards meeting the additional state priorities listed below: Priority 1 (Basics / Williams settlement requirements– appropriately assigned teachers, access to instructional materials and functional school facilities) Priority 2 (Implementation of State Academic Standards) Priority 3 (Parent Engagement) COE Priority 9 (Coordination of services for expelled students) COE Priority 10 (Coordination of services for Foster Youth) Staff’s initial recommendation include developing objective descriptions of practice that, if implemented locally, are likely to enhance local decision making for the relevant LCFF priority. The proposed standards involve collecting additional information and reporting it through the LCFF evaluation rubrics, with the intent of providing additional insight for local decision makers for assessing performance within LCFF priorities and informing local stakeholder conversations. Based on a preliminary example, the selfassessment would describe the criteria that LEAs would use to assess progress toward meeting the standard (I.e. [Met / Not Met / Not Met for Two or More Years]). An example for Priority 1 (Williams Settlement Requirements) is listed below: Priority 1: Appropriately Assigned Teachers, Access to Curriculum-Aligned Instructional Materials, and Safe, Clean and Functional School Facilities Standard: LEA / School meets Williams settlement requirements at 100%, promptly addresses any complaints or other deficiencies identified throughout the academic year, and provides information on progress meeting this standard in the evaluation rubrics. Evidence: LEAs would use locally available information, including data currently reported through the School Accountability Report Card, and determine whether they report progress in the evaluation rubrics. Criteria: LEAs would assess their performance on a [Met / Not Met / Not Met for Two or More Years] scale. TOP-LEVEL DATA DISPLAY AND EQUITY REPORT The top-level summary data display will be a summary report for use by LEAs and schools showing performance relative to the standards established for all LCFF priorities. The system continues to pursue the 5x5 color grid showing performance categories (e.g., Blue, Green, Yellow, Orange, Red) for each of the indicators, with the goal of getting LEAs, schools and subgroups to reach the Green status. The current design anticipates a single color for each indicator that is a composite of both status and change. Some groups are advocating that there be separate color coding for each, but the SBE has yet to make that final determination. There is no intent to do a single overall color ranking as the SBE is not intending to use a summative rating. Based on the Board’s direction, it will prominently reflect equity by showing areas where there are significant disparities in performance for any student subgroups. There remains question if they will show all subgroups performance levels or only the specific student subgroup(s), with a valid n-size, with the two lowest performance categories (Red/Orange) as currently proposed. Within the web-based system, this will likely be a main “landing page” for each LEA and school, with the opportunity for school leaders and local stakeholders to generate more detailed reports showing all performance categories for all student subgroups. The draft display includes a narrative section to allow for local explanation of indicator results. Below is the latest prototype, although the Board suggested SBE staff work with a graphic designer to provide several options for the Board’s consideration and adoption at the September meeting. According to an earlier memorandum by SBE staff, the web-based tool and evaluation rubrics will also include the following components: Data Analysis Tool: Will allow users to generate more detailed data reports that include both state and local indicators. State collected data will be prepopulated, if available. The tool will also support the upload of local data using standardized file formats. This will allow local upload of data for indicators with standard definitions, but where the data is locally held, as well as inclusion of locally determined indicators that an LEA may add to align with its LCAP goals. For indicators without a standard statewide definition or data source (e.g., parent involvement), the data analysis tool may identify a limited number of options that are based in research and are considered valid and reliable measures. LEAs would use a “local data selection” menu to select one or more of those options to track their progress over time using local data. Statements of Model Practices: Will include descriptions of research-supported and evidence-based practices related to the indicators that are optional and may be helpful to LEAs in their analysis of progress. The Statements of Model Practices component of the evaluation rubrics support the Data Analysis Tool and will: Serve as qualitative statements describing examples of effective practices and processes for LEAs to consider and compare to existing practices and processes in place. Describe qualitative information that cannot be assessed only through quantitative analysis of state and local indicators. Provide additional data than can assist users as they review local practices to improve student achievement at the system, school and classroom levels. Be organized to correspond to the organization of the indicators in the data analysis tool. Users could directly access the statements of model practice from the main landing page. They would also be able to access relevant statements of model practices from the data analysis tool interface, which will support users in reflecting on local actions relative to the model practices while they are reviewing data on performance. Links to External Resources: Will include links to existing resources and sources of expert assistance (e.g., CDE digital library, CDE LCFF Resources webpage, the website for the California Collaborative for Educational Excellence, research-based resources identified by stakeholders). These links will connect users to more detailed information about implementing specific programs or services that align with the statements of model practices. The links would be organized by indicators as optional resources for use by LEAs and will be accessible to local stakeholders. This component could evolve over time, for example, directing users to a centralized clearinghouse of successful local practices, information about local or regional networks, etc. LOCAL CONTROL AND ACCOUNTABILITY PLAN (LCAP) ANNUAL TEMPLATE REDESIGN The Board had an informational item regarding the progress being made towards simplifying the LCAP Template, with an anticipated Board action to adopt revisions for the 2017-18 school year at the September meeting. Based on the feedback provided by ACSA members, ACSA staff believes the revised LCAP template is improved over the current template. The Board is moving towards adopting a “static” LCAP template (effective for an inclusive three year period) instead of a “rolling” LCAP template (adding another year into the plan at the time of the annual update), with the goal of promoting strategic educational planning over multiple years. The revised version also includes a “LCAP Summary” section that would require districts to provide an executive summary on their goals, actions, services and potentially budget information. There are several outstanding issues that remain, including whether 100 percent of an LEA’s funding sources need to be reported in the LCAP or only LCFF supplemental and concentration funding, and whether the template is turning into a compliance document rather than a planning document as a result of trying to link and report specific funding information. Other questions include whether the LCAP annual review process needs to require specific response to underperforming subgroups on indicators highlighted in the LCFF evaluation rubrics, and whether there should be a direct link between these documents. It remains to be decided if the Board will mandate the use of the proposed electronic LCAP template, rather than making it optional.
© Copyright 2026 Paperzz