SAGE - Utah PTA!

SAGE
Student Assessment of Growth and Excellence
Utah’s
Computer Adaptive
Assessment
Judy W. Park, Ed.D.
Associate Superintendent
Utah State Office of Education
April 22, 2014
Computer Adaptive Testing
A Computer Adaptive Test (CAT) is a test that dynamically adjusts to the ability
level of each examinee as the test is being administered.
This provides a testing environment that is unique to each student.
The technology responds to individual student responses, increasing the difficulty
of questions when the student answers correctly and decreasing the difficulty of
questions when the student answers incorrectly.
The assessment is able to more accurately identify each student’s knowledge and
abilities.
The test is designed to provide diagnostic information about a student that is
above or below grade level .
CAT typically decreases assessment time because students are
not responding to large numbers of questions that are outside
their ability level.
SAGE assessment SYSTEM
• Formative
– Optional instructional tools for teachers
• Interim
– Optional Fall and Winter Testing
• Summative
– Spring End of Course Testing
3
SAGE assessment SYSTEM
• Computer Adaptive Testing
– English Language Arts
• Grades 3 – 11
– Mathematics
• Grades 3 – 8, Math I, Math II, Math III
– Science
• Grades 4 – 8, Earth Science, Biology, Physics, Chemistry
• Writing
–
–
–
–
60 minute Writing Prompt (Opinion/argumentative)
30 minute Writing Prompt (informative/explanatory)
Grades 3 – 11
Replaces Direct Writing Assessment (DWA)
• Grades 5 & 8
4
SAGE exists due to the vision,
foresight, planning and participation
of……..
• Governor’s Office
• Legislature
• State Board
• Districts & Charter Schools
• Education Stakeholders
• Students/Parents
Historical Review
• 2007
– Governor Huntsman’s Blue Ribbon Panel on Assessment
– Recommendations approved by the Panel and State Board of
Education
• Implement EPAS (ACT) testing for all 8th, 10th and 11th grade students
• Implement Computer Adaptive Testing to replace CRTs.
• 2008
– SB 2002 Establishes K-12 Computer Adaptive Pilot
– Sevier District
– Juab District
• 2010
– SB 16 Allows computer adaptive testing to replace CRT testing
• Number of pilot schools/districts increase
Historical Review
• 2011
– Computer adaptive testing allowed for federal
accountability
– Number of schools/districts in pilot increases
• 2012
– K-12 Computer Adaptive Testing Pilot
• Charters - 8
• Districts - 10 (some to all of schools)
– HB 15 – allocates $6.7 M for state-wide
implementation of computer adaptive testing
Pilot Success (6 years)
lessons learned
• Adaptive Testing
– Low achieving to high achieving students
• Interim Testing – multiple times during year
• Fall to Spring growth
• Spring to Spring growth
• Robust Reporting
• Formative testing
Historical Review - 2012
• HB15 provides $6,700,000 on going funding for Adaptive
Assessment
• SB 97 provides $7,600,000 for technology
• State Board of Education appoints RFP committee
• RFP written, statewide review
• State Board of Education appoints RFP selection committee
• Proposals reviewed, scored and vendor selected
American Institutes for Research
(AIR)
• Washington D.C. based non-profit
• Only organization currently delivering statewide,
online adaptive tests approved for ESEA
accountability
• 1,600 people working in the areas of assessment,
education research and technical assistance,
health, human development, and international
development.
Historical Review – 2013
SAGE developed by Utah Educators
•
Every test question has been developed and/or reviewed by
293 Utah residents
– Educators
– Parents
– Stakeholders focused on fairness (cultural, gender and ethnic sensitivity)
•
Every test question was developed and/or reviewed in 4-5 separate
committees. After each committee review, there was editing/revising of the
questions to incorporate the feedback from the committee.
–
–
–
–
–
Development committee
Content committee
Fairness committee
Passage review committee
Parent review committee
Historical Review – 2013
SAGE developed by Utah Educators/Residents
• SAGE item bank (number of questions)
– 11,783 questions
•
•
•
•
400 – 450 questions per grade/course
5,533 new Utah developed questions
5,599 questions previously used in the Utah CRTs
651 questions from Delaware and Hawaii
– These questions were reviewed in every Utah
committee except development
SAGE Parent Review Committee
53A-1-603(9)
(a) The State Board of Education shall establish a committee consisting of 15 parents of Utah
public education students to review all computer adaptive test questions.
(b) The committee established in Subsection (9)(a) shall include the following parent members:
(i) five members appointed by the chair of the State Board of Education;
(ii) five members appointed by the speaker of the House of Representatives; and
(iii) five members appointed by the president of the Senate.
(c) The State Board of Education shall provide staff support to the parent committee.
(d) The term of office of each member appointed in Subsection (9)(b) is four years.
(e) The chair of the State Board of Education, the speaker of the House of Representatives,
and the president of the Senate shall adjust the length of terms to stagger the terms of
committee members so that approximately 1/2 of the committee members are appointed every
two years.
(f) No member may receive compensation or benefits for the member's service
on the committee.
13
SAGE Parent Review Committee
•
15 Parents
– Children currently enrolled in public school
•
November 4 – 8, 2013
– 9:00 to 5:00
– Began as early as 7:00 am
– Ended as late as 6:00 pm
•
All 11,783 questions reviewed
– At least 2 parents reviewed each question
– Some parents reviewed all questions
•
Parent Panel Discussion on final review day
– State Board members
– Legislators
– Parents discussed their review experience
SAGE Parent Review Committee
• Review Summary
– During the panel discussion, parents agreed their
main concerns were not due to questions promoting a
social agenda or inappropriate verbiage
Review after
Flagged Removed Changed Field Test
Eng Lang Arts
Mathematics
Science
297
27
97
173
71
4
4
63
229
12
56
161
SAGE Parent Review Committee
•
Process for flagged questions
– Individual review of each assessment item that contained a comment from the parent
review panel by individual USOE content experts.
– Items were then either removed from the item bank, edited based on parent and
review panel recommendations or flagged for further review after the item is field
tested.
– All non-removed items were brought before an item review committee consisting of
regular education teachers, content experts, special education teachers, and minority
group representatives for reevaluation. All parent concerns were addressed by
checking functionality, bias, style, facts, and content on all flagged items.
– The majority of flagged items were targeted for further review after field testing. For
example, the identified issue for an item by the parent review panel may have been
the item was too difficult for the grade level. The item was re-checked by content
experts and deemed grade appropriate but will be reviewed again after field testing by
both content experts and the parent review panel.
–
Items flagged by the parent review panel, which were not removed will
be presented to the parent review panel for further review.
SAGE - 2014
February Trimester Administration
– 5,860 high school students
March 17
SAGE writing 3 week window opens
April 1
SAGE adaptive window opens
June 20
SAGE window closes
SAGE - 2014
Comments from the Field
•
“The SAGE system is overwhelmingly working. It has been a really positive experience.”
Hal Sanderson, Canyon District Assessment Director
•
“CRTs were so easy they were boring. SAGE made me think.”
Moab HS student
•
“I hate these tests! I can’t just guess anymore, I actually have to think and show that I
understand.”
Woods Cross HS student
•
“The new item types really make me have to look at my instruction. No longer will my
students do well on tests because they can read well, they really have to know the
content.”
Weber District Jr. High Science Teacher
SAGE - 2014
Feedback to inform 2015
• Focus Groups
• Teacher Surveys
• Solicit feedback in all presentations/meetings
• E-mails and phone calls
SAGE - 2014
• SAGE Standard Setting - August 2014 Process:
–
–
–
–
Broad range of stakeholders (210 participants)
“Ordered Item Booklet”
Impact data, including national and international benchmarks
Resolution process to ensure consistency across grades
• State Board of Education approves the process and
cut scores.
– Cut scores determine what scale score = proficiency
The process to determine student scores and level of
proficiency will use current assessment industry standards.
The process and outcome will be reviewed and approved by
the Technical Advisory Committee (assessment expertise in
and outside of Utah) and the State Board of Education
20
SAGE & Accountability Timelines
• June 20
SAGE window closes
• Aug 11-15
Standard Setting
• Sept 5
Board approval of cut scores
• Sept 30
Assessment results to LEAs
• Oct 30
Accountability reports released to LEAs for 30 day review
• Nov 30 – Dec 15
Accountability reports released to public
21
SAGE Results Reality
Simple Equation
New, more rigorous standards
+
New, more rigorous assessments
=
Reduced % of students proficient
22
SAGE Results Reality
• Reduced proficiency is
– A result of more rigorous standards
– A result of more rigorous assessments
– A result of raising the bar/expectations for all students
• Reduced proficiency is not
– Decreased student performance
– Decreased instructional excellence
– Decreased school achievement
• Student proficiency will increase as students,
parents and
teachers work together implementing
the standards and assessments
23
SAGE Results
• 2014 SAGE results in September
• Beginning October 2014
– SAGE results are immediate
• Extensive Reporting System
– Student Reports for Parents/Students
– Classroom Reports for Teachers
– School and District Reports for School Administrators
• Interim and Summative results link to formative tools
January 19, 2012
24
SAGE & Accountability
• SAGE Proficiency
– Percentage of students proficient
– 2014 is baseline year
– State and Federal Reporting
– Grading Schools
– UCAS Federal Accountability
– PACE Report Card
January 19, 2012
25
SAGE & Accountability
Growth
• We cannot align scale scores between CRT and SAGE
• Assessment differences are too different to compare
CRT
SAGE
Fixed form – every student has same
questions
Multiple Choice only questions
Adaptive – every student has different
questions
Technology based questions
Old core standards
Mostly Recall
New Utah core standards
All levels of critical thinking
Designed to determine proficiency
Measures lowest to highest performance
Comparing CRT and SAGE scores is opposed by acceptable standards of measurement (American
Psychological Association, National Council on Measurement in Education, & American
26
Educational Research Association)
SAGE & Accountability
• Growth 2014
– Baseline year – only 1 year of data
– Required by School Grading
– Required by UCAS (Federal Accountability)
– State Board will approve methodology
January 19, 2012
27
SAGE & Accountability
• Growth 2015
– Two years of data
– Can compare scale scores
– School Grading
– UCAS (Federal Accountability)
– PACE Report Card
– Ongoing evaluation to determine validity of
calculation
January 19, 2012
28
SAGE Test Questions
Live Demo of Training Tests
http://sageportal.org/training-tests/
January 19, 2012
29
Questions
30