Creating an Assessment Bridge

The Academic Program Review
Bridging Standards 7 and 14
Middle State’s Annual Conference
December 10, 2010
Presenters
Mr. H. Leon Hill
Director of Institutional Research
Dr. Joan E. Brookshire
Associate Vice President of
Academic Affairs
Overview
•
•
•
•
•
Framework to Address the APRs
Structure/Challenges/Approach
Examples of Metrics
Current Action Plan
Integration of “End User”
Technology
• Next Steps
• Benefits of Our Approach
• Questions
Assessment Cycle-2005
Plan to
meet
Meet to
plan
Plan to
meet
Meet to
plan
Report out
on planning
What we had to build on
• Strong focus on programs.
• State mandated 5-year academic
program review in need of
revision.
• Institutional Effectiveness Model
(IEM) with performance indicators
benchmarked through State and
National data bases.
Mission
Strategic Initiative:
Access & Success
Institutional Effectiveness
IEM
• Needed a way to assess how the
College was performing on key
metrics in relation to prior.
years/semesters and compared to
other institutions.
• Historical/Trend data
• Benchmark data
– Pennsylvania & National Peers
• Institutional Effectiveness Model
Where we started
• Restructured the Academic
Program Review process
• Incorporated the use of technology
Goal of the restructuring
• Measure student performance as
evidence by results of assessment
of student learning outcomes.
• Measure program performance as
evidenced by comparison of
program performance to overall
college performance on specific
key indicator (current and
aspirational).
Challenges
• Usual issues with assessment in
general.
• Faculty had little knowledge of the
College’s performance indicators.
• Organizational separation of
assessment of institutional and
student learning outcomes.
Approach
Began by building it backwards
from the IEM by mapping out
specific core indicators to program
data, making additions where
needed.
Examples of Metrics Used for APR
College's Graduation Rate by Cohort
14.50%
14.14%
14.10%
14.00%
13.50%
13.26%
13.00%
12.87%
12.70%
12.50%
12.00%
11.50%
Fall 2002
Fall 2003
Fall 2004
Fall 2005
Fall 2006
TARGETS
Caution
Acceptable
Aspirational
Graduation
Rate
<19%
19%-23%
>23%
College's Transfer Rate by Cohort
35.00%
32.27%
30.37%
30.00%
27.24%
26.40%
26.80%
Fall 2005
Fall 2006
25.00%
20.00%
15.00%
10.00%
5.00%
0.00%
Fall 2002
Fall 2003
Fall 2004
TARGETS
Caution
Acceptable
Aspirational
Transfer Rate
<29%
29%-32%
>32%
Definitions of Success & Retention
Success=Grades of (A,B,C & P)/(A,
B, C, D, P, D, F, & W)
Retention=Grades of (W)/(A, B, C,
D, P, D, F, & W)
Retention Rates in Core Courses
100.0%
90.0%
80.0%
70.0%
60.0%
50.0%
40.0%
30.0%
20.0%
10.0%
0.0%
Fall 2005
Fall 2006
Fall 2007
Fall 2008
Fall 2009
Comp I
90.3%
89.9%
90.9%
91.6%
91.7%
Comp II
85.6%
85.4%
86.7%
86.9%
87.5%
College Algebra
72.4%
76.0%
74.5%
81.9%
79.0%
Speech
93.1%
90.5%
90.5%
90.3%
89.6%
Success Rates in Core Courses
100.0%
90.0%
80.0%
70.0%
60.0%
50.0%
40.0%
30.0%
20.0%
10.0%
0.0%
Fall 2005
Fall 2006
Fall 2007
Fall 2008
Fall 2009
Comp I
76.7%
74.6%
76.9%
76.9%
75.6%
Comp II
74.1%
80.3%
77.1%
79.3%
78.8%
College Algebra
61.2%
61.7%
62.9%
61.8%
66.2%
Speech
88.0%
83.3%
83.1%
86.3%
84.3%
Added a curricular analysis
• How well program goals support
the college’s mission.
• How well individual course
outcomes reinforce program
outcomes.
• How well instruction aligns with
the learning outcomes.
• Specific assessment results.
• Changes made based on the
assessment findings.
• Evidence of closing the loop
• Changes made to the assessment
plan.
Action Plan
• Outcomes expected as a result of
appropriate actions steps.
• Timelines and persons
responsible for each action step.
• Resources needed with specific
budget requests.
• Evaluation plan with expected
benefits.
Bottom Line
• Is there sufficient evidence that
the program learning outcomes
are being met?
• Is there sufficient evidence that
the program is aligned with the
college on specific key indicators?
The Framework
STANDARDS 1-7
Focus on Institutional Performance
STANDARDS 8-14
Focus on Student Performance
Standard 7: Assessment of Institutional Effectiveness
How well are we collectively doing what we say we are
doing, with a specific focus on supporting student learning.
Assessment must be included in Standards 1-6.





Strategic Analysis
Institutional data
Link to the IEM
through use of
common data sets
Standard 14: Assessment of Student Learning
Do we have clearly articulated learning goals, offer appropriate
learning activities, assess student achievement of those learning
outcomes, and use the result of assessment to improve teaching
and learning and inform budgeting and planning. Also in Standards
8-13.
Curriculum Analysis
 Course assessment
 Program Assessment
 Core assessment
Planning and Budgeting (Standard 2)
APR Action Plan
APR Annual Report
Annual Academic Planning
Assessment Results
Curriculum Committee
President’s Office
Curriculum BOT & BOT
Addition of Technology
• Worked in concert with Information
Technology to integrate iStrategy
with ERP (Datatel).
• The implementation of this
permitted end users to obtain the
data needed for program
assessment, without the middle
man (IR and/or IT).
Next Steps in the Evolution of of
College and Program Outcomes
Example of APR Report Card
Persistence
Fall to Fall
Full-time
Persistence
Part-time
Persistence
2005
75%
64.39%
25%
33.52%
Program
College
Program
College
Fall to Spring
2006
50%
63.16%
43.48%
37.09%
2007
70%
50%
33.33%
34.41%
Full-time
Persistence
Part-time
Persistence
2006
70%
78.19%
69.57%
45.12%
2007
80%
80.98%
45.83%
41.60%
Graduation and Transfer
Pass Rates on Licensure Exams
Degrees Conferred
2006
2007
2008
Three Year
Percent Change
9
1058
9
1046
8
1127
-11.11%
6.52%
Degrees Conferred – These are the actual number of degrees conferred, not the
degrees earned
Graduation and Transfer Rate: Cohort Entering 2005
Full-time
Part-time
Program
College
Program
College
Total
Students
14
1524
9
990
Degrees
1
192
0
26
Graduation
Rate
(Within 3
Years)
7.14%
12.60%*
0%
2.62%
Transfer
5
402
4
232
Transfer
Rate
(Within 3
Years)
35.71%
26.38%
44.44%
23.43%
First time students to College, no transfer credits
*Acceptable 19-23%
Fiscal
2008
76.92%
80.93%
46.15%
39.00%
Persistence - These data are based on a cohort of first time students from a
specific semester and follows their enrollment patterns from Fall semester to
the following Spring semester
Persistence - These data are based on a cohort of first time students from a
specific semester and follows their enrollment patterns one year out
Program
College
Program
College
Program
College
Evidence of High Priority and Employment
Predictions
• Examples of Course Success
Success in ACC 111
Total College Success: ACC 111
100%
90%
80%
Percent Success
70%
60%
50%
40%
30%
20%
10%
0%
2003/FA
2004/FA
2005/FA
Success
% Success
# Success
% Non Success
# Non Success
2006/FA
2007/FA
2008/FA
2009/FA
Non Success
2006/F
2007/F
2008/F
2009/F
A
A
A
A
2003/FA
2004/FA
2005/FA
61.4%
57.1%
55.4%
55.3%
51.4%
44.2%
48.3%
329
276
253
281
261
244
249
38.6%
42.9%
44.6%
44.7%
48.6%
55.8%
51.7%
207
207
204
227
247
308
267
Success in ACC 111
ACC 111: Success by Gender
100%
90%
Percent Success
80%
70%
60%
50%
40%
30%
20%
10%
0%
2003/FA
2004/FA
2005/FA
2006/FA
Female
% Female Success
2003/FA
63.3%
2004/FA
57.5%
2007/FA
2008/FA
2009/FA
Male
2005/FA
58.8%
2006/FA
57.7%
2007/FA
57.3%
2008/FA
51.8%
2009/FA
58.7%
Female Success
145
111
104
123
114
115
105
% Male Success
59.8%
56.8%
53.2%
53.6%
47.2%
39.1%
42.1%
180
163
149
158
145
127
133
Male Success
Success in Math 010
Total College Success: Math 010
100%
90%
Percent Success
80%
70%
60%
50%
40%
30%
20%
10%
0%
2003/FA
2004/FA
2005/FA
Success
% Success
Success
% Non Success
Non Success
2003/FA
53.6%
2004/FA
46.3%
2006/FA
2007/FA
2008/FA
2009/FA
Non Success
2005/FA
47.3%
2006/FA
45.7%
2007/FA
44.8%
2008/FA
43.3%
2009/FA
47.4%
310
266
276
293
297
288
344
46.4%
53.7%
52.7%
54.3%
55.2%
56.7%
52.6%
268
309
307
348
366
377
381
Success in Math 010
Math 010: Success by Race
100%
90%
Percent Success
80%
70%
60%
50%
40%
30%
20%
10%
0%
2003/FA
2004/FA
2005/FA
2006/FA
African American
% African American
Success
African American
Success
% Caucasian Success
Caucasian Success
2007/FA
2008/FA
2009/FA
Caucasian
2003/FA
42.6%
2004/FA
37.7%
2005/FA
38.5%
2006/FA
25.8%
2007/FA
26.9%
2008/FA
29.9%
2009/FA
34.7%
43
46
40
33
45
56
51
58.2%
51.8%
50.3%
52.7%
53.5%
48.4%
52.0%
202
184
180
217
206
180
141
Benefits
• Build a bridge between Standards
7 and 14.
• Better data.
• By putting data in the hands of
faculty, have them actively
engaged with using data in
decisions/planning.
• IR time better used.
• Annual planning cycle developed.
• Built a culture of assessment in
several of the academic divisions.
• Curricular changes that align with
graduation initiative.
• Curricular and program
improvement.
• Created a college-wide model for
improvement of student learning.
Evolution of the Dashboard
• Creation of a Student Success
Dashboard






Metrics:
Course level success and retention
(Developmental and College-Level)
Persistence (fall to spring and fall to fall)
Progression of various cohorts of students
College level success in Math or English
after Developmental Math or English
Graduation
Transfer
Graphic Representation for
the SSD
Graphic Representation for
the SSD
Final Thoughts
 It’s not perfect, but it works for us.
 Do the research on which tools
are appropriate for your college
 Assessment of the core curriculum
 Launching of assessment software
 It all starts with asking the right
question
 PRR 2010
Questions
Presenters
Mr. H. Leon Hill
[email protected]
Director of Institutional Research
Dr. Joan E. Brookshire
[email protected]
Associate Vice President of
Academic Affairs