2012 Final Report

North Central State College
DEI Final Narrative & Financial Report
North Central State College:
Submission Date: 8-31-12
Name and E-mail of Contact Person for this Report: Peg Moir [email protected]
Section 1: Final Narrative
1. (A) Assessment & Placement Policy Initiative/Strategic Direction One
Original goal: An extensive review and update of developmental placement policies and procedures for
all disciplines in light of cohort data, AtD national research, and evolving state policy changes. The
review will entail much more than a simple cut-score change, but rather all actions influencing the
assessment and placement process.
Action: With assistance of an Ohio-based consultant who also serves as an AtD Data Coach, the team
produced an extensive research paper, which in turn led to committee policy recommendations
produced in early 2011. At the same time, NC State was implementing a new strategic planning model
under Carver Policy Governance. The college selected Academic Preparation for College as a strategic
focus area, and linked this to implementation of the DEI committee recommendations. The
subcommittee then implemented policy changes beginning in fall 2011. The major change areas
included:
•
New placement policy for English composition. To address concerns over gateway English
completion rates and adapt to state directives lowering COMPASS cut scores from 81 to 69, the
college developed a multi-pronged placement and instructional strategy. This included pairing
the highest developmental writing and gateway composition course, and creating a composition
“lab” course potentially for students who fall into the 11-point margin between the former and
current cut-off scores. These students take a secondary writing assessment (COMPASS eWrite),
and depending on their score are placed into either standard composition, the composition-lab
course, or even developmental writing.
•
New placement policy for Reading. In Winter 2011, the college implemented a policy that all
students testing below 60 on COMPASS (6th grade level) seek remediation through the Solutions
ABLE partnership before admission to college classes. See Solutions section for detail.
•
Policy changes and improvements in COMPASS preparation. The College initiated several
reforms to improve assessment and placement, some in connection with the Case Management
Advising strategy such as largely eliminating “cold testing” for COMPASS without a preparation
opportunity. The college implemented and publicized a variety of preparation options, including
the Guilford Tech Moodle Website, access to PLATO software, COMPASS preparation workshops
and preparation help through the Tutoring Center. To incentivize up-front test preparation, a
$25 COMPASS retest fee was implemented for fall 2013 with the exception of students who
subsequently attend a boot camp or the Solutions Transition program.
•
Free computer literacy workshops (state grant) and free tutoring to help more students and
potential students prepare for the computer literacy assessment.
Summary progress results: NC State’s placement levels have fallen dramatically, from 65.5% of
students placing into any one class for the 2010 cohort to 58.4% for the 2011 cohort. Further, the
average number of developmental courses placed (measuring disciplines and placement levels)
dropped from 1.9 to 1.0. This was most noticeable in reading and writing given policy changes
described above, including a one-year drop in writing placements from 41.2% to 26.7% of the
cohort. Of the 206 students in the 2011 cohort taking the secondary writing assessment, seven
were diverted to developmental, 94 placed into the lab course, and 105 placed full college level.
Gateway English results: The results of the cut-score change were mixed, though the overall data for
AY 2011-12 appears to have been clouded by advising by some of the academic departments to
recommend that college-ready students defer taking gateway English until semester conversion to
ensure no credit loss. This contributed to an overall cohort completion rate of only 32%. However,
the 199 students placed college-level via eWrite generally attempted English, resulting in a
completion rate nearly double that of COMPASS-placed college-level. In fact, the eWrite group’s
one-year attempt rate (74%) was higher than any college-ready referral cohort since 2002. The firstattempt pass rates for the eWrite group (61%) were lower than COMPASS college-ready (77%).
2011 Cohort Performance, First-Attempt ENG 101
80%
70%
60%
50%
40%
30%
20%
10%
0%
English Lab
Course
E-write, College
Level
College-Level
on COMPASS
Developmental
Total
Attempt
72%
76%
35%
38%
48%
Complete
43%
48%
27%
18%
30%
COMPASS Prep: In spring 2011, only 30% of surveyed students on the CCSSE reported using
COMPASS prep materials. While the college will not re-administer CCSSE until, 2014, there are
indicators that potential students are using preparation tools. For example, there were 50 users of
PLATO COMPASS prep courseware in FY 2012 for 210 system hours. Further, 2011 cohort placement
in math dropped from 55% to 52% despite no discipline-specific policy changes.
Nearly 500 community residents and students have attended free computer literacy workshops, as well
as individual tutoring. The pass rate on the computer literacy test was 70% in AY 2012, compared to 62%
in AY 2011 and 55% during the benchmark year (CY 2009) of the study.
Post-DEI Plans: The college has already started the process of expanding the gateway English lab course
strategies into the lowest-level developmental courses for fall 2012. The lower-level writing course has
been fully converted into lecture/supplemental lab and a special lab/lecture section has been created
for students who miss the cut-off for second-tier math by at least 15 points.
1. (B) Math Boot Camps (College and High School)/Strategic Direction Two
Original goal: The college will conduct quarterly math boot camps with a goal to serve 125 students
annually. At least 50% of camp completers will move up a level on their COMPASS retest. This has been
merged with the intervention for improved alignment with secondary education. This intervention had a
goal for increased collaboration with high school teachers to develop curriculum revisions toward
improving college-readiness.
Action: During AY 2012, math boot camps served 153 (103 campus and 50 high school). This included
14% of targeted campus students based on COMPASS scores and 42% of invited high school students
based on their ACT PLAN scores. The chart below details camp attendance over three years.
During AY 2012, 44% of students tested increased at least one level, of which 32% were college ready.
While campus-based performance improved, high school-based declined.
NC State tracks the success of eligible campus boot camp attendees vs. eligible non-attendees per the
chart below. The academic outcomes of attendees in terms of “total success” continue to outpace nonattendees in AY 2012, as nearly 40% of attendees either tested out of the major math requirement or
immediately passed a math course. Nearly 30 students were able to accelerate their developmental
education and save money as a result of the camp, especially for those advancing multiple levels
(minimum $16,000 savings assuming each just advanced one level).
Ten of the high school students who took camp in spring matriculated to NC State, of whom five
attempted math (all had increased levels) and three succeeded. Of the 46 high school students from the
2012 camp, 18 have enrolled this fall at NC State.
Year 1 Math AtD Cohort Math Sequence Completion
Attend BC
58%
No Attend BC
61%
50%
50%
43%
34%
2009
2010
2011
Year one math sequence completion was
compared for eligible attendees and nonattendees. Note the cohort attendee group
sizes were 16, 28, and 20. The latter cohort
attendees had much higher sequence
completion rates than non-attendees.
Post-DEI plans: The college has embedded boot camps within the new policy for assessment and
placement. It is offering quarterly boot camps for math and English at $25 each. This includes the new
$25 COMPASS retest fee, giving an incentive for students to attend the camps. Moreover, the math and
English departments now offer Pre-COMPASS prep workshops for $10 each to help maximize
performance on the first test. For fall 2012, 23 students attended the math boot camp. The English
Department has also adopted pre-quarter boot camps. Based on the high school project, the college has
leveraged a $100,000 federal Race to the Top grant in winter 2012 for regional colleges (3) and K-12
schools (16) to develop alignment strategies in math and English. These strategies will dovetail with
implementation of the Common Core curriculum in Ohio, and movement to a new assessment tool
replacing the Ohio Graduation Test (pegged at 10th grade proficiency).
1. (C) Expansion of ABLE Skills Remediation Partnership/Strategic Direction Two
Original goal: The college will expand its partnership with Mansfield City Schools Adult Basic Literacy &
Education to offer on-campus, no cost assistance for individuals wanting to improve placement on
COMPASS or other entrance assessments such as Work-Keys and ASVAB. The partnership will annually
serve 130 individuals through the program, including a component pre-college success skills workshop
for which completers earn credit for NC State’s first-year experience course.
Actions in AY 2012: Under a new coordinator, the program finished its second year of complete
integration and co-location with the college Tutoring Resource Center (TRC). According to TRC activity
reports, approximately 35% of tutor hours are dedicated to serving Solutions students through the selfpaced program. The program continued to strengthen relationships with the on-campus life skills
program administered for public assistance recipients, as well as the local GED program which was the
third largest “high school” provider of students in AY 2012.
Summary of AY Outcomes: The tutoring section of the program served 76, of which 55 were new
program entrants. The success workshop served 53 of whom 46 were granted equivalency credit.
People Completing Solutions Tutoring
Unique Individuals Served in Solutions/Tutoring
100
80
60
40
20
0
Sol
AY 2009
AY 2010
AY 2011
AY 2012
34
62
91
76
80
70
60
50
40
30
20
10
0
Sol
AY 2009
AY 2010
AY 2011
AY 2012
25
51
68
68
Nearly 70 students completed tutoring services in AY 2012. The chart below reflects post-test
outcomes by the year in which the student started Solutions. Of the 194 post-tests, 56% have
resulted in an increase and 23% have post-tested college-ready. Most post-tests are in math
(78), followed by reading (62) and writing (54).
Post-Test Outcomes by Start Year
90
80
70
60
50
40
30
20
10
0
AY 2009
AY 2010
AY 2011
AY 2012
Post-Test
25
63
77
29
Increase Level
12
44
43
9
College-Ready
4
22
17
1
The college tracks outcomes of course attempts by Solutions start year. Of 155 math attempts (136
developmental, 19 college) by students pre-testing developmental in math, students achieved a 62%
pass rate (62% developmental, 58% college). Of the 104 writing/English attempts (69 developmental, 33
college) by students pre-testing into writing, students achieved a 62% pass rate (68% developmental,
49% college). Of 28 reading attempts by students pre-testing into reading, 20 have passed.
Writing/English Success by Solutions Start Year
Math Course Success by Solutions Entrance Year
45
60
40
50
35
30
40
25
30
20
15
20
10
10
5
0
0
AY 2009
AY 2010
AY 2011
AY 2012
AY 2009
AY 2010
AY 2011
AY 2012
Attempts
23
57
53
23
Attempt
20
36
39
9
Pass
8
38
35
15
Pass
10
22
27
5
The college compares the retention of Solutions students who enroll and are counted within the fall AtD
cohorts (18, 11, and 11 respectively). The following charts compare retention against the full cohort that
are referred developmental. Retention rates are higher for students who went through Solutions.
Solutions AtD Cohort Fall to Fall Retention
Solutions AtD Cohort Fall to Winter Retention
60%
100%
90%
50%
80%
70%
40%
60%
30%
50%
40%
20%
30%
20%
10%
10%
0%
0%
FA2009
FA2010
FA2011
FA2009
FA2010
FA2011
Solutions
56%
55%
45%
Solutions
72%
82%
91%
Dev Cohort
46%
42%
42%
Dev Cohort
78%
72%
75%
Post-DEI plans: The college intends to continue this program through an investment of institutional
funds, ABLE funds, and other local, state and federal grants. For example, NC State student tutors
working with Solutions students are paid from Perkins/Work Study. The college has also received a state
grant to expand the Solutions program to an adjoining county, combining the services with GED prep.
1. (D) Expansion of Self-Paced Learning Lab/Strategic Direction Two
Original goal: The college will expand the number of seats available through its PLATO courseware
learning lab to accommodate demand for alternative delivery of developmental coursework, while
ensuring quality of education. Note that CCSSE surveys in 2006, 2008 and 2011 have consistently shown
that hours NC State students spend per week on dependent care and work for pay were at levels
statistically significant above that of peer colleges, helping confirm the need for such an intervention.
The college has set a service target of 1,000 duplicated enrollments.
Action: During AY 2012, the college upgraded to the new generation of PLATO PLE offering numerous
increased options including software benefitting the Assessment/Placement policy and Tutoring
initiatives. In addition, the English department required the courseware be integrated into the
curriculum for all reading and lower-level writing classes. Finally, the math department mandated that
students who fail a PLATO-based math course retake it in a lecture format to increase regulation.
Summary of AY 2012 Outcomes: There were 863 attempts through PLATO, comprising 45% of
developmental attempts. This compares to 32% and 35% of usage for prior years.
Math PLATO v. Lecture Success
Math PLATO v. Lecture Attempts
1400
70%
1200
1000
60%
800
600
40%
400
200
0
20%
50%
30%
10%
0%
AY 10
AY 11
AY 12
AY 10
AY 11
AY 12
PLATO
619
602
488
PLATO
52%
61%
62%
Lecture
1231
1227
918
Lecture
63%
58%
63%
The duplicated success rate of students who attempt PLATO remains competitive with lecture. The
lowest-level PLATO math was the most successful course (68%) after several years of being least
successful, while performance in the higher levels dipped four percentage points apiece.
Writing PLATO v. Lecture Success
Writing PLATO v. Lecture Attempts
800
700
600
500
400
300
200
100
0
80%
70%
60%
50%
40%
30%
20%
10%
0%
AY 10
AY 11
AY 12
AY 10
AY 11
AY 12
PLATO
83
148
205
PLATO
53%
61%
65%
Lecture
671
437
142
Lecture
67%
62%
68%
For the first time, developmental writing attempts integrated with PLATO exceeded lecture. The success
rate for both methods continues to increase and remain competitive with one another.
Reading PLATO v. Lecture Attempts
Reading PLATO v. Lecture Success
250
200
150
100
50
0
64%
62%
60%
58%
56%
54%
52%
50%
48%
AY 10
AY 11
AY 12
AY 10
AY 11
AY 12
PLATO
238
169
170
PLATO
57%
53%
61%
Lecture
99
98
0
Lecture
62%
58%
All reading is now delivered with assistance of PLATO courseware. The success rate did increase in AY
2012, but this is also likely impacted by a newer college policy deferring to ABLE/Solutions applicants
who score below 60 (6th grade) on COMPASS reading.
Post-DEI Plans: The college now charges a technology fee to students taking PLATO courseware to help
cover the subscription cost. However, these students are not required to purchase the standard text of
the lecture course and save approximately $75. The college intends to continue monitoring performance
and making necessary adjustments to balance convenience and quality – as the math department has
done throughout this grant. In addition other departments, especially tutoring and advising, will
continue making use of the courseware to enhance their services.
1. (E) Enhanced Case Management/Strategic Direction Three
Original Goal: Building on “intrusive advising” practices, the college staff advising center will shift to a
assigned case manager model for advising. It will invest in comprehensive student engagement software
connecting various support offices and interfacing with the student information system to support the
new model. As a result, cohort students will more quickly progress through AtD success milestones.
Action: During AY 2012, the College continued to update and refine this strategy based on new data and
connection with other DEI initiatives. For example:
• Based on a study from the Institutional Research & Math departments, beginning in fall 2012
the college students testing ONLY into the highest developmental math will no longer be
mandated into the advising program
• The advising department increased its level of referrals to the ABLE/Solutions program,
especially for students going into pre-health programs for semester conversion. This was done
to help students more efficiently meet new health science gateway prerequisites for completion
of all developmental coursework. The department also increased referrals of existing
developmental students to ABLE/Solutions who were midway through their sequence.
Summary progress results: Since fall 2011 was the first cohort to extensively receive this intervention,
their performance outcomes are tracked against prior cohorts using the key AtD measures for Year 1.
The 2011 cohort continued to show strong performance in developmental math outcomes. Note
gateway outcomes are not tracked because most students at NC State did not require college-level math
until semester conversion in fall 2012. Writing outcomes did fall, but that may be partially attributed to
the new placement policy which moved nearly 200 previously developmental students to the college-
level.
Reading outcomes also declined, though the college implemented a placement change in 2011 diverting
all students testing below 60 on COMPASS to ABLE Solutions. Despite having fewer students with
predicted reading difficulty, the attempt and completion rates remained stagnant.
Term-to-term retention increased slightly, while fall-to-fall remained constant and kept pace with the
whole cohort. The 2011 cohort also completed a higher ratio of credits than past cohorts. Also,
developmental students with less than 30 credit hours have reported increasing positive perspectives on
advising through the CCSSE survey – the most recent covering partial implementation of the model.
Post-DEI plans: In summer 2012, the college merged its admissions intake and staff advising
departments after an extensive internal operational study. Based on this consolidation, it will have
increased capacity to service students in the developmental advising program. Moreover, the
conversion to semesters decreases by one term the number of mandatory meetings for continuing
students, which will also add to program advising capacity. The college will also investigate the best
means for the college is balancing this developmental intake/advising strategy with consumer demand
for rapid registration. Finally, the college has completely moved away from RetentionZen software
originally purchased through DEI funds, and this year will implement a new early alert module.
1. (F) Tutoring Center Initiative/Strategic Direction Three
Original goal: The college will create a robust tutoring environment through enhancement in
space/equipment, personnel and even training practices – with a special focus on developmental
students. The college will employ an accredited full-time Tutor Coordinator to oversee the hiring,
coordination, training and evaluation of all tutors. This person would serve as liaison to the classroom
ensuring proper coordination of requirements, techniques and advisements for alignment/integration of
tutoring with class curriculum.
Action: The Tutoring Resource Center (TRC) completed its third year of operation in summer 2012.
While serving numerically fewer students due to declining enrollment, it served an increased percentage
of students (both unique and duplicated attempts). The chart below compares overall usage statistics.
Over three years (summer 2012 numbers pending), 26% of unique developmental students at North
Central State have visited the TRC, comprising 17% of all duplicated developmental attempts. The
college tracks tutoring services according to quarterly frequency thresholds of no use, lower use (<1 one
hour for writing/reading and 2 hours for math) and greater use (>1 hour writing/reading and 2 hours
math). The charts below track frequency of use and success by frequency of use. Through spring 2012,
62% of tutored students (duplicated) had greater use of services and 38% had lower use. The following
charts track use and performance by discipline from fall-spring for each academic year:
Frequency of Writing Tutoring
Frequency of Math Tutoring
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
AY2010
AY2011
AY2012
AY2010
AY2011
AY2012
No Tutor
86%
79%
80%
No Tutor
88%
84%
81%
< 2 HR
6%
10%
8%
< 1 HR
3%
4%
6%
>= 2 HR
8%
12%
13%
>= 1HR
8%
14%
14%
Summary progress results: The college has tracked success (ABC grade) of students according to the
tutored developmental course, disaggregating by frequency of tutoring. The following charts compare
success in developmental math and writing through the first three years of the grant (fall-spring given
unfinalized summer 2012).
Writing Success by Tutor Frequency
Dev Math Success by Tutor Frequency
80%
70%
60%
50%
40%
30%
20%
10%
0%
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
AY2010
AY2011
AY2012
AY2010
AY2011
AY2012
No Tutor
58%
58%
62%
No Tutor
63%
64%
64%
< 2 HR
51%
62%
75%
< 1 HR
92%
65%
79%
> = 2 HR
68%
60%
64%
>= 1HR
70%
64%
77%
AtD Cohort Fall-Fall Retention by Tutoring
60%
50%
40%
30%
20%
10%
0%
2009
2010
2011
All Dev
46%
42%
42%
No Tutor
37%
31%
30%
Any Frequency
48%
49%
51%
>= Threshold
51%
50%
49%
In both disciplines, the data show that students who receive tutoring succeed at higher rates than
students with no tutoring regardless of frequency. However, frequency of tutoring does not correlate
with success, as the most successful students were those with less than two hours of quarterly tutoring.
An analysis of outcomes by class shows tutoring has the greatest impact at the second developmental
math level. Another key outcome measurement is correlation of tutoring to retention:
AtD Cohort Fall-Winter Retention by Tutoring
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
2009
2010
2011
All Dev
78%
72%
75%
No Tutor
76%
71%
72%
Any Frequency
87%
79%
81%
>= Threshold
69%
83%
73%
The data reflects outcomes for cohort students referred developmental, breaking them out by students
who received no tutoring, tutoring of any frequency, and those with at least one record of greater
frequency use. The first chart represents student outcomes in the first fall term, while the second chart
represents student outcomes in the first year (fall-spring, with tutored students receiving service at least
one term). In every single instance, students who received any frequency of tutored achieved higher
retention rates than their peers with no tutoring, and the developmental cohort as a whole. Those
receiving greater amounts of tutoring, who may reflect greater academic need, outperform the
developmental cohort as a whole in fall-to-fall. Further, fall-to-fall retention of tutored students is
growing while the developmental cohort has stagnated and developmental no tutoring is falling. PostDEI Plans: The college is sustaining the TRC beyond DEI through increased institutional investment as
well as several supplemental state, federal and private grants. Moreover, the TRC is embedding more
tutors into the classroom. In fall 2012, tutors will align with classes representing six academic disciplines.
2. Obstacles
We began our DEI work with state leadership that seemed to be very focused on making community
college an integral part of a university system. It felt like we in DEI had the wind at our backs because of
the support of the state leadership. State leaders developed student success-oriented performance
measures and tied our funding to these, mandated seamless transfer from community colleges to state
universities, required a common semester calendar, and strongly encouraged ABLE-community college
partnerships. This vision and focus faded with the 2010 election resulting in another party taking office.
The change in leadership seemed to slow the momentum and the certainty of the state’s direction for
community colleges and higher education in general. There was no longer the interest in and focus on
developmental education that had existed previously. In addition, many of the mid-level state leaders
who had been strong supporters of DEI and Ohio’s work in developmental education either resigned,
retired or were replaced after the election.
Internally, we had some distractions. Our Institutional Research capacity needed to be strengthened
and our understanding of effective research design was lacking. Our faculty organized in 2008 and
negotiating the first contract was a long process and not without disagreements. This process
sometimes seemed to distract from the student-centric approach that we were seeking. Some faculty
was skeptical about the intent or perceived value of our DEI interventions. While we considered it an
honor to be selected to participate in DEI and were very proud of our work, there were some who
questioned whether the publicity we received relative to DEI could be a double-edged sword – feeding
some outsiders’ negative perceptions of the students that we serve and the rigor of the courses that we
teach. Some asked “Do we really want to be known for our work in developmental education? Do we
want an emphasis on developmental education to become part of our brand?”
3. What would you do differently next time?
Require that every intervention be initiated and owned by faculty. Require a strong research design
prior to implementation of any intervention. Have data coaches for DEI and have a team of 2-3 data
coaches approve and “sign off” on the proposed research design for each intervention.
Focus beyond developmental education and look closely at pedagogy. The focus on developmental
education is not a “call to action” for many faculty members and other members of the college
community. Faculty does not always realize that they may not teach developmental courses, but they
all teach developmental students. The challenge to develop different and better pedagogy can serve as
a call to action for all faculty.
4. Additional outcomes not directly associated with DEI objectives
We were fortunate in Ohio to have had 5 colleges participating in DEI and a lot of coaching, resource
sharing, collaborating, and learning from one another has been the result. The Ohio Student Success
Center will be led by Ruth Silon from Tri-C and she will tap the DEI colleges to coach other Ohio colleges
in the future.
North Central was able to deepen its relationships with local K-12 partners and the dialogue and action
toward aligning mathematics and English expectations has been extremely productive.
It also seems that DEI gave the state of Ohio the impetus to push for collaboration between ABLE and
higher education, resulting in a number of pilot programs around the state.
Progress Toward Scale
5.
Describe progress you made toward scaling up your DEI interventions, i.e., significantly
increasing the number and/or proportion of the intervention’s targeted population participating in the
intervention.
•
Case management. We already had the Directions program, but this intervention certainly
increased the intensity and effectiveness of services.
•
PLATO usage did increase from 33% to 45% of developmental attempts in AY 2012.
•
The assessment and placement policy initiative impacted all potential students, including
areas outside of traditional DEV (computer literacy).
•
The Tutoring Center impacts nearly one-quarter of unique developmental students, as well
as servicing Solutions, COMPASS prep and other groups linked to DEI.
•
The math boot camps touched nearly 325 college and high school students, including 14%
of the “eligible” college population.
•
The Solutions initiative, operated through the TRC, is serving 130 persons annually through
academic help and success skills workshops AT NO COST to the student through our ABLE
partnership.
6. What effect did scaling up have on student outcomes? Did you see results proportional to the
increase in reach and effort? If so, what approaches were most successful?
•
Overall, it is important to note the rapid increase in student risk factors during the
recession . The past three years, about 40% of student population has an EFC of 0 on
FASFA, and aggregate fall Noel-Levitz CSI data taken by all developmental students shows
predicted academic difficulty rising from 43 in 2008 to 66 in 2009 and maintaining that level
throughout. Given these risk factors, the fact we were able to continue improvements in
areas such as sequence completion in math while holding steady in other areas such as
developmental student term-to-term retention and credit completion shows impact.
•
Case management. Even when this intervention was only partially implemented, our CCSSE
(spring 2011) and SENSE responses (fall 2010) were very positive on questions dealing with
academic advising. In fact, the SENSE benchmark score for developmental students for
“Clear Academic Plan and Pathway” for NC State was 63.5, in comparison to 54.4 for small
colleges and 52.0 for all colleges. All five questions comprising the benchmark were
statistically significant in comparison to peers.
•
Pass rates via PLATO consistently improved over the course of the grant and are now
competitive with those of lecture formats for the same disciplines.
•
Placement into any developmental dropped from 66% for fall 2010 cohort to 58% for fall
2011. At the same time, the main target group (11-point COMPASS spread range for writing
cut-score reductions) had a first-attempt completion rate that was much higher than those
students who scored higher on COMPASS. Also, the pass rate on the computer literacy test
rose to 70% from 55%.
•
The retention rates for tutored students, even though who receive minimal levels of
tutoring, are much higher than the developmental cohort as a whole.
•
The subsequent course success and sequence completion rates of boot camp students are
much higher than those of non-attendees and the cohort as a whole.
•
Cohort students who go through Solutions and subsequently enroll achieve higher
retention rates than the overall developmental cohort.
7. What did you learn by expanding interventions to more students that will help you reach most (if
not all) that are eligible as you continue to offer these interventions?
We learned that one of the best ways to move expediently from more to most is to integrate the
intervention into the classroom experience rather than offering it as an option or add-on.
Final Reflections
8.
In your opinion, what is the most remarkable accomplishment or finding of DEI on your
campus?
We are extremely encouraged by the success rates of our students who have participated in
the tutoring intervention, with Solutions as a subset of the tutored population. Delivering
embedded tutoring in all developmental writing and mathematics courses is our plan to move
from some to more to most.
We also believe that our work in AtD and DEI has transformed our college. Our Board of
Trustees (following the Carver Model for Policy Governance) reevaluated their ENDS policies
for the College using the latest input from the Ownership. What they discovered was the ENDS
policy focus for the College had shifted toward a focus on student success and completion
rather than merely student access. This shift in focus forced the College to develop a strategic
plan that was now geared toward student success and completion. Our DEI work is woven
throughout the institutional strategic plan.
9.
What additional questions about developmental education student success did DEI raise for
you?
• What should orientation and the first year experience look like for students who place
into developmental education? Should it be different from orientation for collegeready students? In what ways? Will a different first year experience increase success
and completion rates for these students? (We recently initiated an AQIP action
project that is studying the fist year experience and is grappling with some of these
questions.)
• What is the value and effectiveness of standardized academic assessments in
determining student placement into developmental education? What about
characteristics like student motivation and commitment to college? Can they be
measured? How might we use that information to help with career decisions or give
course placement advice?
10.
How will you sustain, the benefits, momentum, and spirit of this project? How will the lessons
learned from this grant affect your future work or the work of others?
The momentum and spirit of this work is reflected in the Ends Policy that has been set forth by
our Board of Trustees. Our DEI work is woven through our institutional strategic plan. We have
incorporated all of our DEI interventions into our institutional budget. Our DEI team will
continue as our AtD Core Team. Our continued affiliation with AtD, the DREAM conference, and
Ohio’s new Student Success Center will provide opportunity for us to collaborate with other
colleges, share our past and future work, and learn from our counterparts.