Persistence and Success: A Study of Cognitive, Social, and

Western Michigan University
ScholarWorks at WMU
Dissertations
Graduate College
12-2009
Persistence and Success: A Study of Cognitive,
Social, and Institutional Factors Related to
Retention of Kalamazoo Promise Recipients at
Western Michigan University
Michelle Ann Bakerson
Western Michigan University
Follow this and additional works at: http://scholarworks.wmich.edu/dissertations
Part of the Educational Leadership Commons, Education Policy Commons, and the Higher
Education Commons
Recommended Citation
Bakerson, Michelle Ann, "Persistence and Success: A Study of Cognitive, Social, and Institutional Factors Related to Retention of
Kalamazoo Promise Recipients at Western Michigan University" (2009). Dissertations. 647.
http://scholarworks.wmich.edu/dissertations/647
This Dissertation-Open Access is brought to you for free and open access
by the Graduate College at ScholarWorks at WMU. It has been accepted for
inclusion in Dissertations by an authorized administrator of ScholarWorks
at WMU. For more information, please contact [email protected].
PERSISTENCE AND SUCCESS: A STUDY OF COGNITIVE, SOCIAL, AND
INSTITUTIONAL FACTORS RELATED TO RETENTION
OF KALAMAZOO PROMISE RECIPIENTS
AT WESTERN MICHIGAN UNIVERSITY
by
Michelle Ann Bakerson
A Dissertation
Submitted to the Faculty of the Graduate College
in partial fulfillment of the requirements for the
Degree of Doctor of Philosophy
Department of Educational Leadership, Research and Technology
Advisor: Gary Miron, Ph.D.
Western Michigan University
Kalamazoo, Michigan
December 2009
PERSISTENCE AND SUCCESS: A STUDY OF COGNITIVE, SOCIAL, AND
INSTITUTIONAL FACTORS RELATED TO RETENTION
OF KALAMAZOO PROMISE RECIPIENTS
AT WESTERN MICHIGAN UNIVERSITY
Michelle Ann Bakerson, Ph.D.
Western Michigan University, 2009
The Kalamazoo Promise, a universal scholarship program announced in
November 2005 provides four years of tuition and fees at any of Michigan's two- or fouryear public colleges or universities for students who have attended Kalamazoo Public
Schools. This investment in the community is being replicated elsewhere across the
nation, including Denver and Pittsburgh. The scholarship program lowers the cost of
postsecondary education, thereby increasing incentives for high school graduation,
college enrollment, and college completion. Of the 307 Kalamazoo Promise Scholarship
recipients who have attended Western Michigan University since its inception, 16% have
been academically dismissed.
The main objectives of this study were to: (1) examine persisters, those on
probation, and non-persisters in terms of the Cognitive, Social and Institutional factors of
retention, (2) examine persisters, those on probation, and non-persisters in terms of
average courses taken per term and number of courses taken the first year and, (3)
examine non-response bias in terms of respondents, late respondents, and nonrespondents.
Following are highlighted some of the key findings from the dissertation:
Persisters had higher high school GPAs and higher ACT composite scores and were more
likely to be White. Similarly, persisters took, on average, more courses per term and
more courses the first year than either those on probation or non-persisters. As a
contribution to research and evaluation, a number of different approaches were used to
study potential non-response bias among scholarship recipients. Depending on the
approach, small or insignificant differences in non-response bias were identified.
Because non-response bias was minimal, the overall findings and conclusions were
viewed as valid and did not need to be adjusted.
Various factors in the literature, such as parental income and living in a dorm,
found to contribute to retention of students did not function as expected with this
population. Also, the examination of non-response error and therefore possible nonresponse bias were extra steps taken to help ensure the quality of the generalizations
being made. It is hoped that further research using these results as a benchmark will be
conducted in order to more fully understand persistence and success of Kalamazoo
Promise recipients.
Copyright by
Michelle Ann Bakerson
2009
UMI Number: 3392137
All rights reserved
INFORMATION TO ALL USERS
The quality of this reproduction is dependent upon the quality of the copy submitted.
In the unlikely event that the author did not send a complete manuscript
and there are missing pages, these will be noted. Also, if material had to be removed,
a note will indicate the deletion.
UMI
Dissertation Publishing
UMI 3392137
Copyright 2010 by ProQuest LLC.
All rights reserved. This edition of the work is protected against
unauthorized copying under Title 17, United States Code.
ProQuest LLC
789 East Eisenhower Parkway
P.O. Box 1346
Ann Arbor, Ml 48106-1346
ACKNOWLEDGEMENTS
A dissertation is a strange experience. In the beginning it feels like a distant,
vague dream, one that you can barely see. Towards the middle you think, 'Maybe I will
be able to do this, just maybe I can get this done.' While everyone encourages you, and
those who have been through it know you can do it, you never really believe them,
imagining that the end is just too far away. And when, after all the hours, days, late, late
nights, and weekends, one day that last section is finally done, it is an awesome feeling to
know that you reached high and achieved a goal that took so long to accomplish.
Without Dr. Gary Miron, my chairperson, this goal of mine would not have been
possible. Your tireless hours spent helping me, bouncing ideas, solving various problems,
connecting me with stakeholders, helping with participant incentives or even helping me
speak to participants when I had laryngitis, you were always supportive, encouraging and
positive. I could not have asked for a better chair or a better experience. You have been
an extremely knowledgeable mentor and I appreciate everything that you have done for
me.
My sincere thanks go to Dr. Jessaca Spybrook for your substantial feedback,
attention to detail and extensive knowledge. Your time and guidance have been very
much appreciated. Dr. Andrea Beach, thank you for agreeing to be on my committee;
your feedback has also been very helpful and your expertise very much appreciated. I
ii
Acknowledgments—Continued
could not have asked for a better committee.
Dr. Brooks Applegate, thank you for always challenging me and encouraging me
to reach farther. Your classes were some of the most challenging and worthwhile. I
appreciate everything you have taught me over the years and thank you for always
making me think outside of the norm.
Special thanks go to Bob Jorth, administrator of the Kalamazoo Promise; Tracy
Pattok, director of the Student Academic and Institutional Research office at Western
Michigan University, and to Patricia Williams, facilitator of the Kalamazoo Promise at
WMU. Without the three of you, I would not have been able to complete this research.
Thank you for your time and willingness to help with this endeavor.
A sincere thank you goes to my business partner, Nakia James, for pulling the
weight at Momentum Consulting and Evaluation, LLC, and for being so supportive.
Katya L. Gallegos Custode and Tammy DeRoo, thank you for your help with the
interviews. Without your willingness to help this all would have taken so much longer.
Thank you to all of my friends at Western, including June Gothberg, John Hoye,
Michael Noakes, Maxine Gilling, Nadini Persaud, Fatma Ayyad, Julien Kouane, Manuel
Brennes, Chris Coryn, Daniella Schroter, Brandon Youker, Christian Gugiu, Wesley
Martz, Amy Gullickson, Otto Gustafson, Willis Thomas, and Thomaz Chianca it was
always a pleasure working with you. I enjoyed our many conversations together.
iii
Acknowledgments—Continued
Thank you to all of my friends and colleagues at Indiana University South Bend
for your support and encouragement, including but not limited to Dr. Yvonne Lanier,
Dr. Denise Skarbeck, Dr. Gwendolyn Mettetal and Erika Zynda. Thank you to
Dr.Michael Horvath, Dr. Karen Clark and Dr. Bruce Spitzer for believing in me and
giving me the opportunity to be part of the IUSB community.
A very special thank you goes to my family for hanging in there with me. To my
brothers, Charley and Andy, thank you for continually calling and checking on me, even
when I didn't have time to call you back. Your belief in me and support have helped me
get this far. I love you guys. Thank you to Tina, Brandan, Courtney, Alex and Tanner,
too.
Thank you to Lori, Michael, Mika, Taylor, Sandy, Vera and Judy you have been
such a tremendous support to me. Your help with the kids and encouragement has meant
so much to me. Michael and I could not do the things we do without you. Thank you to
all of my friends for not giving up on me and still inviting me when you knew I wouldn't
be able to go. You made me feel loved and that is exactly what I needed during this time.
A special thank you also goes to Tori Davies, University of Notre Dame, for your
wonderful editing, Erik Gunn with Great Lakes Editorial Services for your final editing,
to Dr. Sandra Harris, Troy University and to Maureen Hogue, University of Notre Dame,
and Dr. Martin Klubek, University of Notre Dame.
My deepest thank you goes to my husband, who has relentlessly stood by my side,
iv
Acknowledgments—Continued
encouraging me all the way through this process. Thank you for picking up the pieces I
have dropped along the way and keeping our family together. Without you I would not
have been able to accomplish this. I love you with all of my heart. Equally important a
huge thank you goes to my children, Bailey, Aliea and Audrey, you have been extremely
understanding with my lack of time. Without your help around the house, your
understanding and your patience I am sure I would not have been able to finish this.
Thank you for always being happy for me when I was one step closer. You three are my
life and I cannot wait to finally do all the things we have been planning.
Bailey, Aliea and Audrey, this dissertation is dedicated to you: may you reach high and
dream big.
Michelle Ann Bakerson
v
TABLE OF CONTENTS
ACKNOWLEDGEMENTS
ii
LIST OF TABLES
x
LIST OF FIGURES
xiv
CHAPTER
I. INTRODUCTION
1
Background of the Kalamazoo Promise
2
Development of the Kalamazoo Promise
3
Problem Statement and Research Questions
6
Research Question One
6
Research Question Two
8
Research Question Three
8
Methodological Overview
10
Rationale for the Dissertation
10
Structure and Overview of the Dissertation
11
II. REVIEW OF RELATED LITERATURE
12
Kalamazoo Promise
12
College Retention
18
Non-response Bias
23
Conclusion
31
vi
Table of Contents—Continued
CHAPTER
III. METHODOLOGY
32
Purpose
32
Research Design
33
Sample
34
Procedures for Data Collection
36
Informed Consent Process
38
Research Procedure
39
Data Analysis
43
Research Questions 1 and 2
47
Research Question 3
48
Ethical Considerations
52
Limitations
55
Summary
56
IV. RESULTS
57
Summary Academic Data
57
Survey Summary
64
Research Question One Results
73
Research Question 1.1 Results
74
Research Question 1.2 Results
84
vii
Table of Contents—Continued
CHAPTER
Research Question 1.3 Results
85
Research Question Two Results
87
Research Question 2.1 Results
87
Research Question 2.2 Results
91
Research Question ThreeResults
93
Data Analysis for Research Question 3
94
Research Question 3.1 Results
98
Research Question 3.2 Results
102
Research Question 3.3 Results
105
V. CONCLUSIONS
Ill
Central Findings
Ill
Research Question One Conclusion
112
Research Question Two Conclusion
118
Research Question Three Conclusion
120
Research Question 3.1
120
Research Question 3.2
123
Research Question 3.3
125
Future Research
129
Potential Implications
130
vin
Table of Contents—Continued
135
REFERENCES
APPENDICES
A. Participant Paperwork
142
B. Cognitive, Social and Institutional Factor of Retention and
Corresponding Survey Items, Academic Data Variable Names with
Measurement Type and Cognitive, Social and Institutional Factors
of Retention and Corresponding Survey Items with Subscales Identified
171
C. Summary Demographic Data on Interval Level Data Across
178
D. Tests of Normality and Persistence, those on Probation and Non-persistence,
Subscale Scores of the Survey of Promise Recipients by Response Category.... 183
E. Tests of Homogeneity of Variance and Subscales of the Subscale Scores
of the Survey of Promise Recipients by Response Category
185
F. Summary Results of Item Analysis, Institutional Support, Social
Engagement, Social Demands and Cognitive Engagement
187
G. Survey Summary Tables
198
IX
LIST OF TABLES
1. Kalamazoo Promise Summary Data
13
2. College or University Attendance for Current Promise Users as of Fall 2009
14
3. Bias and Percentage Bias in Respondent Mean Relative to Total Sample Mean
30
4. Data Obtained from the Facilitator of the WMU Kalamazoo Promise and the
Office of Student Academic and Institutional Research
35
5. Breakdown by Type of Data Collected and Persistence
35
6. Summary Research Questions 1 and 2 with Independent and Dependent
Variables, Data Source and Method of Analysis
7. Summary Research Question 3 with Independent and Dependent Variables,
50
Data Source and Method of Analysis
51
9. Average High School GPA by Probation Status at WMU
59
10. Most Recent WMU GPA by Probation Status at WMU
60
11. First Promise Semester
61
12. FTIAC Cohort by Persistence and Non-Persistence
62
13. High School Attended by WMU Kalamazoo Promise Recipients
63
14. Distribution of Promise Students by Race, Gender and High School
63
15. High School by Persistence and Non-Persistence
64
16. Distribution of Promise Students who Answered the Survey by Probation Status .... 65
17. Did You Begin College at WMU or Elsewhere?
66
18. Do You Expect to Enroll for an Advanced Degree When, or if, You
Complete Your Undergraduate Degree?
19. Where Do You Live During the School Year?
67
67
x
List of Tables—Continued
20. What is the Highest Level of Education Obtained by Your Father or Mother?
68
21. About How Many Hours Do You Spend in a Typical 7-day Week Doing
Each of the Following?
69
22. If You Have a Job, How Does it Affect Your School Work?
69
23. How Likely is it That the Following Issues Would Cause You to Withdraw
From Class or From WMU?
70
24. Are You a Member of a Social Fraternity or Sorority?
70
25. How Supportive Are Your Friends of Your Attending WMU?
71
26. How Supportive is Your Immediate Family of Your Attending WMU?
71
27. Which Best Represents the Quality of Your Relationship With Students
at WMU?
28. Which Best Represents the Quality of your Relationships With Instructors
at WMU?
29. Which Best Represents the Quality of Your Relationship With Administrative
Personnel & Office Staff at WMU?
72
72
72
30. If You Could Start Over Again, Would You Still Attend WMU?
73
31. Would You Recommend WMU to a Friend or Family Member?
73
32. Summary of Omnibus MANOVA Test of Group Differences across
Dependent Variables
76
33. Summary of Test for Differences by Race across the Dependent Variables
77
34. Summary of Test for Differences by Gender across the Dependent Variables
79
35. Summary of MANOVA Results for Persistence across the Dependent Variables
80
36. Summary of MANOVA Results for Persistence and Race Interaction across
the Dependent Variables
81
xi
List of Tables—Continued
37. Crosstab for Tests of Differences Among Persisters, Those on Probation,
and Non-persisters Across the Variables of Persistence and Taking Remedial
Courses at WMU, AP Credit, Gender, and Race
83
38. Crosstab for Tests of Differences Among Persisters, Those on Probation,
and Non-persisters Across the Variables of Persistence and Taking
Remedial Courses at WMU, AP Credit, Gender, and Race
86
39. ANCOVA Results of Persisters and Non-persisters, Controlling for Race
and Gender for Average Number of Courses Taken Per Term
88
40. Descriptive Statistics for Gender, Race and Persistence for Average
Number of Courses Taken Per Term
89
41. ANCOVA Results of Persisters, Those on Probation, and Non-persisters,
Controlling for Race and Gender for Number of Courses Taken the First Year
90
42. Descriptive Statistics for Gender, Race and Persistence for Number of
Courses Taken the First Year
90
43. ANCOVA Results of Respondents, Late Respondents and Non-respondents,
Controlling Across Race and Gender for Average Number of Classes Taken
Per Term
91
44. Descriptive Statistics for Gender, Race and Response for Average Number
of Courses Taken Per Term
92
45. ANCOVA Results of Respondents, Late Respondents and Non-respondents,
Controlling across Race and Gender for the Number of Courses Taken
the First Year
93
46. Descriptive Statistics for Gender, Race and Response for Number of
Courses Taken the First Year
93
47. Summary of Results from the Reliability Analysis and Descriptive Statistics
for Subscales of the Survey of Promise Scholarship Recipients
96
48. Distribution of Promise Students who Answered the Survey by Probation
99
49. Summary of MANOVA Comparison of Survey of Promise Scholarship
Recipients at WMU Spring 2009 across the Demographic Variables
101
50. Test Groups Differences across the Categorical Variables
102
xii
List of Tables—Continued
51. Summary of MANOVA Results for Early and Late Respondents
of the Survey of Promise Scholarship Recipients on the Four Summated
Subscale Scores
103
52. Summary of Comparison of Early Respondents and Late Respondents
across the Categorical Variables of the Survey of Promise Scholarship
Recipients
104
53. Descriptive Statistics for the Subscale Scores of the Survey of Promise
Scholarship Recipients
106
54. Response Bias Estimates for the .Subscale Scores of the Survey of Promise
Scholarship Recipients Based on Mean Scores of the Participants Using
the 53% Response Rate
107
55. Response Bias Estimates for the Subscale Scores of the Survey of Promise
Scholarship Recipients Based on Median Scores of the Participants Using
the 53% Response Rate
108
56. Response Bias Estimates for the Subscale Scores of the Survey of
Promise Scholarship Recipients Based on Mean Scores of the Participants
Using the 33% Rate
109
57. Response Bias Estimates for the Subscale Scores of the Survey of Promise
Scholarship Recipients Based on Median Scores of the Participants Using
the 33% Response Rate
110
58. Results of Mean and Median used in Bias Ratio Formula at the 53%
Response Rate
125
59. Results of Mean and Median used in Bias Ratio Formula at the 53%
and 33% Response Rate
126
xni
LIST OF FIGURES
1. Kalamazoo District Enrollment Trend
16
2. District Enrollment Trends for Kalamazoo, Battle Creek and Flint City Public
School Districts
17
xiv
CHAPTER I
INTRODUCTION
Across the nation, states and school districts struggle with high dropout rates and
fairly low college enrollment rates (Cataldi, E., Laird, J., & KewalRamani, A. [NCES],
2009). These issues can leave them at a competitive disadvantage as state and federal
education authorities increasingly tie aid to measures of performance such as test scores
and graduation rates. In addition, the cost of higher education has increased 7% more
than the rate of inflation over three years (Cunningham, A. [NCES], 2005). The disparity
between the cost of education and the meager rise in family incomes leaves some families
wondering whether or not post-secondary education will be attainable. Recognizing these
issues, many states, cities and citizens have acknowledged the need to improve the
education systems, hoping to increase their communities' human and social capital.
In Kalamazoo, Michigan, an anonymous group of concerned citizens created a
novel and innovative post-secondary education scholarship program solely for
Kalamazoo district school children. The idea of this universal scholarship is very simple:
students who live in the district and have gone to school in the district for at least four
years qualify to receive scholarship money; all they have to do is be accepted into a
Michigan higher education institution. The one-page application is straightforward and
simple for families to understand and fill out.
In short, this program provides extraordinary access to higher education. Yet
since the first scholarships were awarded in 2006, 49 Western Michigan University
(WMU) Kalamazoo Promise recipients have attended WMU but have not been
1
retained—nearly 1 out of every 6. This is of major concern to WMU, the Kalamazoo
Public School district, and to administrators of the Kalamazoo Promise scholarship itself.
Cognitive, social or institutional factors may all contribute to retention or non-retention
under this program. Before these factors can be explored in depth, however, it is
necessary to understand the backdrop of the Kalamazoo Promise program.
Background of the Kalamazoo Promise
Since its announcement in November 2005, the Kalamazoo Promise universal
scholarship program has already garnered attention far beyond its home community;
places as far-flung as Denver, Colorado, Pittsburgh, Pennsylvania, and El Dorado,
Arkansas, are attempting to replicate the program in their own communities. For students
who already planned to attend college, the Promise may ease the financial burden of a
post-secondary education. For students who are unsure if they will attend college, the
Promise gives a tangible opportunity to do so, and may enable them to attend full-time,
work less, or both. For students who have written off college as financially out of reach,
the Promise helps lessen the financial burden and offers low-income students new hope
and possibilities. Younger children, especially, who may not have role models who have
gone to college, will benefit from a K-12 experience that expects them to continue their
education into post-secondary school.
The Kalamazoo Promise Scholarship, unlike other scholarship programs, does not
look at need or merit; the only stipulation is having attended Kalamazoo Public Schools
for at least four years and getting accepted into a Michigan higher education institution.
Students who meet these two criteria will have all four years of tuition and fees paid for,
2
regardless of their financial need or other scholarships they may have received. In
essence, all Kalamazoo Public School children now have the opportunity for a free K-16
education if they are able to take advantage of it.
Despite this opportunity, currently 49 students at Western Michigan University
(WMU) who met the criteria of the Kalamazoo Promise Scholarship and attended WMU
did not finish their degree. Understanding why is imperative to determine whether WMU
can do something to retain these students or whether, instead, keeping them in college is
more appropriately the responsibility of the Kalamazoo Public Schools or the Promise
program itself.
Development of the Kalamazoo Promise
Initially, the Kalamazoo Promise was run out of the Kalamazoo Public School
district superintendent's office with no official staff. Alex Lee, Kalamazoo Public
Schools' (KPS) executive director for communications, fielded communication from the
press while the KPS website was the primary source for information regarding the
scholarship.
The first official staff member of the program, Robert Jorth, was hired in March
2006 as the Kalamazoo Promise administrator. The official duties of this position
included determining eligibility, maintaining a database of eligible students, and
disbursing Promise funds to colleges and universities. Jorth was solely responsible for
setting up and managing the scholarship program and has managed to create a system in
which data is easily usable and accommodates the terms of the scholarship. The system
he has established works, and it's simple: for example, the Kalamazoo Promise
3
scholarship application form is a one-page document.
Initially, there was confusion about KPS and the Kalamazoo Promise being one
entity because they were located in the same building, but in the fall of 2006, the Promise
administrator moved out of the school district building and into an office in the
Kalamazoo Communities in Schools (KCIS) downtown facility, which was donated by a
real estate developer. KCIS organizes community partners in their efforts to help children
learn and stay in school. The Kalamazoo Promise incorporated as a 501(c)(3)
organization and established a web site with its own domain name,
https://www.kalamazoopromise.com. These changes helped more concretely distinguish
the Promise from KPS. Robert Jorth continues to run the program and has disbursed
$10.45 million in scholarship money to 26 schools (Jorth, 2009).
Jorth was the sole employee dedicated to the program until September 2008,
when Janice Brown joined as executive director. Brown, who had been superintendent of
the Kalamazoo school district until August of 2007, has had a key role in implementing
the Kalamazoo Promise, arranging for anonymous donors to fund the scholarship.
Kalamazoo's Vice Mayor, Hannah McKinney, stated, "I don't think we would have the
Promise were it not for Janice. She's a little Energizer bunny with a lot of integrity"
(Yung, 2007, p.l). Brown credits the donors themselves with the concept: giving every
Kalamazoo student the opportunity to attend post-secondary school. "It was really the
idea of the donors after a long, long series of discussions about a question. What can we
do to make a turnaround, to make an impact on an urban city in which we really wish to
invest? And over and over again, the answer seemed to be invest in education, invest in
out youth" (Yung, 2007, p.l). Brown continues to be the bridge between donors and the
4
community and is the only person who has publicly acknowledged knowing the identity
of the donors.
Backdrop. Kalamazoo, Michigan, is located in the Southwest corner of the
state. According to the U.S. Census Bureau, Kalamazoo County's high school graduation
rate is 88.8%, 31.2% of its population holds a bachelor degree or higher, and the median
household income is $43,450, with 13.8% of its population falling below poverty (2009).
Even during the current economic downturn, college costs continue to rise. In-state costs
are up 37% at public post-secondary institutions, with charges for tuition, fees, room and
board of $14,333 for in-state students and $25,200 for out-of-state students. Private
colleges in the state charge an average of $34,132 (Jayson, 2009). Such expenses are
increasingly burdensome for families, leaving many unable to even consider the option of
higher education. Thus, for people especially hard-hit by the economy, the Kalamazoo
Promise Scholarship is seen by some as "a miracle" (Miron, Spybrook & Evergreen,
2008, p. 18).
This universal scholarship, the first of its kind, represents an enormous
commitment on the part of its funders. The anonymous donors have promised to carry
the financial burden for all students who live within the district boundaries and attend
Kalamazoo Public Schools for at least four years or 130 credits, whichever comes first.
This program is intended to last in perpetuity and involves no public funds. Despite this
opportunity, however, the office of Institutional Research at Western Michigan
University reports that approximately 49 Kalamazoo Promise Scholarship recipients who
attended Western Michigan University (WMU) are no longer at WMU and have not
finished their degrees. Understanding the influences that led these students to leave
5
college would be valuable information for the Kalamazoo Promise and Western
Michigan University, and may aid in decisions regarding retention of students who are
Kalamazoo Promise Scholarship recipients.
Problem Statement and Research Questions
The purpose of this dissertation was to determine to what extent persisters, those
on probation, and non-persisters differ on demographic characteristics, and on each of the
following in Swail's (2003) Geometric Model of Student Persistence and Achievement:
Cognitive, Social and Institutional retention factors. This study also examined the
average number of courses take per term and the number of courses taken the first year
by persisters, those on probation, and non-persisters; and respondents, late respondents
and non-respondents. Additionally, the extent to which respondent, late-respondent and
non-respondent Kalamazoo Promise recipients differ on each of Swail's (2003)
Geometric Model of Student Persistence and Achievement was examined using known
characteristics from the academic data. Similarly, the difference between early
respondents and late respondents was examined using the same model, but this time using
the actual survey data. Lastly, non-response error was examined using Groves and
Couper's bias ratio formula (1998); the formula was modified, using the median to
compare differences in indications of non-response bias in the survey depending on
whether the mean or median estimator is used.
The research questions are as follows:
Research Question One
To what extent do persister, those on probation, and non-persister Kalamazoo
6
Promise recipients differ by demographic characteristics on each of the following
selected factors in Swail's (2003) Geometric Model of Student Persistence and
Achievement: (a) Cognitive Factors, (b) Social Factors, and (c) Institutional Factors?
More specifically:
1.1 Among the three groups of students, persister, those on probation, and nonpersister Kalamazoo Promise recipients by gender and race, are there any
differences in the cognitive factors from Swail's (2003) Geometric Model of
Student Persistence and Achievement using the following dependent variables
from the academic data: (a) high school GPA, (b) most recent WMU GPA, (c)
ACT composite score, (d) taking a remedial math course at WMU, (e) taking a
remedial reading course at WMU, (f) taking a remedial writing course at
WMU, or (g) taking AP credit?
1.2 Among the three groups of students, persister, those on probation, and nonpersister Kalamazoo Promise recipients by gender and race, are there any
differences in the social factors from Swail's (2003) Geometric Model of
Student Persistence and Achievement using the following dependent variables
from the academic data: (a) living in a dorm, (b) being an athlete or (c)
parental income?
1.3Among the three groups of students, persister, those on probation, and nonpersister Kalamazoo Promise recipients by gender and race, are there any
differences in the institutional factors from Swail's (2003) Geometric Model
of Student Persistence and Achievement using the following dependent
variables from the academic data: (a) first year experience (FYE), and (b)
7
which high school Promise students came from?
Research Question Two
Research question two is broken into two sections, both of which examine the
numbers of courses taken by students. Research question 2.1 examines persisters, those
on probation, and non-persisters, while research question 2.2 examines respondents, late
respondents, and non-respondents, controlling for race and gender.
2.1 Is there a difference in the average number of courses taken per term and number
of courses taken the first year among the three groups of students: Persister,
those on probation, and non-persister WMU Kalamazoo Promise recipients,
controlling for gender and race, using the course summary data?
2.2 Is there a difference in the average number of courses taken per term and number
of courses taken the first year among the three groups of students:
Respondents,
late respondents, and non-respondents of the Survey of Promise Scholarship
Recipients at WMU Spring 2009, controlling for gender and race, using the
course summary data?
Research Question Three
To what extent do respondent, late respondent, and non-respondent Kalamazoo
Promise recipients differ on each of the following selected factors in Swail's (2003)
Geometric Model of Student Persistence and Achievement: (a) Cognitive Factors, (b)
Social Factors, and (c) Institutional Factors using known characteristics from the
academic data? In addition, to what extent do early respondents differ from late
8
respondents on variables from the Survey of Promise Scholarship Recipients at WMU
Spring 2009? Lastly, using Groves and Couper's bias ratio formula, is there an indication
of non-response bias? More specifically:
3.1 Among the three groups of students, respondent, late respondent, and nonrespondent Kalamazoo Promise recipients, are there any differences in the
following known dependent variables from the academic data: (a) high school
GPA, (b) most recent WMU GPA, (c) ACT composite score, (d) taking a
remedial math course at WMU, (e) taking a remedial reading course at WMU,
(f) taking a remedial writing course at WMU, (g) taking AP credit, (h) living
in a dorm, (i) being an athlete, (j) parental income, (k) first year experience
(FYE) or (I) high school that could indicate possible non-response bias?
3.2 Between the two groups of students, early respondent and late respondent
Kalamazoo Promise recipients, are there any differences in the cognitive,
social or institutional factors from Swail 's (2003) Geometric Model of
Student Persistence and Achievement using the dependent variables from the
Survey of Promise Scholarship Recipients at WMU Spring 2009 indicating
possible non-response bias?
3.3 Using and modifying Groves and Couper's bias ratio formula (1998), is there
an indication of non-response bias, and is there a difference between using
the mean and using the median, a more robust statistic, in determining a bias
estimate on the dependent variables from the Survey of Promise Scholarship
Recipients at WMU Spring 2009?
9
Methodological Overview
The mixed methods analysis is a non-experimental relational design of three
samples of Kalamazoo Promise recipients, those Western Michigan University (WMU)
retained (persisters), those on probation and those WMU did not retain (non-persisters).
These samples were compared using the variables from the Survey of Promise
Scholarship Recipients at WMU Spring 2009 survey and variables obtained from
academic records. In addition, non-persisters were examined in depth to identify what
factors influenced their decision to leave WMU. Lastly, non-response bias was examined
to account for those who responded to the Survey of Promise Recipients at WMU Spring
2009 and those who did not respond.
Rationale for the Dissertation
The results of this dissertation may have far-reaching implications for not only
Western Michigan University and the Kalamazoo Promise Scholarship program, but the
Kalamazoo community as a whole. The findings could offer insight into where retention
efforts should be focused and by whom. The factors determined to affect retention might
be associated with the efforts of WMU, the Kalamazoo Promise, the Kalamazoo Public
School system, or individual circumstances that other community agencies might need to
address. This process could be helpful to various stakeholders in establishing a baseline
of information and determining a direction for future dialogue and interactions in higher
education retention.
One contribution of this dissertation is to evaluation technique, in that it illustrates
10
how non-response error and possible non-response bias can be detected even in smallscale research projects or program evaluations. This study offers insights about social
science research sampling bias and its effects on this population. Non-response bias has
not been brought to the field of evaluation; this dissertation is an opportunity to help
bridge the fields of evaluation and research. This is one sampling error that evaluators
need to take into account when collecting and interpreting data.
Structure and Overview of the Dissertation
This study is organized into five chapters. Chapter I, the introduction, provides
the background and development of the Kalamazoo Promise scholarship, the problem
statement, research questions, methodological overview, and rationale for this
dissertation. Chapter II reviews the literature and is broken into three sections: 1) an
overview of the Kalamazoo Promise program, its participants and community impacts; 2)
a discussion of college retention, including definitions, a historical overview, rationale
and factors for successful retention; and 3) a discussion of non-response bias, including
definitions, an explanation of non-response and non-response bias, identification of nonresponse categories, and information on detecting non-response bias. Chapter III
elucidates the methodology of this study in detail, including participant selection, types of
and sources of data, primary and secondary data collection, data analysis, data
verification and ethical considerations. The results are presented in Chapter IV, and the
conclusion and implications are summarized in Chapter V.
11
CHAPTER II
REVIEW OF RELATED LITERATURE
The review of relevant literature and research is divided into the following
sections: 1) an overview of the Kalamazoo Promise Scholarship and its impact on the
Kalamazoo community; 2) a framework on higher education retention, exploring what
factors assist successful retention of students; and 3) a discussion of relevant issues
regarding non-response bias in social science research, its definitions and implications.
Kalamazoo Promise
Overview. The Kalamazoo Promise Scholarship, announced in November 2005,
is funded by anonymous contributors. It is unique in that it is a universal scholarship
awarded to all qualifying students regardless of income or other scholarships, and is the
first program of this kind. It is estimated that current in-state tuition per semester ranges
from $2,000 at a community college to more than $9,000 at the University of Michigan.
This means that a family could receive as much as $18,000 per year per child. Once
there are four graduated classes, donors will be spending approximately $12 million per
year to fund the scholarship. As of 2009, 1,521 graduates have used the Kalamazoo
Promise Scholarship, or 82.7% of those eligible (see Table 1).
12
Table 1. Kalamazoo Promise Summary Data
Number of KPS graduates
Eligible for the Promise
% of graduates eligible for the Promise
Number of graduates using the Promise the
first semester after graduation
% of eligible students using the Promise the
first semester after graduation
Number of graduates who have used the
Promise3
% of eligible students who have used the
Promise3
2006
2007
2008
2009
Total
517
409
579
502
549
475
515
455
2I6U
79.1
86.7
86.5
88.3
1841
85.3
303
359
370
72.7
74.6 78.1
339
414
388
370
1521
82.9
82.5
81.7
81.3
82.7
"Students who have used at least some portion of their scholarships as of September 8, 2009.
Note. Data provided by Kalamazoo Promise administrator
According to data obtained from Mr. Robert Jorth, the Kalamazoo Promise
administrator, of those 1,521 graduates who have used the Promise, 323, or 21.2%,
currently attend Western Michigan University (see Table 2). This is second only to
Kalamazoo Valley Community College (KVCC), which currently enrolls 367, or 24.1%,
of the Promise students. Michigan State University and University of Michigan come in
a distant third and fourth, with 140, or 9.2% percent, and 110 or 7.2%, respectively, of
Promise students currently enrolled. Ferris State University / Kendall School of Art &
Design is in fifth place, with only 25, or 1.6 percent, of the Promise students currently
enrolled.
13
Table 2. College or University Attendance for Current Promise Users as of Fall 2009
College / University
2006
2007
2008
2009
Total
Community Colleges Totals
Glen Oaks Community College
Grand Rapids Community College
Jackson Community College
Kalamazoo Valley Community College
Kellogg Community College
Lake Michigan College
Lansing Community College
Mott Community College
Muskegon Community College
Northwestern Michigan College
Oakland Community College
Washtenaw Community College
41
86
1
2
98
176
1
5
77
1
91
1
3
3
161
1
2
2
2
2
401
2
7
2
367
3
2
9
2
2
0
1
4
693
16
11
25
23
140
8
12
3
2
110
2
1
17
323
1076
Universities Totals
Central Michigan University
Eastern Michigan University
Ferris State University/ KS of A&D
Grand Valley State University
Michigan State University
Michigan Technological University
Northern Michigan University
Oakland University
Saginaw Valley State University
University of Michigan
University of Michigan Dearborn
University of Michigan Flint
Wayne State University
Western Michigan University
Grand Total
% Retained
2
38
1
159
4
1
6
3
33
2
2
171
3
4
7
6
36
2
3
1
1
2
179
9
3
5
6
29
1
7
184
3
7
8
42
3
2
2
2
17
34
43
4
87
3
72
4
72
16
2
1
6
92
200
257
277
360
59.0
62.1
71.4
97.3
Note. The data for 2009 is projected. Data provided by Kalamazoo Promise administrator
The Kalamazoo Promise has had a significant impact on the surrounding colleges
and universities, with 64.1% (690/1076) of currently enrolled recipients attending either
14
WMU or KVCC. Other communities are trying to replicate this scholarship program: the
El Dorado Promise in El Dorado, Arkansas, the Denver Scholarship Foundation, and the
Pittsburgh Promise are among the first to be established. More communities around the
country also are considering the Kalamazoo Promise as a potential model. These
scholarship programs were represented at the first annual PromiseNet conference held in
Kalamazoo in June of 2008 (Eberts, 2008).
Recipients. Eligible recipients of the Kalamazoo Promise Scholarship are
students who have attended Kalamazoo Public Schools—Kalamazoo Central High
School, Loy Norrix High School, or Phoenix High School—for at least four years and
who have been accepted by a Michigan public higher education institution. Recipients
who have attended Kalamazoo Public Schools since Kindergarten receive 100% of
tuition and fees paid. Students who have attended Kalamazoo Public Schools from ninth
grade forward received 65% of tuition and fees paid. For students who have attended
KPS for longer than high school but less than the full K-12 period, the scholarship's total
value is prorated accordingly. The only requirement to maintain eligibility is that
students work toward a degree and maintain a 2.0 GPA in their college courses.
Community Impacts. The community impacts of such a scholarship are immense.
Kalamazoo public school enrollment began increasing dramatically after the Promise was
announced (see Figure 1); reaching in the 2008-09 school year the highest enrollment in
approximately 10 years.
15
12,000
11,500
-a
11,000
10,500
*
10,000
9,500
9,000
$Hi
,«
&
0>
<Vs
«
•
V
&b
>
T5
d*
V
^
Figure 1. Kalamazoo District Enrollment Trend
Note. Data provided by Michigan DOE website located at: http://www.michigan.gove/cepi.html
This trend is even more explicit when examining similar school districts in the
State of Michigan. Over the past seven years Battle Creek Public Schools' enrollment has
decreased 19%, from 7,922 in 2002-'03 to 6,439 in 2008-'09. Flint City Public Schools'
enrollment has decreased 34%, from 21,007 in the '02-'03 school year to 13,798 in the
'08-'09 school year. Lastly, Lansing School district has dropped 19% from 17,376 in the
'02-'03 school year to 14,160 in the '08-'09 school year. These downward trends are seen
across the state. Kalamazoo Public School district was on this same downward trend. In
the '02-'03 school year, the district enrolled 11,084 students; enrollment declined
consistently through the '05-'06 school year to 10,238, an 8% decrease. Immediately
after the Kalamazoo Promise was announced in 2006, enrollment started rising in
Kalamazoo school district as other districts continued to decline. From the '06-'07
school year until the '08-'09 school year, student enrollment increased 13%, to 11,696.
16
25,000
20,000
c
1
15,000
(§ 10,000
5,000
0
^
7
acfc
c^r
^b
c&f
&
^
^
$
/
/
c$F
«r
, /
Figure 2. District Enrollment Trends for Kalamazoo, Battle Creek and Flint City Public
School Districts
Note. Data provided by Michigan DOE website located at: http://www.michigan.gove/cepi.html
Beyond the boundaries of the K-12 or even the K-16 school system, the
Kalamazoo Promise is envisioned as a catalyst for change and economic growth (Miron,
G. & Evergreen, S., 2007). For students who already planned to attend college, the
Promise may ease the financial burden of a post-secondary education. For students
unsure if they will attend college, the Promise offers a tangible opportunity not only to do
so, but to do so full time and, possibly, work less as well. For children who want to go to
college but see the financial burden as insurmountable, the Promise lifts that burden.
Low-income students now have hope and possibilities they lacked before the Kalamazoo
Promise (Miller-Adam, 2009). Younger children lacking role models who have gone to
college especially may benefit from a K-12 experience that routinely expects even them
to continue their education through post-secondary school.
Besides easing or even removing some families' financial burdens, the
scholarship now creates an incentive for families to stay in the community and for new
17
families with children to move into the district so they can take advantage of this
scholarship (Miron & Cullen, 2008). Community leaders hope the scholarship program
will stabilize the housing market and increase property values. In addition, the
community looks much more attractive to businesses seeking to invest, expand or
relocate. The Promise can help local companies attract employees drawn by the
availability of this scholarship for their own children, and it can help foster a well-trained
future workforce as well. The resulting economic growth can create a domino effect, with
more families and businesses moving in, increasing the need for other businesses, thus
further expanding the job market.
More Kalamazoo students than ever before have the opportunity to obtain a postsecondary education. The issue now becomes the capacity of these students to succeed in
this environment and their ability to persist to achieve the college degree that they have
started to pursue.
College Retention
Definitions. College retention and attrition is complex and difficult to define.
The simplest definitions are those of persisters and non-persisters. Persisters are those
students who stay in college and finish their degree; with more persisters, the higher
education institution's retention increases. Non-persisters are those who leave for
whatever reason; when they do so, the college's retention decreases. (Astin, 1975; Astin,
1999; Bean & Eaton, 2000; Bean & Metzner, 1985; Braxton, 2004; Cabrera, Nora &
Cabrera, 1993; Pascarella & Terenzini, 1980; Spady, 1970; Stage, 1989; Swail, 2003;
Tinto, 1975). Persistence in the literature refers to the student, while retention refers to
18
the institution. The other term used for non-persistence is "dropout." Where retention
means staying in school, dropout means leaving school before degree completion. This
seems straightforward. Alexander Astin, however, sees the concept of dropout as
problematic because it is
imperfectly defined. The so-called dropouts may
ultimately become non-dropouts and vice versa.. .But there
seems to be no practical way out of the dilemma: A
"perfect" classification of dropouts versus non-dropouts
could be achieved only when all of the students had either
died without ever finishing college or had finished college
(1971, p.15).
Vincent Tinto (1987) states that there are definite limits to the understanding of
student departure; the controversy lies in the labeling of persisters and non-persisters, and
how the quality of persisting or non-persisting is measured. College students take
numerous avenues in pursuit of their educational goals. One student might enroll, attend
full- or part-time for a couple of years, leave, and then return five years later to finish.
Others might transfer to another college, enroll in more than one college at the same time,
take a full load but then only complete one class, or be put on academic probation. The
scenarios are limitless; the lack of a clear way to measure persisters or non-persisters
complicates any definition of college retention.
Agreeing with Tinto, Bean and Metzner (1985) acknowledge that many students
leave college because they have met their goals. Perhaps this process of self-discovery
resulted in individual growth and maturation; thus, Bean and Metzner argue, leaving
college should not be considered a failure by the student or the institution; that any
definition of retention should consider student educational goals; and that a "dropout,"
therefore, would be defined in light of the student's original intent and outcome (1985).
19
Western Michigan University defines its persisters as those first-time, full-time,
degree-seeking beginners (FTIAC) who start in the Fall semester and who are still
attending the following Fall semester. According to A Comprehensive Report of
Retention Rates, written by WMU's Office of Student Academic and Institutional
Research, the current retention rate of the Fall 2006 cohort to their second year in Fall of
2007 is 75.1%, which is 1.1 percentage points lower than the average retention rate of
Michigan public universities, 76.2% (2009). Along with the difficulties found in the
definitions come the complexities of various models and theories of retention.
Retention Models and Theories. Research on college retention, student
persistence, student departure and achievement of higher education degree attainment has
been enormous (Astin, 1975; Astin, 1884; Braxton, 2004; Bean & Eaton, 2000; Bean &
Metzner, 1985; Cabrera, Nora & Cabrera, 1993; Carera, Stampen & Hansen, 1990;
Carroll, 1988; Pascarella & Terenzini, 1980; Spady, 1970; St. John, Cabrera, Castaneda,
Nora & Asker, 2004; Stage, 1989; Stoecker, Pascarella & Wolfe, 1988; Swail, 2003;
Tinto, 1975; Tinto, 1993). Because of the complexity and importance of the issue of
retention, research, theories and models continue to grow, develop and be evaluated in
the hopes of illuminating its dynamics.
Factors for Successful Retention. Many factors contribute to successful retention
and much research has investigated what factors contribute to that success. One heavily
researched area is the characteristics of persisters and non-persisters. Persisters are more
likely to attend college full-time, while non-persisters are more likely to attend part-time
(Adelman, 1999; Chan, 2002; Feldman, 1993; Lanni, 1993; Moore, 1995; Naretto, 1995;
NCES, 1998; Panos & Astin, 1968; Price, 1993; St. John, 1990; Windham, 1994). Part-
20
time students are also more likely first-generation students, which increases the chance
for non-persistence (NCES, 1998). Typically, non-persisters work more hours than
persisters (Naretto, 1995). The factor of age is contentious (Grosset, 1991). Much
research shows that non-persisters are usually older, while persisters are typically
younger (NCES, 1998; Price, 1993; Windham, 1994). Conversely, some research reports
the exact opposite (Feldman, 1993). Further factors that have been found to contribute
to a student's decision to drop out of college include financial concerns (GAO, 1995),
full-time employment, family responsibilities, low grade-point average (NCES, 1998),
being an ethnic minority other than Asian, and being of the female gender (Bonham &
Luckie, 1993; Guloyan, 1986; Levin & Levin, 1991; Rendon, Jalomo & Nora, 2004).
The literature factors and models are all based on Tinto's model, which Tinto and
others have modified and improved over time. The model used in this study is the
Geometric Model of Student Persistence and Achievement developed by Scott Swail
(2003), which is an improvement over Tinto's model. Unlike other models, Swail's is
student-centered; nevertheless, he also incorporates all of the other factors from other
models.
In Swail's Retaining Minority Students in Higher Education: A Frameworkfor
Success (2003, p. 92), Figure 19 illustrates these factors and their components to illustrate
how they relate to the student and persistence. Swail depicts the "Student Experience" as
the center of an equilateral triangle. The triangle's base is "Institutional Factors,"
consisting of financial aid, student services, recruitment and admissions, academic
services, and curriculum and instruction, the triangle's base. The left-hand side of the
triangle is "Cognitive Factors," consisting of academic rigor, quality of learning, aptitude,
21
content knowledge, critical-thinking ability, technology ability, study skills, time
management, and acedmic-related extracurricular activities. The right-hand side of the
triangle is "Social Factors," consisting of financial issues, educational legacy, attitude
toward learning, religious background, maturity, social coping skills, communication
skills, attitude toward others, cultural values, expectations, goal commitment, family
influence, peer influence, and social lifestyle. When all categories are in balance,
persistence is most likely to happen.
The Swail model has been used in several other research studies, either in its
entirety or in modified form (EPI, 2007; Hayman, 2007; NCRA, 2006). For example, the
Minority Engineering Recruitment and Retention Program at the University of Illinois at
Chicago used Swail's model with modifications:
Borrowing from Terenzini's (2006) model in EC 2000 and
using variables in Swail's (2005) geometric model of
student persistence and achievement, MERRP has
developed a model of engineering student experience that
identifies six background variables affecting an engineering
student's experience in the academy, which in turn
influence college matriculation outcomes (Hayman, D.,
2007, p. 5).
This study examines two types of retention: institutional-level and course-level.
Two other types, major retention or system retention, will not be considered in this
research. Institutional retention is a measure of students who remain at the same school
year after year. System retention is a measure of students who remain in school, but
without specifying that they continue in the same institution from one year to the next.
Course retention is a measure of course completion. Major retention is a measure of
specific major completion. Each measure has its own pitfalls: Institutional and system
22
level retention could be improved by including students who are part-time, transfer
students, continuing education students, and all students regardless of their start date or
fall cohort group.
A shortcoming of Fall-to-Fall retention measures is that some institutions only
admit students with high ACT or SAT scores in the fall semester, while admitting a
second wave of students in the spring with lower ACT and SAT scores. Thus,
institutions can admit students of better academic standing whose likelihood of retention
is higher than for students admitted later, resulting in potentially inflated retention rates.
Course retention is measured by a tool called Successful Course Completion
Ratio (SCCR) (Hagedorn, 2004), the ratio of courses completed to courses taken. For
example, a student who takes four courses and completes three of them has an SCCR of
75%. Institutions with high levels of student "stopouts," students enrolled in more than
one institution, students who are not degree-seeking, or students with diverse academic
goals all find this tool useful. It offers an alternative to the limitations of only examining
degree-completion rates.
Non-response Bias
Reliable and valid techniques for measuring variables and constructs are the basis
for all social science inquiries (Ary, Jacobs & Razavieh, 1996). Population parameters
are estimated through various sampling procedures; the ability of researchers to
generalize to broader populations hinges on these sampling procedures. Regardless of
the quality of the sample and sampling technique, one major issue in survey research and
evaluation is the declining response rate to surveys in the wealthier parts of the world (de
23
Leeuw and de Heer, 2002), which in some cases can produce non-response bias.
Definition. Response rates measure all responses that are returned and usable as a
proportion of all surveys distributed to the sample population. For example, 80 returned
surveys out of 100 would give a response rate of 80%. If 20 of the returned surveys were
incomplete or had other problems making them not usable, however, the response rate
would fall to 60%. This might seem sufficient, but in fact might skew the sample.
Non-response bias exists when there is a difference in the interpretation of results
that would be made regarding those who respond and those who do not respond. "The
bias created by non-response is a function of both the level of non-response and the
extent to which non-respondents are different from respondents" (Kano, Franke, Afifi &
Bourque, 2008, p.l). For example, in a study of gender issues with 50 men and 50
women in the population, 50 returned and usable surveys received might seem quite good
if 25 were from men and 25 from women. If all 50 were from men and no women
responded, however, the results are likely to be severely distorted: the resulting response
bias would make the research or evaluation interpretations and conclusions invalid.
A high response rate can be obtained, but may require costly follow-up
procedures. Even with a high response rate, non-response bias may exist. Miller and
Smith (1983) stated that even with a response rate as high as 90%, non-response bias may
still be present.
Dealing with Non-response and Non-response Bias. There are many ways to deal
with non-response. These include ignoring non-respondents, following up with nonrespondents, comparing sample estimates of respondents to the population, comparing
respondents to non-respondents, re-sampling non-respondents and comparing sample
24
estimates of respondents with other sources. Each method has advantages and
disadvantages.
Ignore non-respondents. Ignoring non-respondents means that the research can
only be generalized to the sample responding and not to the population. This is not a
good method of controlling for non-response error (Miller & Smith, 1983).
Follow-up with non-respondents. A better alternative to ignoring them is to
follow-up with non-respondents by sending out reminders, such as e-mails, postcards, or
phone calls, or by redistributing the survey. "Two to three reminders (and even more)
have proven effective" (Diem, 2004, p. 1).
Compare Respondents with Non-respondents. Another method for controlling
non-response error is to compare respondents with non-respondents (Miller & Smith,
1983). After a comparison on known characteristics shows no statistically significant
difference, the results can be generalized both to the sample and the population (Diem,
2004). "While the level of non-response does not necessarily translate to bias, large
differences in the response rates of subgroups serve as indicators that potential biases
may exist" (Brick, Bose, W., and Bose, J., 2001, p. 2).
A commonly used formula to calculate bias of the mean between respondents and
non-respondents is:
Ky r) = (l-r)( yr-
ynr)
where subscript r signifies the respondents while nr signifies the non-respondents. The
notation 1- r is the non-response rate (Brick, Bose, W., and Bose, J., 2001) and y
represents the mean of the chosen variable. This suggests that if the response rates for
respondents and non-respondents are very different, the difference of the mean could
25
indicate bias. In some social science research this simple formula may not work due to
the use of weighting, imputation of missing items, or any other non-response adjustments
made to the data. In this study, however, no adjustments were made, and the formula
functioned as intended. This formula can only be used when variables are known for
both the respondents and non-respondents; academic records provided data so that
variables for both groups were known, allowing the formula to function as intended.
Compare Sample Estimates of Respondents to the Population. Another way to test
for response bias is to compare sample estimates of respondents to that of the population
"values computed from the sampling frame" (Brick, Bose, W., and Bose, J., 2001; Miller
& Smith, 1983). With this method the problems of weighting and non-response
adjustments are not an issue.
Re-sampling ofNon-respondents. Diem (2004) describes another way to increase
a low response rate: By taking another sample of "10 to 20 percent of the nonrespondents, and securing responses from this subsample, a statistical comparison can be
made with subjects responding by the original deadlines and if they are similar, the data
can be pooled and generalized to the sample/population" (Diem, 2004, p.2). This method
is called "double-dip" by Miller and Smith (1983). This validation approach works, but
two samples from the same population are needed. When a response rate of less than
80% is achieved Gall, Borg, and Gall (1996) suggest that a random sample of 20% nonrespondents be contacted and "double-dipped" and that responses from the nonrespondent sub-sample be compared with each item on the instrument to establish if nonresponse error is indicated.
Compare Early to Late Respondents. Comparing early or on-time respondents
26
with late or reluctant respondents is commonly done in social science research to
determine the effect, if any, of non-response on the statistics being considered (Miller &
Smith, 1983; Smith, 1984). Extrapolation methods are used to compare early with late
respondents. There are three types of extrapolation methods: successive waves, time
trends and concurrent waves. Each method of extrapolation has its drawbacks, but all are
based on the assumption that participants who respond later are more like nonrespondents. "Evidence has shown that late respondents are often similar to nonrespondents. If a statistical comparison of late respondents shows no difference from
early respondents, then data from respondents can be generalized to the population"
(Diem, 2004, p.2).
Successive waves refers to the stimulus done over time, i.e. reminder emails, post
cards, follow-up calls. It is assumed that subjects who respond in later waves responded
because of the increased contacts made; therefore, they are expected to be similar to nonrespondents (Armstrong & Overton, 1977). Time trends refers to looking at the time
between when the subject received notice of the survey or interview and his or her
completion time. The drawback to this method is that it is sometimes difficult to know
when the subjects were aware of the survey. If it is possible to determine when subjects
became aware, subjects who respond later are assumed to be similar to non-respondents.
The advantage to using a time trend method over the use of waves is the elimination of
bias being introduced by the stimulus itself (Armstrong & Overton, 1977). Finally,
concurrent waves refers to the same survey or stimulus being sent out to several
randomly selected subsamples at the same time. "Wide variations are used in the
inducements to ensure a wide range in rate of return among these subsamples. This
27
procedure allows for an extrapolation across the various subsamples to estimate the
response for 100% rate of return" (Armstrong & Overton, 1977, p.2). The advantage of
this method is that only one wave is needed from each subsample; therefore, an early
cutoff date can be used.
Compare Sample Estimates of Respondents with other Sources. Lastly, the
existence of response bias can be established by comparing sample estimates of
respondents with other sources, such as surveys, that ask similar questions (Brick, Bose,
W., and Bose, J., 2001). Large differences could indicate bias, or at least suggest that
further consideration is necessary. This method comes with many limitations, however,
as survey items "may not be comparable because of coverage disparities, time periods
that are not the same, differences in question wording, context effects and a host of other
non-sampling error sources" (Brick, Bose, W., and Bose, J., 2001, p.4).
Non-response Categories. In order to give the most accurate interpretation of the
data, non-response can be broken into three categories: non-contacts, refusals and other.
Non-contacts for this research are those students for whom contact information was either
not available or not correct. Refusals for this research are those students who opted out
of taking the online survey, which they could do by selecting the opt-out link instead of
the link to the survey. Once a student opts out no further contact (such as reminders) is
made. The third category, other, includes late respondents and those who did not respond
at all.
Detecting Response Bias. In Groves and Couper (1998) are found four figures that
illustrate potential frequency distributions for non-respondents and respondents based on
a hypothetical variable, y, measured on all cases in a hypothetical population.
28
Figure 1-1 a depicts conditions in which respondents and non-respondents are
similar and there is a high response rate. If the response rate is 95%, which is extremely
high, the mean for respondents is $201.00 and the mean for non-respondents is $228.00,
then the non-response error is .05($201.00-$228.00) = -$1.35. Figure 1-lb depicts
conditions in which respondents and non-respondents are not similar and there is a high
response rate. If the response rate is again 95%, but the mean for respondents is $201.00
and the mean for non-respondents is $501.00, then the non-response error is .05($201.00$501.00) = -$15.00. Figure 1-lc depicts conditions in which respondents and nonrespondents are similar and there is a low response rate. If the response rate this time is
60%, the mean for respondents is $201.00 and the mean for non-respondents is $228.00,
then the non-response error is .40($201.00-$228.00) = -$10.80 Lastly, Figure 1-ld
depicts conditions in which respondents and non-respondents are not similar and there is
a low response rate. If the response rate is again 60%, but this time the mean for
respondents is $201.00 and the mean for non-respondents is $501.00, then the nonresponse error is .40($201.00-$501.00) = -$120.00. This means that the bias is 37% with
regards to the total sample mean ($321, see Table 3) (Groves & Couper, 1998).
Table 3 summarizes data from from Groves and Couper (1998) to demonstrate
sample sizes of non-respondents needed with each of the four scenarios in order to
determine a stable bias ratio.
29
Table 3. Bias and Percentage Bias in Respondent Mean Relative to Total Sample Mean
NonResponse
Response Respondent respondent
Mean
Rate
Difference Rate %
Mean
Small
95
$201
$228
High
Large
High
95
$201
$501
Low
Small
$228
60
$201
Low
Large
60
$201
$501
Total
Sample
Size
$202
$216
$212
$321
Bias
Percentage
Bias
-0.7
$1.35
$15.00 -6.9
$10.80 -5.1
$120.00 -37.4
Note. From Non-response in Household interview surveys, by R.M. Groves & M.P. Couper, 1998,
New York: Wiley and Sons, p. 19.
As shown in the section Detecting Response Bias, the formula to calculate the
Bias term is as follows:
Bias = 1 - Response Rate % (Respondent Mean - Non-respondent Mean)
Ky r) = (\-r)( yr-
ynr)
For example, line four in Table 3 represents a low response rate, with a large
difference between the respondents and non-respondents. Therefore, 1-60 (201 - 501) =
-$120.00, is approximately a -37% bias, which is a large (over 10%) bias percentage.
If it is believed that non-respondents are different from respondents in ways
critical to the research or evaluation questions being asked, non-response bias should be
examined thoroughly to make accurate generalizations of the population being examined.
Because the Kalamazoo Promise Scholarship is such a new program and carries
enormous implications for Kalamazoo students, the community and for other cities
replicating the universal scholarship program, it is imperative to know that
generalizations made of this population are not skewed by non-response error or possible
non-response bias within the population. The purpose of this research is to consider this
issue in depth by looking at not only the factors of retention affecting WMU's
Kalamazoo Promise Scholarship recipients but also looking at non-response error to
30
determine if bias exists.
Conclusion
This chapter reviewed the relevant literature and research focused on the
Kalamazoo Promise Scholarship and what impacts this universal scholarship has on the
Kalamazoo community. Second, it examined a framework for analyzing higher education
retention, exploring what factors assist successful retention of students. The last section
covered relevant issues regarding non-response bias in social science research, including
its definitions and implications. This framework is intended to help in understanding the
purpose and results of this dissertation, namely retention factors for Western Michigan
University Kalamazoo Promise recipients. It also enables the discussion of non-response
bias in the survey research conducted for this dissertation.
Understanding, or at a minimum, describing, retention factors of WMU
Kalamazoo Promise recipients will facilitate future efforts to examine retention and
factors that enable students to succeed in higher education institutions.
31
CHAPTER III
METHODOLOGY
The previous chapter explained the Kalamazoo Promise Scholarship and the
impacts of this universal scholarship on the Kalamazoo community; presented a
framework on higher education retention, exploring what factors assist successful
retention of students; and discussed relevant issues regarding non-response bias in social
science research, its definitions and implications. This chapter discusses the methods and
procedures to be used in conducting this research. It is outlined as follows: (a) purpose,
(b) research design, (c) population and sample, (d) procedure for data collection, (e)
informed consent process, (f) research procedure, (g) data analysis, (h) ethical
considerations, (i) limitations and (j) summary.
Purpose
The purpose of this study was twofold, centering on retention factors and nonresponse bias in the population of WMU Kalamazoo Promise recipients. It sought to
determine to what extent persisters, those on probation, and non-persisters differ on
demographic characteristics and on Cognitive, Social and Institutional retention factors as
identified in Swail's (2003) Geometric Model of Student Persistence and Achievement. .
As part of that inquiry, this study also examined the average courses taken per term and
number of courses taken in the first year for persisters, those on probation, and nonpersisters, and for respondents and non-respondents. Additionally, the extent to which
respondent, late respondent, and non-respondent Kalamazoo Promise recipients differed
32
on each of Swail's (2003) Geometric Model of Student Persistence and Achievement was
compared with known characteristics from the academic data. Differences between early
respondents and late respondents also were compared, this time using the actual survey
data. Lastly, non-response error was examined using Groves and Couper's bias ratio
formula (1998), using both the mean, as called for in the formula, and the median, a
modification of the formula, to determine whether there was any indication of nonresponse bias in the survey data, and whether use of the more robust median produced
different results than using the mean.
The implications of these results could have an impact on decisions to budget (or
not) extra funds to encourage a higher response rate, so as to ensure an unbiased
representation of results in future studies of this population. Because the research
concept of non-response bias is not widely used in evaluation, examining it here will also
have important implications for evaluation and evaluation theory. Being aware of nonresponse bias and having the tools to apply it in order to achieve accurate results and
make sound interpretations is imperative in research and evaluation alike.
Research Design
The research design used in this study consisted of a non-experimental relational
design, using mixed methods. The research type is descriptive, which lends itself to
detailed descriptions of a phenomenon. Descriptive research designs are compatible with
the study of behavior and specific attributes of individuals. The study involved all
Kalamazoo Promise recipients who are attending or who have attended Western
Michigan University (WMU) since 2006, when the first Promise students entered WMU.
33
Some of these students are now juniors at WMU. For the first part of the research,
participants were examined in three groups: those who are attending WMU and are in
good standing (persisters, iV=200), those on probation (A/=51), and those who are no
longer attending WMU (non-persisters, JV=49). For the non-response bias part of the
research, participants are examined by respondents (7V=101) and non-respondents (iV=90).
The research design involved the administration of an online survey to collect data.
Along with the survey, in-depth interviews were conducted with a random sample of
students. In addition, academic records were obtained from the Office of Student
Academic and Institutional Research for all Kalamazoo Promise Recipients who have
attended or are attending WMU. The purpose was to collect data from the sample of
Kalamazoo Promise recipients associated with WMU and to be able to make
generalizations about this particular population. Surveys were used because of the
efficiency and low cost of this type of instrumentation. In-depth interviews were chosen
because they offered detailed data. Academic records were used in order to compare
respondents, late respondents and non-respondents on key variables to determine the
possibility of non-response bias.
Sample
The population for this study consisted of all Kalamazoo Promise Scholarship
recipients who are attending or have attended WMU since the beginning of the
scholarship, a total of 307 students. Originally, data were obtained from two sources: the
Facilitator of the WMU Kalamazoo Promise and the Office of Student Academic and
Institutional Research. Initially, contact names of all WMU Kalamazoo Promise
34
recipients were obtained from the Facilitator, who listed 191 students. After the surveys
were returned, these 191 coded names were given to Institutional research to add
academic data to the existing data already collected. This procedure was used to comply
with the Family Educational Rights and Privacy Act (FERPA). This breakdown is
detailed in Table 4.
Table 4. Data Obtained from the Facilitator of the WMU Kalamazoo Promise and the
Office of Student Academic and Institutional Research
Facilitator
Institutional
Research
Persisters
155
200
On Probation
0
51
Non-persisters
36
49
Total
191
307
The breakdown of the population of students by persistence and type of data
obtained is detailed in Table 5. Surveys were sent to 191 students, the number initially
provided by the Facilitator of the Kalamazoo Promise at WMU. Of these 191 students,
101 responded and 90 did not respond. Of the 72 interviews planned and requested, only
14 agreed to an interview. Academic records and course data were obtained from the
Office of Student Academic and Institutional Research on all 307 WMU Promise
students.
Table 5. Breakdown by Type of Data Collected and Persistence
Survey Respondents (191
sent)
Interviews
Academic Records/Course
Data
Persisters
On
Probation
Nonpersisters
87
\2
10
2
4
0
101
14
200
51
49
300
Note. N=300 WMU Kalamazoo Promise Recipients. Seven did not have probation status listed. Surveys
were sent to 191 students, 101 responded, 90 did not respond.
35
Procedures for Data Collection
Subject Recruitment. The subject selection initially was obtained from Patricia
Williams, the facilitator of the Kalamazoo Promise support program at Western Michigan
University (WMU). Patricia Williams had access to all Kalamazoo Promise students at
WMU for her job. She provided the researcher with the names and current contact
information of all Kalamazoo Promise recipients who are attending or who have attended
WMU to which she had access.
Survey. The entire population of Kalamazoo Promise recipients known from the
facilitator of the Kalamazoo Promise (7V=191) who are attending or who have attended
WMU was asked to participate in a survey titled: Survey of Promise Scholarship
Recipients at WMU Spring 2009. This survey was developed by the researcher based on
the three factors of Swail's (2003) Geometric model of student persistence and
achievement. A copy of the survey questions sorted by the three factors can be found
appendix A. The Promise recipients were contacted through an initial e-mail,
Introduction Survey E-mail protocol (see Appendix A).
Interview. In addition to the survey of the population of WMU's Kalamazoo
Promise scholarship recipients, two samples of recipients were invited to learn more
about this study. Those who responded and then consented to do so were invited to
participate in an in-depth interview.
The first sample was all of the Kalamazoo Promise scholarship recipients no
longer attending WMU, iV=36. The second sample (JV=36) was a random sample of
Kalamazoo Promise scholarship recipients still at WMU. This random sampling
36
occurred by using the contact list of names from the facilitator of the Kalamazoo Promise
scholarship program. Names were randomized so they were not alphabetical and every
seventh person on the list was chosen until there were 36 names in the random sample.
All 72 students in the two samples were contacted via e-mail by the researcher.
The e-mail (Interview E-mail Invitation, appendix A) explained the research and asked
students, the potential participants, if they were willing to learn more about the project.
Those who wanted to learn more about the project were instructed to contact the
researcher to set up a time to meet. Once a student contacted the researcher, the
researcher contacted the student by phone and arranged a convenient time to meet
(Interview Phone Invitation in appendix A).
Academic Records. In addition to the survey and interviews, academic records
were obtained from all Kalamazoo Promise Scholarship recipients who are attending
WMU or who attended WMU in the past (see Table 5). This collection was done after
the initial collection of data using the survey in order to link academic data to the survey
data without using names. In order to meet the Family Educational Rights and Privacy
Act of 1974 (FERPA or the Buckley Amendment) guidelines, no academic records had
any identifying information on them (such as student names or social security numbers);
therefore a strict coding system was enforced.
The data from each sample was compared to determine what differences, if any,
there were between recipients WMU retained versus those WMU did not retain. In
addition, the question of non-response error was examined by comparing : students who
responded to the online survey with those who did not respond in order to see if there was
an indication of non-response bias associated with this study.
37
Informed Consent Process
There were two separate consent processes, one for the survey and one for the
interview. The following section is broken into a consent process for the survey and
another consent process for the interview.
Survey Consent Process. Students were contacted by e-mail (Introduction Survey
E-mail Protocol appendix A) which explained the research and asked the student if he or
she were willing to participate. A student who wanted to participate was instructed to
click on the link to the Survey Monkey online survey (see appendix A) to read more
information about the study and read the informed consent (see appendix A). There was
a check box for participants to click if they wanted to proceed, which constituted their
consent to use the information they provided for the dissertation research. Participants
who did not click the box to confirm they had read and were giving their consent were
not able to proceed to the survey.
Interview Consent Process. Students were contacted by e-mail (Interview E-mail
Invitation, appendix A) which explained the research and asked students, the potential
participants, if they were willing to learn more about the project. Those who wanted to
learn more about the project were instructed to contact the researcher to set up a time to
meet. Once a student contacted the researcher, she contacted the student by phone, and
arranged a time that worked for the individual to meet (Interview Phone Invitation,
appendix A). At the meeting the consent document was reviewed, with the option to
continue and sign the document or discontinue. If the potential participants chose to sign
the consent form and agreed to participate in an interview, the interview took place at that
38
time. No one participated in the interview until the consent document had been
thoroughly reviewed, questions answered, and the consent document signed.
Academic Records. No consent was needed as all records are de-identified,
meaning there were no student names attached to this information.
Research Procedure
Method of Data Collection. Data were collected through an online survey, an
interview, or both, as well as from academic records. The survey was sent out first and
the interview process started. After this phase had been completed the office of
Institutional Research at WMU was contacted to obtain academic records grouped
aggregately by categories. No student names were linked to this data.
Survey. E-mail addresses were then put into Survey Monkey to enable
distribution of the survey electronically. This also allowed follow up e-mails to only go
to those who had not yet completed a survey. Only the primary researcher had access to
this information. An e-mail was sent to all Kalamazoo Promise recipients with a link to
the Survey Monkey survey. (See appendix A for Introduction Survey Email Protocol).
Non-response was monitored automatically in Survey Monkey through the e-mail
addresses, and non-respondents were sent a reminder email (First Reminder E-mail to
Take Survey see appendix A) regarding the survey one week after the survey was emailed. A second e-mail reminder (Second Reminder E-mail to Take Survey appendix
A) was sent three days later to those who had not yet filled out the survey. Any survey
that came in after this second reminder was considered a late respondent. This nonresponse data was compared to the response data (data that arrived before the second
39
reminder). The survey stayed open until the end of April 2009.
At the end of the electronic survey, participants were offered the opportunity to
enter a random drawing with a chance to receive one often $20 WMU bookstore gift
cards. Participants needed to provide their name and e-mail address, which was not
linked to their survey results, to enter the drawing.
Interview. Two samples were invited to learn more about the study. They were
contacted first through e-mail to invite them to learn more about the study (Interview Email Invitation, appendix A). A week later they were contacted by e-mail again to invite
them to participate in an interview if they agreed to participate (Interview E-mail
Invitation appendix A).
The interview invitation was separate from the survey invitation because of the
preference for taking a random sample of Kalamazoo Promise scholarship recipients and
not just those who answered the survey. That these respondents might be different was
understood and was to be investigated for the non-response bias part of this research.
It was hoped that the researcher would be able to start setting up interviews right
away. An e-mail reminder of the date, time and location of the interview was sent a few
days before the interview, and a reminder call was made the day before the interview. A
copy of this E-mail Reminder of Interview and Reminder Phone Call of Interview can be
found in the appendix. The interview did not last more than one hour. Participants were
given a copy of the study information sheet and the consent form to keep (see appendix
A). Both forms were discussed. One copy of the consent form was signed by the
participant and kept by the researcher; the other copy, along with the study information
sheet, was the participant's to keep. The interview was electronically recorded with a
40
digital audio recorder and transcribed later. After the interview was completed, all
participants were given a $20 WMU bookstore gift card as a thank you for their
participation.
All interviews were completed by the researcher, her business partner, Nakia
James, or Katya L. Gallegos Custode and Tammy DeRoo, both graduate research
assistants at the College of Education working with Dr. Miron. Both the researcher and
Nakia James are partners with Momentum Consulting and Evaluations, L.L.C. and are
experienced in interviewing and proficient in working together on extensive projects. All
interviewers took and passed the Human Subject Institutional Review Board exam at
Western Michigan University.
Academic Records. Academic records were obtained through the office of
Institutional Research at WMU. All records were de-identified, meaning there were no
student names on this information. Records were grouped only by persisters, those on
probation, non-persisters, respondents and non-respondents.
Instrumentation. The survey and the interview protocol (see appendix A) were
developed from research-based factors found in the literature review, principally Swail's
Geometric Model, and modified from other validated surveys, including the National
Survey of Student Engagement (NSSE), the Community College Survey of Student
Engagement (CCSSE), and the College Student Experience Questionnaire (CSEQ). In
addition, a separate reliability analysis was run on the newly created survey by each
retention factor: cognitive, social and institutional.
Items for the continuous/interval level data from the cognitive, social, and
institutional factors included from the Survey of Promise Scholarship Recipients at WMU
41
Spring 2009 were grouped to create the following four summated subscales: cognitive
engagement, social demands, institutional support, and social engagement. The exact
details of each of the scales can be found in Appendix C.
The Cognitive Engagement Subscale consisted of 20 items (Items C7 through
C27) from the Cognitive Factor of the survey. Participants responded to items on the
Cognitive Engagement subscale using a 5-point Likert-type scale where the responses
ranged from 1 = not likely to 5 = very likely.
The Social Demands subscale consisted of five items (Items S13 through SI7)
from the Social Factor of the survey. Participants responded to items on the Social
Demands subscale using a 5- point Likert-type scale, where the responses ranged from 1
= never to 5 = often.
The Institutional Support subscale consisted of six items (Items 17 through 113)
from the Institutional Factor of the survey. Participants responded to items on the
Institutional Support subscale using a 5-point Likert-type scale, where the responses
ranged from 1 = never to 5 = very often.
The Social Engagement subscale consisted of five items (Items 114 through 118)
from the Institutional Factor of the Survey. Participants responded to items on the Social
Engagement subscale using a 5- point Likert-type scale, where the responses ranged from
1 = very little to 5 = very much.
Summated scales offer an advantage over single-item scales in that such scales
can be assessed for reliability and for the unidimensionality of the construct being
measured (Thorndike, 1967). Items assigned to each scale were summated together to
yield total scores. Before running statistical procedures on data from the survey, the
42
researcher assessed the internal consistency of the scales using reliability analysis.
Cronbach's coefficient alpha was used to measure the internal consistency of the scales
included in the survey (Cohen, 1988; Trochim, 2007). While any test developer hopes to
obtain a reliability coefficient that approaches 1.0, such a value is rarely obtained in
behavioral and social science research. The significance of the obtained alphas will be
tested against the value of alpha = .70, as suggested by Kaplan and Saccuzzo (2005), for
the obtained alpha coefficients. The research indicates that values of .70 or greater
indicated that a scale is internally consistent (Kaplan & Saccuzzo, 2005; Mertler &
Vanatta, 2005). Table 25 presents a summary of the descriptive statistics for the four
scales.
Location ofData Collection. The location of data collection for the survey
happened online. It was the participants' choice where they took this online survey: in
their own homes or wherever else they felt comfortable taking it.
The location of data collection for the interview took place in conference room
1411 Sangren Hall or in the Multicultural Affairs conference room in Trimpe Hall. Both
locations were private but convenient for WMU students.
Duration of the Study. The study lasted from the beginning of April until the end
of June 2009. Survey respondents took approximately 20 minutes to fill out the survey
online. Interview respondents took no more than one hour. All surveying was completed
by the end of April, and interviews were completed by mid-May.
Data Analysis
The purpose of this study was to determine to what extent persisters, those on
43
probation, and non-persisters differ on demographic characteristics, and on each of the
following select factors in Swail's (2003) Geometric Model of Student Persistence and
Achievement: Cognitive, Social and Institutional retention factors. This study also
examined the average number of courses taken per term and number of courses taken the
first year of persisters, those on probation, and non-persisters, and of respondents and
non-respondents. Additionally, the extent to which respondent, late respondent and nonrespondent Kalamazoo Promise recipients differ on each of Swail's (2003) Geometric
Model of Student Persistence and Achievement was examined against known
characteristics from the academic data. Differences between early respondents and late
respondents also were examined, using the actual survey data. Lastly, non-response error
was examined using Groves and Couper's bias ratio formula (1998), using first the mean,
as called for in the formula, and then the median, a modification of the formula, to
determine whether there was any indication of non-response bias in the survey data and
whether using the mean estimator produced different results than using the median
estimator. Descriptive and inferential statistics methods, including frequency tables,
Multivariate Analysis of Variance analyses (MANOVA) and Chi-square Test of
Independence analysis, were used along with the bias formula in order to accomplish this
task.
MANOVA is a parametric statistical procedure used to measure differences in
scores between at least two groups (Mertler & Vanatta, 2007). As such the MANOVA is
preferred over the use of several univariate ANOVAs for the following reasons: a)
several dependent variables can be assessed simultaneously; b) results may be obtained
that may not be detected in univariate tests, such as interaction effects among variables;
44
c) it reduces the overall likelihood of Type I error rate by statistically maintaining the
overall rate at the level determined by the researcher. As a parametric statistical
procedure, the following assumptions apply to the data (Kilpatrick & Feeney, 2007;
Sprinthall, 2007): a) independence, b) normality, and c) homoscedascity.
Independence of scores. The researcher assures independence of scores at the
outset of participant selection. MANOVA is not very sensitive to this violation, but it
must be addressed (Kilpatrick & Feeney, 2007). Because the participants submitted the
surveys anonymously online, the researcher presumed that participants completed the
surveys independently without the assistance or input from other participants.
Multivariate normality. This assumption posits that each groups' patterns of
scores should reflect the shape of the normal distribution (Hair, Anderson, Tatham &
Black, 1995). The Kolmogorov-Smirnov Test Statistic was used to test this assumption
(Hair, et al., 1995; Kilpatrick & Feeney, 2007). Separate test statistics were computed for
each dependent variable. A summary of the results is presented in Appendix D, which
indicate that the assumption of normality was violated for several variables. This
violation, however, does not affect the analysis run as the F-test is robust and violations
of the assumptions of normality have minimal effect under certain conditions (Creswell,
2005), including the conditions of this non-experimental relational design.
Homogeneity of variance. This assumption posits that there must be equal
variances between groups. The Levene Test Statistic (Kilpatrick & Feeney, 2007) was
used to test this assumption. Separate test statistics were computed for each dependent
variable. A summary of the results is presented in Appendix E. Results indicate the
assumption was upheld for all of the academic variables except composite ACT scores.
45
The assumption was also upheld for all four subscale scores of the Survey of Promise
Scholarship Recipients at WMU Spring 2009.
Interpretations of the assumptions. The assumptions of normality and
homogeneity of variance are most critical in the case of experimental research designs
(Creswell, 2005). The research for this study was a non-experimental relational design.
While the assumptions of normality and homogeneity of variance were not upheld for the
data in this study, research suggests that violations of the assumption of normality have
little effect under certain conditions. Creswell (2005) also states that the F-test is robust
and violations of the assumptions of normality and homogeneity of variance have
minimal effect under certain conditions. Specifically, Creswell (2005) states that if the
larger group variance is no more than four times the smallest group variance, then
violations of the assumption of homogeneity of variance will have minimal effect on the
results of the MANOVA procedure. The review of the descriptive statistics for the groups
across the dependent variables revealed no cases in which the 4-to-l ratio for the
differences in group variances was upheld. Consequently, the researcher deemed that the
violations of the assumptions were acceptable considering the exploratory nature of the
research.
The Chi-square test of independence. This statistical procedure measures the
degree to which a sample of data comes from a population with a specific distribution
(Mertler &Vanatta, 2007; Rosenberg, 2007; Stevenson, 2007). It tests whether the
observed frequency count of a distribution of scores fits the theoretical distribution of
scores. This issue was addressed through the use of the Pearson's Chi-square {% )
procedure (Mertler & Vanatta, 2007; Rosenberg, 2007). A non-significant finding
46
indicates no statistically significant differences between the observed and expected
frequencies on the variables of interest.
Research Questions 1 and 2
All research questions and sub-questions with corresponding variables and analysis
are detailed in Table 6. To determine whether differences exist on the selected factors
among persisters, those on probation, and non-persisters, by gender and race,, Chi-square
Test of independence analysis and MANOVA analysis were applied using the academic
data provided by the Office of Student Academic and Institutional Research.
In order to examine persisters, those on probation, and non-persisters on average
number of courses taken per term and number of course taken the first year, by gender
and race, ANCOVA analyses were employed using the course summary data provided by
the Office of Student Academic and Institutional Research. First, however, two new
variables were created manually. The average number of courses taken per student per
term was calculated by adding up all of the fall and spring courses taken per student and
dividing by the number of fall and spring terms the student attended. The first-year
courses were calculated by adding the courses from the first fall and spring term that
students attended WMU.
In order to examine respondents, late respondents, and non-respondents of the
Survey of Promise Scholarship Recipients at WMU Spring 2009 in average number of
courses taken per term and number of course taken the first year, by gender and race,
ANCOVA analyses were employed using the course summary data provided by the
Office of Student Academic and Institutional Research. The Survey of Promise
47
Scholarship Recipients at WMU Spring 2009 was used only to distinguish between
respondents, late respondents, and non-respondents for this question.
Research Question 3
All research sub-questions with corresponding variables and analysis is detailed in
Table 7. To determine if possible non-response bias exists, several procedures were
undertaken. The first was determining whether there was a statistically significant
difference between respondents («=52), late respondents («=49), and non-respondents
(«=84) using known variables of each group from the academic data set (see Table 15).
This is one of the suggested ways to determine if bias is a possibility. "After examining
the respondents with the non-respondents on known characteristics, if no statistically
significant difference is found, then the results can be generalized both to the sample and
the population" (Diem, 2004, p. 2). In addition, variables from the Survey of Promise
Scholarship Recipients at WMU Spring 2009 were used to compare early respondents and
late respondents. This type of comparison is commonly used in social science research to
determine an effect, if any, on the statistic being considered when examining nonresponse and possible bias (Miller and Smith, 1983; Smith, 1984). Next the means from
each variable were plugged into Groves and Couper's bias ratio formula (1998), which
includes the mean for respondents and non-respondents or, in this case, late respondents.
Lastly, the median from each variable was plugged into a modified version of Groves and
Couper's bias ratio formula (1998), replacing the mean, because the median is a more
robust estimate than the mean. When population samples are small, such is the case in
this research, the mean is sensitive to extreme scores; the median—the point where half
48
the scores fall below and half above—is less sensitive to extreme scores.
If no statistical difference is found among respondents, late respondents and nonrespondents using the academic data set, and no indication of bias exists using the mean
or the median calculations using the survey data on the respondents and late respondents,
then the results from the survey data can be generalized to the population of Kalamazoo
Promise Students across cognitive, social and institutional factors of retention without
concern of non-response bias. If an indication of bias is found, the results of the survey
data can only speak for the respondents of the survey and not for those who did not
respond. Table 6 and 7 organize all of the research questions with their dependent and
independent variables along with the source of the data and method of analysis.
49
Table 6. Summary Research Questions 1 and 2 with Independent and Dependent
Variables, Data Source and Method of Analysis
Research
Question
Independent Variables
Dependent Variables
Data
Source
Method of
Analysis
1.1
Difference in
Cognitive
Factors
Persisters = Good
Standing
On Probation and Nonpersister =
Academically dismissed
Gender (M,F)
Race (White, Black,
Other)
High school GPA,
most recent WMU
GPA, ACT
composite score
Academic
Records
MANOVA
Taking remedial
math, reading or
writing or taking AP
credit
Living in a dorm,
being and athlete
Parental income
Academic
Records
Chi-Square
Academic
Records
Academic
Records
Chi-Square
1.2
Difference in
Social
Factors
1.3
Difference in
Institutional
Factors
2.1
Difference in
Persistence
2.2
Difference in
Respondents
Persisters = Good
Standing
On Probation and Nonpersister =
Academically dismissed
Gender (M,F)
Race (White, Black,
Other)
Persisters = Good
Standing
On Probation and Nonpersister =
Academically dismissed
Gender (M,F)
Race (White, Black,
Other)
MANOVA
First year experience
and high school
Academic
Records
Chi-Square
Average number of
Persisters = Good
courses taken per
Standing
On Probation and Non- term and number of
persister =
course taken the first
Academically dismissed year.
Gender (M,F)
Race (White, Black,
Other)
Average number of
Respondent, Late &
courses taken per
Non-respondent
term and number of
Gender (M,F)
Race (White, Black,
course taken the first
Other)
year.
Course
Summary
ANCOVA
Gender and
race
(covariates)
Course
Summary
ANCOVA
Gender and
race
(covariates)
50
Table 7. Summary Research Question 3 with Independent and Dependent Variables, Data
Source and Method of Analysis
Research
Question
Independent
Variables
Dependent Variables
Data
Source
Method of
Analysis
3.1 Differences
among
respondents,
late
respondents
and nonrespondents on
dependent
variables from
academic data
Respondents,
late respondents
and nonrespondent
Academic
Records
MANOVA
Academic
Records
Chi-Square
3.2 Differences
between early
and late
respondents on
dependent
variables from
academic data
Early respondent
and late
respondent
Survey
MANOVA
for
subscales;
Chi-Square
for
categorical
level data
3.3 Examining
indication of
non-response
bias using bias
ratio formula
on subscales
from survey
Early respondent
and late
respondent
(a) high school GPA,
(b) most recent WMU
GPA, (c) ACT
composite score, (d)
parental income
(e) taking a remedial
math course at WMU,
(f) taking a remedial
reading course at
WMU, (g) taking a
remedial writing course
at WMU, (h) taking AP
credit, (i) living in a
dorm, (j) being an
athlete, or (k) first year
experience (FYE) or (1)
high school
Cognitive, Social and
Institutional factors
from Swail's (2003)
Geometric Model of
Student Persistence and
Achievement (See
factors/questions table
in Appendix)
(See factors/questions
table in Appendix)
Survey
Bias Ratio
Formula
51
Ethical Considerations
Every effort was made to ensure the ethical treatment of all participants. The
researcher complied with all standards of the Human Subject Institutional Review Board
(HSIRB). It was the responsibility of the researcher to revere the needs and rights of all
participants (Locke, Spirduso & Silverman, 2000). The process of informed consent was
followed to achieve this. The researcher: (1) acquired written permission of Western
Michigan University's Human Subjects Institutional Review Board (HSIRB); (2) clearly
explained the objectives of the study in writing to the survey participants and in writing
and orally to the interview participants; (3) acquired consent from each interview
participant with the consent form found in Appendix A, and from each survey participant
with the same consent form, with the survey participants placing a check mark indicating
their consent after reading the form in Survey Monkey, an online survey program; and (4)
gave each interview participant a $20 WMU book store gift card as a thank you for
participating.
Risks and Costs to and Protections for Subjects. There were no known
anticipated physical, psychological, social or economic risks to the participants.
Participants in the survey could complete it in the comfort and privacy of their own
homes or wherever was convenient for them. Interview participants who did not live on
campus or were no longer attending WMU may have been inconvenienced because
interviews were held on campus. For those living on campus or who had class on
campus, arrangements were made to schedule the interview at a time most convenient for
them. Regardless of whether or not the participant lived on or off campus, the interview
52
was scheduled at a time that was most convenient for him or her. Also, a $20 WMU
book store gift card was given for participating in the interview and, it is hoped, offset
any inconvenience.
Interviews were conducted at each student's convenience; therefore, there was no
disruption to any class or administrative function. The interviews took place in a
conference room in either Sangren Hall or Trimpe Hall on campus to ensure a neutral
environment as well as to take advantage of the excellent audio capabilities. Interviews
were digitally recorded.
Benefits of Research. This research may benefit institutions working on their own
retention issues or those that would like to replicate the Kalamazoo Promise Scholarship.
It may also shed some light on possible non-response bias that exists in social science
research and evaluation. The knowledge base of the Kalamazoo Promise, college
retention, persistence and non-response bias was expanded upon.
The participants in the research may have benefited from the study through selfreflection on their academic journey. Since the Kalamazoo Promise Scholarship is free to
these students, they may have felt good about "giving back" to help this program
succeed. Participants may have also learned more about the Kalamazoo Promise. For
example, one question asked if participants were aware that they could still use their
Kalamazoo Promise Scholarship even if they took time off from college. Participants not
aware of this benefit may have found such information helpful.
Confidentiality of Data. Every effort was made to keep all information
confidential.
Survey Data. Survey data was only linked by e-mail address and was seen only
53
by the researcher and was kept confidential. Students' names were never used.
Interview Data. Interview data initially had a student's name attached to it. In
order to make the interviewee comfortable the participant was referred to by name during
the signing of the consent form. They were told, however, that once the recording device
was turned on their name would not be used and that a code number would be given for
the transcription of the interview. In that way their names were kept confidential even in
the transcription. Consent forms, names and code numbers were only seen by the
researcher. It was anticipated that there would be approximately 72 interviews, with the
code consisting of numbers 1 through 72. All interview data was reported without using
student names. The digital recorders were kept in a locked filing cabinet in the
researcher's home until transcribed. No one except the researcher had access to these
recorders. Once transcription was completed the digital interview recordings were
deleted immediately and no record of the audio files was kept. The transcription was
kept in a locked filing cabinet in the researcher's home. The consent forms were kept in
a manila envelope in a locked filing cabinet in Dr. Miron's office to which no one except
Dr. Miron has access. After the study's completion, the data continues to be stored at
WMU in Dr. Miron's office in a locked filing cabinet. In three years these consent
forms, transcriptions and data will be disposed of by burning.
The doctoral researcher does not work at WMU, has no way to know who these
students are, and has no ability to make any decisions regarding their academic
performance at WMU. Anyone who reads this dissertation will have no way of knowing
who participated other than that participants are recipients of the Kalamazoo Promise
Scholarship. Since there are 307 who have attended or attend WMU, it would be
54
impossible to link any aggregate data to any individual student. Confidentiality is
assured.
Academic Records. Confidentiality was not an issue for academic records as no
student names are linked to any of this data.
Limitations
The limitations of this research, first and foremost, are due to the limited sample
size. Ideally, all Kalamazoo Promise recipients would be included in such a study.
However, because these recipients have chosen to go to 26 different higher education
institutions (Jorth, 2009), it was impossible to include all Promise recipients within the
scope of this project. Therefore, the scope of this project was limited to only those
Kalamazoo Promise Scholarship recipients who attended Western Michigan University at
some point since 2006, when the first recipients attended their first semester at a higher
education institution. The entire population of Western Michigan University Kalamazoo
Promise recipients was included in this project.
Only 4 out of 13, or 31%, of those WMU students who were academically
dismissed and therefore are no longer at WMU responded to the survey. This data came
from the responses and non-responses of the Survey ofKalamazoo Promise Recipients at
WMU Spring 2009 (147 in Good Standing, 25 on Probation, and 13 Academically
Dismissed, 6 no probation status listed, 191 total surveyed) and from the academic data
obtained from the Office of Student Academic and Institutional Research. This response
rate of 31% is considered very low. Considering the nature of this research and the
information needed directly from these students to examine differences between
55
persisters, those on probation, and non-persisters, it is essential to take note of this
response rate. Generalizations can only be made to the sample that responded. The goal
here was to generalize to the population of WMU Kalamazoo Promise recipients, and in
order to accomplish this non-response bias was examined in depth.
Summary
This chapter has provided an overview of the methods and procedures used in
compiling this mixed methods research project, including the ethical treatment of all
participants. Mixed methods were used based on the needs of the research questions and
the types of data used to answer these questions. Using mixed methods allowed the
research questions to be answered and not restricted based on the needs of the approach
used. "Mixed methods research is formally defined here as the class of research where
the researcher mixes or combines quantitative and qualitative research techniques,
methods, approaches, concepts or language into a single study" (Johnson &
Onwuegbuzie, 2004, p. 17). The research questions lent themselves to each method.
Chapter IV presents the findings from the analysis of these data sources.
56
CHAPTER IV
RESULTS
Due to the complexity of the research questions, the results have been broken into
five sections. The initial results are a summary of the academic data received from
WMU's Office of Student Academic and Institutional Research, which are included in
this first section of the results under the heading Summary Academic Data. The second
section is a summary of the Survey of Promise Scholarship Recipients at WMU Spring
2009 under the heading Survey Summary. Each of the other sections was broken up by
research question. The research sub-questions are then each answered separately using
the academic data, which includes the course summary data and the survey data as
needed.
Summary Academic Data
The summary academic data is intended to give a brief overview of the study's
population in terms of students' probation status, gender, race, and survey participation.
This data was obtained on all 307 students who have received the Kalamazoo Promise
and are attending or have attended WMU. This background provides a description of the
sample involved in this study. The socio-demographic characteristics of these 307
Western Michigan Promise scholarship recipients can be seen in Table 8, delineated by
whether these students have persisted (still attend), are on probation or have not persisted,
meaning that they were academically dismissed from WMU.
57
Table 8. Socio-Demographic Characteristics of WMU Kalamazoo Promise Recipients by
Persistence
Persister
n/%
On
NonProbation Persister
n/%
n/%
Total by
Population
n/%
% NonPersist
within
Group3
Male
Female
102/34
98/33
29/10
22/7
32/11
17/6
163/53
137/46
19.6
12.4
White
Black
Asian
Hispanic
American
130/43
35/12
14/5
8/3
29/10
13/4
1/.3
5/2
24/8
17/6
1/.3
3/1
183/61
65/22
16/5
16/5
13.8
26.2
6.3
18.8
Unknown
21.1
11/4
1/.3
21.1
0/0
4/1
3/1
17/6
0
23.5
Gender
Race
Indian
Note. N=300, not 307 as seven students had no probation status listed. Total percent may be off because of
rounding. a % non-persistence within group, male = 32/163= 19.6%. Sources: From academic data from
the Office of Student Academic and Institutional Research.
Results show that males represented the largest percentage (53.4%) in the sample.
In terms of race, the initial data analysis revealed that there were too few participants in
the Hispanic, Asian, Native American, and Unknown categories to conduct meaningful
statistical comparisons. Therefore the researcher collapsed the responses for those four
groups into one group, which was labeled Other.
The summary academic data is reported through several different variables;
Probation, Group, and Group by Response. The variable Probation had five levels;
Good Standing, Academic Dismissal, Extended Probation, Final Probation, Probation,
sad Academic Warning. The Probation status variable was collapsed into three levels by
combining Extended Probation, Final Probation, Probation, and Academic Warning into
a new category, On Probation, after this initial description.
58
Initially, 191 Promise recipients were identified with the help of the WMU
Kalamazoo Promise Scholarship facilitator . The 191 were given the opportunity to
complete the Survey of Promise Scholarship Recipients at WMU Spring 2009. However,
WMU's Office of Student Academic and Institutional Research had 307 Promise students
on record as attending or having attended WMU. Of those, 7 had no record of probation
status and were excluded from the analysis. Of the remaining 300, the overall percentage
of persisters was 66.66%, with another 17% on probation. There are 16% who have been
academically dismissed and have therefore not persisted. Table 9 gives an exact count of
students in each probation status group with their average high school GPA.
Table 9. Average High School GPA by Probation Status at WMU
Probation
Status
Persisters
Good
Standing
On
Extended
Probation Probation
Final
Probation
Probation
Academic
Warning
NonAcademic
persisters Dismissal
Total
N
Mean
SD
200
3.496
0.762
95% Confidence
Interval
Lower
Upper
Bound
Bound
3.390
3.602
6
2.920
0.361
2.541
3.299
2.43
3.46
1
2.990
.
.
.
2.99
2.99
23
21
3.163
3.209
0.477
0.422
2.957
3.016
3.369
3.401
2.36
2.57
3.98
3.85
49
3.044
0.578
2.878
3.210
0.00
3.89
300
3.363
0.713
3.282
3.444
0.00
4.65
Min
Max
0.00
4.65
Note. Seven students had no record of probation status and were not included in this analysis, therefore N=
300 instead of 307. Academic Dismissal = Non-persister.
An exploratory analysis using an Analysis of Variance (ANOVA), not
surprisingly, found the mean high school GPA based on probation status at WMU to be
59
statistically different [F (5, 294) = 4.711, p =0.000]. The source of this difference was
not determined, however.
Looking at the same students again grouped by probation status, similar results
can be seen with these students' most recent WMU GPA (see Table 10).
Table 10. Most Recent WMU GPA by Probation Status at WMU
Persisters
Probation
Status
N
Mean
SD
Good
Standing
20
0
3.0575
0.5011
95% Confidence
Interval
Lower
Upper
Bound
Bound
2.9877 3.1274
6
1.7500
0.1249
1.6189
1.8811
1.59
1.94
1
1.9000
.
.
.
1.90
1.90
23
21
1.0735
2.1943
0.7647
0.1877
0.7428
2.1088
1.4041 0.00
2.2797 2.00
1.98
2.63
49
1.1371
0.5632
0.9754
1.2989 0.00
1.99
30
0
2.5013
0.9755
2.3905
2.6122
4.00
On
Extended
Probation Probation
Final
Probation
Probation
Academic
Warning
NonAcademic
persisters Dismissal
Total
Min
Max
2.06
4.00
0.00
Note. Seven students had no record of probation status and were not included in this analysis, therefore N=
300 instead of 307. Academic Dismissal = Non-persister.
An exploratory analysis using ANOVA found that the most recent WMU GPA
mean based on probation status was also statistically different [F (5, 294) = 153.391,/?
=0.000]. The source of this difference was not determined, however.
Students who had low high school GPAs also had low WMU GPAs. Unlike high
school GPAs, which regardless of probation status clustered around 3.00, for those WMU
students in any of the probation groups, most recent GPAs dropped from the 3.00s into
the 1.00s and low 2.00s.
60
Because the exploratory analyses suggest that there are differences, before
examining this any further the probation status variable was collapsed into three
categories: Good Standing, On Probation, and Academic Dismissal. Extended Probation,
Final Probation, Probation, and Academic Warning were all put under one category, On
Probation, in order to simplify the analyses from here forward.
This data was used not only to answer the research questions but also in an indepth look at non-response error, which can sometimes lead to non-response bias. Table
11 summarizes when each of the 307 Promise students started at WMU, while Table 12
reports the exact incoming first-time, full-time, degree-seeking beginners (FTIAC) cohort
counts by persistence, on probation, and non-persistence. FTIAC cohorts are used at
WMU when examining retention.
Table 11. First Promise Semester
2006 Summer II
2006 Fall
2007 Spring
2007 Fall
2008 Spring
2008 Summer II
2008 Fall
2009 Spring
Total
Frequency
Percentage
1
100
5
90
5
4
100
2
0.3
32.6
1.6
29.3
1.6
1.3
32.6
0.7
307
100.0
It would be suspected that the first FTIAC Cohort group, matriculating in 2006,
would have more students on probation or academically dismissed than subsequent year
cohort groups. That is because the First Year Experience (FYE), which provided supports
61
for first-year students was just introduced in 2005 and had not officially started until the
Fall of 2006. The First Year Experience should be an effective mediator of dropping out
and therefore help retain more students; thus, it would not be surprising to see a decrease
in dropping out or an increase in retention in the later Cohorts after the FYE was in place.
Chi-square analysis reveals, however, that there was no statistically significant difference
\X2(6, 300) = 10.139, p = 0.119] between FTIAC Cohort groups and Probation status,
which means that the same proportion of students from each FTIAC Cohort group have
been academically dismissed or are in good standing.
Data was therefore run aggregately and not by FTIAC Cohort. Aggregation of
these cohort groups, along with being able to use the 27 students who are not in a cohort,
in turn gave more power to the analysis due to the increase in group size.
Table 12. FTIAC Cohort by Persistence and Non-Persistence
2006 Fall
2007 Fall
2008 Fall
Not in a Cohort
Total
Persisters
n/%
On Probation
n/%
68/23
56/19
53/18
23/8
200/67
15/5
12/4
23/8
1/.3
51/17
Non-Persisters
n/%
17/6
15/5
14/5
3/1
49/16
Note. Af=300, seven students had no probation status listed.
About 98% of the WMU Promise students came from Loy Norrix High School
(139) and Kalamazoo Central High School (164), with only one student coming from
Galesburg to attend WMU. In addition, two students did not have a high school listed on
record.
62
Table 13. High School Attended by WMU Kalamazoo Promise Recipients
Percentage
45.3
53.4
0.3
0.7
100.0
Frequency
Loy Norrix High School
Kalamazoo Central High School
Galesburg Augusta High School
Not Reported
Total
139
164
1
2
307
Table 14. Distribution of Promise Students by Race, Gender and High School
Kalamazoo
Unknown
White
Black
Asian
Hispanic
American Indian
Total
Loy Norrix
Central
(« =139 )
(n =164)
Galesburg
(B=1)
Male
Female
Male
Femal
e
Male
3
47
11
3
5
0
69
3
45
14
5
3
0
70
7
59
17
5
4
1
93
3
38
21
3
4
2
71
0
0
0
0
0
0
0
Femal
e
1
0
0
0
0
0
1
Note. N= 304. One black female and one black male had no high school reported and are
therefore not calculated into this analysis.
Males and females are pretty evenly distributed among the schools, with no
statistically significant difference found [j2(l,153.5) = 1.436,p = 0.231]; race, however,
was found not to be evenly distributed [/(5, 51.1) = 4.921,/? = 0.000], (See Table 14).
This means that there are higher percentages of White students than any other racial
group at both high schools. At Loy Norrix High School, the racial makeup is White 66%
and Black 18%; at Kalamazoo Central High School, it consists of White 59% and Black
23%.
63
Students were not evenly distributed across high schools [x(3, 76.5) = 298.209,;?
= 0.000] . If Galesburg, which only had one student attending WMU, is taken out along
with the two unknown high school students, however, the high school student distribution
is found to be evenly distributed between the two remaining schools \x (1, 151.5) =
2.063,/? = .151]. Table 15 gives the details of persistence, on probation, and nonpersistence by high school.
Table 15. High School by Persistence and Non-Persistence
LoyNorrix
Kalamazoo Central
Galesburg Augusta
Unknown
Total
Persisters
n/%
88/29
109/36
y 3
2/7
200/67
On Probation
n/%
23/8
28/9
Q/0
0/0
51/17
Non-Persisters
n/%
23/8
25/8
^
^-^
49/16
Note. Percentages add up to only 96.3% due to rounding and percentages calculated by dividing by 307.
Seven students had no probation status listed. Three students had no high school listed.
Survey Summary
This section reports the results of the Survey ofKalamazoo Promise Recipients at
WMU Spring 2009. The response rate for this survey was 53% (101/191). It was
intended, however, that the survey be sent to the entire population of WMU Kalamazoo
Promise recipients. Had this occurred and still only 101 surveys were returned, the
response rate would have been 33% (101/307) (see Table 4).
Initially, 191 names were obtained from the facilitator of the Kalamazoo Promise
at WMU. After the surveys were sent to these students and returned, however, the Office
64
of Student Academic and Institutional Research was contacted. This was done in order to
meet FERPA regulations that require keeping student data anonymous, and because of
the researcher's desire to link academic data to the survey data. Once data was obtained
from the Office of Student Academic and Institutional Research, 307 rows of students
without names were listed.
An exact description of the distribution of Promise students who answered the
survey and their probation status can be seen in Table 16. A total of 101 students
responded to the survey, a response rate of 53%. This is quite high for a college student
response rate. Unfortunately, only 4 out of the 13 students on academic dismissal
responded, a response rate of only 31% within this group of students. This is actually not
a bad response rate, either, but because of the nature of this research and wanting to
understand what factors influence students to leave the university, it would have been
better to have more who had been academically dismissed from WMU respond to the
survey. The response rate within the group of students on probation was 40% (10/25).
Lastly, the response rate within the group of students in good standing was 59% (87/147).
Table 16. Distribution of Promise Students who Answered the Survey by Probation
Status
Did Not Respond
On Time Response
Late Response
Total
Good
Standing
60
48
39
~X\1
On
Probation
15
2
8
25
Academic
Dismissal
9
2
2
13
Total
n/%
84/45
52/28
49/26
185/100%
Note. Seven students had no record of probation status and were not included in this table, therefore, N=
300 not 307; and the total of Responded and Did Not Respond does not equal 191 surveyed because of this.
65
The survey, Survey of Kalamazoo Promise Recipients at WMU Spring 2009,
consisted of 50 quantitative and qualitative questions with sub-questions. The survey
was broken into seven sections: Background questions; Cognitive, Social and
Institutional questions; About Kalamazoo Public Schools; questions about the Kalamazoo
Promise; and lastly, Changes due to the Kalamazoo Promise. Only highlights of the first
two sections are reported here, as they deal directly with retention factors addressed in
this dissertation. Please see appendix G for details on the rest of the sections.
Table 17. Did You Begin College at WMU or Elsewhere?
Response
Percent
15.8%
Started elsewhere
84.2%
Started at WMU
If elsewhere, please specify where:
Response
Count
16
85
16
Most students reported starting at WMU, while almost 16% reported starting
elsewhere first; see Table 17. Kalamazoo Valley Community College was the highest
reported, while Michigan State University came in second. A few students also reported
attending Grand Valley State, Eastern Michigan, and University of Michigan.
Most students responding thought they would enroll for an advanced degree after
completign their undergraduate degree; see Table 18. This indicates that most of the
Kalamazoo Promise recipients at WMU responding have goals, which is a determining
factor in retention (Bean & Metzner, 1985).
66
Table 18. Do You Expect to Enroll for an Advanced Degree When, or if, You Complete
Your Undergraduate Degree?
Response
Percent
Response
Count
~No
34.7%
34
Yes
65.3%
64
Note. Three did not answer this question.
Thirty-five percent of students who answered the survey reported that they live in
the dorm; see Table 19., The academic data, however, indicates that almost 70%
(214/307) of the WMU Kalamazoo Promise recipients live in dorms.
Table 19. Where Do You Live During the School Year?
Dormitory or other campus housing
Residence (house, apartment, etc.) within
walking distance of Western
Residence (house, apartment, etc.) within
driving distance
Fraternity or sorority house
Response
Percent
Response
Count
35.1%
34
19.6%
19
45.4%
44
0.0%
0
Note. N=97, 4 did not answer.
Almost 40% of the parents or guardians of the WMU Promise students do not
have a college degree; see Table 20. In other words, 40% of those students who
responded to the survey are first-generation college students.
67
Table 20. What is the Highest Level of Education Obtained by Your Father or Mother?
Father or
Male
guardian
Not a high school graduate 3
High school diploma or
18
GED
Some college, did not
17
complete degree
8
Associate degree
20
Bachelor's degree
9
Master's degree
Doctorate degree and/or
4
Professional degree
2
Unknown
Mother or
Female
guardian
3
Response
Count
6
13
31
13
30
15
24
14
23
44
23
2
6
3
5
Response
^
3.57%
18.45
17.86
13.69
26.19
13.69
3.57
2.98
Note. N=96, 5 did not answer.
Most students spend anywhere from 1 to 20 hours preparing for class; see Table
21. Almost 22% of the students who responded do not work, while just over 28% work
more than 21 hours per week. Of the students who answered this question, a little over
45% do not participate in any college-sponsored activities or organizations. Being
involved with these types of activities is a key indicator of retention (Swail, 2003).
A little over 47% of the WMU Promise students report that their jobs take time
away from their school work; see Table 22.
68
Table 21. About How Many Hours Do You Spend in a Typical 7-day Week
Doing Each of the Following?
1-5
6-10
11-20
21-30
31+
Response
Count
27
31
30
6
2
97
21
10
14
24
21
6
96
43
38
10
3
1
0
95
67
23
2
0
2
96
17
73
5
0
0
95
None
Preparing for class
(studying, reading,
writing, rehearsing or
other activities related to
your program
Working for pay
Participating in collegesponsored activities
organizations, campus
publications, student
government,
intercollegiate or
intramural sports, etc.)
Providing care for
dependents living with
your (parents, children,
spouse, etc.)
Commuting to and from
classes
0
Note. N=91, 4 did not answer
Table 22. If You Have a Job, How Does it Affect Your School Work?
I don't have a job
My job does not interfere with my school work
My job takes some time from my school work
My job takes a lot of time from my school work
Response
Percent
25.8%
26.8%
39.2%
8.2%
Response
Count
25
26
38
Note. N=91, 4 did not answer
Almost 61% of students felt they were academically prepared for classes at
WMU. Almost 19%o felt they were not academically prepared and that this might lead to
them withdrawing from classes. The rest were neutral. Working too many hours was
69
also a factor that may cause students to withdraw; see Table 23.
Table 23. How Likely is it That the Following Issues Would Cause You to Withdraw
From Class or From WMU?
Working full-time
Caring for
dependents
Academically
unprepared
Lack of finances
Don't fit in
Don't offer program
of study that I want
Not Likely
1
42
39
75
.„
.
16 16
Very Likely Response
5
Count
16 7
97
15 18
6
6
96
17 20
11 7
97
16 13
15 3
12 16
4 0
96
97
11 13
13 12
97
l
3
4
Note. N=97, 4 did not answer
Established research suggests that the more involved students are, the more likely
they are to persist (Swail, 2003); however, only about 3% of the students responding to
the survey are involved in any social fraternity or sorority; see table 24.
Table 24. Are You a Member of a Social Fraternity or Sorority?
Response
Percent
96.8%
3.2%
No
Yes
If yes, which one?
Response
Count
91
3
4
Note. N=94, 7 did not answer
Regarding attending WMU, most students reported having supportive friends and
family; see Tables 25 and 26. The more supportive people there are surrounding
students, the more likely they are to persist (Swail, 2003).
70
Table 25. How Supportive Are Your Friends of Your Attending WMU?
Response
Percent
2.1%
15.5%
36.1%
46.4%
Not Very
Somewhat
Quite a bit
Extremely
Response
Count
2
15
35
45
Note. N=94, 7 did not answer
Table 26. How Supportive is Your Immediate Family of Your Attending WMU?
Response
Percent
0.0%
6.2%
23.7%
70.1%
Not Very
Somewhat
Quite a bit
Extremely
Response
Count
0
6
23
68
Note. N=94, 7 did not answer
Most students responding to the survey find their relationships with students at
WMU friendly and supportive. They feel a sense of belonging. They feel their instructors
are available, helpful and sympathetic. Almost 63% of students reported that
administrative and office personnel are helpful, informative and flexible; see Tables 27,
28 and 29.
71
Table 27. Which Best Represents the Quality of Your Relationship With Students
atWMU?
1 Unfriendly, unsupportive, sense of alienation
2
3
4
5 Friendly, supportive, sense of belonging
Response
Percent
0.0%
6.2%
19.6%
36.1%
38.1%
Response
Count
0
6
19
35
37
Note. N=94, 7 did not answer
Table 28. Which Best Represents the Quality of your Relationships With Instructors
at WMU?
1 Unavailable, unhelpful, unsympathetic
2
3
4
5 Available, helpful, sympathetic
Response
Percent
Response
Count
0.0%
4.1%
0
4
32
43
18
33.0%
44.3%
18.6%
Note. N=94, 7 did not answer
Table 29. Which Best Represents the Quality of Your Relationship
With Administrative Personnel & Office Staff at WMU?
1 Unhelpful, Inconsiderate, rigid
2
3
4
5 Helpful, considerate, flexible
Response
Percent
Response
Count
0.0%
14.4%
33.0%
33.0%
19.6%
0
14
32
32
19
Note. N=94,7 did not answer
Almost 85% of students responding, when asked whether they would still attend
72
WMU if they could start over again, answered "probably yes" or "definitely yes." Only a
little over 15% said "probably no" or "definitely no"; see Table 30. Ninety-five percent
would recommend WMU to a friend or family member, while 4.3% would not: see table
31.
Table 30. If You Could Start Over Again, Would You Still Attend WMU?
Response
Percent
3.3%
12.0%
53.3%
31.5%
Definitely no
Probably no
Probably yes
Definitely yes
Response
Count
3
11
49
29
Note. N=92, 9 did not answer
Table 31. Would You Recommend WMU to a Friend or Family Member?
Response
Percent
4.3%
95.7%
No
Yes
Response
Count
4
88
Note. N=92, 9 did not answer
Research Question One Results
This section reports the results associated with all three sub-questions of Research
Question One. This question seeks to determine to what extent persister, those on
probation, and non-persister Kalamazoo Promise recipients differ by demographic
characteristics on each of the following selected factors in Swail's (2003) Geometric
Model of Student Persistence and Achievement: (a) Cognitive Factors, (b) Social Factors,
and (c) Institutional Factors.
73
The Probation variable was used to determine if a student persisted or not.
Students who were academically dismissed were considered not to have persisted.
Students whose probation status was in Good Standing were considered to have persisted.
Those in the On Probation category fall in between these two groups. For now they are
persisters, but they look more like non-persisters and as such are examined separately.
Research Question 1.1 Results
1.1: Among the three groups of students (persister, those on probation,
andnon-
persister Kalamazoo Promise recipients, by gender and race), are there any significant
differences in the cognitive factors from Swail 's (2003) Geometric Model of Student
Persistence and Achievement using the following dependent variables from the academic
data: (a) high school GPA, (b) most recent WMU GPA, (c) ACT composite score, (d)
taking a remedial math course at WMU, (e) taking a remedial reading course at WMU,
(f) taking a remedial writing course at WMU, or (g) taking AP credit?
The research question was assessed using the MANOVA procedure to assess
group differences in the continuous variables (a high school GPA, most recent WMU
GPA, and ACT composite score). The Chi-square Test of Independence was used to
assess group differences in the categorical variables (taking remedial courses at WMU
and taking AP credit in high school). Table 6 delineates the details of each question with
related variables and analysis.
The first step taken in interpreting the results from the MANOVA procedure was
to assess the value of Box's M, which tested the assumption of equal variance across
groups (Creswell, 2005). Had the assumption of equal variance been upheld, then the
74
Wilk's Lambda test statistic would have been used to interpret the results. The
assumption of equal variance across groups was violated, however, so the Pillai's Trace
was used to interpret the results.
Results of the data analysis revealed the Box's M = 341.268 to be statistically
significant [F (90, 4983.931) = 3.226, p = .000]. Therefore the assumption of equal
variance is not upheld. Consequently the Pillai's Trace overall results across three
dependent variables were used to interpret results from the MANOVA procedure. The
results showed statistically significant differences among groups based on persistence,
race, and gender. There were no statistically significant interactive effects due to gender,
but there was one statistically significant interaction between persistence and race [F (12,
846) = 2.110,p = .014] (see Table 32). The significant value of Pillai's trace warranted
further investigation to determine the source of the statistically significant differences.
The obtained test statistic for comparisons of the groups based on persistence
revealed a statistically significant difference of [F (6, 562) = 39.234 and/? = .000]. The
obtained test statistic for comparisons of the race groups, F (6, 562) = 7.747 andp = .000)
and comparisons of the groups based on gender revealed a statistically significant
difference of [F (3, 280) = 2.876 andp = .037] between the groups, see Table 17. Lastly,
a test statistic for the interaction of persistence and race, F (12, 846) = 2.110 andp =
.014] between groups was determined.
75
Table 32. Summary of Omnibus MANOVA Test of Group Differences across Dependent
Variables
Effect
Persistence
RaCe
Pillai's Trace
Hypothesis
Partial Eta Observed
Value
F
df
Error df Sig. Squared P ° w e r
39.234 6
562.000 .000 .295
1-000
.590
.153
7.747
6.000
562.000 .000 .076
1-000
2.876
3.000
280.000 .037 .030
.684
persistence *
race
2.110
12.000
846.000 .014 .029
.942
persistence *
gender
.687
6.000
562.000 .660 .007
.275
1.425
6.000
562.000 .203 .015
-558
1.428
12.000
846.000 .147 .020
.788
Gender
^
race * gender
persistence *
race * gender
.060
Note. Alpha = .05
Post Hoc analysis. The overall MANOVA revealed statistically significant group
differences across the dependent variables and one interaction; follow-up tests were
therefore conducted to locate the source of the differences. The follow-up tests consisted of
a univariate ANOVA for each dependent variable (Gay, Mills, & Airasian, 2006; Mertler &
Vanatta, 2007; Spinthall, 2007). To reduce the occurrence of Type I error when conducting
a series of ANOVA's, the Bonferonni correction procedure was used. This procedure set
alpha at a more stringent level to keep the alpha across the set of comparisons at a
predetermined level. In this case the adjusted alpha equals the overall alpha for the analysis
(.05) divided by the number of dependent variables (3). Therefore critical alpha for the post
hoc univariate analyses was alpha = .016 (alpha = .05 / 3 = .016).
76
Differences by race. Table 33 presents a summary of the univariate test for the
groups based on race. With the adjusted critical value of alpha, .016, only two
statistically significant results were found. The obtained test statistic for comparisons of
the race groups in terms of ACT composite score revealed a test statistic of F (2, 282) =
9.410 andp = .000. The magnitude of the effect size for the results was n =.063, which
according to Cohen (1988) is a small to medium effect. The obtained power of .978
revealed that the differences in ACT composite score between the three groups were
large enough to be detected 97.8% of the time. A post hoc analysis and a review of the
descriptive statistics revealed that White students, on the average, had significantly
higher ACT composite scores (M= 20.989, sd = .492) than Black students (M= 17.368,
sd = .698) or students in the Other category (M= 18.874, sd =.925). There were no
statistically significant differences between the ACT composite scores of Black students
and students in the Other category.
Table 33. Summary of Test for Differences by Race across the Dependent Variables
Observed
Power
Dependent Variable
df
F
Sig.
Partial Eta
Squared
High School GPA
2
14.484
.000
.093
.999
Most Recent WMU GPA 2
3.190
.043
.022
.607
ACT Composite Scores
9.410
.000
.063
.978
2
The obtained test statistic for comparisons of the race groups in terms of High
School GPA revealed a test statistic of F (2, 282) = 14.484 and/? = .000. A post hoc
analysis and a review of the descriptive statistics revealed that White students, on the
average, had significantly higher high school GPAs (M= 3.419, sd = .064) than Black
77
students (M= 2.818, sd = .091) or students in the Other category (M =3.188, sd = .121).
Students in the Other category had significantly higher high school GPAs than Black
students. This means that of the three racial groups, Black student have the lowest high
school GPAs. Most recent WMU GPA, although p = .043, is not significant because of
the adjusted alpha, .05/3 = .016 to control for Type I error.
Difference by gender. Table 34 presents a summary of the univariate test for the
groups based on gender. With the adjusted critical value of alpha, .016, no statistically
significant result was found. The obtained test statistic for comparisons of the gender
groups in terms of ACT composite score revealed a test statistic of F ( l , 282) = 4.385 and
p = .037. A post hoc analysis and a review of the descriptive statistics revealed that male
students, on average, had significantly higher ACT composite scores (M= 21, sd = 5)
than female students (M= 19, sd = 5), however, because of the adjusted critical value of
alpha, .016 ACT composite score is considered not statistically significant. This means
that statistically males and females score similarly on their ACT composite.
There was not a statistically significance difference between males and females
across high school GPA or most recent WMU GPA; see Table 34. This means that males
and females have similar GPA, both in high school and at WMU.
78
Table 34. Summary of Test for Differences by Gender across the Dependent Variables
Dependent Variable
High School GPA
.003
Observed
Power
.167
.555
.001
.091
.037
.015
.551
df
1
F
.980
Sig.
.323
Most Recent WMU GPA 1
.350
ACT Composite Scores
4.385
1
Partial Eta
Squared
Note. Adjusted alpha = .016, (.05/3)
Differences among persisters, those on probation, and non-persisters. Table 35
presents a summary of the results. With the adjusted critical value of alpha = .016, two
statistically significant results were found. The obtained test statistic for comparison of
the most recent WMU GPA for persisters, those on probation, and non-persisters revealed
a test statistic of F(2, 282) = 198.560 and/? = .000. A review of the pair-wise
comparisons and descriptive statistics revealed that the most recent WMU GPAs (M =
2.982, sd= .046) of persisters were higher than those of students on probation (M= 1.702
,sd = .095) and were higher than those of non-persisters (M = \.2\2,sd= .088). Also,
those on probation had a statistically higher most recent WMU GPA than did nonpersisters. This means that persisters had the highest most recent WMU GPA, while nonpersisters and the lowest. Students on probation fell between these two groups.
79
Table 35. Summary of MANOVA Results for Persistence across the Dependent Variables
Persistence
Dependent Variable df
2
High School GPA
Most Recent WMU
2
GPA
ACT Composite
2
Score
F
4.356
Sig.
.014
Partial Eta Observed
Squared
Power
.030
.752
198.560 .000
.585
1.00
.183
.001
.078
.833
Note. Alpha is .016; (.05/3)
The obtained test statistic for comparison of high school GPA for persisters, those
on probation and non-persisters revealed a test statistic of F (2, 282) = 4.356 and/? =
.014. A review of the pair-wise comparisons and the descriptive statistics revealed that
persisters had higher high school GPAs (M= 3.50, sd = .76) than did non-persisters (M =
3.04, sd = .58). There was no statistically significant difference between those on
probation (M- 3.15, sd = .44) and non-persisters or those on probation and persisters.
Students on probation had high school GPAs that fell in between persisters and nonpersisters. Students who are persisters had the highest high school GPAs. There was not
a statistically significant difference in terms of ACT composite scores among persisters,
those on probation, or non-persisters.
Statistical comparison for persistence by race interaction. Table 36 presents a
summary of the univariate test for the groups based on interaction. With the adjusted
critical value of alpha, .016, only one statistically significant result was found. The test
for interaction between persistence and race in terms of most recent WMU GPA revealed
a test statistic of F (2, 282) = 6.257 and/? = .000. A review of the descriptive statistics
revealed that White students in good standing (persisters) had higher most recent WMU
80
GPAs (M= 3.136, sd = .047) than Black students (M = 2.826, sd = .091) or students in
the Other category (M= 2.985, sd =.090) who were also in good standing. Students from
the Other category who were in good standing had higher WMU GPAs than Black
students in the same category. There were few differences in the WMU GPAs of
students classified as being on probation. Black students who were dismissed (nonpersisters) had higher WMU GPAs (M = 1.3 3 6 , ,s J = . 13 5) than students in the White (M
= .982, sd= A15) or other (M = 1.316, sd = . 195) racial groups who were also dismissed.
Table 36. Summary of MANOVA Results for Persistence and Race Interaction across the
Dependent Variables
Persistence
Dependent Variable
High School GPA
Most Recent WMU
GPA
ACT Composite Score
df
F
Sig.
Partial
Eta
Observed
Squared P o w e r
2
.548
.701
.008
-182
2
6.257
.000
.082
-988
2
.933
.445
.013
-295
Note. Alpha is .016; (.05/3)
The second part of research question 1.1 was run separately using the Chi-square
Test of Independence analysis because of the nominal variables; remedial courses,
advanced placement credit, gender, and race.
A crosstabs procedure, using the Chi-square Test of Independence, revealed a
statistically significant difference, ^2(2, 300) = 21.854 and/> =.000, among persisters,
those on probation, and non-persisters in terms of whether the students had taken
remedial courses at WMU. The data revealed that approximately 35% of non-persisters
had taken at least one remedial course at WMU, approximately 22% of those on
81
probation had taken at least one remedial course at WMU, while only 9% of the persisters
reported having taken a remedial course. This means that students who took remedial
classes were more likely not to persist.
The results also revealed a statistically significant difference,^ (2, 300) = 10.696
andp =.005, among the persisters, those on probation, and non-persisters in terms of
haven taken AP courses in high school. The data revealed that 12% of the persisers had
taken at least one AP course in high school, while those on probation did not take any AP
course, and 2% of non-persisters took AP courses in high school. This means the
students who take AP courses in high school are more likely to persist.
82
Table 37. Crosstab for Tests of Differences Among Persisters, Those on Probation,
and Non-persisters Across the Variables of Persistence and Taking Remedial Courses
at WMU, AP Credit, Gender, and Race
Total
182
On
NonProbation . Persister
40
32
91%
78.4%
65.3%
85%
18
11
17
46
9%
21.6%
34.7%
15%
200
51
49
300
176
51
48
275
88%
100%
98%
91.7%
24
0
1
25
12%
0%
2%
8.3%
200
51
49
300
102
29
32
163
% within persistence 51%
56.9%
65.3%
54.3
Count
22
17
137
% within persistence 49%
43.1%
34.7%
45.7%
Total Count
200
51
49
300
Count
130
29
24
183
% within persistence 65%
56.9%
49.0%
61.0%
Count
13
17
65
% within persistence 17.5%
25.5%
34.7%
21.7%
Count
9
8
52
% within persistence 17.5%
17.6
16.3%
17.3%
Total Count
51
49
300
Persister
No Remedial Course
Count
% within
persistence
Yes Remedial Course Count
% within
persistence
Total Count
No AP Credit
Count
Yes AP Credit
% within
persistence
Count
% within
persistence
Total Count
Male
Female
White
Black
Others
Count
98
35
35
200
254
The results failed to reveal a statistically significant difference in terms of gender,
83
X2(2, 300) = 3.405 andp =.182. A crosstabs procedure, Chi-square Test of Independence,
also failed to reveal a statistically significant difference,^2(4, 300) = 7.648 andp =.105,
among persisters, those on probation, and non-persisters in terms of race. See table 37 for
details.
Research Question 1.2 Results
1.2 Among the three groups of students: persister, those on probation, and nonpersister Kalamazoo Promise recipients by gender and race, are there any
differences in the social factors from Swail 's (2003) Geometric Model of
Student Persistence and Achievement using the following dependent variables
from the academic data: (a) living in a dorm, (b) being an athlete or (c)
parental income?
This research question was addressed through the use of the One-way Analysis of
Variance procedure (ANOVA) and a Chi-square Test of Independence. The ANOVA
procedure was used to assess the differences among the groups on the interval level data
(parental income). The Chi-square Test of Independence was used to assess the
differences among the groups on the categorical level data of living in the dorm. The
One-way ANOVA failed to reveal a statistically significant difference, F (2, 299) = 1.926
andp = .148, among the three groups on parental income, which means that students
were persisters, on probation, or non-persisters regardless of income: students whose
parents had high incomes were not found to have persisted more than students whose
parents had low incomes.
There were four athletes, all male; one white male was in good standing, two
84
white males were on probation, and the fourth athlete was a black male who had been
academically dismissed. This was too small of a sample to examine statistically, so the
variable was not addressed in the statistical procedures.
Lastly, the variable living in a dorm was examined using the crosstabs procedure.
There was no statistically significant difference,^ (2, 300) = .445 and/? =.801, found in
the observed and expected frequency count of persisters, those on probation, and nonpersisters in terms of living in the dorm. This means that living in a dorm made no
difference on whether a student persisted, was on probation, or did not persist.
Research Question 1.3 Results
1.3 Among the three groups of students: persister, those on probation, and nonpersister Kalamazoo Promise recipients, by gender and race, are there any
differences in the institutional factors from Swail's (2003) Geometric Model
of Student Persistence and Achievement using the following dependent
variables from the academic data: (a) first year experience (FYE), and (b)
which high school Promise students came from?
This research question was addressed through the use of the Chi-square Test of
Independence to assess the differences among the groups on the categorical levels of
which high school the students attended and First Year Experience (FYE).
The Chi-square Test of Independence indicated that there was not a statistically
significant difference, y?(6, 299) = 2.479 andp =.871 in the observed and expected
frequency count of persisters and non-persisters in terms of which high school they
attended. One non-persister did not have a high school listed, therefore n = 299 instead
85
of n = 300. This means that approximately the same number of students from Loy Norrix
High School and Kalamazoo Central High School persisted, were on probation, or did not
persist. One high school did not produce more students who were non-persisters than the
other high school.
The Chi-square Test of Independence, however, did reveal a statistically
significant difference, x2(2, 300) = 10.101 and/? =.006, among the three groups in terms
of First Year Experience at WMU, which means that there was a difference among
students who persisted, who were on probation, or who did not persist in terms of
whether they participated in the first year experience or not. Only 37.7% of the WMU
Kalamazoo Promise students participated in the FYE, the other 62.3% did not, see table
38. Of those students who did participate in FYE, about half of those on probation and
about half of the non-persisters participated. Only about 30% of persisters participated.
This means that a higher percentage of those on probation and the non-persisters
participated in the FYE than the percentage of persisters.
Table 38. Crosstab for Tests of Differences Among Persisters, Those on Probation,
and Non-persisters Across the Variables of Persistence and Taking Remedial Courses
at WMU, AP Credit, Gender, and Race
Persister
On
NonTotal
Probation Persister
137
24
26
187
% within No Persistence 68.5%
47.1%
53.1%
62.3%
Count
63
27
23
113
% within Yes
Persistence
Total Count
31.5%
52.9%
46.9%
37.7%
200
51
49
300
Count
86
Research Question Two Results
Research question two is broken into two sections, both of which examine the
number of courses taken per term and the number of courses taken the first year.
Research question 2.1 examines persisters, those on probation, and non-persisters, while
research question 2.2 examines respondents, late respondents, and non-respondents
controlling for race and gender in both questions.
Research Question 2.1 Results
2.1 Is there a difference in the average number of overall courses taken per term and
number of courses taken the first year among the three groups of students:
Persister, On probation and Non-persister WMU Kalamazoo Promise recipients,
controlling for gender and race, using the course summary data?
An analysis of covariance (ANCOVA) was conducted to assess the differences in
the average number of overall courses taken per term by persisters, those on probation,
and non-persisters, when controlling for gender and race. Table 39 presents a summary of
the results. Results indicate that there was no statistically significant [F{\, 300) = 1.027,/>
= .312] differences in the average number of classes taken per term for the three groups
across the variable gender. Results, however, indicated that there were statistically
significant [F(l, 300) .4.148, p = .043] differences in the average number of classes taken
per term for the three groups across the variable of race.
The descriptive statistics showed that students in the Other category took an
average of 5.26 courses per term, while Black students took an average of 5.00 courses
per term. White students took, on average, the fewest courses per term, 4.94; see table
87
40. The final analysis revealed statistically significant [F(l, 300) = 4.440,/? = .013]
differences in the average number of courses taken per term by persisters, those on
probation, and non-persisters. Persisters took more average courses per term than Nonpersisters. There were no other differences. The ANCOVA failed to indicate any
interactive effects between, gender, race, persistence, and the average number of classes
taken per term.
Table 39. ANCOVA Results of Persisters and Non-persisters, Controlling for Race
and Gender for Average Number of Courses Taken Per Term
Sig.
Partial Eta
Squared
Observed
Power
4.148
.043
.014
.528
1.393
1.027
.312
.003
.173
6.021
4.440
.013
.029
.760
df
Mean
Square
F
Race
2
5.625
Gender
1
Persistence
2
88
Table 40. Descriptive Statistics for Gender, Race and Persistence
for Average Number of Courses Taken Per Term
Average Number of Courses per Term
Persistence
Race
Academic
Gender
Standard
Deviation
1.30
Variance Total N
200
1.70
On Probation 4.85
.83
.70
51
Non-persister 4.65
.84
.70
49
Total
5.00
1.18
1.39
300
White
4.94
1.01
1.02
189
Black
5.00
1.18
1.40
66
Other
5.26
1.63
2.66
52
Total
5.01
1.18
1.38
307
Male
5.05
1.10
1.21
164
Female
4.97
1.26
1.59
143
Total
5.01
1.18
1.38
307
Persister
Mean
5.13
An analysis of covariance (ANCOVA) was conducted to assess the differences in
the number of courses taken the first year by persisters, those on probation, and nonpersisters, when controlling for gender and race. Table 41 presents a summary of the
results. Results indicate that there was no statistically significant [F(\, 300) = .999, p =
.318] differences in the number of courses taken the first year for the three groups across
the variable gender. Nor was a statistically significant result found across the variable
race [F(l, 300) = .3.204, p = .074]. Results, however, indicated that there was a
statistically significant [F(l, 300) 3.574, £> = .029] difference in the number of courses
taken the first year among persisters, those on probation, and non-persisters.
The descriptive statistics, table 42, showed that students who are persisters took
on average more courses (n = 10.17) their first year than those on probation (n = 9.12)
89
and non-persisters (n = 9.31). What is interesting is that Non-persisters took on average
slightly more courses than those on probation. The ANCOVA failed to indicate any
interactive effects between, gender, race, persistence, and the average number of classes
taken per term.
Table 41. ANCOVA Results of Persisters, Those on Probation, and Non-persisters,
Controlling for Race and Gender for Number of Courses Taken the First Year
Sig.
Partial Eta
Squared
Observed
Power
3.204
.074
.011
.528
10.122
.999
.318
.003
.173
6.021
4.440
.013
.029
.760
df
Mean
Square
F
Race
1
32.470
Gender
1
Persistence
1
Table 42. Descriptive Statistics for Gender, Race and Persistence for Number of Courses
Taken the First Year
Number of Course First Year
persistence
Race
Academic
Gender
Mean
Standard
Deviation
Variance Total N
10.17
3.53
12.48
200
On Probation 9.12
2.73
7.43
51
Non-persister 9.31
1.91
3.63
49
Total
9.85
3.22
10.34
300
White
9.51
3.08
9.47
189
Black
10.08
3.19
10.19
66
Other
10.29
3.81
14.52
52
Total
9.76
3.24
10.52
307
Male
9.94
3.01
9.05
164
Female
9.56
3.49
12.19
143
Total
9.76
3.24
10.52
307
Persister
90
Research Question 2.2 Results
2.2 Is there a difference in the average number of overall courses taken per term and
number of courses taken the first year among the three groups of students:
Respondents, late respondents and non-respondents of the Survey of Promise
Scholarship Recipients at WMU Spring 2009 controlling for gender and race
using the course summary data?
An analysis of covariance was conducted to assess the differences between the
average number of courses per term taken by respondents, late respondents, and nonrespondents, controlling for gender and race. Table 43 presents a summary of the
results. Results indicate that there were no statistically significant differences among
the three groups in the number of classes taken per term when compared by race
[F(l, 185) = 2.305,/? = .131] or by gender [F(l, 185) = .344,/? = .558] and also no
statistically significant [F(2, 185) = .314,p = .731] difference among the three groups
in the average number of classes taken per term. The ANCOVA also failed to
indicate any interactive effects between, gender, race, category of respondent, and the
average number of classes taken per term.
Table 43. ANCOVA Results of Respondents, Late Respondents and Non-respondents,
Controlling Across Race and Gender for Average Number of Classes Taken Per Term
Observed
Power
df
Mean Square
F
Race
1
4.011
2.305 .131 .013
.327
Gender
1
.598
.344
.558 .002
.090
GroupByResponse 2
.547
.314
.731 .003
.099
91
Sig.
Partial Eta
Squared
Table 44. Descriptive Statistics for Gender, Race and Response for Average Number
of Courses Taken Per Term
Average Number of Course per Term
Survey Group by
Response
Race
Academic Gender
Respondent
Mean
5.13
Standard
Deviation
1.07
Variance Total N
1.15
52
Late Respondent
5.06
1.06
1.13
49
Non-respondent
5.22
1.57
2.47
84
Total
5.15
1.32
1.73
185
White
4.94
1.01
1.02
189
Black
5.00
1.18
1.40
66
Other
5.26
1.63
2.66
52
Total
5.01
1.18
1.38
307
Male
5.05
1.10
1.21
164
Female
4.97
1.26
1.59
143
Total
5.01
1.18
1.38
307
An analysis of covariance was conducted to assess the differences in the number
of courses taken the first year by respondents, late respondents, and non-respondents,
controlling for gender and race. Table 45 presents a summary of the results. Results
indicate that there were no statistically significant differences among the three groups in
the number of courses taken for by race [F(l, 185) = 2.305,/) = .131] or gender [F(l,
185) = .015, p = .903],. The final analysis revealed that there was no statistically
significant [F(2, 185) = .271, p = .763] difference in the number of courses taken the first
year for the three groups. The ANCOVA also failed to indicate any interactive effects
between, gender, race, category of respondent, and the number of classes taken the first
year.
92
Table 45. ANCOVA Results of Respondents, Late Respondents and Non-respondents,
Controlling across Race and Gender for the Number of Courses Taken the First Year
Observed
Power
df
Mean Square F
Race
1
13.987
1.139 .287 .006
.186
Gender
1
.181
.015
.903 .000
.052
.3.325
.271
.763 .003
.092
GroupByResponse 2
Sig.
Partial Eta
Squared
Table 46. Descriptive Statistics for Gender, Race and Response
for Number of Courses Taken the First Year
Number of Courses First Year
Survey Group by
Response
Race
Academic Gender
Mean
Standard
Deviation
Variance Total N
Respondent
10.23
2.56
6.53
52
Late Respondent
9.98
3.81
14.48
49
Non-respondent
10.39
3.80
14.41
84
Total
10.24
3.48
12.12
185
White
9.51
3.08
9.47
189
Black
10.08
3.19
10.19
66
Other
10.29
3.81
14.52
52
Total
9.76
3.24
10.52
307
Male
9.94
3.01
9.05
164
Female
9.56
3.49
12.19
143
Total
9.76
3.24
10.52
307
Research Question Three Results
Research question three examines to what extent respondent, late respondent, and
non-respondent Kalamazoo Promise recipients differ on each of the following selected
factors in Swail's (2003) Geometric Model of Student Persistence and Achievement: (a)
93
Cognitive Factors, (b) Social Factors, and (c) Institutional Factors, using known
characteristics from the academic data. In addition, to what extent do early respondents
differ from late respondents on variables from the Survey of Promise Scholarship
Recipients at WMUSpring 20091 Lastly, using Groves and Couper's bias ratio formula,
was there an indication of non-response bias?
Data Analysis for Research Question 3
The data analysis for research question three was accomplished in several phases.
First, the Chi-square Test of Independence was used to assess the differences among the
groups on the categorical level variables in the cognitive, social, and institutional factors
included from the Survey of Promise Scholarship Recipients at WMU Spring 2009. The
data were assessed using the Pearson Chi-square Test of Independence, which is a
nonparametric statistical procedure that addresses the differences between observed and
expected frequency counts for a distribution of scores (Mertler &Vanatta, 2007;
Rosenberg, 2007; Stevenson, 2007). Nonparametric tests are not as powerful as
parametric statistical procedures; however the Pearson Chi-square Test of Independence
is an appropriate procedure for assessing the differences in the observed and expected
frequency counts of the two groups across the dependent variables.
Second, items for the continuous/interval level data from the cognitive, social, and
institutional factors included from the Survey of Promise Scholarship Recipients at WMU
Spring 2009 were grouped to create the following four summated subscales: cognitive
engagement, social demands, institutional support, and social engagement. The exact
details of each of the scales can be found in Appendix C.
94
The Cognitive Engagement Subscale consisted of 20 items (Items C7 through
C27) from the Cognitive Factor of the survey. Participants responded to items on the
Cognitive Engagement subscale using a 5-point Likert-type scale where the responses
ranged from 1 = not likely to 5 = very likely.
The Social Demands subscale consisted of five items (Items S13 through SI 7)
from the Social Factor of the survey. Participants responded to items on the Social
Demands subscale using a 5- point Likert-type scale, where the responses ranged from 1
= never to 5 = often.
The Institutional Support subscale consisted of six items (Items 17 through 113)
from the Institutional Factor of the survey. Participants responded to items on the
Institutional Support subscale using a 5-point Likert-type scale, where the responses
ranged from 1 = never to 5 = very often.
The Social Engagement subscale consisted of five items (Items 114 through 118)
from the Institutional Factor of the Survey. Participants responded to items on the Social
Engagement subscale using a 5- point Likert-type scale, where the responses ranged from
1 = very little to 5 = very much.
Summated scales offer an advantage over single-item scales in that such scales
can be assessed for reliability and the unidimensionality of the construct being measured
(Thorndike, 1967). Items assigned to each scale were summated together to yield total
scores. Before running statistical procedures on data from the survey, the researcher
assessed the internal consistency of the scales using reliability analysis. Cronbach's
coefficient alpha was used to measure the internal consistency of the scales included in
the survey (Cohen, 1988; Trochim, 2007). While any test developer hopes to obtain a
95
reliability coefficient that approaches 1.0, such a value is rarely obtained in behavioral
and social science research. The significance of the obtained alphas will be tested against
the value of alpha = .70, as suggested by Kaplan and Saccuzzo (2005), for the obtained
alpha coefficients. The research indicates that values of .70 or greater indicated that a
scale is internally consistent (Kaplan & Saccuzzo, 2005; Mertler & Vanatta, 2005).
Table 47 presents a summary of the descriptive statistics for the four scales. The
results indicate that Social and Cognitive Engagement Subscales obtained coefficient
alphas were statistically significant atp < .05. The results further revealed that the other
two scales obtained acceptable internal consistency estimates for the scores obtained
from this study. The results indicate that the four scales collected reliable data from the
participants in this study.
Table 47. Summary of Results from the Reliability Analysis and Descriptive Statistics for
Subscales of the Survey of Promise Scholarship Recipients
95%
Confidence
Interval
F Test with True Value .7
Lower Upper
Alpha Bound Bound Value dfl
1°
.687
Engagement
Institutional 7 g 2
Support
Social
74g
Demands
Cognitive
g30
Engagement
.959
87
dft
Sig
n
261
.582
4
.565
.782
JQ4
g45
L3?4
g6
5J6
mi+
66Q
g20
U 9 1
93
465
126
774
877
L?61
89
1513
mo*
Mean
sd
12.352
3.594
?
23.207
4.698
6
12.4647 5.165
lg
54.111
10.308
Note. * Significant atp<.05
In addition to assessing the internal consistency of the scales contained in the
96
survey, an item-analysis was performed on the individual items in a scale. This statistical
analysis provided information of the internal consistency of single items as they related to
the homogeneity of items contained in a scale (Thorndike, 1967). Appendix F presents a
summary of the results. The item analysis was conducted by investigating the item-total
correlation for each item in a scale. Items with a correlation of .30 or higher were retained
for inclusion in subsequent analytic procedures. This value was chosen because it
represents the critical value of r with alpha set at .01 and df- 100 (Ary, Jacobs &
Razavieh, 1996). Items with lower correlations were excluded from the subsequent
statistical procedures, if excluding the items did not decrease alpha of the scale to which
the item was assigned. In addition, items with correlation less than .30 were considered
for either modification or removal from the questionnaire (Ary, Jacobs & Razavieh,
1996; Thorndike, 1967). Items were considered for removal if removing the item did not
decrease the alpha for the scale.
A review of the results, as presented in Appendix F, indicated that all items on the
Institutional Support subscale and Social Demands subscale were good items as all
achieved inter-item correlations that exceeded the cut score of .30. However, one item
from the Social Engagement subscale (Attended Art Exhibit, Play, Dance, or Music) was
excluded from subsequent statistical analyses due to a low inter-item correlation.
Deleting the item resulted in four items remaining for the scale: Often Exercise
Participated in Physical Education ; Often Participated in Spiritual Activities; Often Tried
Understand Someone Else's Eyes and Often Learned Something Changed Way
Understand Issue.
A review of the results for the Cognitive Engagement subscale resulted in two
97
items, (C8, Come to class without completing readings or assignments, C23 Skipped
class) being excluded from subsequent statistical analyses due to negative inter-item
correlations with other items in the scale. The researcher initially recoded reverse coding
the items in an effort to improve the total inter-item correlation; however, doing so did
not raise the correlations above the critical cut score of .30. Deleting the items resulted in
18 items remaining for the scale. All items in each subscale can be seen in detail in
Appendix F.
Research Question 3.1 Results
3.1 Among the three groups of students: respondent, late respondent, and nonrespondent Kalamazoo Promise recipients, are there any differences in the
following known dependent variables from the academic data: (a) high school
GPA, (b) most recent WMU GPA, (c) ACT composite score, (d) taking a
remedial math course at WMU, (e) taking a remedial reading course at WMU,
(f) taking a remedial writing course at WMU, (g) taking AP credit, (h) living
in a dorm, (i) being an athlete, (j) parental income, (k) first year experience
(FYE) or (I) high school that could indicate possible non-response bias?
As academic data obtained from the Office of Student Academic and Institutional
Research provides data on all students regardless of whether they responded to the Survey
of Promise Scholarship Recipients at WMU Spring 2009 or not, this data was used to
determine if these students are similar. If these students are similar, then the survey data
could be generalized to the population of Western Michigan University Kalamazoo
Promise Scholarship recipients and not just to those students who responded to the
98
survey. If students who responded to the survey are different from those who did not
respond to the survey, then possible non-response bias must be examined in more depth.
Table 48. Distribution of Promise Students who Answered the Survey by Probation
w = 185
Did Not Respond
On Time Response
Late Response
Total
Good
Standing
60
48
39
147
On
Probation
15
2
8
25
Academic
Dismissal
9
2
2
13
Total
84
52
49
185
Note, Seven students had no record of probation status and were not included in this table, there for, N=
300 not 307; and the total of Responded and Did Not Respond does not equal 191 surveyed because of this.
Non-respondents (JV=84) in this research are considered to be those survey
recipients who did not fill out the online survey. Late respondents are those respondents
who only after more than two reminders responded (April 20 and later was the cutoff
date). "Persons who respond in later waves are assumed to have responded because of
the increased stimulus and are expected to be similar to non-respondents" (Armstrong &
Overton, 1977, p.2). As non-respondents did not fill out the survey, non-respondents and
respondents could not be compared using the survey data. Therefore using the successive
wave method of examining non-response bias, on-time respondents (N=52) were
compared to late respondents (/V=49) using the data obtained from the administration of
the survey. Successive waves refer to the stimulus done over time, i.e. reminder emails,
post cards, follow-up calls (Armstrong & Overton, 1977).
In addition, because additional data was obtained on all students through
academic records, non-respondents (7V=84) were compared to respondents (iV=101) and
non-respondents were compared to late respondents, to detect if any differences exist for
99
the population of WMU Promise students.
This research question was addressed through the use of the MANOVA procedure
and a Chi-square Test of Independence. The MANOVA procedure was used to assess the
differences among the groups on the interval level data (high school GPA, ACT
composite score, most recent WMU GPA, and parents AGI). The Chi-square Test of
Independence was used to assess the differences among the groups on the categorical
level data (taking remedial courses, taking AP credit, living in the dorm, high school
attended and first year experience).
Table 49 presents a summary of the MANOVA procedure. The results revealed
two statistically significant differences among the three groups. These statistical
differences were found in ACT composite score and most recent WMU GPA.
The comparison of ACT composite scores across the three groups—on time
response, late response, and no response—revealed a test statistic oiF(2, 298) =4.597
andp = .011. The magnitude of the effect size for the results was n2 =.048, which
according to Cohen (1988) is a small effect. The obtained power of .773 revealed that the
differences in the ACT composite scores across the three groups were large enough to be
detected 77.3% of the time. The null hypothesis for this research question was rejected. A
review of the descriptive statistics revealed that on-time respondents reported
significantly higher ACT composite scores (M= 21.8, sd = 6.672) = than did nonrespondents (M= 19.05, sd= 6.381). There were no statistically significant differences
between the scores (M= 21.04, sd = 4.975) of late respondents and the scores of the other
two groups.
100
Table 49. Summary of MANOVA Comparison of Survey of Promise Scholarship
Recipients at WMU Spring 2009 across the Demographic Variables
Dependent Variable
df
F
Sig.
Partial Eta Observed
Power
Squared
ACT Composite Score
2
4.597
.011
.048
High School GPA
2
1.646
.196
.018
.773
.344
Most Recent WMU GPA
2
8.796
.000
.088
.969
Parents Aggregate Income
2
1.006
.368
.011
.223
The most recent WMU GPA across the three groups revealed a test statistic of
F(2, 298) =8.796 and/? = .000. The magnitude of the effect size for the results was r\
=.088, which according to Cohen (1988) is a medium. The obtained power of .969
revealed that the differences in the most recent WMU GPA for the three groups were
large enough to be detected 96.9% of the time. The null hypothesis for this research
question was rejected. A review of the descriptive statistics revealed that on-time
respondents reported significantly higher WMU GPA (M= 3.076, sd = .672) = than did
non-respondents (M= 2.542, sd = .813). There were no statistically significant
differences between the scores (M= 2.856, sd = .669) of late respondents and the scores
of the other two groups.
The second part of research question 3.1 was examined using the Chi-square Test
of Independence due to the categorical nature of these variables. A summary of the
results is presented in Table 50. There were no statistically significant differences
between the observed and expected frequency counts among the three groups. The null
hypothesis for this research question was upheld.
101
Table 50. Test Groups Differences across the Categorical Variables
/
First Year Experience
4.491 a
a
df
Asymp. Sig.
(2-sided)
2
.106
Remedial courses
.894
2
.640
Being an athlete
2.431a
2
.297
Living in the dorm
4.239a
2
.120
Taking AP credit
1.018a
2
.601
Gender
a
4.685
2
.096
Race
2.145a
4
.709
High school attended
3.782a
4
.436
Note. a. 0 cells (.0%) have expected count less than 5. The minimum expected count is 6.36.
Research Question 3.2 Results
3.2 Between the two groups of students: early respondent and late respondent
Kalamazoo Promise recipients, are there any differences in the cognitive,
social, or institutional factors from Swail's (2003) Geometric Model of
Student Persistence and Achievement using the dependent variables from the
Survey of Promise Scholarship Recipients at WMU Spring 2009 indicating
possible non-response bias?
Non-respondents are not used for this question as this data is not available for
them. Late respondents are those who answered after the second e-mail reminder but
before the third e-mail reminder. Late respondents were considered to be similar to nonrespondents for the purpose of this research. "Persons who respond in later waves are
assumed to have responded because of the increased stimulus and are expected to be
similar to non-respondents" (Armstrong & Overton, 1977, p. 2).
This research question was addressed through the use of the MANOVA and the
102
Chi-square Test of Independence. The MANOVA procedure was used to assess the
differences among the groups on the four summated subscales of the Survey of Promise
Scholarship Recipients at WMUSpring 2009. Pillai's Trace was used to interpret results
from the MANOVA procedure.
Table 51 presents a summary of the MANOVA results. The value of Pillai's trace
was not statistically significant [F (4, 73) = 1.080, p = .373]. There was not enough
evidence to reject the null hypothesis for this research question. There were not
statistically significant differences between the two groups for the data obtained from the
Survey of Promise Scholarship Recipients at WMU Spring 2009.
Table 51. Summary of MANOVA Results for Early and Late Respondents of the Survey
of Promise Scholarship Recipients on the Four Summated Subscale Scores
,f
„.
F
lg
Partial Eta
Squared
Observed
Power
Dependent Variable
S
Social Demands
Cognitive Engagement
1
2.749
.101
.035
1
.751
.389
.010
.374
.137
Institutional Engagement
1
.080
.778
.001
.059
Social Engagement
1
1.662
.201
.021
.247
'
103
Table 52. Summary of Comparison of Early Respondents and Late Respondents across
the Categorical Variables of the Survey of Promise Scholarship Recipients
Variable
ssDegreePursuing
I2
80.9823
df
88
P
.689
a
ssCurrentMajor
85.987
84
.419
ssAdvancedDegree
1.972a
1
.160
ssIfYesDescribe
53.001a
52
.435
ssDescribeCareerGoals
95.662a
95
.462
ssFreeReducedLunch
1.834a
1
.176
sGenderfromSurvey
a
3.414
1
.065
ssLivingWhereDuringSchoolYear
5.433a
2
.066
ss WhoLiveWithS chool Year
2.893a
6
.822
ssHighestEducationFather
4.159a
5
.527
ssHighestEducationMother
a
5.827
5
.323
siLevelAwarenessPromise
5.827a
5
.323
siPercentScholarshipEligibleFor
4.515a
7
.719
scHoursPreparingForClass
5.055a
5
.409
ssHoursWorkingForPay
a
3.774
5
.582
scHoursCollegeActivities
4.204a
4
.379
ssHoursDependentCare
4.849a
5
.435
ssHoursCommuting
5.175a
2
.075
a
3
.431
ssJobEffectSchoolWork
2.755
Note. a. 0 cells (.0%) have expected count less than 5.
The Chi-square Test of Independence was used to assess the differences between
the groups on the categorical level data. The results failed to reveal any statistically
significant differences between the two groups across the various categorical variables;
see Table 52.
104
Research Question 3.3 Results
3.3 Using and modifying Groves and Couper 's bias ratio formula (1998,) is there
an indication of non-response bias, and is there a difference between using
the mean or the median, a more robust statistic, in determining a bias
estimate on the dependent variables from the Survey of Promise Scholarship
Recipients at WMU Spring 2009?
The null hypothesis for this research question was addressed by using and
modifying Groves and Couper's bias ratio formula (1998). The formula was applied
twice: once with the means and once with the medians obtained from the descriptive data
for the four subscale scores of the Survey of Promise Scholarship Recipients at WMU
Spring 2009. The calculations were performed using the following formula:
Bias = 1 - Response Rate % (Respondent Mean - Late-respondent Mean)
Ky r) = ( l - r ) ( yr-
y„r)
Table 53 gives the descriptive statistics used to use the bias ratio formula.
105
Table 53. Descriptive Statistics for the Subscale Scores of the Survey of Promise
Scholarship Recipients
Survey Group by Response
Mean
On Time
Respondent
N
13.104
52.783
14.844
22.073
48
46
45
41
5.832
10.840
4.106
4.698
34.010
117.507
16.862
22.070
Median
12.000
52.500
14.000
21.000
Mean
11.761
55.500
15.256
24.217
N
46
44
43
46
4.321
9.648
3.971
4.511
18.675
93.093
15.766
20.352
Median
12.000
54.500
15.000
24.000
Mean
12.447
54.111
15.046
23.207
N
94
90
88
87
Std.
Deviation
Variance
Late Respondent
Std.
Deviation
Variance
Total
Social
Cognitive
Institutional Social
Engagement
Demands Engagement Support
Std.
Deviation
Variance
5.165
10.308
4.023
4.698
26.680
106.257
16.182
22.073
Median
12.000
54.000
14.500
23.000
The results of the comparison of means are presented in Table 54. The bias
estimates ranged from a low of-1.2884% to a high of 5.0853%. If the response rate is
53% and the mean Social Demands subscale for the respondents is 13.1042 and the mean
for the Non-respondent is 12.4468, then the non-response error is (1-.5288)(13.104212.4468) = 0.0509. This means that the non-response bias is 5% with regards to the total
sample mean for the Social Demands subscale score.
The same bias percent calculations were completed for each of the subscales—
106
Cognitive Engagement, Institutional Support and Social Engagement—with the resulting
percentages of 2%, 1% and 4% respectively. These results failed to reveal any large
(10% or higher) differences in the response rates between respondents and nonrespondents.
Table 54. Response Bias Estimates for the Subscale Scores of the Survey of Promise
Scholarship Recipients Based on Mean Scores of the Participants Using the 53%
Response Rate
Response
Mean
Social Demands
Cognitive Engagement
Institutional Support
Social Engagement
13.1042
52.7826
14.8444
22.0732
Late
Response
Mean
11.7609
55.5000
15.2558
24.2174
Total
Mean
12.4468
54.1111
15.0455
23.2069
Nonresponse
Bias
0.0509
-0.0237
-0.0129
-0.0435
Bias
Percent
5.0853
-2.3663
-1.2884
-4.3536
Note. Response rate = 101/191 = .5288 = 53%
The results of the comparison based on the median are presented in Table 55. The
bias estimates ranged from a low of 0% to a high of 6.1461%. If the response rate is
53%, the mean Social Demands subscale for the respondents is 11.7610, and the mean for
the Non-respondent is 11.7610, then the non-response error is (1-.5288)(11.761011.7610) = 0. This means that the bias is 0% with regards to the total sample mean for
the Social Demands subscale score.
The same bias percent calculations were completed for each of the subscales, with
the resulting percentages of 2%, 3% and 6% respectively. These results failed to reveal
any large differences in the response rates between respondents and non-respondents.
107
Table 55. Response Bias Estimates for the Subscale Scores of the Survey
of Promise Scholarship Recipients Based on Median Scores of the Participants
Using the 53% Response Rate
Median
Response
Social Demands
11.761
Cognitive Engagement 52.500
Institutional Support
14.000
Social Engagement
21.000
Median
Late
Response
11.761
54.500
15.000
24.000
Total
NonMedian response
Bias
12.000 0
54.000 0.0175
14.500 0.0325
23.000 0.0615
Bias
Percent
0
-1.7452
-3.2497
-6.1461
Note. Response rate = 101/191 = .5288 = 53%
The same analysis was run using the response rate of 33% using the entire
population, instead of the 53% rate of those who were e-mailed the survey and
responded. The results of the comparison of means are presented in Table 56. The bias
estimates ranged from a low of 3.3697%) to a high of 26.0868%. If the response rate is
33%, the mean Social Demands subscale for the respondents is 13.1042, and the mean
for the Non-respondent is 12.4468, then the non-response error is (1-.3289)(13.104211.7609) = 0.0724. This means that the bias is 7% with regards to the total sample mean
for the Social Demands subscale score.
The same bias percent calculations were completed for each of the subscales, with
the resulting percentages of 3%, 8% and 26% respectively. These results failed to reveal
any large differences in the response rates between respondents and late respondents,
except for the subscale Social Engagement. For the subscale of Social Engagement, there
is a 26% bias percent between respondents and late respondents
108
Table 56. Response Bias Estimates for the Subscale Scores of the Survey
of Promise Scholarship Recipients Based on Mean Scores of the Participants
Using the 33% Rate
Response
Mean
Social Demands
Cognitive Engagement
Institutional Support
Social Engagement
13.1042
52.7826
14.8444
22.0732
Late
Response
Mean
11.7609
55.5000
15.2558
24.2174
Total
Mean
12.4468
54.1111
15.0455
23.2069
Nonresponse
Bias
0.0724
-0.0336
-0.0772
-0.2609
Bias
percent
7.2418
3.3697
7.7202
26.0868
Note. Response rate = 101/307 = .3289 = 33%
The results of the comparison based on the median are presented in Table 57. The
bias estimates ranged from 0% to 26.0870%. If the response rate is 33%, the mean
Social Demands subscale for the respondents is 11.7610, and the mean for the late
respondent is 11.7610, then the non-response error is (1-.32899)(11.7610-11.7610) = 0.
This means that the bias is 0% with regards to the total sample mean for the Social
Demands subscale score.
The same bias percent calculations were completed for each of the subscales, with
the resulting percentages of, 4%, 7% and 26% respectively. These results failed to reveal
any large differences in the response rates between respondents and late respondents
except for the subscale; Social Engagement; for the subscale of Social Engagement,
there is a 26% bias percent between respondents and late respondents.
109
Table 57. Response Bias Estimates for the Subscale Scores of the Survey of Promise
Scholarship Recipients Based on Median Scores of the Participants Using the 33%
Response Rate
Social Demands
Cognitive Engagement
Institutional Support
Social Engagement
Median
Response
Median Late Total
Response
Median
11.761
52.500
14.000
21.000
11.761
54.500
15.000
24.000
12.000
54.000
14.500
23.000
i > UJU-
response
Bias
0
-0.0370
-0.0690
-0.2609
Bias
Percent
0
-3.7037
6.8966
-26.0870
Note. Response rate = 101/307 = .3289 = 33%
This chapter presented the summary results of both the academic data as well as
survey data. Along with this, the results of all three research questions with subquestions were presented in detail.
110
CHAPTER V
CONCLUSIONS
This concluding chapter consists of three sections. First, the central findings of
the three research questions and sub-questions are discussed. Second, suggestions for
future research are enumerated. Lastly, the potential implications of these results for
future studies dealing with Kalamazoo Promise recipients are discussed, with
implications for WMU, KPS, the Kalamazoo community, researchers and evaluators to
consider.
Central Findings
The central findings of this dissertation are several, as each of the three research
questions had a different focus. Research question one focused on persisters, those on
probation, and non-persisters in terms of the Cognitive, Social and Institutional factors
from Swail's (2003) Geometric Model of Student Persistence and Achievement, using the
academic data obtained from the Office of Student Academic and Institutional Research.
The second research question also focused on persisters, those on probation, and
non-persisters, but this time in terms of the average number of courses taken per term and
the number of courses taken the first year. This data was also obtained from the Office of
Student Academic and Institutional Research.
The last research question focused on examining non-response bias and whether
there was an indication of bias in this research that would limit the generalizations that
can be made back to the population of WMU Kalamazoo Promise Scholarship recipients.
Ill
Part one of this question examined respondents, late respondents, and non-respondents in
terms of academic data obtained on all. The second part examined early respondents
with late respondents, as the literature shows that late respondents are very similar to
non-respondents using the Survey of Promise Scholarship Recipients at WMU Spring
2009. Persons who respond in later waves are assumed to have responded because of the
increased stimulus and are expected to be similar to non-respondents" (Armstrong &
Overton, 1977, p.2). Non-respondents could not be looked at using the Survey of
Promise Scholarship Recipients at WMU Spring 2009. Non-respondents did not answer
this survey; therefore no data on them exists from this survey. The last part of question
three used the bias ratio formula of Groves and Cooper(1998) to determine a bias ratio on
the dependent variables for the Survey of Promise Scholarship Recipients at WMU Spring
2009. This formula was also modified, using the median in place of the mean, in hopes
of determining a better indication of bias. Because of the number and scope of these
questions, the central findings are organized into three sections: one for each research
question and its corresponding sub-questions.
Research Question One Conclusion
Difference by race. Although high school GPAs for all Kalamazoo Promise
participants on average are around 3.00 regardless of probation status, there is still a
significant difference between those in good standing (persisters), those on probation, and
those that were academically dismissed (non-persisters) from WMU. A closer
examination revealed that White students had on average significantly higher high school
GPAs than Black students or students in the Other category. Also, students in the Other
112
category had significantly higher high school GPAs than Black students. Black students
have the lowest high school GPAs, while White students have the highest.
White students also scored significantly higher on their ACT composite score
than did Black students or students in the Other category. There was no significant
difference between Black students and students in the Other category on ACT composite
score. The interesting thing here is not that the White students scored higher on their
ACTs and had higher high school GPAs than the Black students or students in the Other
racial group; rather it is that there was no significant difference found among racial
groups on most recent WMU GPAs. This indicates that Western Michigan University
Kalamazoo Promise recipient students perform similarly regardless of race, which is not
the case for these same students in high school. This prompts the question: what is
different in high school from the institutional level in terms of race? This area needs
more research in order to determine exactly what is indicated here.
Difference amongpersisters, those on probation, and non-persisters. By race,
there was a significant difference for high school GPA and ACT composite score. Yet
there were not significant differences by race for most recent WMU GPA. The absence
of differences by race was found among those who persisted, those on probation, and
those who did not persist. It would be expected that persisters had higher GPAs and
higher ACT composite scores, just as was found in terms of race. This, however, was not
the case for ACT composite score.
The opposite of what holds true for race in terms of ACT composite score was
found between persisters, those on probation and non-persisters. There was no
significant difference found between persisters, those on probation and non-persisters in
113
terms of ACT composite score. There was a significant difference found between
persisters, those on probation and non-persisters with regard to high school GPA and
most recent WMU GPA. Persisters had higher high school GPAs than non-persisters.
Those on probation performed similar to both persisters and non-persisters falling right in
between the two groups. This means that students who persist had the highest high
school GPAs. This makes sense; it is assumed that students who have higher high school
GPAs are better prepared for a post-secondary education than students who have low
high school GPAs.
Persister also had higher most recent WMU GPAs than either those on probation
or non-persisters. In addition, those on probation had higher most recent WMU GPAs
than non-persisters. Persisters have the highest WMU GPAs, while non-persisters have
the lowest WMU GPAs. Obviously, this would be the case as those at WMU whose
GPAs are low are academically dismissed from WMU.
In addition, in the group of persisters, White students had higher most recent
WMU GPAs than either Black students or students in the Other category. Students from
the Other category who were also persisters had higher WMU GPAs than Black students
in the same category. There were little differences in the WMU GPAs of students
classified as being on probation. Out of students in the non-persister category, Black
students had higher WMU GPAs than non-persister students in the White or Other racial
group. This suggests some Black students may have been on the edge of not being
academically dismissed, and could have persisted with a little academic support.
Students coming to WMU start off very similar in terms of their high school
GPA, whether they persist, find themselves on probation, or do not persist: all three
114
groups average in the 3.00 range, and ACT composite scores are statistically no
different. Thus, neither high school GPA nor ACT composite scores appear to be
predictors of whether a student will persist or not.
The second part of this question examined persisters, those on probation, and nonpersisters in terms of whether the participants had taken remedial courses at WMU.
Thirty-five percent of non-persisters had taken at least one remedial course at WMU,
approximately 22% of those on probation had taken remedial courses at WMU, while
only 9% of the persisters reported taking a remedial course. This means that nonpersisters took the most remedial courses. In addition, 12% of persisters had taken at
least one AP course in high school, while those on probation did not take any, and only
2% of non-persisters took AP courses in high school. Thus, students who take AP
courses in high school are more likely to persist than students who do not. Obviously,
students need to be academically prepared before they come to WMU in order to have a
higher success rate with their post-secondary education.
By gender and race, there were no statistically significant differences between
persisters, those on probation, and non-persisters, which indicates that statistically the
same number of males and females persisted, and regardless of racial group, persistence
or lack thereof was evenly distributed.
Research question 1.2. Although factors of retention found in the literature
indicate that higher parental income and living in the dorm increase retention rates (Astin,
1999; Seidman, 2005), this was not found to be true with this population. This study
determined that there was no statistically significant difference between persisters, those
on probation, and non-persisters in terms of parental income or living in a dorm.
115
Research suggests that living in a dorm increases the social involvement of students,
helping them stay involved with their institution and, in turn, helping to retain them.
Such was not the case for WMU Kalamazoo Promise students, however. Western
Michigan University and Kalamazoo Promise Scholarship recipients are a special case;
all of these students have gone to high school and lived in the community for at least four
years, fostering for them community connections that students from outside areas might
not have. "Students who have money, if it came from their parents, also probably have
high levels of social and human capital. Isolating the effects of money from cultural
capital is difficult. These factors are likely to be mutually supportive in terms of
retention, and a student who has both money and cultural capital will benefit in terms of
social integration and institutional fit" (Seidman, 2005, p 235). These students may or
may not have high levels of "social and human capital" through their parents, as the
research suggests. Promise students obtain their financial support through a scholarship,
which is much different from what existing research discusses. Yet despite existing
research that associates retention with parental income, no such connection is evident
here.
Existing research also indicated that living in a dorm would make an impact on
retention. There was no difference, however, among Promise students at WMU in terms
of whether students lived in a dorm or not. Literature associating living in a dorm with
improved retention seems more relevant in the cases of students who move away from
home and lack other connections in the college community. For Kalamazoo Promise
students and WMU, however, there are special circumstances to account for. Promise
students attending WMU have already gone to high school in the local community. They
116
presumably have many social connections and social supports not found among students
in previous research connecting dorm living with improved retention; such students are
more likely to have moved away from home to a new community. Supposing WMU had
decided to offer free dorm rooms to Kalamazoo Promise recipients on the assumption
that dorm living would improve their retention, in keeping with previous research on the
subject, the evidence indicates such policy would have made no difference in retention of
this population. The Promise scholarship appears to affect this population in ways that
existing factors of retention cannot predict.
Research Question 1.3. There was no statistically significant difference
determined in terms of which high school students attended prior to attending WMU.
Statistically the same number of students from each high school succeeds or fails at
WMU. This is good news for the high schools: it indicates that one high school is not
doing better or worse than the other in preparing students for WMU.
First Year Experience that Western Michigan University provides new students
seems to make a difference, although not a strong one. Out of all of the WMU Promise
students, only 37% participated in FYE; about half of those on probation and about half
of non-persisters participated in it. By comparison, only about 30% of persisters
participated. That only 37% of Promise recipients participated in FYE, while half of
those on probation and half of non-persisters participated in it, suggests that WMU may
need to rethink the program for Promise recipients and that FYE does not seem to be
functioning as intended. WMU may want to examine in greater depth what services the
university provides actually for Promise students.
117
Research Question Two Conclusion
Research question two was answered using the course data provided by the Office
of Student Academic and Institutional Research at WMU. The first sub-question looks at
the average number of courses taken per term, controlling for gender and race, by
persisters, those on probation, and non-persisters. It also examined the number of
courses taken the first year, controlling for gender and race. The second sub-question
examines the respondents, late respondents, and non-respondents of the Survey of
Promise Scholarship Recipients at WMU Spring 2009 in terms of the average number of
courses taken per term, controlling for gender and race. It also examined the number of
courses taken the first year, controlling for gender and race.
Question 2.1, Average number of courses taken per term by persistence. The
difference in the average number of courses taken per term between persisters, those on
probation and non-persisters controlling for gender and race was examined. Not
surprisingly, there was a significant difference in the average number of courses taken per
term by persisters (M= 5.13), compared with the average number those on probation (M
= 4.85) and non-persisters (M= 4.65) each took. This concurs with existing research,
which suggests that taking more courses per course relates to persistence. There was no
difference found across gender, but there was a difference found across race. Students in
the Other racial category took the most classes per term (M= 5.26), while Black students
took an average of 5.00 courses per term. White students took on average the least
amount of courses per term (M= 4.94). This was surprising, because minority students
in question one were found to have lower high school GPAs and lower ACT composite
118
scores than White students.
Question 2.1, Number of courses taken the first year by persistence. The
difference in the number of course taken the first year among persisters, those on
probation, and non-persisters, controlling for gender and race, was examined. Not
surprisingly, there was a significant difference between the number of courses taken the
first year by persisters (M = 10.17), compared with the number those on probation (M =
9.12) and non-persisters (M= 9.31) each took. Existing research indicates that when
higher number of courses are taken, students are more likely to persist, and this is evident
here.
Question 2.2, Average number of courses taken per term by response and the
number of classes taken the first year. The difference in the average number of courses
taken per term and the number of courses taken the first year among respondents, late
respondents, and non-respondents, controlling for gender and race, was examined. There
were no statistically significant differences among respondents, late respondents, and non
respondents in the average number of courses taken per term or the number of courses
taken the first year. This means that non-response error is not an issue here, indicating
that respondents, late respondents, and non-respondents are similar or at least statistically
no different. This means that in terms of non-response bias and generalizing to the
population, based on the course data given on the entire population, it can be confirmed
that generalizations can be made from the responses to the population of WMU
Kalamazoo Promise students.
119
Research Question Three Conclusion
If it is believed that non-respondents are different from respondents in ways that
are critical to the research or evaluation questions being asked, then non-response bias
should be examined thoroughly in order to make accurate generalizations from the
population being examined. The key phrase here is "critical to the research or evaluation
questions." With the Kalamazoo Promise Scholarship being so new and having such
enormous implications for Kalamazoo students, its community, and other cities
replicating this universal scholarship program, it was imperative to know whether nonresponse error or bias would prevent study findings from being able to be generalized to
the larger population of Promise recipients attending WMU..
Research Question 3.1
This research question looked at the three groups of students—respondents, late
respondents, and non-respondents—in terms of the academic data provided by the Office
of Student Academic and Institutional Research at WMU. One method for controlling
non-response error is to compare respondents to non-respondents (Miller & Smith, 1983).
If no statistically significant difference is found between respondents and nonrespondents on known characteristics, then the results can be generalized both to the
sample and the population (Diem, 2004). In other words, if the two groups of students
are similar on known variables, then assumptions can be made for the unknown variables.
In examining respondents, late respondents, and non-respondents, two items were
found to be statistically significant: ACT Composite Scores and Most Recent WMU
120
GPA. Respondents had higher ACT Composite Scores than the non-respondents. The
late respondents, however, were statistically no different than either the respondents or
the non-respondents. For the respondents to have higher scores than the other two groups
might be expected; they presumably are more ambitious, as indicated by answering the
survey. This is in keeping with the tendency of respondents to have taken the most
number of courses, compared with non-respondents. It would have also been unsurprising
for non-respondents to either have had lower scores, or, perhaps, the opposite: to have
possibly much higher scores. In other words, they might be either lower-achieving and
less ambitious, or higher- achieving, with little time to respond to a survey. Regardless,
research suggests that late respondents and non-respondents should be similar
(Armstrong & Overton, 1977, p.2).
In this case, late respondents were found to be not statistically different from nonrespondents, as the research suggests. Late respondents were also found to not be
statistically different from respondents, however. The scores of late respondent scores
fell right between those of respondents and non-respondents. This finding suggests the
population in question departs in some way from the norm described in existing literature
in some way, which suggests a further avenue for research.
Respondents also had higher Most Recent WMU GPAs than the non-respondents.
Again as with the ACT Composite Scores, the late respondents were statistically no
different from either the respondents or the non-respondents. Again as well, there was no
statistically significant difference between the late respondents and either respondents or
non-respondents. The Most Recent WMU GPA of late respondents fell right in between
those of respondents and non-respondents. This coincides with existing research that
121
suggests that late respondents and non-respondents are similar (Armstrong & Overton,
1977, p.2).
This study shows that the similarity of respondents and non-respondents may not
always be the wisest assumption, despite reliance on previous literature.
Late respondents in this study were similar to both respondents and nonrespondents. This would be fine if respondents and non-respondents were similar; in this
study, however respondents and non-respondents are indeed different in two of the
variables. This suggests that researchers seeking to ensure there is no response bias by
comparing respondents with late respondents are making assumptions and generalizing to
the population in ways that may not be accurate.
Surprisingly, no significant difference was found among respondents, late
respondents, and non-respondents on the rest of the known variables: High School GPA,
Parents' Aggregate Income, First Year Experience, Remedial Courses, Being an Athlete,
Living in the Dorm, Taking AP credit, Gender, Race or High School Attended. Using the
course data in question 2.2 found that there was a difference across Race. Non-response
error should be looked at when the variables being examined are critical to the
interpretations. Here I think the course data is not as critical as the academic data.
Using established literature, it can be cautiously concluded from this data that
respondents to the Survey of Kalamazoo Promise Recipients at WMU 2009 are more
likely to persist at WMU as they have higher ACT composite scores as well as higher
WMU GPAs than do non-respondents to the survey.
The question of response error and therefore possible bias is still open, as
respondents are statistically different from non-respondents on two variables, which
122
would allow only for generalizations made to the sample, not to the population.
However, because this study is examining the factors of retention, and the two factors
ACT composite scores and most recent WMU GPAs were the only two variables found
to be different between respondents and non-respondents, this may be acceptable for this
population, because ACT composite scores were not found to be associated with whether
a student persisted or not. By contrast, the most recent WMU GPA is an obvious
variable that, of course, affects persistence.
Another method for examining non-response error is by comparing early or ontime respondents with late respondents. Comparing early or on-time respondents with
either late or reluctant respondents is commonly done in social science research to
determine the effect, if any, of non-response on the data under consideration (Armstrong
& Overton, 1977; Diem, 2004; Miller & Smith, 1983; Smith, 1984).
Late respondents are not statistically different from either respondents or nonrespondents. In order for there to be no further examination into non-response bias, nonrespondents should have been found to be similar to late respondents. In this case,
however, late respondents were found to be similar to both respondents and nonrespondents, which would be fine if there were no statistically significant difference
between respondents and non-respondents. Further examination into non-response bias is
indicated here. As it stands, the data can only be generalized to the sample of students
who responded and not to the population of WMU Kalamazoo Promise recipients.
Research Question 3.2
This research question looked at the two groups of students, early respondents and
123
late respondents, in terms of the cognitive, social and institutional factors from Swail's
(2003) Geometric Model of Student Persistence and Achievement, using the dependent
variables from the Survey of Promise Scholarship Recipients at WMU Spring 2009.
Another way of examining non-response error and possible non-response bias is by
comparing early or on-time respondents with late or reluctant respondents. This is
commonly done in social science research to determine the effect, if any, on nonresponse on the statistics considering (Miller & Smith, 1983; Smith, 1984). Nonrespondents and late respondents are assumed to behave similarly in research question
3.2, (Armstrong & Overton, 1977, p.2). The late respondents' data from the survey are as
assumed to be similar to that of non-respondents for this question, as no data are, of
course, available on non-respondents. This is commonly done in survey research due to
declining response rates to surveys in the richer parts of the world (de Leeuw and de
Heer, 2002).
No statistically significant difference was found between the two groups, early
and late respondents, in any of the four subscales of the survey: Social Demands,
Cognitive Engagement, Institutional Engagement and Social Engagement. There was
also no statistically significant difference found in any of the categorical variables. It
could therefore be assumed that respondents and late respondents, and therefore nonrespondents, are similar. Subsequently, generalizations could be made to the population
of WMU Kalamazoo Promise recipients from the data obtained from the Survey of
Kalamazoo Promise Recipients at WMU Spring 2009.
As can be seen, depending on the method chosen to look at non-response error,
different conclusions can be made. In question 3.1 it was determined that generalizations
124
could only be made to the sample of those who responded to the survey. In question 3.2,
however, using a different method to look at non-response error, it was determined that
generalizations could be made to the population, and not just the sample of those who
responded. Question 3.3, using yet another method to look at non-response error, offers
an opportunity to clear this up.
Research Question 3.3
This research question examined the dependent variables from the Survey of
Promise Scholarship Recipients at WMU Spring 2009 using Groves and Couper's bias
ratio formula (1998). No high bias ratios were found in any of the subscale scores of the
Survey ofPromsie Scholarship Recipients at WMU Spring 2009 using Groves and
Couper's bias ratio formula (1998) and the 53% response rate. No high bias ratios were
found with the modified formula using the 53% response rate again and the median in
place of the mean, as the median is more robust. Using the median, however, did change
the percentages somewhat; see Table 58.
Table 58. Results of Mean and Median used in Bias Ratio Formula
at the 53% Response Rate
Social Demands
Cognitive Engagemei
Institutional Support
Social Engagement
Bias Percent
Mean
53% Response Rate
5.0853
-2.3663
-1.2884
-0.0435
Note. 53% response rate is the 101 returned surveys/ 191 sent out.
125
Bias Percent
Median
0
-1.7452
-3.2497
-6.1461
This 53% response rate was calculated using the 101 students who answered the
survey divided by the number of surveys sent out (191). The actual population of this
group totals 307, however, this discrepancy occurred at the onset of the research which
was explained in detail earlier under question three results.
A high bias ratio was found in one of the subscale scores of the Survey of Promise
Scholarship Recipients at WMU Spring 2009 using Groves and Couper's bias ratio
formula (1998) and the 33% response rate. The same high bias ratio was also found with
the modified formula using the 33% response rate again and the median in place of the
mean as the median is more robust. Using the median, however, did change the
percentages somewhat. Table 59 shows the bias percents for the mean and median using
the 33% response rate, which is the 101 surveys returned divided by the population of
307; along with the 53% response rate for comparison.
Table 59. Results of Mean and Median used in Bias Ratio Formula at the 53% and 33%
Response Rate
Social Demands
Cognitive Engagement
Institutional Support
Social Engagement
Bias Percent Bias Percent
Mean
Median
53% Response Rate
Bias Percent Bias Percent
Median
Mean
33% Response Rate
5.0853
-2.3663
-1.2884
-0.0435
7.2418
3.3697
7.7202
26.0868
0
-1.7452
-3.2497
-6.1461
0
-3.7037
6.8966
-26.0870
Note. 53% response rate is the 101 returned surveys/191 sent out. 33% response rate is the 101 returned
surveys/ 307 population.
Using the 33% response rate indicates that generalizations of the data from the
Survey of Kalamazoo Promise Recipients at WMU Spring 2009 can be made in three of
the four subscales of the survey: Social Demands, Cognitive Engagement and
126
Institutional Support. Generalizations to the population cannot be made for the Social
Engagement subscale, however. This information can only be generalized to the sample
of those who answered the survey.
The only place where there was a difference found that should be noted is with
the modified bias ratio formula using the 33% response rate. This response rate is
theoretical, however, as the actual response rate of the surveys sent out was 53%. With
that said, generalizations from the survey can be made to the population with perhaps the
exception of the items on the Social Engagement subscale.
The results from the academic data obtained from the Office of Student
Academic and Institutional research is on the population of WMU Promise students,
however, as data was obtained on all 307 of them. Therefore these results can be
generalized to the population of WMU Kalamazoo Promise recipients. Very briefly these
results are:
•
Significant differences:
o White students have higher High school GPAs and higher ACT
composite scores than Black students or Other students
o Students in the Other category had higher high school GPAs than Black
students
o Persisters had higher high school GPAs than non-persisters
o Persisters had higher WMU GPAs than those on probation and nonpersisters
o In the persister category, White students had the highest WMU GPA
o In the probation category, there was no difference by race
127
o In the non-persister category, White students had the lowest most recent
WMUGPA
o Non-persisters took more remedial courses
o Persisters took more AP courses
o Students on probation took no AP courses
o Only 37.7% of WMU Promise students participated in FYE
o Half of students on probation and half of those who did not persist
participated in First Year Experience program (FYE). Only 30% of
persisters participated in FYE
o Persisters took on average more classes per term and more classes the first
year than those on probation or non-persisters
o White students took on average the least amount of courses per term
•
No difference found
o Between racial groups and WMU GPA
•
Interesting because a difference was found for high school GPA
and ACT composite scores
o Between persisters, those on probation and non-persisters in
•
FTIAC Cohort group
•
ACT composite score
•
Parental income
•
Living in a dorm
•
High school attended
•
Good news for high schools as both produced same amount
128
of persisters, those on probation, and non-persisters
o Between respondents, late respondents, and non-respondents on the
average number of courses taken per term or the number of courses taken
the first year
Some existing predictors known from a whole body of established research, such
as parental income, living in a dorm, or ACT scores, are not functioning as research
suggests they will. It is not replicated here. Perhaps the playing field has changed with
the scholarship money in place. When financial issues are evened up, the findings of past
research does not appear to hold true.
Future Research
Two types of future research are suggested here. The first set of suggestions is based
on the data obtained for this research that gave rise to more questions and thus the need
for additional research and data. The second set of suggestions for future research uses
the data already gathered for this project to examine questions beyond the scope of this
project.
•
Research using new data:
o Larger sample including ALL recipients from all 26 institutions and their
college retention factors.
o Ideally, interviewing all recipients, regardless whether they are attending,
have attended, or are no longer attending any higher education institution,
to determine their individual circumstances to examine any developing
trends.
129
o Investigate whether there is grading bias among racial groups at the high
school level or higher education institution level.
o Examine high school student records in greater detail for information on
high school courses taken and preparation for higher education. High
school GPA obtained from the Office of Student Academic and
Institutional Research at WMU does not indicate whether it is based on
higher level classes or lower level classes taken in high school. Students
who took harder classes could have received lower grades, but presumably
would have a higher chance of persisting at a higher education institution
than would students who took lower level courses in high school.
•
Research using the existing data from the survey and academic data;
o Non-response error
•
Examine each individual survey item and run through the bias ratio
formula of Groves and Couper (1998).
•
Examine each individual variable from the academic data and run
through the bias ratio formula of Groves and Couper (1998).
o Compare self-reported data from the survey with the academic data
obtained from the Office of Student Academic and Institutional Research.
Potential Implications
The implications of this dissertation are threefold. First, it offers Kalamazoo
Public School district, its community and WMU suggestions on how these results could
affect them. Second, it examines non-response bias and the implication of this type of
130
analysis on small-scale research as well as on evaluation. Lastly, it examines retention
factors and how they functioned here.
The issue for Kalamazoo school district is that that more minority and lowincome students are staying in high school because of the Promise (Miron & Cullen,
2008). Whereas many of these low-income and minority students used to drop out of
high school, the evidence now suggests that they are staying in school with plans to
graduate. This will have implications for WMU as well, because a high proportion of
these students are likely to attend either KVCC or WMU. As more and more students
graduate who perhaps otherwise would not have without the incentive of the Promise,
more students will struggle with the requirements and demands of higher education.
Minority students—in particular Black students and those with low high school GPAs—
may need particular attention such as WMU provides with the Multicultural Affairs
office or the First Year Experience. Also, supports and services currently provided by an
array of community organizations offer examples that can be expanded to a larger scale
involving more stakeholders. All of the needs covered in Social, Cognitive and
Institutional factors that affect retention should, ideally, be addressed by community
groups, schools and higher education institutions.
The trends suggest that at least for the next couple of years, graduation or
completion rates are likely to go up, although a higher proportion of the students may be
less well prepared for attending a university. Although WMU is presently serving at-risk
and minority students very well, as this population increases WMU will need to better
prepare for meeting the needs of this special population in hopes of retaining them and
ensuring their success. Perhaps one option could be making the First Year Experience
131
mandatory for Promise students. This dissertation determined that minority students, in
particular Black students, had lower high school GPAs and lower ACT scores than White
students and, therefore, are in need of services provided by WMU, the community, and
KPS school district in order for them to persist at post-secondary education. Systematic
services need to be in place for these students, including not only academic services but
social services as well. These students are also likely to benefit from mentors and a go-to
person for questions and support.
Students who were in elementary or middle schools when the Promise was
announced will have more time to improve their performance in school before they reach
critical decisions that need to be taken in high school regarding the extent to which they
will take college preparatory classes. These students will have had much more time to
prepare and think about post-secondary education in terms of what they are doing in high
school, where the first cohorts of Promise recipients did not have this time. It is
predicted that more students will take advanced placement classes in high school; as seen
in this study with this population, students who took AP courses persisted at a much
higher rate than those who did not take AP courses. Until then, however, additional
services and supports need to be provided to the minority and low-income students as
well as students who have low high school GPAs or are first-generation college
attendees. Forty percent of the WMU Promise students reported being first-generation
college students. WMU also needs to encourage students to take more courses per term
at WMU, as existing research indicates this as a factor of persistence and, in this study,
persisters took more courses than non-persisters or students who were on probation.
In addition to implications for KPS, the community and WMU to consider,
132
another implication of this dissertation is that it illustrates how non-response error and
non-response bias can be examined in small scale and relatively low budget research
studies and program evaluations. Determining the quality of outcomes has been and still
is of utmost importance in research and evaluation work. Assuming a representative
sample has been obtained, research and evaluators alike would like to generalize their
findings to the population and not just to the sample for which the data was obtained.
Even with a representative sample and a reasonably high response rate, response error
can still occur. Taking the extra steps needed to ensure that an indication of non-response
error and therefore possible response bias does not exist is a minimal measure to help
ensure the accuracy of the findings. This dissertation took extra steps to determine if the
results found here could be generalized to the population of the WMU Kalamazoo
Promise Scholarship recipients without the concern of response error and possible nonresponse bias. It was hoped that this non-response bias analysis would help contribute to
evaluation. As an example to researchers and evaluators, this dissertation illustrates how
this can be done, even with relatively small numbers and a limited budget.
In order to ensure the accurate use of research data, anyone making assumptions
from existing research needs to be concerned with the generalizations made, how these
were made, and if they are appropriate in the context in which they are used. When
stakes are high, it is better to take measures to ensure accuracy. Currently, non-response
error is more commonly used in large-scale research; small-scale research and program
evaluation have not seen the value in this type of analysis. In order to ensure the quality
and validity of the findings, this dissertation illustrated that it can easily be completed and
that it is imperative to examine non-response error in order to generalize with confidence.
133
One final implication that requires attention is the finding that some of the
established factors that typically predict persistence or non-persistence do not appear to
function with this population as the existing literature indicates they should. Cognitive
factors such as high school GPA, WMU GPA, and taking remedial or advanced
placement courses did function as expected. Other factors of retention—ACT composite
scores, parental income, living in a dorm, and Eirst Year Experience provided by WMU,
which according to the literature would be expected to have an effect on persistence—did
not. Interestingly, most of these factors are Social factors or Institutional factors, which
were not found to differ among persisters, those on probation, and non-persisters. This is
a fascinating finding. Factors that normally have a relationship with persistence are not
exhibited here among scholarship recipients, which really asks more questions than it
answers. The factors that do seem to be associated with persistence among Promise
scholarship recipients are academic ones; those that do not deal with more than just
academics. One possible explanation is that the funding and support provided by the
Kalamazoo Promise, along with the supports provided by WMU, KPS, and the
community, have mitigated or neutralized many of the factors that typically contribute to
non-persistence in higher education. This possible explanation has huge implications
and, of course, deserves further consideration in future research.
134
T?T5T7T3T?CW-'T7C
Adelman, C. (1999). Answers in the toolbox: Academic intensity, attendance patterns,
and bachelor's degree attainment. Washington, D.C.: U.S. Department of
Education Office of Educational Research and Improvement.
Armstrong, S.J., & Overton, T.S. (1977). Estimating non-response bias in mail surveys.
Journal of Marketing Research, 14, 396-402.
Ary, D., Jacobs, L.C., & Razavieh, A. (1996). Introduction to research in education (5X
ed.). Fort Worth, TX: Harcourt Brace Court.
Astin, A.W. (1971). Predicting academic performance in college: Selectivity data for
2300 American colleges. New York: The Free Press.
Astin, A.W. (1975). Preventing students from dropping out. San Francisco: JosseyBass.Astin, A.W. (1984). Student involvement: A developmental theory for
higher education. Journal of College Student Personnel, 25, 297-308.
Astin, A. W. (1999). Student involvement: A developmental theory for higher education.
Journal of College Student Development, 40(5), 518-529.
Bean, J. P., & Eaton, S. B. (2000). A psychological model of college student retention. In
Braxton, J. M., (Ed.), Reworking the student departure puzzle. Nashville, TN:
Vanderbilt University Press.
Bean, J.P. & Metzner, B. (1985). A conceptual model of nontraditional undergraduate
student attrition. Review of Educational Research 55, (4), 485-540.
Bonham, L. A. & Luckie, J.A.I. (1993). Taking a break in schooling: Why community
college students stop out. Community College Journal of Research and Practice
77(3), 257-70.
Braxton, J.M. (Ed.). (2004). Reworking the student departure puzzle. Nashville, TN:
Vanderbilt University.
Braxton, J.M., & Lien, L. A. (2004). The viability of academic integration as a central
construct in Tinto's interactionalist theory of college student departure. In J. M.
Brick, J. M., Bose, W., & Bose, J. (2001, Aug. 5-9). Analysis of potential nonresponse bias. Proceedings of the Annual Meeting of the American Statistical
Association. Rockville, MD.
Cabrera, A. F., Nora, A., & Castaneda, M.B. (1993). College persistence: Structural
equations modeling test of an integrated model of student retention. Journal of
Higher Education, 64(2), 123-139.
135
Cabrera, A. F., Stampen, J. O. & Hansen, W. L. (1990). Exploring the effects of ability to
pay on persistence in college. The Review of Higher Education, 13(3), 303- 336.
Carroll, J. (1988). Freshman retention and attrition factors at a predominantly Black
urban community college. Journal of College Student Development, 29, 52-59.
Cataldi, E.F., Laird, J., and KewalRamani, A. (2009). High School Dropout and
Completion Rates in the United States: 2007(NCES 2009-064). National Center
for Education Statistics, Institute of Education Sciences, U.S. Department of
Education. Washington, DC. Retrieved from
http ://nces. ed. go v/pubsearch/pubsinfo. asp ?pubid=2 009064
Chan, E. (2002). Patterns of full-time/part-time attendance and their effects on retention
and graduation. Showcase presented at the Association for Institutional Research
42 nd Annual Forum, Toronto, Canada. Retrieved from
http://ocair.org/files/presentations/paper2002_03/poster2_EvaChan.pdf
Cohen, J., & Cohen, P. (1988). Applied multiple regressions/correlation analysis for the
behavioral sciences (2 nd ed.). Hillsdale, NJ: Lawrence Erlbaum, Associates.
Creswell, J. W. (2003). Research design: Qualitative, quantitative, and mixed methods
approaches. Thousand Oaks, CA: Sage Publications.
Cresswell, J. W. (2005). Educational research: Planning, conducting, and evaluating
quantitative and qualitative research (2 nd . ed.). Upper Saddle River, NJ:
Merrill/Prentice Hall.
Cunningham, A.F. (2005). Changes in Patterns of Prices and Financial Aid (NCES
2006-153). U.S. Department of Education. Washington, DC: National Center for
Education Statistics.
De Heer, W. & De Leeuw, E. (2002). Trends in household survey nonresponse; A
longitudinal and international comparison, in Goves R. et al. (eds.) Survey
Nonresponse, New York: John Wiley, pp. 41-54.
Diem, K. G., PhD. (2004). Maximizing response rate and controlling non-response error
in survey research. Rutgers Cooperative Research & Extension, JAES, Rutgers,
The State University of New Jersey.
Eberts, R., & Kitchens, R. (2008). Communities investing in education and economic
development. Kalamazoo, MI. (pp. 1-40), Retrieved from:
www.upjohninst.org/promise/2008_promisenet_proceedings.pdf
136
Educational Policy Institute (EPI), (2007). Institutional Student Retention Assessment:
Program manual. Retrieved from:
http://www.isra-online.com/assets/ISRA_Manual.pdf
Feldman, M. J. (1993). Factors associated with one-year retention in a community
college. Research in Higher Education 34(4), 503-12.
Gall, M. D., Borg, W.R., & Gall, J.P. (1996). Educational research: An instruction (6l
ed.). Whit plains, NY: Longman.
Gay, L. R., Mills, G. E., & Airasian, P. (2006). Educational research: Competencies for
analysis and application (8 th ed.). Upper Saddle River, NJ: Merrill/Prentice Hall.
General Accounting Office (GAO). (1995). Higher education: Restructuring student aid
could reduce low-income student dropout rate. (GAO/HEHS-95-48). Washington
DC: U.S. Government Printing Office.
Grosset, J. M. (1991). Patterns of integration, commitment, and student characteristics
and retention among younger and older students. Research in Higher Education,
32(2), 159-78.
Groves, R. M., & Couper, M.P. (1998). Non-response in household interview surveys.
New York: Wiley and Sons.
Guloyan, E. V. (1986). An examination of white and non-white attitudes of university
freshmen as they relate to attrition. College Student Journal, 20, 396-402.
Hair, J.F., Jr., Anderson, R.E., Tatham, R. L., & Black, W. C. (1995). Multivariate data
analysis: With readings (4 th ed.); 617-671. Englewood Cliffs, NJ: Prentice-Hall,
Inc.
Hagedorn, L. (2006). How to Define Retention: A New Look at an Old Problem. Los
Angeles: Transfer and Retention of Urban Community College Students, 26,
Dialog, ERIC, ED 493674.
Hayman, D., (2007). Rising above the gathering storm: Engineering undergraduate
student affairs 2007 report. College of Engineering, University of Illinois at
Chicago. Retrieved from:
http://www.namepa.org/region_c/programs/uic_undergraduate_student_affairs_20
07report.pdf
Jayson, S. (2009, January 7). Getting the most bang for you college buck. USA Today.
Retrieved from http://www.usatoday.com/news/education/2009-01-07-best-valuecolleges_N.htm
137
Johnson, R. B. & Onwuegbuzie, A., (2004, October). Mixed methods research: A
research paradigm whose time has come. Educational Researcher, 33 (7).
Retrieved from http://www.aera.net/uploadedFiles/Journals_and_Publications/
Journals/Educational_Researcher/Volume_33_No_7/03ERv33n7_Johnson.pdf
Jorth, R., (2009). Data from the Administrator of the Kalamazoo Promise Scholarship.
Kano, JVL, Franke, T., Afifi, A., & Bourque, L., (2008). Adequacy of reporting results of
school surveys and nonresponse effects: A review of the literature and a case
study. Educational Researcher, 37 (8) 480-490.
Kaplan, R. M., & Saccuzzo, D. P. (2005). Psychological testing: Principles, applications,
and issues (5 th ed.). Belmont, CA: Thomson-Wadsworth.
Kilpatrick, L. A., & Feeney, B. C. (2007). SPSS for windows step by step: A simple guide
and reference 15.0 update (8 th ed.). Pearson Education, Inc.
Lanni, J. C. (1993, March 18-22). The longitudinal student success study: The entering
student survey. Paper presented at the 17th Annual Meeting of the National
Association for Equal Opportunity in Higher Education, Washington, DC.
(ED350 017).
Levin, J. R. & Levin, M. E. (1991). A critical examination of academic retention
programs for at-risk minority college students. Journal of College Student
Development 32, 323-334.
Locke, L. F., Spirduso, W. W., & Silverman, S. J. (2000). Proposals that work (4th ed.).
Thousand Oaks, CA: Sage Publications.
Mertler, C. A., & Vanatta, R. A. (2005). Advanced and multivariate statistical methods
(3rd ed.) Glendale, CA; Pyrzcak Publishing.
Miller-Adams, M. (2009). The power of a promise education and economic renewal in
Kalamazoo. W.E. Upjohn Institute Employment Research, Kalamazoo,
Michigan.
Miller, L. E., & Smith, K.L. (1983). Handling nonresponse issues. Journal of Extension,
27(5), 45-50.
Miron, G., & Cullen, A., (2008, October). Trends and patterns in student enrollment for
Kalamazoo public schools working paper #4. Western Michigan University: The
Evaluation Center. Retrieved from
http://www.wmich.edu/evalctr/promise/documents/WorkingPaper4.pdf
138
Miron, G., & Evergreen S. (2007, January). The Kalamazoo Promise as a catalyst for
change in an urban school district: A theoretical working paper #1. Western
Michigan University: The Evaluation Center. Retrieved from
http://www.wmich.edu/evalctr/promise/documents/WorkingPaperl.pdf
Miron, G., Spybrook, J., & Evergreen S. (2008, April). Key finding from the 2007 survey
of high school students. Evaluation of the Kalamazoo promise working paper #3.
Western Michigan University: The Evaluation Center. Retrieved from
http ://www. wmich. edu/evalctr/promise/documents/WorkingPaper3 .pdf
Moore, N. (1995). Persistence and attrition at San Juan college. Farmington, NM:
Office of Institutional Research, Grant Development, and Planning, San Juan College.
(ED 380 159).
Naretto, J. A. (1995). Adult student retention: The influence of internal and external
community. NASPA Journal, 32(2), 90-97.
National Center for Education Statistics (NCES). (June, 1998). First generation students:
Undergraduates whose parents never enrolled in postsecondary education. (U.S.
Department of Education Office of Educational Research and Improvement/
NCES 98-082). Washington DC: U.S. Government Printing Office.
National Court Reporters Association (NCRA). (May, 2006). Report education
commissioner: report to the members. Retrieved from:
http://www.ncraonline.org/NR/rdonlyres/9370CllD-8D57-41CE-BB883D94439AD3AC/0/rec_report.pdf
Panos, R. J. & Astin, A. W. (1968). Attrition among college students. American
Education Research Journal, 5, 57-72.
Pascarella, E. T., & Terenzini, P. T. (1980). Predicting freshman persistence and
voluntary dropout decisions from a theoretical model. The Journal of Higher
Education, 51(1), 60-75.
Price, L. A. (1993). Characteristics of early student dropouts at allegany community
college and recommendations for early intervention. Cumberland, MD: Allegany
Community College, 1993. (ED 361 051).
Rendon, L. I., Jalomo, R. E., & Nora, A. (2004). Theoretical considerations in the study
of minority student retention in higher education. In J. M. Braxton (Ed.),
Reworking the student departure puzzle (p. 127-156). Nashville, TN: Vanderbilt
University Press.
139
Rosenberg, K. M. (2007). The excel statistics companion. Belmont, CA; Thomson Higher
Education.
Seidman, A., (ed.) (2005). College student retention: formula for student success.
Westport, CT ACE/ Praeger.
Smith, T. W. (1984). Estimating non-response bias with temporary refusals. Sociological
Perspectives, 27 (4), 473-489.
Spady, W. G. (1970). Dropouts from higher education: An interdisciplinary review and
synthesis. Interchange, 7(1), 64-85
Sprinthall, R. C. (2007). Basic statistical analysis (8 th ed.). Boston, MA: Pearson Allyn
& Bacon.
Stage, F. K. (1989). Motivation, academic and social integration, and the early dropout
American Educational Research Journal, 26(30, 385-402.
St. John, E. P. (1990). Price response in persistence decisions: Analysis of the high
school and beyond senior cohort. Proceedings for the Seventh Annual Conference
of the NASSGP/NCHELP Research Network, 29-56. New Jersey Higher
Education Assistance Authority: Trenton.
St. John, E. P., Cabrera, A. F., Castaneda, M. B., Nora, A., & Asker, E. (2004). Economic
influences on persistence reconsidered: How can finance research inform the reconceptualization of persistence models? In J. M. Braxton (Ed.), Reworking the
student departure puzzle (p. 29-47). Nashville, TN: Vanderbilt University Press.
Stevenson, J. P. (2007.) Applied multivariate statistics for the social sciences (5
ed.).New York, NY: Routledge.
Stoecker, J., Pascarella, E., & Wolfe. L. (1988). Persistence in higher education: A 9-year
test of a theoretical model. Journal of College Student Development, 29, 196-209.
Swail, W., Education., A., ERIC Clearinghouse on Higher Education, W., & George
Washington Univ., W. (2003, January 1). Retaining Minority Students in Higher
Education: A Framework for Success. ASHE-ERIC Higher Education Report.
Jossey-Bass Higher and Adult Education Series. (ERIC Document Reproduction
Service No. ED483024) Retrieved from ERIC database.
Terenzini, P. T., & Pascarella, E. T. (1980). Toward the validation of Tinto's model of
college student attrition: A review of recent studies. Research in Higher
Education, 12, 271-282.
140
Thomas, R. O. (1990). Programs and activities for improved retention. In Hoissler,
Bean,and Associates, The strategic management of college enrollments: Chap. 11,
186-201. San Francisco: Jossy-Bass.
Thorndike, R. L. (1967). Reliability. In D. N. Jackson & S. Messick (Eds.) Problems in
human assessment (201-214). New York, NY: McGraw-Hill Book Company.
Tinto, V. (1975). Dropouts from higher education: A theoretical synthesis of recent
research. Review of Educational Research, 45, 89-125.
Tinto, V. (1987). Leaving college: Rethinking the causes and cures of student attrition.
Chicago: University of Chicago Press.
Tinto, V. (1993). Leaving college: Rethinking the causes and cures of student attrition.
Chicago, IL: Chicago Press.
Trochim, W. MN. K, & Donnelly, J. P. (2007). Research methods knowledge base (3ed.).
Macon, OH; Thomson.
U.S. Census Bureau. (2009). Kalamazoo county quick facts from the U.S. census bureau.
Washington, DC: Author. Retrieved from
http://quickfacts.census.gov/qfd/states/26/26077.html
Western Michigan University. (2009). A comprehensive report of retention rates. Office
of Student Academic and Institutional Research. Unpublished report.
Kalamazoo, MI.
Windham, P. (1994, August 1-3). The relative importance of selected factors to attrition
at public community colleges. Paper presented at the 23rd Annual Conference of
the Southeastern Association for Community Colleges, Savannah, Georgia. (ED
373 833).
Yung, K. (2007, August 20). Kalamazoo school chief leaves with promise secret intact.
Detroit Free Press. Retrieved from: http://www.usatoday.com/news/education/
2007-08-20-kalamazoo-promise_N.htm
141
Appendix A
Participant Paperwork
Survey of Promise Scholarship Recipients at WMU Spring 2009
(Note: This survey was formatted for use as an online survey, rather than a paper and
pencil survey.)
Please complete the following survey to help us understand the impact of the
Kalamazoo Promise on Kalamazoo Public Schools as recent graduates. Your responses
also will help us understand how improvements can be made to ensure that all students
will have the opportunity to benefit from this scholarship program and succeed in
college. The survey is part of a dissertation study on the factors related to success in
college. This is an anonymous survey, so please do NOT write your name or any
identifying information on the questionnaire. Thank you for your assistance.
B\CK(iROlM)IMOK\l\lION
Which high school did you attend?
1.
• Loy Norrix
In which year did you graduate high school?
• Kalamazoo Central
• 2006
• 2007
• Phoenix
• 2008
2. In which month and year did you start studying at Western Michigan University? _
3. Did you begin college at Western Michigan University (WMU) or elsewhere?
• Started at WMU • Started elsewhere. Where?
4.
If you graduated in 2006 or 2007, describe where you have gone to school or what you
did prior to the current school year.
5. What is your classification in college?
•Freshman/first-year
• Sophomore
• Junior
142
• Senior
• Unclassified
6. How many credit hours are you taking this term?
credits
7. Please estimate the total credit hours you have earned at WMU, not including those you
are taking this semester.
credits
8. When do you most frequently take classes? (Mark only ONE)
• Day classes (morning or afternoon)
• Evening classes
• Weekend classes
9. What degree(s) do you wish to pursue?
10. What is your current major?
11. Do you expect to enroll for an advanced degree when, or if, you complete your
undergraduate degree?
• No
• Yes
If yes, describe the next degree you wish to pursue.
12. Briefly describe your career goals.
13. When enrolled in high school, did you qualify for the free/reduced lunch program at
your school?
• No
• Yes
14. What is your gender?
• Female
15. What is your marital status?
separated
• Male
Qnot married
• married
• divorced
•
• widowed
16. What is your race/ethnicity? (Mark the one ethnic group with which you most identify)
• American Indian or other Native American
• Asian, Asian American, or Pacific Islander
143
• Black or African American
• White (non-Hispanic)
• Mexican or Mexican American
• Puerto Rican
• Other Hispanic or Latino
• Multiracial
• I prefer not to respond
• Other:
17. Is English your native (first) language?
• No
• Yes
18. Where do you live during the school year?
Q Dormitory or other campus housing
• Residence (house, apartment, etc.) within walking distance of Western
• Residence (house, apartment, etc.) within driving distance
• Fraternity or sorority house
19. With whom do you live during the school year? (Fill in all that apply)
• No one, I live alone
• One or more other students
• My spouse or partner
• My child or children
• My parents
• Other relatives
_J Friends who are not students at WMU
G Other people, who
20. What is the highest level of education obtained by your:
Father or
Male
guardian
Not a high school graduate
High school diploma or GED
Some college, did not complete degree
Associate degree
Bachelor's degree
Master's degree
Doctorate degree and or Professional Degree
gree
Unknown
144
•
•
•
•
•
••
•
Mother or
Female
guardian
•
•
•
•
•
•
•
•
21. Estimate your grade point average (GPA) in high school
Straight "A "s are
equivalent to a 4.0 GPA. A
"B " average would be
22. Estimate your grade point average (GPA), thus far at
equal to 3.0 GPA. A "C"
WMU
23. Please rate your level of
awareness about the Kalamazoo
Promise.
Not at all
Very
Familiar
familiar
©
©
©
24. What additional information would you like to have regarding the Kalamazoo Promise?
Length of Attendance....Benefit
25. How much tuition scholarship are you eligible for under
the Kalamazoo Promise? (Please fill in the blank)
0/
/o
145
K-12
....100%
1-12
95%
2-12
....95%
3-12
....95%
4-12
....90%
5-12
....85%
26. About how many hours do you spend in a typical 7-day week doing each of the
following?
IS on
limn s p( r week
1-5
6-
11-
21-'
3
i
Preparing for class (studying, reading, writing, rehearsing or
a
0
© © ©
® ©
0
© © ©
® ©
0
© © ©
® ©
0
© © ©
® ©
0
© © ©
® ©
other activities related to your program
b Working for pay
Participating in college-sponsored activities (organizations,
c
campus publications, student government, intercollegiate or
intramural sports, etc.)
Providing care for dependents living with your (parents,
d
children, spouse, etc.)
e Commuting to and from classes
27. If you have a job, how does it affect your school work?
• I don't have a job
• My job does not interfere with my school work
• My job takes some time from my school work
• My job takes a lot of time from my school work
28. How likely is it that the following issues would cause you to withdraw from class or
f r o m WMU?(MarA: the most appropriate response for each item, where l=Not Likely & 5=Verv Likely.)
1
, ' , . . ?
"
*•
,, Likely"
,.
-'f,,* Likely'i
© © © ©
©
6 Caring for dependents
© © © ©
©
c
© © © ©
©
© © © ©
© © © ©
©
©
a
Working full-time
Academically unprepared
d Lack of finances
e
Don't fit in
146
/
©
Don't offer program of study that I want
29. Are you a member of a social fraternity or sorority?
©
©
©
©
• No • Yes
30. Are you a student athlete on a team sponsored by WMU's athletics department? • No
• Yes
If yes, on what team(s) are you an athlete (i.e., football, swimming)?
31. Are you involved in student associations or organizations, if so please list:
32. How supportive are your friends of your attending WMU?
• Not Very
• Somewhat • Quite a bit • Extremely
33. How supportive is your immediate family of your attending WMU?
• Not Very
• Somewhat • Quite a bit • Extremely
34. Mark the box that best represents the quality of your relationships with people at
WMU.
Your relationship with:
a. Other Students
Unfriendly, unsupportive,
Friendly, supportive
sense of alienation ID
2Q
3Q
4U
5Q
6U
sense of belonging
b. Instructors
Unavailable, unhelpful,
unsympathetic
Available, helpful
!•
2Q
3Q
4Q
5Q
6Q
sympathetic
c. Administrative Personnel & Office Staff
Unhelpful, inconsiderate, 1Q
Rigid
2Q
3Q
4Q
5Q
6Q
Helpful,
considerate, flexible
35. In your experiences at WMU during the current school year, about how often have you
147
done each of the foiiowing? (Mark the most appropriate response for each item, where l=Never &
5=Very Often.)
m
m
!
K
*
i
Asked questions in class or contributed to class
#5!
& ,-w
© © ©
©
© Q © ©
©
© © © ©
©
S © © ©
©
© © © ©
©
©
© ©
©
©
©
Q
a
'OfferiS
*» ^* ^, i
discussions
b
Made a class presentation
Come to class without completing readings or
c
assignments
Worked with classmates outside of class to prepare
d
class assignments
e
g
Tutored or taught other students (paid or voluntary)
Participated in a community-based project as a part
Used instant messaging to work on an assignment
h
Used e-mail to communicate with an instructor
© © © ©
© © © ©
i
Discussed grades or assignments with an instructor
© © © ©
©
J
Talked about career plans with an instructor or
Discussed ideas from your readings or classes with
Received prompt feedback (written or oral) from
© © © ©
©
© © ft) ©
(D
© © © ©
©
© © © ©
© © © ©
©
©
© © © ©
©
© © © ©
©
f
k
I
Worked harder than you thought you could to meet
m
(2)
© © © ©
© © © ©
©
©
an instructor's standards or expectations
Worked with instructors on activities other than
n
coursework
0
Discussed ideas from your readings or classes with
P '
Had serious conversations with students who differ
q
Had serious conversations with students of different
race or ethnic background than your own
r
Skipped class
148
Included diverse perspectives (different races,
s
religions, genders, political beliefs, etc.) in class
© © © ©
©
© © © ®
©
discussions or writing assignments
t
Put together ideas or concepts from different courses
36. To what extent does WMU emphasize each of the following? (Mark the most appropriate
response for each item, where 1 = Very Little & 5=Very Much.)
Tart eX- " '^-J
3V
J- ,
i
-.
*<4,fti
- ^Muca
-»-*
% Jtfav»
"» "T
© © © ©
©
© © © ©
©
c Encouraging contact among students from different
economic, social, and racial or ethnic backgrounds
© © © ©
©
d Helping you cope with your non-academic responsibilities
© © © ©
© © © ©
©
©
© © © ©
©
© © © ©
©
a
Spending significant amounts of time studying
Providing the support you need to help you succeed
b
academically
e Providing the support you need to thrive socially
Attending campus events and activities (special speakers,
f
cultural performances, athletic events, etc.)
g Using computers in academic work
37. During the current school year, about how often have you done each of the
following? (Mark the most appropriate response for each item, where l = Very Little & 5—Very Much.)
i C, • * \\-t
'
T *"
"Seldom
Attended an art exhibit, play, dance, music, theater, or other
performance
© ©
Exercised or participated in physical fitness activities
©
149
•is Often.
© ©
© ©
©
Participated in activities to enhance your spirituality (worship,
meditation, prayer, etc.)
Tried to better understand someone else's views by imagining
how an issue looks from his or her perspective
Learned something that changed the way you understand an
/
©
(2)
©
©
©
©
©
©
issue or concept :
38. Overall, how would you evaluate the quality of academic advising you have received at
WMU?
• Poor
• Fair
• Good
• Excellent
39. How would you evaluate your entire educational experience at WMU?
• Poor
• Fair
• Good
• Excellent
40. If you could start over again, would you go still attend WMU?
• Definitely no
• Probably no
• Probably yes
• Definitely yes
41. Would you recommend WMU to a friend or family member?
• No
UYes
42. What can WMU do better?
QUESTIONS ABOUT KAI. \M AZOO PI BUC SCHOOLS
43. To what extent do you agree or disagree with the following statements about your high
school?
(Mark the most appropriate response for each item, where l=Strongly Disagree &
5=Strongly Agree.)
Strongly
Strongly
Do
Disagree
Agree
n't
Kn
•$<wM
TEA CHER-STUDENT RELA TION'S
150
A
Teachers were patient when a student had trouble learning
© © © ©
B
Teachers made extra efforts to help students
© © © ©
©
o
C
Teachers understood and met the needs of each student
© © © ©
©
V)
D
Teachers were fair to students
© © © ©
/=
^
()
STbDENT ACADEMIC ORIENTATION
Students at my high school understood why they were in
A
© © © ©
0
© © © ©
O
school
At my high school, students were interested in learning
B
new things
Students at my high school had fun but also worked hard
C
© © © ©
a
')
© © © ©
(i
()
on their studies
Students at my high school worked hard to complete their
D
school assignments
STUDENT ASPIRATIONS - "
>•',?
i
A
Getting good grades in high school was important to me
© © © ©
a
O
b
I pushed myself in high school to do better academically
© © © © Q
()
c
In high school, I believed that I could be successful
© © © ©
()
d
Going to college was important to my future
© © © fc
O
fTEACHEis'
EXPECTATIONS *OF, STUDENTS
My high school teachers believed that I would graduate
a
© © © ©
(5
0
© © © ©
a
O
© © © ©
ft
P
from high school
My high school teachers believed that I would succeed in
b
college
c
My high school teachers had high expectations of me in
151
class
d
© © © ©
I had a teacher who was a positive role model for me
GUIDANCE/ COLLEGE READINESS
'
-
a
Teachers or counselors encouraged students to think about
b
Teachers or counselors helped students plan for future
c
Teachers or counselors helped students with personal
d
Students at my high school could get help and advice from
e
I received the assistance I needed to go to college
f
My high school prepared me well for my future
©
©
©
©
©
©
©
©
©
®
©
®
©
o
.
©
©
©
©
©
©
©
©
©
©
©
©
© ()
© , O'J
© Ov
© ; o
© 0© 0
44. What changes could be made by Kalamazoo Public Schools to better prepare students
for college and
other post-secondary options.
Q l LSI IONS \ B ( ) l I IIIF KA! VM VZOO PROMISL
45. To what extent do you agree or disagree with the following statements regarding the
Kalamazoo Promise?
(Mark the most appropriate response for each item, where l=Strongly
Disagree & 5= Strongly Agree.)
1
*>
, £ i l~;,
, , -s. strongly** i *
> ; .*|Strongl
V Disagree! U
*• ' *l C * V "''
s
V
'•»
'
i
y
Jt
/
*^
•• fcrf
i
.'>• *v-s-*
& Agree
•^."'I-^.V'-/;^
Teachers and/or school staff at my high school spoke to me about the
Kalamazoo Promise
152
©
©
©
d
My parents/guardians have spoken with me about the Kalamazoo
©
©
©
©
©
©
©
© d
(J)
(|)
(J)
(4)
(5
©
©
©
©
d
©
©
©
©
d
©
©
©
© d
h I was confident before the Promise that I could afford to go to college,
©
©
©
© d
i I wasn't sure that I could afford college before the Promise. I didn't
©
©
©
© d
J I still am not sure if I can afford college, because I am not eligible for
©
©
©
© d
©
©
©
©
Promise
My parents/guardians encouraged me to work harder in school because
d
of the Promise
The Kalamazoo Promise gives me more flexibility about which college
The Promise hasn't really made a difference to my educational goals or
plans
/
I changed my career goals because of the Kalamazoo Promise
I worked harder in high school because I knew that the Promise would
pay for college
I wanted to go to college even before the announcement of the
d
Kalamazoo Promise
C I 1 \ \ ( . I > 1 ) I L TO I H K K \ L \ M \ / O O P K O M I S L
46. To what extent do you agree or disagree with the following statements
« ' ' »*Strph*gly-^',vXf, - V&SfrongU
T
Disagree
lV.
<
.*"^
f
T1 %
""|l
.
y
«*
. ;C •^.AgreK*
a My attendance in high school improved
©
©
©
© ©
& More academic support was provided after school
©
©
©
© ©
c My school started offering more college prep courses
©
©
©
© ©
153
.*
;
d I enrolled in more college prep courses
©
©
©
©
e Teachers expected that more students would go to college
©
©
©
® ©
f The amount of homework increased
©
©
©
© ©
g Students became better behaved and were getting into less trouble
©
©
©
© ©
h More information was provided about higher education opportunities
©
©
©
© ©
i My peers were more motivated to succeed in school
©
©
©
© ©
J I talked about college more often with peers
©
©
©
© ©
k The quality of student academic performance improved
©
©
©
© ©
©
©
©
© ©
More support from community organizations was provided to students
I
and families
47. Describe changes in your family as a result of the Kalamazoo Promise
48. How has the Kalamazoo Promise changed your life?
Thank you for your assistance with this survey!
Interview Protocol - Kalamazoo Promise Students
Hi, I am (your name), thank you so much for coming today for this interview. Let's go
over the study information sheet and the consent form first and then we can start the
interview.
(Go over informed consent, giving participant a copy to keep and a copy to sign.)
Project: Kalamazoo Promise Scholarship Recipients: A Comparative Analysis of Higher
Education Retention and Non-response Bias
Date:
Time of Interview:
Location:
154
Interviewer:
Interviewee:
Semi-Structured Questions:
'
.
-_
Background
r
..-'4&^^..',l^.
*..•:.:
Ii2'
Name:
Age:
Gender:
Race:
Last School:
1.
Are you currently attending college?
a. If yes, where?
b. If no, do you plan to?
i. If yes, how, where, when?
ii. If no, why not?
\Note: Participants who indicate that they have left WMU should answer questions 2-4.
Those that are still at WMU should skip to question 5.]
2. What were the circumstances that influenced your decision to leave WMU?
3. Is there anything someone could have done to help you stay in college?
4. Did you know that you could return to college and still use the Kalamazoo
Promise Scholarship?
5. Where do get information regarding the Kalamazoo Promise?
6. Do you have a support system around you? Please describe this support system.
Cognitive Factors
7. Do you (or did you) enjoy college?
a. What did you enjoy the most?
b. What did you enjoy the least?
8. I am going to ask you to rate the difficulty level of various things on a scale of 15, where a 1 means "very difficult" and a 5 means "very easy." A 3 would be in
the middle, which would indicate that it was just right. [Show illustration of the
155
scale to the interviewee.] You are welcome to provide examples or explain the
reasoning for the rating you give. [Ask for examples when the score is 1 or 5.]
a. How would you rate course work at WMU?
b. How would you rate course work at your high school?
i. [If there is a difference ask them to explain]
c. For you studying is...
d. For you learning is...
Social Factors
e. Making new friends at college is/was...
9. Did getting the Kalamazoo Promise Scholarship solve your financial issues as
they relate to college expense? Please explain.
10. How do you feel about learning?
11. Are/were you involved in any extra curricular activities at WMU? Please explain.
12. Are you involved in your community? Please explain.
13. Do you have goals? If so, can you share them.
14. What does your family think about your decision to attend (not attend) WMU?
15. What do your friends think about your decision to attend (not attend) WMU?
16. Please finish the following sentences...
a. On the weekend I like to...
b. After a long day at work I like to...
c. My friends are the...
d. My family always...
e. If I could do anything I wanted I would...
Institutional Factors
17. Did WMU provide you with the services that you needed? Please explain.
18. Did you attend an orientation?
156
19. How do you feel about the admissions process at WMU?
20. How do you feel about the instructors at WMU?
21. How do you feel about the classes you took at WMU?
22. How do you feel about the programs offered at WMU?
23. Is there anything WMU could do better? Please explain.
Kalama/oo Promise
24. What changes would you suggest to KPS to help more students get what they
need so they can be prepared for college? (Should they be doing anything
differently?)
a. Academically
b. Socially
c. Other support (health, counseling, mentoring, tutoring, technology)
25. Is there anything else you are thinking about with the Promise that I haven't asked
you about yet?
This completes the interview. Do you have any follow-up questions or comments?
Thank you for participating in this interview. Remember, if you have any follow-up
comments, concerns, or questions, please contact me or Dr. Miron. Our contact
information is on the consent form.
Give participant their phone card and thank them again.
157
Interview Rating Scale
Rating Difficulty Level
m
Very Difficult
No Change
158
Very Easy
V
UMi
K-HK:. :«^-
S £i::>1w-few«:.&ir.
Date: Marchi8,20G9
To:
Gary Miron, Principal Investigator
Michelle Ann Bakerson. Student Investigator for dissertf-Uor;
From: Amy Nauglc, P a J ^ ^ Q h a i r M f W W « W ~ Re:
HSIRBProjectNumber. 09-03-10
This letter will serve as confirmation that your research project entitled "Kstaaasoo
Promise Scholarship Recipients: A Comparative Analysis of Higher Edsf&iioti Retesnion
aadNonrespoase Bias" has been approved tinderfeeexpedited category si" rsstew ty
the Human Subjects Institutional Review Board. The conditions and durc.&>n z>T this
approval arc specified in the Policies of "Western Mchigan University. Yc« sanrnow
begin to implement the research as described in tie application.
Please note that yon may only conduct this research exactly in the fern-; it «*5 approved.
You most seek specific board approvalforany changes in mis project Ycx r.«;si ?Jso
seekreapprovalif me project extends beyond meterminationdate noted itJcv.% la
addition if there are any unanticipated adverse reactions or unanticipated events
associated with the conduct of this research, you should immediately rasp-tec' tfte project
and contact the Chair of the HSIRB for consultation.
The Board wishes you successin the pursuit of your research goals.
Approval Termination:
March 18,2010
mm
159
Vkbsxt Hall, Kateraaw, M! 4900S-i<SS
tK9)28742S3 ft& (289)387-82/6
Date: March 18, 2009
To:
Gary Miron, Principal Investigator
Michelle Ann Bakerson, Student Investigator for dissertation
From: Amy Naugle, Ph.D., Chair
Re:
HSIRB Project Number: 09-03-10
This letter will serve as confirmation that your research project entitled "Kalamazoo
Promise Scholarship Recipients: A Comparative Analysis of Higher Education Retention
and Nonresponse Bias" has been approved under the expedited category of review by
the Human Subjects Institutional Review Board. The conditions and duration of this
approval are specified in the Policies of Western Michigan University. You may now
begin to implement the research as described in the application.
Please note that you may only conduct this research exactly in the form it was approved.
You must seek specific board approval for any changes in this project. You must also
seek reapproval if the project extends beyond the termination date noted below. In
addition if there are any unanticipated adverse reactions or unanticipated events
associated with the conduct of this research, you should immediately suspend the project
and contact the Chair of the HSIRB for consultation.
The Board wishes you success in the pursuit of your research goals.
Approval Termination:
March 18, 2010
160
Study Information Sheet/Consent Form for Online Survey
Western Michigan University
Department of Educational Leadership, Research and Technology
Dr. Gary Miron, Principal Investigator
Michelle Ann Bakerson, Student Investigator
Title: Kalamazoo Promise Scholarship Recipients': A Comparative Analysis of Higher
Education Retention and Non-Response Bias
You are invited to participate in a study entitled, "Kalamazoo Promise Scholarship
Recipients: A Comparative Analysis of Higher Education Retention and Non-Response Bias."
The study is being conducted by Michelle Ann Bakerson, a doctoral student in the
Evaluation, Measurement and Research doctoral program at Western Michigan
University, under the direction of Dr. Gary Miron, her dissertation chair.
The following information is being provided for you to decide whether you wish
to participate in this study as well as to inform you that you are free to decide not to
participate in it, or to withdraw at any time, without affecting your relationship with the
researchers, Western Michigan University or the Kalamazoo Promise Scholarship.
The purpose of the study is to examine retention factors of Kalamazoo Promise
recipients who are attending and who have attended Western Michigan University
(WMU) to determine if there is a difference between those students that WMU retained
and those it did not retain.
If you agree to participate you will be asked to complete an on-line survey
regarding your experiences in high school and at WMU. It should only take you about 20
minutes to complete and you will be one of approximately 300 subjects to participate.
Your name will not be associated with the research findings in any way, and your
identity as a participant will be known only to the researcher. The survey information
that you provide will be maintained online secured with a password and then when data is
aggregated with no identifying information it will be located in a locked file cabinet in
the residence of the researcher for a period of three years. At that time all data will be
destroyed. There are no known risks and/or discomforts associated with this study.
While there are no direct benefits to you from participating in the study, we hope to learn
about your academic experiences and the quality of education and support provided by
WMU and KPS may improve based on these findings.
You also have the option to enter a random drawing to receive one often $20
WMU book store gift cards if you choose to fill out the survey. You can withdraw from
participating at anytime and can skip any question you do not wish to answer. This will
161
not affect your opportunity to enter the drawing or your chances of receiving one of the
gift cards.
If you have any questions about this study, you may contact the primary researcher,
Michelle Ann Bakerson, M.A., at (269-362-1620) (office), (269-684-5566) (home), or by
e-mail at michelle.a.bakerson(a),wmich.edu. You may also contact the Dissertation Chair,
Gary Miron, Ph.D., (269-387-3883), Human Subjects Institutional Review Board (269387-8293) or the Vice President for Research (269-387-8298) if questions or problems
arise during the course of the study.
This consent document has been approved for use for one year by the Human Subjects
Institutional Review Board (HSIRB) as indicated by the stamped date and signature of
the board chair in the upper right corner. Do not participate in this study if the stamped
date is older than one year.
If you decide to participate and give your consent, please put a check in the box below
[this box will be online] and continue on to answer the questions of this electronic survey.
162
Study Information Sheet/Consent Form for Interview
Western Michigan University
Department of Educational Leadership, Research, and Technology
Dr. Gary Miron, Principal Investigator
Michelle Ann Bakerson, Student Investigator
Title: Kalamazoo Promise Scholarship Recipients: A Comparative Analysis of Higher Education
Retention and Non-Response Bias
You are invited to participate in a study about the "Kalamazoo Promise Scholarship
Recipients': A Comparative Analysis of Higher Education Retention and Non-Response Bias."
The study is being conducted by Michelle Ann Bakerson, a doctoral student in the
Evaluation, Measurement and Research doctoral program at Western Michigan
University, under the direction of Dr. Gary Miron, her dissertation chair.
The following information is being provided for you to decide whether you wish
to participate in this study as well as to inform you that you are free to decide not to
participate in it, or to withdraw at any time, without affecting your relationship with the
researchers, Western Michigan University or the Kalamazoo Promise Scholarship.
The purpose of the study is to examine retention factors of Kalamazoo Promise
recipients who are attending and who have attended Western Michigan University
(WMU) to determine if there is a difference between those students that WMU retained
and those they did not retain.
If you agree to participate you will be asked to participate in an in depth interview
regarding your experiences in high school and at WMU. This interview will be held on
WMU's campus and should take about 50 minutes to complete. You will be one of about
72 subjects to participate.
Digital audio recording equipment will be used to ensure accuracy of the
information received and written transcripts of all interviews will be produced. You may
request the interviewer to turn off the audio recorder at any time during the interview.
Your interview will be given a number for transcription. Only the interviewer will know
your name. Your name will not be on the transcription and your name will not be
reported in any way. There will be no way to identify any individual student.
163
Do not hesitate to ask any questions about the study either before participating or
during the time that you are participating. Your name will not be associated with the
research findings in any way, and your identity as a participant will be known only to the
researcher during the interview.
The digital recorders will be kept in a locked filing cabinet in the researcher's home until
transcribed. No one except the researcher will have access to these recorders. Once
transcription has been completed the digital interview recordings will be deleted
immediately. No record of the audio files will be kept once the transcription is
completed. The transcription will be kept in a locked filing cabinet in the doctoral
researcher's home. The consent forms will be kept in a manila envelope and locked in a
filing cabinet in the doctoral researcher's home as well. No one except the researcher
will have access to this cabinet. After the study's completion the data will be stored at
WMU in Dr. Miron's office in a locked filing cabinet. In three years these consent
forms, transcriptions and data will be disposed of by burning. There are no known risks
and/or discomforts associated with this study.
While there are no direct benefits to you from participating in the study we hope
to learn about your academic experiences.
As a thank you for participating you will be given a $20 WMU book store gift
card that can be used for books, merchandise, phone cards or beverages at the WMU
book store. You can withdraw from participating at anytime and can skip any question
you do not wish to answer. This will not affect your receiving the gift card.
If you have any questions about this study, you may contact the primary
researcher, Michelle Ann Bakerson, M.A, at (269-362-1620) (office), (269-684-5566)
(home), or by e-mail at [email protected]. You may also contact the
Dissertation Chair, Gary Miron, Ph.D., (269-387-3883), Human Subjects Institutional
Review Board (269-387-8293) or the Vice President for Research (269-387-8298) if
questions or problems arise during the course of the study. This consent document has
been approved for use for one year by the Human Subjects Institutional Review Board
(HSIRB) as indicated by the stamped date and signature of the board chair in the upper
right corner. Do not participate in this study if the stamped date is older than one year. A
copy of this consent form will be given to you to keep for your own records.
Participant
Date
Interviewer/Researcher
Date
164
Introduction Survey Jb-mail Protocol:
Kalamazoo Promise Recipients
Hi "Name of Student",
My name is Michelle Bakerson and I am a doctoral student at WMU. I am
conducting a study on Kalamazoo Promise Scholarship Recipients. I am a doctoral student in
the Evaluation, Measurement and Research doctoral program at the College of Education.
I would love for you to click on the following link (http:
) to complete a brief
survey so I can gather information about recipients of the Kalamazoo Promise who are
attending or have attended WMU. You of course are free to participate or not. The
choice you make will not affect your relationship with the researcher, Western Michigan
University or the Kalamazoo Promise Scholarship at all.
If you are interested in learning more, you will be given more information about
the study and your rights before you take the survey. The survey needs to be completed
by April 10th and will only take about 20 minutes to fill out. At the end of the survey you
will be given the opportunity to enter a drawing to receive one often WMU book store
gift cards worth $20 that can be used for anything at the WMU book store, such as;
books, magazines, merchandise, phone cards or beverages at the WMU book store.
If you have any questions about this study, you may contact me at (269-362-1620)
(office), (269-684-5566) (home), or by e-mail at michelle.a.bakerson(a),wmich.edu.
Thank you so much for your time and I look forward to hearing from you.
Thank you,
Michelle Bakerson
Doctoral Student
WMU
165
First Reminder Email to l ake Survey
Hi "Name of Student",
My name is Michelle Bakerson and I am a doctoral student at WMU. I sent you
an e-mail last week asking for your participation in a research project I am doing to finish
my graduate degree. Your participation it important to my research and won't take much
time at all. Please consider going to the following link (http:
) and taking a brief
survey about Kalamazoo Promise Scholarship Recipients. .
You of course are free to participate or not. The choice you make will not affect
your relationship with the researcher, Western Michigan University or the Kalamazoo
Promise Scholarship at all.
If you are interested in learning more, you will be given more information about
the study and your rights before you take the survey. The survey needs to be completed
by April 10th and will only take about 20 minutes to fill out. At the end of the survey you
will be given the opportunity to enter a drawing to receive one often WMU book store
gift cards worth $20 that can be used for anything at the WMU book store, such as;
books, magazines, merchandise, phone cards or beverages at the WMU book store.
If you have any questions about this study, you may contact me at (269-362-1620)
(office), (269-684-5566) (home), or by e-mail at [email protected].
Thank you so much for your time and I look forward to hearing from you.
Thank you,
Michelle Bakerson
Doctoral Student
WMU
166
Second Reminder Email to Take Survey
Hi "Name of Student",
It's me again. My name is Michelle Bakerson and I am a doctoral student at
WMU. I have sent you a couple of e-mail requests to complete a survey regarding
Kalamazoo Promise Recipients. Please consider completing this quick survey at the
following link (http:
). Your help is greatly appreciated.
You of course are free to participate or not. The choice you make will not affect
your relationship with the researcher, Western Michigan University or the Kalamazoo
Promise Scholarship at all.
If you are interested in learning more, you will be given more information about
the study and your rights before you take the survey. The survey needs to be completed
by April 10th and will only take about 20 minutes to fill out. At the end of the survey you
will be given the opportunity to enter a drawing to receive one often WMU book store
gift cards that can be used for anything at the WMU book store, such as; books,
magazines, merchandise, phone cards or beverages at the WMU book store.
If you have any questions about this study, you may contact me at (269-362-1620)
(office), (269-684-5566) (home), or by e-mail at [email protected].
Thank you so much for your time and I look forward to hearing from you.
Thank you,
Michelle Bakerson
Doctoral Student
WMU
167
Interview E-mail Invitation
Hi "Name of Student",
My name is Michelle Bakerson and I am a student at WMU. I am conducting a
study on Kalamazoo Promise Scholarship Recipients. I am a doctoral student in the
Evaluation, Measurement and Research doctoral program at the College of Education.
I am inviting you to learn more about this project. If after this information you
are still interested and agree to participate I would like to invite you to an interview so
that I can gather information about recipients of the Kalamazoo Promise who are
attending or have attended WMU. You of course are free to participate or not. The
choice you make will not affect your relationship with the researcher, Western Michigan
University or the Kalamazoo Promise Scholarship at all.
If you are interested in learning more, you will be given more information about
the study and your rights before you complete the interview. Your name will never be
used in reporting any information. All information will be reported aggregately only.
Your name and participation will be held strictly confidential.
At the end of the interview you will be given a $20 WMU book store gift card
that can be used for anything at the WMU book store, such as; books, magazines,
merchandise, phone cards or beverages at the WMU book store as a thank you for your
time.
If you have any questions about this study or would like to schedule an interview,
you may contact me at (269-362-1620) (office), (269-684-5566) (home), or by e-mail at
michelle.a.bakerson(q)wmich.edu.
Thank you so much for your time and I look forward to hearing from you.
Thank you,
Michelle Bakerson
Doctoral Student
WMU
168
Reminder E-Mail for Interview
Hello (student's name), this is Michelle Bakerson. I would like to express my
appreciation for your willingness to meet with me and learn more about the project for
my Kalamazoo Promise study. This message is a reminder of the meeting which will be
held on (date) at (time) in (location). If you decide to participate you will receive a
$20.00 WMU book store gift card as compensation for your participation. I look forward
to meeting with you. If you have any questions in the meantime, please e-mail me at
[email protected] or call me at 269-362-1620.
Also here are the directions to get to (location)...describe directions.
169
Reminder Phone Call for Interview
Hello (student's name), this is Michelle Bakerson. I would like to express my
appreciation for your willingness to meet with me and learn more about the project for
my Kalamazoo Promise study. This message is a reminder of the meeting which will be
held on (date) at (time) in (location).
Give directions for location.
If you decide to participate you will receive a $20.00 WMU book store gift card as
compensation for your participation. I look forward to meeting with you. If you have any
questions in the meantime, please e-mail me at michelle.a.bakerson(a),wmich.edu or call
me at 269-362-1620.
170
Appendix B
Cognitive, Social and Institutional Factor of Retention and Corresponding Survey Items,
along with Academic Data Variable Names with Measurement Type and Cognitive,
Social and Institutional Factors of Retention and Corresponding Survey Items with
Subscales Identified.
Cognitive, Social and Institutional Factors of Retention and Corresponding Survey Items
Factor Name
Cognitive
Items
Estimate your grade point average (GPA) in high school.
Estimate your grade point average (GPA), thus far at WMU.
About how many hours do you spend in a typical 7-day week
preparing for class (studying, reading, writing, rehearsing or
other activities related to your program)?
About how many hours do you spend in a typical 7-day week
participating in college-sponsored activities (organizations,
campus publication, student government, intercollegiate or
intramural sports, etc)?
How likely is it that being academically unprepared would
cause you to withdraw from class or from WMU?
In your experience at WMU during the current school year,
about how often have you done each of the following?
o Asked questions in class or contributed to class discussions
o
o
o
o
o
o
o
o
o
o
o
o
o
o
Made a class presentation
Come to class without completing readings or assignments
Worked with classmates outside of class to prepare class
assignments
Tutored or taught other students (paid or voluntary)
Participated in a community-based project as a part of a
regular course
Used instant messaging to work on an assignment
Used e-mail to communicate with an instructor
Discussed grades or assignments with an instructor
Talked about career plans with an instructor or adviser
Discussed ideas from your readings or classes with
instructors outside of class
Received prompt feedback (written or oral) from
instructors on your performance
Worked harder than you thought you could to meet an
instructor's standards or expectations
Worked with instructors on activities other than
coursework
Discussed ideas from your readings or classes with others
171
o
o
Social
Institutional
outside of class (students, family members, co-workers,
etc.)
Had serious conversations with students who differ from
you in terms of their religious beliefs, political opinions, or
personal values
Had serious conversations with students of different race or
ethnic background than your own
Skipped class
Included diverse perspectives (different races, religions,
genders, political beliefs, etc.) in class discussions or
writing assignments
Put together ideas or concepts from different courses when
completing assignments or during class discussions
What degree(s) do you wish to pursue?
What is your current major?
Do you expect to enroll for an advanced degree when, or if, you
complete your undergraduate degree?
Briefly describe your career goals.
When enrolled in high school, did you qualify for the
free/reduced lunch program at your school?
Where do you live during the school year?
With whom do you live during the school year?
What is the highest level of education obtained by your father
or male guardian?
What is the highest level of education obtained by your mother
or female guardian?
About how many hours do you spend in a typical 7-day week
working for pay?
About how many hours do you spend in a typical 7-day week
providing care for dependents living with you (parents,
children, spouse, etc.)?
About how many hours do you spend in a typical 7-day week
commuting to and from class?
If you have a job, how does it affect your school work?
How likely is it that working full-time would cause you to
withdraw from class or from WMU?
How likely is it that caring for dependents would cause you to
withdraw from class or from WMU?
How likely is it that non fitting in would cause you to withdraw
from class or from WMU?
How likely is it that the lace do finances would cause you to
withdraw from class or from WMU?
Are you a member of a social fraternity or sorority?
Are you a student athlete on a team sponsored by WMU's
athletics department?
Are you involved in student associations or organizations?
How supportive are you friends of your attending WMU?
How supportive is you r immediate family of your attending
WMU?
What is the quality of your relationship with other students?
Please rate your level of awareness about the Kalamazoo
Promise.
172
What additional information would you like to have regarding
the Kalamazoo Promise?
How much tuition scholarship are you eligible for under the
Kalamazoo Promise?
How likely is it that not offering the program of study would
cause you to withdraw from class or from WMU?
What is the quality of relationship with instructors at WMU?
What is the quality of relationship with the administrative
personnel and office staff at
WMU?
To what extent does WMU emphasize:
o Spending significant amounts of time studying?
o Providing the support you need to help you succeed
academically?
o Encouraging contact among students from different
economic, social, and racial or ethnic backgrounds?
o Helping you cope with your non-academic
responsibilities (work, family, etc.)?
o Providing the support you need to thrive socially?
o Attending campus events and activities (special
speakers, cultural performances, athletic events, etc.)?
o Using computers in academic work?
During the current school year, about how often have you done
each of the following?
o Attended an art exhibit, play, dance, music, theater, or
other performance
o Exercised or participated in physical fitness activities
o Participated in activities to enhance your spirituality
(worship, meditation, prayer, etc.)
o Tried to better understand someone else's views by
imagining how an issue looks from his or her
perspective
o Learned something that changed the way you
understand an issue or concept
Overall, how would you evaluate the quality of academic
advising you have received at WMU?
How would you evaluate your entire educational experience at
WMU?
If you could start over again, would you still attend WMU?
Would you recommend WMU to a friend or family member?
173
Academic Data Variable Names with Measurement Type
Variable Name
StudentID
FirstPromiseSemester
FTIACCohort
DualEnrolled
HSGPA
HSGradDate
HSName
ACTComposite
Gender
Race
TransHrsPass
Probation
MostRecentWMUGPA
MostRecentGPASemester
RemedialMath
RemedialReading
RemedialWriting
ParentsAGI
Athelete
Dorm
APCredit
FYE
GPA200630SumII
GPA200640Fall
GPA200710Spring
GPA200720SumI
GPA200730SumII
GPA200740Fall
GPA200810Spring
GPA200820SumI
GPA200830SumII
GPA200840Fall
GPA200910Spring
Group
DateResponded
GroupByResponse
GroupByLateResponse
Measurement
Nominal
Nominal
Nominal
Nominal
Scale
Nominal
Nominal
Scale
Nominal
Nominal
Scale
Nominal
Scale
Nominal
Nominal
Nominal
Nominal
Scale
Nominal
Nominal
Nominal
Nominal
Scale
Scale
Scale
Scale
Scale
Scale
Scale
Scale
Scale
Scale
Scale
Nominal
Nominal
Nominal
Nominal
174
Cognitive, Social and Institutional Factors of Retention and Corresponding Survey Items
with Subscales Identified.
Factor Name
Cognitive
Items
CI Estimate your grade point average (GPA) in high school.
C2 Estimate your grade point average (GPA), thus far at WMU.
C3 About how many hours do you spend in a typical 7-day week
preparing for class (studying, reading, writing, rehearsing or other
activities related to your program)?
C4 About how many hours do you spend in a typical 7-day week
participating in college-sponsored activities (organizations, campus
publication, student government, intercollegiate or intramural sports,
etc)?
C5 How likely is it that being academically unprepared would cause
you to withdraw from class or from WMU?
In your experience at WMU during the current school year, about how
often have you done each of the following? Cognitive Engagement
Subscale
o C6 Asked questions in class or contributed to class discussions
o C7 Made a class presentation
o C8 Come to class without completing readings or assignments
o C9 Worked with classmates outside of class to prepare class
assignments
o CIO Tutored or taught other students (paid or voluntary)
o CI 1 Participated in a community-based project as a part of a
regular course
o CI2 Used instant messaging to work on an assignment
o CI3 Used e-mail to communicate with an instructor
o C14 Discussed grades or assignments with an instructor
o CI 5 Talked about career plans with an instructor or adviser
o C16 Discussed ideas from your readings or classes with instructors
outside of class
o CI7 Received prompt feedback (written or oral) from instructors
on your performance
o CI8 Worked harder than you thought you could to meet an
instructor's standards or expectations
o CI9 Worked with instructors on activities other than coursework
o C20 Discussed ideas from your readings or classes with others
outside of class (students, family members, co-workers, etc.)
o C21 Had serious conversations with students who differ from you
in terms of their religious beliefs, political opinions, or personal
values
o C22 Had serious conversations with students of different race or
ethnic background than your own
o C23 Skipped class
o C24 Included diverse perspectives (different races, religions,
genders, political beliefs, etc.) in class discussions or writing
assignments
o C25 Put together ideas or concepts from different courses when
completing assignments or during class discussions
175
Social
•
•
•
•
•
•
•
•
•
•
•
•
•
SI What degree(s) do you wish to pursue?
S2 What is your current major?
S3 Do you expect to enroll for an advanced degree when, or if, you
complete your undergraduate degree?
S4 Briefly describe your career goals.
S5 When enrolled in high school, did you qualify for the free/reduced
lunch program at your school?
S6 Where do you live during the school year?
S7 With whom do you live during the school year?
S8 What is the highest level of education obtained by your father or
male guardian?
S9 What is the highest level of education obtained by your mother or
female guardian?
S10 About how many hours do you spend in a typical 7-day week
working for pay?
SI 1 About how many hours do you spend in a typical 7-day week
providing care for dependents living with you (parents, children,
spouse, etc.)?
S12 About how many hours do you spend in a typical 7-day week
commuting to and from class?
S13 If you have a job, how does it affect your school work?
Social Demands Subscale
•
•
Institutional
S14 How likely is it that working full-time would cause you to
withdraw from class or from WMU?
SI5 How likely is it that caring for dependents would cause you to
withdraw from class or from WMU?
516 How likely is it that non fitting in would cause you to withdraw
from class or from WMU?
517 How likely is it that the lace do finances would cause you to
withdraw from class or from WMU?
518 Are you a member of a social fraternity or sorority?
519 Are you a student athlete on a team sponsored by WMU's athletics
department?
520 Are you involved in student associations or organizations?
521 How supportive are you friends of your attending WMU?
522 How supportive is you r immediate family of your attending
WMU?
S23What is the quality of your relationship with other students?
11 Please rate your level of awareness about the Kalamazoo Promise.
12 What additional information would you like to have regarding the
Kalamazoo Promise?
I3How much tuition scholarship are you eligible for under the
Kalamazoo Promise?
14 How likely is it that not offering the program of study would cause
you to withdraw from class or from WMU?
15 What is the quality of relationship with instructors at WMU?
16 What is the quality of relationship with the administrative personnel
and office staff at
WMU?
176
• To what extent does WMU emphasize:
(Institutional Support Subscale)
o 117 Spending significant amounts of time studying?
o 118 Providing the support you need to help you succeed
academically?
o 119 Encouraging contact among students from different
economic, social, and racial or ethnic backgrounds?
o 120 Helping you cope with your non-academic responsibilities
(work, family, etc.)?
o 121 Providing the support you need to thrive socially?
o 122 Attending campus events and activities (special speakers,
cultural performances, athletic events, etc.)?
o I 23Using computers in academic work?
•
•
•
•
•
During the current school year, about how often have you done each of
the following?
(Social Engagement Subscale)
o 124 Attended an art exhibit, play, dance, music, theater, or
other performance
o 125 Exercised or participated in physical fitness activities
o 126 Participated in activities to enhance your spirituality
(worship, meditation, prayer, etc.)
o 127 Tried to better understand someone else's views by
imagining how an issue looks from his or her perspective
o 128 Learned something that changed the way you understand
an issue or concept
129 Overall, how would you evaluate the quality of academic advising
you have received at WMU?
130 How would you evaluate your entire educational experience at
WMU?
131 If you could start over again, would you still attend WMU?
I32Would you recommend WMU to a friend or family member?
177
Appendix C
Summary Demographic Data on Interval Level Data Across
Descriptive Statistics
Std.
Academic
Academic
persistence Race
Gender
Mean
persistence 1
Male
3.5815
.59654
71
Female
3.8051
.36166
59
Total
3.6830
.51419
130
Male
2.6940
1.16989
15
Female
3.1065
.87016
20
Total
2.9297
1.01469
35
Male
3.5506
.48484
16
Female
3.2147
1.19125
19
Total
3.3683
.94011
35
Male
3.4462
.75492
102
Female
3.5481
.76941
98
Total
3.4961
.76185
200
Male
3.3411
.42700
18
Female
3.3127
.42408
11
Total
3.3303
.41848
29
Male
2.7275
.27035
4
Female
2.9178
.38986
9
Total
2.8592
.35771
13
Male
2.9143
.34708
7
Female
3.2450
.19092
2
Total
2.9878
.34084
9
Male
3.1534
.45519
29
Female
3.1450
.42789
22
Cognitive
Deviation
N
High
School
2
GPA
3
-
Total
On
1
Probation
2
3
Total
178
Non-
1
persister
2
3
Total
Total
1
2
3
Total
Academic
persistence 1
Cognitive
Most
Recent
Western
Michigan
2
Total
3.1498
.43926
51
Male
3.2231
.28922
16
Female
3.2500
.41689
8
Total
3.2321
.32805
24
Male
2.8964
.29354
11
Female
2.5650
1.29157
6
Total
2.7794
.77575
17
Male
2.8560
.40321
5
Female
3.3467
.62083
3
Total
3.0400
.51722
8
Male
3.0534
.34518
32
Female
3.0253
.87726
17
Total
3.0437
.57763
49
Male
3.4857
.54944
105
Female
3.6787
.43376
78
Total
3.5680
.51114
183
Male
2.7727
.84100
30
Female
2.9651
.86303
35
Total
2.8763
.85178
65
Male
3.2675
.54098
28
Female
3.2337
1.07128
24
Total
3.2519
.82024
52
Male
3.3170
.66554
163
Female
3.4185
.76457
137
Total
3.3633
.71305
300
Male
3.1145
.49243
71
Female
3.1578
.52141
59
Total
3.1342
.50427
130
Male
2.6513
.40438
15
Female
3.0005
.38093
20
179
University
GPA
3
Total
On
1
Probation
2.8509
.42327
35
Male
3.0463
.43124
16
Female
2.9237
.56227
19
Total
2.9797
.50325
35
Male
3.0357
.49464
102
Female
3.0803
.50934
98
Total
3.0576
.50113
200
Male
1.7906
.58799
18
.9482
.85781
11
Total
1.4710
.80358
29
Male
1.9700
.16207
4
Female
1.9233
.58402
9
Total
1.9377
.48420
13
Male
1.6329
.87036
7
Female
1.9450
.12021
2
Total
1.7022
.76739
9
Male
1.7772
.62104
29
Female
1.4377
.85564
22
Total
1.6308
.74318
51
Male
.9613
.71620
16
1.0038
.58977
8
Total
.9754
.66393
24
Male
1.2745
.41972
11
Female
1.3983
.30407
6
Total
1.3182
.37778
17
Male
1.0020
.31027
5
Female
1.6300
.45398
3
Total
1.2375
.46855
8
Male
1.0753
.58234
32
Female
1.2535
.52198
17
Total
1.1371
.56319
49
Femg|e
2
3
Total
Non-
Total
1
persister
Fema|e
2
3
Total
180
Total
1
2
3
Total
Academic
persistence 1
Cognitive
ACT
Composite
2
Score
3
Total
On
1
Probation
2
3
Male
2.5594
.99966
105
Female
2.6253
1.10646
78
Total
2.5875
1.04406
183
Male
2.0557
.74725
30
Female
2.4489
.78855
35
Total
2.2674
.78891
65
Male
2.3279
1.02104
28
Female
2.6804
.71120
24
Total
2.4906
.90085
52
Male
2.4269
.97665
163
Female
2.5899
.97026
137
Total
2.5013
.97551
300
Male
23.04
4.695
71
Female
21.46
5.361
59
Total
22.32
5.050
130
Male
17.00
5.529
15
Female
17.00
4.611
20
Total
17.00
4.947
35
Male
21.38
7.228
16
Female
15.95
7.884
19
Total
18.43
7.968
35
Male
21.89
5.639
102
Female
19.48
6.243
98
Total
20.71
6.050
200
Male
20.78
4.152
18
Female
19.91
2.879
11
Total
20.45
3.690
29
Male
20.00
4.082
4
Female
16.56
1.424
9
Total
17.62
2.873
13
Male
19.29
3.684
7
181
Total
Non-
1
persister
2
3
Total
Total
1
2
3
Total
Female
18.50
2.121
2
Total
19.11
3.296
9
Male
20.31
3.947
29
Female
18.41
2.754
22
Total
19.49
3.580
51
Male
21.50
2.757
16
Femg|e
19.25
2.121
8
Total
20.75
2.739
24
Male
16.82
1.662
11
Female
16.83
2.483
6
Total
16.82
1.912
17
Male
19.80
2.588
5
Female
18.33
3.215
3
Total
19.25
2.712
8
Male
19.63
3.170
32
Female
18.24
2.538
17
Total
19.14
3.014
49
Male
22.42
4.428
105
Female
21.01
4.876
78
Total
21.82
4.664
183
Male
17.33
4.310
30
Female
16.86
3.647
35
Total
17.08
3.942
65
Male
20.57
5.827
28
Female
16.46
7.126
24
Total
18.67
6.721
52
Male
21.17
5.037
163
Female
19.15
5.477
137
Total
20.25
5.329
300
182
Appendix D
Tests of Normality and Persistence, those on Probation and Non-persistence, Subscale
Scores of the Survey of Promise Recipients by Response Category
Persistence, those on probation and non-persistence
Tests of Normality
Kolmogorov-Smirnov3
persistence
Statistic
Academic Cognitive High persistence
df
Shapiro-Wilk
Sig.
Statistic
df
Sig.
.165
200
.000
.708
200
.000
„ „ ,
On Probation
.084
51
.200*
.968
51
.186
Non-persister
.137
49
.022
.747
49
.000
Academic Cognitive ACT persistence
.147
200
.000
.862
200
.000
Composite Score
Probation
.188
51
.000
.886
51
.000
Non-persister
.138
49
.021
.961
49
.107
.060
200
.075
.975
200
.001
.225
51
.000
.833
51
.000
.111
49
.182
.936
49
.010
.210
200
.000
.644
200
.000
.137
51
.018
.904
51
.001
.146
49
.010
.932
49
.007
School GPA
0n
Academic Cognitive Most persistence
Recent Western Michigan _ . - , . . .
a
On Probation
University GPA
Non-persister
Academic Social Parents persistence
Aggregate Income
On Probation
Non-persister
a. Lilliefors Significance Correction
*. This is a lower bound of the true significance.
183
Subscale scores of the Survey of Promise Recipients by Response Category
Kolmogorov-Smirnova
Survey Group by Response
Social Demands
Statistic
On Time Respondent
Late Respondent(so nonresponse)
Cognitive Engagement
On Time Respondent
Late Respondent(so nonresponse)
Social Engagement
On Time Respondent
Late Respondent(so nonresponse)
Institutional Support
On Time Respondent
Late Respondent(so nonresponse)
184
df
Sig.
.147
37
.042
.153
41
.017
.106
37
.200*
.145
41
.029
.138
37
.071
.164
41
.007
.144
37
.052
.105
41
.200*
Appendix E
Tests of Homogeneity of Variance and Subscales of the Subscale Scores of the Survey of
Promise Recipients by Response Category
Levene's Test of Equality of Error Variances3
F
Academic Cognitive High
df1
df2
Sig.
2.608
17
282
.001
3.080
17
282
.000
1.442
17
282
.116
School GPA
Academic Cognitive Most
Recent Western Michigan
University GPA
Academic Cognitive ACT
Composite Score
Tests the null hypothesis that the error variance of the dependent variable is
equal across groups.
a. Design: Intercept + persistence + ethnicity + aGender + persistence *
ethnicity + persistence * aGender + ethnicity * aGender + persistence *
ethnicity * aGender
Test of Homogeneity of Variances
Academic Social Parents Aggregate Income
Levene
Statistic
3.473
df2
df1
2
Sig.
297
.032
Subscales of the Subscale scores of the Survey of Promise Recipients by Response
Category
185
F
dfl
df2
Sig.
2.267
1
76
.136
Cognitive Engagement
.625
1
76
.432
Social Engagement
.915
1
76
.342
Institutional Support
.280
1
76
.598
Social Demands
186
Appendix F
Summary Results of Item Analysis, Institutional Support, Social Engagement, Social
Demands and Cognitive Engagement
Institutional Support
Item Statistics
Mean
siWMUEmphasizeTimeStudyin
Std. Deviation
N
3.5747
.99571
87
3.5172
.98668
87
2.9310
1.17921
87
2.4828
1.06599
87
2.9080
1.10635
87
3.5632
.99652
87
4.2299
.75792
87
g
siWMUEmphasizeSupportAcad
emically
siWMUEmphasizeContactStud
entsDifferentEconSocRace
siWMUEmphasizeHelpCopeNo
nacademics
siWMUEmphasizeProvidingSu
pportThriveSocially
siWMUEmphasizeAttendingCa
mpusEventsActivities
siWMUEmphasizeUsingComp
utersAcademicWork
Summary Item Statistics
Maximum /
Mean
Item Means
Item Variances
Minimum
Maximum
Range
Minimum
Variance
N of Items
3.315
2.483
4.230
1.747
1.704
.335
7
1.040
.574
1.391
.816
2.421
.065
7
187
Item-Total Statistics
siWMUEmphasizeTimeSt
udying
Corrected Item-
Squared
Cronbach's
Scale Mean if
Scale Variance
Total
Multiple
Alpha if Item
Item Deleted
if Item Deleted
Correlation
Correlation
Deleted
19.6322
16.979
.500
.317
.755
19.6897
16.217
.614
.423
.733
20.2759
14.574
.678
.528
.715
20.7241
15.993
.580
.446
.739
20.2989
15.445
.622
.462
.729
19.6437
18.813
.262
.164
.799
18.9770
19.627
.279
.128
.790
siWMUEmphasizeSuppor
tAcademically
siWMUEmphasizeContac
tStudentsDifferentEconSo
cRace
siWMUEmphasizeHelpC
opeNonacademics
siWMUEmphasizeProvidi
ngSupportThriveSocially
siWMUEraphasizeAttendi
ngCampusEventsActivitie
s
siWMUEmphasizeUsing
Computers AcademicWor
k
Scale Statistics
Mean
23.2069
Variance
22.073
Std. Deviation
N of Items
7
4.69819
188
Social Engagement
Case Processing Summary
%
N
Cases
Valid
88
28.7
Excludeda
219
71.3
Total
307
100.0
a. Listwise deletion based on all variables in the
procedure.
Item Statistics
Mean
Std. Deviation
N
ssOftenAttendedArtExhibitPlayDanceMusic
2.6932
1.39257
88
ssOftenExercisedParticipatedPhysEd
3.4545
1.30348
88
ssOftenParticpatedSpiritualActivities
2.0568
1.35916
88
ssOftenTriedUnderstandSomeoneElsesView
3.2955
1.27900
88
3.5455
1.03845
88
ssOftenLearnedSomethingChangedWayUnd
erstandlssue
Summary Item Statistics
Maximum /
Mean
Item Means
3.009
Minimum
2.057
Maximum
Range
3.545
189
1.489
Minimum
1.724
Variance
.394
N of Items
5
Item-Total Statistics
Corrected Item-
Squared
Cronbach's
Scale Mean if
Scale Variance
Total
Multiple
Alpha if Item
Item Deleted
if Item Deleted
Correlation
Correlation
Deleted
ssOftenAttendedArtExhib
itPlayDanceMusic
12.3523
12.920
.132
.021
.687
11.5909
11.693
.313
.130
.592
12.9886
10.287
.464
.237
.510
11.7500
10.351
.510
.460
.488
11.5000
11.494
.513
.434
.507
ssOftenExercisedParticipa
tedPhysEd
ssOftenParticpatedSpiritu
alAotivities
ssOftenTriedUnderstandS
omeoneElsesView
ssOftenLearnedSomething
ChangedWayUnderstandl
ssue
Scale Statistics
Mean
Variance
Std. Deviation
16.182
15.0455
N of Items
4.02266
5
Cronbach's Alpha
Based on
Cronbach's Alpha Standardized Items
.687
.697
N of Items
4
190
Item Statistics
Mean
ssOftenExercisedParticipatedPhy
sEd
ssOftenPartiopatedSpiritualActiv
Std. Deviation
N
3.4545
1.30348
88
2.0568
1.35916
88
3.2955
1.27900
88
3.5455
1.03845
88
ities
ssOftenTriedUnderstandSomeon
eElsesView
ssOftenLearnedSomethingChang
edWayUnderstandlssue
Summary Item Statistics
Maximum /
Mean
Item Means
3.088
Minimum
Maximum
2.057
Minimum
Range
3.545
1.489
Variance
1.724
N of Items
4
.483
Item-Total Statistics
Corrected Item-
Squared
Cronbach's
Scale Mean if
Scale Variance if
Total
Multiple
Alpha if Item
Item Deleted
Item Deleted
Correlation
Correlation
Deleted
ssOftenExercisedParticipa
tedPhysEd
8.8977
8.645
.336
.130
.708
10.2955
7.544
.473
.229
.622
9.0568
7.411
.556
.460
.564
ssOftenParticpatedSpiritu
alActivities
s sOftenTriedUnderstandS
omeoneElsesView
191
Item-Total Statistics
Corrected Item-
Squared
Cronbach's
Scale Mean if
Scale Variance if
Total
Multiple
Alpha if Item
Item Deleted
Item Deleted
Correlation
Correlation
Deleted
ssOftenExercisedParticipa
tedPhysEd
8.8977
8.645
.336
.130
.708
10.2955
7.544
.473
.229
.622
9.0568
7.411
.556
.460
.564
8.8068
8.502
.551
.433
.586
ssOftenParticpatedSpiritu
alActivities
ssOftenTriedUnderstandS
omeoneElsesView
ssOftenLearnedSomething
ChangedWayUnderstandl
ssue
Scale Statistics
Mean
Variance
12.3523
Std. Deviation
N of Items
3.59450
12.920
4
Intraclass Correlation Coefficient
F Test with True Value .7
95% Confidence Interval
Intraclass
Correlationa
Single Measures
Average
Measures
Lower Bound
Upper Bound
Value
dfl
df2
Sig
.355b
.245
.473
.309
87
261
1.000
.687c
.565
.782
.959
87
261
.582
192
Social Demands
Case Processing Summary
%
N
Cases
Valid
94
30.6
Excluded3
213
69.4
Total
307
100.0
a. Listwise deletion based on all variables in the
procedure.
Reliability Statistics
Cronbach's Alpha
Based on
N of Items
Cronbach's Alpha Standardized Items
.748
.752
6
Item Statistics
Mean
Std. Deviation
N
2.2660
1.37704
94
1.9362
1.21645
94
2.1809
1.27813
94
ssWithdrawLackFinances
2.4681
1.51482
94
ssWithdrawDontFitln
1.3404
.74130
94
2.2553
1.48060
94
ssWithdrawWorkingFullTime
ssWithdrawCaringForDependent
s
ssWithdrawAcademicallyUnprep
ared
siWithdrawDontOfferProgramW
anted
193
Summary Item Statistics
Maximum /
Mean
Item Means
Minimum
2.074
Maximum
2.468
1.340
Minimum
Range
1.128
Variance
1.841
N of Items
.159
6
Item-Total Statistics
Corrected Item-
ssWithdrawWorkingFullT
ime
ssWithdrawCaringForDep
endents
Squared
Cronbach's
Scale Mean if
Scale Variance if
Total
Multiple
Alpha if Item
Item Deleted
Item Deleted
Correlation
Correlation
Deleted
10.1809
18.085
.572
.340
.687
10.5106
19.392
.542
.330
.698
10.2660
18.713
.573
.356
.688
9.9787
17.333
.559
.340
.691
11.1064
23.601
.351
.138
.747
10.1915
19.640
.369
.147
.750
ssWithdrawAcademically
Unprepared
ssWithdrawLackFinances
ssWithdrawDontFitln
siWithdrawDontOfferPro
gramWanted
Scale Statistics
Mean
12.4468
Variance
26.680
Std. Deviation
N of Items
5,16526
6
194
Cognitive Engagement
%
N
Cases
90
29.3
Excluded2
217
70.7
Total
307
100.0
Valid
a. Listwise deletion based on all variables in the
procedure.
Reliability Statistics
Cronbach's Alpha
Based on
Cronbach's Alpha Standardized Items
.830
N of Items
18
.831
Item Statistics
Mean
Std. Deviation
N
scl
3.8000
.92651
90
sc2
3.2778
1.14193
90
sc4
3.2556
1.25922
90
sc5
2.0556
1.19325
90
sc6
1.7667
1.02825
90
sc7
2.0667
1.27904
90
sc8
4.0667
.93376
90
sc9
3.6222
1.05551
90
sclO
2.7444
1.25027
90
sell
2.3111
1.25102
90
scl2
3.3222
1.00368
90
sc!3
3.2667
1.09954
90
195
scl4
1.8556
1.11739
90
scl5
3.5000
1.15389
90
scl6
3.1333
1.27376
90
ssl7
3.4889
1.17315
90
scl9
3.1667
1.09391
90
sc20
3.4111
1.01555
90
Summary Item Statistics
Maximum /
Mean
Item Means
3.006
Minimum
Maximum
4.067
1.767
Minimum
Range
2.300
Variance
2.302
N of Items
.489
18
Item-Total Statistics
scl
sc2
sc4
sc5
sc6
sc7
sc8
sc9
sclO
sell
Scale Mean if Item
Scale Variance if
Corrected Item-
Deleted
Item Deleted
Total Correlation
Squared Multiple Cronbach's Alpha if
Correlation
Item Deleted
50.3111
97.183
.450
.327
.820
50.8333
95.287
.434
.351
.821
50.8556
93.833
.444
.481
.820
52.0556
96.120
.372
.392
.824
52.3444
94.610
.529
.395
.816
52.0444
97.256
.292
.324
.829
50.0444
99.728
.303
.507
.827
50.4889
97.758
.354
.594
.825
51.3667
95.785
.364
.496
.825
51.8000
89.061
.662
.632
.807
196
scl2
scl3
scl4
scl5
scl6
ssl7
scl9
sc20
50.7889
98.483
.340
.390
.825
50.8444
96.470
.397
.368
.822
52.2556
93.855
.515
.425
.816
50.6111
95.656
.411
.439
.822
50.9778
95.325
.374
.508
.824
50.6222
94.800
.441
.515
.820
50.9444
94.323
.505
.601
.817
50.7000
97.583
.381
.423
.823
Scale Statistics
Mean
54.1111
Variance
106.257
Std. Deviation
N of Items
18
10.30811
197
Appendix G
Survey Summary Tables
Table Gl
Which High School Did You Attend?
Loy Norrix
Kalamazoo Central
Phoenix
Response
Percent
40.6%
59.4%
0.0%
Response
Count
41
60
0
Response
Percent
39.6%
33.7%
26.7%
Response
Count
40
34
27
Response
Percent
15.8%
84.2%
Response
Count
16
85
16
Table G2
In which Year Did You Graduate High School?
2006
2007
2008
Table G3
Did You Begin College at WMU or Elsewhere?
Started elsewhere
Started at WMU
If elsewhere, please specify where:
198
Table G4
What is your classification in college?
Freshman/first-year
Sophomore
Junior
Senior
Unclassified
Response
Percent
Response
Count
25.0%
31.0%
38.0%
6.0%
0.0%
25
31
38
6
0
Response
Percent
96.0%
4.0%
0.0%
Response
Count
96
4
0
Note. One person did not respond.
Table G5
When do you most frequently take classes?
Day classes (morning or afternoon classes)
Evening classes
Weekend classes
Table G6
Do you expect to enroll for an advanced degree when, or if you complete your
undergraduate degree?
Response
Percent
34.7%
65.3%
No
Yes
Note. Three did not respond.
199
Response
Count
34
64
Table G7
When enrolled in high school, did you qualify for the free/reduced lunch program at your
school?
No
Yes
Response
Percent
70.1%
29.9%
Response
Count
68
29
Response
Percent
47.9%
52.1%
Response
Count
46
50
Note. N=91, 4 did not answer.
Table G8
What is your gender?
Female
Male
Note. N=96, 5 did not answer.
Table G9
What is your race/ethnicity? (Mark the one ethnic group with which you most identify
American Indian or other Native American
Asian, Asian American, or Pacific Islander
Black or African American
White (non-Hispanic)
Mexican or Mexican American
Puerto Rican
Other Hispanic or Latino
Multiracial
I prefer not to respond
Other (please specify)
Note. N=93, 8 did not answer.
200
Response
Percent
0.0%
6.5%
18.3%
61.3%
3.2%
0.0%
1.1%
5.4%
4.3%
Response
Count
0
6
17
57
3
0
1
5
4
3
Table G10
Is English your native (first) language?
No
Yes
Response
Percent
9.4%
90.6%
Response
Count
9
87
Response
Percent
35.1%
Response
Count
34
19.6%
19
45.4%
44
0.0%
0
Note. N=96, 5 did not answer.
Table Gl 1
Where to you live during the school year?
Dormitory or other campus housing
Residence (house, apartment, etc.) within
walking distance of Western
Residence (house, apartment, etc.) within
driving distance
Fraternity or sorority house
Note. N=91, 4 did not answer.
Table G12
With whom do you live during the school year? (Fill in all that apply
Response
Percent
6.2%
54.6%
5.2%
1.0%
34.0%
3.1%
5.2%
No one, I live alone
One or more other students
My spouse or partner
My child or children
My parents
Other relatives
Friends who are not students at WMU
Other people, who
Note. N=97, 4 did not answer.
201
Response
Count
6
53
5
1
33
3
5
2
Table G12
What is the highest level of education obtained by your Father or Mother
Not a high school graduate
High school diploma or GED
Some college, did not complete degree
Associate degree
Bachelor's degree
Master's degree
Doctorate degree and/or Professional
degree
Unknown
Father or
Male
guardian
Mother or
Female
guardian
Response
Count
3
18
17
8
20
9
3
13
13
15
24
14
6
31
30
23
44
23
4
2
6
2
3
5
Note. N=96, 5 did not answer.
Tabled 3
Please rate your level of awareness about the Kalamazoo Promise.
Response
Percent
1.0%
0.0%
10.4%
41.7%
46.9%
Not at all familiar
Not familiar
Neutral
Familiar
Very familiar
Note. N=96, 5 did not answer.
202
Response
Count
1
0
10
40
45
Table G 1 4
About how many hours do you spend in a typical 7-day week doing each of the
Preparing for class (studying,
reading, writing, rehearsing or other
activities related to your program
Working for pay
Participating in college-sponsored
activities organizations, campus
publications, student government,
intercollegiate or intramural sports,
etc.)
Providing care for dependents living
with your (parents, children, spouse,
etc.)
Commuting to and from classes
XT
None
1c
1-5
c m
6-10
11-
1
27
31
30
21
10
14
43
38
67
17
31+
Response
^
6
2
97
24
21
6
96
10
3
1
0
95
23
2
2
0
2
96
73
5
0
0
0
95
2Q
213Q
Note. N=97, 4 did not answer
Tabled 5
If you have a job, how does it affect your school work?
I don't have a job
My job does not interfere with my school work
My job takes some time from my school work
My job takes a lot of time from my school work
Note. N=97, 4 did not answer
203
following?
Response
Percent
25.8%
26.8%
39.2%
8.2%
Response
Count
25
26
38
8
Tabled 6
How likely is it that the following issues would cause you to withdraw from class or from
WMU?
Not Likely
1
Working full-time
Caring for
dependents
Academically
unprepared
Lack of finances
Don't fit in
Don't offer program
of study that I want
Very Likely
5
Response
Count
42
16
16
16
7
97
51
15
18
6
6
96
42
17
20
11
7
97
39
75
16
15
13
3
12
4
16
0
96
97
48
11
13
13
12
97
Note. N=97, 4 did not answer
TableGl 7
Are you a member of a social fraternity or sorority?
Response
Percent
96.8%
3.2%
No
Yes
If yes, which one?
Response
Count
91
3
4
Note. 7V=94, 7 did not answer
Tabled 8
Are you a student athlete on a team sponsored by WMU's athletics department?
Response
Percent
100.0%
0.0%
No
Yes
Note. N=91, 4 did not answer
204
Response
Count
97
0
Tabled 9
How supportive are your friends of your attending WMU?
Response
Percent
2.1%
15.5%
36.1%
46.4%
Not Very
Somewhat
Quite a bit
Extremely
Response
Count
2
15
35
45
Note. N=94, 7 did not answer
Table G20
How supportive is your immediate family of your attending WMU?
Response
Percent
0.0%
6.2%
23.7%
70.1%
Not Very
Somewhat
Quite a bit
Extremely
Response
Count
0
6
23
68
Note. N=94, 7 did not answer
Table G21
Which best represents the quality of your relationship with students at WMU?
1 Unfriendly, unsupportive, sense of alienation
2
3
4
5 Friendly, supportive, sense of belonging
Note. 7V=94, 7 did not answer
205
Response
Percent
0.0%
6.2%
19.6%
36.1%
38.1%
Response
Count
0
6
19
35
37
Table G22
Which best represents the quality of your relationships with instructors at WMU?
Response
Percent
Response
Count
0.0%
4.1%
33.0%
44.3%
18.6%
0
4
32
43
18
1 Unavailable, unhelpful, unsympathetic
2
3
4
5 Available, helpful, sympathetic
Note. 7V=94, 7 did not answer
Table G23
Which best represents the quality of your relationship with administrative personnel &
office staff at WMU?
Response
Percent
Response
Count
0.0%
14.4%
33.0%
33.0%
19.6%
0
14
32
32
19
1 Unhelpful, Inconsiderate, rigid
2
3
4
5 Helpful, considerate, flexible
Note. 7V=94, 7 did not answer
Table G24
In your experience at WMU during the current school year, about how often have you
done each of the following?
Asked questions in class or
contributed to class discussions
Made a class presentation
Come to class without completing
Very
Often
4
5
Never
1
2
0
6
32
28
30
5
21
27
26
17.
7
39
26
20
4
206
3
readings or assignments
Worked with classmates outside of
class to prepare class assignments
10
17
26
25
18
Tutored or taught other students
(paid or voluntary)
44
21
15
12
3
Participated in a community-based
project as a part of a regular course
54
21
14
5
2
Used instant messaging to work on
an assignment
49
18
12
11
6
Used e-mail to communicate with
an instructor
Discussed grades or assignments
with an instructor
0
5
20
29
42
1
15
24
30
26
Talked about career plans with an
instructor or adviser
18
23
21
24
10
Discussed ideas from your readings
or classes with instructors outside of
class
30
26
20
10
9
Received prompt feedback (written
or oral) from instructors on your
performance
5
11
39
25
15
Worked harder than you thought
you could to meet an instructor's
standards or expectations
5
16
33
25
16
49
21
17
3
5
7
11
27
30
21
13
14
31
21
17
Worked with instructors on
activities other than coursework
Discussed ideas from your readings
or classes with others outside of
class (students, family members, coworkers, etc.)
Had serious conversations with
students who differ from you in
terms of their religious beliefs,
207
political opinions, or personal
values
Had serious conversations with
students of different race or ethnic
background than your own
Skipped class
Included diverse perspectives
(different races, religions, genders,
political beliefs, etc.) in class
discussions or writing assignments
Put together ideas or concepts from
different courses when completing
assignments or during class
discussions
Note. N=96, 5 did not answer
6
12
32
22
23
15
45
22
12
2
8
15
35
25
13
2
15
31
31
17
Table G25
To what extent does WMU emphasize each of the following?
Spending significant amounts of time studying
Providing the support you need to help you
succeed academically
Encouraging contact among students from
different economic, social, and racial or ethnic
backgrounds
Helping you cope with your non-academic
responsibilities (work, family, etc.)
Very
Little
1
2
2
3
4
8
33
31
Very
Much
5
18
3
8
32
33
14
14
18
28
24
8
17
31
29
11
4
10
20
34
19
7
2
11
33
26
19
0
2
16
37
37
Providing the support you need to thrive socially
Attending campus events and activities (special
speakers, cultural performances, athletic events,
etc.)
Using computers in academic work
Note. N=92, 9 did not answer
209
Table G26
During the current school year, about how often have you done each of the following?
Very
Little
1
2
3
4
Very
Much
5
25
19
20
15
12
Exercised or participated in physical fitness
activities
13
3
31
19
25
Participated in activities to enhance your
spirituality (worship, meditation, prayer, etc.)
47
13
18
2
11
23
21
20
21
14
28
27
19
Attended an art exhibit, play, dance, music,
theater, or other performance
Tried to better understand someone else's views
by imagining how an issue looks from his or
her perspective
Learned something that changed the way you
understand an issue or concept
1
Note. N=9l, 10 did not answer
Table G27
Overall, how would you evaluate the quality of academic advising you have received at
WMU?
Response
Percent
8.6%
23.7%
46.2%
21.5%
Poor
Fair
Good
Excellent
Note. N=93, 8 did not answer
210
Response
Count
8
22
43
20
Table G28
How would you evaluate your entire educational experience at WMU?
Response
Percent
1.1%
20.7%
57.6%
20.7%
Poor
Fair
Good
Excellent
Response
Count
1
19
53
19
Note. N-92, 9 did not answer
Table G29
If you could start over again, would you still attend WMU?
Response
Percent
3.3%
12.0%
53.3%
31.5%
Definitely no
Probably no
Probably yes
Definitely yes
Response
Count
3
11
49
29
Note. N=92, 9 did not answer
Table G30
Would you recommend WMU to a friend or family member?
Response
Percent
4.3%
95.7%
No
Yes
Note. N=92, 9 did not answer
211
Response
Count
4
88
Table G31
To what extent do you agree or disagree with the following statements about your high
school?
Strongly
disagree
°
2
4
4
29
My high school teachers made
extra efforts to help students
2
7
27
Teachers at my high school
understood and met the needs of
each student
.
--
.
Teachers at my high school were
fair to students
4
4
38
27
13
31
24
At my high school, students were
interested in learning new things
8
28
Students at my high school had
fun but also worked hard on their
studies
_
My high school teachers were
patient when a student had trouble
learning
Students at my high school
understood why they were in
school
Students at my high school
worked hard to complete their
school assignments
1Q
agree 5
Know
Rating
.
Average
31
15
5
3.59
32
16
4
3.63
,
- _„
12
3
3.46
11
4
5
2.54
34
13
1
4
2.65
oo
34
14
1
3
268
^
33
10
2
3
Strongly
4
c
.
Don't
T,
4
2 .55
Getting good grades in high
school was important to me
0
1
10
14
59
4
4.56
I pushed myself in high school to
do better academically
1
3
15
24
41
4
4.20
212
In high school, I believed that I
could be successful
0
4
4
18
When I was in high school I
believed that going to college was
important to my future
My high school teachers believed
that I would graduate from high
school
57
5
4.54
72
5
4.83
69
9
4.84
My high school teachers believed
that I would succeed in college
1
2
2
9
64
10
4.71
My high school teachers had high
expectations of me in class
2
4
1
15
59
7
4.54
I had a high school teacher who
was a positive role model for me
1
5
8
15
54
5
4.40
4
2
10
34
32
6
4 07
3
5 2 3
32
19
6
3.72
15
24
14
9
3.37
At my high school teachers or
counselors encouraged students to
think about their future
At my high school teachers or
counselors helped students plan
for future classes and for future
jobs
At my high school teachers or
counselors helped students with
personal problems
Students at my high school could
_.! ,
j _, . .cx i
get help and advice from teachers
or counselors
When I was in high school I
. ,j
•
T
J i
received the assistance I needed
to go to college
My high school prepared me well
for my future
22
_ „ „ - „
3
27
28
1r,
5
19
,
6
- ,.
3.65
,
6
.
4
,,
16
„,
26
_-,
31
,
5
_ „„
3.87
7
14
18
30
15
4
3.38
c
Note. iV=88, 13 did not answer
213
Table G32
To what extent do you agree or disagree with the following statements regarding the
Kalamazoo Promise?
Strongly
Disagree
1
2
3
4
Strongly
Agree 5
Teachers and/or school staff at my high school spoke
to me about the Kalamazoo Promise
4
6
17
28
32
My parents/guardians have spoken with me about the
Kalamazoo Promise
2
9
17
15
44
21
22
30
15
28
My parents/guardians encouraged me to work harder
in school because of the Promise
The Kalamazoo Promise gave me more flexibility
about which college or university I may choose to
attend
12
12
20
The Promise hasn't really made a difference to my
educational goals or plans
29
17
15
I changed my career goals because of the Kalamazoo
Promise
43
15
15
I worked harder in high school because I knew that
the Promise would pay for college
I was confident before the Promise that I could afford
to go to college, using financial aid, scholarships,
and/or my family's resources
21
12
24
14
16
11
13
20
18
24
I wasn't sure that I could afford college before the
Promise. I didn't know if I would be able to get the
scholarships, financial aid, or loans that I would need
26
21
15
13
12
I still am not sure if I can afford college, because I am
not eligible for 100% of tuition from the Promise
65
I wanted to go to college even before the
announcement of the Kalamazoo Promise in
November 2005
Note. N=&7, 14 did not answer
214
18
69
Table G33
To what extent do you agree or disagree with the following statements regarding changes in
your high school after the announcement of the Kalamazoo Promise?
Str0ngly
Disagree
&
2
3
4
Stranelv
7°nglJ
Agree 5
23
14 28
15
6
16
12 33
19
6
16
22 27
14
5
27
24 22
11
2
5
3
18
26
33
23
26 32
4
1
Students became better behaved and were
getting into less trouble
29
24
23
7
3
More information was provided about
higher education opportunities
6
7
25
33
13
My peers were more motivated to succeed
in school
9
15
30
27
5
I talked about college more often with
peers
8
8
24
33
12
The quality of student academic
performance improved
10
1937
16
4
„ „ , _ . .
7 25
31
16
My attendance in high school improved
school
My school started offering more college
prep courses
I enrolled in more college prep courses
Teachers expected that more students
would go to college
The amount of homework increased
More support from community
• *•
-A A * *, A 4.
orgamzations was provided to students
and families
Note. N=86, 15 did not answer
,
6
215
1/r