Reams and twale 2008 mixed methods

International Journal of Research & Method in Education
Vol. 31, No. 2, July 2008, 133–142
The promise of mixed methods: discovering conflicting realities
in the data
Paula Reamsa* and Darla Twaleb
a
Kettering College of Medical Arts, Kettering, OH, USA; bUniversity of Dayton, Dayton, OH, USA
(Received 27 April 2007; final version received 17 February 2008)
Taylor and Francis Ltd
CWSE_A_312616.sgm
International
10.1080/17437270802124509
1743-727X
Original
Taylor
202008
31
[email protected]
PaulaReams
000002008
&Article
Francis
(print)/1743-7288
Journal of Research
(online)
and Method in Education
The purpose of our article is to illustrate the use of mixed methods research as necessary
to uncover maximum information and perspective, increase corroboration of the data,
and render less biased and more accurate conclusions. This study used the case study
method to closely examine the people, issues, programmes and topics related to the
implementation of service learning at the health professions college. Using document
analysis, Stages of Concern Questionnaire, and faculty and administrative interviews, the
mixed method approach allowed the researchers to use both quantitative and qualitative
data to answer the research questions. The data gathered in this mixed method manner
assisted the change process of institutionalization of service learning in this College.
Keywords: mixed method research; service learning; institutionalization; change process;
qualitative and quantitative process
Graduate research methods courses in education have typically been the most dreaded by
our students because of the complicated statistical components (DeLuca, Twale, and
Herrelko 2005). As a result, students who do not feel comfortable or competent doing quantitative research, indicate early in their master’s or doctoral programme that they will be doing
a qualitative research study. Most have not written their research question. Students may also
be swayed by professors who hold a bias or preference for one type of research over the other
(Guba 1990). Letting the research questions direct the study is the most practical way to
undertake a research project and the approach most likely to uncover the reality of the situation (Tashakkori 2007). The purpose of our article is to illustrate the use of mixed methods
research as necessary to uncover maximum information and perspective, increase corroboration of the data, and render less biased and more accurate conclusions. In order to illustrate
this, we use the first author’s dissertation research on the factors within a small, religiously
affiliated, health professions college that facilitated or hindered the implementation of service
learning into their curriculum. How the methods were applied to this topic and the subsequent
outcome are the focus rather than specific findings from the actual research.
Review of literature
The purpose of the original research study was to understand the extent to which the organizational infrastructure, institutional leadership and faculty facilitate or hinder institutionalizing service learning across the curriculum in a small college. We formulated research
*Corresponding author. Email: [email protected]
ISSN 1743-727X print/ISSN 1743-7288 online
© 2008 Taylor & Francis
DOI: 10.1080/17437270802124509
http://www.informaworld.com
134
P. Reams and D. Twale
questions and determined what data would need to be gathered to answer them. Quantitative
and qualitative data collection procedures answered the following: (1) How does the organizational structure facilitate or hinder the incorporation of service learning in the curriculum? (2) How does the administration’s leadership facilitate or hinder the incorporation of
service learning in the curriculum? and (3) How does faculty facilitate or hinder the
incorporation of service learning in the curriculum? We were able to see the bigger picture
through the multiple perspectives of the campus constituents, either through interview,
instrument, or meeting minutes (Greene 2005). In essence, we combined a phenomenological approach with a quantitative instrument because we desired to hear and see how faculty
and administrators in this small college described their reality regarding service learning
(Fox, Martin, and Green 2007). The primary researcher chose to interview faculty and
administrators regarding their perceptions of service learning in the curriculum after they
completed a Likert-type instrument on their receptivity toward and use of service learning
in their courses. While these data were illuminating and supportive, the fact remained that
service learning was not present in the curriculum beyond a single programme area. The
examination of campus artefacts through document and discourse analysis ultimately
determined that the college infrastructure might be problematic area as faculty and the
administration were largely supportive. Using mixed methods, she uncovered unexpected
as well as expected interactions (Hays 2004). This pragmatic approach allowed for incremental understanding of the data and over time showed these discrepancies taking place.
Choosing a combination of quantitative and qualitative methods for this study allowed
the research to be conceptualized holistically. We used Newman and Benz’s (1998)
interactive qualitative–quantitative philosophy of education research methodology. Conceptually, the circular pattern of the model neither allows theory at the beginning nor at the end
of the study, but illustrates the process as a continued cycle. Alone, neither quantitative nor
qualitative research makes a complete whole. Multiple data collection allows for complementarity and a counterbalance of strengths and weaknesses of each technique. Following
a purely empirical model would have yielded scores regarding faculty and administrative
perceptions, but subsequent interviews with these groups added greater richness to those
scores. However, it was the documents gathered that offered contradiction and explanation.
These data were integral to the holistic picture.
Greene (2005) regarded mixed methods research as a means to gain insight. She went
on to add how such enquiry generated ‘important understandings and discernments through
the juxtaposition of different lenses, perspectives, and stances’ (208). By the same token,
faculty issues expressed in the interviews were not always captured in the documents or
highlighted in the scores of the instrument used. Participants perhaps used either pencil and
paper or in person conversations to raise issues or concerns with the researcher but not all
of those issues were captured in the meeting minutes. The goal was to understand the
faculty’s perspective in context and to peer into their ‘reality’ with regard to service learning
in the curriculum college wide.
Because we were collecting data from multiple sources at one institution, we referred to
the product as a case study. Orum, Feagin, and Suoberg referred to case study as an ‘indepth, multifaceted investigation, using qualitative methods, of a single social phenomenon’
(1991, 6). They saw the case study method as evolving over time, permitting the grounding
of observations, in a holistic, complex and evolving system. Merriam defined case study as
‘an examination of a specific phenomenon such as a program, an event, a person, a process,
an institution, or a social group’ (1988, 9). The need to use the pragmatic research paradigm
in this study as opposed to only the empirical or constructivist paradigms does not appear
to violate the use of case study as identifying this study. Educational research as a user of
International Journal of Research & Method in Education
135
sociological research techniques regards case study as practical and illuminating (Stake
1995; Hays 2004).
Method
This study used the case study method to closely examine the people, issues, programmes
and topics related to service learning at a Midwestern religiously affiliated health professions college in a suburb of a moderately sized city. Started in the 1960s, the college was
the first in the country to offer associate degrees as part of a hospital affiliated institution.
The college majors consist of nursing, premed and other allied health professions with twoand four-year degrees offered. The physician assistant programme offers a master’s degree.
Fulltime students number between 700 and 800 with an average age of 27 years. The
College employs 52 full-time faculty along with clinical and adjunct instructors.
Document analysis
Document content analysis helped establish the existence and frequency of concepts related
to service learning in the curriculum (Busch et al. 2004). This technique enabled the
primary researcher to study the institutionalization of service learning through analysis of
multiple communications. Because much of human activity is not directly observable, nor
it is always possible to get information from people who might know of such activity from
firsthand experience, content analysis enabled the campus to be studied in an unobtrusive
way (Frankael and Wallen 2006). Content analysis proved a suitable method of study
because it began as a quantitative method for gathering existing data but also evolved into
a qualitative form of discourse analysis of that data (Newman and Benz 1998). The primary
researcher performed a content analysis of 2000–4 archived college documents such as
minutes from faculty meetings, curriculum committee meetings, course syllabi, accreditation self-study documents, assessments, evaluations, college bulletin, faculty handbook,
website, Honours programme and budget narratives. The authenticity of these documents
emerged as all documents chronicled group proceedings, group reports, or contracts
between faculty and students. The documents contained information that pertained to the
governance, management or mission of the institution with specific attention to service
learning.
Documents were categorized into primary (syllabi, meeting minutes) and secondary
documents (accreditation reports, websites) (Merriam 1998; Love 2003). Tallying of the
information required two matrices: counting the number of times service learning was
mentioned in the documents (quantitative) and how it was mentioned (qualitative) (Mills
1997). The primary researcher assigned meaning to the contents of each group of
documents based on where they were found in the hierarchical pattern reflective of the
campus governance structure. She performed discourse analysis on the content, themes,
structures, and underlying messages and assumptions conveyed in the documents (Mills
1997). Categorizing and coding of the documents were then compared and contrasted
(Dey 1993).
Stages of concern questionnaire
Administrators and faculty completed Hall and Hord’s (2001) Stages of Concern Questionnaire (SoCQ) which addressed the attitudes and perceptions related to innovation introduced
as part of a change process. In this instance, introducing service learning campus wide into
136
P. Reams and D. Twale
the curriculum served as the innovation. In the 35-item questionnaire, participants utilized
an eight-point Likert-type scale ranging from 0 (completely irrelevant) to 1–2 (not true of
me now), 3–4 (somewhat true of me now), and 5, 6 and 7 (very true of me now). Sample
questions included: ‘At this time, I am not interested in learning about this innovation’ and
‘I would like to revise the innovation’s instructional approach’.
Hall and Hord (2001) constructed seven corresponding stages to relate to the points on
the Likert-type scale. Groups of five questions constituted a scale and corresponded to one
of the stages. If participants consistently selected 0, they were determined to be in the
Awareness Stage. If participants did not know what service learning was or how to use it in
their curriculum, this would be interpreted as a barrier to institutionalization and indicative
of the need for Awareness. This is followed by Informational, Personal, Management,
Consequence, Collaboration and Refocusing (Stage 6).
The SoCQ instrument achieved criterion and construct validity including ‘intercorrelation matrices, judgments of concerns based on interview data, and confirmation of expected
group differences and changes over time’ (Hall, George, and Rutherford 1979, 12). Reliability coefficients ranged from .64 to .83 indicating moderate to strong reliability levels (Hall,
George, and Rutherford 1979).Two opened questions assisted in the interpretation of stage
scores. Individual scores were plotted on SoCQ graphs and analysed for patterns. Once the
highest stage score was calculated for each participant scores were grouped to produce an
overall picture of each defined group, in this case, by department as well as faculty versus
administration. Hall, George, and Rutherford (1979) recommended examining the second
highest score to note the general pattern of attitudes toward an innovation.
Faculty and administrator interviews
SoCQ participants provided the pool of interviewees. Interviewing was used in the study
because it served as a validity check to responses given in surveys (Glesner 1999). Ten 15to 30-minute face-to-face interviews were conducted by the primary researcher in a private
space. The semi-structured interview protocol combined experience and behaviour with
opinion and value (Patton 1990). We wanted to know participants use, goals, beliefs, attitudes and values related to service learning, if service learning fits the mission, and
whether it would be beneficial to both students and faculty. Notes taken during and immediately following the interviews were reviewed with the transcribed interviews. The
review produced codes of identifiable concepts which were gathered into major codes and
labelled as themes with brief descriptions in order to construct a conceptual framework
(see Figure 2). Interviewees were offered the opportunity to review and clarify the data
(Frankael and Wallen 2006). To strengthen internal validity, all respondents answered the
same interview questions; however, selection bias still existed because while all were
invited to participate in the SoCQ and the interview, not all chose to be interviewed. The
primary researcher increased objectivity by spending a considerable amount of time at the
College gathering data and checking respondent perceptions against collected data
(Glesner 1999; Charmaz 2004).
Results
To make sense of the 766 documents collected, a diagram (Figure 1) was created to show
how many documents were generated by each department or committee in the institutional
hierarchy and how many times service learning was mentioned in those documents. Data
showed that service learning was frequently mentioned by faculty and curriculum
International Journal of Research & Method in Education
Institutionalization
Of
Service learning
Support Committees
College Senate
General Assembly
Administrative
Council
137
Dissemination of information
Reports to
Outside
Agencies
Curriculum Committee
Assessment Committee
Nursing Faculty/Curriculum
Committees
Bulletin
Service Learning Task Force
KMC Mission
Committee
Figure 1.
College Website
Budget
Hierarchy of the documents.
committees sitting at the bottom of the hierarchy but that as one moved up the College
infrastructure, service learning was mentioned much less. Although service to the community was a key component of the College mission, the documents revealed that it was not
pervasive. In fact, some departments were discussing service learning far more than others.
Despite that service learning was part of the College mission, it was found in only a few
courses and mentioned infrequently in college curriculum documents.
How service learning was discussed in those documents was equally compelling.
Service was part of strategic campus goals; however, service learning as a pervasive goal of
the health professions’ curriculum was not discussed in key committee meetings and was
not encouraged through rewards to faculty. In fact, the committee spearheading the service
learning initiative was not officially recognized for its progress in terms of placement in the
governance hierarchy or through budget allotments, and no funding was allotted to develop
service learning as a teaching strategy.
The SoCQ analysis indicated that of the top three administrators at the College, one was
deemed to be between Stage 0 and 1, another was a Stage 4 and the third was in Stage 5.
Because of the disparity when taken as a group, the analysis indicated possible implementation problems, because one showed resistance while another was concerned with the
consequences of incorporating service learning into the curriculum. The open-ended questions further revealed that despite at Stages 4 and 5, two administrators expressed hesitation
and concern; however, their positive comments indicted that they were willing to work
together.
Forty-five out of 52 faculty members completed usable SoCQ questionnaires. Awareness Stage 0 had the highest overall mean (48%), followed by Stage 1 Informational (20%).
Eleven per cent of the faculty was at Stage 5 Collaboration and included the smallest
departments. Additional analyses showed that 10 individuals were concerned more with
their personal position in the College than with changing the curriculum. Open-ended
questions revealed concerns with understanding the concept, pleas for more information,
and little concrete data on service learning as a productive teaching strategy.
Interviews with the administrators further indicated a lack of understanding of service
learning but recognized that it supported the College mission. All three administrators
138
P. Reams and D. Twale
stated enthusiasm and support for service learning in the curriculum and acknowledged
that they encouraged faculty to use it. One administrator noted the need for ‘leadership on
the academic side that gives service learning a very substantial priority … we will
become service-learning oriented to the degree that the leadership of the institution takes
it seriously’. Another administrator, however, was worried that service learning could
compromise students’ capacity to pass board exams. Interviews with the faculty indicated
their recognition of service learning as part of the service mission. With the initiative to
move to a learner-centred teaching methodology, some faculty regarded ‘service learning
as a valued pedagogy’. But sceptics voiced the notion of ‘how to ensure that learning is
taking place’. One nursing faculty member indicated that administration ‘could foster
more of a collaboration effort between the departments in order to implement service
learning’. Another faculty member stated that ‘administrators need to have a greater
understanding of what kind of work is involved’. A few faculty voiced concern that
service learning was not ‘part of the culture’ and saw this as part of administration’s job
to lead the College in this direction. Both groups were clearly expecting the other to lead
them toward fulfilling the mission even though most faculty and most administrators
acknowledged service learning as important to the College mission. Clearly, the mix of
methods held the key to more completely understanding the frustration expressed by both
faculty and administrators.
Discussion
The research addressed a practical problem, holistically, in its natural setting, through a
flexible but evolving design that employed quantitative and qualitative methods that
complement as well as verify each other. It combined a phenomenological approach to give
meaning to what campus faculty and administrators attached to integrating service learning
into the curriculum was studied along with results from a quantitative instrument. The goal
of this case study research was to solve the puzzle and reconcile the dissonance in the
findings in order to effect any change in the system.
In order to accomplish that, the primary researcher inadvertently exposed what Fox,
Martin, and Green (2007) call the shadow side of the organization. The primary researcher
discovered that using mixed methods was not just for the purposes of triangulation and
confirmation and but also for complementarity and in this case, serendipity. Data analysis
from multiple collection techniques revealed discrepancies and inconsistencies between
what the administration professed, the faculty lived and what was portrayed in the document
analysis, for example. It ultimately indicated that what leaders were saying in the interviews
was not consistently based upon what the SoCQ results revealed nor what was housed in the
documents. The use of mixed methods indicated that leadership was as much the problem
as the solution. However, without the combination of methods in this case study, that fact
might have been overlooked. Burke Johnson and Onwuegbuzie (2004) referred to this
process as ‘development’, that is, the results of the quantitative data informed or clarified
data gathered from qualitative methods which in this case allowed us to determine where
the problem lied.
Because changes in College curricula require at least three layers of governance for
voted acceptance, service learning became mired in the infrastructure. The results studied
in isolation might have indicated that administration is supportive of service learning, that
faculty is predominantly unsure of what service learning is, and that the College maintains
a mission of community service. We also learned that senior administration was a barrier to
institutionalization of service learning in the College despite their rhetoric. The interviews
International Journal of Research & Method in Education
139
with administration if taken alone (as compared to the SoCQ results, interviews with faculty
and the documents) indicated they were supportive and enthusiastic. While they may have
believed this to be true, actions as chronicled in the documents and perceived by faculty
corroborated otherwise. While most people at the College believe that service learning
should be utilized more in curriculum, collaboration or implementation seemed to come
from certain pockets of faculty and not others. In fact, the nursing faculty and a task force
located at the bottom of the College hierarchy. While each data source appears to produce
feasible results, they are only tentative conclusions that are refutable given the results of
data from all sources (Burke Johnson and Onwuegbuzie 2004). Each method inadvertently
supported or contradicted the others such that, conclusions from the data agreed and
sometimes they did not (Newman et al. 2002; Teddlie and Tashakkori 2002; Mertens 2005;
Greene 2007).
This mixed methods approach capitalized on data collection and interpretation (Mertens
2005; Wiersma and Jurs 2005; Greene 2007). Had we followed purists like Guba (1990),
our data analysis and outcomes would have been very different, that is, at best, incomplete
and at worst, incorrect. Without the combination of methods providing a chronicle of events
regarding the path of service learning through the College, there would have been no clear
understanding of why service learning had not come to fruition as expected from the service
mission. Examination of all the data shows that administration is not understanding and
communicating about service learning, so a bottom-up approach opened up the infrastructure. Without using mixed methods, it is unlikely that any acknowledgement of the problem
would have taken place or that any subsequent change would have followed in a timely
fashion.
Implications
While one conclusion of the study is that new hypotheses could be generated and tested to
determine further why service learning had not been implemented or if a faulty infrastructure created other problems, our message is broader. We argue that one epistemology may
have been insufficient at determining the gravity and reality of the research problem.
Tashakkori (2007) argues that you should follow your research question like a road map and
let it determine the depths of your methodology rather than follow one epistemological track
especially at the exclusion of another. Uncovering reality or ‘truth’ is the goal, by whatever
means that entails. Using multiple methods enhances the picture to illuminate the problem
under study (Wiersma and Jurs 2005). That said, limits may be placed on the data one
wishes to gather, that is, it may be too costly or time consuming, or impossible or unethical
for that matter. Furthermore, mixing methods because you can is not reason enough to do
so. However, one method may offer only one view (albeit, the favoured one hypothesized)
and, while valid, may be incomplete.
We also encountered another issue. Data collected were filtered through the primary
researcher who worked at the College. To avoid jeopardizing interpretive validity, she was
encouraged to construct multiple frameworks from the data rather than try to fit the data into
a preconceived framework (Fox, Martin, and Green 2007). Therefore, the research design
had to overcome any researcher bias because in addition, the primary researcher was housed
in the nursing department and the Honours Programme both of which were impassioned
about incorporating service learning in the curriculum to comply with the College service
mission. In a two-way bind were College faculty who help preconceived notions about
service learning and its alignment with mission as well as their individual stances on
incorporating it in their individual courses. Some faculty struggled with national standards
140
P. Reams and D. Twale
for health care professionals as paramount to the College service mission while other faculty
believed both masters could be served (see Mezirow 1991).
Yet another issue arose in that as a health service professional, the primary researcher
was steeped in post positivist research tradition, thus the addition of qualitative methods was
not only foreign to her but to many of her colleagues. One the one hand, her choice of mixed
methods answered her research questions, but would the result be valued by the faculty and
the administration enough to make them change their goal as well (Fox, Martin, and Green
2007)? Ultimately, she was able to paint the larger picture (see Figures 1 and 2) that opened
up the situation to scrutiny. Because of the trustworthiness of the data and the data collector,
the administration acted on her findings. While they were able to see flaws in the infrastructure as well as leadership to spur change, we recommend that additional types of data need
to be gathered to assist the changes.
Throughout the project, the primary researcher needed to be cognizant of her faculty
position, research role and her relationship with colleagues and administration as it related
to data collection. She was able to be reflexive to the extent allowed by the collaboration
with the second author qua dissertation chair (Fox, Martin, and Green 2007). Given her
position, however, she alone needed to practice reflection as a way of interpreting the data
and the gravity of the information as she gathered it. In other words, how did her use of
service learning in her courses affect her ability to be value-neutral when a college
expressed little willingness to incorporate it? Furthermore, because of the sequence in which
she gathered the data, doing the document analysis before the interviews, she became aware
of those contradictions.
Figure 2.
1. Facilitators
Hierarchy ofand
thebarriers.
documents.
Conclusions
Perhaps the way we prepare researchers is ill-conceived in that we traditionally offer
quantitative methods in one course, qualitative methods in another and statistics in one or
"
Figure 2.
Facilitators and barriers.
!
International Journal of Research & Method in Education
141
two subsequent courses. Furthermore, mixing methods often appears as the shortest chapter
in a research methods text (see Wiersma and Jurs 2005; Frankael and Wallen 2006)! Where
applicable, mixed methods may be dismissed if a dissertation chair favours one method over
another and sways the advisee toward a preferred method. A need to follow the research
questions prevailed in this case study. However, wishing to graduate soon, advisees may
take the shortest path with the least resistance rather than following what needs to be
followed to answer their research questions. Furthermore, graduate programmes or specific
disciplines may be socializing biased researchers by emphasizing one epistemology or
paradigm over another (Mertens 2005).
Using mixed methods often dovetails model development or building. Because this case
study was confined to a small college and the whole population was invited to participate,
we were able to identify the variables and find some links between them. This becomes
more difficult, however, as the natural setting and the participant pool grows larger and
more complex. The model, depicted in Figures 1 and 2, speaks to a larger body of knowledge on isolation, communication and fragmentation that needs to be further tested (see
Cleary and Benson 1998; Pribbenow 2002; Heim and Murphy 2003). Our conclusions
emerge from the holistic or practical approach to research and would have been different
had we followed the belief that one epistemology or one method is more valid or superior
than another.
Perhaps as the study found, administrators were convinced they were leading and
administrating but instead a pocket of nursing faculty was leading and not following. By the
same token, perhaps research methods faculty can take the lead and demonstrate that
research questions determine method rather than personal preference. For the ‘pocket’ of
those researchers who have discovered the value of mixed methods, our study attempts to
illustrate to others that value.
References
Burke Johnson, R., and A. Onwuegbuzie. 2004. Mixed methods research: A research paradigm
whose time has come. Educational Researcher 33, no. 7: 14–26.
Busch, C., P. DeMaret, T. Flynn, R. Kellum, S. Le, and B. Meyers. 2004. Content analysis. http://
writing.colostte.edu.references/research/content/index.cfm
Cleary, C., and D. Benson. 1998. The service integration project: Institutionalizing university
service learning. Journal for Experiential Education 21, no. 3: 124–9.
DeLuca, B., D. Twale, and J. Herrelko. 2005. Linking action research to professional development:
Applications for university faculty. Journal of Faculty Development 20: 69–78.
Dey, I. 1993. Qualitative data analysis. London: Routledge.
Fox, M., P. Martin, and G. Green. 2007. Doing practitioner research. London: Sage.
Frankael, J.R., and N.E. Wallen. 2006. How to design and evaluate research in education. 6th ed.
Boston, MA: McGraw-Hill.
Glesner, C. 1999. Becoming qualitative researchers: An introduction. New York: Longman.
Greene, J.C. 2005. The generative potential of mixed methods inquiry. International Journal of
Research & Method in Education 28, no. 2: 207–11.
———. 2007. Mixed methods in social inquiry. San Francisco, CA: Jossey-Bass.
Guba, E.G. 1990. The alternative paradigm dialog. In The paradigm dialog, ed. E.G. Guba, 17–27.
Newbury Park, CA: Sage.
Hall, G.E., A.A. George, and W.L. Rutherford. 1979. Measuring stages of concern about the innovation:
A manual for use of the SoC questionnaire. 2nd ed. Austin, TX: University of Texas.
Hall, G.E., and S.M. Hord. 2001. Implementing change: Patterns, principles, and potholes. Boston,
MA: Allyn & Bacon.
Hays, P.A. 2004. Case study research. In Foundations for research: Methods of inquiry in
education and social sciences, ed. K. deMarrais, and S.D. Lapan, 56–78. Mahwah, NJ: Lawrence
Erlbaum.
142
P. Reams and D. Twale
Heim, P., and S. Murphy. 2003. In the company of women: Indirect aggression among women: Why we
hurt each other and how to stop it. New York: Jeremy P. Tarcher/Putnam.
Love, P. 2003. Document analysis: Approaches and methods. In Research in the college context, ed.
F.K. Stage, and K. Manning, 83–96. New York: Brunner-Routledge.
Merriam, S.B. 1988. Case study research in education: A qualitative approach. San Francisco, CA:
Jossey-Bass.
———. 1998. Qualitative research and case study applications in education. San Francisco, CA:
Jossey-Bass.
Mertens, D. 2005. Research and evaluation in education and psychology. 2nd ed. Thousand Oaks, CA:
Sage.
Mezirow, J. 1991. Transformative dimensions of adult education. San Francisco, CA: Jossey-Bass.
Mills, S. 1997. Discourse: The new critical idiom. New York: Routledge.
Newman, I., and C. Benz. 1998. Qualitative-quantitative research methodology: Exploring the interactive continuum. Carbondale, IL: Southern Illinois University Press.
Newman, I., C. Ridenour, C. Newman, and G. DeMarco. 2002. A typology of research purposes and
its relationship to mixed methods. In Handbook of mixed methods in social and behavioral
research, ed. C. Teddlie, and A. Tashakkori, 167–88. Thousand Oaks, CA: Sage.
Orum, A.M., J.R. Feagin, and G. Suoberg. 1991. A tale of two cases. In A case for the case study, ed.
J.R. Feagin and G. Suoberg, 26–38. Chapel Hill, NC: University of North Carolina Press.
Patton, M.Q. 1990. Qualitative evaluation and research methods. Newbury Park, CA: Sage.
Pribbenow, C., and G. Golde. 2000. Understanding faculty learning in residential learning communities. Journal of College Student Development 41, no. 1: 27–40.
Stake, R.E. 1995. The art of case study research. Thousand Oaks, CA: Sage.
Tashakkori, A. 2007. The state of mixed methods: Discussion of the philosophical and methodological
issue. Keynote address at the annual meeting of the Eastern Educational Research Association,
February, in Clearwater, FL.
Teddlie, C., and A. Tashakkori. 2002. Handbook of mixed methods in social and behavioral research.
Thousand Oaks, CA: Sage.
Wiersma, W., and S. Jurs. 2005. Research methods in education. 8th ed. Boston, MA: Allyn & Bacon.
Copyright of International Journal of Research & Method in Education is the property of Routledge and its
content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's
express written permission. However, users may print, download, or email articles for individual use.