ER Editor Onwuegbuzie - Education Policy Analysis Archives

education policy analysis archives
A peer-reviewed, independent,
open access, multilingual journal
EPAA Response Table
Manuscript Number: http://epaa.asu.edu/ojs/author/submission/2792
Title: Investigating Student Exposure to Competency-Based Education
Manuscript re-submitted: January 18, 2017
Reviewer #1’s Comments:
1. The literature and conceptual framework provide a strong
rationale for your study. However, I did not have access to
Table A1, which would have been helpful to better understand
the indicators and the descriptive statistics related to them.
2. There seems to be a disagreement between Table 1 and your
description on page 16. On page 16, you indicate that CB
PACE1 became a fourth indicator of student demonstration of
ownerwhip over learning. In the text, you named this OWN4,
but in the table it seems you labeled it RSPNS4.
3. Without the actual survey, it was difficult to consider the factors
associated with your models (though I realize the survey will be
linked once published).
4. Finally, you suggest on page 7 that this research "...can support
practitioners and policymakers..." You do a nice job in the
discussion of linking back to practitioners (and researchers), but
do not spend time discussing (potential) policy implications.
Doing so would strengthen the piece, particularly in light of the
nature of this journal.
Responses:
We agree with the reviewer that the information in Table A1 will
help the reader understand the indicators and the descriptive
statistics for the indicators. Table A1 was submitted as an appendix
table and will be available to the reader.
The reviewer is correct; this was an oversight on our part. We have
corrected this as “OWN4” in Table 1.
The link to the survey was masked for blind peer review. The
manuscript is now unblinded, and a link to the publicly available
survey is provided in the text.
We appreciate this suggestion, offered by both Reviewers. In
response, we have added a new section to the manuscript
summarizing several implications for competency-based education
policy that our research highlights. At the same time, we are careful
to note that policy solutions are best informed by a cumulative body
of evidence rather than in response to a single study. See pp. 19-20.
1
Reviewer #2’s Comments:
1. Competency-based education (CBE) has been increasingly used
across schools and districts, but little empirical work has been
done about its effectiveness. As such, this manuscript fits the
scope of an education policy journal. While the authors discuss
the relationship between CBE and state and district policies in
the introduction, more discussion of how their survey might
inform district and state policies could be included in the
implications and conclusion sections.
2. The authors do a good job of sharing the landscape and
research around CBE. While the authors adopt four essential
elements of CBE, there have been a number of frameworks that
have varying levels of overlap with the Sturgis, 2016 framework.
Since the authors adopt that framework wholesale and use it to
develop their survey, it would be helpful to provide a brief
summary or description of other frameworks and definitions.
This will help readers understand the broader landscape of how
CBE is defined (similarly, differently) and how compatible the
survey is with these other definitions.
3. It would also be very informative to include the survey items in
the article, or at the very least examples as illustrations of each
construct, to give readers a better idea of how these constructs
are operationalized.
4. Understanding the items could also help answer my question
about whether or not there is a disconnect between students’
perceptions of what is going on in the classroom and what the
teacher is doing. While a CBE teacher might be implementing
practices that are CBE-aligned, students might not pick up on
those practices and how the survey is worded might help inform
this question. This point could be included in the discussion
about triangulating the survey with other data sources.
Responses:
Per both Reviewers’ suggestion, we have added a new section to the
manuscript summarizing several implications for policy. See pp. 1920.
We appreciate this comment. We now acknowledge more explicitly
the fact that a variety of overlapping frameworks for competencybased education exist, and we include several additional examples
(see p. 3). See the revised paragraph that begins, “Over time,
competency-based education…”
The link to the survey was masked for blind peer review. The
manuscript is now unblinded, and a link to the publicly available
survey is provided in the text on p. 9.
In addition, Table A1 was submitted as an appendix table and will
be available to the reader. This table includes the indicators and
their descriptive statistics.
We thank the Reviewer for this helpful comment. We have
incorporated this point into the text on p. 17 where we discuss
triangulating survey data with other data sources. See the paragraph
that begins, “Further, either collectively or in collaboration with
researchers…”
2
4. The authors also include a brief review of the literature on
implementation and the dimensions of implementation and say
that the survey measures exposure, adherence, and program
differentiation. While I was not able to see the specific items the
survey uses, it seems to me like the survey might not actually
measure exposure (the amount of time students are exposed to
different elements of the program).
5. It would also be helpful to have more of a discussion about how
the survey will help measure program differentiation. While the
authors discuss how the various models they tested provide
some information on program differentiation, would the
validated survey itself be useful for measuring program
differentiation? I could imagine that the survey could be given to
students who are in a traditional classroom and students who are
in a CBE classroom and used to determine if those practices
measured in the survey are actually different than the traditional
classroom. Discussion of that nature would be helpful and has
been brought up in the literature on program implementation
before.
6. There is also some discussion in the literature on
implementation fidelity about the specificity of innovations and
how that impacts measurement. Since the authors discuss how
CBE might not be a specific program or “thing,” including
some information on measurement of implementation of broad
policies or programs versus very specific programs or curricula
could be informative.
7. As is, the discussion of fidelity of implementation seems pretty
sparse, but I think adding additional background in that area
could help strengthen the implications and conclusions of the
article.
Editorial Board’s Comments:
1. Development of a section much more focused on
We have updated this section to include additional explanation on p.
5 about which aspects of exposure the survey assesses. See the
sentence that begins, “At the student level, this survey captures
information about student exposure…”
This suggestion is also appreciated. We have added text on p. 16
where we discuss the need for research that builds from our
research to further explore how well the survey differentiates both
among schools at different levels of implementation fidelity and
between competency-based and non-competency based schools.
We have added text to this effect on p. 18 where we now write: “In
other words, competency-based education may actually unfold as a
more arbitrary mix of instructional practices that, at the level of
implementation, are combined in ways that can’t necessarily be
distinguished from other models for secondary education—a
possibility consistent with research indicating that whole-school
models often lack the specificity required to achieve, and to
measurably detect, high implementation fidelity (Desimone, 2002;
O’Donnell, 2008).”
We appreciate the Reviewer’s suggestions regarding various
dimensions of implementation fidelity. As described above, we have
included new text in several places to incorporate these suggestions,
and we agree that these additions strengthen the manuscript.
Responses:
We have developed a new section focused on the implications of
3
implications for educational policy.
2. Some work on tables and figures and their alignment
with the text is needed.
3. Please be sure to include your survey instrument in an
appendix or somewhere online. As an open-access
journal we value making instruments, data, etc.
accessible; hence, this component is critical.
our research for educational policy. This section is located at pp. 1920 of the resubmitted manuscript.
We have reviewed and made revisions to the tables and figure so
that they are aligned with each other and with the text.
We have now unblinded the link to the survey, which is publicly
available online. The link is provided in the text on p. 9.
Accept with Revisions Original Email: ***Copy/paste your blinded “accept with revisions” email with all revisions originally
detailed below (i.e., so that the editorial board and reviewers can re-review what the manuscript’s reviewers originally requested
as ultimately addressed and detailed in the table above).
**This email was sent on behalf of EPAA/AAPE** Dr. Sarah Ryan:
We have reached a decision regarding your submission to education policy analysis archives, "Investigating
Student Exposure to Competency-Based Education".
Our decision is to: Accept the manuscript for publication pending revisions.
Please see the attached for more information.
Audrey Amrein-Beardsley
Arizona State University
[email protected]
-----------------------------------------------------Reviewer A:
Thank you for submitting your manuscript to EPAA. I believe this work will be of interest to the journal's
audience. Your writing is clear and concise.
The literature and conceptual framework provide a strong rationale for your study. However, I did not have
access to Table A1, which would have been helpful to better understand the indicators and the descriptive
statistics related to them.
There seems to be a disagreement between Table 1 and your description on page 16. On page 16, you indicate
that CB PACE1 became a fourth indicator of student demonstration of ownerwhip over learning. In the text,
you named this OWN4, but in the table it seems you labeled it RSPNS4.
4
Without the actual survey, it was difficult to consider the factors associated with your models (though I
realize the survey will be linked once published).
Your discussion brought up some good points, particularly on page 24 when you talk about the idea that
"...competency-based education may instead reflect a more arbitray mix of instructional practices that, at
the level of implementation, are comgined in ways that can't necessarily be distinguished from other modesl
for secondary education." This has immense potential for educational research broadly, as it seems we are
always trying to simplify the complicated nature of teaching and learning into models that are easily
digested. Seems it could be at the detriment of the endeavor.
Finally, you suggest on page 7 that this research "...can support practitioners and policymakers..." You do
a nice job in the discussion of linking back to practitioners (and researchers), but do not spend time
discussing (potential) policy implications. Doing so would strengthen the piece, particularly in light of
the nature of this journal.
----------------------------------------------------------------------------------------------------------Reviewer D:
I would recommend accepting the manuscript with some minor modifications. My full review and suggested
modifications are below.
Competency-based education (CBE) has been increasingly used across schools and districts, but little
empirical work has been done about its effectiveness. As such, this manuscript fits the scope of an
education policy journal. While the authors discuss the relationship between CBE and state and district
policies in the introduction, more discussion of how their survey might inform district and state policies
could be included in the implications and conclusion sections.
The authors do a good job of sharing the landscape and research around CBE. While the authors adopt four
essential elements of CBE, there have been a number of frameworks that have varying levels of overlap with
the Sturgis, 2016 framework. Since the authors adopt that framework wholesale and use it to develop their
survey, it would be helpful to provide a brief summary or description of other frameworks and definitions.
This will help readers understand the broader landscape of how CBE is defined (similarly, differently) and
how compatible the survey is with these other definitions.
It would also be very informative to include the survey items in the article, or at the very least examples
as illustrations of each construct, to give readers a better idea of how these constructs are
operationalized. Understanding the items could also help answer my question about whether or not there is a
disconnect between students’ perceptions of what is going on in the classroom and what the teacher is
doing. While a CBE teacher might be implementing practices that are CBE-aligned, students might not pick up
5
on those practices and how the survey is worded might help inform this question. This point could be
included in the discussion about triangulating the survey with other data sources.
While I don’t have a strong background in CFA and can’t comment on the specific methods used, I feel that
the authors laid out their methods and results in a coherent way that helped readers understand what they
did and why they did it.
The authors also include a brief review of the literature on implementation and the dimensions of
implementation and say that the survey measures exposure, adherence, and program differentiation. While I
was not able to see the specific items the survey uses, it seems to me like the survey might not actually
measure exposure (the amount of time students are exposed to different elements of the program).
It would also be helpful to have more of a discussion about how the survey will help measure program
differentiation. While the authors discuss how the various models they tested provide some information on
program differentiation, would the validated survey itself be useful for measuring program differentiation?
I could imagine that the survey could be given to students who are in a traditional classroom and students
who are in a CBE classroom and used to determine if those practices measured in the survey are actually
different than the traditional classroom. Discussion of that nature would be helpful and has been brought
up in the literature on program implementation before. There is also some discussion in the literature on
implementation fidelity about the specificity of innovations and how that impacts measurement. Since the
authors discuss how CBE might not be a specific program or “thing,” including some information on
measurement of implementation of broad policies or programs versus very specific programs or curricula
could be informative. As is, the discussion of fidelity of implementation seems pretty sparse, but I think
adding additional background in that area could help strengthen the implications and conclusions of the
article.
6