MSP Project Document Self-Appraisal

MSP Project Document
Self-Appraisal
Instructions
The MSP Project Document Self-Appraisal is intended for use by internal evaluators of National
Science Foundation (NSF)-funded and U.S. Department of Education-funded Mathematics and
Science Partnership (MSP) projects. This instrument provides a checklist of key aspects of
high-quality project evaluation plans and reports that internal evaluators can use to examine
evaluation plans and documents that summarize evaluation results. In some cases both the
evaluation plan and a report are combined into a single document, whereas in other cases they
are separate. Additionally, the checklist can be used during the development of evaluation
plans and reports to ensure that important aspects of high-quality evaluations are addressed.
This document corresponds to the more comprehensive MSP Project Document Review Rubric
designed for use by external evaluators of MSP projects.
References
Bobronnikov, E., Sahni, S.D., Fernandes, M., & Bozzi L. (2013). A guide for reporting on rigorous
evaluations for the US Department of Education mathematics and science partnerships (MSP): A userfriendly guide for MSP project officials and evaluators. Retrieved May 2014 from
http://teams.mspnet.org/index.cfm/26828
Callow-Heusser, C., Chapman, H.J., & Torres, R.T. (2005). Evidence: An Essential Tool—Planning
for and gathering evidence using the Design-Implementation-Outcomes (DIO) cycle of evidence. Logan,
UT: Utah State University.
Coalition for Evidence-Based Policy and National Opinion Research Center. (2005). How to
conduct rigorous evaluations of mathematics and science partnerships (MSP) projects (ED-01-DO0028/0001). Washington, DC: U.S. Department of Education.
Heck, D.J. & Minner, D.D. (2010). Technical report: Standards of evidence for empirical research, math
and science partnership knowledge management and dissemination. Chapel Hill, NC: Horizon
Research, Inc.
Yarbough, D.B., Shulha, L.M., Hopson, R.K., & Caruthers, F.A. (2011). The program evaluation
standards: A guide for evaluators and evaluation users (3rd ed.). Thousand Oaks, CA: Sage.
Version: June 2014
1
Funding provided by the National Science Foundation Division of Research on
Learning in Formal and Informal Settings, Award No. 1238120
MSP Project Document
Self-Appraisal
Document Title: ____________________________________________________________________
Reviewer:
Date:
Purpose of the Evaluation

The project description clearly states the project goals and objectives.

The description provides a rationale for the project by clearly identifying the issue(s) or
problem(s) the project designed to address.

The description provides a concise theory of action that describes how the project will
address the issue(s) or problem(s). This theory of action may also be stated in terms of a
testable hypothesis.
Intervention

The project intervention is clearly described in terms of activities, events, professional
development, scope, type, purpose, etc.
Target
Population(s)

The project’s proximal target population(s) (those directly subject to the intervention) is/are
clearly defined in terms of role, grade level, subject, etc.

The project’s distal target population(s) (those who indirectly benefit from the intervention)
is/are clearly defined in terms of role, grade level, subject, etc.
Project
Description
Evaluation Design and Measurement
Logic Model
Evaluation
Questions
Design and
Attribution
Measures and
Indicators
Recruitment

The evaluation design is based on a logic model and is addressed in the evaluation plan
and reports.

The logic model includes a theory of action for the project.

The logic model clearly shows why project activities are expected to lead to the intended
outcomes and impacts.

The evaluation questions are clearly stated.

The evaluation questions address all of the project goals and objectives.

The evaluation questions address key aspects of the logic model (e.g., activities, outputs,
outcomes, impacts) including project implementation as appropriate.

The evaluation design is clearly articulated (e.g., randomized control trial, matched
comparison groups, pre-post comparison, case study, etc.).

The design addresses how any findings derived from the evaluation can be attributed to
the project interventions.

For quantitative studies, independent variables, dependent variables, and other covariates
are clearly identified.

The unit of analysis or change is clearly identified and corresponds to the unit of
assignment to the treatment.

The evaluation plan or report describes the measures and indicators used to address each
evaluation question and how these relate to the expected outcomes.

The data sources or informants (e.g., teachers, principals, students) for each measure or
indicator are identified.

The evaluation plan or report clearly describes how the treatment and any comparison
groups were recruited or identified.

Any potential bias introduced by the recruitment process is addressed.
Version: June 2014
1
Funding provided by the National Science Foundation Division of Research on
Learning in Formal and Informal Settings, Award No. 1238120
Sampling
Grouping
Instrumentation
Data Collection

If the evaluation involves sampling, the evaluation plan or report clearly describes how the
sampling was carried out, including how participants were selected, and whether the
sampling was stratified to include all relevant groups.

Evidence that the sample is representative of the population of interest is provided.

Any potential bias introduced by the sampling process or potential threats to the sampling
process introduced during implementation are addressed.

Evidence that the data sample contains sufficient power is provided.

If the evaluation involves assigning participants to groups, the evaluation plan or report
clearly describes the criteria and process for group assignment.

A rationale for the comparability of the groups is provided through identification of the
matching parameters (e.g., achievement level, demographics, propensity score, etc.).

The evaluation plan or report includes information about the instruments (e.g., surveys,
interview protocols, observation protocols, assessments) used to collect data.

Evidence that the instruments are reliable and valid for the purpose for which they are
used is provided.

The instruments are described in terms of specific measures, indicators, or constructs
relevant to the evaluation design.

The data collection plan describes how each instrument will be used, the data sources or
informants, and the frequency and timing of the data collection.

The evaluation plan or report describes how data will be handled to ensure subjects’
confidentiality.

The evaluation plan or report clearly describes the quantitative and qualitative analytic
strategies used.

The strategies are appropriate to address the evaluation questions.

The analytic methods are appropriately rigorous for the goals and type of evaluation.

The evaluation plan or report reveals potential threats to validity (e.g., attrition, low
response rates, missing data, group assignment problems) and addresses how these threats
could bias the evaluation results.

Explanations of how these threats to validity occurred and how they were addressed are
provided.
Analysis
Analytic
Strategies
Threats to
Validity and
Bias
Generalizability, Representativeness, and Utility
Findings

The evaluation plan or report clearly states the evaluation findings and provides adequate
documentation to justify and support claims regarding the results (e.g., descriptive
statistics, statistical significance, effect size).
Alternative
Explanations

Findings are presented objectively and impartially.

Viable alternative explanations are adequately discussed.
Generalizability

The scope of the evaluation (i.e., evaluation design, sample size, etc.) enables the findings
to be generalizable to a larger population or other projects.
Interpretation

An interpretation of the data that is informative with respect to the project is provided.
Logical
Conclusion

The conclusions and implications that are presented are logically derived from the
evaluation findings.
Version: June 2014
2
Funding provided by the National Science Foundation Division of Research on
Learning in Formal and Informal Settings, Award No. 1238120