Unit Assessment Handbook

THE
UNIVERSITY
OF
AKRON
COLLEGE
OF
EDUCATION
UNIT
ASSESSMENT
HANDBOOK
EDUCATOR
AS
DECISION
MAKER
AUGUST
2009
College
of
Education
Unit
Assessment
Handbook
TABLE OF CONTENTS
I.
Introduction.......................................................................................................................
3
II.
Conceptual Framework Proficiencies ...............................................................................
3
III. Standards Alignment.........................................................................................................
4
IV. Development of the Unit's Assessment System................................................................ 17
V.
Assessment Philosophy .................................................................................................... 17
VI. Transition Points .............................................................................................................. 25
VII. Key Assessments ............................................................................................................ 31
VIII. Fairness, Accuracy, Consistency, and Elimination of Bias in Candidate Assessments.. 45
IX.
Relationship of Data Sources ......................................................................................... 45
X.
Assessment of Support for Candidate Learning ............................................................. 48
XI.
Procedures for Data Collection, Aggregation, Disaggregation, and Dissemination
of Bias ............................................................................................................................. 52
XII. Analysis and Use of Assessment Data .......................................................................... 59
XIII. Summary ........................................................................................................................ 60
Glossary ......................................................................................................................... 61
References ...................................................................................................................... 65
2
I.
INTRODUCTION
The Unit Assessment System is designed to collect, analyze, and evaluate data that informs
the Unit about our candidates' qualifications and performance as they progress through their
programs; candidates' knowledge, skills, and dispositions as they perform in the field; and the Unit
operations in the course of delivering programs. The Unit Assessment Handbook describes the
comprehensive approach the College of Education (the “Unit”) takes in measuring the
effectiveness of our efforts in preparing candidates for roles in K-12 schools. Its purpose is to
provide all stakeholders with information about the assessment system that collects and analyzes
data on applicant qualifications, candidate and graduate performance, and unit operations to
evaluate and improve the performance of candidates, the unit, and its programs at the initial
teacher preparation and advanced program levels.
The assessment system is based on the Unit’s Conceptual Framework (The University of
Akron, 2008). This framework guides the Unit in achieving its primary goal of providing
educators with the knowledge, skills, and dispositions to become effective decision makers within
the education profession. Therefore, the handbook begins with a description of our Conceptual
Framework which serves as our foundation of practice. The assessment system is also standardsbased. Alignments have been developed to reflect the alignment of assessments with the
conceptual framework and applicable state and national standards.
Within this context, the development of the Unit’s Assessment System is detailed for the
reader, including a list of the evaluation tools used to assess candidates and the unit. Procedures
for the collection and dissemination of the data collected are outlined as are procedures that help
ensure fairness, accuracy, consistency, and the elimination of bias. A master timeline for the
collection of these evaluations is provided.
II.
CONCEPTUAL FRAMEWORK PROFICIENCIES
The College of Education (COE) is guided by its Conceptual Framework which has the
theme of Educator as Decision Maker (University of Akron, 2008). Four components of
professional practice identified in the conceptual framework undergird the assessment system:
Knowledge, Technology, Diversity, and Ethics. Proficiencies for each of the four components
have been identified. It is the expectation that all candidates in initial and advanced programs will
meet the proficiencies listed below:
A. Knowledge
Candidates will:
K1. demonstrate knowledge of the content necessary for optimum practice and/or
research in their respective employment settings (content knowledge).
K2. demonstrate an understanding of students’ and individuals’ cognitive, social,
academic, linguistic, physical, and emotional development to explain and present
content in multiple ways that facilitate cognitive, academic achievement, linguistic,
physical and affective development (pedagogical knowledge).
K3. demonstrate knowledge of the interaction of subject matter and effective
strategies to make cognitive, academic achievement, linguistic, physical and
affective growth attainable for all students and individuals (pedagogical content
knowledge).
3
K4. demonstrate an understanding of professional, state and institutional standards,
the role of assessment, and the use of formative and summative assessments.
B. Technology
Candidates will:
T1. demonstrate an ability to integrate appropriate technology to facilitate learning
and development for all students and individuals.
T2. demonstrate an ability to use technology for assessment, analysis of data, and
research to support and enhance student learning and individual development.
C. Diversity
Candidates will:
D1. demonstrate knowledge, skills and dispositions necessary to meet the
individual needs of students and individuals based on gender, socio-economic
status, racial, ethnic, sexual orientation, religion, language, and exceptionalities
(both disabilities and giftedness).
D2. demonstrate dispositions that value fairness and learning for all students and
individuals.
D. Ethics
Candidates will:
E1. demonstrate an ability to collaborate and communicate with other educators,
administrators, community members, students and parents to support student
learning.
E2. demonstrate knowledge of and adherence to the roles and responsibilities of the
profession and to respective professional ethics and codes of conduct including the
Licensure Code of Professional Conduct for Ohio Educators.
E3. demonstrate ability to reflect on their effectiveness in helping all students or
individuals learn and develop to their fullest potential.
A full text version of the Conceptual Framework is available at:
http://www.uakron.edu/colleges/educ/docs/CF-Fall08.pdf
III.
STANDARDS ALIGNMENT
The College of Education’s assessment system is standards based. The Ohio
Standards for the Teaching Profession (Ohio Department of Education, 2007) have been
aligned with the COE's Conceptual Framework and with numerous national standards
including, the Interstate New Teacher Assessment and Support Consortium (INTASC), the
National Council for Accreditation of Teacher Education (NCATE), Praxis II, Praxis III,
NBPTS, and the Value-Added metric as applied in Ohio. The Ohio Principal Standards
(Ohio Department of Education, 2007) have been aligned with the Interstate School
Leaders License Consortium (ISLLC) standards, Educational Leadership Constituents
Council (ELCC) standards, NCATE standards, Praxis II Educational Leadership Test
categories, and the Value-Added metric as applied in Ohio. The master alignment for each
is provided in table format below. The master alignment for each is provided in Excel
format: Ohio Standards for Teachers Alignment Matrix with Conceptual Framework and
Ohio Standards for Principals Alignment Matrix with Conceptual Framework.
4
Ohio's Standards for the Teaching Profession
Standard
Number
1
Teacher Standards
UA
Conceptual
Framework
Teachers display knowledge of how
students learn and of the developmental
characteristics of age groups.
K2
1.2
NCATE
Praxis II
Praxis
III
NBPTS
Value
Added
Unit
Assessments
Students: Teachers understand student learning and development, and respect the diversity of the students they teach.
K2
1.1
INTASC
Teachers understand what students know
and are able to do and use this knowledge
to meet the needs of all students.
BK1
BK3
BP2
BP3
BP4
BP5
FD3
FD4
FP2
GP3
BP1
CK1
CP2
HD1
HP1
1c
1c
1d
1d
3c
IA1
IA2
A1
1.3
IC2
IC3
IB1
IB2
A1
A2
IB4
IB5
A4
C2
2
1.2
2.2
2
5
IB6
1.3
Teachers expect that all students will
achieve to their full potential.
D2
D1
1.4
Teachers model respect for students’
diverse cultures, language skills and
experiences.
K2
1.5
Teachers recognize characteristics of
gifted students, students with disabilities
and at-risk students in order to assist in
appropriate identification, instruction, and
intervention.
Unit Assessments: 1. GPA 2. Praxis II PLT 3. Praxis II Content
CD1
CP3
BP3
BP4
1g
4a
IC3
D2
1.1
FK5
HD2
HP3
CD3
CD4
CP5
CP6
1c
1g
IB1
IB6
A1
GD3
GP4
JD1
JP4
4a
4d
IIIB
IVB2
B2
BP5
CK4
CD2
BK2
BD1
BD2
BP1
1c
1d
IB2
IB4
A1
A4
HP2
CK2
CP1
CP3
3c
4a
IIA2
IIA4
B2
C3
CP4
FD5
4
6
B1
1.4
6
2.3
4d
4. Student Teaching Evaluation
5
5. Impact on Student Learning
2
4
5
6. Dispositions Assessment
Standard
Number
2
Teacher Standards
UA
Conceptual
Framework
INTASC
K1
Teachers understand and use contentspecific instructional strategies to
effectively teach the central concepts and
skills of the discipline.
K3
Teachers understand school and district
curriculum priorities and the Ohio
academic content standards.
K1
2.4
Teachers understand the relationship of
knowledge within the content area to other
content areas.
K3
AK3
AP5
K3
AD3
CP5
CP6
2.5
Teachers connect content to relevant life
experiences and career opportunities.
DK3
DP1
DP5
2.2
2.3
Praxis II
Praxis
III
NBPTS
Value
Added
Unit
Assessments
1.1
3
1.2
5
Content: Teachers know and understand the content area for which they have instructional responsibility.
Teachers know the content they teach and
use their knowledge of content-specific
concepts, assumptions and skills to plan
instruction.
2.1
NCATE
AK1
AD1
DK1
DP1
1a
1b
IIB1
A2
A4
2.1
3c
AP1
AP2
AP4
AD3
EP5
DK2
1b
3b
IIB2
C1
3c
DP1
1a
C4
1b
IIB1
A3
C2
2.2
1.2
3
4
2.3
3
5.3
1c
Unit Assessments: 1. GPA 2. Praxis II PLT 3. Praxis II Content
3
A3
DK2
1c
1d
IB6
IIB2
A1
2.1
1.2
3
6
4. Student Teaching Evaluation
6
5. Impact on Student Learning
6. Dispositions Assessment
Standard
Number
3
3.1
3.2
3.3
3.4
3.5
Teacher Standards
UA
Conceptual
Framework
INTASC
NCATE
Praxis II
Praxis
III
NBPTS
Value
Added
Unit
Assessments
3.1
2
3.2
4
Assessment: Teachers understand and use varied assessments to inform instruction, evaluate and ensure student learning.
Teachers are knowledgeable about
assessment types, their purposes and the
data they generate.
K4
Teachers select, develop and use variety
of diagnostic, formative and summative
assessments.
K4
Teachers analyze data to monitor student
progress and learning to plan, differentiate
and modify instruction
K4
Teachers collaborate and communicate
student progress with students, parents
and colleagues
E1
Teachers involve learners in selfassessment and goal setting to address
gaps between performance and potential.
T2
BP1
HK1
HK3
1d
IIC1
IIC2
A5
IIC4
A5
IIC5
T2
BP1
HK2
HP1
HP3
1d
3c
IIC3
3.3
3.4
5
T2
BD2
BP1
HP1
HP5
HP2
HP6
HD1
HD2
1d
3c
IIC4
C4
D1
3.3
3.4
3.3
4
5
3c
IIC6
C4
D4
3.4
3.4
5
6
E1
Unit Assessments: 1. GPA 2. Praxis II PLT 3. Praxis II Content
HD2
HP3
BP3
BP4
3.5
6
4. Student Teaching Evaluation
7
5. Impact on Student Learning
6. Dispositions Assessment
Standard
Number
4
Teacher Standards
UA
Conceptual
Framework
K2
K2
4.2
Teachers use information about students’
learning and performance to plan and
deliver instruction that will close the
achievement gap.
Teachers communicate clear learning
goals and explicitly link learning activities
to those defined goals.
K3
4.3
Teachers apply knowledge of how
students think and learn to instructional
design and delivery.
K1
4.4
4.5
4.6
4.7
NCATE
Praxis II
Praxis
III
NBPTS
Value
Added
Unit
Assessments
Instruction: Teachers plan and deliver effective instruction that advances the learning of each individual student.
Teachers align their instructional goals and
activities with school and district priorities
and Ohio’s academic content standards.
4.1
INTASC
Teachers differentiate instruction to
support the learning needs of all students,
including students identified as gifted,
students with disabilities and at-risk
students.
Teachers create and select activities that
are designed to help students develop as
independent learners and complex
problem-solvers.
Teachers use resources effectively,
including technology, to enhance student
learning.
K4
DK1
DK2
DP1
1b
IIB1
IIB2
A2
A4
4
5
K4
BK1
BK2
DK3
DD2
1b
EP2
DP2
DP3
EK3
1d
1c
IIB1
IIB2
A1
A4
1.1
1.2
3.3
2.2
IIIA
A2
5
B3
4
C1
K2
BK2
BK3
K3
DP3
EK1
K3
CK1
CK2
DK2
DP2
1d
IIA1
IIA2
C2
5
C4
1.2
3.1
4
5
DP2
DP3
EP2
1c
1d
4a
IB1
IB2
A4
B1
1.1
1.2
4
IB4
IB6
B3
C2
2.2
2.3
5
3.1
K1
K2
BP3
DP1
K3
K4
EP2
FP7
1d
K3
T1
EK2
EP2
1b
ED1
EK1
1b
1c
IIA1
IIA4
IC3
C3
B3
6
1.1
1.2
4
2.3
3.2
6
A4
4
T2
Unit Assessments: 1. GPA 2. Praxis II PLT 3. Praxis II Content
4
5
4. Student Teaching Evaluation
8
5. Impact on Student Learning
6. Dispositions Assessment
Standard
Number
5
5.1
5.2
5.3
5.4
Teacher Standards
UA
Conceptual
Framework
NCATE
Praxis II
Praxis
III
NBPTS
Value
Added
Unit
Assessments
Learning Environment: Teachers create learning environments that promote high levels of learning and achievement for all students.
Teachers treat all students fairly and
establish an environment that is respectful,
supportive and caring.
Teachers create an environment that is
physically and emotionally safe.
K2
D2
CP6
CD3
CD4
FP5
1b
1g
IC4
4a
K2
D1
FP4
FP5
CD5
CP7
B1
B2
Teachers motivate students to work
productively and assume responsibility for
their own learning.
K2
Teachers create learning situations in
which students work independently,
collaboratively and/or as a whole class.
K2
1.2
1.4
B4
4a
IC4
B2
B5
1.2
1.4
1.5
BP3
FK3
FP2
FP6
1b
IC3
1.5
D1
FK1
FD1
4
6
3.2
FP1
D2
4
6
D2
K2
5.5
INTASC
4
6
FD3
FP3
FP1
FP7
CD1
1b
1d
IC2
3c
IC4
1.5
A1
A4
B3
B5
1.6
4
3.2
6
1.1
4
Teachers maintain an environment that is
conducive to learning for all students.
D2
Unit Assessments: 1. GPA 2. Praxis II PLT 3. Praxis II Content
4a
4. Student Teaching Evaluation
9
5. Impact on Student Learning
6. Dispositions Assessment
6
Standard
Number
6
Teacher Standards
UA
Conceptual
Framework
NCATE
Praxis
III
Teachers communicate clearly and
effectively.
EP5
FP4
GK4
GD2
1a
1b
IC2
IC4
A2
GD3
GP1
GP3
GP4
1d
4a
IIC6
IIIA
C1
IIIB
IIIC
GP5
E1
6.2
Praxis II
NBPTS
Value
Added
Unit
Assessments
Collaboration and Communication: Teachers collaborate and communicate with other educators, administrators, students and parents and the community
to support student learning.
E1
6.1
INTASC
HP6
4d
JP2
JP4
Teachers share responsibility with parents
and caregivers to support student learning,
emotional and physical development and
mental health.
1c
1e
1f
1g
3c
4a
IVB3
D4
B3
4.1
4.3
3.4
5
6
3.4
5.1
3.4
5.5
6
4d
6.3
Teachers collaborate effectively with other
teachers, administrators and school and
district staff.
6.4
Teachers collaborate effectively with the
local community and community agencies,
when and where appropriate, to promote a
positive environment for student learning.
E1
AD3
HP6
JD3
JP2
1c
1g
4c
4d
1c
1g
IVB3
D3
5.1
5.3
3.4
6
JP5
E1
Unit Assessments: 1. GPA 2. Praxis II PLT 3. Praxis II Content
JD3
JP2
JP5
IVB3
D3
5.1
5.2
3.4
6
4c
4. Student Teaching Evaluation
10
5. Impact on Student Learning
6. Dispositions Assessment
Standard
Number
7
7.1
7.2
7.3
Teacher Standards
UA
Conceptual
Framework
INTASC
NCATE
Praxis II
Praxis
III
NBPTS
Value
Added
Unit
Assessments
Professional Responsibility and Growth: Teachers assume responsibility for professional growth, performance, and involvement as an individual and as a
member of a learning community.
Teachers understand, uphold and follow
professional ethics, policies and legal
codes of professional conduct.
E2
Teachers take responsibility for engaging
in continuous, purposeful professional
development.
E3
Teachers are agents of change who seek
opportunities to positively impact teaching
quality, school improvements and student
achievement.
E1
ID5
1g
IVB3
IVB4
D2
D3
3.4
4
6
ID1
ID2
IP2
IP3
1c
IVA1
IVA2
D3
4.1
4.2
6
IVA3
E3
Unit Assessments: 1. GPA 2. Praxis II PLT 3. Praxis II Content
ID4
IP3
1c
IVB3
4.3
D3
5.1
5.2
5.3
5.5
3.4
6
4. Student Teaching Evaluation
11
5. Impact on Student Learning
6. Dispositions Assessment
Ohio's Principal Standards
Standard
Number
1
1.1
Ohio Principal Standards
UA Conceptual
Framework
ELCC
NCATE
ISLLC
Praxis
Value
Added
Principals help create a shared vision and clear goals for their schools and ensure continuous progress toward achieving those goals.
Principals facilitate the articulation and
realization of a shared vision of continuous
improvement.
K1
K2
1.1a
K4
E1
1.2c
1.1b
1.2a
1.2b
1.e
1
1.D2
1.P1
1.P2
1.f
1.P3
1.P4
1.P6
1.P11
2.12
1.P12
1.2
Principals lead the process of setting,
monitoring and achieving specific and
challenging goals that reflect high expectations
for all students and staff.
K1
K2
K4
D1
D2
1.3a
1.3b
1.4b
2.1a
1.e
1
1.P5
1.P8
1.P9
2.1
2.2
2.2a
2.2b
2.2c
2.3a
1.f
1.P10
1.P13
1.P14
1.P15
2.3
2.5
3.2a
3.2b
6.3a
6.3b
1.P16
2
2.P2
2.8
3.2
6.3c
1.3
Principals lead the change process for
continuous improvement.
K1
T2
D1
D2
E1
K1
1.4
Principals anticipate, monitor, and respond to
educational developments that affect school
issues and environments.
E3
E1
Possible
VA
3.4
1.2a
1.4a
1.5a
1.5b
1.e
2
2.D5
2.P5
2.P17
1.6
2.7
2.1a
2.2a
2.2b
2.2c
1.f
2.P18
2.P19
2.K9
3
2.9
3.1
2.3a
2.3b
2.3c
3.1a
1.g
3.P3
3.P12
3.D2
6
3.5
3.7
3.1b
6.1a
6.1b
6.1c
3.8
4.1a
6.1d
6.1e
6.1f
6.1g
6.1h
6.2a
1.3a
1.3b
1.4a
1.4b
1.4
1.5
1.4c
4.2b
6.K5
1.e
4
1.f
6.K6
1.g
12
4.K1
6
6.K4
Possible
VA
Standard
Number
2
2.1
2.2
Ohio Principal Standards
UA
Conceptual
Framework
ELCC
NCATE
ISLLC
Praxis
Value
Added
Principals support the implementation of high-quality standards-based instruction that results in higher levels of achievement for all students.
Principals ensure that the instructional content
that is taught is aligned with the Ohio
academic content standards and curriculum
priorities in the school and district.
Principals ensure instructional practices are
effective and meet the needs of all students.
K1
K3
2.2b
6.1d
6.3c
1.e
K3
2.2a
2.3a
K4
T1
4.2c
4.2d
T2
D1
2.3b
2.3c
2.K4
2.P12
2.P1
3
1.1
2.1
6
2.2
2.3
2.P1
4
6.K3
3
3.D5
6.K4
6.P4
1.e
2
2.K6
2.D1
2.D2
1.3
2.4
1.f
2.D3
5
5.D8
5.P7
2.5
2.6
4.a
6
6.K3
K4
K1
2
Possible VA
2.7
D2
K2
D1
D2
2.3
Principals advocate for high levels of learning
for all students, including students identified
as gifted, students with disabilities and at-risk
students.
1.1a
1.1b
1.2a
1.2b
1.e
1
1.D1
2
2.K7
1.1
1.2c
2.1a
2.2b
4.2c
1.f
2.D1
2.D2
2.D3
2.D6
4.1
c
6.3a
6.3b
6.3c
1.g
2.D8
2.P5
2.P11
4.a
2.8
4
4.P3
4.P12
2.P2
0
5
5.K3
5.D3
5.D4
5.P8
5.P9
5.P10
6
6.D1
2
2.K1
2.K2
2.K3
1.5
2.9
2.K9
2.K10
2.P9
3
2.1
2.11
2
2.9
3.2
Possible VA
6.D5
K1
2.4
Principals know, understand and share
relevant research.
1.2b
1.4b
2.3b
2.3c
4.2b
6.1a
6.1f
6.1h
1.e
3.P1
K4
2.5
T2
1.2b
1.4b
2.3c
Principals understand, encourage and
facilitate the effective use of data by staff.
3.1a
1.e
1
1.K4
1.P11
1.f
2.P1
6
2.P17
2.P18
OPS 2.5a
OPS 2.5b
OPS 2.5c
K1
2.6
E3
2.4a
2.4b
2.4c
Principals support staff as they plan and
implement research-based professional
development.
13
1.e
2
2.K8
2.D4
2.D5
2.9
1.f
2.P2
2.P7
2.P8
2.P1
9
3.8
1.g
5
5.P6
3.4
Standard
Number
3
Ohio Principal Standards
ELCC
NCATE
ISLLC
Praxis
Principals allocate resources and manage school operations in order to ensure a safe and productive learning environment.
3.1
Principals establish and maintain a safe
school environment.
3.2
Principals create a nurturing learning
environment that addresses the physical and
mental health needs of all.
3.3
UA
Conceptual
Framework
Principals allocate resources, including
technology, to support student and staff
learning.
K1
K2
D1
D2
K1
K2
D1
D2
K1
T1
T2
3.1b
3.2c
1.f
3.1b
2.2c
2
2.D7
3
3.K3
3.K6
3.D7
3.P6
3.P21
2
2.P1
2
3
3.K6
3.9
5
5.K3
5.D1
5.D3
5.5
1.e
3
3.K5
3.K8
3.D1
2.9
3.10
1.f
3.P10
3.P1
1
3.P2
0
5
4.3
4.3a
1.f
3.1c
3.3a
3.3b
3.3c
5.P5
3.4
Principals institute procedures and practices
to support staff and students and establish an
environment that is conducive to learning.
K1
K2
2.4a
2.4b
3.1b
D1
D2
3.2a
3.2b
3.2c
E2
3.5
3.2c
3.3a
5.3a
Principals understand, uphold and model
professional ethics, policies and legal codes of
professional conduct.
14
3.1c
1.f
1.g
4.4d
3.11
4.3b
2
2.P1
9
3
3.K2
3.2
3.K4
3.D1
3.D3
3.D5
4.1a
3.D6
3.P2
3.P7
3.P22
3
3.K7
3.P5
3.P23
3.9
4.4
5
5.D3
5.P8
5.P9
4.4a
4.4b
5.P10
5.P1
5
5.P1
6
6
4.4c
6.K3
6.D5
3.4
Value
Added
Standard
Number
4
4.1
Ohio Principal Standards
UA
Conceptual
Framework
Principals promote a collaborative learning
culture
K2
D1
D2
E1
2.1a
3.2a
4.1a
3.2b
Principals share leadership with staff,
students, parents and community members
K1
4.3
NCATE
ISLLC
Praxis
Value
Added
Principals Establish and sustain collaborative learning and shared leadership to promote learning and achievement of all students.
E1
4.2
ELCC
Principals support and advance the leadership
capacity of all educators.
E2
2.4a
2.4b
2.4c
E3
4.3
a
1.e
1
1.K6
3
3.P1
3
1.7
2.12
1.f
4
4.D2
4.D3
4.P1
5
5.2
5.3
1.g
4.P16
1.e
1
1.D4
1.P7
3
2.12
4.2c
1.f
3.P14
4
4.D2
4.D5
1.g
4.D8
4.P4
4.P8
4.P9
4.P15
6
6.P4
1.e
1.g
15
Possible
VA
Standard
Number
5
Ohio Principal Standards
UA
Conceptual
Framework
NCATE
ISLLC
Praxis
Value Added
Principals engage parents and community members in the educational process and create an environment where community resources support student learning,
achievement, and well being.
K1
T1
E1
5.1
ELCC
Principals connect the school with
the community
1.2c
1.3a
1.4a
1.5a
1.e
1
1.P7
1.D4
3
1.3
2.12
1.5b
3.2b
4.1a
4.1b
1.f
3.D6
4
4.P2
4.P4
4.2b
4.2c
4.1c
4.1d
4.1e
4.1f
1.g
4.P6
4.P7
4.P8
4.P9
4.1g
4.1h
4.2a
4.3a
4.P10
4.P15
6
6.P4
4.3b
4.3c
6.1e
6.2a
1.5a
4.1a
4.1b
4.1c
1.e
4
4.D2
4.D5
4.D6
2.12
4.2b
OPS 5.2a
4.1d
4.1f
6.2a
1.f
4.D8
4.2c
5.3
OPS 5.2b
6.3a
K1
5.2
E1
Principals involve parents and
community members in improving
student learning.
1.g
OPS 5.2c
OPS 5.2d
K1
5.3
E1
Principals use community resources
to improve student learning.
3.3a
3.3b
4.1a
4.1c
1.e
3
3.P10
4
4.K3
4.1d
4.1e
4.1g
4.1h
1.f
4.K5
4.D7
4.P2
4.P4
4.2d
4.3a
4.3b
4.3c
1.g
4.P6
4.P7
4.P8
4.P9
6.1b
5.4
Principals establish expectations for
the use of culturally responsive
practices that acknowledge and
value diversity
4.2b
5.3
3.11
4.P14
K1
K2
1.1a
1.1b
2.1a
2.2b
1.e
1
1.K1
1.D1
2
1.3
D1
D2
2.3b
3.2c
4.2b
4.2c
1.f
2.K7
2.D6
2.P6
4
5.3
4.2d
5.1a
5.2a
5.3a
1.g
4.K2
4.D4
4.P11
5
6.1f
6.1g
6.2a
6.3a
4.a
5.K3
5.P10
5.P12
6
6.K8
6.D2
6.3c
16
IV.
DEVELOPMENT OF THE UNIT’S ASSESSMENT SYSTEM
The College of Education developed a program assessment model based on the
philosophy of assessment developed by the Graduate Studies Committee in 1998. The model
was originally approved by College faculty in August 2000, at which time departments began
preparing assessment plans for specific initial and advanced programs. In 2001, faculty from
the various licensure area met during a day-long retreat to design an assessment using
guiding questions from Campbell, Melenyzer, Nettles, and Wyman (2000). What do our
candidates know and what can they do when they graduate? How will we assess the extent
to which our candidates have attained the standards that we have adopted? What type of
evidence will we offer to indicate quality? A standard format for the portfolio was developed
that included a section of assignments in the professional and pedagogical core that reflect
the Ohio/INTASC, a section of assessments based on the standards of the Specialized
Professional Association Standards guiding the specific program, a reflection prior to student
teaching, and a culminating assessment at the end of student teaching. Faculty from each
program area determined the assessments that would best reflect the standards of the
program. As the portfolio design was developed, it was reviewed by school partners from the
P-12 community and from candidates enrolled in the teacher preparation program.
As the NCATE program review process was revised, program area faculty met to
review performance assessments and make revisions to the assessments and accompanying
rubrics as indicated. With the impetus provided by the increased emphasis on advanced
programs, faculty in these programs met to analyze assessments being used in these programs
and make the revisions and additions required. In 2007-2008, a Professional Education
Council (PEC) NCATE Standard 2 workgroup led the effort to review the model, analyze the
alignment with the NCATE 2008 standards, and include explicit links to Conceptual
Framework proficiencies. PEC, a standing committee of the college, has wide representation
from the professional community which includes COE faculty, Arts & Sciences faculty, Fine
and Applied Arts Faculty, Dean (or designee), NCATE coordinator, and P-12 educators. The
model collaboratively developed by this group reflects a systemic approach to the collection,
aggregation, and analysis of data at critical points in the program to evaluate candidate
learning and develop plans for the improvement of programs.
IV.
ASSESSMENT PHILOSOPHY
The College of Education seeks to cultivate a culture in which assessment is an
essential part of teaching and learning. Assessment and evaluation are extremely important
elements for the improvement of academic programs and for both internal and external
accountability. Many of the assessment activities performed by the College of Education are
required for professional accreditation. The College recognizes the importance of assessment
and evaluation as tools for decision-making and increasing College effectiveness.
The College of Education has developed an outcomes assessment program intended
to provide an ongoing review of the College’s effectiveness. The program for assessing
effectiveness has three specific and complementary purposes:
1. to improve candidate learning and performance,
2. to improve programs, program planning, and program development, and
3. to improve support for programs and candidate learning.
17
These purposes will be achieved by gathering and compiling information on the extent of the
College’s accomplishments in achieving defined purposes and using such information for
planning and program improvements. The College of Education’s assessment efforts can be
characterized as:
• Integrated
• Participatory
• Comprehensive
Each of these facets are described in detail below.
Integrated – Assessment efforts within the College of Education begin with the mission
of the College. Academic programs, candidate support services, and other college
activities should work together to fulfill the mission. Assessments within the College are
directly related to the mission as identified in the Conceptual Framework. The assessment
program is intended to be an integral part of the institutional assessment process of
planning, review, and revision.
Participatory – The College of Education’s assessment program is an ongoing
collaborative effort by faculty, staff, administrators, and extended professional
community. The College follows a combination of a centralized/decentralized approach
to assessment, with departments and faculty groups responsible for establishing and
assessing specific candidate outcomes. The administration’s role is to coordinate and
document assessment activities occurring at the department level, coordinate collegewide activities, and provide college data to various constituencies. It is an administrative
responsibility to ensure that assessment activities provide useful and usable data in a costeffective manner.
Comprehensive – Assessment activities in the College reflect the following areas of
concentration:
• Candidates: Outcomes Assessment – Assessment of Candidate Learning
• Programs: Academic Program Evaluation
• Support: Field Experiences, Diversity, Faculty, and Governance
These areas assess the effectiveness of all college functions, with the highest priority
placed on the assessment of candidate learning and effectiveness. The sections that follow
address the assumptions, structure, and focus of the College’s assessment efforts. It should be
noted that assessment efforts in the College of Education began in earnest in 1992 with the
College’s short-term assessment models. These models established the groundwork for the
current, more comprehensive model.
General Assumption
The development of an assessment framework presumes a reference base. The
College of Education has identified concepts that are appropriate to assessment at every level
(candidates, program, and faculty) and guide the assessment practices employed. The
principles serve as a guide for all assessment activities in the college. The first nine principles
quoted directly below were developed under the auspices of the American Association for
18
Higher Education's Assessment Forum with support from the Fund for the Improvement of
Postsecondary Education with additional support for publication and dissemination from the
Exxon Education Foundation (Astin et al., 1996). The principles are patterned on Chickering
and Gamsom’s (1987) Seven Principles of Good Practice in Undergraduate Education. The
tenth principle was offered by Banta, Lund, Black, and Oblander (1996) in Assessment in
Practice: Putting Principles to Work on College Campuses.
AAHE Assessment Forum
9 Principles of Good Practice for Assessing Student Learning
1) The assessment of student learning begins with educational values.
Assessment is not an end in itself but a vehicle for education improvement. Its
effective practice, then, begins with and enacts a vision of the kinds of learning we
most value for students and strive to help them achieve. Educational values should
drive not only what we choose to assess but also how we do so. Where questions
about educational mission and values are skipped over, assessment threatens to be an
exercise in measuring what’s easy, rather than a process of improving what we really
care about (Astin et al.,1996).
2) Assessment is more effective when it reflects an understanding of learning as
multidimensional, integrated, and revealed in performance over time.
Learning is a complex process. It entails not only what students/candidates know but
what they can do with what they know; it involves not only knowledge and abilities
but values, attitudes, and habits of mind that affect both academic success and
performance beyond the classroom. Assessment should reflect these understandings
by employing a diverse array of methods, including those that call for actual
performance, using them over time so as to reveal change, growth, and increasing
degrees of integration. Such an approach aims for a more complete and accurate
picture of learning, and therefore firmer bases for improving our candidates’
educational experience (Astin et al.,1996).
3) Assessment works best when the program it seeks to improve have clear,
explicitly stated purposes.
Assessment is a goal-oriented process. It entails comparing educational performance
with educational purposes and expectations- these derived from the institution’s
mission, from faculty intentions in program and course design, and from knowledge
of students’ own goals. Where program purposes lack specificity or agreement,
assessment as a process pushes a campus toward clarity about where to am and what
standards to apply; assessment also prompts attention to where and how program
goals will be taught and learned. Clear, shared, implementable goals are the
cornerstone for assessment that is focused and useful (Astin et al.,1996).
4) Assessment requires attention to outcomes but also and equally to the
experiences that lead to those outcomes.
Information about outcomes is of high importance; where students “end up” matters
greatly. But to improve outcomes, we need to know about student experience along
19
the way – about the curricula, teaching, and kind of student effort that lead to
particular outcomes. Assessment can help us understand which students learn best
under what conditions; with such knowledge comes the capacity to improve the
whole of their learning (Astin et al.,1996).
5) Assessment works best when it is ongoing, not episodic.
Assessment is a process whose power is cumulative. Though isolated, “one-shot”
assessment can be better than none, improvement over time is best fostered when
assessment entails a linked series of cohorts of students; it may mean collecting the
same examples of student performance or using the same instrument semester after
semester. The point is to monitor progress toward intended goals in a spirit of
continuous improvement. Along the way, the assessment process itself should be
evaluated and refined in light of emerging insights (Astin et al.,1996).
6) Assessment fosters wider improvement when representatives from across the
educational community are involved.
Student learning is a campus-wide responsibility, and assessment is a way of enacting
that responsibility. Thus, while assessment efforts may start small, the aim over time
is to involve people from across the educational community. Faculty play an
especially important role, but assessment’s questions can’t be fully addressed without
participation by student services educators, librarians, administrators, and students.
Assessment may also involve individuals from beyond the campus (alumni/ae,
trustees, employers) whose experience can enrich the sense of appropriate aims and
standards for learning. Thus understood, assessment is not a task for small groups of
experts but a collaborative activity; its aim is wider, better-informed attention to
student learning by all parties with a stake in its improvement (Astin et al.,1996).
7) Assessment makes a difference when it begins with issues of use and
illuminates questions that people really care about.
Assessment recognizes the value of information in the process of improvement. But
to be useful, information must be connected to issues or questions that people really
care about. This implies assessment approaches that produce evidence that relevant
parties will find credible, suggestive, and applicable to decisions that need to be
made. It means thinking in advance about how the information will be used, and by
whom. The point of assessment is not to gather data and return “results”; it is a
process that starts with the questions of decision-makers, that involves them in the
gathering and interpreting of data, and that informs and helps guide continuous
improvement (Astin et al.,1996).
20
8) Assessment is most likely to lead to improvement when it is part of a larger
set of conditions that promote change.
Assessment alone changes little. Its greatest contribution comes on campuses where
the quality of teaching and learning is visibly valued and worked at. On such
campuses, the push to improve educational performance is a visible and primary goal
of leadership; improving the quality of undergraduate education is central to the
institution’s planning, budgeting, and personnel decisions. On such campuses,
information about learning outcomes is seen as an integral part of decision making,
and avidly sought (Astin et al.,1996).
9) Through assessment, educators meet responsibilities to students and to the
public.
There is a compelling public stake in education. As educators, we have a
responsibility to the publics that support or depend on us to provide information about
the ways in which our students meet goals and expectations. But that responsibility
goes beyond the reporting of such information; our deeper obligation – to ourselves,
our candidates, and society – is to improve. Those to whom educators are accountable
have a corresponding obligation to support such attempts at improvement (Astin et
al.,1996).
10) Assessment is most effective when undertaken in an environment that is
receptive, supportive, and enabling.
More specifically, successful assessment requires an environment characterized by
effective leadership, administrative commitment, adequate resources (for example,
clerical support and money), faculty and staff development opportunities, and time.
(Banta et al., 2006).
Assessment Measures
Methodologies provide the vehicle to obtain data for evaluation of effectiveness.
Relative to assessment activities in the College of Education, multiple measures better assure
a well-rounded assessment, especially candidate learning and performance. Recognizing the
need for alternative assessments to standardized testing, the COE agrees that standardized
tests provide limited measures of learning, that their overuse narrows the curriculum, that
they are poor diagnostic tools, and that they do not reflect or capture the diversity of
students’ backgrounds and experiences (Darling-Hammond, Ancess, & Falk, 1995; DarlingHammond, 1999). Assessment works best when it is embedded and ongoing (Stroble, 2000).
Assessment activities in the COE focus on both content standards and performance
standards. Content standards identify what is important to learn and performance standards
describe what students should be able to do with what they know, i.e., the kind of
performance that will be assessed. Performance indicators must be varied to allow for
diverse and complex kinds of student learning. The interpretation of the data is used in both
formative and summative contexts. Methodologies used may include:
21
•
•
•
•
•
•
•
•
•
Standardized tests of basic skills and academic aptitudes (e.g. Praxis)
Performance assessments embedded in courses
Observations
Attitude inventories
Alumni surveys/focus groups
Persistence studies
Exit surveys/interviews
Capstone courses
Portfolio analysis
Use of Results
The results of assessment activities are valuable tools for decision-making and
improvement. They produce results that serve to increase effectiveness and meet the stated
objectives of improving candidate learning and performance and improving programs,
program planning, and program development. On a yearly basis, data are aggregated and
analyzed by appropriate decision-makers. On the basis of this data, program and support
improvements are made. In this way, a continuous cycle of improvement has been
established. Results of assessment activities can be demonstrated in various ways:
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Modifications of course assignments, assessments, and rubrics
Changes in instructional styles
Reorganization of courses
Development of courses
Elimination of courses
Changes in major requirements
Changes in admission or exit requirements
Modification of course and teaching evaluation instruments
Modification of course schedules
Revision of syllabi and changes in course emphasis
Provision of additional or specialized technology facilities for candidates and faculty
Addition of capstone courses
Development of portfolio assessment within courses or programs
Revision of student services activities
Adjustments in operating procedures
Structure
As previously stated, assessment in the College of Education is comprehensive,
reflecting three areas of concentration: candidates, program, and support for candidate
learning.
Candidates – Outcomes Assessment: Assessment of Candidates
Unit Assessment charts were developed for initial teacher preparation programs.
The transition points for initial teacher preparation programs are as follow:
22
1) Program entry – These assessment activities focus on those indicators identified to
allow entry into the College of Education or a specific program within the college.
The criteria might include standardized test scores, writing sample, interviews,
completion of required coursework in general education, and/or performance
assignments.
2) Entry to extended field/clinical experience – These assessment activities focus on
those indicators identified to evidence competence as progress is made. Activities in
initial teacher education programs might, for example, reflect the INTASC standards,
specialized program association (SPA) standards, and the domains of Praxis. These
standards guide the initial teacher preparation program and assessments performed
with a focus on the competencies to indicate progression. Syllabi clearly reflect the
expected course outcomes and identify the standards that are introduced or reinforced
during the course. In addition, competency in the content knowledge demonstrated
through Praxis specialty area scores is required.
3) Exit from extended field/clinical experience – These assessment activities focus on
indicators of initial teacher competencies. Evidence of a candidate’s impact on
student learning should be a major part of this assessment point. The criteria include
student teaching evaluations, portfolio components, and candidate reflection on
his/her own performance and decision-making.
4) Program completion – At this point, assessment focuses on evidence of
achievement of program or College criteria. These indicators include successful
completion of coursework designed to provide the content, professional, and
pedagogical knowledge for beginning teachers and an acceptable evaluation of a
portfolio.
5) Follow-up – Praxis III observational evaluations of candidate’s performance in
their first two years of teaching were implemented in Fall 2002 and were required
throughout the 2008-2009 academic year. In addition, focus group interviews are
conducted to collect qualitative data on the effectiveness of the programs. Faculty and
school colleagues review this data and data collected through surveys and focus group
interviews on an annual basis for the purpose of analyzing and improving program
quality.
The transition points established for advanced teacher preparation programs are:
1) Program entry – These assessment activities focus on those indicators identified
to allow entry into the College of Education or a specific program within the
college. The criteria might include standardized test scores, writing sample,
and/or interviews.
2) Midpoint – Progress is reviewed and evaluated at the point of advancement to
candidacy.
23
3) Program completion – At this point, assessment focuses on evidence of
achievement of program or College criteria.
4) Follow-up – Alumni surveys are periodically conducted by Institutional Research.
In addition, focus group interviews are conducted to collect qualitative data on the
effectiveness of the programs.
Program – Academic Program Evaluation
Program evaluation in the college is a systematic, ongoing process that is considered
a routine feature of the work. While program evaluations necessarily focus on academic
learning as evidenced by candidate outcome assessment, other components such as the
degree to which a program supports the academic mission of the college and the Conceptual
Framework should be considered. Together, these statements provide the overall guiding
framework for the operation of the individual programs. Additionally, advising, student
services, and human and financial support resources should be reflected in the program
evaluation.
Following the general mission statements and the more specific program purposes
statement, program assessment plans address the three general components of the College of
Education Outcomes Assessment System: program goals and standards, assessment criteria
and procedures, and use of results.
1) Program goals and standards – The goals for candidate outcomes should be
programmatically identified through relevant Specialized Program Associations,
accrediting bodies, licensure, and/or other faculty determined requirements or
expectations.
2) Assessment criteria and procedures – This area identifies the criteria and related
procedures that are used to assess the success of the program in assuring that the
program goals are met. As data are collected on candidates at the transition points, the
data are aggregated and analyzed. Trend data reports are produced and presented to
stakeholders for analysis.
3) Use of assessment results - The assessment system requires regular and
systematic review and use of performance operations data to initiate changes in
programs and unit operations.
Support
• Field experience, student teaching assignments, and internship data are collected
and trend data reported.
• Data on diversity of candidates and faculty are collected and trend data reported.
• Faculty, administration, and staff are responsible for fulfilling the mission of the
College of Education. Their collective performance contributes to the
effectiveness of the College; therefore, evaluation is a necessary component.
- Staff and administrative assessment occurs through annual institutional
performance appraisals.
24
-
-
-
The performance of faculty for tenure and promotion is based on
guidelines outlined in the College of Education Retention, Promotion, and
Tenure document. These guidelines reflect the department, college and
University missions, and faculty members are evaluated according to their
contributions to the objectives of the department, College and University
relative to the areas of research, teaching, and service.
Department faculty are evaluated for merit as per the merit guidelines
developed by each department. Each faculty member needs to meet the
minimum criteria to be eligible for across the board and merit raises.
Department Chairs develop annual Professional Development Plans with
each of their faculty members in an attempt to spell out, in detail, the
mutually agreed upon objectives the faculty member will work towards for
that year in the areas of research, teaching, and service. Chairs and
faculty review these plans at the end of each year not only to assist in the
retention, promotion, and tenure process, but also as they relate to merit
and improvement. The process of professional development planning also
helps the department chairs plan for the following year.
Candidates evaluate the effectiveness of their instructors’ teaching at the
end of each course, each semester.
Operations data that are produced for annual review include:
• Enrollment data
• Graduation statistics
• Retention and time-to-degree data
• Candidate complaints record/documentation of resolution
• Advisor/advisee assignment lists
• Budget
• Personnel
• Facilities
• Unit resources, including technology
• Specific study data and university comparison data
Assessment Models
The College of Education assessment model has been developed for the previously
identified comprehensive areas of candidates, program, and support. Developed
collaboratively, this model reflects criteria deemed critical to evaluate and evidence
candidate learning, candidate performance, program effectiveness, and effective support
candidates and programs. Furthermore, this model includes a cyclical process that involves
aggregation of data and review by decision-makers for the purpose of improving programs
and policies. This constitutes a continuous improvement process.
V.
TRANSITION POINTS
In accordance with the assessment philosophy data the College of Education has
identified transition points at which candidate performance and progress are evaluated. These
points for initial and advanced levels are grouped according to program and are described in
the following documents:
25
Transition
Assessment
Transition Points – Initial Teacher Licensure
Responsibility
Admission
Background Clearance Investigation
Office of Student Services
[Candidate not admitted until
requirements are met]
Computer Literacy Test – Hands on test
Office of Student Services/Technology
Coordinator
Office of Student Services/Data Manager
GPA (2.50 or higher)
Entry to Student Teaching
[Candidate not admitted until
requirements are met.]
Exit from Student Teaching
[Candidate can not complete
Student Teaching until
requirements are met.]
1) PRAXIS I scores: Reading (173), Writing (172),
Math (172); or SAT score (1050 or higher) ACT score
(22 or higher); or B or better in general education
English and math courses
General Education courses: 30 semester hours
distributed as indicated by audit sheets (UG)
Admission to Graduate School (G)
GPA: 2.50 overall, in education course, and in major
Portfolio (Core, Content, Reflective Essay): Copy of
review on file with appropriate signature
Office of Student Services/Data Manager
PRAXIS II content test
Student Teaching Evaluation
PRAXIS III- based
Licensure Officer/Data Manager
Cooperating and Supervising
Teachers/Director of Student
Teaching/Assessment Director
Cooperating and Supervising
Teachers/Director of Student Teaching
Student Teaching Evaluation
SPA specific
Completers’ Surveys
Impact on Student Learning
Program Completion (Licensure
Application)
[Candidate cannot be
recommended for licensure
until requirements are met.]
Follow-up
Portfolio (All items including Core, Content,
Reflective Essay, Impact on Student Learning and
Student Teaching Evaluations)
Program Completers Surveys
Cooperating Teacher Survey
PRAXIS II Principles of Learning and Teaching
(UG/G)
Comprehensive Examination (G only)
Degree Clearance: 128 credits minimum, 2.50 GPA
overall, 2.50 in education courses, 2.50 in major (UG)
PRAXIS III Evaluations
Focus Group Interviews
Employers' Survey
Office of Student Services
Office of Student Services
Office of Student Services
Faculty/Faculty advisor/ Student Teaching
Director/Data Manager
Candidates/Director of Student
Teaching/Assessment Director
Colloquium instructor/Data Manager
Colloquium instructor/Data Manager
Colloquium Instructor/Assessment Director
Cooperating Teachers/Assessment Director
Licensure Officer/Data Manager
Department of Curricular & Instructional
Studies
Office of Student Services
State Pathwise Evaluation
(ODE)/Assessment Director
Assessment Director
Assessment Director
26
Purpose
Candidate assessment
(Dispositions)
Candidate assessment
(Technology)
Candidate assessment
(Content knowledge
Candidate assessment
(Content knowledge)
Candidate assessment
(Content knowledge)
Candidate assessment
Candidate assessment
Candidate assessment
(Content, professional & pedagogical knowledge, skills, and
dispositions)
Candidate assessment (Content knowledge)
Candidate assessment
(Content, professional & pedagogical knowledge, skills, and
dispositions.)
Candidate assessment
(Content, professional & pedagogical knowledge, skills, and
dispositions.)
Program assessment
Candidate assessment
(Content, professional & pedagogical knowledge, skills, and
dispositions.)
Candidate assessment
(Content, professional & pedagogical knowledge, skills, and
dispositions.)
Program assessment
Program assessment
Candidate assessment
(Professional & pedagogical knowledge, skills, and dispositions)
Candidate assessment
Candidate assessment
Candidate assessment
(Professional & pedagogical knowledge, skills, and dispositions)
Program assessment
Program assessment
Transition
Admission
[Candidate not admitted
until requirements are
met.]
Mid-point
[Candidate can not
progress in program until
requirements are met.]
Program Completion
[Candidate can not
progress in program until
requirements are met.]
Program Completion
[Candidate can not
complete program until
requirements are met.]
Transition Points - Master’s Programs In Curricular And Instructional Studies (C&I)
Assessment
Responsibility
C&I Department Chair
COE Office of Student Services
Graduate School (sending letter of
acceptance with specific information).
Candidate Assessment
(Content Knowledge)
Advancement to Candidacy and graduation a) B
or better in 15 credit hours of program course
work and
b) Successful completion of field experience in
5610:605 OR 5500: 600 as evidenced by a score
of 3 (target) or 2 (acceptable)
c) Score of 3 (target) or 2 (acceptable) on the
Dispositions Measurement in 5610: 605 or 5500:
600 course
C&I Faculty Advisor signature
asserting completion of 15 credits and
acceptable scores on the Field
Experience and Dispositions
Candidate Assessment
(Content knowledge, professional growth and
dispositions)
Successful completion of Master’s Written
Comprehensive Examination as evidenced by a
score of 3 (target) or 2 (acceptable) on each
section of the examination
Capstone assessment:
1. Successful completion Master’s Research
Project/Problem as evidenced by a score of 3
(target) or 2 (acceptable) on each section of the
Project/Problem
Office of Student Services (NOTE:
This requirement is outlined on the
Program Course Distribution Plan
(PCD). The candidate’s acceptance
letter into the program instructs the
candidate to meet with the assigned
advisor to complete the PCD. During
this time, the C&I advisor explains
each requirement including the Midpoint assessment described above).
C&I Faculty rating of candidate’s
performance in 5500: 600 or 5610: 605
C&I Faculty rating of candidate’s
performance on Comprehensive
Examination
C&I Faculty rating of candidate’s
performance on Master’s Research
Project/Problem and Dispositions
Candidate Assessment (content, professional and
pedagogical knowledge)
Candidate Assessment (Content, professional and
pedagogical knowledge, and dispositions)
C&I Faculty collect Completers
Survey
2. Score of 3 (target) or 2 (acceptable) on the
Dispositions Measurement
Follow-up
Purpose
GPA
Professional Experience
Graduate School Admission
3. Completers Survey (submitted during the final
class meetings of the Master’s Research
Project/Problem course)
Alumni Surveys
Focus Group Interviews
Institutional Research
COE Assessment Office
27
Program Assessment
Transition
Admission
[Candidate not admitted
until requirements are
met.]
Mid-point
[Candidate can not
progress in program until
requirements are met.]
Program Completion
[Candidate can not
complete program until
requirements are met.]
Follow-up
Transition Points – Classroom Guidance for Teachers Master’s Program in Counseling
Assessment
Responsibility
Purpose.
Bachelor’s Degree
GPA: 2.75 or higher (full admission into
Graduate school)
Teaching Certificate/License
Professional Experience
School Counseling Coordinator
CoE Office of Student Affairs
Graduate School (sending letter of
acceptance with specific information).
Candidate Assessment
(Content Knowledge)
Advancement to Candidacy and graduation a)
B or better in 15 credit hours of program course
work and
b) Successful completion of field experience in
5600: 695 with a B
c) Pass annual candidate review by School
Counseling faculty on the Statement of
Expectation for Counseling Students
Counseling Faculty Advisor
Signature asserting completion of 15
credit hours
(NOTE: This requirement is outlined
on the Program Course Distribution
Plan (PCD). The candidate’s
acceptance letter into the program
instructs the candidate to meet with
the assigned advisor to complete the
PCD. During this time, the
Counseling advisor explains each
requirement including the Mid-point
assessment described above).
School counseling faculty review
field experience performance and do
annual review of candidates
Review of scores by the School
Counseling Coordinator
Candidate Assessment
(Content knowledge, Statement of Expectations for
Counseling Students)
1. Successful completion of Master’s
Comprehensive Examination as evidenced by a
score of at least 53 out of 75 questions (70%).
2. Completers Survey (submitted during the
comprehensive exam)
Reviewed by School Counseling
faculty
Alumni Surveys
Focus Groups
Assessment Office
Institutional Research
28
Candidate Assessment (Content; Professional and
Pedagogical Knowledge)
Program Assessment
Program Assessmen t
Transition
Admission
[Candidate not admitted
until requirements are
met.]
Mid-point
Transition Points - Master's Programs in Educational Foundations and Leadership
Master’s Programs in Instructional Technology, Technology Facilitation Endorsement,
Principalship Master's, and Principalship Licensure Programs
Assessment
Responsibility
Purpose
GPA
Professional Experience
Graduate School Admission
Assistant Department Chair (EFL)
Candidate Assessment
(Content knowledge)
Teaching Experience for Advanced Programs
Office of Student Services
Confirmation of a completed undergraduate
degree
Program Course Requirements
Graduate School
Candidate Assessment
(Experience required for Endorsements, and Licensures)
Candidate Assessment (Content Knowledge)
Advisor
Candidate Assessment (Rubrics to assess meeting
program standards, including aligned dispositions)
Advancement to Candidacy and graduation
a) 3.0 GPA or better in at least 12 credit hours
or program course work
Successful completion of Master's Portfolio
with a score of 3(target) or 2 (acceptable)
overall on the portfolio rubric
EFL Faculty Advisor signature
asserting completion of at least 12
credits of 3.0 GPA or better.
EFL Faculty rating of the candidate's
performance on final program
portfolio
Candidate Assessment
Master’s Project
Office of Student Services
Completers’ Survey
Focus Group Interviews
Candidates
Assessment Office
Candidate Assessment (Content, professional and
pedagogical knowledge, and dispositions)
Program Assessment
Program Assessment
[Candidate can not
progress in program until
requirements are met.]
Program Completion
[Candidate can not
complete program until
requirements are met.]
Follow-up
29
Candidate Assessment (Content, professional and
pedagogical knowledge, and dispositions)
Transition Points - Doctoral Program In Educational Foundations And Leadership (Ed.D)
Assessment
Transition
Admission
Responsibility
Purpose
GPA
Office of Student Services
Candidate Assessment
(Content Knowledge)
Controlled Writing Sample
Office of Student Services
Interview
Office of Student Services
GRE Score
Office of Student Services
Internship
Faculty Advisor; Office of Student
Services
Candidate Assessment
(Content Knowledge)
Candidate Assessment (Content, professional and
pedagogical knowledge, and dispositions)
Candidate Assessment
(Content Knowledge)
Candidate Assessment
(Content knowledge, professional and pedagogical
knowledge, and dispositions)
Advancement to Candidacy
Faculty Advisor; Office of Student
Services
Faculty Advisor; Office of Student
Services
Candidate Assessment
Dissertation Defense
Faculty Advisor; Office of Student
Services
Completers survey
Focus Group Interviews
Candidates
Assessment Office
Candidate Assessment
(Content knowledge, professional and pedagogical
knowledge, and dispositions)
Program Assessment
Program Assessment
[Candidate not admitted
until requirements are
met.]
Mid-Point
[Candidate can not
progress in program until
requirements are met.]
Program Completion
Dissertation Proposal
[Candidate can not
complete program until
requirements are met.]
Follow-up
30
Candidate Assessment
(Content knowledge, professional and pedagogical
knowledge, and dispositions)
VI.
KEY ASSESSMENTS
Admissions criteria have been established for all programs. Once admitted to a
program, key assessments are in place to monitor the progress of candidates as they move
through and complete the programs.
For initial teacher preparation programs, performance assessments that address
Professional and Pedagogical Knowledge and Skills for Teacher Candidates have been
implemented. There are thirteen separate assessments at the undergraduate level and ten at
the graduate level that have been aligned with both the Ohio/INTASC Standards and the
Ohio Standards for the Teaching Profession (Ohio Department of Education, 2007). For each
initial and advanced licensure program, there are six to eight key assessments. These have
been aligned with the Specialized Professional Association (SPA) Standards which are
specific to teaching fields. Collectively, they delineate what candidates should know and be
able to do within their chosen teaching field. The assessments are completed at various points
during the programs and are reviewed at the identified transition points. Some assessments
are unique to a specific program; others are unit-wide assessments. For initial teacher
preparation, the key assessments constitute the candidate assessment portfolio. [Portfolio
checklists are located through the following link:
http://www.uakron.edu/dotAsset/576616.pdf.]
At the advanced level, assessments have also been implemented to determine what
candidates know and are able to do and to evaluate the rigor and effectiveness of the
programs. The assessments for the Master’s Degree in Educational Administration/
Principalship and the Post-Master’s Principalship Licensure Program are aligned with
Educational Leadership Constituents Council (ELCC) standards. The assessments for the
Master’s Degree in Instructional Technology and the Technology Facilitation Endorsement
are aligned with International Standards for Technology Education (ISTE). The advanced
programs in Curricular and Instructional Studies reflect the Ohio Standards for the Teaching
Profession which have been aligned with the National Board for Professional Teaching
Standards (NBPTS).
The Ed.D in the Department of Educational Foundations and Leadership primarily
prepares candidates for continuing roles in K-12 schools. Therefore, transition points and
corresponding assessments that are reviewed at each point have been identified for this
program.
Details of the key assessments are reflected in the following table.
31
INITIAL TEACHER PREPARATION PROGRAMS
Professional and Pedagogical Core
Professional and Pedagogical Core – Baccalaureate/Post-Baccalaureate
Assessment
Assessment Name
Ohio/INTASC
Standards*
1
2
3
4
5
6
7
8
9
10
11
12
13
Beginning Philosophy of Education (5100:200)
Field Synthesis Report (5100:200)
Comprehensive Project (5100:220)
Field Synthesis Report (5100:220)
Electronic Presentation (5500:230)
Field Report (5610:225)
Multicultural Pedagogical Project (5100:300)
Unit Plan (5500:360)
Lesson Plan (5500:360)
Management Plan (5500:360)
Personal Management Plan (5500:370)
Assessment Plan (5500:370)
Praxis II – Principles of Learning & Teaching
Professional and Pedagogical Core – Master’s with Licensure
Assessment
Assessment Name
#
1
2
3
4
5
6
7
8
Philosophy of Education Statement (5100:604)
Comprehensive Project (5100:620)
Field Synthesis Report (5100:695)
Candidate Created Assessment (5100:642)
Classroom Management Plan (5500:619)**
Unit Plan, Lesson Plan, Micro-Teach Rubric
(5500:617)**
Case Study Presentation (5500:617)**
Praxis II – Principles of Learning & Teaching
Transition Point
C, D, E, G, H, I, J
B, C, I
B, C, H, I
B, C, I
D, E, F, G
C, D, E, F, G, J
C, G, I, J
B, C, D, E, F, H, J
B, C, D, E, F, H
C, E, F, J
B, C, E, F, J
B, F, G, H, I, J
B, C, D, F, G, H, I, J
2 – Entry to Student Teaching
2 – Entry to Student Teaching
2 – Entry to Student Teaching
2 – Entry to Student Teaching
2 – Entry to Student Teaching
2 – Entry to Student Teaching
2 – Entry to Student Teaching
2 – Entry to Student Teaching
2 – Entry to Student Teaching
2 – Entry to Student Teaching
2 – Entry to Student Teaching
2 – Entry to Student Teaching
4 – Program Completion
Ohio/INTASC
Standards*
Transition Point
Unit/
Unique
to
Program
Unit
Unit
Unit
Unit
Unit
Unit
Unit
Unit
Unit
Unit
Unit
Unit
Unit
C, D, E, G, H, I, J
B, C, H, I
B, C, I
B, D, G, H, I
B, C, F, G, I, J
D, E, G, H, I
2 – Entry to Student
2 – Entry to Student Teaching
2 – Entry to Student Teaching
2 – Entry to Student Teaching
2 – Entry to Student Teaching
2 – Entry to Student Teaching
Unit/
Unique
to
Program
Unit
Unit
Unit
Unit
Unit
Unit
B, C, D, F, H, I, J
B, C, D, F, G, H, I, J
2 – Entry to Student Teaching
4 – Program Completion
Unit
Unit
* Ohio/INTASC Standards have been aligned with the Ohio Standards for the Teaching Profession
** Content specific courses required in the Intervention Specialist programs
32
Data
Technology
Source
Tk20
Tk20
Tk20
Tk20
Tk20
Tk20
Tk20
Tk20
Tk20
Tk20
Tk20
Tk20
COE
Database
Data
Technology
Source
Tk20
Tk20
Tk20
Tk20
Tk20
Tk20
Tk20
COE
Database
Specialized Professional Association (SPA) Key Assessments
Early Childhood Education – Baccalaureate/Post-Baccalaureate
Assessment
Assessment Name
NAEYC Standards
Transition Point
Unit/
Unique
to
Program
Unit
Data
Technology
Source
1
Praxis II Education of Young Children (#20021)
1, 2, 3, 4, 5
2 – Entry to Student Teaching
2
3
4
Make Learning Visible (5200:325)
Collaborative Primary Unit Plan 5200:425)
Student Teaching Evaluation/NAEYC Specific
Evaluation (5200:495/496)
Impact on Student Learning (5200:498)
1, 2, 3
3, 4
1, 2, 3, 4, 5
2 – Entry to Student Teaching
2 – Entry to Student Teaching
3 – Exit from Student
Teaching
3 – Exit from Student
Teaching
2 – Entry to Student Teaching
Unique
Unique t
Unit
COE
Database
Tk20
Tk20
SPSS
Unit
Tk20
Unique
Tk20
2 – Entry to Student Teaching
2 – Entry to Student Teaching
Unique
Unique
Tk20
Tk20
5
1, 3, 4, 5
6
Family School Relationship (2006-2007)
1, 2
Family Interview (2007-2008) (5610:460)
7
Letter to Congressman (5200:425)
2, 5
8
Classroom Management Plan (5200:420)
1, 5
Middle Childhood Education – Baccalaureate/Post-Baccalaureate
Assessment
Assessment Name
NMSA Standards
1
2
3
4
5
6
7
Praxis II Principles of Learning and Teaching–
Grades 5-9 (#30523)
Middle Level Language Arts (#10049)
Middle Level Mathematics (20069)
Middle Level Science (#10439)
Middle Level Social Studies (#20089)
Summary of Research Article (5250:300)
Interdisciplinary Unit (5200:300)
Student Teaching Evaluation (5250:495/496)
1, 4, 5, 6, 7
Impact on Student Learning: Modified Teacher
Work Sample (5250:498)
Parent Communication (5250:300)
Field Research Project (5250:300)
1, 2, 3, 4, 5, 6, 7
Transition Point
PLT: 4 – Program
Completion
Unit/
Unique
to
Program
Unit
Data
Technology
Source
COE
Database
Content: 2 – Entry to Student
Teaching
2, 4
1, 3, 4, 5
1, 2, 3, 4, 5, 6, 7
2, 6
1, 2, 7
33
3 – Entry to Student Teaching
3 – Entry to Student Teaching
3 – Exit from Student
Teaching
3 – Exit from Student
Teaching
3 – Entry to Student Teaching
3 – Entry to Student Teaching
Unique
Unique
Unit
Tk20
Tk20
Tk20
Unique
Tk20
Unique
Unique
Tk20
Tk20
Early Childhood Intervention Specialist – Baccalaureate/Post-Baccalaureate
Assessment
Assessment Name
CEC Standards
1
2
3
4
5
Praxis II: Special Education: Knowledge-based
Core Principles (data provided on test #0351)
Test #0353 has now replaced #0351.
Research Paper on a Disability (5610:448)
Individualized Education Plan (5610:485)
Student Teaching Evaluation - including CEC
specific evaluation (5610:485)
Impact on Student Learning (5610:403)
6
Practicum Case Study (5610:470)
7
Family Interview (5610:460)
Early Childhood Intervention Specialist – Master’s Degree
Assessment
Assessment Name
1
2
3
4
5
6
Praxis II: Special Education: Knowledge-based
Core Principles (data provided on test #0351)
Test #0353 has now replaced #0351.
Research Paper on a Disability (5610:548)
Individualized Education Plan (5610:553)
Student Teaching Evaluation - including CEC
specific evaluation (5610:690)
Practicum Case Study – Impact on Student
Learning (5610:570)
Assessment Report (5610:564)
Transition Point
Unit/
Unique
to
Program
Data
Technology
Source
1, 3, 5, 6, 9, 10
3 – Entry to Student Teaching
Unit
COE
Database
1, 2
4, 7, 10
1, 2, 3, 4, 5, 6, 7, 8, 9,
10
4, 5, 7, 8,
3 – Entry to Student Teaching
3 – Entry to Student Teaching
3 – Exit from Student
Teaching
3 – Exit from Student
Teaching
3 – Entry to Student Teaching
3 – Entry to Student Teaching
Unique
Unique
Unit
Tk20
Tk20
SPSS
Unit
Tk20
Unique
Unique
Tk20
Tk20
3, 6, 8, 9
1, 2
CEC Standards
Transition Point
Unit/
Unique
to
Program
Data
Technology
Source
1, 3, 5, 6, 9, 10
3 – Entry to Student Teaching
Unit
COE
Database
1, 2
4, 7, 10
1, 2, 3, 4, 5, 6, 7, 8, 9,
10
4, 5, 7, 8
3 – Entry to Student Teaching
3 – Entry to Student Teaching
3 – Exit from Student
Teaching
3 – Exit from Student
Teaching
3 – Entry to Student Teaching
Unique
Unique
Unit
Tk20
Tk20
SPSS
Unit
Tk20
Unique
Tk20
3, 8, 9
34
Mild to Moderate Intervention Specialist – Baccalaureate/Post-Baccalaureate
Assessment
Assessment Name
CEC Standards
1
2
3
4
5
Praxis II Special Education: Knowledge-based
Core Principles (data provided on test #0351)
Test #0353 has now been replaced by Praxis II
test (#0351)
Research Paper on a Disability (5610:447)
Analysis of Best Practices for Youth with
Mild/Moderate Disabilities (5610:451)
Student Teaching Evaluation - including CEC
Specific Evaluation (5610:486)
Impact on Student Learning (5610:403)
6
Practicum Case Study (5610:470)
7
Family Interview (5610:460)
Mild to Moderate Intervention Specialist – Master’s Degree
Assessment
Assessment Name
1
2
3
4
5
6
Praxis II Special Education: Knowledge-based
Core Principles (data provided on test #0351)
Test #0353 has now been replaced by Praxis II
test (#0351)
Research Paper on a Disability (5610:547)
Analysis of Best Practices for Youth with
Mild/Moderate Disabilities (5610:551)
Student Teaching Evaluation - including CEC
Specific Evaluation (5610:690)
Practicum Case Study – Impact on Student
Learning (5610:570)
Assessment Report (5610:563)
Transition Point
Unit/
Unique
to
Program
Unit
Data
Technology
Source
1, 3, 5, 6, 9, 10
3 – Entry to Student Teaching
1, 2
4, 7, 10
3 – Entry to Student Teaching
Unique
Unique
Tk20
Tk20
1, 2, 3, 4, 5, 6, 7, 8, 9,
10
4, 5, 7, 8,
3 – Exit from Student
Teaching
3 – Exit from Student
Teaching
3 – Entry to Student Teaching
3 – Entry to Student Teaching
Unit
SPSS
Unit
Tk20
Unique
Unique
Tk20
Tk20
3, 6, 8, 9
1, 2
Standards
1, 3, 5, 6, 9, 10
3 – Entry to Student Teaching
Unit/
Unique
to
Program
Unit
1, 2
4, 7, 10
3 – Entry to Student Teaching
3 – Entry to Student Teaching
Unique
Unique
Tk20
Tk20
1, 2, 3, 4, 5, 6, 7, 8, 9,
10
4, 5, 7, 8
3 – Exit from Student
Teaching
3 – Exit from Student
Teaching
3 – Entry to Student Teaching
Unit
SPSS
Unit
Tk20
Unique
Tk20
3, 8, 9
35
Transition Point
COE
Database
Data
Technology
Source
COE
Database
Moderate to Intensive Baccalaureate/Post-Baccalaureate
Assessment
Assessment Name
1
2
3
4
5
Praxis II Special Education: Knowledge-based
Core Principles (data provided on test #0351)
Test #0353 has now replaced #0351.
Research Paper on a Disability (5610:448)
Individualized Education Plan (5610:453)
Student Teaching Evaluation - including CEC
specific evaluation (5610:487)
Impact on Student Learning (5610:403)
6
Practicum Case Study (5610:470)
7
Family Interview (5610:460)
Moderate to Intensive - Master’s Degree
Assessment
Assessment Name
1
2
3
4
5
6
Praxis II Special Education: Knowledge-based
Core Principles (data provided on test #0351)
Test #0353 has now replaced #0351.
Research Paper on Disability (5610:548)
Individualized Education Program (5610:553)
Student Teaching Evaluation - including CEC
Specific Evaluation (5610:690)
Practicum Case Study – Impact on Student
Learning (5610:570)
Assessment Report (5610:563)
CEC Standards
Transition Point
1, 3, 5, 6, 9, 10
3 – Entry to Student Teaching
1, 2
4, 7, 10
1, 2, 3, 4, 5, 6, 7, 8, 9,
10
4, 5, 7, 8
3 – Entry to Student Teaching
3 – Entry to Student Teaching
3 – Exit from Student
Teaching
3 – Exit from Student
Teaching
3 – Entry to Student Teaching
3 – Entry to Student Teaching
3, 6, 8, 9
1, 2
CEC Standards
Transition Point
1, 3, 5, 6, 9, 10
3 – Entry to Student Teaching
1, 2
4, 7, 10
1, 2, 3, 4, 5, 6, 7, 8, 9,
10
4, 5, 7, 8
3 – Entry to Student Teaching
3 – Entry to Student Teaching
3 – Exit from Student
Teaching
3 – Exit from Student
Teaching
3 – Entry to Student Teaching
3, 8, 9
36
Unit/
Unique
to
Program
Unit
Data
Technology
Source
COE
Database
Unique
Unique
Unit
Tk20
Tk20
SPSS
Unit
Tk20
Unique
Unique
Tk20
Tk20
Unit/
Unique
to
Program
Unit
Data
Technology
Source
COE
Database
Unit
Unique
Unit
Tk20
Tk20
SPSS
Unit
Tk20
Unique
Tk20
AYA Integrated Mathematics Education - Baccalaureate/Master’s
Assessment
Assessment Name
1
Praxis II Content Knowledge Exam 0061
2
Grades in required mathematics content courses
3
Mock Praxis III Assessment (5300:420/
5500:520)
4
Student Teaching Assessment (5300:495/
5500:694)
5
Impact on Student Learning (5300:496/
5500:692)
6
Proof Skills Assessment – Lower Level
(3450:307)
7
Proof Skills Assessment – Upper Level
(3450:307)
8
Standards-Based Strategy Portfolio (5300:420/
5500:520)
AYA English Language Arts -Baccalaureate/Master’s
Assessment
Assessment Name
1
Praxis II Subject Area Test 0041
2
Grades in required English language arts
content courses
3
Mentoring Report (5300:420/5500:520)
4
Student Teaching Evaluation (5300:495/
5500:694)
NCTM
Standards
1, 5, 9, 10, 11, 12, 13,
14, 15
1, 3, 4, 5, 6, 9, 10, 11,
12, 13, 14, 15
3, 7, 8,
3 – Entry to Student Teaching
Unit/Uni
que to
Program
Unit
3 – Entry to Student Teaching
Unique
Data
Technology
Source
COE
Database
PeopleSoft
3 – Entry to Student Teaching
Unique
Tk20
7, 8,
Unit
SPSS
Unit
Tk20
1, 2, 3, 4,
3 – Exit from Student
Teaching
3 – Exit from Student
Teaching
3 – Entry to Student Teaching
Unique
Tk20
1, 2, 3, 4, 5, 11
3 – Entry to Student Teaching
Unique
Tk20
6, 8
3 – Entry to Student Teaching
Unique
Tk20
Transition Point
Unit/
Unique
to
Program
Data
Technology
Source
7, 8,
NCTE Standards
3.1, 3.2, 3.3, 3.4, 3.5,
4.9
2.1, 2.2, 2.3, 2.4, 2.5,
3.1, 3.3, 3.4, 3.5, 3.6,
4.1, 4.2, 4.7, 4.8, 4.9,
4.10
2.2, 2.4, 4.4, 4.5, 4.7,
4.8, 4.9
2.1, 2.3, 2.4, 2.5, 2.6,
3.2, 3.5, 3.6, 4.1, 4.2,
4.3, 4.6, 4.10
37
Transition Point
3 – Entry to Student
Teaching
3 – Entry to Student
Teaching
Unit
Unit
COE
Database
PeopleSoft
3 – Entry to Student
Teaching
3 – Exit from Student
Teaching
Unique
Tk20
Unit
SPSS
AYA English Language Arts -Baccalaureate/Master’s (cont’d)
5
Impact on Student Learning (5300:496/
2.1, 2.2, 2.5, 3.1, 3.3,
5500:692)
3.4, 4.1, 4.2, 4.3, 4.4,
4.9, 4.10
6
Language Development Response (5300:480)
2.1, 2.2, 2.4, 3.1, 3.2,
3.3, 3.6,
7
Integrated Language Arts Unit
2.1, 2.2, 2.3, 2.4, 2.6,
Plan/Teach/Reflect (5300:420/5500:520)
3.4, 4.1, 4.2, 4.4, 4.5,
4.6, 4.7, 4.8, 4.9, 4.10
8
Critical Analysis and Reflection on the
3.3, 3.4, 3.5, 3.7
Teaching of Literature (5300:330)
AYA Science Education - Baccalaureate/Master’s
Assessment
Assessment Name
NSTA Standards
1
Praxis II Content Knowledge Tests
1a
2
Grades in require science content courses
1a
3
Unit Plan (5300:420/5500:520)
4
Student Teaching Evaluations - Praxis III–based
and Science-specific (5300:495/5500:694)
5
Impact on Student Learning – Modified for
AYA Science (5300:496/5500:692(
Safety Plan (5300:420/5500:520)
1a, 1b, 1c, 2a, 2b, 2c,
3b, 4a, 4b, 6a, 6b, 7a,
7b, 8a, 8b, 8c
1a, 1c, 5a, 5b, 5c, 5d,
5e, 5f, 6a, 6b, 8a, 8b, 8c,
9a, 9b
1a, 1b, 2c, 3b, 4b, 8c
6
7
8
Research Report Reflection (5300:420/
5500:520)
Portfolio – NSTA Standards (5300:420/
5500:520)
9a, 9b
1d, 1e
1b, 2a, 2b, 3a, 4a
38
3 – Exit from Student
Teaching
Unit
Tk20
3 – Entry to Student
Teaching
3 – Entry to Student
Teaching
Unique
Tk20
Unique
Tk20
3 – Entry to Student
Teaching
Unique
Tk20
Transition Point
Unit/
Unique
to
Program
3 – Entry to Student
Teaching
3 – Entry to Student
Teaching
3 – Entry to Student
Teaching
Unit
Data
Technology
Source
Unit
COE
Database
PeopleSoft
Unique
Tk20
3 – Exit from Student
Teaching
Unit
SPSS
3 – Exit from Student
Teaching
3 – Entry to Student
Teaching
3 – Entry to Student
Teaching
3 – Entry to Student
Teaching
Unit
Tk20
Unique
Tk20
Unique
Tk20
Unique
Tk20
AYA Social Studies Education – Baccalaureate/Post-baccalaureate
Assessment
Assessment Name
NCSS Standards
1
2
Praxis II “Social Studies Content Knowledge”
(10081)
Grades in required social studies courses
3
4
NCSS Lesson Plans (5300:420/5500:520)
Student Teaching Evaluations (5300:495/
5500:694)
5
Impact on Student Learning (5300:496/
5500:692)
6
Content Portfolio/Curriculum Connections
Portfolio (5300 :420/5500:520)
P-12 Multi-age Foreign Language - Baccalaureate/Master’s
Assessment
Assessment Name
1
2
3
4
5
6
7
8
Praxis II Content Knowledge Exam French
(#0173)
Praxis II Content Knowledge Exam Spanish
(#0191)
Essay/Writing Assessment (3520:302,
3580:402)
Big Book Unit Plan (5300:495/5500:694)
Student Teaching Evaluation - including
ACTFL specific section (5300:495/5500:694)
Impact on Student Learning (5300:496/
5500:692)
Candidate Oral Proficiency (OPI)
Culture Project (3520:309,310/3580:431,432)
Field Journal (5200:321/5500:621)
1.1, 1.2, 1.3, 1.4, 1.5,
1.6, 1.7, 1.10
1.1, 1.2, 1.3, 1.4, 1.5,
1.6, 1.7, 1.8, 1.9, 1.10
1.1, 1.3
1.1, 1.2, 1.3, 1.4, 1.5,
1.6, 1.7, 1.8, 1.9, 1.10
1.1, 1.2, 1.3, 1.4, 1.5,
1.6, 1.7, 1.8, 1.9, 1.10
1.1, 1.2, 1.3, 1.4, 1.5,
1.6, 1.7, 1.8, 1.9, 1.10
ACTFL Standards
Transition Point
Unit/
Unique
to
Program
Data
Technology
Source
3 – Entry to Student Teaching
Unit
3 – Entry to Student Teaching
Unit
COE
Database
PeopleSoft
3 – Entry to Student Teaching
3 – Exit from Student
Teaching
3 – Exit from Student
Teaching
3 – Entry to Student Teaching
Unique
Unit
Tk20
SPSS
Unit
Tk20
Unique
Tk20
1, 2
3 – Entry to Student Teaching
Unit/
Unique
to
Program
Unit
1, 2
3 – Entry to Student Teaching
Unique
Tk20
2, 3, 4, 5
1, 2, 3, 4, 5, 6
3 – Entry to Student Teaching
3 – Exit from Student
Teaching
3 – Exit from Student
Teaching
3 – Entry to Student Teaching
3 – Entry to Student Teaching
3 – Entry to Student Teaching
Unique
Unit
Tk20
SPSS
Unit
Tk20
Unique
Unique
Unique
Tk20
Tk20
Tk20
3, 4, 5
1
1, 2
3, 4, 6
39
Transition Point
Data
Technology
Source
COE
Database
P-12 Multi-age Physical Education – Baccalaureate
Assessment
Assessment Name
1
AAHPERD-NASPE
Standards
Transition Point
1, 2
3 – Entry to Student Teaching
2
3
4
Praxis II Physical Education: Content
Knowledge Test (#10091)
Grades in required physical education courses
Physical Education Lesson Plans (5550:345)
Student Teaching Evaluation (5550:494/495)
1, 2, 4, 5, 6, 7, 9
2
3, 4, 5, 6, 7, 8, 10
5
Impact on Student Learning (5550:494)
7, 8
6
7
Adapted PE Clinical Report (5550:245)
Microteaching Portfolio (5550:345)
3, 5
1, 2, 3, 4, 5, 6, 7, 8, 9,
10
3 – Entry to Student Teaching
3 – Entry to Student Teaching
3 – Exit from Student
Teaching
3 – Exit from Student
Teaching
3 – Entry to Student Teaching
3 – Entry to Student Teaching
Unit/
Unique
to
Program
Unit
Data
Technology
Source
Unit
Unique
Unique
COE
Database
PeopleSoft
Tk20
Tk20
Unique
Tk20
Unique
Unique
Tk20
Tk20
ADVANCED PROGRAMS
Curricular and Instructional Studies – Master’s Programs for Practicing Teachers
Assessment
Assessment Name
Ohio Standards for
the Teaching
Profession
1
Comprehensive Examination
OSTP 2.1, 2.2, 2.3,
2.4, 2.5
Program Completion
Unit/
Unique
to
Program
Unique
2
Field Experience Report (5500:600/5610:605)
Mid-point
Unique
Excel
3
Master’s Project/Problem (5500:696/5610:698)
Program Completion
Unique
Excel
4
Theory to Practice Applied Project (5100:604)
Program Completion
Unique
Excel
5
Field Experience
Program Completion
Unique
Excel
6
Dispositions Assessment #1
Dispositions Assessment #2
OSTP 5.1, 5.2, 5.3,
5.4, 5.5
OSTP 1.2, 1.4, 2.1,
2.2, 2.3, 3.1, 3.2, 3.3,
4.3, 4.4, 5.5, 6.1, 6.3,
7.1, 7.2, 7.3
OSTP 1.2, 1.4, 4.2,
4.4, 5.3, 5.5, 7.2
OSTP 5.1, 5.2, 5.3,
5.4, 5.5
OSTP 1.2, 1.4, 4.5,
5.1, 5.2, 5.5, 7.1, 7.2,
7.3
Mid-point
Program Completion
Unique
Excel
40
Transition Point
Data
Technology
Source
Excel
Counseling – Classroom Guidance for Teachers
Assessment
Assessment Name
1
Comprehensive Examination
KII.4
KII.7
Program Completion
Unit/
Unique
to
Program
Unique
2
Field Experience (5600:695)
A4, 6, 8 &9
B1, 4, & 6
C1a, C1b
Mid-point
Unique
Excel
3
Ethical Dilemma Presentation (5600:631)
KII.4
KII.7
Mid-point
Unique
Excel
4
5
Grades in Required Courses
Theory to Practice Applied Project (5600:663)
Program Completion
Program Completion
Unique
Unique
Excel
Excel
Principalship: Master’s/Post-Master’s Licensure
Assessment
Assessment Name
CACREP
(School Counseling
Standards)
Consultation
Standard
Excel
1*
Praxis II Education Leadership: Administration
and Supervision (#410)
1.1, 1.3, 2.2, 2.3, 2.4,
3.1, 3.2, 3.3, 4.1, 4.2,
4.3, 5.1, 5.2, 6.1
Program Completion
2
Foundations of Educational Leadership
Part A – Vision Project
Part B – Implications of Law
Part C – School Contexts
Leading and Evaluating School Improvement
and Cultures Projects
Part A – School Cultures Projects
Part B – Supervision and Professional
Development Project
Internship Project
1.1, 1.2, 1.3, 1.4, 1.5,
6.1, 6.2, 6.3
Within Program Coursework
Unique
Excel
2.1, 2.2, 2.3, 2.4
Within Program Coursework
Unique
Excel
1.5, 2.2, 3.1, 3.2, 3.3,
4.1, 4.2, 4.3, 5.1, 5.2,
5.3, 6.1, 6.2, 6.3, 7.3
Program Completion
Unique
Excel
4*
41
Transition Point
Data
Technology
Source
Unit/
Unique
to
Program
Unit
3
ELCC Standards
Transition Point
Data
Technology
Source
COE
Database
Principalship: Master’s/Post-Master’s Licensure (cont’d)
Assessment
Assessment Name
5*
Employer Survey
6
Capstone Project with Portfolio
7
ELCC Standards
1.1, 1.2, 1.3,.1.4, 1.5,
2.1, 2.2, 2.3, 2.4, 3.1,
3.2, 3.3, 4.1, 4.2, 4.3,
5.1, 5.2, 5.3, 6.1, 6.2,
6.3
1.1, 1.2, 2.1, 2.2, 2.3,
2.4, 3.1, 3.2, 3.3, 4.1,
4.2, 4.3, 5.1, 6.1
3.1, 3.2, 3.3, 4.1, 4.2,
4.3
Transition Point
Post-program
Exit From MA
Program/Midpoint in
Licensure
Within Program Coursework
Organizational Management and Community
Relations Projects
Part A – Building and Facilities Safety
Part B – Human Resources
Part C – School Community Audit
*Assessments # 1, 4, & 5 are only required for those candidates completing the licensure track.
42
Unit/
Unique
to
Program
Unique
Data
Technology
Source
Excel
Unique
Excel
Unique
Excel
Technology Facilitation Endorsement – Instructional Technology Master’s
Assessment
Assessment Name
ISTE Standards
1*
Competencies Verifications (GPA, Teaching,
and NETS-T Verifications)
2
3
E-Portfolio Assessment
Web-Based Deliverable Instruction Project
4
Field Experience
5
Technology Integration Classroom Project
6
Technology Plan Case Study
TF-IA, TF-IB, TF-IIA,
TF-III B,
TF-IIIC, TF-VA, TFVB, TF-VC, TF-VIA,
TF-VIB,
TF-VIIA,
TF-IA, TF-IIIA
TF-IIA, TF-IIB, TFIIC, TF-IID, TF-IIE,
TF-IIF,
TF-IIIA, TF-IIIC, TFIVA,
TF-IVB,
TF-IVC, TF-VC, TFVD,
TF-VIIA
TF-IA, TF-IIF, TF-VC,
TF-VD, TF-VID, TFVIE, TF-VIIA, TFVIIB, TF-VIIC, TFVIIIA,
TF-VIIIB,
TF-VIIIC, TF-VIIID,
TF-VIIIE
TF-IB, TF-II A, TFIIB, TF-IIC, TF-IID,
TF-IIE, TF-IIIA, TFIIIB, TF-IIIC, TF-IIID,
TF-IIIE, TF-IVA, TFIVB, TF-IVC
TF-IIA, TF-IIB, TFIID, TF-IIE, TF-IIF,
TF-IIIA, TF-IIIB, TF43
Transition Point
Admissions
Unit/
Unique
to
Program
Unique
Data
Technology
Source
Excel
Program Completion
Mid-Point
Unique
Unique
Excel
Excel
Program Completion
Unique
Excel
Mid-point
Unique
Excel
Mid-point
Unique
Excel
`IIID, TF-IIE, TF-IVA,
TF-IVC, TF-VA, TFVB, TF-VC, TF-VD,
TF-VIA, TF-VIB, TFVIC, TF-VID, TF-VIE,
TF-VIIA, TF-VIIB,
TF-VIIC, TF-VIIIA,
TF-VIIIB, TF-VIIIC,
TF-VIIID
7
Instructional Design Project
TF-IIA, TF-IIB, TFBeginning of Program
IIE, TF-IIF, TF-IIIA,
TF-IIC, TF-IIID, TFIVA, TF-IVB, TF-IVC,
TF-VC, TF-VD, VIIIC
* Assessment #1 is only required for those candidates completing the Technology Facilitation Endorsement.
Educational Foundations and Leadership – Ed.D.
Assessment
Assessment Name
Conceptual
Transition Point
Framework
Proficiencies
1
2
3
Internship Evaluation
Dissertation Proposal
Dissertation Defense
K1, E2
K1, E2
K1, E2
Mid-Point
Program Completion
Program Completion
44
Unique
Unit/
Unique
to
Program
Unique
Unique
Unique
Excel
Data
Technology
Source
Excel
Excel
Excel
VII.
FAIRNESS, ACCURACY, CONSISTENCY, AND ELIMINATION OF BIAS IN
CANDIDATE ASSESSMENTS
The Unit utilizes several methods to ensure the fairness, accuracy, consistency and
the elimination of bias as required by NCATE Standard 2. For the external measures in place
such as Praxis I, Praxis II, and Praxis III evaluations, the program relies on the validity and
reliability studies and the fairness review conducted by the Educational Testing Service and
the module selection and benchmarking processes at the state level. Research shows that
while these measures may have a disparate effect on certain populations, the measures in
themselves are not biased (Gitomer, Latham & Ziomeck, 1999). The COE, however, is
monitoring any differential impact of these measures.
For internal measures procedures have also been implemented to provide this
assurance. For the design of measures such as surveys and focus group interviews, an
Assessment Design Matrix has been developed. This matrix assures the alignment with the
components of the COE Conceptual Framework and the appropriate standards. For
assessments embedded in coursework and aggregated in candidate assessment portfolios,
rubrics have been developed. At the orientation session, candidates are provided copies of
portfolio checklists for their respective programs and the assessment expectations are
discussed. Inter-rater reliability exercises have been conducted for selected assessments in
candidate portfolios for Early Childhood and Intervention Specialist programs. A
continuation of these exercises to cover the unit and program assessments utilizing rubrics is
planned to enhance the assurance of accuracy, consistency, fairness and avoidance of bias. A
student teaching evaluation based on the 19 Praxis III criteria is employed to assess the
performance of candidates in the culminating clinical experience. Supervisors of student
teachers have received training on the four domains of Praxis and on using this assessment to
evaluate candidates.
IX.
RELATIONSHIP OF DATA SOURCES
Two tables demonstrate the relationship among the data sources and uses of the data.
The Assessment Relationship Table (see p. 45) outlines the relationship among the levels of
data aggregated to address the operation of the programs, the unit, and the institution. The
table also indicates the reciprocal manner in which the data are used by the institution, the
Unit, and the programs to improve and enhance the outcomes for candidates and the students
with whom they will be working.
The key assessments in place at both the unit and program levels provide data for
decision-making at all levels, provide multiple sources of data, both internal and external to
the Unit and are administered at multiple points in the candidates’ programs. Identified key
assessments provide information to the Unit about how candidates are performing in relation
to the competencies delineated in the Conceptual Framework and are represented in the
Relationship of Conceptual Proficiencies and Key Assessments Table (see p.46).
45
Assessment Relationship Table
Key Assessments by Level
Level of Data
Sources
Institutional
Level:
Student
(candidate)
satisfaction
survey; graduate
survey

Unit Level:
Transition data
aggregated for
all candidates;
candidate
satisfaction
surveys (unit)

Program Level:
Professional &
Pedagogical
Core,
Dispositions,
Impact on
Student
Learning, PIII
Student
Teaching
(Disaggregated
by Program
Area);
Program-specific
key assessments

Candidate Level:
Key assessments
(Core &
Content),
aggregated by
transition points
Key Assessments
Employed
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
National Survey of
Student Engagement
(NSSE)
IR graduate followup survey
Responsibility
for Data
Collection
Institutional
Research
GPA
Praxis II & III
Core key assessments
Dispositions
PIII Student
Teaching Eval.
Impact on Student
Learning (ISL)
Teacher Quality
Partnership (TQP)
Survey
Completers’ Survey
GPA
Praxis II & III
Core key assessments
Dispositions
PIII Student
Teaching Evaluation
Impact on Student
Learning (ISL)
Completers’ Survey
Employer’s Survey
Program key
assessments
Dean’s Office,
State-wide
Teacher Quality
Partnership (TQP)
project
GPA
Praxis II & III
Core key assessments
Dispositions
PIII Student
Teaching Evaluation
Impact on Student
Learning (ISL)
Completers’ Survey
Program key
assessments
Candidates,
course instructors,
supervisors
Program faculty,
department chairs,
Dean’s Office
46
Responsibility
for Summary &
Analysis
Dean, Associate
Dean, Assistant
Dean, Advisory
Committees
(Administrative
Council, PEC,
NCATE Steering)
Dean, Associate
Dean, Assistant
Dean, Advisory
Committees
(Administrative
Council, PEC,
NCATE Steering)
Program faculty
Supervisors,
course instructors,
program faculty
Consumers of
Data
Dean, Associate
Dean, Assistant
Dean, Advisory
Committees
Use of Data
Review and
revision of
policies and
programs

Dean, Associate
Dean, Assistant
Dean, Advisory
Committees
Review and
revision of
policies and
programs

Dean Associate
Dean, Assistant
Dean, Director
of Assessment
&
Accreditation,
department
chairs, program
faculty
Review and
revision of
curriculum,
assessments,
field/clinical
experiences
Supervisors,
course
instructors,
program faculty
Improvement
of candidate
knowledge,
skills,
dispositions,
and effect on
student
learning

Relationship of Conceptual Framework Proficiencies and Key Unit Assessments
Conceptual Framework
Proficiencies

Key Assessments

Grade Point Average
(Admissions)
Praxis II Content Area Tests
Praxis II Principles of
Learning and Teaching
Knowledge
K1

K3
K4
T1

Praxis III Student Teaching
Evaluation
Teacher Quality Partnership
(TQP) Survey
Ethics
D1
D2
E1
E2
E3


































Dispositions Assessment
Completers Survey
T2
Diversity

Professional & Pedagogical
Core Courses
Praxis III Entry-year
Assessment
Impact on Student Learning
(ISL) Assessment
K2
Technology


























National Survey of Student
Engagement (NSSE)
47


X.
ASSESSMENT OF SUPPORT FOR CANDIDATE LEARNING
Assessment measures of support for candidate learning provided by operations and
student services key indicators are reported on an annual basis. These reports are reviewed by
Unit administrators and necessary changes as indicated are made to goals, policies and
procedures. These reports serve as a basis for a discussion of the Unit's support for candidate
learning. Furthermore, this review provides information to assist the dean in planning for the
following year's budget and personnel requests. These discussions also allow the Unit to make
the necessary operational changes to administrative policies and procedures and help guide the
development of the following year's goals.
Operations
The associate dean has identified the operations data that will be collected each semester.
PeopleSoft provides data on many of the operations key indicators. An Access database is
employed to collect and record faculty data for teaching, research and service. The data is
aggregated and reported for unit operations reports.
The operations review process includes the submission of reports by center directors and
department chairs. Other reports include budget, personnel, external funding, candidate
enrollment data, and facilities including technology. The faculty members in the Unit are
required to go through an annual merit process. This requires faculty to submit their
accomplishments in Teaching, Research and Service to their department chairs. These
accomplishments are reviewed and discussed with each faculty member and merit points are
assigned accordingly. During spring, faculty members discuss their professional development
plans for the following academic year, which helps the department chairs plan for the allocation
of resources for the next fiscal year
Key Indicators – Operations
Key Indicators
Key
Documents
Budget:
Total
Operating
Budget
Documents
Student
(candidate)
Enrollment:
Undergraduate
Students,
Graduate
Students
Average ACT
Score
Zip Reports
Institutional
Research
Report
Institutional
Research
Report
Average SAT
Score
Responsible
to Collect
Data
Fiscal
Administrator
from Budget
Office
Responsibility
for Summary
Consumers of
Data
Use of Data
Fiscal
Administrator
Dean,
Administrative
Council
Institutional
Research
Associate
Dean
Administrative
Council,
alumni, faculty
Budget
planning,
development of
goals, resource
allocation,
operations
review
Development of
goals, resource
allocation,
operations
review
Institutional
Research
Associate
Dean
Department
Chairs
PeopleSoft
Associate
Dean
Associate
Dean
Department
Chairs
PeopleSoft
48
Technology
Data
Source
PeopleSoft
Financials
PeopleSoft
Number of
employees:
FT Tenure
Track Faculty,
FT Faculty
Non Tenure
Track,
PT faculty,
Contract
Professionals,
GA’s, Staff
Student/Teacher
Ratio
Zip Reports
Institutional
Research
Associate
Dean
Administrative
Council,
alumni, faculty
Development of
goals, resource
allocation,
operations
review
PeopleSoft
HR
Zip Reports
Institutional
Research
Associate
Dean
Administrative
Council
PeopleSoft
Faculty
Teaching
Reports
TAARS
Reports,
Candidate
Evaluation
Reports
Faculty
Publications,
Faculty
Presentation
s
Faculty
Collaboration, Faculty
Collegiate
Activities,
Faculty
Membership
Activities,
Faculty
Professional
Assignments
Monthly and
annual
research
reports to
Board of
Trustees
Instructional
Technology
Services
Annual
Report
Zip Reports
Associate
Dean,
Director Data
Management
Associate
Dean
Administrative
Council
Development of
goals, resource
allocation,
operations
review
Professional
Development
Plans
Director,
Data
Management
Associate
Dean
Administrative
Council
Annual Reports
Faculty
Database
Director,
Data
Management
Associate
Dean
Administrative
Council
Annual Reports
Faculty
Database
Office of
Research
Services and
Sponsored
Programs
Associate
Dean
Administrative
Council,
alumni, faculty
Development of
goals, resource
allocation,
operations
review
PeopleSoft
grant
module
Computer
Support
Assistant
Associate
Dean
Administrative
Council, faculty
Institutional
Research
Associate
Dean
Administrative
Council,
alumni, faculty
Development of
goals, resource
allocation,
operations
review
Development of
goals, resource
allocation,
operations
review
PeopleSoft
financials,
ITS
scheduling
database
PeopleSoft
Faculty
Research
Reports
Faculty Service
Reports
Research
Productivity
Technology
Degrees
Awarded
49
PeopleSoft,
Excel
documents
Student Credit
Hours
Generated
Zip Reports
Institutional
Research
Associate
Dean
Administrative
Council
Living
Education
Alumni
Alumni
Report
Alumni
Office
Associate
Dean
Facilities:
Square
Footage
Candidate
Progression &
Completion
Reflection of
candidate
admission,
progression
through
programs, and
completion
Facilities
Usage report
Capital
Planning
Associate
Dean
Administrative
Council,
alumni, faculty,
alumni board
Administrative
Council
-Admissions
-Changes in
majors or
advisors
-Advancement to
Candidacy
-Comps
-Dissertations
-Degrees
awarded
Student
Issues/Alerts/
Complaints
Student
Services
Assistant Dean
Student
Services
Field
Placement
Database
Student
Issues/Alerts/
Complaints
Reflection of
student issues
brought to the
attention of the
assistant dean
Field
Placements
Student
Teaching
Entrance and
Exit
Audit for
Licensure
Description
of Fields
-Applica-ions
-Midterm
-Final
Licensure
Application
& Supporting
Documentation
Development of
goals, resource
allocation,
operations
review
Development
goals and plans
PeopleSoft
Operations
review
Facilities
database
Dean,
Department
Chair
Enrollment
review,
time to
completion,
Faculty load
COE
Database
DARS for
undergraduate
degree
clearance
forms
effective fall
2009
Assistant Dean
Dean &
Assistant Dean
Review and
document for
needed
improvement
Word
Student
Services &
Student
Teaching and
Field
Experiences
Student
Services &
Student
Teaching and
Field
Experiences
Director of
Student
Teaching
Director of
Student
Teaching
Placement
decisions
Excel
Director of
Student
Teaching
Review of
candidate
performance
and eligibility
Excel
Student
Services
Coordinator of
Licensure
Director of
Student
Teaching;
Faculty &
Student
Teaching
Committee
Assistant Dean
Confirmation of
licensure
program
completion &
review of
number of
candidates
obtaining
license
Word
50
Alumni
database
Lists and
Criteria for
school based
faculty
-Resumes
-Copies of
Licenses
Student
Services &
Student
Teaching and
Field
Experience
Director of
Student
Teaching
Director of
Student
Teaching
Employment
decisions
Excel and
hard Copy
Student Services
The Assistant Dean for Student Services assists with the coordination of data collected,
aggregated and disseminated by the Office of Student Services. This includes data that is
instrumental for the determination of student success. Specifically, data from pre-admission
advising, scholarships, student issues and other documentation of program
progression/completion are identified below. In addition, data regarding field placements,
student teaching, and licensure are included. Some reports are cyclical, such as review of
academic program advisor assignments which are run every spring semester, checked for
accuracy and updated over the summer for fall day of development.
Key Indicators: Student Services
Key Indicators
Key
Documents
Advisor/advisee
List that
identifies advisor
and advisee
assignments
Advising
satisfaction
Advisor/
Advisee Lists
Scholarships
Scholarship
applicants,
ratings and
amounts
disbursed
Satisfaction of
school based
faculty and
student teachers
Responsible
to Collect
Data
Student
Services
Responsibility
for Summary
Consumers
of Data
Use of Data
Assistant Dean
Department
Chairs
Review of
resource needs
-Advising
Satisfaction
Survey
-Advising
Tickets
showing time
in/out
Scholarship
Applicants and
Disbursements
Advisors
Advisors and
Assistant Dean
Assistant
Dean
Review for
efficiency
Excel
Student
Services
Assistant Dean
Dean
Assistant
Dean
Development
Office
Resource
allocation for
student
retention
Excel
Satisfaction
surveys for
school based
faculty and
student
teachers
Student
Services &
Student
Teaching
and Field
Experiences
Director of
Student
Teaching
Director of
Student
Teaching
Placement
decisions
Excel
51
Technology
Data
Source
Excel
PeopleSoft
XI. PROCEDURES FOR DATA COLLECTION, AGGREGATION, DISAGGREGATION, AND
DISSEMINATION OF DATA
“Everyone agrees; colleges of education need databases. They need databases to be accountable,
to manage programs, and to implement data-based change and development” (Schroeder, 2000, p.1).
The College of Education uses multiple technologies to support the data management system.
TK20 HigherEd™, PeopleSoft, the College of Education Access database and Excel templates and other
data sources such as SPSS are employed to manage the data needed for decision making in the COE. The
Information Management System input diagram (Figure 1) identifies the modules containing the
data that are collected by the unit. The unit system, developed in collaboration with unit
stakeholders, includes the actuarial data module, candidate performance assessment module,
field/student teaching module, operations module, faculty module, and outreach module. The
Information Management System output diagrams (Figures 2 and 3) identify the data used to
create reports provide information on the quality of candidate learning and the effectiveness of
support for candidate learning.
Assessment of Candidate Learning
Initial Teacher Licensure
The unit uses the Tk20 HigherEd™ system for managing most initial program
performance assessment data. Prior to the beginning of each academic year, faculty members
review the key performance assessments and rubrics in their courses for the upcoming academic
year to the director of data management for upload to Tk20. The rubrics for the assessments are
then distributed to the candidates who submit their artifacts for scoring in Tk20 by faculty. The
Tk20 system provides ready access to data for candidates in their courses. The TK20 Higher Ed
system also provides the capability to aggregate and disaggregate data by program, standard,
course, and assignment.
Data from the Tk20 system are augmented by Praxis II licensure test scores from the
College of Education database, the student teaching scores from SPSS data files, and dispositions
assessment data from Excel. The director of data management creates the performance key
assessment data reports at the end of each academic year. Reports are shared with the College
Administrative Council and placed on the SharePoint server for faculty review and analysis prior
to the Fall Day of Development. Analysis reports are generated that inform curriculum proposals
that need to be generated.
52
Advanced Programs
Prior to the beginning of each academic year, faculty members also review assessments
and rubrics for the key performance assessments in advanced programs and changes as indicated
are made. Since the Tk20 HigherEd™ system has not yet been implemented for advanced
programs, multiple approaches are taken to collecting this data.
For assessments in the Master’s for Practicing Teachers, the Principalship Master’s and
Principalship Licensure programs, and the endorsements in TESOL and Reading, the criteria
identified in the rubrics is uploaded to Excel spreadsheets. The spreadsheets are distributed to
faculty by the director of data management at the beginning of the semester. Faculty members
assess candidate artifacts and record the scores in the Excel spreadsheet. The spreadsheets are
then returned to the director of data management at the end of each semester. For assessments in
the Instructional Technology Master’s and the Technology Facilitation Endorsement programs,
Google Docs has been piloted. Faculty members assess candidate dispositions in designated
courses and also return it in Excel spreadsheet format to the director of data management at the
end of the semester.
Utilizing the performance data from the Excel spreadsheets and from Google Docs, the
director of data management creates reports at the end of each academic year. As with the initial
programs, these advanced programs reports are shared with Administrative Council and faculty
for review and analysis.
53
Figure 1
College of Education
Unit Information Management System: Input
BioDemographic
Admission
Licensure
Undergraduate
Data
Graduate
Data
Completion
Data
Advisors
Access
Database
|
PeopleSoft
Actuarial
Data
Initial
Teacher
Preparation
Key
Assessment
Reports
Outreach
Actuarial
Data
Reports
Tk20
|
Access
Database
|
Excel
Spreadsheets
Access
Database
|
Excel
Spreadsheets
Advanced
Programs
Key
Assessments
&
Disposition
Reports
Unit
Data
Tk20
|
Access
Database
|
Excel
|
SPSS
Faculty
Teaching,
Research,
and
Service
Reports
Access
Database
|
Excel
Spreadsheets
Field
Experience
and
Student
Teaching
Reports
Operations
Reports
Access
Database
|
Excel
Spreadsheets
PeopleSoft
|
Excel
Spreadsheets
54
Figure 2
College of Education
Unit Information Management System: Output
Assessment of Candidate Learning
Professional/Pedagogical
Core
Reports
UG/G
By
Assessment
By
Standards
Body
Population
of
Completers
or
Class
Rosters
Population
of
Completers
or
Class
Rosters
PeopleSoft
PeopleSoft
Praxis
II
PLT
Scores
Praxis
II
Subject
Test
Scores
(Principalship)
Access
Database
Access
Database
Performance
Assessments
Performance
Assessments
Tk20
Excel
Unit
Data
SPA
6‐8
Key
Assessment
Reports
(Principalship
and
Technology
Facilitation
Endorsement)
Performance
Assessments
(TFE
&
Instructional
Technology)
Google
Docs
Population
of
Completers
Class
Rosters
|
Course
Grades
PeopleSoft
SPA
6‐8
Key
Assessment
Reports
By
Assessment
By
Standards
Body
Praxis
II
Subject
Test
Scores
Population
of
Completers
or
Class
Rosters
Access
Database
PeopleSoft
Student
Teaching
Evaluations
Assessments
at
Identified
Transition
Points
SPSS
Excel
Performance
Assessments
Tk20
55
Dispositions
Assessment
Reports
Figure 3
College of Education
Unit Information Management System: Output
Support for Candidate Learning – Operations
Number
of
Employees
PeopleSoft
|
HR
Operating
Budget
Student/Teacher
Ratio
Personnel
Reports
PeopleSoft
Financial
PeopleSoft
Undergraduate
&
Graduate
Enrollment
Student
Credit
Hours
PeopleSoft
Faculty
Teaching
PeopleSoft
Unit
Data
ACT
&
SAT
PeopleSoft
|
Excel
External
Funding
Reports
PeopleSoft
Faculty
Research
Productivity
Degrees
Awarded
PeopleSoft
PeopleSoft
Alumni
Alumni
Database
Facilities
Usage
Facilities
including
Technology
Budget
Reports
Facilities
Database
Instructional
Technology
Usage
PeopleSoft
|
ITS
Database
56
Candidate
Data
Reports
Figure 4
College of Education
Unit Information Management System: Output
Support for Candidate Learning – Student Services
Student
Teaching
Entrance
&
Exit
Advisor
Lists
Advising
Advising
Reports
Reports
Excel
PeopleSoft
|
Excel
Criteria
for
School
based
Faculty
Advising
Satisfaction
Survey
Excel
Retention/Time
to
Degree
Retention
Reports
Student
Teaching
Reports
Excel
|
Hardcopy
Satisfaction
of
School
based
Faculty
&
Student
Teacher
Unit
Data
Scholarships
Excel
Excel
Field
Reports
Field
Placements
Admissions
through
Program
Completion
Excel
Access
Database
|
DARS
Student
Issues/Alerts/Complaints
Audit
for
Licensure
Licensure
Reports
Word
Access
Database
|
DARS
57
Candidate
Progression
&
Completion
Reports
In addition to measures regarding candidate performance assessments, operations,
and student services, an array of surveys and focus group interviews are employed to collect
data from multiple stakeholders. These measures provide information to improve candidate
learning, the quality of our programs, and the support for student learning.
Surveys and Focus Group Interviews
Inventory and Dissemination Plan
Instrument
Completers
Survey
Cooperating
Teacher
Survey
Employer
Survey
Evaluation
of Student
Teaching
Experience
by
Candidate
Description
Survey aligned with
Conceptual Framework that
is distributed to initial
program completers
Survey aligned with
Conceptual Framework that
is distributed to cooperating
teachers
Survey aligned with
conceptual framework that is
distributed to employers of
initial program completers
Survey designed to collect
operational data about the
student teaching experience
Completers
Survey aligned with
Survey Conceptual Framework that
Principalship is distributed to Principalship
completers
Principalship Survey designed to collect
Internship
operational data about the
Survey
internship experience
Evaluation
by
Candidate
Data Collected Dissemination Groups/Units
Each semester
Fall Day of
• PEC
Development
• Department
Chairs
• Faculty
Each semester
Fall Day of
• PEC
Development
• Department
Chairs
• Faculty
Piloted in
Fall Day of
• PEC
Spring 2009;
Development
• Department
full
Chairs
implementation
• Faculty
in Spring 2010
Each semester
Fall
• PEC
• NCATE
Steering
• Office of
Student
Teaching &
Field
Experiences
Piloted in
Fall Day of
• PEC
Spring 2009;
Development
• Department
full
Chairs
implementation
• Faculty
in Spring 2010
Piloted in
Fall Day of
• Department
Spring 2009;
Development
Chair
full
• Faculty
implementation
• Internship
in Spring 2010
Coordinator
58
Instrument
Principalship
Internship
Field
Administrator
Survey
Focus Group
Interviews
Description
Survey designed to collect
operation data regarding the
internship experience from
the field administrator’s
perspective
Structured focus groups
conducted with completers
of initial and advanced
programs
Data Collected
Piloted in
Spring 2009;
full
implementation
in Spring 2010
Rotating
schedules
Dissemination
Fall Day of
Development
Teacher
Quality
Partnership
(TQP)
Survey
Statewide survey of preservice candidates at the
point of initial program
completion
Each semester
Fall
National
Survey of
Student
Engagement
(NSSE)
National survey of
Rotating basis
undergraduate candidates
(approximately
perception of various aspects every 3 years)
of their experience
administered by Institutional
Research
XII.
Fall Day of
Development
Spring
Groups/Units
• Department
Chair
• Faculty
• Internship
Coordinator
• PEC
• NCATE
Steering
• Department
Chairs
• Faculty
• PEC
• NCATE
Steering
• Department
Chairs
• Faculty
• PEC
• NCATE
Steering
• Department
Chairs
• Faculty
ANALYSIS AND USE OF ASSESSMENT DATA
A Program Review Cycle based on candidate assessments that speak to both candidate
competencies and program quality has been implemented. The candidate data collected are
aggregated, analyzed, and summarized to determine candidate learning and effectiveness of
the programs of study offered. Stakeholders review the assessment results each August at the
Day of Development meeting, propose program or course changes to improve programs and
facilitate candidate learning, and direct these through necessary college and university
governance procedures. An analysis form has been developed to facilitate this task. The
results of this process have included changes in program portfolio requirements, changes in
course requirement/assignments, changes in course content and objectives, changes in course
delivery methods, changes in GPA requirements, changes in Praxis requirements, revision of
rubrics, and numerous other changes.
In addition to candidate performance data aggregated by program area, other data
measures are included in the assessment system. Two large-scale data sources, the Ohio
Teacher Quality Partnership (2006, 2007) pre-service report and the National Survey of
Student Engagement (NSSE) survey report, provide data for review by stakeholders (Kuh,
2001). The TQP five-year survey research initiative provides candidates in the last semester
of their preparation programs the opportunity to respond to questions about learning
experiences within their programs. All fifty institutions in the state of Ohio that prepare
59
teachers have participated. Data reported to each institution include responses from
candidates at each institution and aggregate data for all candidates statewide. A report has
been developed that aligns candidate responses with the NCATE Standards.
The NSSE survey of student opinion of the educational experience yields data for the College
of Education and the institution as a whole. This survey, which has been administered three
times since 2004, provides trend data on candidate responses. Several surveys have also been
developed by the college. These include the Completers Survey, the Cooperating Teachers
Survey, and the Employers Survey, which have been constructed to directly relate to the
Conceptual Framework.
Finally, an Operations Review Cycle that speaks to the sufficiency of support for
candidate learning. Operational data are summarized for review at the end of each fiscal
year. This allows judgment about unit operations to support candidate learning and program
quality and indicates changes that need to be made.
XIII.
SUMMARY
It is the intention of the College of Education to “ensure that its programs and
graduates are of the highest quality” (NCATE, 2008, p. 27). Our assessment system includes
multiple sources of data aligned with candidate proficiencies and the Conceptual Framework.
The unit collects, analyzes and uses these sources to both assess candidate learning and
evaluate unit operations and programs. The unit recognizes that to ensure quality the work
must be ongoing. In this way an effective a continuous cycle of improvement has been
operationalized.
60
Glossary
*Advanced Programs. Programs at postbaccalaureate levels for (1) the continuing education
of teachers who have previously competed initial preparation or (2) the preparation of other
school professionals. Advanced programs commonly award graduate credit and include
master’s, specialist, and doctoral degree programs as well as non-degree licensure programs
offered at the post baccalaureate level. Examples of these programs include those for teachers
who are preparing for a second license at the graduate level in a field different from the field
in which they have their first license; programs for teachers who are seeking a master’s
degree in the field in which they teach; and programs not tied to licensure, such as programs
in curriculum and instruction. In addition, advanced programs include those for other school
professionals such as school counselors, school psychologists, educational administrators,
and reading specialists.
*Assessment System. A comprehensive and integrated set of evaluation measures that
provides information for use in monitoring candidate performance and managing and
improving unit operations and programs for the preparation of professional educators.
*Candidates. Individuals admitted to, or enrolled in, programs for the initial or advanced
preparation of teachers, teachers continuing their professional development, or other
professional school professionals. Candidates are distinguished from “students” in P–12
schools.
*Conceptual Framework. An underlying structure in a professional education unit that
gives conceptual meaning to the unit's operations through an articulated rationale and
provides direction for programs, courses, teaching, candidate performance, faculty
scholarship and service, and unit accountability.
*Content (knowledge). The subject matter or discipline that teachers are being prepared to
teach at the elementary, middle level, and/or secondary levels. Content also refers to the
professional field of study(e.g., special education, early childhood, school psychology,
reading, or school administration).
*Dispositions. Professional attitudes, values, and beliefs demonstrated through both verbal
and nonverbal behaviors as educators interact with students, families, colleagues, and
communities. These positive behaviors support student learning and development. NCATE
expects institutions to assess professional dispositions based on observable behaviors in
educational settings. The two professional dispositions that NCATE expects institutions to
assess are fairness and the belief that all students can learn. Based on their mission and
conceptual framework, professional education units can identify, define, and operationalize
additional professional dispositions.
61
Diversity. Differences among groups of people and individuals based on socioeconomic
status, race, ethnicity, sexual orientation, language, religion, and exceptionalities (both
disabilities and giftedness), language, religion, sexual orientation, and geographical area. The
types of diversity necessary for addressing the elements on candidate interactions with
diverse faculty, candidates, and P–12 students are stated in the rubrics for those elements.
Educator as Decision Maker. The theme adopted by the College of Education to reflect the
complexity of the nature of a role of practitioners in their practice. As a Unit, we strive to
prepare candidates to use reflective processes and make sound judgments.
ELCC (Educational Leadership Constituent Council). A project of the National Policy
Board for Education Administration. Standards for advanced programs in educational
leadership for principals, superintendents, curriculum directors, and supervisors. KH - does
this look correct?
Ethics. The College of Education’s commitment to creating an ethical environment that
promotes a culture of intellectual excellence, respect for diversity, caring, civility, and
responsibility.
Field/Clinical Experiences. A variety of early and ongoing field-based opportunities in
which candidates may observe, assist, tutor, instruct, and/or conduct research. Field
experiences may occur in off-campus settings such as schools, community centers, or
homeless shelters. Field experiences are identified as urban or suburban based upon more
than one ethnicity being significantly represented according to the US Census. As field
placements are made, the candidate’s history of prior placements is determined and future
placements are based upon candidate need.
*Initial Teacher Preparation. Programs at baccalaureate or post baccalaureate levels that
prepare candidates for the first license to teach.
Inquiry. Reflected in faculty inquiry in research and scholarly activities and student inquiry
in problem solving and decision making.
*INTASC (Interstate New Teacher Assessment and Support Consortium). A project of
the Council of Chief State School Officers (CCSSO) that has developed model performancebased standards and assessments for the licensure of teachers.
ISLLC (Interstate School Leaders License Consortium). A project of the Council of Chief
State School Officers (CCSSO). ISLLC Standards are organized around core proposition that
the most critical aspect of a school leader’s work is the continuous improvement of school
learning.
*Licensure. The official recognition by a state governmental agency that an individual has
met certain qualifications specified by the state and is, therefore, approved to practice in an
occupation as a professional.
62
*NBPTS (National Board for Professional Teaching Standards). An organization of
teachers and other educators, which has developed both standards and a system for assessing
the performance of experienced teachers seeking national certification.
NCATE. (National Council for the Accreditation of Teacher Education). NCATE is a
coalition of 33 specialty professional associations of teachers, teacher educators, content
specialists, and local and state policy makers. All are committed to quality teaching, and
together, the coalition represents over 3 million individuals. NCATE is the profession’s
mechanism to help establish high quality teacher preparation. Through the process of
professional accreditation of schools, colleges and departments of education, NCATE works
to make a difference in the quality of teaching and teacher preparation today, tomorrow, and
for the next century.
Outcomes Assessment. See Performance Assessment.
*Pedagogical Content Knowledge. The interaction of the subject matter and effective
teaching strategies to help students learn the subject matter. It requires a thorough
understanding of the content to teach it in multiple ways, drawing on the cultural
backgrounds and prior knowledge and experiences of students.
*Pedagogical Knowledge. The general concepts, theories, and research about effective
teaching, regardless of content areas.
*Performance Assessment. A comprehensive assessment through which candidates
demonstrate their proficiencies in subject, professional, and pedagogical knowledge, skills,
and professional dispositions, including their abilities to have positive effects on student
learning.
*Portfolio. An accumulation of evidence about individual proficiencies, especially in
relation to explicit standards and rubrics, used in evaluation of competency as a teacher or
other school professional. Contents might include end-of-course evaluations and tasks used
for instructional or clinical experience purposes such as projects, journals, and observations
by faculty, videos, and reflective essays on the student teaching application.
Praxistm tests. Praxis encompasses three categories of assessment provided by Educational
Testing Service (ETS), that are used as part of the teacher licensure process. Praxis I® is
taken prior to entry to the teacher education program; Praxis II® assesses Principles of
Teaching and Learning and subject specialty area(s); Praxis III® assesses classroom
performance.
*Professional Knowledge. The historical, economic, sociological, philosophical, and
psychological understandings of schooling and education. It also includes knowledge about
learning, diversity, technology, professional ethics, legal and policy issues, pedagogy, and
the roles and responsibilities of the profession of teaching.
63
*School Partners. P–12 schools that collaborate with the higher education institution in
designing, developing, and implementing field experiences, clinical practice, delivery of
instruction, and research.
*Standards. Written expectations for meeting a specified level of performance. Standards
exist for the content that P–12 students should know at a certain age or grade level.
*Technology, Use of. What candidates must know and understand about information
technology in order to use it in working effectively with students and professional colleagues
in the (1) delivery, development, prescription, and assessment of instruction; (2) problem
solving; (3) school and classroom administration; (4) educational research; (5) electronic
information access and exchange; and (6) personal and professional productivity.
Tk20. Tk20's HigherEdtm is an online assessment, accountability, and management system
developed to support college accreditation needs in areas such as course, program and unitlevel assessments, standards-based portfolios, data aggregation, and report generation (Tk20,
n.d.).
*Unit. The college, school, department, or other administrative body in colleges, universities,
or other organizations with the responsibility for managing or coordinating all programs
offered for the initial and advanced preparation of teachers and other school professionals,
regardless of where these programs are administratively housed in an institution. Also known
as the “professional education unit.” The professional education unit must include in its
accreditation review all programs offered by the institution for the purpose of preparing
teachers and other school professionals to work in pre-kindergarten through
twelfth grade settings.
Zip Report. A report made available to the College of Education by The University of
Akron's Office of Institutional Research.
* From NCATE (2008) glossary.
64
References
Astin, A.W., Banta, T.W., Cross, P., El-Khawas, E., Ewell, P.T., Hutchings, P., et al.
(1996). AAHE assessment forum: 9 principles of good practice for assessing
student learning. Retrieved June 19, 2009, from
http://www.academicprograms.calpoly.edu/pdfs/assess/nine_principles_good_pra
ctice.pdf
Banta, T., Lund, J.P., Black, K.E., & Oblander, F.W. (1995). Assessment in practice:
Putting principles to work on college campuses. San Francisco: Jossey-Bass.
Campbell, D., Melenyzer, B., Nettles, D., & Wyman, R. (2000). Portfolio and
performance assessment in teacher education. Boston: Allyn and Bacon.
Chickering, A. & Gamson, Z. F. (1987). Seven principles for good practice in
undergraduate education. Retrieved June 19, 2009, from
http://www.csuhayward.edu/wasc/pdfs/End%20Note.pdf
Darling-Hammond, L., Ancess, J., & Falk, B. (1995). Authentic assessment in action:
Studies of schools and students at work. New York: Teachers College Press.
Darling-Hammond, L. (1999). Teacher quality and student achievement: A review of state
policy evidence. Retrieved June 26, 2009, from the University of Washington, Center for
the Study of Teaching and Policy Web site:
http://www.nctaf.org/resources/archives/documents/LDH_State_Policy_Evidence.pdf
Darling-Hammond, L., & Snyder, J. (2000). Authentic assessment of teaching in context.
Teaching and Teacher education, 16(5-6), 523-545.
Gitomer, D.H., Latham, A.S., & Ziomek, R. (1999). The academic quality of prospective
teachers: The impact of admissions and licensure testing. Princeton, NJ: Educational
Testing Service.
Kuh, G.D. (2001). Assessing what really matters to student learning: Inside the National
Survey of Student Engagement. Change, 33(3), 10-17, 66.
National Council for Accreditation of Teacher Education (NCATE). (2008). Professional
standards for the accreditation of teacher preparation institutions. Washington, DC:
Author.
Ohio Department of Education. (2007). Standards for Ohio's educators. Retrieved June 16,
2009, from http://esb.ode.state.oh.us/PDF/Standards_OhioEducators.pdf
65
Schroeder, G.G. (2005). The UK College of Education database issue sets. Retrieved August
7, 2009, from the University of Kentucky, College of Education Web site:
http://ukdame.coe.uky.edu/dameportal/documents/UK%20Database%20Issue%20Set
s%20v3.pdf
Stroble, E. (2000). Unit assessment systems. Retrieved June 16, 2009, from:
http://www.ncate.org/documents/articles/stroble_unit%20assessment%20systems.pdf
Teacher Quality Partnership (2006). [2006 preservice cohort III state norm].
Unpublished raw data.
Teacher Quality Partnership (2007). [2007 preservice cohort IV state norm].
Unpublished raw data.
Tk20. (n.d.). Retrieved June 18, 2009, from http://www.tk20.com/
University of Akron (2008). Conceptual framework. Retrieved June 18, 2009, from
The University of Akron, College of Education Web site:
http://www.uakron.edu/colleges/educ/docs/CF-Fall08.pdf
66