Regional mapping report on assessment in the Arab States: survey

UNESCO Regional Bureau
for Education in the Arab States - Beirut
Arab League Educational, Cultural
and Scientific Organization (ALECSO)
N/2014/10/009
ISBN: 978-9973-15-356-2
2014
Arab League Educational, Cultural and Scientific Organisation, the
Arab Regional Agenda for Improving Education Quality, Tunis Regional
Mapping Report on Assessment in the Arab States / by Adnan El Amine
- Tunis : Arab League Educational, Cultural and Scientific Organization
Education Department, Beirut : UNESCO Regional Bureau for Education
in the Arab States, 2014 - P. 155
Regional Mapping Report on Assessment in the Arab States
Regional Mapping Report on
Assessment in the Arab States
Survey of Student Assessment
Systems in the Arab States
System Approach for Better
Education Results (SABER)
UNESCO Regional Bureau
for Education in the Arab States - Beirut
Arab League Educational, Cultural
and Scientific Organization (ALECSO)
Regional Mapping
Report on Assessment
in the Arab States
Survey of Student Assessment Systems in the Arab States
SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS (SABER)
Adnan El-Amine
Commissioned by the UNESCO Regional Bureau for Education in the Arab
States - Beirut
and Arab League Educational, Cultural and Scientific Organization (ALECSO)
Arab Regional Agenda for Improving Education Quality
November 2014
CONTRIBUTORS
National Researchers (Annex I)
Research Assistants
Rana ABDUL LATIF (UNESCO Beirut)
Salia HOTEIT
Questionnaire review and validation, and national report preparation
Marguerite Clarke (World Bank)
Julia Liberman (World Bank)
Vidyasri Putcha (World Bank)
Jem Heinzel Nelson (World Bank)
Supervision
Said Belkachla (UNESCO Beirut)
Monia Mghirbi (ARAIEQ Director - ALECSO)
© All rights reserved Arab League Educational, Cultural and Scientific Organization - ALECSO
Tunis, 2014
LB/2014/ED/PI/79
The designations employed and the presentation of material throughout this publication do not imply the expression
of any opinion whatsoever on the part of ALECSO and UNESCO concerning the legal status of any country, territory,
city or area or of its authorities, or concerning the delimitation of its frontiers or boundaries.
The ideas and opinions expressed herein are those of the author and do not necessarily represent those of ALECSO
and UNESCO.
2
PA RT O N E Classroom Assessment (CA)
Table of
Contents
List of Tables
4
Introduction
6
Part I:
13
14
21
28
Classroom Assessment (CA)
I. Enabling Context
II. Assessment Quality
III. Benchmarking for Classroom Assessment
Part II: Examinations (EX)
Introduction: Major Standardized Examinations
I. Enabling Context
II. System Alignment
III. Assessment Quality
IV. Benchmarking for Examinations
31
32
34
41
44
49
Part III: National Large-Scale Assessment (NLSA)
I. Enabling Context
II. System Alignment
III. Assessment Quality
IV. Benchmarking for NLSA
53
54
61
62
65
Part IV: International Large-Scale Assessment (ILSA)
I. Enabling Context
II. System Alignment
III. Assessment Quality
IV. Benchmarking for ILSA
69
70
75
76
78
General Conclusion
1. Overall Picture
2. Enabling Context
3. System Alignment
4. Assessment Quality
82
82
84
85
ANNEXES
Annex I: List of National Researchers
Annex II: List of National Validation Workshops
Annex III: SABER-SA Questionnaires
• Classroom Assessment
• Examinations
• National Large-Scale Assessment
• International Large-Scale Assessment
89
90
92
93
93
99
114
128
3
List of Tables
4
Table 1
Assessment types and their key differences
8
Table 2
Framework for building an effective assessment system, with indicator areas
9
Table 1.1
Country system-level documents on classroom assessment guidelines
14
Table 1.2
Availability of system-level documents on CA guidelines to the public
16
Table 1.3
Teacher resources on classroom assessment
17
Table 1.4
System-level mechanisms for teachers’ skills and expertise development
19
Table 1.5
Reasons for conducting classroom assessment activities
21
Table 1.6
Knowledge and skills measured in classroom assessment
22
Table 1.7
Issues and challenges in classroom assessment activities
24
Table 1.8
Records of student results
26
Table 1.9
Required uses of classroom assessment
27
Table 1.10
Benchmarking results for classroom assessment (by country and status)
28
Table 2.1
Standardized examinations
32
Table 2.2
Types of examination documents
34
Table 2.3
Stakeholders’ support for policy
35
Table 2.4
Activities covered by funding allocated for the examination
36
Table 2.5
Bodies responsible for running the examination
37
Table 2.6
Facilities available to carry out the examinations
38
Table 2.7
Issues in the performance of human resources
39
Table 2.8
Learning opportunities in educational measurement and evaluation
40
Table 2.9
Publically available material on the examination
42
Table 2.10
Examination-related tasks performed by teachers
43
Table 2.11
Mechanisms in place to ensure examination quality
44
Table 2.12
Inappropriate behaviors that diminish the credibility of the examination
46
Table 2.13
Options for students who do not perform well
47
Table 2.14
Mechanisms in place to monitor the consequences of the examination
48
Table 2.15
Benchmarking results for examinations (by country and status)
49
Table 3.1
Name of large scale assessment, frequency and population
56
Table 3.2
Policy document nature and date of issuing
56
Table 3.3
Staffing adequacy
59
Table 3.4
Opportunities available for professional development on educational measurement and evaluation 60
Table 3.5
Teacher training provision on NLSA
61
Table 3.6
Frequency of mechanisms in place to ensure the quality of the NLSA
62
Table 3.7
Frequency of mechanisms to disseminate NLSA results
63
Table 3.8
Frequency of mechanisms in place to monitor consequences of the NLSA
64
Table 3.9
Benchmarking results for national large-scale assessment (by country and status)
65
Table 4.1
Country participation in previous international assessments
70
Table 4.2
Country participation in upcoming international assessments
71
Table 4.3
Country policy documents addressing participation in international assessments
72
Table 4.4
Benchmarking results for international large-scale assessment (by country and status)
78
Table 5.1
Benchmarking-overall picture
82
Table 5.2
Enabling context, sorting indicators based on benchmarking results
84
Table 5.3
System alignment, sorting indicators based on benchmarking results
85
Table 5.4
Enabling context, system alignment, and assessment quality comparison
87
Table 5.5
Assessment quality, sorting indicators based on benchmarking results
87
5
Regional Mapping Report on Assessment in the Arab States
INTRODUCTION
It is not enough for children to be enrolled in school
and sitting in classrooms. For the benefits of education
to accrue, children must be learning. But how do we
measure whether children are learning and what do we
do with that information?
Effective assessment of student learning and
achievement is a key component of any successful
education system. Research shows that the right kinds
of assessment activities, and the right uses of data
resulting from those activities, can contribute to better
learning outcomes and more well-informed policy
decisions. As governments strive to improve student
learning outcomes, it is vital for them to develop strong
systems for assessing students’ learning and academic
achievement1.
ARAIEQ
Improving the quality of education is one of the most
important and urgent challenges for the future of
the Arab world. This was acknowledged at the Doha
Colloquium on Quality Education for All (September,
2010).The Doha Colloquium concluded with a request
from the Arab Ministers of Education to the three
organizing institutions - the Arab League Educational,
Cultural and Scientific Organization (ALECSO), Qatar
Foundation and the World Bank, to propose an Action
Plan that would systematically address the challenge of
improving education quality in the region. The resulting
Action Plan proposed a programmatic response in
the form of an Arab Regional Agenda for Improving
Education Quality (ARAIEQ). ARAIEQ is an umbrella
initiative aiming at tying a number of current programs
and institutions together into a coherent framework,
and is based on the strategic principle of regional
consensus and partnership, intended as a formal and
enduring relationship among prominent regional and
global organizations sharing the mission of improving
education quality. On December 22, 2010, ARAIEQ was
endorsed by the Arab Ministers of Education.
1
6
Structure of ARAIEQ
The Arab League Educational, Scientific and Cultural
Organization (ALECSO) is responsible for the
management and the coordination of ARAIEQ. ARAIEQ
comprises five regional programmes, each executed by
a host institution:
1. Arab Program on Curriculum Innovation,
Qualifications, and ICTs in Education (APIQIT),
hosted by the National Center for Education
Technologies (CNTE) in Tunis, Tunisia;
2. Arab Program on Teacher Policies (APTP), hosted
by the Queen Rania Teacher Academy (QRTA) in
Amman, Jordan;
3. Arab Program on Education Evaluation and
Policy Analysis (APEEPA), hosted by the UNESCO
Regional Bureau for Education in the Arab States
in Beirut, Lebanon;
4. Arab Program on Early Childhood Development
(APECD), hosted by the Arab Resource Collective
in Beirut, Lebanon;
5. Arab Program on Entrepreneurship (APEEI),
hosted by the Injaz El Arab in Amman, Jordan.
APEEPA
The UNESCO Regional Bureau for Education in the Arab
States – Beirut is the home to APEEPA. The overall goal
of APEEPA is to strengthen national capacity in analyzing
assessment data and offering contextual benchmarking
for all aspects of education quality. The focus of APEEPA
is on analyzing, interpreting and exploring results. Arab
States are heavily engaged in national and international
assessments, but little work is done on the translation of
data into information, policy and practice.
APEEPA consists of a number of activities, including:
1. Mapping of national evaluation systems in the
Arab countries;
2. Technical assistance to individual countries for
the development of comprehensive national
capacity in the area of assessment;
Much of the assessment-related text in the Introduction section of this report is taken directly from Marguerite Clarke (2012), What
matters most for student assessment systems: a framework paper, The World Bank. Please refer to that paper for further details.
Introduction
3. Technical workshops that aim at identifying
national educational issues, improving analysis
skills and developing national reports; and
4. Policy seminars that aim at presenting evidencebased policy guidance to policy makers in order
to improve education quality.
This regional report discusses the findings of the
mapping exercise that was conducted with 17 Arab
countries between February and April 2013.
In order to gain a better understanding of the
strengths and weaknesses of the assessment systems
in the Arab region, Arab countries were invited to
take part in benchmarking their evaluation systems
using standardized tools developed by the World Bank
Systems Approach for Better Education Results (SABER)
program. SABER is an evidence-based program to help
countries systematically examine and strengthen the
performance of different aspects of their education
systems.
SABER-Student Assessment
SABER-Student Assessment is a domain of the SABER
program. The goal of SABER-Student Assessment is to
promote stronger assessment systems that contribute to
improved education quality and learning for all. One of
the activities of the SABER-Student Assessment domain
is to help countries benchmark their student assessment
systems.
National governments and international agencies are
increasingly recognizing the key role that assessment of
student learning plays in an effective education system.
The importance of assessment is linked to its role in:
(i) providing information on levels of student
learning and achievement in the system;
(ii) monitoring trends in education quality over
time;
(iii) supporting educators and students with realtime information to improve teaching and
learning; and
(iv) holding stakeholders accountable for results.
SABER-Student Assessment methodology
The SABER-Student Assessment framework is built
on the available evidence base for what an effective
assessment system looks like. The framework provides
guidance on how countries can build more effective
student assessment systems. The framework is structured
around two main dimensions of assessment systems: (a)
the types or purposes of assessment activities and (b)
the quality of those activities.
Assessment Types and Purposes
Assessment systems tend to be comprised of three main
types of assessment activities, each of which serves a
different purpose and addresses different information
needs. These three main types are: classroom assessment,
examinations, and large-scale system level assessments.
Classroom assessment provides real-time information
to support ongoing teaching and learning in individual
classrooms. Classroom assessments use a variety of
formats, including observation, questioning, and paperand-pencil tests, to evaluate student learning, generally
on a daily basis.
Examinations provide a basis for selecting or certifying
students as they move from one level of the education
system to the next (or into the workforce). All eligible
students are tested on an annual basis (or more often
if the system allows for repeat testing). Examinations
cover the main subject areas in the curriculum and
usually involve essays and multiple-choice questions.
Large-scale system-level assessments provide feedback
on the overall performance of the education system
at particular grades or age levels. These assessments
typically cover a few subjects on a regular basis (such
as every 3 to 5 years), are often sample based, and use
multiple-choice and short-answer formats. They may be
national or international in scope.
Table 1 summarizes the key features of these main types
of assessment activities.
7
Regional Mapping Report on Assessment in the Arab States
Table 1. Assessment types and their key differences
Classroom
Examinations
Large-scale
assessment surveys
Exit
Entrance
National
International
Purpose
To provide
immediate feedback
to inform classroom
instruction
To certify
students as
they move from
one level of
the education
system to the
next (or into the
workforce)
To select students
for further
educational
opportunities
To provide
feedback on
overall health
of the system at
particular grade/
age level(s), and
to monitor trends
in learning
To provide
feedback on
the comparative
performance of
the education
system at
particular grade/
age level(s)
Frequency
Daily
Annually and
more often
where the system
allows for repeats
Annually and
more often
where the system
allows for repeats
For individual
subjects offered
on a regular basis
(such as every 3-5
years)
For individual
subjects offered
on a regular basis
(such as every 3-5
years)
Who is
tested?
All students
All eligible
students
All eligible
students
Sample or census
of students at a
particular grade
or age level(s)
A sample of
students at a
particular grade
or age level(s)
Format
Varies from
observation to
questioning to
paper-and-pencil
tests to student
performances
Usually essay and
multiple choice
Usually essay and
multiple choice
Usually multiple
choice and short
answer
Usually multiple
choice and short
answer
Coverage of
curriculum
All subject areas
Covers main
subject areas
Covers main
subject areas
Generally
Generally
confined to a few confined to one
subjects
or two subjects
Additional
information
collected
from
students?
Yes, as part of the
teaching process
Seldom
Seldom
Frequently
Yes
Scoring
Usually informal
and simple
Varies from
simple to more
statistically
sophisticated
techniques
Varies from
simple to more
statistically
sophisticated
techniques
Varies from
simple to more
statistically
sophisticated
techniques
Usually involves
statistically
sophisticated
techniques
Source: Marguerite Clarke (2012), What matters most for student assessment systems: a framework paper, The World Bank.
8
Introduction
Quality Drivers of an Assessment System
The key considerations when evaluating a student
assessment system are the individual and combined
quality of assessment activities in terms of the adequacy
of the information generated to support decision
making. There are three main drivers of information
quality in an assessment system: (a) enabling context,
(b) system alignment, and (c) assessment quality.
Enabling context refers to the broader context in which
the assessment activity takes place and the extent to
which that context is conducive to, or supportive of,
the assessment. It covers such issues as the legislative or
policy framework for assessment activities; institutional
and organizational structures for designing, carrying
out, or using results from the assessment; the availability
of sufficient and stable sources of funding; and the
presence of trained assessment staff.
System alignment refers to the extent to which the
assessment is aligned with the rest of the education
system. This includes the degree of congruence
between assessment activities and system learning
goals, standards, curriculum, and pre- and in-service
teacher training.
Assessment quality refers to the psychometric quality
of the instruments, processes, and procedures for the
assessment activity. It covers such issues as design and
implementation of assessment activities, analysis and
interpretation of student responses to those activities,
and the appropriateness of how assessment results are
reported and used.
Crossing the quality drivers with the different
assessment types/purposes provides the framework and
broad indicator areas shown in Table 2. This framework
is a starting point for identifying indicators that can be
used to review assessment systems and plan for their
improvement.
Mapping Exercise Methodology
All 19 Arab countries covered by UNESCO Beirut were
invited to take part in the mapping exercise through
their National Commissions for UNESCO. Countries
were requested to nominate at least three national
researchers. Once the nominations were received,
UNESCO Beirut selected a national researcher from
each country based on certain identified required
qualifications. The national researchers were responsible
for completing the SABER-Student Assessment tools,
which are comprised of four questionnaires, by
collecting information from key informants and official
sources supported by all necessary documentation, in
relation to the questions raised in the questionnaires on:
1. Classroom Assessment (CA),
2. Examinations (Ex),
3. National Large-Scale Assessments (NLSA), and
4. International Large-Scale Assessment (ILSA).
Table 2: Framework for building an effective assessment system, with indicator areas
Classroom assessment
Enabling
Context
Examinations
Large-scale, system-level assessment
Policies
Leadership and public engagement
Funding
Institutional arrangements
Human resources
System
Alignment
Learning/quality goals
Curriculum
Pre- and in-service teacher training opportunities
Assessment
Quality
Ensuring quality (design, administration, analysis)
Ensuring effective uses
Source: Marguerite Clarke (2012), What matters most for student assessment systems: a framework paper, The World Bank.
9
Regional Mapping Report on Assessment in the Arab States
With the exception of Algeria and Morocco, 17 out of
the 19 contacted countries confirmed their participation
in the mapping exercise. These include: Bahrain, Egypt,
Iraq, Jordan, Kuwait, Kingdom of Saudi Arabia (KSA),
Lebanon, Libya, Mauritania, Oman, Palestine, Qatar,
Sudan, Syria, Tunisia, United Arab Emirates (UAE) and
Yemen. Annex I lists the national consultants responsible
for conducting the field survey in each country.
Regarding the main phases of the survey, the
questionnaires were first provided by the World Bank
in two languages, English and Arabic. Validation of the
Arabic version of the questionnaire took place at the
UNESCO Office in Beirut. The updated version was sent
to the national researchers for feedback and additional
comments. The UNESCO office then finalized the Arabic
version of the questionnaire and sent it to the relevant
national researchers. The survey was launched in 16
countries, since Kuwait had already been previously
surveyed by the World Bank and a national report had
been written about the country on the basis of the data
collected using the aforementioned questionnaires.
A training session was organized by the regional
consultant for the national researchers individually by
Skype or phone, and the data collection methodology
was reviewed, with added emphasis on two rules: (1) to
double check the data provided by the informants, and
(2) to collect any available supporting documents.
Validation of the completed questionnaires took place
at the UNESCO Office - Beirut initially, by checking the
internal alignment of the answers (between questions
themselves and between questions and comments),
their clarity and the relevance of the supporting
documents. The national researchers were asked to
clarify, correct or provide supporting documents where
needed. General instructions were often sent to all the
national researchers to clarify some frequently-occurring
matters and to create common understanding of the
questionnaires by all.
The completed and validated questionnaires were then
translated into English, with answers and comments
included. Following translation and review, the English
version of the questionnaires were sent to the World
Bank. The SABER-Student Assessment team at the World
Bank performed a second validation of the questionnaires
by examining the internal alignment, clarity and
10
completeness of the information on the national
evaluation system. The SABER-Student Assessment
team’s comments and questions on the individual
questionnaires were sent to the national consultants for
feedback. Accordingly, all new information, comments
or corrections were then integrated into the finalized
questionnaires and submitted to the World Bank.
The SABER-Student Assessment team benchmarked
each country’s student assessment system using
standardized rubrics that provide each country with
some sense of the development level of its assessment
activities compared to best or recommended practice in
each area. For each driver, the rubric displays four
development levels—Latent, Emerging, Established,
and Advanced. This phase led to producing national
reports on students’ assessment system in the
participating countries.
UNESCO started then a process of validation of the
national reports. For this purpose, the UNESCO National
Commission offices in every participant country were
asked to organize a workshop aiming at reviewing and
validating the national report. It was agreed to invite
to this workshop stakeholders and staff concerned with
the national system of assessment, and to provide them
with a copy of the report prior to the workshop.
Each workshop included four sessions:
1. The UNESCO expert’s presentation of the
project background, its phases, methodology
and the maner in which the national reports
were prepared;
2. The national expert’s presentation of the
national report;
3. The review and discussion of the national report
by three working groups: the first reviews the
part related to classroom assessment, the second
reviews the part related to examinations, and
the third reviews the parts related to national
and international large-scale assessments; and
4. The presentation of the working groups’
remarks, as agreed upon by the group. These
remarks were provided in writing and included
changes in the content of the report as well as
in the initial responses to questionnaires that
were completed by the national consultant.
Introduction
Thirteen workshops were held; the first in Sudan on 31
October 2013 and the last in Jordan on 28 August 2014
(see Annex II for list of national workshops and dates).
UNESCO then incorporated the comments that
emerged from the validation worksops into the English
version of the national reports and sent them to the
SABER-Student Assessment team at the World Bank. In
its turn, the team examined the proposed changes and
finalized the national reports. At the time of submitting
this regional report, not all national reports had been
finalized.
This regional comparative report has been prepared
in two phases. In the first phase, the comparison was
based on data collected through the questionnaires. In
the second phase, the comparison was based on the
national reports, that is, on the benchmarking. The first
draft of the report was submitted on 26 May 2014. The
present and second version of the report was prepared
after validation of national reports during the national
validation workshops. In fact, only 12 validated reports
were considered in the revision; those completed by the
1st of September. These countries are Bahrain, Egypt,
Iraq, KSA, Lebanon, Mauritania, Oman, Palestine,
Tunisia, Sudan, UAE, and Yemen. The data related to
the remaining reports was kept as it was before the
validation workshops.
This report provides a description of student assessment
systems in seventeen Arab countries. It covers their
four types of assessment (classroom assessment,
examinations, national large-scale assessment, and
international large-scale assessment) and their three
quality drivers:
(a) enabling context,
(b) system alignment, and
(c) assessment quality.
The purpose of this report is to present a descriptive
comparison of these assessment systems, and to provide
a summary of the benchmarking exercise displaying the
development levels for each assessment type in each
surveyed country.
11
PART
ONE
Classroom Assessment (CA)
I.
ENABLING CONTEXT
........................................................
II. ASSESSMENT QUALITY
.....................................................
III. BENCHMARKING FOR CLASSROOM ASSESSMENT (CA)
...
14
21
28
13
Regional Mapping Report on Assessment in the Arab States
I. ENABLING CONTEXT
1. Policy Documents
Out of the 17 surveyed countries, 14 reported having
a formal state-level document that provides guidelines
for classroom assessment, such as content, format,
expectations, scoring criteria and uses. Palestine and
Sudan reported having an informal or draft document
providing classroom assessment guidelines, while Iraq
reported not having any such document. Table 1.1 shows
the title of the relevant document in each country as
well as the authorizing body and year of authorization.
Table 1.1: Country system-level documents on classroom assessment guidelines (q22)
2
14
Year of
Authorization
Country
Official Document
Authorizing Body
1. Bahrain
• Educational Assessment System
• Teachers’ Guide for Assessing the Daily
Tasks of Students in Basic Education
Center of Measurement
and Assessment
2010
2. Egypt
Ministerial decision number 313 of 2011
regarding the re-organization of the
Comprehensive Educational Assessment
applied to Basic Education in both of its
cycles, Primary and Intermediate.
Ministry of Education
2011
3. Iraq
-
4. Jordan
Guidance brochures on classroom
assessment and student assessment report
Department of
Examinations and Tests Directorate of Tests
2011
5. KSA
• Rules of Student’s Assessment
(2007-2008)
• Education Policy in the Kingdom (1995)
Ministry of Education
2007-2008
6. Kuwait
Fundamental Document for (Primary,
Intermediate, Secondary) Stages in the
State of Kuwait
Ministry of Education
2008-2009
7. Lebanon
Minister Decision no. 666/m/2000 and its
amendments Decision no. 940/m/2001
Ministry of Education and
Higher Education
200-2001
8. Libya
• Rules of Grades Distribution for Courses
in the Secondary Education for the
school year 2012- 2013 AD
• Rules of Grades Distribution for Courses
in Basic Education for the school year
2012- 2013 AD.
Center for Educational
Curricula and ResearchCurricula Department at
the Ministry of Education
Updated yearly
1995
This refers to the number of the relevant question in the questionnaire pertaining to the assessment.
PA RT O N E Classroom Assessment (CA)
Year of
Authorization
Country
Official Document
Authorizing Body
9. Mauritania
• Primary Level: Evaluator’s Factsheets
according to the competences approach
• Secondary Level: Integration and
Remediation Guidebooks (by subject)
Ministry of National
Education and the
Inspectorate General of
National Education
(MEN IGEN)
Primary: 2008
Secondary: 2000
10. Oman
The General Document for Learning
Assessment of Students in Grades
1 to 12 and students Assessment
documents for each subject
General Directorate of
Educational Assessment Ministry of Education
2012
11. Palestine
Basis of Success, Completion and
Repetition for Grades 1 to 12
Ministry of Education
2010-2011
12. Qatar
Assessment Policies for Grades Four
to Eleven
Evaluation Institute –
Supreme Education
Council
2010
13. Sudan
Guidance for the Two Levels
National Center for
Curricula and Educational
Research
1995-1996
Latest edition
2007
14. Syria
• By-laws for Basic Education Schools
• By-laws for Intermediate and Secondary
Schools
Ministry of Education
2004
1994
15. Tunisia
• Calendar of the three examinations and
their organizational structure
• Promotion system in primary education
Ministry of Education
Yearly publications
16. UAE
Guidelines to the Implementation of Ongoing Assessment Tools
Directorate of Evaluation
and Examinations –
Ministry of Education
2010
17. Yemen
General Regulations for Examinations
General Directorate
for Examinations and
Educational Assessment –
Ministry of Education
2001
The documents are available to the public in 13 countries
in various forms. In only seven of the 14 countries that
have a formal classroom assessment document, this
document is available online for all to access. This is the
case for Bahrain, Egypt, KSA, Libya, Oman, Qatar and
UAE. The document can be found in the public library in
Sudan, while for the rest of the countries, it is available
for internal institutions only. Table 1.2 shows where
these documents are made available.
15
Regional Mapping Report on Assessment in the Arab States
Table 1.2: Availability of system-level documents on CA guidelines to the public (q4)
Country
Online
Public
library
Teacher
training
colleges
In-service
Other
courses for
teachers
1. Bahrain
ü
ü
Schools
2. Egypt
ü
ü
Education departments and schools
3. Iraq
4. Jordan
5. KSA
School libraries
ü
ü
ü
6. Kuwait
ü
Public version available at schools and
educational districts
7. Lebanon
ü
General Directorate of Education and
the archive of the Center for Educational
Research and Development
8. Libya
Distributed to teachers
ü
9. Mauritania
10. Oman
ü
ü
ü
ü
11. Palestine
12. Qatar
Disseminated to all public, private and
UNRWA schools
ü
13. Sudan
Hard copies made available to every
subject coordinator in independent schools
ü
14. Syria
ü
School principals and teachers
15. Tunisia
ü
Distributed to teachers via educational
institutions directors
16. UAE
ü
17. Yemen
In the UAE, the document provides instructions about
the format of classroom assessment only for public and
private schools that apply the Ministry’s curriculum.
Scoring criteria and application standards are provided
on the Electronic Student Information System.
Expectations are available in the assessment guidebooks
that support the textbooks. In Egypt, this document
is considered to be a referential guide for all teachers
given that it contains a great deal of details that are
used by teachers and directors almost on a daily basis.
In Lebanon, teachers have been trained on the policy
in the early phase of the implementation of the new
16
ü
ü
curricula in 2000-2001. As for Yemen, the document
is only available at the Ministry and at educational
offices archives, which makes it inaccessible to teachers
and therefore, they do not benefit from it. However,
there are training manuals for teachers, inspectors and
school principals that contain classroom assessment
instructions. Some have been trained; however, the
training was not completed on the rest.
Aside from the formal documents, some countries,
such as Sudan, use unofficial documents that include
guidelines and training material related to good
PA RT O N E Classroom Assessment (CA)
examination measurements and evaluation principles,
criteria and control. In Libya, classroom assessment
guidelines are found in other documents as well, such as
“Division of courses for basic education and secondary
education cycles”a document that outlines the expected
learnings in the different subjects, “scoring criteria or
rubrics for students’ work”. However, a concern raised
by some teachers was the length of the documents. In
Oman, all types of assessment documents are available,
in addition to scope and sequence matrices for each
subject.
2. Teacher Resources
Finally, some countries revisit and revise these documents.
For example, in Qatar, the document is currently under
revision for development purposes, while in Bahrain,
it has been modified in light of field feedback and the
results of teachers’ utilization of the document.
None of the countries utilize computer-based testing
with instant reports on students’ performance as a
resource for classroom assessment activities.
In terms of resources available to teachers on a systemwide basis for their classroom assessment activities, and
as Table 1.3 shows, the majority of countries surveyed
make available to teachers a document that outlines
what students are expected to learn in different subject
areas at different grade levels, as well as textbooks
and workbooks that provide support for classroom
assessment. In four countries, online assessment
resources are available (Jordan, KSA, Kuwait and Oman).
Table 1.3: Teacher resources on classroom assessment (q5)
Country
Document
outlines
student
learning
expectations
Document
outlines
expected
student
performance
level(s)
Textbooks
or
workbooks
that
provide
support for
classroom
assessment
Scoring
criteria
or
rubrics
for
student
work
1. Bahrain
ü
ü
ü
ü
2. Egypt
ü
Item
banks
or pools
with
examples
of
questions
Online
assessment
resources
Computerbased testing
with instant
reports on
student
performance
ü
3. Iraq
ü
4. Jordan
ü
ü
ü
5. KSA
ü
ü
ü
ü
ü
ü
6. Kuwait
ü
ü
ü
ü
ü
ü
7. Lebanon
ü
ü
ü
ü
8. Libya
ü
ü
ü
9. Mauritania
ü
ü
10. Oman
ü
ü
ü
ü
11. Palestine
12. Qatar
ü
ü
ü
ü
ü
ü
ü
ü
ü
13. Sudan
ü
14. Syria
ü
15. Tunisia
ü
16. UAE
ü
17. Yemen
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
17
Regional Mapping Report on Assessment in the Arab States
Most countries also provide documents that outline
what students are expected to learn in different subject
areas at different grade levels (except for Sudan and
Iraq). However in Egypt, Iraq, Libya, Palestine, Qatar,
Syria, and UAE, these documents don’t specify the
level of performance that students are expected to
reach in different subject areas at different grade levels.
However, there is a Scope and Sequence Matrix for each
educational subject that specifies the objectives that
should be achieved in each grade. For Palestine, the
document is the general curricula guidelines developed
in 1998 and 1999. It was adopted in principle for the
elaboration of textbooks; however, it was not released
to the public and remained limited to authors. This
document defines teaching objectives and a simple
description of the content and proposed assessment
methods by subject and grade. In Iraq, there are no
general educational objectives, but every book includes
behavioral objectives that need to be achieved in each
term and for each subject. In Yemen, these documents
are found in the Teacher Guide. However, this guide is
not made available to many school teachers, despite the
calls from school directors to make it accessible. These
topics are rather addressed through workshops and
training sessions.
and mathematics subjects. As for Libya, there are no
question banks available for classroom assessment;
however, teachers use questions banks supplied by
the Department of Examinations related to the final
examinations for grades 9 of the basic education and
grade 12 of the secondary education. Teachers use
these questions as examples that benefit students of
these two grades only. None of the countries surveyed
provides computer-based testing with instant reports on
students’ performance. However, in Qatar, computerized
examinations were conducted as a trial on a group of
schools during the current academic year.
Some countries mentioned other resources available
to teachers. For example, Qatar and Syria offer sample
questions from previous examinations, while UAE
provides pilot example. In Jordan, Performance Standards
and Indicators Guides for all levels (except the grades
1-3 and grade 11 as they have not been completed yet)
are distributed to subject teachers (in Arabic, English,
Science, and Mathematics). Finally, in Egypt, some
teachers and directors are endeavoring to prepare guides
and instructions pertaining to classroom assessment, but
this is still on an individual initiative basis.
3. Teachers’ Capacities
A considerable number of countries provide teachers
with scoring criteria or rubrics for assessing students’
work. For example, in Palestine, scoring criteria are
provided in the standardized tests that are centrally
prepared by the Ministry, as well as the regional tests
that are elaborated by the Directorates of Education.
They are available in schools as “set scoring manuals”.
In Yemen, scoring criteria are more common in private
schools and are usually applied on the basis of personal
initiative and effort. In Kuwait, according to the Heads
of Departments (HODs), each curriculum area has a
website available to assist in creating tests and guiding
assessment. The Ministry of Education has a website
archive of previous tests to be used as examples.
As for item banks, these are only available in KSA,
Kuwait, Oman, and Qatar. In Mauritania, a document
on item banks is currently “under printing” proposing
a contextualization methodology and a number of
classroom assessment examples. In Syria, sample
questions and scoring instructions are circulated to
schools annually, and in Bahrain, a system of questions
repository is currently being set up for the sciences
18
All countries reported having, in one way or another,
system-level mechanisms to ensure that teachers develop
skills and expertise in classroom assessment. While
eight countries reported having a plan for classroom
assessment within pre-service training (KSA, Lebanon,
Mauritania, Palestine, Qatar, Syria, Tunisia and Yemen),
the two most frequently adopted mechanisms are inservice teacher training and inclusion of a component
focused on classroom assessment in school inspection
or teacher supervision.
As shown in Table 1.4, in four countries (Kuwait, Oman,
Qatar and Syria), these mechanisms are well diversified
which demonstrates high political awareness of these
countries towards the importance of developing
teachers’ capacities in classroom assessment. The
political awareness seems to be low in Iraq, Libya and
Sudan where the sole existing mechanism is relying
only on in-service training or inspectors’ supervision. In
only eight countries teachers have the opportunity to
participate in exam item development for, or scoring of,
large-scale assessments or exams.
PA RT O N E Classroom Assessment (CA)
In Egypt, the Professional Academy of Teachers
endeavors to train teachers on some of the classroom
assessment procedures while the National Authority
for Quality Assurance and Accreditation in Education
(NAQAAE) conducts external reviews of schools
applying for accreditation so as to ensure the fulfillment
of teachers and learners’ standards, including classroom
assessment.
Table 1.4: System-level mechanisms for teachers’ skills and expertise development (q8)
Country
Preservice
teacher
training
In-service All teacher On-line
teacher
resources
training
training
on CA
programs
include CA
required
course
1. Bahrain
ü
2. Egypt
ü
3. Iraq
ü
4. Jordan
ü
5. KSA
7. Lebanon
ü
ü
Participate
in exam item
development
or scoring
School
inspection
or teacher
supervision
includes CA
component
ü
ü
ü
ü
ü
ü
6. Kuwait
Opportunities
to
participate in
conferences
and
workshops
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
8. Libya
9. Mauritania
ü
ü
10. Oman
ü
11. Palestine
ü
ü
12. Qatar
ü
ü
13. Sudan
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
15. Tunisia
ü
ü
ü
ü
16. UAE
ü
ü
ü
ü
ü
14. Syria
17. Yemen
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
19
Regional Mapping Report on Assessment in the Arab States
In Libya, surveyed participants reported that training
courses are few and sparse, commercially oriented with
for-profit purposes and lacking any focus on educational
assessment methodology. Another issue in Libya is the
weakness in inspectors’ guidance of teachers because
the inspector-teacher relationship is not participatory
or collaborative; the inspector is rather focused on
assessment and catching errors without giving any
feedback or useful guidance to teachers.
year, the Ministry of Education trains teachers and
educators on the developed curricula. Among the
most prominent and important training topics was
classroom assessment. Recently, a center for education
measurement and assessment was created. The
center will be in charge of evaluating all the elements
of the educational process, designing appropriate
measurement tools and training teachers to develop
their classroom assessment skills.
In Mauritania, teacher education schools (TES) have
as part of their training a module on assessment,
which also incorporates courses dedicated to curricula
that integrate assessment examples for the different
competencies to be acquired. At the secondary level,
this is definitely not quite the case and young teachers
usually encounter assessment difficulties. Moreover,
there is a huge gap between the disciplinary education
provided by the TES for the secondary cycle teachers
and the concepts these teachers have to teach in
their classes. Continuous training remains rather
underdeveloped in Mauritania; almost all of the public
sector teachers have received an initial professional
training in the TES. However, the new National
Development Plan of the Mauritanian Educational
System (PNDSE II) encompasses a development strategy
for continuous training that incorporates aspects
related to teachers’ classroom assessment.
In Tunisia, in-service teacher training by pedagogical
inspectors and assistants through field visits, or
continuous training during pedagogical days are
conducted. In addition to that, teachers’ promotion
from one rank to another is based on a pedagogical
score assigned by the educational inspector, elements
of which may include assessment methods. That is
why teachers pay a lot of attention to this issue as it
represents an important factor in the improvement
of their income. In-service teacher trainings are also
conducted in Bahrain on the topics of elaborating
achievement tests and using classroom assessment
methods. Follow-up on teachers’ implementation of
classroom assessment methods is also conducted.
In Oman, the Directorate of Educational Achievement
and Assessment and the Directorate of Educational
Supervision play a complementary role in both
the preparation and implementation of varied
relevant training programs; they also supervise the
implementation and monitor the training effect. As part
of the professional development programs, teachers
and supervisors are trained each year on implementing
classroom assessment. This takes place within planned
training programs and within the on-going follow
up by specialists from the Directorate of Educational
Achievement and Assessment on one hand, and
educational supervisors on the other.
In Palestine, mechanisms to ensure the development of
teachers’ skills in classroom assessment depend on the
availability of special funds. As for Syria, the Ministry
of Education embarked on a three-year curricula
development project for the General Education System
since the academic year 2009. During each academic
20
PA RT O N E Classroom Assessment (CA)
II. ASSESSMENT QUALITY
1. Purpose, Focus and Characteristics of
Classroom Assessment Activities
Classroom assessment activities are undertaken for
three main reasons (as summarized in Table 1.5 below):
to meet school-level and/or system-level requirements or
information needs, and to inform teaching and student
learning.
With the exception of Iraq and Libya, all surveyed
countries reported carrying out classroom assessment
activities in order to inform teaching and student
learning. In more than one country out of two, the
classroom assessment serves all these three purposes
(Bahrain, KSA, Kuwait, Mauritania, Oman, Palestine,
Qatar, Sudan, and UAE). Egypt, Jordan, Lebanon, Libya,
Syria, and Tunisia reported not using this assessment
to meet the school-level requirements; while in Iraq,
Lebanon, Libya, Syria, and Yemen, the classroom
assessment doesn’t serve the system-level requirements.
Only Lebanon and Syria reported that the classroom
assessment is used exclusively to inform teaching and
their students’ learning. Surprisingly Iraq reported using
the classroom assessment just to meet the school-level
requirements. As for Libya, classroom assessment is
only conducted as a prerequisite to students’ success or
failure without precising whether to meet the schoollevel or the system-level requirements.
Table 1.5: Reasons for conducting classroom assessment activities (q9)
Country
To meet systemlevel requirements or
information needs
To meet school-level
requirements or
information needs
To inform teaching and
student learning
1. Bahrain
ü
ü
ü
2. Egypt
ü
3. Iraq
ü
ü
4. Jordan
ü
5. KSA
ü
ü
ü
6. Kuwait
ü
ü
ü
ü
7. Lebanon
ü
8. Libya
9. Mauritania
ü
ü
ü
10. Oman
ü
ü
ü
11. Palestine
ü
ü
ü
12. Qatar
ü
ü
ü
13. Sudan
ü
ü
ü
14. Syria
ü
15. Tunisia
ü
16. UAE
ü
17. Yemen
ü
ü
ü
ü
ü
21
Regional Mapping Report on Assessment in the Arab States
In Mauritania’s secondary education, the system-level
pressure on teachers is quite strong (examinations) and
simultaneously real pressure exists at the school level to
get teachers to fulfill a significant number of classroom
assessments in between the examinations. As for Syria,
although it reported that teachers conduct classroom
assessment only to inform their own teaching and their
students’ learning at the present time, the assessment
process is highlighted as one of the most important
elements in the new curriculum. Yemen reported that
when it comes to meet the school-level requirements,
some say the private schools use this assessment for
publicity purposes.
As can be seen in Table 1.6, classroom assessment
activities in all of the surveyed countries focus on
knowledge and skills in core curriculum areas. Most of
the countries also assess knowledge and skills in non-
core curriculum areas, with the exception of Egypt, Iraq,
Libya, Sudan and Yemen. In Mauritania for example, all
taught disciplines are assessed. However, teachers are
not aware of the weighting of the different skills and
knowledge.
Only seven countries assess non-cognitive skills such
as teamwork and self-discipline. These countries are
Bahrain, Egypt, Jordan, Palestine, Syria, Tunisia and
UAE. For example, in Bahrain, classroom assessment
activities evaluate reasoning, scientific thinking,
conflict resolution, critical thinking and creativity skills.
In Palestine, achievement exams measure the various
cognitive skills in almost all subject matter, with the
exception of physical education and art education
where grades are based on students’ performance. In
addition to that, 10-15% of the score of each subject
matter is based on students’ behavior and discipline
Table 1.6: Knowledge and skills measured in classroom assessment (q10)
22
Country
Knowledge and skills in
core curriculum areas
Knowledge and skills in
non-core curriculum areas
Non-cognitive skills
1. Bahrain
ü
ü
ü
2. Egypt
ü
3. Iraq
ü
4. Jordan
ü
ü
5. KSA
ü
ü
6. Kuwait
ü
ü
7. Lebanon
ü
ü
8. Libya
ü
9. Mauritania
ü
ü
10. Oman
ü
ü
11. Palestine
ü
ü
12. Qatar
ü
ü
13. Sudan
ü
14. Syria
ü
ü
ü
15. Tunisia
ü
ü
ü
16. UAE
ü
ü
ü
17. Yemen
ü
ü
ü
ü
PA RT O N E Classroom Assessment (CA)
in the classroom, as well as daily performance and
perseverance. As for Egypt, non-cognitive skills are
limited to teamwork and perseverance skills in the
primary cycle of basic education.
Some countries are still challenging the traditional views
of assessment. In Syria for instance, the focus group
reported that the traditional view of assessment still
prevails among teachers, educators and even parents.
Educators still focus solely on measuring the information,
even though the new curricula are based on national
standards that are built around essential entries such as
the focus on the basics of knowledge; development of
life skills, most importantly teamwork; critical thinking
and interpersonal communication skills. The Ministry is
keen to develop educators and teachers’ assessment
skills and tools and to focus on measuring skills.
As for Libya, the current educational assessment,
including classroom assessment, is cognitive-based,
which is insufficient particularly in the first basic
education cycles. According to a group of teachers in
Secondary Education, final examinations for basic and
secondary education certificates are fundamentally
cognitive-based. This has negatively impacted the
assessment and evaluation methods used in the
classroom. The teachers have confirmed that there
is no document to assess the behavioral, emotional,
social and psychological aspects. The teachers also
indicated that the combined aspects must be taken into
consideration in the context of the assessment given
that they provide an integrated view of the level of a
student’s achievement as well as several psychological,
behavioral and moral elements and their impact on
educational achievement. Moreover, in basic education,
both Science and Computer teachers faced a problem in
carrying out classroom assessments, because the “rules
of grades distribution for courses in the basic cycle”
for the academic year 2012-2013 prepared by the
Ministry do not give importance to practical assessment
and in fact, it does not mention it at all. Once the
teachers applied the practical assessment method as a
fundamental method, problems occurred with parents
who protested against it given that it is not mentioned
in the grades distribution document.
It appears that in all countries, with the exception of
Kuwait, classroom assessment activities are mainly
about recalling information. Moreover, ten countries
reported that classroom assessment activities provide
little feedback to students. This does not seem to be
the case for Bahrain, Egypt, Jordan, Kuwait, Mauritania,
Qatar and Tunisia. Ten countries also reported grade
inflation as a serious problem they face in their
assessment activities. These countries exclude Egypt,
Kuwait, Lebanon, Libya, Qatar, Sudan, and Tunisia.
Relying mainly on multiple choice and selection-type
questions is also a challenge in Egypt, Jordan, Kuwait,
Libya, Palestine, Qatar, Sudan, Syria, UAE, and Yemen.
In Syria, teachers typically elaborate a scale or criteria
to correct written examinations. As for students’ work
throughout the year, it is assessed without clear tools
and is measured based on written assignments and oral
recitations. Most teachers tend to assess a student’s
performance according to his/her written examination
achievement. This may be attributed to traditional
teaching methods used by teachers and educators
(relying for the most part on lecturing), as well as to
crowded classrooms. It could therefore be said that
there is a gap between the teaching strategies of the
education system and the implementation on the
ground. In other countries, such as Oman, classroom
assessment for grades 5 to 12 is diversified and carried
out in several forms including oral tests, quizzes,
observation, periodic reports and quarterly tests.
When asked about certain characteristics of classroom
assessment activities, the respondents to the survey
identified a number of issues and challenges that countries
are facing. These are highlighted in Table 1.7.
23
24
Table 1.7: Issues and challenges in classroom assessment activities (q11)
Rely
mainly on
multiplechoice,
selectiontype
questions
Rely mainly
on recalling
information
Teachers
do not
use
explicit
scoring
criteria
Errors in
scoring
or
grading
of
students’
work
Uneven
application
of
standards
for
grading
students’
work
Grade
inflation is
a serious
problem
Parents
poorly
informed
about
students'
grades
Provide
little
useful
feedback
to
students
Mainly
used as
administrative
or control tool
Not aligned
with
pedagogical
framework
1. Bahrain
NC
C
R
R
R
C
R
R
NC
R
2. Egypt
C
VC
C
NC
UT
NC
NC
NC
NC
R
3. Iraq
NC
VC
VC
R
C
VC
R
VC
NC
R
4. Jordan
C
C
C
NC
VC
C
R
R
NC
R
5. KSA
NC
C
R
NC
NC
VC
NC
C
NC
R
6. Kuwait
VC
NC
R
R
R
NC
R
R
R
R
7. Lebanon
NC
C
C
NC
NC
NC
NC
C
NC
NC
8. Libya
VC
VC
VC
C
NC
NC
C
VC
VC
9. Mauritania
R
C
R
NC
C
VC
VC
NC
NC
C
10. Oman
NC
VC
C
C
C
C
R
C
C
NC
11. Palestine
C
C
C
NC
C
C
C
C
C
NC
12. Qatar
C
C
NC
NC
R
R
NC
NC
NC
NC
13. Sudan
C
VC
C
R
NC
R
NC
C
R
R
14. Syria
C
VC
NC
C
C
C
C
C
C
VC
15. Tunisia
NC
C
NC
R
R
NC
R
NC
NC
R
16. UAE
C
VC
NC
NC
VC
C
NC
C
NC
NC
17. Yemen
C
VC
VC
C
NC
C
R
C
VC
VC
Note: VC = Very Common; C = Common; NC = Not Common; R = Rarely; UT = Unable to Tell
Regional Mapping Report on Assessment in the Arab States
Country
PA RT O N E Classroom Assessment (CA)
In Egypt, Iraq, Jordan, Lebanon, Libya, Oman,
Palestine, Sudan, and Yemen, responses revealed that
it is common for teachers not to use explicit or a priori
criteria for scoring or grading students’ work. In UAE,
teachers differ to a great extent in their understanding
of the classroom assessment criteria and scoring, and
in their implementation of assessment and evaluation
of assessment goals. Uneven application of standards
for grading students’ work is also common in seven
countries, namely Iraq, Jordan, Mauritania, Oman,
Palestine, Syria, and UAE. However, observing errors in
scoring or grading students’ work is not a major issue
as it is only observed in Libya, Oman, Syria and Yemen.
Aligning classroom assessment activities with
pedagogical or curricular frameworks is common
in most countries surveyed. Only Mauritania, Syria
and Yemen expressed that a lack of alignment exists
between assessment and curriculum frameworks.
Moreover, 12 countries denied that assessment activities
are maily used as administrative or control tools, except
for Libya, Oman, Palestine, Syria, and Yemen. Finally, in
terms of assessment characteristics, informing parents
about students’ grades seems to be a common practice
in most countries as well, except for Libya, Mauritania,
Palestine, and Syria. In Bahrain for example, parents
are supplied with student performance assessment
criteria in advance, and they receive feedback regarding
their children’s performance by way of parent-teacher
meeting, text messages (SMS) and report cards in hard
copy and online.
2. Monitoring Mechanisms
In terms of system-level mechanisms that are in place to
monitor the quality of classroom assessment activities,
all the surveyed countries reported that classroom
assessment is a required component of school inspection
or teacher supervision. Classroom assessment is also
a required component of a teacher’s performance
evaluation in all countries except Iraq and Lebanon.
Bahrain, Egypt, Kuwait, Oman, Palestine, Qatar, Syria,
Tunisia, and UAE all have in place national or other
system-wide reviews of the quality of education that
include a focus on classroom assessment.
Aside from the monitoring mechanisms mentioned
above, government funding for research on the quality
of classroom assessment activities and how to improve
classroom assessment is only available in KSA, Kuwait,
Tunisia, and UAE. Oman and Qatar are the only two
countries that reported having an external moderation
system that reviews the difficulty of classroom
assessment activities, the appropriateness of scoring
criteria, etc.
3. Student Results
In all countries, results for individual students are
recorded in the teacher’s record book. With the
exception of Iraq, all countries also have a classroom
or a school database where student results are
recorded. Egypt, Jordan, KSA, Kuwait, Lebanon,
Libya, Mauritania, Oman, Qatar and Tunisia reported
that classroom assessment results are also recorded
in students’ own copybooks. Table 1.8 shows where
classroom assessment results for individual students in
each country are typically recorded.
Some countries reported having district-wide databases
or information systems to record student results. These
are KSA, Kuwait, Mauritania, Oman, Palestine, Sudan,
Tunisia, UAE, and Yemen. Among these, five countries
record student results as well in system-wide databases
or information systems. However, three countries have
records only in the system-wide databases but they
don’t have these records at district level, including
Bahrain, Jordan and Qatar.
In UAE, the students’ database is standardized for all
private and public schools that apply the curriculum
of the Ministry of Education. Student grades are
saved in an electronic system adopted by the Ministry
of Education and as such, the students’ performance
evaluation cards are extracted from the same system.
In Jordan, results are also recorded in EduWave, an
e-learning System.
Other record keeping systems include a register in
Bahrain available in both hard and electronic copies
for monitoring student scores. In Tunisia, assessment
documents and results are returned to students once
they have been duly corrected. At the end of every
semester, a score card is sent to the students’ parents.
Furthermore, the educational institution saves all
assessment data, which may not be communicated
to the students’ parents until the section council
has debated and validated them. This applies to all
25
Regional Mapping Report on Assessment in the Arab States
Table 1.8: Records of student results (q13)
Country
Student’s own
copy book
1. Bahrain
2. Egypt
ü
3. Iraq
Teacher’s
record book
Classroom
or school
database
ü
ü
ü
ü
System-wide
database or
information
system
ü
ü
4. Jordan
ü
ü
ü
5. KSA
ü
ü
ü
ü
ü
6. Kuwait
ü
ü
ü
ü
ü
7. Lebanon
ü
ü
ü
8. Libya
ü
ü
ü
9. Mauritania
ü
ü
ü
ü
10. Oman
ü
ü
ü
ü
ü
ü
ü
ü
ü
13. Sudan
ü
ü
14. Syria
ü
ü
ü
ü
ü
16. UAE
ü
ü
ü
ü
17. Yemen
ü
ü
ü
ü
11. Palestine
12. Qatar
15. Tunisia
ü
ü
education levels. In Oman, for grade 12, there exists a
specific database at the central level. This system can be
referred to at any time by specialists of the Directorate of
Tests, the Directorate of Examinations, and by technical
personnel from the Department of Information Systems
if there is a need to amend or develop some. As for
students and parents, they can only look at the results
if they wish during the period of results announcement
and via the Educational Portal.
4. Required Uses of Classroom Assessment
There are different required uses of classroom
assessment activities to promote and inform students’
learning. As Table 1.9 shows, all countries, except for
Libya, use assessment to provide feedback to students
26
District-wide
database or
information
system
ü
ü
ü
ü
on their learning. However, parents are not informed
about this feedback in Egypt, Iraq and Libya. Iraq, Libya
and Yemen do not use the feedback from classroom
assessment as a diagnostic tool for student learning
issues. In addition to these three latter countries,
Sudan and Syria also do not make use of this feedback
for planning purposes. Ten countries reported using
classroom assessment for grading students for internal
classroom uses. Classroom assessment activities are
used for providing input to external examination
programs in Palestine, Qatar, Sudan, Syria, Tunisia, and
UAE.
In UAE, classroom assessment activities are also used to
develop students’ self-evaluation skills and to evaluate
teachers’ performance in teaching and planning their
PA RT O N E Classroom Assessment (CA)
courses and activities, in the aim of improving them.
Palestine, Qatar and UAE seem to be the only countries
that make full use of classroom assessment for all
purposes of promoting and informing student learning.
In all surveyed countries, schools or teachers are required
to report on individual student’s performance to parents
and students. However, they are required to report to
the school district or Ministry of Education officials only
in ten countries - Bahrain, KSA, Mauritania, Oman,
Palestine, Qatar, Sudan, Tunisia, UAE, and Yemen.
Table 1.9: Required uses of classroom assessment (q15)
Country
Diagnosing
student
learning
issues
Providing
feedback to
students on
their learning
Informing
parents about
their child’s
learning
Planning
next
steps in
instruction
1. Bahrain
ü
ü
ü
ü
2. Egypt
ü
ü
3. Iraq
Grading
students
for internal
classroom
uses
Providing
input to an
external
examination
program
ü
ü
4. Jordan
ü
ü
ü
ü
5. KSA
ü
ü
ü
ü
ü
6. Kuwait
ü
ü
ü
ü
ü
7. Lebanon
ü
ü
ü
ü
8. Libya
ü
9. Mauritania
ü
ü
ü
ü
ü
10. Oman
ü
ü
ü
ü
ü
11. Palestine
ü
ü
ü
ü
ü
ü
12. Qatar
ü
ü
ü
ü
ü
ü
13. Sudan
ü
ü
ü
14. Syria
ü
ü
ü
15. Tunisia
ü
ü
ü
ü
16. UAE
ü
ü
ü
ü
ü
ü
17. Yemen
ü
ü
ü
ü
ü
ü
ü
27
Regional Mapping Report on Assessment in the Arab States
III. BENCHMARKING FOR CLASSROOM ASSESSMENT
Overall View
1. Enabling Context and System Alignment
The surveyed countries range in their overall
development levels on the indicators of Classroom
Assessment between Emerging and Established. Nine
countries - Iraq, Jordan, Libya, Oman, Palestine, Sudan,
Syria, UAE, and Yemen – were found to have Emerging
levels of Classroom Assessment, with weak system-wide
institutional capacity to support and ensure the quality
of classroom assessment practices. The remaining eight
countries were found to have Established development
levels in this regard, with sufficient system-wide
institutional capacity to support and ensure the quality
of classroom assessment practices.
This quality driver assesses “the overall policy and
resource framework within which classroom assessment
activity takes place in a country or system, and the degree
to which classroom assessment activity is coherent with
other components of the education system”.
Three area indicators are included in this driver:
a. Setting clear guidelines for classroom
assessment
The majority of the countries seem to be well
developed (Established to Advanced levels), having
formal system-level documents that provide
Table 1.10: Benchmarking results for classroom assessment (by country and status):
Country
Latent
Emerging
1. Bahrain
ü
2. Egypt
ü
3. Iraq
ü
4. Jordan
ü
5. KSA
ü
6. Kuwait
ü
7. Lebanon
ü
8. Libya
ü
9. Mauritania
ü
10. Oman
ü
11. Palestine
ü
12. Qatar
ü
13. Sudan
ü
14. Syria
ü
15. Tunisia
28
Established
ü
16. UAE
ü
17. Yemen
ü
Advanced
PA RT O N E Classroom Assessment (CA)
guidelines for classroom assessment, made
publically available to varying extents. The sole
exception is Iraq, which shows latent development
in this area with its absence of a system-level
policy document. Palestine has an informal policy
document, but it is widely available to the public.
b. Aligning classroom assessment with
system learning goals
The levels of development of the countries in this
indicator are more varied. Only two countries (Iraq
and Tunisia) have scarce system-wide resources
for teachers for classroom assessment (Emerging
level), while the rest have a range of available
resources (Established to Advanced levels). With
the exception of Iraq and Yemen, all the countries
have an official curriculum or standards document
that specifies what students are expected to learn.
In nine countries, this document also specifies the
level of performance required (Advanced level). Iraq
and Yemen have no official curriculum or standards
document for the purpose (Latent development).
c. Having effective human resources to carry
out classroom activities
The Arab countries are well developed in this area,
with 14 countries providing some system-level
mechanisms to ensure that teachers develop skills
and expertise in classroom assessment (Established
level), and the remaining three countries providing
a variety of such mechanisms (Advanced level).
Thus, in this quality driver, with the exception of
Iraq, the Arab countries are fairly established in
the overarching policy and resource framework
that provides the enabling context that is
conducive to, or supportive of, the classroom
assessment activities taking place in the country.
These activities appear to be coherent with other
components of the education system, providing
adequate system alignment and a fair degree of
congruence between assessment activities and
system learning goals, standards, curriculum, and
teacher training.
2. Assessment Quality
This driver assesses the “quality of classroom assessment
design, administration, analysis, and use”. It includes two
indicators: ensuring the quality of classroom assessment
and ensuring effective uses of classroom assessment.
a. Ensuring the quality of classroom assessment
In this area, all the countries were found to be of
rather average level, with 10 countries having
classroom assessment practices that were found to
be weak (Emerging level), and the remaining seven
having practices that were known to be of moderate
quality (Established level). Egypt, Iraq, Lebanon and
Mauritania have ad hoc mechanisms to monitor
the quality of the classroom assessment practices
(Emerging level), eight countries have limited
systematic monitoring mechanisms (Established
level), while the remaining five countries have varied
and systematic mechanisms in place to monitor the
quality of the assessment practices (Advanced level).
b. Ensuring effective uses of classroom
assessment
The surveyed countries appear to be better in
ensuring effective uses of classroom assessment
than in ensuring its quality. Dissemination is
well–developed, for in all the countries classroom
assessment information is required to be
disseminated to key stakeholders – to some (for
seven countries; Established level) or all stakeholders
(10 countries; Advanced level). As for the uses of
classroom assessment to support student learning,
the countries range in development. Egypt, Iraq,
Libya, Sudan, and Yemen have limited required
uses of classroom assessment to support student
learning (Emerging level). The remaining countries
show adequate required uses of classroom
assessment to support student learning, but more
than half of these countries exclude its use as an
input for external examination results while the rest
include it (Established to Advanced levels).
In this driver of Classroom Assessment Quality, the
surveyed countries seem to be more able to ensure
effective use of classroom assessment than to
ensure the quality of the assessment activity itself.
29
PART
TWO
Examinations (EX)
INTRODUCTION: MAJOR STANDARDIZED EXAMINATIONS ..... 32
I.
ENABLING CONTEXT ......................................................... 34
II. SYSTEM ALIGNMENT ......................................................... 41
III. ASSESSMENT QUALITY
.....................................................
44
IV. BENCHMARKING FOR EXAMINATIONS ............................. 49
31
Regional Mapping Report on Assessment in the Arab States
INTRODUCTION: MAJOR STANDARDIZED EXAMINATIONS
The surveyed countries were asked to present their
three major standardized examinations. Three countries
reported the existence of one examination, seven
countries two examinations and the remaining seven
countries reported about three examinations.
Almost all countries use the official examinations
for graduation from secondary graduation and/or as
condition for entrance to university. Only KSA does not
have such an exam; it only has school examinations
instead. To enter university, students in KSA have to pass
a General Aptitude Test (GAT). Table 2.1 provides a brief
description of these examinations.
With the exception of Qatar, all the secondary
examinations mentioned were first administered in the
respective countries more than 10 years ago, and most at
the 12th grade level, except Tunisia and Mauritania (at the
13th grade), and Sudan (at the 11th grade). Accordingly,
the modal age of students is 18 years old. For Qatar,
the standardized examination was first administered
five to ten years ago. The examination in Libya was first
introduced in 1989; that of Iraq was introduced more
than thirty years ago, while the examinations in Lebanon
and Bahrain date all the way back to 1925 and 1958,
respectively. These examinations deal with general
education. Vocational and technical education at the
secondary level has different examination arrangements.
Table 2.1: Standardized examinations (q1)
32
Country
First major exam
Second major exam
Third major exam
1. Bahrain
Examination for the Certificate
of the Completion of
General Secondary Education
Completion
Educational achievement quality Promotion exams
control examinations
2. Egypt
Certificate of Completion of the
General Secondary Education
IGCSE certificate
American High School
Diploma
3. Iraq
Ministerial Examination
(Baccalaureate) for Intermediate
sixth Grade- Scientific and
Literary Sections
Ministerial Examination
(Baccalaureate) for Primary
Sixth Grade
Ministerial Examination
(Baccalaureate) for Third
Grade Intermediate
4. Jordan
General Secondary Examination
National Examination (education
quality control examination)
5. KSA
General Aptitude Test
Achievement Test for Science
Majors - Male and Female
Students
6. Kuwait
Thunaweya Ama
(General Secondary)
7. Lebanon
General Secondary Diploma,
(Baccalaureate)
Intermediate Certificate (Brevet)
8. Libya
End of secondary education
exam (the main examination for
university admission)
End of basic education
examination
9. Mauritania
Baccalaureate
Entrance examination for the
1st year of secondary education
Hssen Test Program
Lower secondary school
certificate (9th grade)
PA RT T W O Examinations (EX)
Country
First major exam
Second major exam
10. Oman
General Education Diploma
Examination
Cognitive Development Program
Examination
11. Palestine
General Secondary Education –
Tawjihi
12. Qatar
General Secondary Examination
Comprehensive Educational
Evaluation Examination
13. Sudan
Secondary Certificate
Examinations
Basic Education Certificate
Examination
14. Syria
General Secondary Certificate
Basic Education Certificate
Examination
15. Tunisia
Baccalaureate
Certificate of completion of
basic education
16. UAE
Twelfth grade examination
Common Educational
Proficiency Assessment (CEPA)
17. Yemen
General Secondary Certification
Examination (Scientific section)
General Secondary Certification
Examination (Literature section)
1. Main Purpose
For most countries, the standardized examinations at
the secondary level have a double function: (1) student
certification for grade or school cycle completion,
and (2) student selection to university or other higher
education institution. In Bahrain, Egypt and Kuwait the
examinations have a third function, which is student
selection or promotion for grades/courses/tracks in
secondary school. This does not imply a negation of the
two main functions. It should be read that the results of
secondary examinations (students scores) are the basis
upon which students are enrolled at convenient majors
(or tracks) at universities. Success in secondary education
in Egypt, for example, is necessary but not sufficient to
enroll in public university. With relatively low scores,
students may not find places at the university and may
have to go to private universities or to study abroad.
2. Content and Format
Many countries have branches or tracks in their secondary
education level, such as Humanities or Sciences for
example, and each track has its own examination.
The general trend is to assess the greatest number of
subjects taught. The lists of these subjects are not always
Third major exam
Entry competition into
Pilot Middle School
Basic Certification
Examination (9th grade)
available; however, when details are provided, one
may observe that foreign language examination is very
common, almost as much as Arabic, and that religious
studies are included in countries such as Egypt, Oman,
Qatar, Syria, UAE and Yemen. Twelve countries test their
students on all subjects taught in Grade 12. Sociology,
Civics, and Philosophy are examined as independent
subjects in only two countries: Lebanon and Syria.
The examinations are conducted in all countries in paperand-pencil format. Exceptions appear in: Bahrain and
Kuwait, where they use performance assessment; Libya,
where computer-based assessment is used; Oman,
where oral and computer-based assessment are used;
and Tunisia, where portfolio, performance assessment
and computer-based forms are also applied. The three
most common forms of examination questions are:
multiple-choice, supply/open-ended, and essays.
Usually standard examinations are completely
independent from school evaluations. One exception is
observed in the UAE where sixty percent of the student’s
grade comes from the examination average and forty
percent from the on-going classroom assessment
throughout the school year.
33
Regional Mapping Report on Assessment in the Arab States
I. ENABLING CONTEXT
1. Policy Documents
In all countries, there are formal policy documents that
authorize the examination. These documents are of
different types, but most have been issued recently. In
Egypt, this document is in the form of a law, while in
KSA, Lebanon and Mauritania, the policy documents are
decrees issued by the president, the prime minister or
the council of ministers (see Table 2.2). In the remaining
countries, the policy documents are “instructions” or
“regulations” issued by the Ministry of Education.
In most cases, the document is available and easily
accessible to the public. In the few cases where it is not
available (Bahrain, Kuwait, Sudan, and Yemen), it either
exists at schools (Sudan) or at the ministry “to parties
in charge of the preparation and implementation of
examinations” (Bahrain and Yemen).
Table 2.2: Types of examination documents (q4)
Country
Official document
Year of
authorization
Available
to public
1. Bahrain
MoE General Examinations Regulations
MOE, 2010
û
2. Egypt
Law 20 of 2012 on the amendment of certain provisions
of the Education Law 139 of 1981
President, 2012
3. Iraq
General Examinations System number 18 of 1987
MOE, 1987
ü
4. Jordan
General Secondary Certificate Instructions
MOE, 2011
ü
5. KSA
Royal Decree for the establishment of the National Center
for Assessment in Higher Education
Council of
Ministers, 2000
ü
7. Lebanon
Decree number 5697 of 15 June 2001
President, 2001
ü
8. Libya
Examination administration document
9. Mauritania
Decree organizing the National Baccalaureate number
2011-034
Prime Minister,
1973
10. Oman
Regulations of the committees for general diplomas
examinations
MOE, 2010
11. Palestine
Instructions for the General Secondary Education
Examination Certificate
MOE, 2013
12. Qatar
Evaluation Policy for grade twelve (General Secondary)
Supreme
Education Council, ü
2012
13. Sudan
Examinations Regulations and student’s guidebook
MOE, 2010
14. Syria
Executive Instructions of General Exam for Secondary
Education
MOE , 2012
15. Tunisia
Decision regulating the Baccalaureate examination
MOE, Yearly
16. UAE
Assessment and Examinations System for Grades 1 – 12
(Ministerial decision 355)
MOE, 2010
17. Yemen
Examinations Rules, ministerial decision, first issued in 1965
MOE, 1965
ü
6. Kuwait
34
ü
ü
ü
ü
û
ü
ü
ü
û
PA RT T W O Examinations (EX)
The policy document authorizing the examination
usually includes many components. First, in 15 of the 17
countries, it describes the purpose of the examination (in
all countries except Lebanon and Sudan) and specifies
who can sit for the examination (all countries except
Iraq and UAE). Second, the policy document outlines the
governance, distribution of power and responsibilities
among key entities, and/or outlines procedures to
investigate and address security breaches, cheating, or
other forms of inappropriate behavior (13 countries).
Third it identifies rules about preparation (12 countries).
parents (KSA and Palestine), the media (KSA, Palestine
and Iraq), think tanks or NGOs (Palestine and Egypt) and
universities (Egypt). Palestine is the country showing
the most opposition to the examination program from
several stakeholders.
The typical situation in all countries is that policymakers
support or strongly support the program. Educators and
parents are usually supportive, while the position of
teacher unions is not always clear.
Typically, the team or the people that guide the
development of the examination questions are located
at the examination office. In Qatar, Sudan, and Syria the
same people are in charge of large scale assessment.
In all countries, except for Kuwait, Mauritania, Oman,
and Palestine, coordinated efforts have been made
by these stakeholder groups in order to improve
examinations. In Sudan, for example, the administration
conducts an examinations’ analysis on a yearly basis,
and the resulting comments are exploited the following
year. In Oman and Palestine, attempts to improve
examinations are based on independent, rather than
coordinated, efforts. In Kuwait, independent efforts
by different stakeholders have been made, including
curriculum specialists from the Ministry of Education
and some teachers.
Opposition to the examination program is hardly visible
(Table 2.3). When it exists, this opposition usually
emanates from educators (Palestine), students and
Efforts to improve the examination are generally
welcomed by the leadership in charge of the
examination, except for Palestine and the UAE.
In six countries, the document explains alignment with
curricula and standards, or it explains the format of the
examination questions (seven countries). In rare cases
does it state funding sources (three countries).
2. Leadership and Stakeholders
Table 2.3: Stakeholders’ support for policy (q8)
Stakeholder Group Strongly
Support
Support
Neutral
Oppose
Strongly
Oppose
Unable
to Tell
Total
Policymakers
14
3
0
0
0
0
17
Teacher Unions
4
4
4
0
0
5
17
Educators
6
10
0
1
0
0
17
Students
3
6
5
2
0
1
17
Parents
5
6
3
2
0
1
17
Media
2
9
1
3
0
2
17
Think-tanks, NGOs
or equivalent
4
2
4
2
0
5
17
Universities
6
6
2
1
0
2
17
Employers
4
3
6
0
0
4
17
35
Regional Mapping Report on Assessment in the Arab States
3. Funding
When asked about funding allocated for the examination,
all countries, except for KSA, reported having a regular
budget allocated by the government for the examination.
In Bahrain for instance, government funding for the
examination activities is included in the yearly budget
and the physical requirements of the examination plus
payment rewards for teachers who handle the correction
of the examination. In Yemen, government funding
exists, but it is scarce and insufficient.
Another source of funding in some countries is student
fees, as is the case in Egypt, Jordan, KSA, Palestine,
Sudan, and Tunisia. For example, in Palestine, students
pay a small fee to sit for the examination. Student
subscription fees are also common in Jordan, whereas
in Mauritania, only independent candidates pay a
subscription fee, a minimal one nonetheless.
Funding allocated for the examination is used to cover
different in-house and outsourced activities. As Table
2.4 shows, all the countries use the funds to cover
activities related to the design and administration of
the examination. Most countries also use the funding
to cover data analysis activities (except for Bahrain, Iraq,
Tunisia and Yemen) and data reporting activities (except
for Bahrain, Libya and Yemen).
A number of countries use the funding for planning
program milestones and for staff training. Research and
development activities related to the examination are
also covered by the funding in KSA, Oman, and Qatar.
Other funded activities include courses and conferences
on evaluation in Qatar and stationery and examination
requirements including transportation for the examination
development and execution teams in Palestine.
Table 2.4: Activities covered by funding allocated for the examination (q12)
36
Country
Exam
Exam
Data
design administration analysis
1. Bahrain
ü
ü
2. Egypt
ü
ü
3. Iraq
ü
ü
4. Jordan
ü
ü
ü
ü
ü
5. KSA
ü
ü
ü
ü
ü
6. Kuwait
ü
ü
ü
ü
ü
7. Lebanon
ü
ü
ü
ü
8. Libya
ü
ü
ü
9. Mauritania ü
ü
ü
ü
ü
10. Oman
ü
ü
ü
ü
ü
ü
ü
11. Palestine
ü
ü
ü
ü
12. Qatar
ü
ü
ü
ü
ü
ü
ü
13. Sudan
ü
ü
ü
ü
ü
14. Syria
ü
ü
ü
ü
ü
ü
15. Tunisia
ü
ü
ü
ü
ü
16. UAE
ü
ü
ü
ü
17. Yemen
ü
ü
ü
Data
reporting
Long- or
mediumterm
planning
Research and
development
Staff
training
ü
ü
Activities
not
related to
examination
ü
ü
ü
ü
ü
ü
PA RT T W O Examinations (EX)
4. Organizational Structures
With the exception of KSA and Qatar, all countries
reported that an office or a unit within the Ministry of
Education holds the primary responsibility for running
the examination. In Qatar, the Evaluation Institute of
the Supreme Education Council is the responsible body.
As for KSA, it is the National Center for Assessment
in Higher Education that holds this responsibility. In
Yemen, the Higher Committee for Examinations,
which was formed by a ministerial decision, shares the
responsibility of running the examination.
All the surveyed countries reported that the examination
results are officially recognized by certification and
selection systems in the country. The examination results
for the countries are also recognized by more than one
certification and selection system abroad.
Table 2.5 shows the responsible bodies in charge of
running the examination in each country, the year of
taking on this responsibility and the external body, if
any, to which they are held accountable.
In terms of facilities available to the office or institution
responsible for the examination, it appears that most
countries confirm the availability of secure storage
facilities; adequate communications tools such as
telephones, emails and the Internet; a secure building
(except for Palestine); computers for technical staff
(except for Lebanon and Libya); access to adequate
computer servers (except for Lebanon); and data
backup services (except for Lebanon and Libya). Table
2.6 summarizes the availability of the required facilities
needed to carry out the examination.
Table 2.5: Bodies responsible for running the examination (q14)
Country
Name of body responsible for running the examination
Year of taking on
this responsibility
1. Bahrain
Examination Directorate
1958
2. Egypt
General Directorate of Examinations
Before 1952
3. Iraq
General Directorate of Assessment and Examination
1925
4. Jordan
Department of Examinations and Tests
Around 1960
5. KSA
The National Center for Assessment in Higher Education
2000
6. Kuwait
-
1960
7. Lebanon
Examination Department
1949
8. Libya
Examinations Department
Since the start of the
education system
9. Mauritania
Directorate of Examinations and Assessment
2004
10. Oman
Department of Tests and Examinations Management – Directorate
General for Educational Assessment
1972
11. Palestine
General Directorate of Examination and Assessment
Early 1960’s
12. Qatar
Evaluation Institute – Supreme Education Council
2000-2001
13. Sudan
General Directorate of Examination, Measurement and Assessment
1950
14. Syria
Examination Directorate – Ministry of Education and its Examination
Departments in each province
1958
15. Tunisia
Directorate General of Examinations
Since the beginning
16. UAE
Administration of Assessment and Examinations
1972
17. Yemen
General Administration for Examinations
Higher Committee for Examinations
Since the beginning
37
Regional Mapping Report on Assessment in the Arab States
Table 2.6: Facilities available to carry out the examinations (q17)
Country
Computers
Secure
for all
building
technical staff
Secure
storage
facilities
Access to
adequate
computer servers
Ability to Adequate
backup
communication
data
tools
1. Bahrain
SA
SA
A
SA
SA
SA
2. Egypt
A
SA
A
A
SA
SA
3. Iraq
SA
SA
SA
SA
SA
SA
4. Jordan
A
SA
SA
A
SA
SA
5. KSA
SA
SA
SA
SA
SA
SA
6. Kuwait
SA
SA
SA
SA
SA
SA
7. Lebanon
D
A
SA
D
D
A
8. Libya
D
A
A
UT
D
A
9. Mauritania*
SA
SA
SA
SA
SA
SA
10. Oman
SA
A
A
A
A
SA
11. Palestine
A
D
A
A
SA
SA
12. Qatar
SA
A
A
A
A
SA
13. Sudan
A
SA
SA
A
SA
SA
14. Syria
SA
SA
SA
SA
SA
SA
15. Tunisia
SA
A
A
A
SA
SA
16. UAE
SA
SA
SA
SA
SA
SA
17. Yemen
SA
SA
SA
SA
SA
SA
Note: SA = Strongly Agree; A = Agree; D = Disagree; SD = Strongly Disagree; UT = Unable to Tell
* Mauritania also has a secured place within the directorate for confidential activities related to examinations during the exam period
5. Human Resources
Regarding availability of human resources for the
examinations, six countries reported lacking an adequate
number of permanent or full-time staff in the agencies or
institutions responsible for examinations. These include
Jordan, Lebanon, Oman, Palestine, Qatar, and UAE.
Lebanon and Yemen have mainly temporary or part-time
staff in the agencies. In Oman, the department in charge
of examinations management refers to the support
of specialized senior supervisors and teachers from
related directorates such as the General Directorate for
Curriculum Development and the General Directorate for
Human Resources Development (Education Supervision
Department),
whereas
examinations
execution
management is carried out by specialized committees.
In Yemen, although the Higher Committee for
Examinations and its various sub-committees are in
charge of examinations from the preparation phase
38
to the announcement of the results, the participation
of teachers, supervisors and other educators in this
process is mandatory. As for UAE, assistance to the
Administration of Assessment and Examinations is
provided by technical teams from the Educational
Supervision Department at the Ministry level, or at the
regional level, depending on examination needs.
Although most countries reported that there are no
issues in the performance of the human resources
responsible for the examinations, a number of issues
were still identified for some countries (see Table 2.7).
For example, poor training of test administrators or
unclear instructions and guidelines in the administration
of the examination is common in Libya, Oman, Palestine
and Yemen. Other issues include errors in scoring
(Mauritania and Yemen), weakness in test design
(Mauritania, Palestine and Syria), omission of curricular
topics (Libya, Mauritania and Syria) and frequent errors
in examination questions (Egypt, Syria and Yemen).
PA RT T W O Examinations (EX)
Table 2.7: Issues in the performance of human resources (q19)
Country
Delays in
Poor training
Errors in
Weaknesses
Omission
Frequent
administering
of test
scoring
in test design
of
errors in
examination
administrators
lead to
curricular
examination
due to issues
or unclear
delays in
topics
questions
with design of
instructions in
reporting
examination
administering
results
questions
examination
1. Bahrain
None
ü
2. Egypt
ü
3. Iraq
ü
4. Jordan
ü
5. KSA
ü
6. Kuwait
ü
7. Lebanon
ü
8. Libya
ü
9. Mauritania
ü
ü
10. Oman
ü
11. Palestine
ü
ü
ü
ü
12. Qatar
13. Sudan
ü
14. Syria
ü
ü
ü
15. Tunisia
ü
16. UAE
ü
17. Yemen
ü
ü
Various learning opportunities in educational
measurement and evaluation are provided in the surveyed
countries on an annual basis to prepare for work on
the examination. Table 2.8 presents the opportunities
available in the countries on an annual basis. Most
countries provide university graduate programs
specifically focused on educational measurement and
evaluation (except for Iraq, Mauritania, Qatar and
Tunisia) and/or university courses (except for Lebanon,
Libya, Mauritania, Oman, Qatar, and Tunisia).
ü
internships in the examination office are also offered,
but to a lesser degree.
In Yemen, there are attempts by the General
Administration for Examinations to organize training
sessions, but these have not taken place yet. In Sudan,
opportunities are available but scarce. As for Mauritania,
teaching assessment is practically nonexistent and those
who have been trained have travelled abroad. The
Directorate is starting to accept interns, who are usually
students from the École Normale Supérieure.
A considerable number of countries also provide nonuniversity training courses or workshops on educational
measurement and evaluation (except for Iraq, Kuwait,
Libya, Qatar and Tunisia). Funding for attending
international programs or workshops on evaluation and
39
Regional Mapping Report on Assessment in the Arab States
Table 2.8: Learning opportunities in educational measurement and evaluation (q20)
Country
University
graduate
programs
specifically
focused on
EME*
University
courses
(graduate and
non-graduate)
on EME
Non-university
training courses
or workshops
on EME
1.
Bahrain
ü
ü
ü
2.
Egypt
ü
ü
ü
3.
Iraq
4.
Jordan
ü
ü
ü
5.
KSA
ü
ü
ü
6.
Kuwait
ü
ü
7.
Lebanon
ü
8. Libya
Funding for
Internships in
attending
the examination
international office
programs,
courses, or
workshops on
EME
ü
ü
ü
ü
ü
ü
ü
9. Mauritania
ü
10. Oman
ü
11. Palestine
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
12. Qatar
13. Sudan
ü
ü
ü
14. Syria
ü
ü
ü
ü
ü
15. Tunisia
ü
16. UAE
ü
ü
ü
17. Yemen
ü
ü
ü
*Note: EME: Educational measurement and evaluation
40
ü
ü
ü
PA RT T W O Examinations (EX)
II. SYSTEM ALIGNMENT
1. Aligning Examinations with Learning
Goals and Opportunities to Learn
Fifteen of the surveyed countries reported that the
examination measures the national school curriculum
guidelines or standards. In Mauritania, the examination
measures an internationally recognized curriculum,
which is the Baccalaureate. Although the Baccalaureate
is a national diploma, its format and standards apply
for this type of diploma in France, which guarantees
its complete recognition of equal levels in all French
speaking countries. It is also considered as an entrance
diploma to universities in other countries that recognize
the general value of the Baccalaureate. As for KSA, the
examination measures general verbal and quantitative
competencies.
In general, what is measured by the examination is
largely accepted by the stakeholders in most countries.
For example, in Egypt, although the examination
relies to a great extent on cognitive aspects, it still
has high credibility within the community because it
provides students with equitable access to universities.
Stakeholders in Palestine, Syria and Yemen reported,
however, that is they were not very clear on what the
examination measures. In Yemen, the examination
appears to measure only information, facts, concepts,
etc. A similar issue is faced in Syria, where the
examination measures national curriculum standards,
but the focus is on measuring students’ knowledge and
information rather than their skills.
When asked whether all students have access to the
materials needed to prepare for the examination, all
countries, with the exception of Yemen, stated that the
material is widely accessible to over 90% of students in a
variety of learning contexts, such as in public schools or
online. For example, in Palestine, examination instructions
and schedules are disseminated to all registered students
enough time before the examination date, and the
competencies tested are included in the textbooks
provided by the Ministry at a low cost. In Bahrain, all
previous exam questions and their corresponding
answers are available on the Ministry’s website and that
of HM King Hamad Future School Project.
In Qatar, questions for all subjects are available on
the website of the Supreme Education Council, as
well as training for students on mock examinations
that emulate the questions of the Secondary School
Certificate. In Egypt, materials are available in publicdomain textbooks distributed free of charge, in foreign
textbooks licensed by the Ministry and on CDs. As for
Oman, aside from the material available in the textbooks,
there are a variety of test models and questions available
at the educational portal of the Sultanate. Yemen
reported that examination-related material is available
to most students (50-90%), but the problem lies mainly
with the electronic availability of the necessary material.
With respect to what material on the examination is
publically available, and as can be seen in Table 2.9, all
countries reported that they offer sample examination
questions.
In Libya, there is a questions bank prepared by a group
of specialized teachers in all educational subjects that
is used by specialists and students for guidance. The
Questions Committee made some 5,000 questions in
each subject matter available to be used as reference
for teachers and the officers in charge of the Questions
and Examinations Committee. In addition to that, these
guidance questions are distributed to students sitting
for the general official tests (in grades 9 and 12) and
at schools. The distribution of these guidance questions
has been in place since 1998, and samples of tests from
previous years are published in the official newspapers.
As for Yemen, samples from previous examinations are
publically available, but only in the cities. Students from
rural and remote areas may not have access to them.
Almost all countries make available information on
how to prepare for the examinations, except for
Bahrain, Egypt, Palestine, and Yemen. In KSA, Kuwait,
Oman, Qatar, Sudan, and Tunisia, the examination
framework document explaining what is measured
on examination is publically available. Some countries
also provide reports on the strengths and weaknesses in
student performance. These include KSA, Kuwait, Libya,
Oman, Qatar, and UAE. For example, in Oman, during
41
Regional Mapping Report on Assessment in the Arab States
Table 2.9: Publically available material on the examination (q24)
Country
Examples
of types of
examination
questions
Information
on how to
prepare for the
examination
1. Bahrain
ü
2. Egypt
ü
3. Iraq
ü
ü
4. Jordan
ü
ü
5. KSA
ü
ü
Framework
document
explaining
what is
measured on
examination
Report on the
strengths and
weaknesses
in student
performance
ü
ü
Other
Web links to prepare
students for the
examination
6. Kuwait
ü
ü
7. Lebanon
ü
ü
8. Libya
ü
ü
9. Mauritania
ü
ü
10. Oman
ü
ü
11. Palestine
ü
ü
ü
ü
ü
ü
Examination
instructions Averaging
methods
12. Qatar
ü
ü
ü
13. Sudan
ü
ü
ü
14. Syria
ü
ü
15. Tunisia
ü
ü
16. UAE
ü
ü
17. Yemen
ü
every academic term, the Ministry prepares a scientific
analysis of students’ results. The analysis is presented
and discussed with concerned parties at the Ministry
and concerned educational directorates. It reflects the
strengths and weaknesses of each subject.In Mauritania,
documents are essentially private, and at times of foreign
nature, due to the strong reference to Baccalaureate
programs criteria in the francophone space. Meanwhile,
the Directorate of Examinations, in cooperation with
the National Pedagogical Institute (NPI), has finished
42
ü
ü
ü
preparing a document that puts together around 10 years
of corrected subjects of the Mauritanian Baccalaureate.
This document will be available soon on the Directorate’s
website.
2. Learning Opportunities for Teachers
All countries, except for Egypt and Iraq, indicated that
there are courses or workshops on examinations that
are either compulsory or voluntary. In Kuwait, teachers
PA RT T W O Examinations (EX)
must take pre-service training in assessment, and as the
exam system gets updated, teachers attend workshops
to receive training in the new practices. Even though
Egypt has non-university courses and workshops on
educational measurement and evaluation, these are
part of the training courses offered to teachers, but
not directly related to the examination nor available
to teachers participating in the implementation of the
examination. With the exception of KSA, teachers
in all countries perform a number of tasks related to
examinations, as summarized in Table 2.10.
Supervising examination procedures and administering
the examination are the main examination-related task
that teachers perform. In some countries, teachers
are actively involved in almost all tasks related to
examinations, including Bahrain, Iraq, Kuwait, Lebanon,
Mauritania, Qatar, and Tunisia. The Directorate General
of Examinations in Tunisia has a sufficient number of
full-time employees. However, teachers, inspectors and
academics support the Directorate particularly through
the participation in the various technical committees,
which are involved in the examinations from the first
preparation phases until the announcement of the results.
Table 2.10: Examination-related tasks performed by teachers (q26)
Country
Selecting
Selecting
Administering
Scoring
Acting
Supervising
Resolving
or creating
or creating
the
exam
as a
examination
inconsistencies
examination
examination
examination
judge
procedures
between
questions
scoring guides
(i.e., in
examination
orals)
scores and
Other
school
grades (i.e.,
moderation)
1. Bahrain
ü
2. Egypt
3. Iraq
ü
ü
ü
ü
ü
ü
ü
4. Jordan
ü
ü
Review
ü
corrections
ü
ü
ü
ü
ü
ü
ü
ü
ü
5. KSA
6. Kuwait
7. Lebanon
ü
8. Libya
ü
9. Mauritania
ü
ü
ü
ü
ü
ü
ü
ü
12. Qatar
ü
13. Sudan
ü
14. Syria
ü
ü
ü
ü
ü
Mock Bacc.
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
16. UAE
17. Yemen
ü
ü
11. Palestine
ü
ü
ü
10. Oman
15. Tunisia
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
43
Regional Mapping Report on Assessment in the Arab States
III. ASSESSMENT QUALITY
1. Ensuring Quality
Countries adopt different systematic mechanisms to
ensure the quality of the examination. All the surveyed
countries reported that they have internal reviewers
or observers. In Oman, for example, examinations are
reviewed by specialized technical committees prior to
their execution. These examinations are then presented
to the assessment committee composed of highly
experienced specialists. Currently, an assessment of
general education diploma examinations is under way
by committees that are not part of the Ministry of
Education. In Mauritania, there is a Technical Committee
for the Baccalaureate made up of senior officials from
the Directorate and all of the Presidents of the Juries.
This committee centralizes reports generated by all the
Juries. This year, and at the request of the Minister,
the Directorate of Examinations has synthesized these
reports; an initiative that will be adopted systematically
in the coming years.
Some countries, such as Egypt, Iraq, KSA, Lebanon and
Libya also use external reviewers or observers. Iraq and
Tunisia have mechanisms for external certification or
audit. Iraq, KSA, Kuwait, Tunisia and Yemen conduct
pilot or field testing for the examination to ensure its
quality.
Although Yemen reported the use of internal reviewers,
this mainly concerns scoring questions and not quality
verification. In addition to that, with respect to
piloting the examination, a new pilot mechanism was
launched this year to check the implementation of the
examination rather than its quality. As for Mauritania,
monitoring is only hierarchical and administrative. Table
2.11 summarizes the different systematic mechanisms
that countries have put in place to ensure the quality of
the examinations.
Table 2.11: Mechanisms put in place to ensure examination quality (q28)
44
Country
Internal review
or observers
External review
or observers
1. Bahrain
ü
2. Egypt
ü
ü
3. Iraq
ü
ü
4. Jordan
ü
5. KSA
ü
6. Kuwait
ü
7. Lebanon
ü
ü
8. Libya
ü
ü
9. Mauritania
ü
10. Oman
ü
11. Palestine
ü
12. Qatar
ü
13. Sudan
ü
14. Syria
ü
15. Tunisia
ü
16. UAE
ü
17. Yemen
ü
External
Pilot or field testing
certification or audit
ü
ü
ü
ü
ü
ü
ü
ü
PA RT T W O Examinations (EX)
Some countries have a comprehensive, high quality
technical report supporting the examination that is
available to the public (Iraq, KSA, Libya, and Oman)
or with restricted circulation (Bahrain, Egypt, Jordan,
Kuwait, Qatar, Sudan, Syria, Tunisia, and UAE). In
Lebanon, Mauritania, Palestine, and Yemen, there is
some documentation about technical aspects of the
examination, but it is not in a formal report format.
2. Fairness
There are a number of inappropriate behaviors that may
occur during the examination process and consequently
diminish the credibility of the examination. When
asked to identify those behaviors, Bahrain and Kuwait
reported not having any. All other countries confirmed
that copying from other candidates takes place during
examinations. In addition to this classical behavior,
all these countries, except Qatar, experienced at least
one other type of inappropriate behaviors such as
the collusion among candidates via mobile phones,
passing of paper, or something of an equivalent nature;
impersonation (refers to when an individual other than
the registered candidate takes the examination); or
using unauthorized materials such as prepared answers
and notes.
Seven countries, including Egypt, Iraq, KSA, Libya,
Mauritania, Tunisia, and Yemen, indicated that
intimidation of examination supervisors, makers or
officials is a behavior that affects the credibility of the
examination. Most of the countries, except KSA and
Tunisia, also indicated that leakage of the content of the
examination paper or part of it before the administration
of the examination occurs in their countries, or providing
external assistance through the supervisor or mobile
phones, with the exception of Egypt and KSA. As for
forging certificates or altering results information, this
occurs in four countries, namely Iraq, Libya, Palestine
and Yemen.
As can be seen in Table 2.12, which summarizes the
inappropriate behaviors that diminish the credibility of
the examination, Bahrain and Kuwait did not report
any such behaviors. Iraq, Libya, Mauritania and Yemen
indicated all, or almost all, of the mentioned behaviors
as problems reoccurring in their countries during
examinations.
As for the mechanisms that have been put in place to
address these inappropriate behaviors, Egypt, Lebanon,
Palestine and Yemen reported that when students are
observed doing such behaviors they are expelled from
the examination, and sometimes are prohibited from
taking the exam another time. In Tunisia, examiners use
cell phone jammers and decrease the number of students
in the examination hall. Mauritania also suspends
cellphone connectivity because the use of cellphones is
a serious problem. In Sudan, the issue of impersonation
is dealt with by having student identification cards
with individual photographs. As for Qatar, they have
put in place strict regulations on punishment, and they
conduct trainings for observers.
In Jordan, the control process is intensified in the
examination room. This is done through having one
observer in the front of the room while another stays
at the back, in addition to observing a certain space
between students. Moreover, students are prohibited
from introducing any piece of paper to the exam room;
they are checked in case of suspicion, warned and
punished, and can even be prohibited from taking the
exam. Furthermore, it is entirely prohibited to enter
cellphones. In KSA, more than one sample of the
examination is used in the same hall and the student
caught cheating is given a warning and then expelled in
the event of recurrence. Moreover, a sufficient number
of proctors are hired to administer the test; anyone trying
to intimidate the proctors is prohibited from undertaking
the test and relevant authorities are informed. Similar
mechanisms have been put in place in Oman.
When asked about the credibility of the examination
results, all countries, except for Yemen, stated that
results are perceived as credible by all stakeholder groups.
Yemen indicated that the results are perceived as credible
by some stakeholder groups only, mainly due to the fact
results may depend on the student’s social environment
and related geographical aspects. Credibility is greater in
urban areas compared to rural areas.
Jordan elaborated that the examinations are prepared
according to scientific principles and many measures are
in place to guaranty the examination’s confidentiality,
precise execution, scoring and result extraction.
Moreover, some of the studies conducted show a direct
link between the students’ score in the secondary cycle
and his or their performance at university.
45
Regional Mapping Report on Assessment in the Arab States
Table 2.12: Inappropriate behaviors that diminish the credibility of the examination (q29)
Country
Leakage of
Copying
Using
Collusion
Intimidation
the content
Impersonation
from other
unauthorized
among
of examination forged
of external
of an
candidates
materials
candidates
supervisors,
certificates
assistance
markers or
or altering
officials
results
examination
Issuing
Provision
information
1. Bahrain
2. Egypt
ü
3. Iraq
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
4. Jordan
5. KSA
ü
ü
ü
ü
ü
ü
6. Kuwait
7. Lebanon
8. Libya
ü
ü
ü
ü
ü
ü
9. Mauritania
ü
ü
ü
ü
ü
ü
10. Oman
ü
ü
ü
ü
11. Palestine
ü
ü
12. Qatar
ü
14. Syria
15. Tunisia
ü
16. UAE
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
All the surveyed countries reported that all students may
take the examination regardless of their background
(gender, ethnic group, etc.), location (urban, rural,
etc.), ability to pay (transportation, fees, etc.) and
similar factors. Therefore, issues of language, gender,
socioeconomic status and cost were not perceived as
existing barriers to take the examination.
3. Using Examination Information
Only Palestine and Tunisia reported an improper use of
the examination results by stakeholder groups. Tunisia
for example reported that results may be misused in
the media, accidently or intentionally, since journalists
search for sensational news such as benchmarking the
46
ü
ü
13. Sudan
17. Yemen
ü
ü
ü
ü
ü
ü
ü
ü
ü
results of one province with the national average and
focusing on the circumstances surrounding the learning
process, the caliber of teachers and the administration.
The same applies to some intellectual forums.
In terms of confidentiality of results, eight countries
reported that only the student and persons with a
legitimate, professional interest in the test taker, such
as his or her educators, parents or authorized potential
employers could access the results. These countries are
Bahrain, Egypt, Jordan, KSA, Libya, Oman, Qatar and
Sudan. For Jordan, however, it appears that scores are
sometimes accessed on some websites by entering the
student’s first name, father’s name, grandfather’s name
and last name.
ü
PA RT T W O Examinations (EX)
Kuwait, Lebanon, Iraq, Mauritania, Syria, Tunisia, UAE,
and Yemen indicated that student names and results
are made public. In UAE, students are informed of
their results first by way of text message (SMS) from
the Ministry of Education e-learning system, and then
they become available to the media. As for Palestine,
students’ results are published by the schools but their
names are not disclosed.
4. Ensuring Positive Consequences
of the Examination
Some students who sit for the examination may not
perform well. For those students, all countries offer
them the option of retaking the exam. To those students
failing their first chance, some countries offer remedial
or preparatory courses in order to prepare to retake the
examination (Bahrain, KSA, Libya, Mauritania, Oman,
and Yemen). In most countries, except Bahrain, Jordan
and KSA, when students fail the examination in the
second chance, they can repeat the grade. Table 2.13
summarizes these available options.
In KSA, Kuwait and Yemen students may also opt for
less selective schools, universities or tracks. In Yemen,
should the student fail to pass the general examination
in two consecutive years, he or she is no longer entitled
to sit for it as an enrolled student and becomes a free
candidate. Should he or she fail in the Scientific section
for two consecutive years, he or she is transferred to
the Humanities section. In Syria, students with passing
grades wishing to improve their overall score can
choose to retake three subjects in the second round of
examinations of the same year. However, this may occur
only once. In Kuwait, failing no more than three subjects
allows students to retake the exam; otherwise they have
the option to repeat the class.
With respect to the various mechanisms in place to
monitor the consequences of the examination, results
show that, with the exception of Bahrain, Iraq, Lebanon,
and Mauritania, the surveyed countries have put in
place certain mechanisms to monitor the consequences
of the examination.
Table 2.13: Options for students who do not perform well (q35)
Country
Students may
retake the
examination
Students may attend
remedial or preparatory
courses in order to prepare
to retake the examination
Students may
opt for less
selective schools/
universities/tracks
Students can
repeat the
grade
1. Bahrain
ü
2. Egypt
ü
ü
3. Iraq
ü
ü
4. Jordan
ü
5. KSA
ü
6. Kuwait
ü
7. Lebanon
ü
8. Libya
ü
ü
ü
9. Mauritania
ü
ü
ü
10. Oman
ü
ü
ü
11. Palestine
ü
ü
12. Qatar
ü
ü
13. Sudan
ü
ü
14. Syria
ü
ü
15. Tunisia
ü
ü
16. UAE
ü
ü
17. Yemen
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
47
Regional Mapping Report on Assessment in the Arab States
Table 2.14: Mechanisms in place to monitor the consequences of the examination (q36)
Country
Funding for
independent
research on the
impact of the
examination
Permanent Studies (e.g.,
oversight
predictive validity)
committee that are updated
regularly
Regular
focus groups
or surveys
of key
stakeholders
Expert
review
groups
Other
1. Bahrain
2. Egypt
ü
3. Iraq
4. Jordan
ü
5. KSA
ü
ü
ü
6. Kuwait
ü
ü
ü
ü
7. Lebanon
8. Libya
ü
ü
9. Mauritania
10. Oman
ü
11. Palestine
Field
followups
12. Qatar
ü
ü
13. Sudan
ü
14. Syria
ü
15. Tunisia
ü
16. UAE
17. Yemen
ü
ü
Jordan, KSA and Qatar have funding for independent
research on the impact of the examination. A permanent
oversight committee is found in KSA, Libya, Oman,
Qatar, and Yemen. Expert review groups are available
in KSA, Libya, Sudan, Syria, and UAE. Studies on related
issues, such as predictive validity, are performed in Egypt,
KSA, Kuwait and Tunisia. As for regular focus groups
or surveys of key stakeholders, these are only available
48
in Jordan and Kuwait. Palestine has field follow-ups
conducted by some teachers and researchers, but these
are not organized nor based on a clear methodology.
In Oman, an additional practice is for students of the
general education diploma (grade 12) and equivalent to
be permitted to submit a request for the re-correction
of three educational subjects. Table 2.14 displays these
various mechanisms.
PA RT T W O Examinations (EX)
IV. BENCHMARKING FOR EXAMINATIONS
Overall View
Of the 17 surveyed countries, 14 were found to have
an Established development level on quality drivers
of Examinations, whereby each country had a stable
standardized examination in place. These countries show
institutional capacity and some limited mechanisms to
monitor the examination. In each of these countries, the
examination was found to be of acceptable quality and
was perceived as being fair for most students and free
from corruption. The remaining 3 countries – Palestine,
Qatar and Yemen – were found to have Emerging levels
of performance on Examinations. In these countries,
a partially stable standardized examination is in place,
with a need to develop institutional capacity to run the
examination. The examination itself typically is of poor
quality and is perceived as being unfair or corrupt. Table
2.15 shows the benchmarking results for all countries in
the examination area.
1. Enabling Context
This driver assesses “the broader context in which
the assessment activity takes place and the extent to
which that context is conducive to, or supportive of, the
assessment”. It covers such issues as the legislative or
Table 2.15: Benchmarking results for examinations (by country and status)
Country
Latent
Emerging
Established
1. Bahrain
ü
2. Egypt
ü
3. Iraq
ü
4. Jordan
ü
5. KSA
ü
6. Kuwait
ü
7. Lebanon
ü
8. Libya
ü
9. Mauritania
ü
10. Oman
ü
11. Palestine
ü
12. Qatar
ü
13. Sudan
ü
14. Syria
ü
15. Tunisia
ü
16. UAE
ü
17. Yemen
Advanced
ü
49
Regional Mapping Report on Assessment in the Arab States
policy framework for assessment activities; institutional
and organizational structures for designing, carrying
out, or using results from the assessment; the availability
of sufficient and stable sources of funding; and the
presence of trained assessment staff”.
Five indicators are included in this driver:
a. Setting clear policies
In this indicator, almost all the surveyed countries
seem to be developed (Established levels). With
the exception of Qatar, in all the countries the
examination is a stable program that has been
operating regularly (Established). For Qatar, the
standardized examination has been operating on
an irregular basis (Emerging). All the countries
have a formal policy document that authorizes the
examination (Established). However in Bahrain,
Kuwait, Sudan, and Yemen, this policy document
is not available to the public (Emerging). In all the
countries, the policy document addresses at least
some, if not all, key aspects of the examination.
b. Having strong leadership
The levels of development of the countries in this
indicator are varied. Except for the case of Palestine,
all the countries show support from stakeholder
groups for the examination (Established to
Advanced). In Palestine the situation is reversed, with
most stakeholder groups opposing the examination.
In Lebanon and Libya the situation is not clear. In
all 17 countries there are either independent or
coordinated attempts to improve the examination
by stakeholder groups (Established to Advanced).
With the exception of Palestine and UAE, efforts to
improve the examination are generally welcomed
by the leadership in charge of the examination
(Established level).
c. Having regular funding
All countries have regular funding allocated for the
examination (Established level). This funding covers
some or all of the core examination activities such as
design, administration, data processing or reporting
(Emerging to Established levels). In almost all the
countries, with the sole exception of Qatar, this
funding does not cover research and development
(Emerging level).
50
d. Having strong organizational structures
In all the countries, the examination office is a stable
organization (Established level). However in only six
countries – Egypt, KSA, Lebanon, Qatar, Sudan and
Yemen – is this office accountable to an external
board or agency. In Jordan, the examination results
are recognized by a local certification or selection
system (Emerging level), while in all remaining
countries the examination results are recognized
by selection systems in other countries (Advanced
levels). The examination offices in Lebanon, Libya
and Palestine have only some of the required
facilities to carry out the examination (Emerging
level), while the offices in the remaining countries
have all of the required facilities.
e. Having effective human resources
All the surveyed countries are well-achieved in this
area, with examination offices that are adequately
staffed to carry out the assessment effectively, with
minimal or no issues (Established to Advanced
levels). All the countries also offer at least some
opportunities that prepare for work on the
examination (Established to Advanced levels).
Thus in this quality driver, the Arab countries
seem to be fairly established in the overarching
policy and resource framework that provides the
enabling context for examination activities to take
place in the country and that is conducive to, or
supportive of, the examination activities.
2. System Alignment
This quality driver assesses the “degree to which the
assessment is coherent with other components of the
education system”.
Two indicators are included here:
a. Aligning examinations with learning goals
and opportunities to learn
In all the countries there is a clear understanding
of what the respective examination measures
(Established level). With the exception of Jordan,
Palestine, Syria, and Yemen, where what the
examination measures is questioned by some
stakeholder groups (Latent level), the remaining
countries show large acceptance by stakeholder
groups of what is measured by the examination
PA RT T W O Examinations (EX)
(Established level). In making accessible to students
material to prepare for the examination, the
countries are varied. Egypt, Jordan and Yemen
make only some material accessible to students
(Emerging level), while the rest of the countries
have comprehensive material to prepare for
the examination, made accessible to most or all
students (Established to Advanced levels).
b. Providing teachers with opportunities to
learn about the examination
In this indicator, countries are varied. In Egypt
and Iraq, there are no courses or workshops on
examinations available to the teachers (Latent level).
In Oman, Palestine, Sudan, Syria, and Yemen such
courses exist, but they are not up to date (Emerging
level). The rest of the countries have up-to-date
courses of workshops, but these are voluntary
in two countries and compulsory in 6 countries
(Established to Advanced levels). In involving
teachers in examination-related tasks, the scope
ranges widely among the countries. Six countries
involved their teachers in most examination-related
tasks (Advanced level).
Thus, it seems that there is a general fair alignment
of examinations in the surveyed countries with
learning goals and opportunities to learn for the
students, but the opportunities for teachers to
learn about the assessment and be involved in it
are very varied across countries.
3. Assessment Quality
This quality driver assesses the “degree to which the
assessment meets quality standards, is fair, and is used
in an effective way”.
Four indicators are included:
a. Ensuring quality
Lebanon, Mauritania, Palestine and Yemen have
some documentation on the examination, but
it is not in a formal report form (Emerging level).
The remaining countries all have a comprehensive
technical report, but in most cases with restricted
circulation (Established). Iraq, KSA, Libya and Oman
have made this report available to the general
public (Advanced level). Only Yemen has no
mechanisms in place to ensure the quality of the
examination (Latent). Iraq has varied and systematic
mechanisms in place to ensure the quality of the
examination (Advanced level). The rest have limited
such systematic mechanisms in place (Established).
b. Ensuring fairness
In ensuring fairness of the examination, the
surveyed countries are varied in the occurrence of
inappropriate behavior surrounding the examination
process. Such behavior occurs at a high rate in
Iraq, Libya and Yemen (Latent level); at a moderate
rate in Egypt, Jordan, Palestine, Syria, and Tunisia
(Emerging); and at a low or marginal frequency in
the rest of the countries (Established to Advanced
levels). The examination results are credible for
some stakeholder groups in Yemen (Emerging level)
and for all groups in all other countries (Established).
In all 17 countries, all the students can take the
examination, with no language, gender or other
barriers (Advanced level).
c. Using examination information in a fair way
Examination results are used by most or all stakeholder
groups in a proper way in all countries (Established to
Advanced levels). In eight countries, however, student
names and results are public (Latent).
d. Ensuring positive consequences of the
examination
To ensure positive consequences of the Examination,
countries may provide options for students who
do not perform well on the examination. These
options vary in range from one country to the
other (Emerging to Advanced), with only Libya and
Yemen providing a variety of options in such cases
(Advanced level).
In Bahrain, Iraq, Lebanon, and Mauritania there
are no mechanisms in place to monitor the
consequences of the examination (Latent), while in
all remaining countries there are some mechanisms
in place for the purpose (Established level).
As seen in this driver of Assessment Quality for
Examinations, the surveyed countries range in the
degree to which their assessment meets quality
standards. In general, examinations ensure fairness
in the respective country assessment activities. The
examination information is more or less used in a
fair manner, but there seems to be limited abilities to
ensure positive consequences of the examination.
51
PART
THREE
National Large-Scale Assessment (NLSA)
I. ENABLING CONTEXT ...................................................... 54
II. SYSTEM ALIGNMENT
.....................................................
III. ASSESSMENT QUALITY
..................................................
61
62
IV. BENCHMARKING FOR NLSA ........................................... 65
53
Regional Mapping Report on Assessment in the Arab States
I. ENABLING CONTEXT
1. Setting a Clear Policy for NLSA
a. Existence and nature of the large scale
assessment
All countries except for Iraq, Kuwait and Libya use
certain forms of national large scale assessment.
Some countries have run several NLSAs but selected
one for the purpose of NLSA questionnaire.
These assessments have different names and
characteristics (see Table 3.1).
In Bahrain, the National Examinations are conducted
every year and assesse all students in grades 3, 6,
9, and 12, covering a variety of subjects: grade 3
students (Arabic language and Mathematics), grade
6 and 9 students (Arabic and English languages,
Mathematics and Sciences), and grade 12 students
(Arabic, English, and problem solving).
In Egypt, the National Standardized Examination
was launched less than five years ago and was
conducted two times only covering Arabic and
English languages, Mathematics, and Sciences
in grades 4, 8 and 10. In Jordan, the National
Examination (Education Quality Control Examination)
is administrated regularly for the same grades (4, 8
and 10) once every three years. It covers Arabic and
English languages, Mathematics and Sciences for
grades 4, 8 and 10.
In KSA, the National Tests were applied once only
in 2004 for the 6th grade covering four subjects. In
Lebanon, the “Measuring learning achievement”
survey was conducted in 2003, covering third grade
students in Languages, Mathematics, Sciences,
Social Studies and Civics. In Mauritania, the
Assessment of the second year of basic education
took place in 2001. It focused on students’
acquisition in the second year of basic education in
Arabic, French and Mathematics.
Oman’s Cognitive Development Tests are applied
every year (two to four times every five years) for
grades 5 to 10, covering Mathematics, Sciences, and
Environmental Geographic concepts. In Palestine,
the National Assessment survey has been conducted
54
every two years since 2008, and covers Arabic,
Mathematics and Sciences for the 4th grade (10 year
olds) and 10th grade (16 year olds) of basic education.
In Qatar, the Comprehensive Educational Evaluation
started more than five years ago and is conducted on
a regular basis every year. It covers Arabic, English,
Mathematics, Sciences, Chemistry, Biology, Physics,
Islamic Education, and Social Sciences.
In Sudan, a survey of the state of education in
some states (provinces) was undertaken less than
five years ago. It covered Mathematics and English
and Arabic languages for grades 4 and 5. The
study took place in four states (provinces) only
and was conducted by the Sudanese Organization
for Education Development under the supervision
of the World Bank and with the approval of the
Ministry of Education.
In Syria, the Standardized Examinations are
implemented yearly in all of the provinces whereby
each province chooses a specific subject for a specific
grade and a standardized examination is administered
to all students in that grade. The standardized
examinations include the transitional grades (the
grades that do not have general examinations).
In Tunisia, the Diagnostic Assessment of the
Student’s Acquisitions is administrated every
year at the beginning of the 7th grade covering
Mathematics, Arabic and French. The first NLSA
was implemented in the academic year 2012/2013.
The United Arab Emirates National Assessment
Program (UAENAP) covers Arabic and English
languages, Mathematics, and Sciences in grades 3,
5, 7 and 9. It started more than five years ago, and
has been administrated on a regular basis every year.
In Yemen, the National Assessment System of
Students (NASS) is still under development and
has not been applied yet. It will cover Sciences,
Mathematics, and Arabic language, and will take
place upon completion of the first and second
cycles, namely at the beginning of grades 4 and 7.
PA RT T H R E E National Large-Scale Assessment (NLSA)
Table 3.1: Name of large scale assessment, frequency and population (q3)
Country
Name of assessment
Frequency
Population
1. Bahrain
National Examinations
regular
Students of grades 3, 6, 9 and 12
2. Egypt
National Standardized Examination
not regular
A representative random sample of
students
4. Jordan
National Examination (Education
Quality control Examination)
regular
All students at the given grade(s)
5. KSA
National Tests
not regular
A representative random sample
of students
Measuring Learning Achievement
not regular
A representative random sample
of students
9. Mauritania Assessment of the 2nd year of basic
education 2001
not regular
A representative random sample
of students
10. Oman
Cognitive Development Tests
regular
All students at the given grade(s)
11. Palestine
National Assessment survey
regular
A representative random sample
of students
12. Qatar
Comprehensive Educational Evaluation
regular
All students at the given grade(s)
13. Sudan
A survey of the state of education in
some states (provinces) in Sudan
not regular
All students at the given grade(s)
14. Syria
Standardized examinations
regular
All students at the given grade(s)
15. Tunisia
Diagnostic assessment of the
student’s acquisitions
regular
A representative random sample
of students
16. UAE
UAE National Assessment
Program- UAENAP
regular
All students at the given grade(s)
17. Yemen
National Assessment System
of Student - NASS
not regular
A representative random sample
of students
3. Iraq
6. Kuwait
7. Lebanon
8. Libya
It can thus be noted from Table 3.1 that only
eight countries are administrating national large
scale assessments on a regular basis: Bahrain,
Jordan, Oman, Palestine, Qatar, Syria, Tunisia, and
UAE. Most of these countries apply the survey
on all students at the given grade(s). Yet there
is no difference between countries in terms of
assessment purposes, having regular or irregular
assessment, on all students or on a random sample
of students. The NLSA in these countries generally
serves the following two purposes: (a) monitoring
education quality at the system level and (b)
policy design, evaluation, or decision making. In
some countries the NLSA also has the purpose
of: “holding government or political authority
accountable” (Egypt, Mauritania and Palestine),
“school or educator accountability” (Egypt,
Palestine and Qatar), or “supporting schools and
teachers” (Oman, Palestine, Qatar, Sudan, Tunisia,
and UAE).
55
Regional Mapping Report on Assessment in the Arab States
All countries use multiple-choice format in their
NLSA, while eight use supply/open ended question
format (Bahrain, Egypt, Palestine, Oman, Qatar,
Tunisia, UAE, and Yemen). The essay model format
is used by eight of the countries (Bahrain, Egypt,
Lebanon, Palestine, Oman, Qatar, Syria, Tunisia, and
UAE).
b. Policy Documents
In most cases, there is no real formal policy document
related to national large scale assessments. Out of
the 14 countries which have any form of NLSA,
only four have a policy document related directly
to the subject: Oman, Qatar, Syria, and UAE (Table
3.2). The other documents are of a general nature.
In Oman, Qatar, and UAE these documents are
available to the public, while in Syria the document
is available in schools along with the relevant
guidelines. Obviously, the “general documents” are
available to the public (Bahrain, KSA, Mauritania,
Palestine, and Tunisia) except for the “study” in
Sudan.
Table 3.2: Policy document nature and date of issuing (q6)
Country
Formal policy
1. Bahrain
Royal decree number 32 for 2008 and its
amendments
2. Egypt
Informal or draft policy
Date
The King,
2008,2010
ü
3. Iraq
4. Jordan
5. KSA
Educational policy in the KSA
CM, 1969
9. Mauritania
The organizational chart of the Ministry of
National Education
MOE, 1999
10. Oman
Cognitive Development Manual
MOE, 2009
6. Kuwait
7. Lebanon
8. Libya
11. Palestine
MOE, 2008
12. Qatar
Evaluation Policy for grades 4 to 11
SEC, 2010
13. Sudan
An evaluation study for basic education
in the states of Northern Darfur, the Red
Sea, Southern Kordofan and the Blue Nile
MOE, 2010
14. Syria
Executive Instructions of Unified Exam for
Basic Education and High School
MOE, 1998
15. Tunisia
The orientation law on education and
school learning
MOE, 2002
16. UAE
Implementing the National Assessment
Program (administrative decision)
MOE, 2010
17. Yemen
56
The Strategic Plan for Educational
Development (2008-2012)
PA RT T H R E E National Large-Scale Assessment (NLSA)
c. Large-scale Assessment Plan
Six countries report that they do not have a largescale assessment plan for the coming years or for
future assessment rounds. These are Egypt, Iraq,
Kuwait, Lebanon, Libya, and Qatar.
Countries which indicated that they have such
a plan provided information implying that this
plan is still tentative or intentional. For instance,
Bahrain refers to the National Economic Strategy
of 2009-2014 and states that the establishment
of the National Authority for Qualification and
Quality Assurance of Education and Training is one
of the initiatives adopted to reform the National
Education System mentioned in the National
Economic Strategy to achieve the Economic Vision
2030 for the Kingdom of Bahrain. The Authority
was delegated the responsibility of standard setting
and quality control by using a series of national
examinations and periodic assessments in public
schools and of submitting general reports thereto.
Jordan clarifies that the plan is available at the
Department of Examinations but it could not be
accessed. Mauritania explains that a large number
of assessments are foreseen in the program
submitted to the GPE (Global Education Partnership)
for basic education (regular assessment of students
and of the Teacher training School, participation
in TIMSS, EGRA, PASEC, etc.). Palestine reports
that discussion is still at its first phase within
the Ministry’s Education Policy Committee; the
preliminary information indicates a tendency to
apply a national assessment in 2014. Currently in
KSA, a relevant program is under implementation
in cooperation with the World Bank. Sudan reports
that there is a project to design the National
Learning Assessment System-Project (GPE). Syria
indicated standardized Examination Instructions for
the transitional grades. Tunisia states that in the
future, it plans to conduct the NLSA on an annual
basis. Yemen’s plan is linked to the country’s Basic
Education Development Project (BEDP) and Global
Partnership for Education (GPE).
When it comes to determining which of the
following three options best describes the situation,
the data witnesses some change in the planning
image:
Option a: There is a publicly-available written plan
specifying who will be tested [e.g., 4th graders]
and in which subject areas [e.g., math, science].
The plan is available to, and easily accessible by,
the public. Five countries are concerned: Bahrain,
Mauritania, Oman, Sudan, and UAE.
Option b: There is a non-publicly available written
plan specifying who will be tested [e.g., 4th graders]
and in which subject areas [e.g., math, science].
The plan is available to, and accessible by, only
certain selected groups of people. Five countries are
concerned: Jordan, KSA, Syria, Tunisia and Yemen.
Option c: There is a common understanding that the
assessment will take place but there is no formally
written plan. The only country which selected this
option is Palestine.
2. Public Engagement for NLSA
In general stakeholders are engaged in NLSA either
within coordinating efforts (in seven countries) or within
independent efforts (in five countries: Egypt, Lebanon,
Oman, Palestine, and Tunisia). Only Bahrain, Mauritania
and Yemen reported not having such stakeholders’
engagement.
Among the stakeholders, policy-makers and educators
seem to be supportive to the large scale assessment
program, while teacher unions, students, parents and
employers lean to be neutral or their position is yet
unknown. The media, think tanks, NGOs and universities
are divided between the two positions. Students and
parents’ opposition to NLSA appear however in Palestine
and Qatar, and only from students in Jordan
Bahrain and UAE raised the fact that the large scale
assessment is not counted in students’ grades,
procedures, promotion or failure, which leads to a lack
of seriousness from the students in the application of
the assessment.
3. Funding
The typical situation is that funding of large-scale
assessment programs is allocated by the government.
Only three countries obtain funding from nongovernmental sources, on irregular basis, along with an
57
Regional Mapping Report on Assessment in the Arab States
absence of any government funding: Mauritania, Sudan
and Yemen. Government funding is regularly available
in Bahrain, Jordan, Oman, Qatar, Syria, and UAE. There
is irregular government funding in Egypt, KSA, Lebanon,
Palestine, and Tunisia.
Funding allocated for the large-scale assessment
program covers all types of activities in all countries:
assessment administration and data analysis, assessment
design (except for Palestine), data reporting (except
for Palestine and Syria), and staff training (except for
Palestine, Jordan and Syria).
4. Organizational Structures
a. The agency
In most countries, the group in charge of the
large-scale national assessment is described
as a “permanent agency or institution or unit
created for running the assessment”: the
National Center for Examinations and Educational
Evaluation (Egypt), the Assessment and Evaluation
Department (Palestine), the Research Directorate
at the Ministry of Education (Syria), the National
Center for Pedagogical Innovation and Educational
Research (Tunisia), the Center for Measurement and
Assessment (Yemen), the Center for Educational
Research and Development (Lebanon).
b. Political consideration
Nine countries claim that political considerations
never hamper technical considerations for
the NLSA, although this may happen in some
countries (Egypt, Jordan, KSA, and Yemen). In
Egypt for example, the problems specifically
relate to sampling, shortcomings in logistics and
administrative complications.
In some cases, the publication of the results is
delayed or withheld because of political reasons. In
Egypt, dissemination of the results was limited to
a report related to the program. In Palestine, the
2010 and 2012 detailed assessment results have
not yet been published to the public. Special reports
on the national school sample are being developed
detailing school performances compared to the
national performance.
58
c. Accountability
Jordan, Oman and Yemen reported that the
group responsible for carrying out the large-scale
assessment is not accountable to a clearly recognized
body. The remaining countries reported that the
respective group in their country is held accountable
to a recognized body or provided specifications.
The assessment group is accountable to a higher
office in the Ministry of Education or another
sectorial authority in eight countries. Among these
is Mauritania, where the group is also accountable
to an external board or committee (government or
non-government). In Bahrain, the agency in charge
of the national assessment reports to the Cabinet of
Ministers. In Qatar, the unit reports to the Supreme
Executive Committee of the Supreme Education
Council.
5. Human Resources
Countries involved in national large scale assessments
have different situations regarding the availability of
human resources for the NLSA, as seen in Table 3.3.
Only five countries have an adequate number of
permanent or full-time staff, while five other countries
do have permanent or full-time staff, but their number
is insufficient to meet the needs of the assessment.
Lebanon, Mauritania and Sudan have only temporary
or part-time staff. In some cases the agency refers to
another unit for help, such as in Tunisia, where some
inspectors and teachers are required to carry out the
mission. In Mauritania, the assessment group is solid
and technically operational, but its own sustainability
and renewal are not assured. The members of the group
have attended universities abroad in the nineties and are
now relatively old and close to retirement.
The issues that have been identified in some countries
regarding the performance of the human resources
responsible for the large-scale assessment are of different
nature. In Palestine, the NLSA faced a shortage in staff
members to follow-up on field assessment activities
which forced the central team to recourse to part-time
members. It also faced some difficulties in obtaining the
latest data on schools, teachers and students, which
delayed the sampling process.
PA RT T H R E E National Large-Scale Assessment (NLSA)
Table 3.3: Staffing adequacy (q20)
Country
Adequate number Permanent or full-time
of permanent or
staff, but insufficient to
full-time staff
meet assessment needs
1. Bahrain
ü
2. Egypt
ü
Temporary
or part-time
staff
No staff allocated
to run large-scale
assessment
3. Iraq
4. Jordan
ü
5. KSA
ü
6. Kuwait
7. Lebanon
ü
8. Libya
9. Mauritania
10. Oman
ü
ü
11. Palestine
ü
12. Qatar
ü
13. Sudan
14. Syria
ü
ü
15. Tunisia
16. UAE
ü
ü
17. Yemen
In Mauritania, the members have all been trained at the
Masters level in universities abroad and are specialized
in the field of assessment. They are, however, in need
of updating their knowledge in the most fields of
assessment (multi-level data analysis, factor analysis). In
KSA, the problem concerns the limited use of results. In
Sudan, all technical work was undertaken by the World
Bank experts, so there was some criticism on how data
was analyzed.
Opportunities available for professional development
are summarized in Table 3.4. For KSA and UAE, all kinds
of opportunities are offered to the assessment staff:
university graduate programs and courses specifically
focused on educational measurement and evaluation,
non-university training courses or workshops on
educational measurement and evaluation, funding
for attending international programs or courses or
workshops on educational measurement and evaluation,
and internships or short-term employment in the largescale assessment office. Sudan reported four of these
learning opportunities. These are affected, however, by
weak deployment of a measurement and evaluation
culture and sparcity of training sessions and workshops.
Lebanon and Mauritania provide only opportunities
for attending international programs or courses or
workshops on educational measurement and evaluation.
Mauritania is hoping to compensate in the future through
the GPE program to strongly re-launch the work of the
evaluator team in Mauritania. This should be coupled
with training sessions for young professionals and open
mindsets to the latest assessment and psychometric
techniques.
59
Regional Mapping Report on Assessment in the Arab States
Table 3.4: Opportunities available for professional development on educational measurement
and evaluation (q22)
Country
University
graduate
programs
(masters or
doctorate level)
1. Bahrain
2. Egypt
University
courses
(graduate
and nongraduate)
Nonuniversity
training
courses or
workshops
Funding for
attending
international
programs or courses
or workshops
Internships
or short-term
employment in
the large-scale
assessment office
ü
ü
ü
ü
ü
ü
4. Jordan
ü
ü
ü
5. KSA
ü
ü
ü
3. Iraq
6. Kuwait
7. Lebanon
ü
8. Libya
9. Mauritania
10. Oman
ü
ü
11. Palestine
ü
ü
12. Qatar
ü
ü
ü
ü
13. Sudan
ü
ü
ü
14. Syria
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
15. Tunisia
16. UAE
17. Yemen
ü
In Egypt, Jordan, Sudan, and Syria, university graduate
programs, university courses and non-university training
courses or workshops are available. There are doubts,
however, about the quality of the university courses.
Syria’s questionnaire points to the fact that academic
studies are not up to the required level, and that
they offer theoretical knowledge more than practical
experience. As such, the Educational and Psychological
Measurement and Assessment Center was created to
address the needs of capacity building in the field of
assessment and measurement. As for Egypt, it states that
non-university training courses related to the educational
measurement and assessment are considered relatively of
highest quality because they are in-service and intensive
courses; trainees can benefit from them on the job.
In Oman, university graduate programs, non-university
training courses and funding for international programs
are available. It is reported however that the available
60
ü
ü
ü
training programs do not cover all required areas such
as advanced statistical analysis, and if available, they can
be general at times.
In Palestine, although three opportunities are available,
which are university courses, funding for attending
international programs and internship in the largescale assessment office, no real assessment studies have
been done however on university courses, and there are
discrepancies between universities as to the quality of
the courses offered.
In Tunisia, university courses, non-university training
courses and funding for international programs are all
available. In Yemen, university and non-university courses
and workshops are offered, whereas in Qatar, funding
for attending international programs or workshops and
internships are available for professional development.
Bahrain only provides some related university courses.
PA RT T H R E E National Large-Scale Assessment (NLSA)
II. SYSTEM ALIGNMENT
1. Aligning NLSA with Learning Goals
In all 14 countries under consideration, the NLSA
measures performance against national/system or statelevel curriculum guidelines or learning standards. In
Egypt, Jordan and Sudan, the assessment also measures
performance against internationally recognized
curriculum guidelines or learning standards.
Countries are divided, however, regarding the position
of stakeholders towards what the NLSA measures.
Bahrain, Lebanon, Mauritania, Oman, Qatar, Sudan,
and UAE indicate that what is measured by the largescale assessment is largely accepted by the stakeholder
groups, while Egypt, Jordan, KSA, Palestine, Syria and
Tunisia state that some stakeholder groups question
what the assessment measures. Only in Egypt the
questionnaire states that these questions are raised by
some students and teachers. On the other hand, most
of the countries (with the exception of Sudan, Syria and
Yemen) confirm that mechanisms are in place to ensure
that the large-scale assessment accurately measures what
it is supposed to measure.
Whether countries use “regular independent review”,
“regular internal review” or “ad-hoc review” to ensure
that the large-scale assessment accurately measures
what it is supposed to measure”, the second option
appears to be the most common measure in place, as it
is used by 10 countries. Only Lebanon applies solely the
third mechanism, while none of these mechanisms is in
place in Sudan, Syria or Yemen.
Out of the ten countries which use the regular internal
review, eight also use either one of the two other
mechanisms or both. In total, three countries use the
three mechanisms together (Jordan, Tunisia and UAE)
and six use two mechanisms: Egypt, KSA, Mauritania,
Oman, Palestine, and Qatar.
In Palestine, experts from the National Curricula Center,
education supervision teams and education assessment
teams take part in the working groups developing
achievement examinations tests of the National
Assessment, which makes the assessment tools highly
credible. Egypt adds that internal and external reviews
are carried out systematically during the questions
elaboration phase.
2. Providing the Teachers Opportunities
to Learn about NLSA
Ten out of the 14 countries offer teacher training
courses, workshops, or presentations on the large-scale
assessment among other training opportunities (Table
3.5). There are no teacher training courses or workshops
on the large-scale assessment in Egypt, Jordan, Sudan
and Yemen.
Table 3.5: Teacher training provision on NLSA (q28)
Teachers training forms
N
Countries
Courses or workshops are offered on a regular basis
2
Bahrain, Qatar
Courses or workshops are offered occasionally
6
KSA, Lebanon, Oman, Syria, Tunisia, UAE
Presentations are offered occasionally
5
KSA, Lebanon, Mauritania, Oman, Tunisia
Most teachers have access to live courses or workshops
2
Qatar, UAE
Most teachers have access to courses online
1
KSA
Most courses are of a high quality
2
Bahrain, UAE
Most courses provide teachers with relevant resources that
they can use in their classrooms
3
Bahrain, Qatar, UAE
Other
1
Palestine
There are no teacher training courses or workshops on the
large-scale assessment
1
Egypt, Jordan, Sudan, Yemen
61
Regional Mapping Report on Assessment in the Arab States
The most common training is in the form of courses
or workshops offered occasionally, followed by
presentations also offered on an occasional basis. This
once again confirms the low rate of teacher training
provision on NLSA in the region. In total, the UAE
mentioned using four of the eight proposed forms
of training; Bahrain, KSA and Qatar mentioned three
forms; Lebanon, Oman, and Tunisia mentioned two
forms; Mauritania, Palestine and Syria mentioned only
one form, while Egypt, Jordan, Sudan, and Yemen have
no such teacher training provision.
III. ASSESSMENT QUALITY
in hard-to-reach areas exist in only four countries (Egypt,
Lebanon, Mauritania, and Oman), and accommodations
or alternative arrangements for students with disabilities
are provided only in Bahrain.
Regarding mechanisms in place to ensure the quality
of the large-scale assessment instruments, the most
common mechanisms are: “All booklets are numbered”
(11 countries); “There is a standardized manual for
large-scale assessment administrators” (10 countries);
“All proctors or administrators are trained according to
a protocol” (9 countries); “A pilot is conducted before
the main data collection takes place” (8 countries);
“Scorers are trained to ensure high inter-rater reliability”
(8 countries); and “Internal reviewers or observers” (7
countries). Some other mechanisms are used in only one
to four countries, as shown in Table 3.6.
1. Ensuring the Quality of NLSA
Nine countries reported that, to ensure a wide social
coverage, the large-scale assessment is offered in the
language of instruction for almost all student groups
(Bahrain, Egypt, Jordan, Lebanon, Mauritania, Oman,
Qatar, Tunisia, and UAE). Special plans to ensure that
the large-scale assessment is administered to students
Bahrain, Lebanon, Palestine, Qatar, and UAE use more
than seven mechanisms each, while Egypt, Jordan,
KSA, Mauritania, Sudan, and Tunisia use one to six
mechanisms only. Yemen has none of these mechanisms
in place.
Table 3.6: Frequency of mechanisms in place to ensure the quality of the NLSA (q30)
62
Mechanisms
N
All proctors or administrators are trained according to a protocol
9
There is a standardized manual for large-scale assessment administrators
10
Discrepancies must be recorded on a standard sheet
1
A pilot is conducted before the main data collection takes place
8
All booklets are numbered
11
There is double data scoring (if applicable, for example, for open-ended items)
4
Scorers are trained to ensure high inter-rater reliability
8
There is double processing of data
3
External reviewers or observers
3
Internal reviewers or observers
7
External certification or audit
2
Other
2
None
1
PA RT T H R E E National Large-Scale Assessment (NLSA)
The countries appear to be less engaged in the technical
documentation of the NLSA. In selecting the best
option to describe this technical documentation, the
option of a comprehensive, high-quality technical report
available to the general public was selected by only
four countries - Jordan, Mauritania, Oman, and UAE,
against seven countries which chose the option “There
is a comprehensive technical report, but with restricted
circulation”. In Palestine there is some documentation
about the technical aspects of the assessment, but it is
not in a formal report format, while in Lebanon there is
no technical report or other documentation.
2. Ensuring the Effective Use of the NLSA
Dissemination of results is one of the effective uses
of the NLSA. Table 3.7 presents different options of
dissemination, with their respective frequencies. Only
Syria and Tunisia didn’t mention any of these options to
report on the NLSA results.
The typical common action is to hold workshops and to
make presentations on the results to the stakeholders; it
is the easiest manner of dissemination and it keeps the
dissemination limited to special groups (10 countries).
The second option in order of frequency does not relate
to the scope of dissemination but rather to the content
of the report, wherein the main reports on the results
contain information on the overall achievement levels
and subgroups (9 countries).
The most effective ways of dissemination, options “a”
and “b” in Table 3.7, are practiced in only six and seven
countries, respectively. Two countries mentioned the
option “other”: Palestine states that the overall results
are published in the yearly follow-up and assessment
report (2008, 2010 and 2012). A special brochure with
the results is sent to all educational institutions and
schools (2008). In Oman, the results are published on
the Ministry’s Educational Portal. Both cases pertain to
the category of wide dissemination.
Some countries have not disseminated the results at all
or have used only one or two forms of dissemination,
as is the case in Egypt, KSA, Sudan, Syria, and Tunisia.
Lebanon and Mauritania could be placed under the
category of medium dissemination scope (3-5 forms).
The third group of countries, with a wide scope of
dissemination (6-7 options) includes Bahrain, Jordan,
Oman, Palestine, Qatar, and UAE.
How is information from the NLSA used? The answers
to this question reveal limited benefit of this kind of
assessment. Only five countries responded that the
assessment information is used by all or most stakeholder
groups in a manner consistent with the stated purposes
or technical characteristics of the assessment. These are
Jordan, Oman, Qatar, Sudan, and UAE. Tunisia should
in fact be added to this group because it states that the
process has not been yet completed, however the usage
will be on a large scale and available to all sponsoring
parties.
Table 3.7: Frequency of mechanisms to disseminate NLSA results (q32)
Ways of dissemination
N
a. Results are disseminated within twelve months after the large-scale assessment is administered
6
b. Reports with results are made available for all stakeholder groups
7
c. The main reports on the results contain information on overall achievement levels and subgroups
9
d. The main reports on the results contain information on trends over time overall and for subgroups
6
e. The main reports on the results contain standard errors (measure of uncertainty)
3
f. There is a media briefing organized to discuss results
4
g. There are workshops or presentations for key stakeholders on the results
10
h. Results are featured in newspapers, magazines, radio, or television
2
i. Other
2
63
Regional Mapping Report on Assessment in the Arab States
The second group includes six countries which declared
that assessment information is used by some stakeholder
groups in a way that is consistent with the stated
purposes or technical characteristics of the assessment.
These are Bahrain, Egypt, KSA, Mauritania, Palestine and
Syria. Only Lebanon stated that assessment information
is not used by stakeholder groups or is used in ways
inconsistent with the stated purposes or the technical
characteristics of the assessment.
Four countries report that there are no mechanisms in
place at all to monitor the consequences of the NLSA
(Egypt, Lebanon, Mauritania and Sudan). The other
options, relating to funding, oversight committee, focus
groups, themed conferences, and expert review groups
did not collect more than 4 countries each. Jordan,
Palestine, Syria, and Tunisia chose only one option.
Under the category “other”, Tunisia mentioned having
established specialized technical committees by subject
matter, while Jordan has been setting-up remedial plans
to address student weaknesses in some basic learning
skills and following-up on the implementation process
by field supervisors.
Table 3.8 provides the country answers to the last
question in the questionnaire on the mechanisms in
place to monitor the consequences of the large-scale
assessment and confirms limited benefits from NLSA.
Table 3.8: Frequency of mechanisms in place to
monitor consequences of the NLSA (q34)
64
Mechanisms
N
Funding for independent research on the
impact of the large-scale assessment
2
A permanent oversight committee
4
Regular focus groups or surveys of key
stakeholders
2
Themed conferences that provide a
forum to discuss research and other data
on the consequences of the large-scale
assessment
2
Expert review groups
4
Other
3
None
4
PA RT T H R E E National Large-Scale Assessment (NLSA)
IV. BENCHMARKING FOR NLSA
Overall View
1. Enabling Context
In assessing the overall development levels of the
surveyed countries on the drivers of National LargeScale Assessment (NLSA), the majority of the countries
were found to be Emerging. This indicates that an
unstable NLSA is in place with a need for the respective
countries to develop institutional capacity to run the
NLSA. Assessment quality and impact are weak. Iraq,
Kuwait, Libya, and Yemen showed Latent progress in
this area, as they have no NLSA in place yet. Bahrain and
UAE were the only countries found to be Established
in this type of assessment, with a stable NLSA in place,
institutional capacity and some limited mechanisms
to monitor the assessment. The NLSA is of moderate
quality and its information is disseminated, but not
always used in effective ways.
This driver assesses “the overall framework of policies,
leadership, organizational structures, financial and
human resources in which NLSA activity take place
in a country or system and the extent to which that
framework is conducive to, or supportive of, the NLSA
activity”.
Table 3.9: Benchmarking results for national
large-scale assessment (by country and status)
Country
Latent
Emerging
1. Bahrain
ü
2. Egypt
3. Iraq
ü
ü
4. Jordan
ü
5. KSA
ü
6. Kuwait
ü
7. Lebanon
8. Libya
ü
ü
9. Mauritania
ü
10. Oman
ü
11. Palestine
ü
12. Qatar
ü
13. Sudan
ü
14. Syria
ü
15. Tunisia
ü
16. UAE
17. Yemen
Established
Advanced
Nine countries have formal policy documents that
authorize NLSA (Established), while the rest either
have none or have an informal or draft policy
document (Latent to Emerging levels). Almost half
of the countries have made this policy document
available to the public (Established). The situation
for Jordan is unclear. In planning for upcoming
NLSAs, countries vary greatly: nine countries have a
written NLSA plan for the coming years (Advanced),
two countries have a general understanding that
the NLSA will take place (Established), and the rest
have no plan for NLSA activity (Latent).
b. Having strong public engagement for NLSA
Only in Jordan and Sudan do all stakeholder groups
support the NLSA (Advanced level). The remaining
countries which do have NLSA have a variety of
support for it from the stakeholder groups. The
situation in Mauritania in this regards is not clear.
ü
ü
Five indicators are included:
a. Setting clear policies for NLSA
The surveyed countries show a variety of results
on the sub-indicators in this area. In Iraq, Kuwait,
Libya, and Yemen, no NLSA exercise has taken
place to date (Latent level). Bahrain, Mauritania,
Palestine and Syria have a NLSA each but it has
been operating on irregular basis (Emerging). The
remaining countries have a stable NLSA program
that has been operating regularly (Established).
c. Having regular funding for NLSA
Five countries have regular funding allocated to the
NLSA (Established level), and the rest have none
or irregular funding (Latent to Emerging levels).
Funding in most countries covers some if not all
65
Regional Mapping Report on Assessment in the Arab States
NLSA activities: design, administration, analysis and
reporting (Emerging to Established). In Bahrain, KSA,
Lebanon, Qatar, Tunisia, and UAE the funding covers
research and development activities (Advanced).
d. Having strong organizational structures
for NLSA
With the exception of Lebanon, the countries
that administer NLSA have a NLSA office that
is a permanent agency, institution or unit
(Established). Lebanon has a temporary agency or
group of people for the NLSA office (Emerging).
Political considerations never hamper technical
considerations in nine countries (Advanced) or
sometimes do so, as is the case in four countries
(Established). In all these countries, the NLSA
office is accountable to a clearly recognized body
(Established).
b. Providing teachers with opportunities
to learn about NLSA
Only Bahrain offers its teachers widely available
high quality courses or workshops on NLSA on a
regular basis (Advanced), while Qatar offers only
some of such courses (Established). The remainder
of the countries has either no courses or occasional
ones (Latent to Emerging).
e. Having effective human resources for NLSA
Staffing of NLSA offices to carry out the NLSA
activity is adequate with no issues in five countries
(Advanced), adequate with minimal issues in
three countries (Established), or inadequate (five
countries – Emerging). All countries undertaking
NLSA offer some form of opportunities to prepare
individuals for work on the NLSA (Established to
Advanced levels).
Therefore for this driver, it appears that in
most countries the system alignment of NLSA
with learning goals is clear; however, not all
countries have a regular internal review process
of NLSA to ensure this alignment. Teachers are
not systematically offered opportunities to learn
about NLSA in all countries.
Thus, it is clear that the enabling context for NLSA
in the surveyed countries ranges in the provision
of the framework and resources for the activity,
with some countries having clearer, more widely
available pertinent policies than others, with a
range of support from stakeholder groups and of
funding. Organizational structures for NLSA are
relatively strong in most countries. Staffing and
learning opportunities range in scope.
This quality driver assesses the “degree to which NLSA
meets technical standards, is fair, and is used in an
effective way”.
2. System Alignment
This driver assesses the “degree to which the NLSA
is coherent with other components of the education
system”.
Two indicators are included here:
a. Aligning the NLSA with learning goals
In all the countries with an NLSA in place, the
NLSA presumably measures performance against
66
curriculum or learning standards (Established).
In only five countries is this questioned by some
stakeholder groups (Established). In Sudan and
Syria however, there are no mechanisms in place
to ensure that NLSA accurately measures what it is
supposed to measure (Latent). Lebanon performs
ad hoc reviews for this purpose (Emerging), while
the rest of the countries have regular internal
reviews of the NLSA to do so (Established).
3. Assessment Quality
Two indicators are included:
a. Ensuring the quality of the NLSA
In this indicators, country performance showed
a range, with six countries offering no options
to include all groups of students in NLSA (Latent
level), while the rest offer at least one option to
do so. Syria has no mechanisms in place to ensure
the quality of NLSA, but the other countries have
either some such mechanisms or a variety of them
(Established to Advanced levels).
b. Ensuring effective uses of the NLSA
The surveyed countries also vary in their methods of
ensuring effective uses of NLSA results. In Syria and
Tunisia, the assessment results are not disseminated
at all (Latent level). Dissemination is poor in Egypt,
PA RT T H R E E National Large-Scale Assessment (NLSA)
KSA, Lebanon and Mauritania (Emerging). In
the remaining seven countries, NLSA results are
disseminated in an effective way (Established level).
With the exception of Lebanon and Tunisia, where
NLSA information is not used at all or is used in
ways that are inconsistent with the purposes or the
technical characteristics of the assessment (Latent),
the remaining countries use NLSA information by
either some or all of their stakeholder groups in
a way consistent with the purposes and technical
characteristics of the assessment (Established to
Advanced). Only eight countries have some form of
mechanisms in place to monitor the consequences
of NLSA (Established).
As seen in this driver of assessment quality for
NLSAs, the countries surveyed faired averagely,
with some attempts here and there to ensure the
quality of NLSA and its effective use.
67
PART
FOUR
International Large-Scale Assessment (ILSA)
I. ENABLING CORRECT
......................................................
70
II. SYSTEM ALIGNMENT
......................................................
75
III. ASSESSMENT QUALITY
...................................................
V. BENCHMARKING FOR ILSA
.............................................
76
78
69
Regional Mapping Report on Assessment in the Arab States
I. ENABLING CONTEXT
1. Participation
accordingly referred to in completing the questionnaire
was the TIMSS in 2011. This is the case for the countries
of Bahrain, Jordan, KSA, Lebanon, Oman, Palestine,
Syria, and Tunisia. In Yemen, given that the TIMSS
2011 results had not come out yet at the time of the
completion of the questionnaire, the answers to the
questions are based on the results of TIMSS 2007. The
questionnaire response for Egypt and Kuwait are also
based on the results of TIMSS 2007. Egypt was unable
to participate in the TIMSS 2011 study because of the
revolution that started early 2011. Mauritania’s latest
participation was the PASEC in 2004 as its survey results
are widely disseminated in Mauritania. For both Qatar
and UAE, the most recent international assessment
undertaken was the PISA in 2012, and subsequently
that exercise formed the base upon which each of the
two countries completed its ILSA questionnaire.
Of the 17 countries sampled for the survey, 14
reported having participated in international largescale assessments (ILSA) in the past two decades. These
countries are: Bahrain, Egypt, Jordan, KSA, Kuwait,
Lebanon, Mauritania, Oman, Palestine, Qatar, Syria,
Tunisia, UAE, and Yemen (see Table 4.1).
The most common international assessment in the region
is by far the TIMSS. With the exception of Mauritania,
all the other countries have participated at least once
in TIMSS. Jordan and Tunisia have participated four
times and seven other countries three times (Bahrain,
KSA, Kuwait, Lebanon, Palestine, Syria, and Yemen).
For several countries, the most recent international
assessment in which they had participated and
Table 4.1: Country participation in previous international assessments (q2)
Country
PIRLS
TIMSS
PISA
PASEC
Other
2006 2011 1995 1999 2003 2007 2011 2003 2006 2009 2004 2009
1. Bahrain
ü
ü
2. Egypt
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
3. Iraq
4. Jordan
ü
5. KSA
6. Kuwait
ü
ü
ü
ü
7. Lebanon
ü
ü
PISA
ü
2012
ü
8. Libya
9. Mauritania
ü
10. Oman
ü
ü
11. Palestine
12. Qatar
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
ü
2008
LAMP
2008
ü
ü
ü
ü
13. Sudan
14. Syria
15. Tunisia
16. UAE
17. Yemen
70
ü
ü
ü
ü
ü
ü
2003
TEDS-M
ü
ü
MLA II
ü
ü
PA RT F O U R International Large-Scale Assessment (ILSA)
A large number of countries have already taken
concrete steps to participate in upcoming international
assessments (see Table 4.2). Egypt, Iraq, Libya, Sudan,
and Yemen have not yet taken any measures to participate
in future large-scale international assessments.
Jordan had in fact already participated in PISA 2012.
Qatar and UAE both checked PISA 2012 as one of their
country’s upcoming international assessments despite
the fact that they had already completed the assessment.
The 12 countries which have planned to participate in
future assessments almost all plan to participate in the
TIMSS in 2015.
2. Policy Documents
Nine countries have some form of policy document
that addresses their participation in ILSA. Oman and
Tunisia have formal policy documents that are available
to the public. Oman’s document is related to ministerial
instructions and decisions regarding participation in
these studies and preparation and publication of relevant
national reports. Lebanon, Qatar and UAE have formal
policy documents but they are not made available to the
public. The UAE document is available only for strategic
partners such as educational councils and bodies
participating in the examinations implementation.
Palestine and Mauritania have an informal or draft
policy document. Yemen has an introductory document
to the program that is available to all stakeholders
but not available to the public. Egypt, Iraq, Jordan,
Kuwait, Libya, Sudan, and Syria all do not have any
policy documents that address country participation in
international assessments.
Table 4.2: Country participation in upcoming international assessments (q3)
Country
PASEC
1. Bahrain
PIRLS 2016
PISA 2012
PISA 2015
ü
TIMSS 2015
Other
ü
2. Egypt
3. Iraq
4. Jordan
ü
5. KSA
ü
ü
ü
6. Kuwait
ü
7. Lebanon
ü
ü
8. Libya
9. Mauritania
10. Oman
ü
ü
ü
ü
11. Palestine
12. Qatar
EGRA
ü
ü
ü
ü
ü
13. Sudan
14. Syria
ü
15. Tunisia
ü
ü
ü
ü
16. UAE
ü
ü
ü
ü
17. Yemen
71
Regional Mapping Report on Assessment in the Arab States
Table 4.3: Country policy documents addressing participation in international assessments (q6)
Country
Official Document Citation
1. Bahrain
Policy document related to the
participation in TIMSS 2003
Authorizing Body
Year of
Authorization
2. Egypt
3. Iraq
4. Jordan
5. KSA
6. Kuwait
7. Lebanon
Letter from the Head of the Center for
Educational Research and Development
and approval thereof from the Minister
of Education and Higher Education
Ministry of Education and
Higher Education
2011
9. Mauritania Basic Education Support Project:
funding request to the GPE for program
implementation
Ministry of Education
2013
10. Oman
Ministerial decision number 140/2009
Ministry of Education
2009
11. Palestine
Strategic Plan for Education
Development (2008-2012)
Ministry of Education
2008
12. Qatar
Agreement of Participation
OECD
Renewed for every
participation cycle
15. Tunisia
Orientation Law on School Education
and Learning (Law number 80-2002 of
23 July 2002)
Ministry of Education
2002
16. UAE
Ministerial Council for Services, decision
number (73/62/2) of 2010
UAE Cabinet
2010
8. Libya
13. Sudan
14. Syria
17. Yemen
3. Funding
All participating countries to the ILSA had regular sources
of funding allocated for participation in international
assessments. While the Gulf States have full self funding
sources, the other countries are relying completely or
72
partially on external sources from loans and external
donors. Egypt, Lebanon and Syria reported that there
is also a contribution from the Ministry of Education.
In some countries (Kuwait, Qatar and UAE) the funding
allocated for participation in ILSA is approved by law,
decree or norm.
PA RT F O U R International Large-Scale Assessment (ILSA)
This funding is used to cover a range of activities
(international participation fees; implementation of
the assessment exercise in the country; processing
and analyzing data collected from implementation of
the assessment exercise; reporting and disseminating
the assessment results in the country; attendance
at international expert meetings for the assessment
exercise). Only Jordan and UAE indicated using the
funding also for research and development.
In addition to that, specific use is reported in some
countries: funding for Bahrain covers the implementation
of a remedial action plan to improve students’
performance in the Kingdom of Bahrain after analyzing
their results and identifying their weaknesses in the
various educational competencies. In Jordan, funding
covers developing training guides for Mathematics
and Science teachers that identify the most common
mistakes of the grade 8 students based on the TIMSS.
The funding in Qatar is used for “scientific competitions”
which are conducted in independent schools specifically
designed for raising awareness about international
assessments. For Palestine, funding is usually provided
in line with the procedures of the Palestinian Ministry of
Finance in compliance with a one-year plan elaborated
by the Ministry of Education based on the five-year
general planning framework. Projects often cover some
activities related to international assessment studies
especially those pertaining to quality assessment or
impact studies of such programs on pilot schools in the
framework of these development projects.
4. Capacity: Team Staffing and Experience
All participating countries have assigned a team
responsible for carrying out the international assessment,
led by a national coordinator. In many cases the national
coordinator and some members of the team are working
at the Ministry of Education.
In Palestine, for example, the team also includes the
Data Manager from the Department of Performance
Measurement and Assessment within the Ministry who
is in charge of following up on school activities. In Syria,
the national coordinator is in fact the Senior Science
Supervisor at the Ministry of Education. The team includes
senior supervisors from the Ministry of Education and
specialized science and mathematics supervisors from
the provinces, in addition to a technical committee from
the IT directorate, whose members include specialists in
data entry and analysis. The supervisors are in charge
of visiting schools to evaluate and orient teachers. At
the central level, they are involved in curriculum and
textbooks matters. The coordination of the international
assessment in Tunisia is carried out by the Director of
the Evaluation Department at the National Center of
Pedagogical Renovation and Educational Research. The
assessment section team within this center is supported
by a number of inspectors and researchers.
In Mauritania, the team in charge of undertaking largescale national assessments is responsible for PASEC and
MLA. In Jordan, on the other hand, there is a separate
national coordinator for the TIMSS and for PISA. Data
collection from schools is under the responsibility
of Mathematics and Science supervisors from the
Ministry of Education. They are trained on conducting
international studies, and some have already gained
experience through repeated participation.
Oman has an integrated team in place for carrying
out international assessments as part of the country’s
International Studies Program. Libya, which has not yet
participated in any large-scale international assessments,
has begun preparations for future participation. A
group of experts has visited the Ministry of Education
for coordination purposes related to Libya’s participation
in the TIMSS in 2015.
In all 14 countries which have recently participated in
international large-scale assessments, the national teams
have previous experience on international assessments.
In some countries, however, the teams do not have
the necessary training or experience to carry out the
required assessment activities effectively, as is the case in
Oman and Syria. For all participating countries, with the
exception of Tunisia, the national coordinator is fluent in
the language in which the international-level meetings
are conducted and related documentation is available. In
KSA, Palestine, Syria, and Tunisia the teams responsible
for carrying out the international assessment are not
sufficiently staffed. The team in Mauritania suffers from
an institutional instability as it has been dissolved and
recreated in the past years. Moreover, its members that
were trained at the end of the 90’s will be retiring in
few years without any identified or trained replacement
team.
73
Regional Mapping Report on Assessment in the Arab States
With regard to the training of the national team
members, not all participating countries were able to
have their teams attend all international meetings
related to the assessment. In Palestine, due to budget
constraints, no more than two team members were able
to attend the meetings. Furthermore, due to security
measures imposed by the Israeli occupation or because
of visa delays, none of the team members were able
to attend some of the meetings. In Syria also, there
were some obstacles that prevented the Syrian team
from attending certain meetings due to visa restrictions
(Germany) or the absence of a Syrian embassy in certain
countries (Australia). The teams from Egypt, Jordan,
Kuwait, Oman, and Yemen, were able to attend only
some of the meetings.
5. Issues in Implementation
In carrying out the international assessment in their
countries, some teams were faced with difficulties.
There were issues with translation of the assessment
instruments, particularly in contacting the IEA to
74
submit comments pertaining to the Arabic translations
in Palestine or errors and delays in scoring student
responses to questions in Egypt. Mistakes in the
translation of the questions to Arabic were also
identified the UAE with the PISA examination.
The major issue in Syria and Yemen was related to
complaints about poor training and limited experience
of the test administrators. Furthermore, the Syrian team
working in the provinces was not dedicated full-time to
this task and was constantly complaining about the low
compensation they received for their assigned tasks.
The team in Yemen was faced with a large number
of issues, from errors or delays in printing and layout
of the test booklets, to delays in the administration of
the assessment, complaints about poor training of test
administrators, and a decrease in the participation rate
below 100% for the year 2011 as a result of the “Arab
Spring”. The Bahraini team’s work was perturbed due to
the general instability circumstances of the country, and
the implementation of TIMSS 2011 was postponed until
November 2012.
PA RT F O U R International Large-Scale Assessment (ILSA)
II. SYSTEM ALIGNMENT
1. National Learning Opportunities
Opportunities to learn about international assessments
in the respective countries or systems are offered
in one way or another in each of Bahrain, Egypt,
Jordan, KSA, Oman, Palestine, Qatar, Tunisia, UAE,
and Yemen. The opportunities offered are mainly in
the form of workshops or meetings or online courses
on using international assessment databases. Learning
opportunities benefit a large audience: individuals
working directly on the specific international assessment
exercise, university students studying assessment or a
related area, professionals or university staff interested
in the topic of assessment, and group undertaking
education initiatives targeting quality improvement.
In Bahrain, information is also available in the press,
or in the form of leaflets, publications and posters.
University students studying assessment or a related
topic area have also opportunities to benefit from
these resources. In Yemen, Master’s students from the
faculties of Education in five universities were trained. In
Jordan, the national teams were given the opportunity
to obtain additional and continuous training on the
study implementation techniques. In Palestine, valuable
documents and user guides related to these studies
and their implementation challenges and assessment
framework are available on the website. The same
opportunities are offered in Qatar where educational
materials are available on the website of the Supreme
Education Council to inform about international studies,
their content and their importance.
In cooperation with the UNDP, the organization in
charge of the TIMSS study held a series of training
sessions for Arabic countries to develop their technical
capacity in data analysis mechanisms for TIMSS 2003
and 2007. The World Bank, in cooperation with regional
organizations, launched a training initiative in several
Arab countries to activate the use of TIMSS indicators to
inform decision making processes in education.
75
Regional Mapping Report on Assessment in the Arab States
III. ASSESSMENT QUALITY
1. Presentation in Official International
Report
All countries reported that they have met all technical
standards required to have their data presented in
the main displays of the international report, with the
exception of Lebanon, which judged having met only
sufficient standards to have its data presented in the
main display of the international report or in an annex.
2. Contribution to the Global Knowledge
Base
Bahrain, KSA, Mauritania, Oman, Palestine, Qatar,
and Yemen all claim that their country or system
has contributed to the global knowledge base
on international assessments by generating new
knowledge and making it available through publications
or presentations. In Palestine, research papers were
published in refereed reports within research initiatives,
such as the IEA initiative, the TIMSS repertoire, and
the World Bank regional initiative on policy research.
Qatar published its results in the PIRLS 2006 and 2011
encyclopedia, as well as in the TIMSS 2011 encyclopedia.
3. Process of Dissemination of Results
76
the national report was not made available online, while
in Kuwait no national report was distributed to key
stakeholders. In Palestine, preparation is underway to
develop a “school report” for all schools that participated
in the study and to hold meetings with specialized
education supervisors for a detailed discussion of the
results. Curricula developers were contacted so as to
benefit from the results in the development of sciences
and mathematics curricula.
In Mauritania, the results were widely debated in the
country and awareness was created to press for urgent
remedial measures. The assessment results are not fed
back directly to schools and educators, but they have
been systematically communicated to basic education
inspectors.
The dissemination of results in Lebanon has been
poor. The country results have only been published in
the international report, and the results have not been
fed back to the participating schools and educators.
In Egypt also only copies of the international report
are distributed to key stakeholders. The results of the
international assessment are sometimes fed back
to participating schools. A summary of the results is
prepared and disseminated to specific directorates.
In Bahrain, Egypt, Jordan, KSA, Kuwait, Mauritania,
Oman, Palestine, Qatar, Tunisia, and UAE, results
from the most recent international assessments were
disseminated in the country. In Bahrain, for example, the
Kingdom published a report on the results of TIMSS 2007
and the Achievement results of TIMSS 2011. In Jordan,
it is common to disseminate the national and detailed
report which includes the results of the international
report and other variants that are important to the
Ministry of Education, donors and partners.
4. Media Coverage
The results from the international assessments were
disseminated in variable formats. In Bahrain, Jordan,
Kuwait, KSA, Mauritania, Palestine, Qatar, Oman,
Tunisia, and UAE, the national report, brochures and
presentations of the results are all made available
online. Copies of the national report are distributed
to key stakeholders, and the assessment results are
communicated through a press release. In UAE, however,
The coverage by the media of the international
assessment results in Egypt, Jordan, Palestine, Syria, and
Tunisia was limited to a few small articles. The latest issue
of the official newsletter of the Ministry of Education in
Palestine was dedicated to tackling the TIMSS and its
results, and results for Lebanon and Yemen have not
been covered in the media at all.
Media coverage of the results of the ILSA in the respective
participating countries also varied in scope. In Bahrain,
KSA, Kuwait, Mauritania, Oman, Qatar, and UAE, there
were some editorials or columns commenting on the
international assessment results. In Oman, the Ministry
organized radio and television panels and interviews to
shed light on the results in TIMSS and PIRLS of 2011,
the results of which were published in a national report.
PA RT F O U R International Large-Scale Assessment (ILSA)
5. ILSA Effects
a. Effect on National Decision Making
The results of the international assessments in
all countries, with the exception of Lebanon,
have been used to inform decision making at the
national level. Furthermore, in six of these countries,
Bahrain, Egypt, KSA, Palestine, Qatar, and Tunisia,
there appears to be a positive impact on student
achievement levels from the use of the results of
the international assessment exercise to improve
the country’s education quality.
The results of the international assessment exercise
were used by policy makers in a wide variety of
formats. In Bahrain, Palestine and UAE, the results
were used to track the impact of reforms on student
achievement levels, and to inform the processes
of curriculum improvement, for teacher training
programs, and for other assessment activities in
the system. All these apply as well to Syria except
tracking the impact on students’ achievement, and
for Tunisia they are used for improving curriculum
and other assessment, and for curriculum only in
Kuwait. Lebanon expressed the hope to make most
of these in the future. Four countries reported also
using the results to inform resource allocation,
including Bahrain, Kuwait, Tunisia, and UAE.
Enriching supporting material was developed in
Palestine to cover mathematics skills encompassed
in TIMSS and not included in national curricula. This
material was disseminated to all Palestinian schools.
developed the science and mathematics curricula in
light of the standards on which the study was built.
b. Effect on Student Achievement
Saudi students showed an improvement in their
overall results in 2011 compared to previous years.
The improvement in student performances for
Qatar was visible with each round of results.
In Palestine, an improvement in the education
achievement level in mathematics was noticed in
the national assessment study between 2010 and
2012. The improvement registered in the student
achievements in TIMSS 2011 could perhaps be
an indicator of the impact of activating the 2007
results at the national level. It is worth noting that
Palestinian students made the highest improvement
amongst participating countries (36 points)
between TIMSS 2007 and 2011.
There is evidence in Bahrain of improvement in the
results of TIMSS 2007 and TIMSS 2011 as compared
to TIMSS 2003. In the first participation of the
fourth primary grade, the results of the Kingdom of
Bahrain were advanced at the Arab level.
Out of 13 countries which provided information
on this effect, six confirmed the existence of such
positive effect against seven countries denying
the positive impact of ILSA results use on students
achievement level.
The Ministry of Education in Tunisia has introduced
some updates to the school programs. English
hours have been increased in the elementary school
as were mathematics and basic sciences in the
primary.
Participation in PASEC left a significant impact on
education in Mauritania and was the reason behind
many changes in the educational system. In Yemen,
the assessment results helped elaborate the Basic
Classes Initiative program to improve the results.
In Egypt, the Center for Developing Curricula and
Educational Materials of the Ministry of Education
studied the results of TIMSS 2007 and subsequently
77
Regional Mapping Report on Assessment in the Arab States
IV. BENCHMARKING FOR ILSA
Overall View
The surveyed countries range widely in their attempts to
undertake International Large-Scale Assessment (ILSA)
activities. Four of the 17 countries (Egypt, Iraq, Libya
and Sudan) still show Latent levels of performance as
they have no history of participation in an ILSA or no
plans to undertake one in the future. The majority of the
countries (9) show Emerging levels, with participation in
an ILSA initiated but still needing to develop institutional
capacity to carry out the ILSA. Bahrain, Oman, and
the UAE were found to have Established levels in this
regard, as they have more or less stable participation
in an ILSA, with institutional capacity to carry out the
assessment. The information from ILSA is disseminated,
but not always used in effective ways. Qatar was the
sole country to show an Advanced level, with stable
participation in an ILSA and institutional capacity to
run it. The information from ILSA is effectively used to
improve education in Qatar.
1. Enabling Context
This driver assesses “the overall framework of policies,
leadership, organizational structures, financial and
human resources in which ILSA take place in a country
or system and the extent to which that framework is
conducive to, or supportive of, ILSA activity”.
Table 4.4: Benchmarking results for international large-scale assessment (by country and status)
Country
Latent
Emerging
1. Bahrain
ü
3. Iraq
ü
4. Jordan
ü
5. KSA
ü
6. Kuwait
ü
7. Lebanon
ü
ü
9. Mauritania
ü
10. Oman
ü
11. Palestine
ü
12. Qatar
13. Sudan
ü
ü
14. Syria
ü
15. Tunisia
ü
16. UAE
17. Yemen
78
Advanced
ü
2. Egypt
8. Libya
Established
ü
ü
PA RT F O U R International Large-Scale Assessment (ILSA)
Three indicators are included:
a. Setting clear policies for ILSA
Of the 17 surveyed countries, 14 showed
participation in two or more ILSAs in the last ten
years (Advanced). Iraq, Libya, and Sudan have not
participated in any ILSA in the same period (Latent).
In planning for upcoming ILSAs, 12 countries have
taken concrete steps to participate in at least one
ILSA in the next 5 years (Established), while the
remaining countries have no plan to do it. Only
six countries have formal policy documents that
address participation in ILSA (Established), four
countries have informal documents (Emerging), and
the rest have no document (Latent). Only Oman,
Palestine and Tunisia have made these documents
available to the public (Established).
b. Having regular funding for ILSA
Funding for ILSA is regular in seven countries
(Established to Advanced levels), and is available
from loans or external donors in seven countries
(Emerging). In only half the countries does the
funding cover research and development activities
(Advanced).
c. Having effective human resources for ILSA
Countries participating in ILSA activities all have
teams in place and national coordinators to
carry out the assessment (Established level). The
coordinators are all fluent in the language of the
assessment (Established). Staffing of the ILSA office,
however, is adequate and the team is trained to
carry out the assessment effectively with no issues
only in Bahrain, Kuwait, Lebanon, Mauritania, and
Qatar (Advanced). Staffing is adequate but minimal
issues arise in the ILSA offices in six countries
(Established). In Palestine, Syria, and Tunisia, ILSA
office is inadequately staffed or trained to carry out
the assessment effectively (Emerging levels).
Thus in this driver, with the exception of Iraq, Libya
and Sudan, the Arab countries are on the path to
establishing the overarching policy and resource
framework that provides the enabling context for
ILSA activities to take place in the country.
2. System Alignment
This driver assesses the “degree to which the ILSA is
coherent with other components of the education
system”.
Only one indicator is included here:
a. Providing opportunities to learn about ILSA
In this indicator, the countries were found to be of
ranging levels. The ILSA teams in seven countries
were able to attend all international workshops or
meetings (Established) while the rest of the teams
were able to attend only some of the learning
opportunities (Emerging). Only Oman offers a wide
range of opportunities for its teachers to learn about
ILSA (Advanced), while the remaining countries either
offer some opportunities or none at all. In 8 countries
opportunities to learn about ILSA are available to a wide
audience, in addition to the country’s team members
(Advanced).
3. Assessment Quality
This driver assesses the “degree to which the ILSA meets
technical quality standards, is fair, and is used in an
effective way”.
Two indicators are included:
a. Ensuring the quality of ILSA
Twelve countries were able to meet all technical
standards required to have their data presented
in the main displays of the international report
(Established), while the rest were able to meet
sufficient standards to have their data presented
beneath the main display of the international
report or in an annex (Emerging). In contributing
new knowledge on ILSA, the countries are split with
seven countries claiming that they have contributed
new knowledge (Advanced) and the rest claiming
otherwise (Latent).
b. Ensuring effective uses of ILSA
In those countries which have undertaken ILSA
activities, country-specific results and information
are regularly and widely disseminated in seven
countries (Advanced), regularly but not widely
disseminated in Jordan and KSA (Established),
irregularly disseminated in Egypt, Tunisia, and
Yemen (Emerging), and not disseminated in
Lebanon nor in Syria (Latent).
79
Regional Mapping Report on Assessment in the Arab States
Six countries have products that are systematically
made available to provide feedback to schools
and educators about ILSA results (Advanced), four
countries make such products available frequently
(Established), while the remainder of the countries
do not make available such products at all (Latent).
Media coverage of the ILSA results varies remarkably
among the countries, from none in Lebanon and
Yemen (Latent), to wide coverage in Kuwait,
Mauritania and UAE (Advanced), with the rest of
the countries falling in between.
In using results from ILSA to inform decision
making, seven countries claim doing so in a variety
of ways (Advanced), four countries do so in some
ways (Established), two countries do so in limited
ways (Emerging), while in Lebanon, the results are
not used to inform decision making (Latent). In only
five countries have decisions based on ILSA results
had a positive impact on students’ achievement
levels (Advanced). In the remaining countries, it is
not clear whether decisions based on ILSA results
have had a similar impact or not (Latent).
In this driver of Assessment Quality for ILSA, the
surveyed countries appear to be fairly established
in attempting to ensure the quality of the ILSA
activity. Dissemination and media coverage of ILSA
results ranges in scope for the surveyed countries.
The countries also differ in ensuring effective uses
of ILSA results.
80
GENERAL
CONCLUSION
1. Overall picture
................................................................
2. Enabling context
............................................................
3. System alignment
...........................................................
4. Assessment quality
.........................................................
82
82
84
85
81
Regional Mapping Report on Assessment in the Arab States
Seventeen Arab countries were engaged in this
survey that was conducted for mapping the national
assessment systems in the region. This report served the
purpose of presenting a general description of various
aspects of the assessment policy, management and
results based on the information gathered through four
questionnaires completed by the participating countries
on classroom assessment, national examinations, and
large scale national and international assessments.
Bahrain is the sole country that appears to be placing
equal efforts across the four assessment types and has
thus reached “established” levels in all four. Qatar is the
only country with an “advanced” level on one of the
assessment types, which is the ILSA.
Analysis of results has been conducted in two phases,
based first on the questionnaire responses then based
on the benchmarking exercise.
2. Enabling Context
In this conclusion the main results of the survey are
recapitualated by indicator, encompassing the four
questionnaires.
1. Overall Picture
The Arab countries could generally be judged as having
“emerging” to “established” systems of assessment.
They have already founded their systems but they need
more work to improve them. There is not one country
which achieved the level “advanced” in all forms of
assessment. Significantly, national exams are the one
form of assessment which is developed in the majority
of the countries (14 out of 17), while for national large
scale assessment the majority (11) lies in “emerging”
status and four in “latent” status (Table 5.1). In other
terms, one can describe the assessment systems in the
Arab countries as being traditional, since examinations
are the focus of national policies, and where large scale
assessments have not so far gained an important place
in these policies. Classroom assessment policies should
be strengthened as well.
Table 5.1: Benchmarking-overall picture
Latent
Emerging
Established
Advanced
Total
CA
0
9
8
0
17
EX
0
3
14
0
17
NLSA
4
11
2
0
17
ILSA
4
9
3
1
17
The main regional trends under the three quality drivers
of assessment policies are presented below: (a) enabling
context, (b) system alignment and (c) assessment quality.
Policy documents related to several forms of assessment
do exist in almost all countries. Of the seventeen
surveyed countries, 15 reported having an official statelevel document that provides guidelines for classroom
assessment. Documents related to examination do
exist in all countries. The case is different for largescale assessments. In most cases, there is no real formal
policy document related to national or international
assessments. Of the 14 countries that have any form of
NLSA, only four have a formal policy document related
directly to the topic. The other documents are of a
general nature or are informal or in draft form.
Accessibility to policy documents constitutes an issue.
The CA documents are available to the public in one
form or another in all concerned countries, while the
documents related to large-scale assessments are not
always made available to the public. Most countries refer
to paper documents, and less to electronic documents
available online. Written plans specifying who will be
tested and in which subject areas in NLSA is usually not
available to and accessible easily by the public.
In terms of resources available to teachers on a systemwide basis for their CA activities, the majority of
countries surveyed provide teachers with textbooks and
workbooks that provide support for CA, and also provide
a document that outlines what students are expected to
learn in different subject areas at different grade levels.
A considerable number of countries provide teachers
with scoring criteria or rubrics for students’ work. Few
countries use item banks or online assessment resources
and none provide computer-based testing.
The countries adopt different system-level mechanisms
to ensure that teacher develop their skills and expertise
in classroom assessment. Nine countries reported
82
G e n e r a l Conclusion
having an official curriculum or standards document
that specifies what students are expected to learn and
to what level of performance, at a minimum in language
and mathematics, at different age and grade levels.
All countries reported having mechanisms to ensure
that teachers develop skills and expertise in classroom
assessment, some at the system-level and some informal
or in the form of ad-hoc initiatives and activities
Regarding human resources available in the respective
countries, most seem to have an adequate number
of staff for administering examinations, but are
understaffed to undertake large-scale assessments.
Eleven countries reported that there is an adequate
number of permanent or full-time staff in the agencies
or institutions responsible for examinations in their
respective countries. In six countries, however, the
number of permanent staff in the agencies is not
sufficient to meet the needs of the examination. Different
learning opportunities in educational assessment
and measurement are provided on an annual basis
in the surveyed countries to prepare for work on the
examination. A considerable number of countries also
provide training courses or workshops on educational
measurement and evaluation.
To undertake national large-scale assessments, eight
countries stated that no issues were identified with the
performance of the human resources that are responsible
for the large-scale assessment, while five countries have
permanent or full-time staff, but who are insufficient to
meet the needs of the assessment. Opportunities for
professional development for NLSA are available in all
countries, in at least one form or more, ranging from
university graduate programs, to university and nonuniversity courses or workshops, funding for attending
international courses or workshops, and internships
available at assessment offices.
All countries that have been participating in international
assessments have assigned a team responsible for
carrying out the international assessment in the country,
led by a national coordinator and the teams have previous
experience working on international assessments. Not
all the teams are sufficiently staffed, however, or have
the necessary training or experience to carry out the
required assessment activities effectively. Moreover, not
all participating countries were able to have their teams
attend all international meetings related to international
assessment. In carrying out the international assessment
in their countries, some teams were faced with difficulties,
whether related to printing, translation, scoring of the
test booklets, delays in administration, or poor training
of the test administrators.
Opposition to the examination program is hardly visible.
In some cases, some degree of opposition was exerted
by educators, students, parents, the media, think tanks,
or universities. Palestine is the country showing most
opposition to the examination program from several
stakeholders. The typical situation in most countries is
that policymakers show support to the program.
All 14 countries engaged in NLSA state that stakeholders
are generally quite supportive of the program and
attempted to reform it, while teacher unions, students,
parents and employers are more neutral in attitude.
Regarding available funding, governments have allocated
regular funding for the administration of examinations
only, while funding for large-scale assessments is not
always allocated on a regular basis. All the countries use
the funds to cover activities related to the design and
administration of the examination. Most countries also
use the funding to cover data analysis activities and data
reporting activities.
Funding of NLSAs is typically allocated by the
government, whether on a regular or irregular basis.
Large number of countries had variable sources of
funding allocated for participation in international
assessments. This funding was used to cover a range
of activities. Only in Bahrain, Egypt, Saudi Arabia,
Palestine, Qatar, and Tunisia does the funding cover
research and development activities.
Almost all countries reported that an office or a branch
within the Ministry of Education holds the primary
responsibility for running the examination in the country.
All the surveyed countries reported that the examination
results are officially recognized by certification and
selection systems in the country and by more than one
certification and selection system abroad.
The groups carrying out the NLSAs in the respective
countries are usually accountable to a clearly recognized
body. In 13 out of the 14 countries where large-scale
national assessments take place, the organization
83
Regional Mapping Report on Assessment in the Arab States
that is in charge of the assessment is described as a
“permanent agency or institution or unit created for
running the assessment”. Political considerations never
hamper technical considerations in 10 of the countries.
Looking at enabling context from a benchmarking point
of view, it is worth to note the following:
“Established” is the benchmark given to almost
half of the cases under study. The remaining
were unequally divided into “latent-emerging”
(more than one quarter) and advanced (less than
one quarter). Enabling context seems in need for
special efforts at the policy level, in order to move
all countries at least to the level of “established”.
The critical problems of enabling context are found
in Iraq, and the countries which did not participate
in NLSA or/and ILSA, such as Kuwait, Libya, Sudan
and Yemen. In fact, across the countries, latent
and emerging statuses are found mainly in these
two fields: national and international large scale
assessments. The classroom assessment is suffering
the least from enabling context shortcomings; this
is the case as well for the examinations.
Enabling context has five indicators; surprisingly
“having regular funding” collected the highest
percentage of cases judged “latent-emerging”,
followed by “setting clear guidelines”, while the more
advantaged aspect is related to “human resources”
followed by “strong leadership”. Table 5.2 shows the
order of these indicators, by increasing percentages
of cases classified latent-emerging (and decreasing
percentage of established-advanced).
3. System Alignment
All of the surveyed countries reported that the
examination in their respective countries measures the
national school curriculum guidelines or standards. In
general, what is measured by the examination is largely
accepted by the stakeholders in most countries, and the
materials needed to prepare for the examination are
widely accessible by over 90% of students in a variety of
learning contexts, such as in public schools or online. All
countries reported that they offer sample examination
questions. A large number of countries make available
information on how to prepare for the examinations.
In all 14 countries under consideration, the NLSA
measures performance against national/system or statelevel curriculum guidelines or learning standards. In three
countries the assessment also measures performance
against internationally recognized curriculum guidelines
or learning standards. Countries are divided regarding
stakeholders positions towards what NLSA measures. On
the other hand, the majority of countries confirm that
mechanisms are in place to ensure that the large-scale
assessment accurately measures what it is supposed to
measure, with regular internal reviews of the alignment
between the assessment instrument and its intended
aims being the most common measure.
Regarding the alignment with teacher learning
opportunities, eight countries indicate that there are
regularly updated compulsory courses or workshops for
teachers on examinations. Teachers perform a number
of tasks related to examinations. The main examinationrelated tasks that teachers perform in almost all
countries are supervising examination procedures and
administering the examination.
Table 5.2: Enabling context, sorting indicators based on benchmarking results
Indicator
84
Latent-Emerging (%)
Established- Advanced (%)
Having effective human resources
11
89
Having strong leadership, Public engagement
14
86
Having strong organizational structures
21
79
Setting clear guidelines
32
68
Having regular funding
43
57
G e n e r a l Conclusion
Most countries also offer teacher training courses,
workshops, or presentations on the large-scale national
assessment; however these are offered occasionally.
Opportunities to learn about international assessments
in the respective countries or systems are offered in one
way or another in 10 countries. These are in the form
of training workshops or university or online courses
on the topic of international assessments and its use.
The learning opportunities are of benefit for those
individuals working directly on the specific international
assessment exercise or for university staff or students
interested in the subject.
Table 5.3: System alignment, sorting indicators
based on benchmarking results
In terms of benchmarking the following results are
noticed:
“Established” status occupies half of the judgments
given to the cases under study. The two other
quarters are divided equally between “latentemerging” and “advanced”. In other terms, in a
quarter of the cases, across the countries and the
assessment types, there are needs for improvement.
Classroom assessment activities in all of the surveyed
countries focus on knowledge and skills in core
curriculum areas and are mainly concerned with
recalling information. Only seven countries assess noncognitive skills such as teamwork and self-discipline.
Some countries are still challenging the traditional
views of assessment. Ten countries reported that
classroom assessment activities provide little feedback
to students. Aligning classroom assessment activities
with pedagogical or curricular frameworks is common
in most countries surveyed.
There are efforts that should be exerted in some
countries, since problems of system alignment
are found more frequently in Iraq, Egypt, Oman,
Palestine, Sudan, and Syria. Across the countries,
judgments about system alignment are rated
lower as one goes from classroom assessment to
examination to national and international large
scale assessments.
As for the two indicators of system alignment,
results show that the situation is critical regarding
“Providing opportunities to learn about” where
a little more than half the cases are in “latentemerging” levels, as shown in Table 5.3.
4. Assessment Quality
With the exception of Iraq and Libya, all surveyed
countries carry out classroom assessment activities in
order to inform their own teaching and their students’
learning. A large number of countries also conduct
classroom assessment to meet system or school-level
requirements or information needs.
LatentEmerging (%)
EstablishedAdvanced (%)
Alignment with
system learning goals
14.2
85.8
Providing (teachers
with) opportunities
to learn about
47.3
52.7
Indicator
Most countries have system-level mechanisms in place
to monitor the quality of classroom assessment activities.
In all the surveyed countries, classroom assessment is
a required component of school inspection or teacher
supervision. It is also a required component of a teacher’s
performance evaluation in all countries except Iraq and
Lebanon. Nine countries have national or other systemwide reviews of the quality of education which include
a focus on classroom assessment.
Government funding for research on the quality of
classroom assessment activities and how to improve
classroom assessment is only available in the UAE,
Tunisia and Kuwait. Qatar is the only country that
reported having an external moderation system that
reviews the difficulty of classroom assessment activities,
the appropriateness of scoring criteria, etc.
In all countries, results of classroom assessment for
individual students are recorded in the teacher’s record
book, and in most cases, in the students’ own copybooks
too. With the exception of Iraq, all countries also have a
classroom or a school database where student results are
recorded. Nine countries have district-wide databases
85
Regional Mapping Report on Assessment in the Arab States
or information systems to record student results, and
eight have system-wide student record databases or
information systems.
their assessment the purpose of monitoring education
quality at the system level and policy design, evaluation,
or decision making.
There are different required uses of classroom assessment
activities to promote and inform students’ learning. In all
countries, except for Libya, it is required to use assessment
to provide feedback to students on their learning. The
majority use classroom assessment as a diagnostic tool
for student learning issues, to inform parents about their
children’s learning, and for planning purposes.
Five countries have done nothing to ensure a wide
social coverage of their national large-scale assessment.
All countries which have any form of NLSA use at least
one mechanism to ensure the quality of the assessment
instruments. The countries seem to be less engaged
in technical documentation of NLSA. A high-quality
technical report is available to the general public in
only four countries. Countries were engaged differently
in dissemination. The results are not reported or
disseminated in two countries (Syria and Tunisia). The
typical common action is to hold workshops and to
make presentations to the stakeholders. Four countries
reported that there are no mechanisms in place to
monitor the consequences of the NLSA.
For most countries, the standardized examinations at
the secondary level have a double function: (1) student
certification for grade or school cycle completion, and
(2) Student selection to university or other highereducation institution. Monitoring education quality level
and planning education policy reforms are the functions
of exams in other countries.
Only four countries have a comprehensive, high quality
technical report supporting the examination that
is available to the public. Countries adopt different
systematic mechanisms to ensure the quality of their
examinations. All the surveyed countries reported
that they have internal reviewers or observers. Some
countries, such as Egypt, KSA, Lebanon, and Libya also
use external reviewers or observers.
There are a number of inappropriate behaviors that may
occur during the examination process and consequently
diminish the credibility of the examination, but each of
the countries has mechanisms in place to attempt to
address these inappropriate behaviors. In all countries,
except Yemen, the results are perceived as credible by all
stakeholder groups. All the surveyed countries reported
that all students may take the examination regardless of
their background (gender, ethnic group, etc.), location
(urban, rural, etc.), ability to pay (transportation, fees,
etc.) and similar factors. Very few countries reported an
improper use of the examination results by stakeholders
groups. Some students who sit for the examination may
not perform well. For those students, all countries offer
them the option of retaking the exam.
Fourteen countries use certain forms of national large
scale assessment, with eight countries conducting the
assessment on a regular annual or biennial basis (every
two years). All countries include among the purposes of
86
Fourteen countries reported having participated in
international large-scale assessments in the past two
decades. The majority of the assessments are related to
the TIMSS, most markedly the TIMSS 2007 and 2011
assessments. Of these 14 countries, only Mauritania has
not yet participated in a TIMSS assessment, but plans
to do so in 2015, along with 12 other countries which
have all already taken concrete steps for the purpose.
In their participation in international large-scale
assessments, 13 countries had met all technical standards
required to have their data presented in the main
displays of the international report. Only six countries
claim having contributed to the global knowledge
base on international assessments by generating new
knowledge and making it available through publications
or presentations. The results from the international
assessments were disseminated in variable formats in
11 countries, with varying scopes of coverage by the
media. The results of the international assessments in
13 countries have been used to inform decision making
at the national level, whether for tracking the impact of
reforms on student achievement levels, or for informing
curriculum improvement, teacher training programs,
resource allocation, or other assessment activities in
the system. In only six countries there appear to be a
positive impact on student achievement levels from
the use of the results of the international assessment
exercise by policy makers or education leaders to
improve education quality in the country or system.
G e n e r a l Conclusion
In terms of benchmarking, it is noticeable to mention
the following:
There are fewer “established” positions in assessment
quality (42%) compared to enabling context (53%)
and system alignment (49%), with more cases judged
“latent-emerging” (31%) as compared to the two latter
cases (27% and 28%). At the same time, Table 5.4
shows that more advanced positions could be found
here.
This means that, at one end, there are more problems
in this aspect of assessment policies, while at the other
end, some cases show advanced status levels.
Table 5.5: Assessment quality, sorting indicators
based on benchmarking results
LatentEmerging
(%)
EstablishedAdvanced
(%)
Using examination
information in a fair way
23.5
76.5
Ensuring the quality of
29.3
70.7
Ensuring positive
consequences of the
examination
32.4
67.6
Ensuring effective uses of,
Ensuring fairness of
32.7
67.3
Criterion
Table 5.4: Enabling context, system alignment,
and assessment quality comparison
Enabling System
Quality
context
alignment assessment
(%)
(%)
(%)
Latent 1
11.6
13.2
17.3
Emerging 2
15.4
15.0
13.4
Established 3
53.4
48.6
42.4
Advanced 4
19.6
23.2
26.8
Total
100.0
100
100.0
The disparity in assessment quality is observed among
the countries. In Egypt, Iraq, Lebanon, Libya, Mauritania,
Sudan, and Yemen, there are more latent cases than
in the other countries, while there are more advanced
cases in Bahrain, Kuwait, KSA, Palestine, Qatar, and
UAE.
In terms of type, the quality shortcomings increase
upward from Classroom Assessment (27%), to
Examinations, to NLSA and ILSA (44%).
This disparity is not due to differences in indicator
positions; the situation is almost similar in the four
indicators used in the quality driver, as shown in Table
5.5.
87
ANNEXES
ANNEX I:
List of National Researchers ............................. 90
ANNEX II: List of National Validation Workshops
.............
ANNEX III: SABER-SA Questionnaires ................................
• Classroom Assessment ...............................
• Examinations ..............................................
• National Large-Scale Assessment ................
• International Large-Scale Assessment ..........
92
93
93
99
114
128
89
Regional Mapping Report on Assessment in the Arab States
ANNEX I: List of National Researchers
90
Title/Organization
Country
Name
1. Bahrain
Maher Younes Aldarabi
Measurement and Assessment
Consultant
Ministry of Education
[email protected]
[email protected]
2. Egypt
Hasib Mohamed Hasib
Abdrabou
Assistant Professor /Educational
Evaluation Department
National Center for Examinations
and Educational Evaluation
[email protected]
3. Iraq
Hala Ibrahim Majid
Head of Assessment and
Evaluation
Quality Assurance DepartmentMinistry of Education
[email protected]
4. Jordan
Sheren Hamed
Researcher
[email protected]
National Center for Human
Resources Development (NCHRD)
5. KSA
Saleh ben AbdelAziz
Zahrani
General Evaluation Supervisor
[email protected]
Directorate General of Evaluation
Ministry of Education
6. Kuwait
Sarah I. Portman
Consultant
National Center for Education
Development (NCED)
[email protected]
7. Lebanon
Charlotte Hanna
Head of Planning Unit
Center for Educational Research
and Development
[email protected]
8. Libya
Anies Hroub
University Professor in Education
American University of Beirut
[email protected]
9. Mauritania Jean Pierre Jarousse
University Professor
Education Consultant
[email protected]
10. Oman
Mohamed Bin Rashid Bin
Said Al Hadidi
Assistant Director General for
Educational Assessment
Ministry of Education
[email protected]
11. Palestine
Mohamed Matar
Director of Monitoring and
Evaluation, National Coordinator
for TIMSS
Ministry of Education
[email protected]
[email protected]
12. Qatar
Maha Ali Mohamed Saadi Evaluation Expert
Supreme Education Council –
Evaluation Institute
(at the time of data collection)
E-mail
[email protected]
A N N E XE S
Title/Organization
Country
Name
13. Sudan
Fayza Sayed Khalafallah
Manager, Questions Repository
Project Coordinator for
Designing the National Learning
Assessment System
Directorate of Examinations –
Ministry of Education
[email protected]
14. Syria
Almouthana Khodour
Director of Curricula and
Supervision
Ministry of Education
[email protected]
15. Tunisia
Al Hedi Al Saidi
Director General
General Directorate of Studies,
Planning and Information
Systems
Ministry of Education
[email protected]
16. UAE
Awatif Hammoud Bu Afra Evaluation Specialist
Directorate of Evaluation and
Examinations
Ministry of Education
[email protected]
17. Yemen
Nour Eddin Akil Othman
[email protected]
(at the time of data collection)
Training and Formation
Consultant
Sector of Training and Formation
Ministry of Education
E-mail
91
Regional Mapping Report on Assessment in the Arab States
ANNEX II: List of National Validation Workshops
Country
Date
1. Bahrain
1 April 2014
2. Egypt
17 March 2014
3. Iraq
6 May 2014
4. Jordan
28 August 2014
5. KSA
25 June 2014
6. Kuwait
7. Lebanon
12 February 2014
8. Libya
9. Mauritania 3 December 2013
10. Oman
20 February 2014
11. Palestine
9 July 2014
12. Qatar
13. Tunisia
21 March 2014
14. Sudan
31 October 2013
15. Syria
92
16. UAE
13 February 2014
17. Yemen
19 March 2014
A N N E XE S
ANNEX III: SABER-SA Questionnaires
2011 QUESTIONNAIRE
Survey of Student Assessment Systems
Classroom Assessment
Name of Country of Education System
Date of data collection
Systems Approach for Better Education Results
The World Bank
Human Development Network
93
Regional Mapping Report on Assessment in the Arab States
1. Is there a system-level document that provides guidelines for classroom assessment
(e.g., content, format, expectations, scoring criteria, uses)?
a. ( ) Yes, there is a formal document
b. ( ) Yes, there is an informal or draft document
c. ( ) No, à Go to question 5
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
2. Please supply the following information on the document that provides guidelines for classroom assessment.
Official document citation: ...............................................................................................................................
Authorizing body: ...........................................................................................................................................
Year of authorization: ...................................................
Please provide the link or attach a copy of the document with your submission of the completed
questionnaire.
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
3. Is the document identified in question 2 available to the public?
a. ( ) Yes
b. ( ) No
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
4. Where, specifically, is the document identified in question 2 available? Check all that apply.
a.
b.
c.
d.
e.
(
(
(
(
(
)
)
)
)
)
Online
Public library
Teacher training colleges
In-service courses for teachers
Other, please specify: ............................................................................................................................
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
94
A N N E XE S
5. Which of the following resources are typically available (i.e., on a system-wide basis) to teachers for
their classroom assessment activities? Check all that apply.
a. ( ) A document that outlines what students are expected to learn in different subject areas at different grade/
age levels
b. ( ) A document that outlines the level(s) of performance that students are expected to reach in different
subject areas at different grade/age levels
c. ( ) Textbooks or workbooks that provide support for classroom assessment
d. ( ) Scoring criteria or rubrics for students’ work
e. ( ) Item banks or pools with examples of selection/multiple-choice or supply/open-ended questions
f. ( ) Online assessment resources
g. ( ) Computer-based testing with instant reports on students’ performance
h. ( ) Other, please specify: ............................................................................................................................
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
6. Is there an official curriculum or standards document that specifies what students are expected to
learn and to what level of performance, at a minimum in language and mathematics, at different
grade/age levels?
a. ( ) Yes, the document outlines what students at different grade/age levels are expected to learn and to what
performance level
b. ( ) Yes, the document outlines what student at different grade/age levels are expected to learn, but does not
specify to what performance level
c. ( ) No, there is no official document that specifies what students at different grade/age levels are expected
to learn and to what performance level
d. ( ) Other, please specify: ............................................................................................................................
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
7. Do mechanisms exist to ensure that teachers develop skills and expertise in classroom assessment?
Check all that apply.
a. ( ) Yes, mechanisms exist at the system level
b. ( ) Yes, there are informal or ad-hoc initiatives/activities c. ( ) No à Go to question 9
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
95
Regional Mapping Report on Assessment in the Arab States
8. What system-level mechanisms exist to ensure that teachers develop skills and expertise in classroom
assessment? Check all that apply.
a.
b.
c.
d.
e.
f.
g.
h.
(
(
(
(
(
(
(
(
) Pre-service teacher training
) In-service teacher training ) All teacher training programs include a required course on classroom assessment
) On-line resources on classroom assessment
) Opportunities to participate in conferences and workshops
) Opportunities to participate in item development for, or scoring of, large-scale assessments or exams
) School inspection or teacher supervision includes component focused on classroom assessment
) Other, please specify: ............................................................................................................................
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
9. What are the main reasons that teachers typically carry out classroom assessment activities?
Check all that apply.
a.
b.
c.
d.
(
(
(
(
)
)
)
)
To meet external (system-level) requirements or information needs
To inform their own teaching and their students’ learning
To meet school-level requirements or information needs
Other, please specify: ............................................................................................................................
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
10. What type of knowledge and skills typically is the focus of classroom assessment activities?
Check all that apply.
a.
b.
c.
d.
(
(
(
(
)
)
)
)
Knowledge and skills in core curriculum areas such as mathematics, language arts (reading, writing)
Knowledge and skills in non-core curriculum areas such as civics, home economics
Non-cognitive skills such as team work, persistence, self discipline
Other, please specify: ...........................................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
11. According to key documents or informants, to what extent do classroom assessment activities have
the following characteristics? For each characteristic, mark Very Common (VC), Common (C), Not
Common (NC), Rarely (R), or Unable to Tell (UT).
a.
b.
c.
d.
e.
f.
g.
96
(
(
(
(
(
(
(
)
)
)
)
)
)
)
Rely mainly on multiple-choice, selection-type questions .............................................. [VC-C-NC-R-UT]
Are mainly about recalling information ....................................................................... [VC-C-NC-R-UT]
Teachers do not use explicit or a priori criteria for scoring or grading students’ work ....... [VC-C-NC-R-UT]
It is common to observe errors in the scoring or grading of students’ work .................. [VC-C-NC-R-UT]
Uneven application of standards for grading students’ work is a serious problem ......... [VC-C-NC-R-UT]
Grade inflation is a serious problem ............................................................................ [VC-C-NC-R-UT]
Parents are poorly informed about students’ grades .................................................... [VC-C-NC-R-UT]
A N N E XE S
h. ( ) Provide little useful feedback to students .................................................................... [VC-C-NC-R-UT]
i. ( ) Mainly used as administrative or control tool rather than as pedagogical resource ......... [VC-C-NC-R-UT]
j. ( ) Not aligned with pedagogical or curricular framework ................................................ [VC-C-NC-R-UT]
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
12. What system-level mechanisms are in place to monitor the quality of classroom assessment activities?
Check all that apply.
a. ( ) Classroom assessment is a required component of a teacher’s performance evaluation
b. ( ) Classroom assessment is a required component of school inspection or teacher supervision
c. ( ) There is an external moderation system that reviews the difficulty of classroom assessment activities,
appropriateness of scoring criteria, etc.
d. ( ) National or other system-wide reviews of the quality of education include a focus on classroom assessment
e. ( ) Government funding is available for research on the quality of classroom assessment activities and how
to improve classroom assessment
f. ( ) Other, please specify: ..........................................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
13. Where are classroom assessment results for individual students typically recorded? Check all that apply.
a.
b.
c.
d.
e.
f.
(
(
(
(
(
(
)
)
)
)
)
)
Student’s own copy book
Teacher’s record book
Classroom or school database
District-wide database or information system
System-wide database or information system
Other, please specify: ...........................................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
14. To whom are schools or teachers required to report on individual student’s performance?
Check all that apply.
a.
b.
c.
e.
d.
(
(
(
(
(
) School district/Ministry of Education officials ) Parents
) Students
) Other, please specify: ...........................................................................................................................
) No one
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
97
Regional Mapping Report on Assessment in the Arab States
15. According to key documents or informants, what are the required uses of classroom assessment
activities to promote and inform student learning? Check all that apply.
a.
b.
c.
d.
e.
f.
(
(
(
(
(
(
) Diagnosing student learning issues
) Providing feedback to students on their learning
) Informing parents about their child’s learning
) Planning next steps in instruction
) Grading students for internal classroom uses
) Providing input to an external examination program (e.g., school-based assessment with moderation and
quality audit)
g. ( ) Other, please specify: ..........................................................................................................................
h. ( ) None
Comments:
THANK YOU FOR YOUR RESPONSES
Please ensure that you have answered all questions and that your responses are consistent
and accompanied by supporting evidence.
98
A N N E XE S
ANNEX III: SABER-SA Questionnaires
2011 QUESTIONNAIRE
Survey of Student Assessment Systems
Examinations
Name of Country of Education System
Date of data collection
Systems Approach for Better Education Results
The World Bank
Human Development Network
99
Regional Mapping Report on Assessment in the Arab States
1. Please provide information on up to three major standardized examinations that the country has in
place. Please make sure to include examinations that determine high school graduation or university
entrance. Complete one table for each examination.
Examination 1A:
I. Name of the standardized
examination
a. ( ) Name: .................................................................
b. ( ) Check here if there is no standardized examination
II. Main purpose(s) of the
examination (check all that
apply)
a. ( ) Student certification for grade or school cycle completion
b. ( ) Student selection to secondary school
c. ( ) Student selection or promotion for grades/courses/tracks
in secondary school
d. ( ) Student selection to university or other higher-education institution
e. ( ) Monitoring education quality levels
f. ( ) Planning education policy reforms
g. ( ) Designing individualized instructional plan
h. ( ) School or educator accountability
i. ( ) Promoting competition among schools
j. ( ) Other, please specify: ..........................................................................
III. First year the examination
was administered
a. ( ) More than ten years ago
b. ( ) Five to ten years ago
c. ( ) Less than five years ago
IV. Subject(s) or area(s)
covered by the examination
V. Grade level(s) at
which students take the
examination
a. (
b. (
c. (
d. (
e. (
f. (
g. (
h. (
i. (
j. (
k. (
l. (
m. (
) Grade 1
) Grade 2
) Grade 3
) Grade 4
) Grade 5
) Grade 6
) Grade 7
) Grade 8
) Grade 9
) Grade 10
) Grade 11
) Grade 12
) Grade 13
VI. Most common modal
a. ( ) 10 years old
age(s) at which students take b. ( ) 11 years old
the examination
c. ( ) 12 years old
d. ( ) 13 years old
e. ( ) 14 years old
f. ( ) 15 years old
g. ( ) 16 years old
h. ( ) 17 years old
i. ( ) 18 years old
j. ( ) 19 years old
k. ( ) 20 years old
100
A N N E XE S
VII. Format of the
examination (check all that
apply)
a.
b.
c.
d.
e.
f.
g.
(
(
(
(
(
(
(
) Paper and pencil
) Oral
) Portfolio
) Performance assessment
) Computer-based
) Computer-adaptive test
) Other, please specify: ..........................................................................
VIII. Format(s) of the
a. ( ) Multiple-choice
examination questions (check b. ( ) Supply/open-ended
all that apply)
c. ( ) Essays
d. ( ) Oral or performance assessment
e. ( ) Other, please specify: ..........................................................................
IX. Additional comments
Examination 1B:
I. Name of the examination
II. Main purpose(s) of the
examination (check all that
apply)
a. ( ) Student certification for grade or school cycle completion
b. ( ) Student selection to secondary school
c. ( ) Student selection or promotion for grades/courses/tracks in secondary
school
d. ( ) Student selection to university or other higher-education institution
e. ( ) Monitoring education quality levels
f. ( ) Planning education policy reforms
g. ( ) Designing individualized instructional plan
h. ( ) School or educator accountability
i. ( ) Promoting competition among schools
j. ( ) Other, please specify: .........................................................................
III. First year the examination
was administered
IV. Subject(s) or area(s)
covered by the examination
V. Grade level(s) at
which students take the
examination
a. (
b. (
c. (
d. (
e. (
f. (
g. (
h. (
i. (
j. (
k. (
l. (
m. (
) Grade 1
) Grade 2
) Grade 3
) Grade 4
) Grade 5
) Grade 6
) Grade 7
) Grade 8
) Grade 9
) Grade 10
) Grade 11
) Grade 12
) Grade 13
101
Regional Mapping Report on Assessment in the Arab States
VI. Most common modal
a. ( ) 10 years old
age(s) at which students take b. ( ) 11 years old
the examination
c. ( ) 12 years old
d. ( ) 13 years old
e. ( ) 14 years old
f. ( ) 15 years old
g. ( ) 16 years old
h. ( ) 17 years old
i. ( ) 18 years old
j. ( ) 19 years old
k. ( ) 20 years old
VII. Format of the
examination (check all that
apply)
a.
b.
c.
d.
e.
f.
g.
(
(
(
(
(
(
(
) Paper and pencil
) Oral
) Portfolio
) Performance assessment
) Computer-based
) Computer-adaptive test
) Other, please specify: ..........................................................................
VIII. Format(s) of the
a. ( ) Multiple-choice
examination questions (check b. ( ) Supply/open-ended
all that apply)
c. ( ) Essays
d. ( ) Oral or performance assessment
e. ( ) Other, please specify:..........................................................................
IX. Additional comments
Examination 1C:
I. Name of the examination
II. Main purpose(s) of the
examination (check all that
apply)
III. First year the examination
was administered
IV. Subject(s) or area(s)
covered by the examination
102
a. ( ) Student certification for grade or school cycle completion
b. ( ) Student selection to secondary school
c. ( ) Student selection or promotion for grades/courses/tracks in secondary
school
d. ( ) Student selection to university or other higher-education institution
e. ( ) Monitoring education quality levels
f. ( ) Planning education policy reforms
g. ( ) Designing individualized instructional plan
h. ( ) School or educator accountability
i. ( ) Promoting competition among schools
j. ( ) Other , please specify: .........................................................................
A N N E XE S
V. Grade level(s) at
which students take the
examination
a. (
b. (
c. (
d. (
e. (
f. (
g. (
h. (
i. (
j. (
k. (
l. (
m. (
) Grade 1
) Grade 2
) Grade 3
) Grade 4
) Grade 5
) Grade 6
) Grade 7
) Grade 8
) Grade 9
) Grade 10
) Grade 11
) Grade 12
) Grade 13
VI. Most common modal
a. ( ) 10 years old
age(s) at which students take b. ( ) 11 years old
the examination
c. ( ) 12 years old
d. ( ) 13 years old
e. ( ) 14 years old
f. ( ) 15 years old
g. ( ) 16 years old
h. ( ) 17 years old
i. ( ) 18 years old
j. ( ) 19 years old
k. ( ) 20 years old
VII. Format of the
examination (check all that
apply)
a.
b.
c.
d.
e.
f.
g.
(
(
(
(
(
(
(
) Paper and pencil
) Oral
) Portfolio
) Performance assessment
) Computer-based
) Computer-adaptive test
) Other, please specify: .........................................................................
VIII. Format(s) of the
a. ( ) Multiple-choice
examination questions (check b. ( ) Supply/open-ended
all that apply)
c. ( ) Essays
d. ( ) Oral or performance assessment
e. ( ) Other, please specify: .........................................................................
IX. Additional comments
103
Regional Mapping Report on Assessment in the Arab States
2. Please indicate the table that you completed for the main university entrance examination (or if you
did not complete a table for a main university entrance examination, please indicate the table with
the major examination for graduation from high school or secondary school) for which you will be
answering the remaining questions.
a. ( ) Table 1A
b. ( ) Table 1B
c. ( ) Table 1C
Please answer all remaining questions with respect to this examination.
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
3. Is there a policy document that authorizes the examination?
a. ( ) Yes, there is a formal policy document
b. ( ) Yes, there is an informal or draft policy document
c. ( ) No –> Go to question 7
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
4. Please provide the following information on the policy document that authorizes the examination:
Official document citation: ...............................................................................................................................
Authorizing body: ...........................................................................................................................................
Year of authorization: ......................................................................................................................................
Comments: Please provide the link to the policy or attach a copy of the policy with your submission of the
completed questionnaire.
.......................................................................................................................................................................
.......................................................................................................................................................................
5. Is the policy document identified in question 4 available to and easily accessible by the public?
a. ( ) Yes
b. ( ) No
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
104
A N N E XE S
6. What does the content of the policy document authorizing the examination include?
Check all that apply.
a.
b.
c.
d.
e.
f.
g.
h.
i.
j.
k.
( ) It outlines governance, distribution of power, responsibilities among key entities
( ) It describes the purpose of the examination
( ) It describes authorized uses of results
( ) It states funding sources
( ) It outlines procedures to investigate and address security breaches, cheating, or other forms of inappropriate
behavior
( ) It outlines procedures for special/disadvantaged students
( ) It specifies who can sit for the examination
( ) It identifies rules about preparation
( ) It explains alignment with curricula and standards
( ) It explains the format of the examination questions
( ) Other, please specify: .............................................................................................................................
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
7. Where does key leadership to guide the development of the examination questions come from?
Check all that apply.
a.
b.
c.
d.
e.
f.
(
(
(
(
(
(
) A person or team in the examination office
) A person or team from the group in charge of national large scale assessment
) A person or team in a university
) A person or team from the government, Please specify: ...........................................................................
) A non-government person or team, Please specify: ..................................................................................
) Other, please specify: ............................................................................................................................
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
8. Based on publicly available evidence, how much do the following stakeholder groups support or
oppose the examination program? For each stakeholder, mark Strongly Support, Support, Neutral,
Oppose, Strongly Oppose, or Unable to Tell.
a.
b.
c.
d.
e.
f.
g.
h.
i.
j.
(
(
(
(
(
(
(
(
(
(
)
)
)
)
)
)
)
)
)
)
Policymakers .................................................................................. [SS – S – N – O – SO-Unable to Tell]
Teacher Unions .............................................................................. [SS – S – N – O – SO-Unable to Tell]
Educators ....................................................................................... [SS – S – N – O – SO-Unable to Tell]
Students ........................................................................................ [SS – S – N – O – SO-Unable to Tell]
Parents .......................................................................................... [SS – S – N – O – SO-Unable to Tell]
Media ............................................................................................ [SS – S – N – O – SO-Unable to Tell]
Think-tanks, NGOs or equivalent .................................................... [SS – S – N – O – SO-Unable to Tell]
Universities .................................................................................... [SS – S – N – O – SO-Unable to Tell]
Employers ...................................................................................... [SS – S – N – O – SO-Unable to Tell]
Other, please specify:
Comments: (please specify if the actions of stakeholder subgroups differ):
.......................................................................................................................................................................
.......................................................................................................................................................................
105
Regional Mapping Report on Assessment in the Arab States
9. Have there been attempts to improve the examination by any of the stakeholder groups listed in
question 8?
a.
b.
c.
d.
(
(
(
(
) Yes, coordinated efforts have been made by stakeholder groups
) Yes, independent efforts by different stakeholder groups have been made
) No
) Other, please specify: .............................................................................................................................
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
10. Are efforts to improve the examination generally welcomed by the leadership in charge of the examination?
a. ( ) Yes
b. ( ) No
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
11. Is there funding allocated for the examination?
a.
b.
c.
d.
e.
f.
g.
(
(
(
(
(
(
(
)
)
)
)
)
)
)
Yes, there is regular funding allocated by the government
Yes, there is regular funding allocated by non-government sources
Yes, there is irregular funding allocated by the government
Yes, there is irregular funding allocated by non-government sources
Yes, there is funding from student fees
Other, please specify: ...........................................................................................................................
No à Go to question 13
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
12. What activities are covered by the funding allocated for the examination (include both in-house and
outsourced activities)? Check all that apply.
a.
b.
c.
d.
e.
f.
g.
h.
i.
(
(
(
(
(
(
(
(
(
) Examination design
) Examination administration
) Data analysis
) Data reporting
) Long- or medium-term planning of program milestones
) Research and development
) Staff training
) Activities not related to examination
) Other, please specify: ...........................................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
106
A N N E XE S
13. What type of agency or institution or unit has primary responsibility for running the examination?
Check all that apply.
a.
b.
c.
d.
e.
f.
(
(
(
(
(
(
) Office or branch within the Ministry of Education
) Semi-autonomous institute or examination council or agency, or quasi-government agency
) University or university consortium or council
) Private board
) International consortium or board
) Other, please specify: ............................................................................................................................
Comments: Please specify the name of the agency or institution or unit.
......................................................................................................................................................................
......................................................................................................................................................................
14. Since what year has the agency or institution or unit identified in question 13 had primary responsibility
for running the examination?
In charge since (year): .....................................................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
15. Is the agency or institution or unit identified in question 13 accountable to an external body?
a. ( ) Yes
Please specify the name of the external body to which the agency or institution or unit identified in question 13 is
accountable: .................................................................................................................................................
b. ( ) No
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
16. Are the examination results officially recognized by broader certification or selection systems?
Check all that apply.
a. ( ) No
b. ( ) Yes, the examination results are officially recognized by certification and selection systems in the country
c. ( ) Yes, the examination results are officially recognized by only one certification and selection system abroad
d. ( ) Yes, the examination results are officially recognized by more than one certification and selection system
abroad
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
107
Regional Mapping Report on Assessment in the Arab States
17. To what extent does the agency or institution or unit identified in question 13 have the following?
For each statement, indicate Strongly Agree, Agree, Disagree, Strongly Disagree, or Unable to Tell.
a.
b.
c.
d.
e.
f.
(
(
(
(
(
(
) Computers for all technical staff ............................................................ [SA – A – D – SD-Unable to Tell]
) Secure building .................................................................................... [SA – A – D – SD-Unable to Tell]
) Secure storage facilities ........................................................................ [SA – A – D – SD-Unable to Tell]
) Access to adequate computer servers .................................................... [SA – A – D – SD-Unable to Tell]
) Ability to backup data .......................................................................... [SA – A – D – SD-Unable to Tell]
) Adequate communication tools [phone, email, internet] ........................ [SA – A – D – SD-Unable to Tell]
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
18. Which best describes the availability of human resources for running the examination (including inhouse or outsourced)?
a.
b.
c.
d.
e.
(
(
(
(
(
) There is an adequate number of permanent or full-time staff
) There is permanent or full-time staff, but it is insufficient to meet needs of the examination
) There is mainly temporary or part-time staff
) There is no staff allocated to running the examination
) Other, please specify: ............................................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
19. What issues have been identified with the performance of the human resources that are responsible
for the examination? Check all that apply.
a. ( ) Delays in administering the examination due to issues with the design of the examination questions
b. ( ) Poor training of test administrators or about unclear instructions and guidelines in administering
the examination
c. ( ) Errors in scoring that have led to delays in results being reported
d. ( ) Weaknesses in test design
e. ( ) Omission of curricular topics
f. ( ) Frequent errors in the examination questions
g. ( ) Frequent errors in data processing
h. ( ) Other, please specify: ............................................................................................................................
i. ( ) None
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
108
A N N E XE S
20. Which of the following opportunities are available in the country on an annual basis?
Check all that apply.
a. ( ) University graduate programs (masters or doctorate level) specifically focused on educational measurement
and evaluation
b. ( ) University courses (graduate and non-graduate) on educational measurement and evaluation
c. ( ) Non-university training courses or workshops on educational measurement and evaluation
d. ( ) Funding for attending international programs, courses, or workshops on educational measurement
and evaluation
e. ( ) Internships in the examination office
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
21. What does the examination measure? Check all that apply.
a. ( ) The national school curriculum guidelines or standards
b. ( ) Internationally recognized curriculum guidelines or standards
Please specify: ................................................................................................................................................
c. ( ) It is not clear what the examination measures
d. ( ) Other, please specify: ............................................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
22. Is what is measured by the examination largely accepted by stakeholders?
a.
b.
c.
d.
(
(
(
(
) Yes
) Some stakeholder groups question what the examination measures
) No
) Other: ..................................................................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
23. Do all students have access to the materials needed to prepare for the examination?
a. ( ) Definitely yes. The material is widely accessible by all students (over 90%) in a variety of learning contexts
(e.g., covered in public school, available for home schooling, available on line)
b. ( ) Yes. The material is accessible by most students (50% to 90% of students), but certain student subgroups
may have greater access than others (e.g., due to language issues, location)
c. ( ) The material is accessible only by some students (10% to 50% of students) who meet certain criteria (e.g.,
who have the ability to pay for supplemental study material who are enrolled in special schools)
d. ( ) No. the material is only accessible by a small number (less than 10%) of students
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
109
Regional Mapping Report on Assessment in the Arab States
24. What material on the examination is publically available? Check all that apply.
a.
b.
c.
d.
e.
(
(
(
(
(
) Examples of the types of questions that are on the examination
) Information on how to prepare for the examination
) The framework document explaining what is measured on the examination
) Report on the strengths and weaknesses in student performance
) Other, please specify: ............................................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
25. How would you characterize the quality of workshops or courses on the examinations available to
teachers? Check all that apply.
a.
b.
c.
d.
e.
f.
(
(
(
(
(
(
)
)
)
)
)
)
There are compulsory courses or workshops that are regularly updated
There are compulsory courses or workshops that are not regularly updated
There are voluntary courses or workshops that are regularly updated
There are voluntary courses or workshops that are not regularly updated
Other, please specify: ...........................................................................................................................
There are no courses or workshops
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
26. What examination-related tasks are mainly performed by teachers? Check all that apply.
a.
b.
c.
d.
e.
f.
g.
h.
i.
(
(
(
(
(
(
(
(
(
) Selecting or creating examination questions
) Selecting or creating examination scoring guides
) Administering the examination
) Scoring the examination
) Acting as a judge (i.e., in orals)
) Supervising examination procedures
) Resolving inconsistencies between examination scores and school grades (i.e., moderation)
) Other, please specify: ............................................................................................................................
) None
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
27. Which best describes the technical documentation supporting the examination?
a. ( ) There is a comprehensive, high quality technical report available to the general public
b. ( ) There is a comprehensive technical report but with restricted circulation
c. ( ) There is some documentation about the technical aspects of the examination, but it is not in a formal
report format
d. ( ) There is no technical report or other documentation
110
A N N E XE S
If available, please submit the technical documentation supporting your answer selection.
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
28. What systematic mechanisms are in place to ensure the quality of the examination?
Check all that apply.
a.
b.
c.
d.
e.
f.
g.
(
(
(
(
(
(
(
) Internal review or observers
) External review or observers
) External certification or audit
) Pilot or field testing
) Translation verification
) Other, please specify: ..........................................................................................................................
) None
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
29. Which of the following inappropriate behaviors that diminish the credibility of the examination
typically occur during the examination process? Check all that apply.
a.
b.
c.
d.
e.
f.
g.
h.
i.
j.
(
(
(
(
(
(
(
(
(
(
) Leakage of the content of an examination paper or part of a paper prior to the examination
) Impersonation when an individual other than the registered candidate takes the examination
) Copying from other candidates
) Using unauthorized materials such as prepared answers and notes
) Collusion among candidates via mobile phones, passing of paper, or equivalent
) Intimidation of examination supervisors, markers or officials
) Issuing forged certificates or altering results information
) Provision of external assistance via the supervisor, mobile phone, etc.
) Other, please specify: ............................................................................................................................
) None
Comments: For each selection, please indicate what mechanisms have been put in place to address the
inappropriate behavior.
......................................................................................................................................................................
......................................................................................................................................................................
30. How credible are the examination results?
a. ( ) The results are perceived as credible by all stakeholder groups
b. ( ) The results are perceived as credible by some stakeholder groups
c. ( ) The results lack credibility for all stakeholder groups
Comments: Please provide an explanation for your selection. Please comment if subgroups of stakeholder
groups have different views.
......................................................................................................................................................................
......................................................................................................................................................................
111
Regional Mapping Report on Assessment in the Arab States
31. May all students take the examination, regardless of background (e.g., gender, ethnic group), location
(e.g., urban, rural), ability to pay (e.g., transportation, fees) or the like?
a. ( ) Yes à Go to question 33
b. ( ) No
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
32. Which best describes the existing barriers to take the examination?
a. ( ) There are some small groups of students (less than 10%) that may not take the examination because of
language, gender, socioeconomic status, cost, or the like
b. ( ) There is a significant proportion of the students (between 10% and 50%) that may not take the
examination because of language, gender, socioeconomic status, cost, or the like
c. ( ) The examination is not an option for the majority of the population (over 50%) due to language, gender,
socioeconomic status, cost, or the like
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
33. Is there systematic evidence of improper use of examination results by any of the stakeholder groups?
For each stakeholder selected, please specify the improper use.
a.
b.
c.
d.
e.
f.
g.
h.
i.
(
(
(
(
(
(
(
(
(
)
)
)
)
)
)
)
)
)
Policy makers Please specify: ...............................................................................................................
Teacher unions Please specify: ..............................................................................................................
Educators Please specify: .....................................................................................................................
Students Please specify: .....................................................................................................................
Parents Please specify: .........................................................................................................................
Media Please specify: ..........................................................................................................................
Think-tanks, NGOs or equivalent Please specify: ....................................................................................
Universities Please specify: ..................................................................................................................
Employers: Please specify: ...................................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
34. Are student results confidential?
a. ( ) Yes, only the student and persons with a legitimate, professional interest in the test taker (e.g., his or her
educators, parents, authorized potential employers) can know the results
b. ( ) No, student names and results are public
c. ( ) Other, please specify: ...........................................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
112
A N N E XE S
35. What are the options for students who do not perform well on the examination?
Check all that apply.
a.
b.
c.
d.
e.
f.
(
(
(
(
(
(
) Students may retake the examination
) Students may attend remedial or preparatory courses in order to prepare to retake the examination
) Students may opt for less selective schools/universities/tracks
) Students can repeat the grade
) Students must leave the education system
) Other, please specify: ..........................................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
36. What mechanisms are in place to monitor the consequences of the examination?
Check all that apply.
a.
b.
c.
d.
e.
f.
g.
(
(
(
(
(
(
(
)
)
)
)
)
)
)
Funding for independent research on the impact of the examination
A permanent oversight committee
Studies (e.g., predictive validity) that are updated regularly
Regular focus groups or surveys of key stakeholders
Expert review groups
Other, please specify: .........................................................................................................................
None
THANK YOU FOR YOUR RESPONSES
Please ensure that you have answered all questions and that your responses are consistent
and accompanied by supporting evidence.
113
Regional Mapping Report on Assessment in the Arab States
ANNEX III: SABER-SA Questionnaires
2011 QUESTIONNAIRE
Survey of Student Assessment Systems
National Large-Scale Assessment
Name of Country of Education System
Date of data collection
Systems Approach for Better Education Results
The World Bank
Human Development Network
114
A N N E XE S
1. At what age do children usually start grade 1? Please provide modal age.
a.
b.
c.
d.
(
(
(
(
) 5 years old
) 6 years old
) 7 years old
) 8 years old
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
2. Which grades represent the end of an education cycle or stage? Check all that apply. Please indicate
which education cycle or stage the selected grades represent.
a. ( ) Grade 1
Education cycle or stage:
b. ( ) Grade 2
Education cycle or stage:
c. ( ) Grade 3
Education cycle or stage:
d. ( ) Grade 4
Education cycle or stage:
e. ( ) Grade 5
Education cycle or stage:
f. ( ) Grade 6
Education cycle or stage:
g. ( ) Grade 7
Education cycle or stage:
h. ( ) Grade 8
Education cycle or stage:
i. ( ) Grade 9
Education cycle or stage:
j. ( ) Grade 10
Education cycle or stage:
k. ( ) Grade 11
Education cycle or stage:
l. ( ) Grade 12
Education cycle or stage:
m. ( ) Grade 13
Education cycle or stage:
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
115
Regional Mapping Report on Assessment in the Arab States
3. Please provide information on up to three major national (or sub-national) large-scale assessment
programs in the country/system. Please complete one table for each assessment program, starting
with the one that has produced the most recent data on student learning levels in the system.
Assessment 3A:
I. Name of large-scale assessment
program
II. Main purpose(s) of large-scale
assessment program (check all that
apply)
a.
b.
c.
d.
e.
f.
g.
III. First year the large-scale assessment
program was administered
d. ( ) More than ten years ago
e. ( ) Five to ten years ago
f. ( ) Less than five years ago
IV. Frequency of administering the largescale assessment program
a.
b.
c.
d.
(
(
(
(
(
(
(
(
(
(
(
)
)
)
)
)
)
)
)
)
)
)
Monitoring education quality at the system level
Holding government or political authority accountable
School or educator accountability
Student accountability
Supporting schools and teachers
Policy design, evaluation, or decision making
Other, please specify:
Every year
Two to four times every five years
One to two times every ten years
Other, please specify:
V. For each year in which the large-scale
assessment program was administered in
the last ten years, list the subject area(s)
tested and the grade/age level(s) at
which students were assessed
VI. Format of the questions used on the
a. ( )
large-scale assessment program (check all b. ( )
that apply)
c. ( )
d. ( )
VII. Who participates in the large-scale
assessment program?
VIII. Additional comments
116
Multiple-choice
Supply/open-ended
Essay
Other, please specify:
a. ( ) All students at the given grade(s) or age level(s)
b. ( ) A representative random sample of students
c. ( ) A non-random sample of students
A N N E XE S
Assessment 3B:
I. Name of large-scale assessment
program
II. Main purpose(s) of large-scale
assessment program (check all that
apply)
a.
b.
c.
d.
e.
f.
g.
III. First year the large-scale assessment
program was administered
a. ( ) More than ten years ago
b. ( ) Five to ten years ago
c. ( ) Less than five years ago
IV. Frequency of administering the largescale assessment program
a.
b.
c.
d.
(
(
(
(
(
(
(
(
(
(
(
)
)
)
)
)
)
)
)
)
)
)
Monitoring education quality at the system level
Holding government or political authority accountable
School or educator accountability
Student accountability
Supporting schools and teachers
Policy design, evaluation, or decision making
Other, please specify:
Every year
Two to four times every five years
One to two times every ten years
Other, please specify:
V. For each year in which the large-scale
assessment program was administered in
the last ten years, list the subject area(s)
tested and the grade/age level(s) at
which students were assessed
VI. Format of the questions used on the
a. ( )
large-scale assessment program (check all b. ( )
that apply)
c. ( )
d. ( )
VII. Who participates in the large-scale
assessment program?
Multiple-choice
Supply/open-ended
Essay
Other, please specify:
a. ( ) All students at the given grade(s) or age level(s)
b. ( ) A representative random sample of students
c. ( ) A non-random sample of students
VIII. Additional comments
117
Regional Mapping Report on Assessment in the Arab States
Assessment 3C:
I. Name of large-scale assessment
program
II. Main purpose(s) of large-scale
assessment program (check all that
apply)
a.
b.
c.
d.
e.
f.
g.
III. First year the large-scale assessment
program was administered
a. ( ) More than ten years ago
b. ( ) Five to ten years ago
c. ( ) Less than five years ago
IV. Frequency of administering the largescale assessment program
a.
b.
c.
d.
(
(
(
(
(
(
(
(
(
(
(
)
)
)
)
)
)
)
)
)
)
)
Monitoring education quality at the system level
Holding government or political authority accountable
School or educator accountability
Student accountability
Supporting schools and teachers
Policy design, evaluation, or decision making
Other, please specify:
Every year
Two to four times every five years
One to two times every ten years
Other, please specify:
V. For each year in which the large-scale
assessment program was administered
in the last ten years, the subject area(s)
tested and the grade/age level(s) at
which students were assessed
VI. Format of the questions used on the
a. ( )
large-scale assessment program (check all b. ( )
that apply)
c. ( )
d. ( )
VII. Who participates in the large-scale
assessment program?
Multiple-choice
Supply/open-ended
Essay
Other, please specify:
a. ( ) All students at the given grade(s) or age level(s)
b. ( ) A representative random sample of students
c. ( ) A non-random sample of students
VIII. Additional comments
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
118
A N N E XE S
4. Please indicate which national large-scale assessment program is viewed as the most important for
use by policy makers. A sub-national large-scale assessment may be substituted if there is no national
large-scale assessment program in the country/system.
a. ( ) Large-scale assessment in Table 3A
b. ( ) Large-scale assessment in Table 3B
c. ( ) Large-scale assessment in Table 3C
Comments: Please answer all remaining questions with respect to the assessment indicated in question 4.
.......................................................................................................................................................................
.......................................................................................................................................................................
5. Does the country/system have a policy that authorizes the large-scale assessment program?
a. ( ) Yes, a formal policy
b. ( ) Yes, informal or draft policy
c. ( ) No –> Got to question 8
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
6. Please provide the following information on the policy that authorizes the large-scale assessment
program:
Official document citation: ...............................................................................................................................
Authorizing body: ...........................................................................................................................................
Year of authorization: ......................................................................................................................................
Please provide the link or attach a copy of the policy with your submission of the completed questionnaire.
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
7. Is the policy identified in question 6 available to the public?
a. ( ) Yes
b. ( ) No
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
119
Regional Mapping Report on Assessment in the Arab States
8. Does the national/system’s government have a large-scale assessment plan for the coming years or
future assessment rounds?
a. ( ) Yes
b. ( ) No à Go to question 10
Please provide the link or attach a copy of the plan with your submission of the completed questionnaire.
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
9. Which of the following apply regarding the plan referred to in question 8?
a. ( ) There is a publicly-available written plan specifying who will be tested [e.g., 4th graders] and in which
subject areas [e.g., math, science]. The plan is available to, and easily accessible by, the public.
b. ( ) There is a non-publicly available written plan specifying who will be tested [e.g., 4th graders] and in which
subject areas [e.g., math, science]. The plan is available to, and accessible by, only certain selected groups of people.
c. ( ) There is a common understanding that the assessment will take place but there is no formally written plan.
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
10. Have stakeholder groups attempted to reform the large-scale assessment program?
a.
b.
c.
d.
(
(
(
(
) Yes, coordinated efforts have been made by stakeholder groups
) Yes, independent efforts have been made by different stakeholder groups
) Other, please specify: ...........................................................................................................................
) No à Go to question 13
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
11. Based on publicly-available evidence, how much do the following stakeholder groups support or
oppose the large-scale assessment program? For each stakeholder, mark Strongly Support (SS),
Support (S), Neutral (N), Oppose (O), Strongly Oppose (SO), or Unable to Tell.
a. ( ) Policymakers .................................................................................. [SS – S – N – O – SO-Unable to Tell]
b. ( ) Teacher Unions .............................................................................. [SS – S – N – O – SO-Unable to Tell]
c. ( ) Educators ...................................................................................... [SS – S – N – O – SO-Unable to Tell]
d. ( ) Students ........................................................................................ [SS – S – N – O – SO-Unable to Tell]
e. ( ) Parents .......................................................................................... [SS – S – N – O – SO-Unable to Tell]
f. ( ) Media ............................................................................................ [SS – S – N – O – SO-Unable to Tell]
g. ( ) Think-tanks, NGOs or equivalent ..................................................... [SS – S – N – O – SO-Unable to Tell]
h. ( ) Universities .................................................................................... [SS – S – N – O – SO-Unable to Tell]
i. ( ) Employers ...................................................................................... [SS – S – N – O – SO-Unable to Tell]
j. ( ) Other, please specify: ………………………………………………… [SS – S – N – O – SO-Unable to Tell]
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
120
A N N E XE S
12. What actions have stakeholder groups engaged in that are critical of the large-scale assessment?
Check all that apply.
a.
b.
c.
d.
e.
f.
g.
h.
i.
(
(
(
(
(
(
(
(
(
) Educators claiming that they will not cooperate with the assessment or will go on strike if it takes place
) Students protesting or boycotting the assessment or its uses
) Newspaper or magazine editorials or columns criticizing the assessment or its uses
) Policymakers criticizing the assessment or its uses
) Parents criticizing the assessment or its uses
) NGOs, think tanks, and other donors issuing reports critical of the assessment or its uses
) Universities criticizing the assessment or its uses
) Employers criticizing the assessment or its uses
) Other, please specify: ...........................................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
13. Is there funding allocated for the large-scale assessment program? Check all that apply.
a.
b.
c.
d.
e.
f.
(
(
(
(
(
(
) Yes, there is regular (continuous and predictable) funding allocated by the government
) Yes, there is regular (continuous and predictable) funding allocated by non-government sources
) Yes, there is irregular funding from the government
) Yes, there is irregular funding from non-government sources
) Other, please specify: ...........................................................................................................................
) No, there is no funding allocated for the large-scale assessment program à Go to question 15
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
14. What activities are covered by the funding allocated for the large-scale assessment program (include
both in-house and outsourced activities)? Check all that apply.
a.
b.
c.
d.
e.
f.
g.
h.
i.
(
(
(
(
(
(
(
(
(
) Assessment design
) Assessment administration
) Data analysis
) Data reporting
) Long- or medium-term planning of program milestones
) Research and development
) Staff training
) Activities not related to the large-scale assessment, please specify:
) Other, please specify: ...............................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
121
Regional Mapping Report on Assessment in the Arab States
15. Which best describes the organizational structure of the group in charge of the large-scale assessment?
a.
b.
c.
d.
e.
(
(
(
(
(
) It is a permanent agency or institution or unit created for running the assessment
) It is a temporary agency or institution or unit created for running the assessment
) It is a group of people temporarily assigned to carry out the assessment exercise
) Other, please specify: ...........................................................................................................................
) There is no group in charge of the large-scale assessment
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
16. What priority is given to technical and political considerations in the decision-making process for the
large-scale assessment exercise identified in question 4?
a.
b.
c.
d.
e.
(
(
(
(
(
) Political considerations never hamper technical considerations
) Political considerations sometimes hamper technical considerations
) Political considerations regularly hamper technical considerations
) Other, please specify: ............................................................................................................................
) Unable to tell
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
17. Have large-scale assessment results from the assessment identified in question 4 ever been withheld
from publication because of political reasons?
a. ( ) Yes Please specify: ............................................................................................................................
b. ( ) No
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
18. Is the group responsible for carrying out the large-scale assessment accountable to a clearly recognized
body?
a. ( ) Yes
b. ( ) No à Go to question 20
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
122
A N N E XE S
19. To which body is the group responsible for carrying out the large-scale assessment accountable?
Check all that apply.
a. ( ) It is accountable to a higher office in the Ministry of Education or another sectoral authority
b. ( ) It is accountable to an external board or committee (government or non-government)
c. ( ) Other, please specify: ...........................................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
20. Which best describes the availability of human resources for the large-scale assessment identified in
question 4 (including in-house or outsourced)?
f. ( ) There is an adequate number of permanent or full-time staff
g. ( ) There is permanent or full-time staff, but it is insufficient to meet the needs of the assessment identified
in question 4
h. ( ) There is mainly temporary or part-time staff
i. ( ) There is no staff allocated to running the large-scale assessment
j. ( ) Other, please specify: ...........................................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
21. What, if any, issues have been identified with the performance of the human resources that are
responsible for the large-scale assessment? Check all that apply.
j. (
k. (
l. (
m. (
n. (
o. (
p. (
q. (
r. (
) Delays in administering the assessment due to issues with the design of the questions
) Poor training of test administrators or unclear instructions and guidelines for administering the assessment
) Errors in scoring that have led to delays in results being reported
) Weaknesses in test design
) Omission of curricular topics
) Frequent errors in the test questions
) Frequent errors in data processing
) Other, please specify: ...............................................................................................................
) None
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
123
Regional Mapping Report on Assessment in the Arab States
22. Which, if any, of the following opportunities are available in the country/system on an annual basis?
Check all that apply.
f. ( ) University graduate programs (masters or doctorate level) specifically focused on educational measurement
and evaluation
g. ( ) University courses (graduate and non-graduate) on educational measurement and evaluation
h. ( ) Non-university training courses or workshops on educational measurement and evaluation
i. ( ) Funding for attending international programs or courses or workshops on educational measurement and
evaluation
j. ( ) Internships or short-term employment in the large-scale assessment office
k. ( ) Other, please specify: ...........................................................................................................................
l. ( ) No opportunities are offered
Comments: (please specify the perceived quality of each of the available opportunities):
......................................................................................................................................................................
......................................................................................................................................................................
23. What does the large-scale assessment measure? Check all that apply.
a. ( ) Performance against national/system or state-level curriculum guidelines or learning standards
b. ( ) Performance against internationally recognized curriculum guidelines or learning standards
Please specify: ................................................................................................................................................
c. ( ) Other, please specify: ...........................................................................................................................
d. ( ) It is not clear
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
24. Is what is measured by the large-scale assessment largely accepted by stakeholder groups?
a.
b.
c.
d.
(
(
(
(
) Yes
) Some stakeholder groups question what the assessment measures
) No
) Other, please specify: ..........................................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
25. Are mechanisms in place to ensure that the large-scale assessment accurately measures what it is
supposed to measure?
a. ( ) Yes
b. ( ) No à Go to question 27
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
124
A N N E XE S
26. What are the mechanisms referred to in question 25? Check all that apply.
a. ( ) Regular independent review by qualified experts of the alignment between the assessment instrument
and what it is supposed to measure
b. ( ) Regular internal review of the alignment between the assessment instrument and what it is supposed
to measure
c. ( ) Ad-hoc review of the alignment between the assessment instrument and what it is supposed to measure
d. ( ) Other, please specify: ...........................................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
27. Are teacher training courses, workshops, or presentations on the large-scale assessment (e.g., domains
measured, how to read and use results) offered in the country/system?
a. ( ) Yes
b. ( ) No à Go to question 29
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
28. Which of the following best describe the teacher training courses, workshops, or presentations on the
large-scale assessment? Check all that apply.
a.
b.
c.
d.
e.
f.
g.
h.
i.
(
(
(
(
(
(
(
(
(
) Courses or workshops are offered on a regular basis
) Courses or workshops are offered occasionally
) Presentations are offered occasionally
) Most teachers have access to live courses or workshops
) Most teachers have access to courses online
) Most courses are of a high quality
) Most courses provide teachers with relevant resources that they can use in their classrooms
) Other, please specify: ...........................................................................................................................
) There are no teacher training courses or workshops on the large-scale assessment
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
29. What is done to include all student groups in the large-scale assessment exercise? Check all that apply.
a. ( ) Accommodations or alternative assessments are provided for students with disabilities
b. ( ) Special plans are made to ensure that the large-scale assessment is administered to students in hard-toreach areas
c. ( ) The large-scale assessment is offered in the language of instruction for almost all student groups
d. ( ) Other, please specify: ..........................................................................................................................
e. ( ) Nothing is done
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
125
Regional Mapping Report on Assessment in the Arab States
30. What mechanisms are in place to ensure the quality of the large-scale assessment instrument?
Check all that apply.
a. (
b. (
c. (
d. (
e. (
f. (
g. (
h. (
i. (
j. (
k. (
l. (
m. (
) All proctors or administrators are trained according to a protocol
) There is a standardized manual for large-scale assessment administrators
) Discrepancies must be recorded on a standard sheet
) A pilot is conducted before the main data collection takes place
) All booklets are numbered
) There is double data scoring (if applicable, for example, for open-ended items)
) Scorers are trained to ensure high interrater reliability
) There is double processing of data
) External reviewers or observers
) Internal reviewers or observers
) External certification or audit
) Other, please specify: ...........................................................................................................................
) None
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
31. Which best describes the technical documentation of the large-scale assessment?
a. ( ) There is a comprehensive, high-quality technical report available to the general public
b. ( ) There is a comprehensive technical report, but with restricted circulation
c. ( ) There is some documentation about the technical aspects of the assessment, but it is not in a formal
report format
d. ( ) There is no technical report or other documentation
If available, please submit the technical documentation supporting your answer selection.
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
32. How are the large-scale assessment results reported or disseminated? Check all that apply.
a.
b.
c.
d.
e.
f.
g.
h.
i.
j.
(
(
(
(
(
(
(
(
(
(
) Results are disseminated within twelve months after the large-scale assessment is administered
) Reports with results are made available for all stakeholder groups
) The main reports on the results contain information on overall achievement levels and subgroups
) The main reports on the results contain information on trends over time overall and for subgroups
) The main reports on the results contain standard errors (measure of uncertainty)
) There is a media briefing organized to discuss results
) There are workshops or presentations for key stakeholders on the results
) Results are featured in newspapers, magazines, radio, or television
) Other, please specify: ...........................................................................................................................
) Large-scale assessment results are not reported or disseminated
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
126
A N N E XE S
33. How is the large-scale assessment information used?
a. ( ) Assessment information is used by all or most stakeholder groups in a way that is consistent with the
stated purposes or technical characteristics of the assessment
b. ( ) Assessment information is used by some stakeholder groups in a way that is consistent with the stated
purposes or technical characteristics of the assessment
c. ( ) Assessment information is not used by stakeholder groups or is used in ways inconsistent with the stated
purposes or the technical characteristics of the assessment
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
34. What mechanisms are in place to monitor the consequences of the large-scale assessment?
Check all that apply.
a.
b.
c.
d.
( ) Funding for independent research on the impact of the large-scale assessment
( ) A permanent oversight committee
( ) Regular focus groups or surveys of key stakeholders
( ) Themed conferences that provide a forum to discuss research and other data on the consequences of the
large-scale assessment
e. ( ) Expert review groups
f. ( ) Other, please specify: ...........................................................................................................................
g. ( ) None
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
THANK YOU FOR YOUR RESPONSES
Please ensure that you have answered all questions and that your responses are consistent
and accompanied by supporting evidence.
127
Regional Mapping Report on Assessment in the Arab States
ANNEX III: SABER-SA Questionnaires
2011 QUESTIONNAIRE
Survey of Student Assessment Systems
International Large-Scale Assessment
Name of Country of Education System
Date of data collection
Systems Approach for Better Education Results
The World Bank
Human Development Network
128
A N N E XE S
1. Has the country/system participated in any international assessments?
2. ( ) Yes
3. ( ) No à Go to question 3
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
2. In which international assessment(s) has the country/system participated? Check all that apply.
CIVED/ICCS (IEA)
PIRLS (IEA)
TIMSS (IEA)
PISA (OECD)

1996

2001

1995

2000

1997

2006

1999

2003

1999

2011

2003

2006

2000

2007

2009

2009

2011
LLECE (UNESCO)
SACMEQ (UNESCO)

LLECE 1999

I (1995-1998)

SERCE 2004-2008

II (1999-2004)

TERCE 2011

III (2005-2009)
PASEC (CONFEMEN)1
Other2
1
Please indicate the year(s) in which the country/system participated in PASEC.
2
Please specify the name(s) and year(s) of other international assessments in which the country/system has participated.
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
3. In which upcoming international assessment(s), if any, has the country/system taken concrete steps
to participate? (Examples of concrete steps include writing a proposal and plan, allocating funding,
or participating in an international meeting related to the assessment exercise.) Check all that apply.
a. ( ) LLECE
b. ( ) PASEC
c. ( ) PIRLS 2016
129
Regional Mapping Report on Assessment in the Arab States
d. ( ) PISA 2012
e. ( ) PISA 2015
f. ( ) SACMEQ
g. ( ) TIMSS 2015
h. ( ) Other, please specify: ............................................................................................................................
i. ( ) None
If you answered “no” for question 1 and “none” for question 3, please go to the end of the questionnaire for
information on how to submit your responses.
4. Please provide the name and the year of the most recent international assessment in which the
country/system participated. Please refer to this assessment for questions 5 through 27.
If the country/system has not participated in an international assessment, but has taken steps to participate in
its first international assessment, please provide the name and the year of that assessment. Please refer to this
assessment for questions 5 through 18.
Name: ...............................................................................................................
Year: ...............................................................................................................
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
5. Does the country/system have a policy document that addresses participation in international
assessments?
a. ( ) Yes, a formal policy document
b. ( ) Yes, an informal or draft policy document
c. ( ) No à Go to question 8
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
6. Please provide the following information on the policy document that addresses participation in
international assessments:
Official document citation: ..............................................................................................................................
Authorizing body: ...........................................................................................................................................
Year of authorization: ......................................................................................................................................
Please provide the link or attach a copy of the document with your submission of the completed questionnaire.
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
130
A N N E XE S
7. Is the document identified in question 6 available to the public?
a. ( ) Yes
b. ( ) No
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
8. Was funding allocated for participation in the international assessment identified in question 4?
a. ( ) Yes
b. ( ) No à Go to question 11
Comments:
.......................................................................................................................................................................
.......................................................................................................................................................................
9. Which option best describes the funding allocated for participation in the international assessment?
a.
b.
c.
d.
(
(
(
(
) Regular funding program for international assessment participation, approved by law, decree or norm
) Regular funding program for international assessment participation, allocated at discretion
) Funding sourced from loans, external donors
) Other, please specify: ............................................................................................................................
Comments: Please provide additional information if funds were earmarked partly from the regular government
budget and partly from other sources, such as donors.
.......................................................................................................................................................................
.......................................................................................................................................................................
10. What activities (in-house and outsourced) are covered by the funding for participation in the
international assessment identified in question 4? Check all that apply.
a.
b.
c.
d.
e.
f.
g.
(
(
(
(
(
(
(
) International participation fees
) Implementation of the assessment exercise in the country/system (e.g., printing booklets, travel to schools)
) Processing and analyzing data collected from implementation of the assessment exercise
) Reporting and disseminating the assessment results in the country/system
) Attendance at international expert meetings for the assessment exercise
) Research and development
) Other, please specify: ............................................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
131
Regional Mapping Report on Assessment in the Arab States
11. Is there a national/system coordinator responsible for the international assessment?
a. ( ) Yes
b. ( ) No
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
12. Is there a team responsible for carrying out the international assessment in the country/system?
a. ( ) Yes
b. ( ) No à Go to question 16
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
13. Which of the following describes the team responsible for carrying out the international assessment?
Check all that apply.
a. ( ) The national/system coordinator is fluent in the language in which the international-level meetings are
conducted and related documentation is available
b. ( ) The team is sufficiently staffed
c. ( ) The team has previous experience working on international assessments
d. ( ) The team has the necessary training or experience to carry out the required assessment activities effectively
e. ( ) Other, please specify: ...........................................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
14. Have members of the team in charge of the international assessment exercise in the country/system
attended international meetings related to the assessment?
a. ( ) Yes, team members have attended all of the meetings
b. ( ) Yes, team members have attended some of the meetings
c. ( ) No
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
15. What, if any, issues have been identified with the carrying out of the international assessment in the
country/system? Check all that apply.
s.
t.
u.
v.
132
( ) There have been errors or delays in the printing or layout of the test booklets
( ) There have been errors or delays in the administration of the assessment
( ) There have been complaints about poor training of test administrators
( ) There have been issues with translation of the assessment instruments (e.g., test booklets, background
questionnaires)
A N N E XE S
w. ( ) There have been errors or delays in scoring student responses to questions
x. ( ) Other, please specify: ...........................................................................................................................
y. ( ) None
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
16. Are opportunities to learn about international assessments offered in the country/system?
a. ( ) Yes
b. ( ) No à Go to question 19
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
17. What opportunities to learn about international assessments are offered in the country/system?
Check all that apply.
a.
b.
c.
d.
e.
(
(
(
(
(
) Workshops or meetings on using international assessment databases
) University courses on the topic of international assessments
) Funding for attending international workshops or training on international assessments
) On-line courses on international assessments
) Other, please specify: ...........................................................................................................................
Comments: Please comment on the regularity/frequency of such opportunities
......................................................................................................................................................................
......................................................................................................................................................................
18. Who benefits from the opportunities to learn about international assessments? Check all that apply.
a.
b.
c.
d.
(
(
(
(
) Individuals working directly on the specific international assessment exercise
) University students studying assessment or a related area
) Professionals or university staff interested in assessment
) Other, please specify: ...........................................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
19. Which best describes the way in which the country’s/system’s data from the international assessment
exercise was presented in the official international report?
[If the country/system has yet to participate to this stage in an international assessment exercise, please go
directly to the end of the questionnaire for information on how to submit your responses.]
a. ( ) The country/system met all technical standards required to have its data presented in the main displays
of the international report
133
Regional Mapping Report on Assessment in the Arab States
b. ( ) The country/system met sufficient standards to have its data presented beneath the main display of the
international report or in an annex
c. ( ) The country/system did not meet the technical standards required to have its data published in the
international report
d. ( ) Other, please specify: ...........................................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
20. Has the country/system contributed to the global knowledge base on international assessments by
generating new knowledge and making it available through publications or presentations?
a. ( ) Yes (Please provide a reference, link, or PDF supporting you answer)
b. ( ) No
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
21. Were country/system-specific results from the most recent international assessment disseminated in
the country/system?
a. ( ) Yes
b. ( ) No à Go to question 24
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
22. How were country/system-specific results from the international assessment disseminated? Check all
that apply.
a.
b.
c.
d.
e.
f.
( ) A national/system report was made available online
( ) Copies of the national/system report were distributed to key stakeholders
( ) Copies of the international report were distributed to key stakeholders
( ) Country’s/system’s results were communicated through a press release
( ) Results received coverage on the television, radio or newspapers
( ) Brochures and PowerPoint presentations with the country’s/system’s results were made available online
or distributed to key stakeholders
g. ( ) Products providing feedback to the schools or educators about the results were made available
h. ( ) Other, please specify: ............................................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
134
A N N E XE S
23. Have the results of the international assessment been fed back to schools and educators?
a. ( ) Yes, systematically
b. ( ) Yes, but only sometimes
c. ( ) No
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
24. Which of the following describes how the international assessment results have been covered by the
media in the country/system? Check all that apply.
a.
b.
c.
d.
e.
(
(
(
(
(
) The assessment results are on the front page of the newspapers or the main story on the TV news
) There are editorials or columns commenting on the international assessment results
) Media coverage is limited to a few small articles
) International assessment results have not been covered in the media
) Other, please specify: ...........................................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
25. Have the results of the international assessment been used to inform decision making at the national/
system level?
a. ( ) Yes
b. ( ) No à End of questionnaire
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
26. Please indicate how the results of the international assessment exercise have been used by policy
makers or education leaders to improve education quality in the country/system. Check all that apply.
a.
b.
c.
d.
e.
f.
(
(
(
(
(
(
) Tracking the impact of reforms on student achievement levels
) Informing curriculum improvement
) Informing teacher training programs
) Informing other assessment activities in the system (e.g., classroom assessment, examinations)
) Informing resource allocation
) Other, please specify: ...........................................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
135
Regional Mapping Report on Assessment in the Arab States
27. Is there evidence of a positive impact on student achievement levels from the uses identified in
question 26?
a. ( ) Yes
b. ( ) No
Please provide details to support your answer: .................................................................................................
Comments:
......................................................................................................................................................................
......................................................................................................................................................................
THANK YOU FOR YOUR RESPONSES
Please ensure that you have answered all questions and that your responses are consistent
and accompanied by supporting evidence.
136
UNESCO Regional Bureau
for Education in the Arab States - Beirut
Arab League Educational, Cultural
and Scientific Organization (ALECSO)
N/2014/10/009
ISBN: 978-9973-15-356-2
2014
Arab League Educational, Cultural and Scientific Organisation, the
Arab Regional Agenda for Improving Education Quality, Tunis Regional
Mapping Report on Assessment in the Arab States / by Adnan El Amine
- Tunis : Arab League Educational, Cultural and Scientific Organization
Education Department, Beirut : UNESCO Regional Bureau for Education
in the Arab States, 2014 - P. 155
Regional Mapping Report on Assessment in the Arab States
Regional Mapping Report on
Assessment in the Arab States
Survey of Student Assessment
Systems in the Arab States
System Approach for Better
Education Results (SABER)