What`s behind the test?

EDUCATION
AARMS
Vol. 10, No. 2 (2011) 287–292
What’s behind the test?
ISTVÁNNÉ VADÁSZ
Miklós Zrínyi National Defence University, Budapest, Hungary
Since English is one of the official NATO languages and is most often used in
communication, attaining high level of English proficiency is a must for many members
of the armed services, which in many cases determines their advancement possibilities.
In 1976, NATO adopted a language proficiency scale, which became a milestone in the
military language assessment and has been used by NATO members and Partner
nations to evaluate the language proficiency of the military personnel.
Introduction
Because of the expanding cooperation within NATO and between NATO and the PfP
countries, the role of language proficiency has significantly grown, and accurate
language assessment came into the foreground. NATO STANAG 6001 level
descriptors, adopted in 1976, serve as guidelines for testers in assessing the level
proficiency of the military personnel. This article gives a brief account of the origin of
the NATO language proficiency scale and its development to meet the needs of the
military language assessment.
The history of the NATO Language Proficiency Scale
Because of the broad international relations, foreign language proficiency has always
been important for the U.S. Government services. The need for the Government
employees’ better assessment was realized at the beginning of the sixties. The Civil
Service Commissioni was given the task to prepare an inventory of the Government
employees. The first difficulties the project faced showed the need for the development
of an assessment system that “was objective, applicable to all languages and all
positions, and unrelated to any particular language curriculum.”1 An interagency
committee created by the Foreign Service Institute (FSI)ii devised a scale ranging from
i
The United States Civil Service Commission was created in 1883. The Commission was created to
administer the civil service of the U.S. Federal Government.
ii The Foreign Service Institute (FSI) is the U.S. Federal Government’s primary training institution preparing
American diplomats and other professionals to advance U.S. foreign affairs interests overseas.
Received: May 16, 2011
Address for correspondence:
ISTVÁNNÉ VADÁSZ
E-mail: [email protected]
I. VADÁSZ: What’s behind the test?
1 to 6. This scale was not based on tests in the listening, speaking, reading and writing
skills, but simply rated “language.”
Lessons learned from the first testing experience showed the need of separating
scales for each skill, and the scale was standardised to six levels. 0 level meant no
functional ability, and 5 referred to the language proficiency of an educated native
speaker. The newly established FSI Testing Unit developed a scoring system, which
helped to reduce the subjectivity in language testing and increased its reliability. In
1985 the Interagency Language Roundtable (ILR)iii revised the scale. Since then the
Government Language Skill Level Descriptions have been called the ILR Scale.
Although some minor changes in language testing were introduced depending on the
needs of certain countries and agencies, it has remained as the standard measuring tool
of language proficiency. In 1976 NATO adopted a language proficiency scale related to
the Interagency Language Roundtable 1968 document.
STANAG 6001 Language Proficiency Levels
NATO Standardization Agreement 6001 (STANAG 6001) describes language
proficiency with a profile of four digits in the following order: listening (L), speaking
(S), reading (R) and writing (W). To indicate the language proficiency SLP
(Standardized Language Profile) code letters are used. SLP 3321 means levels 3 in
listening, level 3 in speaking, level 2 in reading and level 1 in writing.2 Original
STANAG 6001 defined five levels of proficiency:
Level 0 No Proficiency
Level 1 Elementary
Level 2 Limited Working
Level 3 Minimum Professional
Level 4 Full Professional
Level 5 Native/Bilingual3
The document provided the description of the four skills at each level. Because of
the diversity of positions of military and civilian personnel the descriptors were very
general and not related to any specific course or curriculum.
iii
The Interagency Language Roundtable is an organization comprising various agencies of the U.S. Federal
Government with the purpose of coordinating and sharing information on foreign language activities at the
federal level.
288
AARMS 10(2) (2011)
I. VADÁSZ: What’s behind the test?
Table 1. Extract from the Table of Proficiency Levels (Written proficiency skills)4
Level
0
1
2
3
4
5
Written Proficiency Skills
No practical proficiency
..ability to write is limited to a few short sentences.
Can draft routine social correspondence and meet limited professional needs
Can draft official correspondence and reports in a special field.
Can draft all levels of prose pertinent to professional needs.
…completely equal to a native speaker of the language.
STANAG 6001 aimed at ensuring clear understanding of the language proficiency
of the military personnel for the purpose of meeting language requirements for
international staff appointments, but it did not provide any practical testing tools leaving
it to the member-countries to develop their own testing materials in each skill. The
STANAG 6001 level descriptors were used in their original form for almost 20 years.
Cooperation with the new NATO members and with the partner nations revealed the
need to revise the descriptors. The Bureau for International Language Coordination
(BILC) took the lead in the reinterpreting the descriptors of the original STANAG 6001.
Bureau for International Language Coordination
Figure 1. The BILC organisation6
AARMS 10(2) (2011)
289
I. VADÁSZ: What’s behind the test?
The Bureau for International Language Coordination (BILC) is a consultative and
advisory body within the NATO Training Group in charge of language training matters.
BILC was established in 1966 as there was a need for an organisation coordinating
language training efforts within NATO. The tasks of the organisation were defined as
“dissemination of information on developments in the field of language training to
participating countries.”5 BILC has the following component elements (see Figure 1).
All NATO members actively take part in the work of the organisation. Other
organisations interested in language training and testing issues such as Allied Command
Operations, Partnership Coordination Cell, etc. may attend BILC conferences. All BILC
activities are open to the Partnership for Peace and Mediterranean Dialogue countries
which may send observers to the conferences if they are interested in the NATO training
and testing policies. The BILC Secretariat is provided by one of the member countries and
coordinates the annual BILC conferences and seminars. The BILC policy-making body,
the Steering Committee, consists of the heads of full member states’ delegations.7
Language training and testing in focus
From the first days after its establishment BILC became an effective organisation. One
of its main tasks was defined as “the convening of an annual conference of participating
nations which would review the work done in the coordination and in the study of
particular language topics.”8 The first BILC conference, hosted by the UK, was held in
1967 and discussed the issues of language programme development, instructor training
and computer application to language work. The conferences that followed discussed a
lot of language training and testing issues:
o Defining and Meeting Language Training Requirements for Multi-National
Peace Support Operations (1998, UK)
o Evaluating the Quality of Language Programs (2001, Spain)
o Teaching According to the STANAG 6001 Scale (2003, UK)
o STANAG 6001 and the Common European Framework (2005, Germany)
o Standardizing language training programs (2006, Hungary)
o Language Teaching Issues for NATO Missions (2007 USA)
o Establishing a Successful Language Training Program in Afghanistan (2010,
Turkey)
New forms of cooperation between the NATO members and the new partners
revealed some inconsistencies among STANAG ratings of different countries. Eleven
participating countries contributed to the work of reinterpreting the descriptors of the
original STANAG document, and in 2003 The Military Agency for Standardization
290
AARMS 10(2) (2011)
I. VADÁSZ: What’s behind the test?
published STANAG 6001 Edition 3, and in 2010 Edition 4. These editions provide
more detailed descriptors in terms of task, accuracy and content in performance of
language tasks to guide language instructors and test developers. Responding to the
request of the NATO Training Group, BILC developed a ‘plus levels’ scale ranging 0,
0+, 1, 1+ etc. Plus levels here mean proficiency that is between two base levels,
exceeding the base skill level but not fully meeting the requirements of the next higher
level. Edition 3 also gave the proficiency levels new labels:
Table 2. STANAG 6001 Proficiency Levels9
Original STANAG Level Labels (1976)
Level 0
No Proficiency
Level 1
Elementary
Level 2
Limited Working
Level 3
Minimum Professional
Level 4
Full Professional
Level 5
Native/Bilingual
New STANAG Level Labels (2003)
Level 0
No proficiency
Level 1
Survival
Level 2
Functional
Level 3
Professional
Level 4
Expert
Level 5
Highly-articulate native
Because of the need to better standardise testing, BILC developed the Language
Testing Seminar. Two-week seminars are aimed at developing competency in the
development and administration of language proficiency tests based on STANAG 6001.
The sessions take place at the Partnership Language Training Center Europe in
Garmisch-Partenkirchen, Germany in November, February and June. Since 2000 about
300 attendees from 38 countries10 have taken part in these seminars, which helped to
improve the quality of testing in their respective countries.
Advanced Language Testing Seminar is another BILC initiative with the idea to
provide advanced training in testing and management of testing teams. This seminar is
held once a year and is limited to eight participants only. It deals with productive and
receptive skills, test analysis and managerial issues. With its focus on testing level 3
candidates, the seminar is an exceptional opportunity for the participants to exchange
experience and improve their testing skills.
In 2005 BILC began to develop the Benchmark Advisory Test answering the need
to create a calibrated test against which the nations could compare the results attained
on their national tests. The test was developed by January 2010 by the American
council for the Teaching of Foreign Languages with participation of eleven BILC
countries that contributed some test items and piloted the first versions of the test. The
project has been another important step in standardization of language testing in
accordance with NATO STANAG 6001 descriptors.
AARMS 10(2) (2011)
291
I. VADÁSZ: What’s behind the test?
Conclusions
Created as a simple tool for testing government employees in the USA, the NATO
language proficiency scale has gone through significant changes to become the basic
document for development national testing systems in the military. The Bureau for
International Language Coordination in its attempt to tailor STANAG 6001 descriptors
to the needs of the military, has invested a lot in the scale development. The work on
coordinating language training and testing efforts of the NATO and the PfP countries
continues through regular conferences, seminars and research projects.
References
1. HERZOG, M., An Overview of the History of the ILR Language Proficiency Skill Level Descriptions and
Scale. http://www.govtilr.org/Skills/IRL%20Scale%20History.htm (last accessed 2 February, 2011).
2. STANAG 6001 (Edition 2) Language Proficiency Levels,1976. NATO Training Group, Brussels.
3. Appendix 1 to annex A to STANAG 6001 (Edition 2).
4. Annex A to STANAG 6001 (Edition 2).
5. Memorandum DS15/160/7. Enclosure 1, 26 July 1996. MOD, UK.
6. From: http://www.bilc.forces.gc.ca/org/index-eng.asp (last accessed 2 February, 2011).
7. Constitution and Rules of Procedure for the Bureau for International Language Co-Ordination
http://www.bilc.forces.gc.ca/org/doc/BILC_Constitution_2007-eng.pdf (last accessed 15 February, 2011).
8. Memorandum DS15/160/7. Enclosure 1, 26 July 1996. MOD, UK.
9. STANAG 6001 (EDITION 3) –Language Proficiency Levels.
http://www.bilc.forces.gc.ca/stanag/doc/STANAG_6001_Edition_3-eng.pdf (last accessed 15 February, 2011).
10. BILC Report to the Joint Services Sub Group, 2009.
http://www.bilc.forces.gc.ca/conf/2009/documents/Dubeau-BILC.ppt (last accessed 20 March, 2011)
292
AARMS 10(2) (2011)