Position Based Testing Manual

DEPARTMENT OF HUMAN RESOURCES
Position
Based
Testing
Manual
January 1, 2012
Part 1
POSITION BASED TESTING MANUAL
TABLE OF CONTENTS
Forms and Samples
Section I Examination Assignment and Research
Forms:
Request to Hire Form………………………………………………………………4-5
Position Based Testing Examination Plan Checklist……………………………10-12
Sample:
Request for Promotive-Only Exam………………………………………………..…8
Section II Job Analysis
Forms:
Subject Matter Expert Background & Qualifications Form………………………...12
Test Security Agreement and Statement of Responsibility for
Test Development and Administration Staff…………………………………….13-15
Task Rating Form…………………………………………………………..........25-26
Knowledge, Skills and Abilities Form…………………………………………..27-28
Task and KSA Linkup Form………………………………………………………...29
Qualifications Form…………………………………………………………………36
Examination Outline………………………………………………………………...43
Section III Recruitment
Sample:
Standard Text for All Job Announcements………………………………………...6-9
Forms:
Request to Cancel an Examination/Recruitment/Eligible List…………………..17-18
Section IV Exam Development
Samples:
Combined Point Method and Task (Skill) – Based Method T&E Supplemental...9-11
Point Method and Task Based Training and Experience Evaluation Sheet…...........12
Task Based Supplemental Questionnaire………………………………………..13-14
Behavioral Consistency Example Supplemental Questionnaires,
Instructions, Rating Guidelines and Scales……………………………………...17-23
Supplemental Questionnaire for Screening Committee, and Rating
Guidelines and Forms………………………………………………………..…..34-31
Oral Communication Scales……………………………………………………..51-53
Performance Exam Rating Guidelines and Instructions…………………………57-60
Written Communication Rating Sheet………………………………………………64
Section V Screening of Applications
Forms:
Human Resources Director’s Application Approval Form…………………………..7
Veteran’s Preference Application…………………………………………………...19
Employment Verification Form………………………………………………....21-22
Samples:
Letters to Applicants Who Do Not Qualify……………………………………...10-16
Section VI Examination Administration
Forms:
Request for Reasonable Accommodation in Civil Service
Examination Form………………………………………………………………….4-5
Identification Verification Form……………………………………………………...8
Written Examination Checklist……………………………………………………...12
Review of Ratings by Candidates in the Written Examination Form………………14
Checklist for Oral Examinations……………………………………………………23
Information for Candidates Form………………………………………………...…25
Rater Background Information and Qualifications Form…………………………...29
Expense Reimbursement Receipt Form………………………………………….….33
Request for Cash Advance to defray Business-Related Expense Form…………….34
Field Expense Report………………………………………………………………..35
Samples:
Proctor Instructions for Written-Multiple Choice Examination…………………10-11
Cover Letter to Oral/Performance Examination Raters……………………………..16
Proctor Instructions for Oral/Performance Examination……………………………21
Examination Instructions to Candidate for Oral/Performance Exam……………….22
Section VII Examination Scoring
Section VIII Eligible Lists
Sample:
Letter to Eligible………………………………………………………………..…….4
Inspection Instructions…………………………………………………………..13-21
Forms:
Authorization to Adopt a List of Eligibles………………………………………...…7
Restriction Removal Form……………………………………………………………9
Request for Extension of Eligible List Form……………………………………….23
Section IX Exam Documentation, Finalization and Storage
Forms:
Examination Final Report Form……………………………………………………...3
Checklist for Active File Folder…………………………………………………...…6
Checklist for Examination Materials Storage………………………………………...7
Section X Appeals
SECTION I
Examination Assignment and Research
Section I – Page 1
SECTION I – EXAM ASSIGNMENT AND RESEARCH
TABLE OF CONTENTS
REQUEST FOR POSITION BASED TESTING
Considerations to Determine Best Approach
Page
3
4
REQUEST TO HIRE FORM
5
PROMOTIVE-ONLY EXAMINATION
7
Sample Request
8
EXAMINATION PLAN CHECKLIST & EXAMINATION REPORTING FORM
POSITION BASED TESTING EXAMINATION PLAN CHECKLIST
Section I - Page 2
9
10
REQUEST FOR POSITION BASED TESTING
PURPOSE:
To request approval to conduct an exam under Rule 111A Position Based Testing and to document
the rationale for the request.
FORM:
Request to Hire Form
PROCEDURE:
The table that follows this section serves as a guide to help analysts determine whether a Class
Based or Position Based Testing approach should be used the best selection process to use to fill a
vacant position. If a personnel analyst believes that a Position Based Testing approach should be
used, s/he will need to submit a Request to Hire Form to DHR via his or her respective
supervisor/manager.
Before submitting a Request to Hire Form, the analyst should research personnel data so that the
hiring department and DHR will have the necessary information to determine whether Position
Based Testing is appropriate to use to fill a particular vacancy. Research in this regard may require
from automated systems PeopleSoft and/or JobAps (e.g., to obtain appointment and eligible list
information respectively).
Specifically, is the analyst’s responsibility to obtain and provide answers to the following questions
when preparing a Request to Hire Form. DHR will provide assistance, if necessary.
1. Does the target classification belong to a pre-approved PBT program designated by DHR (e.g.,
Personnel Analyst)?
2. How many positions are there in the class overall and where is the class used?
3. Are there permanent (PCS), provisional (PV) and/or exempt (E) incumbents serving in the class,
(if known)?
4. Do recruitments for this class tend to result in diverse candidate populations?
5. What was the applicant population size for the prior exam? How many satisfied the minimum
requirements and how many were tested?
6. Were there any departmental or union protests to any aspect of the previous selection process?
If yes, what did the protest(s) involve and how were they resolved?
7. Was the previous examination issued as an entrance, promotive, or combined promotive and
entrance examination?
8. If there is an existing PBT eligible list for this class? If so, is there is a unique special condition
that would make this list inappropriate?
When the Request to Hire Form is completed, the analyst should give the form to his or her
supervisor/manager or personnel officer. That individual should review and, upon approval, sign
the form. It should then be submitted electronically to the appropriate Client Services representative
at DHR.
Section I - Page 3
When evaluating the request, DHR will consider issues such as:
grievances associated with the class;
protests/appeals filed in past recruitments (e.g., announcement or MQ protests);
discrimination complaints involving the selection procedure(s);
pending investigations related to the class;
complaints from members in the class or class series;
professional considerations related to Subject Matter Experts, and;
other factors related to merit principles.
The existence of past complaints, grievances, etc. may not necessarily prevent use of Position Based
Testing, but it may require implementation of certain strategies to ensure that a valid and defensible
process is used going forward.
DHR may require a draft of the announcement be submitted for review prior to approving the
Request to Hire.
After the Recruitment and Assessment Services Unit (RAS) at DHR approves the request, copies of
the approved request will be provided to the hiring department and DHR’s Client Services.
If DHR denies a request for Position Based Testing, it will inform the department of the method the
department can use to fill the position. DHR will also notify other units as appropriate.
The department is to retain a copy of the approved Request to Hire Form in the Active File folder.
Section I - Page 4
City-Wide Test
Considerations
to determine best approach
If yes, city-wide test is not appropriate; use existing eligible list
Does the hiring department have an existing eligible list
or holdover roster appropriate for this position?
If no, a city-wide test may be appropriate
Does this classification belong a to pre-determined PBT
program designated by DHR?
If yes, a city-wide test may be more appropriate
If the work varies little from position to position, a citywide test is appropriate
If yes, a city-wide test may be more appropriate
Do many departments hire in this classification?
Does the work associated with this classification vary
considerably from position to position? Is the work
of the target position unique within this job classification?
Is the classification highly competitive and are very large
applicant pools expected?
Position Based Test
If yes, PBT examination is not appropriate; use eligible list
If yes, a PBT shall be used.
If no, PBT may be an appropriate alternative to city-wide test
If the work varies considerably or if the position is unique, a
PBT may be appropriate
If no, PBT may be appropriate alternative to city-wide test
If no, a city-wide test may be more appropriate
Is the hiring need extremely urgent? (e.g., 311 Call Center)
If yes, PBT may be appropriate alternative to city-wide test
If yes, a city-wide test may be more appropriate
Are there complex issues associated with this classification
or does past exam history indicate high sensitivity when
testing this classification?
If no, PBT may be appropriate alternative to city-wide test
Section I – Page 4
City and County of San Francisco
DEPARTMENT OF HUMAN RESOURCES
REQUEST TO HIRE FORM
Position Based Testing
Provisional Recruitment
Section I – Department Information
Instructions: Department completes Sections I through III for PBT Requests, and Sections I, II and V (on second page) for
Provisional Recruitment Requests. Send form to your Client Service Representative.
Department:
Division:
Date:
Hiring Administrator:
Phone No.:
Supervisor:
Phone No.:
Section II – Position Information
Division:
Section:
Class Code and Title:
Working Title (if applicable):
DHR Requisition Number(s):
Requisition Approved?
Yes
No
Special Condition, if applicable:
Section III – Request for Position Based Testing
Justification for requesting PBT:
Section IV – Position Based Testing Review & Approval (DHR Use Only)
PBT Approved:
Denied:
List ID Number:
List Type:
E
CPE
P
Scope:
Comments:
Signature of DHR Examination Team Leader:
Date:
Section V – Request for Provisional Recruitment
Justification for requesting Provisional recruitment:
Note: Oral Authorization will be required (Section VII)
Section VI – Provisional Recruitment Review & Approval (for DHR Use Only)
Provisional Recruitment
Approved:
Denied:
List ID Number:
Comments:
Name of DHR, Examination Team Leader:
Date:
Section I - Page 5
Section VII – Request for Oral Authorization (Use only for Provisional Hire request)
At the completion of a provisional selection process and prior to appointment, please complete this form and send
as an attachment to an email to your DHR Department Representative (please, no hardcopies). Your
Representative will obtain approval and return the form to you electronically or call if clarification is required.
Your copy of the approved form should be maintained for five years. Use the tab key to move between fields.
Request date:
Submitted by:
Appointment Information
No eligible list or test in progress
Pending completion of medical or background
Exam in progress; selected from applicant pool
Continued PV in same class, but different requisition
Selected from eligible list pending referral
Transfer, reinstatement, or reappointment pending
PCS
Check box to confirm that the MQs are identical to the last permanent job announcement. If not, explain any differences
below.
Recruitment Process (required)
Opening Date:
Final Closing Date:
Announcement posted within department
Announcement posted at DHR
Announcement distributed city wide
Announcement posted on City’s website (SFGOV)
Selection Hurdles (not all required) and Applicant Statistics
Screen for MQs– Analyst Name:
Number of applications received
Training and Experience (T&E) ratings
Number meeting minimum qualifications
Written examination
Performance test (e.g., writing sample)
Number of candidates invited to participate in
process
Oral Interview
Number of vacancies (approved requisitions)
Other
Candidates Selected (List only those to be hired in same order as requisitions listed above)
Rank
Name
Current or
previous
employee?
SSN
State briefly why each individual was deemed more
qualified than others in the pool of candidates for this
position.
Department Certification
DHR Concurrence
This provisional process was conducted in a fair and
open manner and the selection was based on merit.
Based on the information and discussions with the
department (when necessary) DHR concurs with this
selection decision.
By:
By:
Date
Section I - Page 6
Date
PROMOTIVE-ONLY EXAMINATION
PURPOSE:
To ensure that promotive-only examinations are announced in compliance with Civil Service
Commission (CSC) policy.
AUTHORITY:
The Civil Service Commission re-issued its policy on “promotive only” examinations in a
memorandum dated 3/26/99 (see Appendix II). It delegated authority to the Human
Resources Director to issue “promotive only” examinations.
PROCEDURE:
Promotive-only examinations are prohibited except as mandated by law or when there is
agreement by concerned Department(s), Employee Organization(s) and subject to the review
of the DHR – Equal Employment Opportunity Unit and Human Resources Director.
Therefore, the personnel analyst shall review qualifications and appointment status of all
employees in promotive classes in the occupational series, including provisional and exempt
employees, to:
(1)
(2)
determine if there would be any problems qualifying them under the terms of a
promotive-only announcement, and
evaluate the ethnic and gender composition of employees within the promotive
class, and lower level classes to ensure the workforce is diverse.
The analyst shall also obtain written confirmation from the union stating that it will agree to
promotive-only examination.
Upon receipt, the analyst shall:
(1)
(2)
(3)
prepare a memorandum that includes analysis of above factors,
point out relevant issues (e.g., if there are some provisional employees who may not
be eligible to compete for the promotional examination based on the MQ
requirements) and
submit the draft memorandum to his/her supervisor/manager for review and
approval.
The approved memorandum should be initialed and submitted to the Human Resources
Director for approval. (See next page for sample memo to the Human Resources Director)
Section I - Page 7
City and County of San Francisco
Department of Human Resources
Mayor
Human Resources Director
__________________________________________________________________
SAMPLE
DATE:
Month x, 20xx
TO:
______________
Human Resources Director
THROUGH:
_____________
Senior Personnel Analyst
DHR Recruitment & Assessment Services Unit
FROM:
_________________
(Hiring Department’s Personnel Supervisor/Manager)
SUBJECT:
Request for a Promotive-Only Examination for Class 7334 Stationary
Engineer
I request permission to conduct a promotive-only examination for class 7334 Stationary
Engineer. Eight employees in class 7333, Apprentice Stationary Engineer will soon complete
their four-year apprenticeship programs and become eligible to work at the journey level.
The Department of _______ requests that these apprentices have an opportunity to promote to
the journey level, since four years have been invested in their training. Some well qualified
people are graduating in this apprentice class.
The pool of potential promotional applicants will be more representative of gender and ethnic
diversity than the population of 7334 Stationary Engineers.
The union has agreed to a promotive-only examination for this classification. [See attached.]
Should this request be approved, we plan to post an eligible list by ___ (month/year).
Section I - Page 8
EXAMINATION PLAN CHECKLIST AND
EXAMINATION REPORTING FORM
PURPOSE:
To provide personnel analysts with a document to guide them through the Position Based
Testing process.
FORM:
Position Based Testing Examination Plan Checklist
PROCEDURE:
Position Based Testing Examination Plan Checklist
Use the Position Based Testing Examination Plan Checklist (see below) as a planning
tool to help guide you throughout the examination process. Make note of the points
in the process where approval by the Department of Human Resources is required.
Complete the Examination Planning Form and submit it to the DHR RAS for
consultation and approval.
o The shaded task statements are the responsibility of the Department of Human
Resources or the authorized Decentralized Exam Unit supervisor.
o If the department has an authorized Decentralized Exam Unit supervisor, the
supervisor may review and approve the Position Based Testing process
administered by the departmental analyst. The authorized Decentralized Exam
Unit supervisor must sign off on the form indicating that the supervisor reviewed
and approved the examination documents and process at the required points
indicated on the checklist.
o Additional review points may be required depending on the experience of the
analyst and/or the issues or complexities associated with the exam.
The analyst may wish to indicate the date when each specific examination task is
completed.
Upon approval of the eligible list, the checklist should be filed for exam record
purposes.
Section I - Page 9
POSITION BASED TESTING EXAMINATION PLAN CHECKLIST
The outline below identifies the general steps and procedures to follow when conducting PBT examinations.
Important: Please note each step (indicated by a box with an X in it) where DHR review or attention is required.
DHR Review
Notes:
Point
(i.e. Target Dates, Completion Dates)
PLANNING
1.
Identify funding & budgeted position.
2.
Identify appropriate job class.
3.
Obtain concurrence on classification, if
needed.
Issue requisition.
Determine if there are provisional employees
or anticipated hires, and contact respective
departments.
Obtain approval from DHR to conduct a PBT
process for the position.
4.
5.
6.
7.
Obtain agreement from the union on the
certification rule, if necessary.
8. Identify Subject Matter Experts & schedule
position analysis and exam development
meetings.
JOB ANALYSIS AND DEVELOPMENT OF THE
RECRUITMENT PLAN
1. Review prior announcement, recruitment file
and exam active file, if available.
2. Review job duty information, KSAs, and
minimum qualifications based on existing
documentation. Determine if amendments to
the documentation are needed.
3. Complete position analysis documentation
forms and attachments.
4. If position analysis differs significantly from
the job analysis documentation, confer with
DHR before drafting announcement.
5. Assess recruitment needs (e.g., check
previous recruitment and workforce statistics
and confer with Subject Matter Experts, if
needed).
EXAM DEVELOPMENT AND
ANNOUNCEMENT
1. Develop test (i.e. T&E, oral exam, etc.)
2.
3.
4.
Develop the proposed PBT announcement.
Upon request, submit to DHR that portion of
the Compliance Review Form covering
position analysis, recruitment plan, test
information (e.g., KSA/test linkage, test
questions, rating guidelines, etc.), rater
information, announcement, and exam plan.
DHR will review and respond.
DHR will check for holdovers prior to
issuance of list ID, and will consult with
department if provisional employees exist
Section I - Page 10
5.
DHR assesses the need for updates to the
class specification and, if needed, posts the
amended class specification in advance of
the announcement posting.
6. The Department may wish to send a courtesy
copy of the announcement to the union
approximately five days prior to the
scheduled posting of the announcement.
[However, this is not a requirement.]
7. Issue and post the announcement in JobAps.
RECRUITMENT
1.
2.
3.
Distribute the announcement to outside
parties (media, professional organizations,
professional magazines, etc.) if needed.
Respond to any announcement inquiries.
Notify and copy RAS with respect to any
protests or appeals received.
4.
Review protest and conduct research as
necessary. Draft and forward reply to RAS
for review and approval. As appropriate,
complainants are notified of rules relating to
appeals.
5. DHR or CSC determines if exam process or
list issuance will continue with a pending
appeal.
APPLICATION SCREENING
1. Collect applications and/or resumes from
interested job seekers.
2. Screen applications for minimum
qualifications.
3. Notify applicants via email (or letter) who do
not meet the minimum qualifications and
provide five days for
clarification/verification.
4. Respond to questions and issues from
applicants.
5. Enter applicant disposition information in
JobAps.
6. Notify candidates of time/date/place for
exam process.
7. EXAMINATION ADMINISTRATION
1. Implement logistics of exam such as securing
exam site; recruiting proctors; contacting
and/or recruiting raters; and preparing exam
materials, etc.
2. Train raters and proctors.
3. Administer the exam process. Inform
candidates in writing that protests involving
the manner in which the test is administered
must be filed prior to the candidate leaving
the test center. This policy is designed to
address issues on-the-spot rather than afterthe-fact when there may be no remedy.
Section I - Page 11
4.
Bring all test administration protests to the
attention of RAS.
5.
1.
DATA ENTRY AND SCORING
Score exam results and apply the appropriate
pass point.
2. Upon request, submit to DHR a Compliance
Review Form to document the exam process,
any problems encountered and the
justification for the pass point.
3. Add promotive and veteran points if needed.
City employees with six consecutive months
(1040 hours) of verifiable experience in any
job classification in any appointment type
qualify as receiving promotive
4. Enter exam results into JobAps.
ELIGIBLE LIST AND RESPONDING TO
APPEALS
1. Send out exam results and inspection
materials to candidates. Inform eligibles of
the eligible list’s duration and/or the duration
of their eligibility. Exam participants have
five working days to verify the accuracy of
their score calculations.
2. Appeals to the CSC at this time can only be
about claims of inconsistency in exam
administration, bias of raters and/or failure of
raters to apply uniform standards. All other
protests regarding exam administration will
be submitted to the HR Director.
3. Notify DHR of any protest; confer with DHR
on response and forward draft response to
DHR for review.
4. Direct appeals of inconsistency of the exam
administration, bias of raters and failure to
apply uniform standards to the CSC. Notify
DHR of appeals directed to CSC, and
unresolved protests of any other matters.
5. Prepare materials for presentation to the
Civil Service Commission in response to
appeal and forward to DHR for review.
6. Conduct Review of Rating.
7. Send Authorization to Adopt the Eligible
List via email to DHR’s EIS and Referral
Units. DHR adopts and posts the eligible list.
REFERRAL AND FINAL SELECTION
1. DHR issues Referral to departments within
15 business days from date of adoption.
2. Conduct hiring interview, make selection
decision and notify DHR.
DOCUMENT PROCESS
1. Document unusual circumstances, if
applicable.
2. Prepare and maintain active file.
3.
Retain storage materials. Follow record
retention schedule.
Section I - Page 12
Section I - Page 13
SECTION II
Job Analysis
Section II – Page 1
SECTION II – JOB ANALYSIS
TABLE OF CONTENTS
Page
3
JOB ANALYSIS
Frequently Used Terms
4
Updating Job Analyses
5
New Job Analyses
6
SUBJECT MATTER EXPERTS
9
Subject Matter Expert Forms
11
SME Background and Qualifications Form
12
Test Security Agreement and Statement of Responsibility Form
13
Role of the Job Analyst
16
DESCRIBING JOB BEHAVIORS
17
Writing Critical Incidents
17
Writing Task Statements
18
WORKER CHARACTERISTICS
19
Writing KSA Statements
19
Evaluating KSA Statements
20
On-The-Job Learning or Brought-To-Job?
20
GENERAL INTERVIEWING GUIDE FOR AN ANALYST
JOB ANALYSIS FORMS
22
24
MINIMUM AND DESIRABLE QUALIFICATIONS
30
Minimum Qualifications
30
Minimum Qualifications with Special Conditions
32
Common Pitfalls Associating with Minimum Requirements
34
Qualifications Forms
35
Desirable Qualifications
37
ESTABLISHING A CERTIFICATION RULE
38
VERIFICATION AS PART OF THE APPLICATION PROCESS
39
SELECTION PLAN EVALUATION
40
Selection Plan Worksheet
42
Section II – Page 2
JOB ANALYSIS
Job analysis is a generic term for the process of critically examining job components to provide a
functional description of a job. The job analysis is a basic tool of personnel administration
which can serve a number of personnel needs including job classification and specification,
worker compensation, training, performance evaluation, and of course examination for
employment and promotion. Because of the diversity of jobs and job analysis applications, a
variety of job analysis strategies, methods and formats exist.
All job analysis techniques deal with behavior, either directly or indirectly. Behavior is dealt
with directly when the job analysis specifies as its elements those activities which are necessary
for achieving the objectives of the job. Behavior is dealt with indirectly when work products are
used as elements in place of behaviors which are not directly observable. The descriptions of the
job behaviors spelled out in the job analysis are the foundation blocks upon which the analyst
constructs the selection instrument. This is the basic linkage between the job and the test.
In the City and County of San Francisco, selection procedures are generally developed with a
content validity approach. This manual, therefore, shall focus on one type of job analysis which
is legally defensible and supports this type of validity. It supports validity because the job
analysis procedures capture the most important aspects of the position(s) which are then reflected
in what the selection instrument measures. In other words, the validity of the selection
instrument is inferred from the nature of the test content’s relationship to the job analysis
information. It should be noted that, although this manual focuses on job analysis for specific
positions under the Position Based Testing Program, concepts underlying the process are the
same as those for the analysis of all positions in a class.
Basic Principles
 Job analysis is a process; not a product.
 Job analysis is crucial to the establishment of content validity.
 Content validity job analysis begins with observable job behavior.
 Content validation requires that rational and logical linkage be made between observable
job behaviors and the selection instrument.
Elements embodied in a job analysis, such as tasks and worker characteristics (KSAs), must
never be so vague or abstract that the behaviors, and hence the job itself, cannot be perceived.
Section II – Page 3
Frequently Used Terms
To avoid confusion, analysts should be clear on the descriptions of terms used to describe job
analysis and its components. Here are some definitions of common terms:
A position is a related set of purposeful activities for an individual. Only one individual
occupies a position, although it is possible for a position to be vacant. Several positions
may entail performance of the same job.
A task constitutes a discrete unit of purposeful (for the organization) activity. It is
behavior that produces a result. All the job tasks taken together constitute all the
activities of the job. “Task” is used synonymously with “behavior,” “function,” or
“activity”; to the extent that “task” differs from the other three terms, the difference is
that “task” connotes accomplishment of the organization’s purpose at a basic level.
A dimension is a defined category under which behavior can be reliably classified. The
set of behaviors classified under the dimension operationally define the dimension.
A duty is a set of related tasks. Tasks are elementary components; duties on components
taken at a higher level of abstraction. “Duty” and “responsibility” are taken as
synonymous.
The principal divisions of worker characteristics are knowledge, skills and abilities,
frequently referred to as KSAs. In order to allow for the possibility of other worker
characteristics (e.g., personality variables), some agencies include “other”; in this case,
the generic designation for worker characteristics is KASO. Knowledge is a body of
information applied directly to the performance of a function. A skill is a present
observable competence to perform a learned psychomotor act. An ability is a present
competence to perform an observable behavior or a behavior which results in an
observable product.
Subject Matter Experts (SMEs) – are typically incumbents and supervisors who
provide important job information in the job analysis process. Although incumbents and
supervisors are termed “Subject Matter Experts” with respect to the job they do or
supervise, this term is something of a misnomer because incumbents, even superior
performers when in the entry level of their title series, are not necessarily “experts.” The
term “SME” is also used within the test development field to designate consultants who
develop test material.
Section II – Page 4
Updating Job Analyses
Jobs do change, at least in some of their aspects over time. For example, it sometimes happens
that a job will change radically in a short span of time, as when there is major organizational
restructuring or when there is implementation of new technology. In other cases, jobs may
change at a glacial rate, one minor aspect at a time. In these latter cases, a new job analysis need
not be completed each time a test is held for the title. In other words, a prior job analysis can be
verified to see if change has occurred since it was last done and, if necessary, it can be amended.
An expedited data gathering process or a previous job analysis may be employed to update a job
analysis conducted within the last five years if the basic duties and responsibilities and
requirements of the position as described by the last job analysis have not changed substantially
in that time. Analysts should consult their team leaders or DHR to determine if an expedited
process is appropriate.
For expedited data gathering, it is recommended that analysts, at a minimum, do the following:
Review previous examination files (i.e., Active Folder information) - to obtain prior job
announcements and job analysis reports which should include tasks determined to be
important to the job as well as the knowledge, skills, and abilities needed to perform these
tasks. These can provide a tentative list of job tasks and KSAs, as well as the names of
SMEs used for job analysis and/or test development. The analyst should pay particular
attention to how the prior job analyses were conducted and whether the job analysis elements
were clearly described. [Note: Although a prior file can serve as a good starting point, it
should be remembered that it documents the job only as it existed at the time of the last
examination administration. If this occurred some time ago, then the information may be
outdated and have limited utility.]
Review the class specification for this position – to obtain major job activities and the KSAs
to perform these activities. Also, the minimum qualifications (MQs) stated in the
specification should always be the same as that of the target announcement with certain
exceptions (e.g., manager titles, IT titles, special conditions). [It should be remembered that
job specifications should be considered as providing “bare bones” information for selection
purposes. Because job specifications are oriented toward classification and perhaps
compensation as well, descriptions of activities may be written in a manner so broadly as to
not identify work behaviors specific to a particular position within that class. It is also quite
possible that the job description is out-of-date. Indeed the current job specification may be
quite different from the one that existed at the time of the original job analysis.]
Review position classification information - such as Job Analysis Questionnaires, recent job
descriptions or performance appraisals if available, and/or recent provisional announcements.
Review class specifications and announcements for other classes in the same series, both
above and below the classification for the position. The duties of the class tested under
Position Based Testing should be related and consistent when compared with duties and
responsibilities of the class in general.
Section II – Page 5
Conduct a Telephone or Personal Interview – Given the concerns noted with regard to the
review of active examination files, existing job specifications, etc., the job analyst should
gather and verify job analysis data by conducting a telephone or personal interview with a
Subject Matter Expert. Providing the SME time to review in advance the job analysis
questionnaire/booklet with draft statements of job behaviors and worker characteristics will
speed the process considerably. If a personal interview, it is also recommended that the
analyst notify the SME in advance to bring, as appropriate, necessary job-related documents
(e.g., training materials, organizational charts, etc.) to the interview.
Ideally, the SME used for verification purposes will be the one who participated in the
original job analysis, although this is not a necessary condition. In any event, the SME is to
examine the prior job analysis for changes in job behaviors, worker characteristics, linkages,
and ratings. The SME should be alerted to specify new versions of manuals or other changes
that might not be reflected in the wording of the last job analysis. The SME might find minor
changes in the job since the initial job analysis but they may be of no consequence. What
constitutes “no consequence” is a matter of judgment; however, the following should be used
as a guide:
 Changes involving on-the-job learning status or ranker/qualifier status are never
trivial.
 Where job behaviors or worker characteristics are rated on scales, a fluctuation of one
scale point is trivial unless that change has impact on the test development plan.
 Any change in a job behavior or worker characteristic must be noted. Additions or
deletions must be noted. Any change in linkages must be noted.
New Job Analyses
For classes that have not been studied within the last five-to-seven years, a new job analysis
should normally be conducted. When a new job analysis is performed, the analyst will often need
to select the type of data collection method based on the amount of job-relevant information
already available from previous job analyses and the title’s sensitivity (e.g., number of appeals
associated with the title). The amount of time that an analyst has to spend on a job analysis,
however, is often the most significant factor that enters into the decision of the data collection
method to be used.
Typical methods used to collect job-relevant information in the job analysis process are
presented below. More than one method may be used in any given job analysis.
Questionnaire – Those knowledgeable about the job provide answers to questions presented on
a standardized form which is known as either a job analysis questionnaire or job analysis
booklet. Although the opportunity for personal interaction is lost, the time saved by avoiding
such interaction may be an important consideration. The questionnaire is particularly useful
when the duties of the position haven’t changed much since the previous job analysis. Although
highly unlikely in the case of PBTs, questionnaires are also very useful when there are numerous
SMEs to be contacted and many positions to be covered.
Section II – Page 6
This procedure can be taken a step further, by mailing or electronically forwarding the job
behavior and worker characteristic statements to the SMEs in advance of a panel session or
interview (discussed later in Section II). Indeed, the drafting of a tentative set of job behaviors
and worker characteristic statements by the analyst into a questionnaire, gives SMEs an initial
frame of reference and the need to reconcile differences of language can be mostly eliminated
where no substantive issues are involved. In this way, the SMEs can do much of the job analysis
before the formal session. The session then has its primary purpose the integration, rather than
the generation, of information.
It is essential that the supervisors of the SMEs allow on-the-job time for completion of the
materials. It is always good practice by the analyst to notify the HR office in advance of any
plans to use SMEs from that department for job analysis purposes and to alert the HR office of
the analyst’s plans to forward the questionnaire to the SMEs. In many instances, the HR office
may be needed to recommend particular SMEs. Such notification, regardless of the job analysis
method, also allows the HR office to determine whether there are any concerns about using a
particular SME. It is also essential that SMEs who complete the questionnaire have the reading
and communication skills to work with the job analysis materials (if this is not the case, this
method should not be used). Detailed instructions must be prepared for the SMEs working on
the material. A “due date” for completion should also be provided, along with the job analyst’s
contact information.
Group discussion – The components of the job are brought out through a discussion or
brainstorming session with incumbents, their supervisors, or others knowledgeable about the job.
It is preferable to meet jointly with all SMEs when conducting job analyses because a group has
the advantage of providing information from different perspectives that can be immediately
discussed by all. That is, group discussion allows for immediate feedback on the information
generated. Consequently, it tends to produce quality information. However, meeting with a
group of SMEs can be quite time-consuming. Structuring the group or panel discussion (for
example, by starting the panel with a draft set of task statements and KSAs selected from job
specifications or other documentation) can often speed the process. No further discussion of
large SME groups or multiple SME panels will be discussed here given that this approach is
more likely to be used in association with Class-Based Testing.
Interview – Incumbents and supervisors are asked a series of interactive questions about the job.
This method allows the analyst to gather details about the job which may be especially important
if the job is highly technical. In fact, information can be obtained in more depth in an interview,
yielding information superior to that which can be obtained in a panel situation. The interview
may be done via a telephone call but a personal interview is preferable because it also allows the
analyst to examine job related material (procedure manuals, equipment, forms) while someone
familiar with that material explains its usage. When possible, the analyst should interview at
least two incumbents, and interview them together. This gives two perspectives on the job and
enables the incumbents to cross-check information with each other.
The job analyst, at a minimum, should have gathered information from the job specification or a
previous job analysis, prior to speaking with the SME. Ideally, the analyst should prepare a draft
questionnaire with tentative task and/or KSA statements in advance of the interview. This will
Section II – Page 7
allow the SME to verify the accuracy of the conclusions drawn by the job analyst from his/her
review and synthesis of job-related data gathered.
The interview must have as a minimum goal a set of job behavior statements and a
differentiation of those statements by task frequency and importance. The generation of worker
characteristics should be done as well along with evaluations of those KSAs in terms of
importance at entry, usefulness in differentiating levels of work performance, and usefulness as a
qualifying or ranking element in selection. It should be noted that various other scales do exist
(e.g., Independence/Supervision of tasks performed), many of which are variations on a theme
(e.g., Learning Time/Difficulty). However, the task and KSA scales identified in the job analysis
forms presented later in this section are the ones recommended for the majority of titles that fall
within the PBT program.
It should be noted that there are several ways in which an analyst may finalize the task and KSA
statements.
As indicated previously, it is recommended that the analyst do homework beforehand and
have a list of tentative tasks and/or KSAs available in advance of the interview and this
provides a structural guide which can be amended as the interview proceeds.
It may be feasible in some cases to send a questionnaire to the interviewee, have it returned
and then conduct the interview to extend and to clarify the questionnaire information.
The analyst always has the option of gathering questionnaire information during the
interview, editing the statements after the interview, sending the edited statements back to
the SME for review and/or evaluation. Upon completion, the SME would return the
questionnaire back to the analyst.
Observation – The job analyst observes the activities of one or more people and records the
observations. This is a wonderful method for analysts to understand what is done by someone in
the target position. It is especially helpful with jobs involving physical activity and the use of
tools and equipment. However, it is very time consuming. Indeed, when possible, the analyst
should observe a complete work cycle, noting tools and activities performed. Because of the
time commitment, direct observation, may not always be a viable option for PBT recruitments.
Literature Review - As previously noted under the discussion of expedited data gathering,
various job descriptive material may be used. This can include job specifications, position
questionnaires, prior job analyses, research reports, announcements and class specifications from
other agencies, job descriptions outlined in the Dictionary of Occupational Titles, training
manuals, etc. This method is particularly useful when working with a new class or a class that
has only a small number of available SMEs.
Section II – Page 8
SUBJECT MATTER EXPERTS
It is impossible to do a good job analysis without considering what people are actually doing on
the job, and virtually impossible to do a good job analysis without some form of interaction
between the analyst and those with first-hand knowledge of the job. As previously indicated,
incumbents and supervisors typically serve the purpose of providing this important job
information. Incumbents and supervisors are termed “Subject Matter Experts” (SMEs) with
respect to the job they do or supervise. This term is something of a misnomer; incumbents, even
superior performers when in the entry level of their title series, are not necessarily “experts.” The
term is also used, within the test develop field, to designate consultants who develop test
material.
The analyst needs SMEs to generate, add to, delete from, modify or confirm materials
comprising the job analysis, which could consist of job behaviors (task statements), critical
incidents, job elements/sub-elements, activity elements, process statements, or worker
characteristics (KSAs). Consequently, a SME should have first-hand knowledge about the job;
if possible, each SME should have at least one year of experience performing or supervising the
job. Of course, when supervisors are used, the general rule is to try to use supervisors who
supervise the position. However, for some jobs, a representative of higher management may be
appropriate (e.g., where there are suspected differences in job activity by unit, shift, location,
work crew, etc. and the employee may be subject to reassignment). Also, in Position Based
Testing, oftentimes the hiring manager might be the only individual who can serve as a subject
matter expert when no permanent incumbents available. Additional considerations in the
selection of SMEs:
Permanent incumbents and supervisors should be used to:
 develop job tasks and the KSAs required to perform those tasks
 rate those tasks and KSAs in terms of importance, frequency, etc.
 establish linkages between the important tasks and KSAs, and the content used in the
selection process.
If possible, it is often advantageous to use supervisors who were promoted in the not-toodistant past from the job being analyzed.
Provisional employees, as a rule, should never be used. However, if no Permanent Civil
Service or supervisory SMEs are available, a provisional employee may be used to only
identify job tasks. They should never be used to rate or weigh tasks or identify, rate or
weigh KSAs since they may be eligible to take the examination. DHR should be
consulted prior to any consideration concerning use of provisional incumbents in the job
analysis or test development process. Exceptions may be made if the provisional
employee signs a form or statement electing to not participate in the examination.
In unusual circumstances, SMEs from other departments or jurisdictions who are at or
above the level being tested may be used.
Section II – Page 9
Incumbents should have permanent status and preferably should be superior performers.
When panels of SMEs are used in the job analysis, they should be a representative sample
of ethnic and gender workforce composition and range of experience, including different
position locations, shifts, etc., if and when appropriate.
While analysts may meet jointly with all SMEs when conducting job analyses, it is often
a good practice to meet with the supervisor first, separately from the employees to avoid
any distraction or inhibition on the part of incumbents. The supervisor can initially
provide an overview of the job; give a perspective on how the job under consideration fits
in with other jobs within the organization; indicate what is expected by management from
incumbents in the job; and discuss special conditions, problems, procedures, or tools
associated with the job.
An important point to keep in mind is that the interviewer should elicit a description of
the job behaviors from the supervisor, rather than have the supervisor merely confirm the
analyst’s summary of the duties and job behaviors or to confirm what was contained in a
previous job analysis or class specification.
Interviewing supervisors without interviewing employees should be done only when
there are no permanent incumbents or the position is structured so explicitly that it is not
subject to modification by the employee’s style or approach.
Section II – Page 10
SUBJECT MATTER EXPERT FORMS
PURPOSE:
To ensure that participants in the job analysis and exam development
process are qualified and have no conflict of interest, and to document
participation and communication of confidentiality requirements.
FORM:
Subject Matter Expert forms
PROCEDURE:
All SMEs must complete the following two forms:
Subject Matter Expert Background & Qualifications
Test Security Agreement and Statement of Responsibility form.
The examination analyst provides the SME(s) with a copy of the
completed form and files the original with his/her examination records.
Section II – Page 11
SUBJECT MATTER EXPERT
BACKGROUND & QUALIFICATIONS FORM
Classification Title _____________________________________________
Classification Number ___________________
Date:
________________________________
Name: _______________________________
Work Phone: __________________________
Current job class #:
Sex:
Male
Ethnic group:
Title:
Female
White
African-American
Hispanic
Asian/Pacific Islander
Filipino
American Indian/Alaskan Native
Other _______________________
Department: ____________________________
Section/Division: ________________________________
Years of work experience in tested class:
_________________
Years of experience supervising tested class: ________________
Briefly describe your experience, training and education in this area:
_
________________________________________________________________________________
Signature:
Date signed:
______
-------------------------------------------------------------------------------------------------------------------------------FOR DHR USE ONLY
Participated in:
First Job Analysis Meeting
Second Job Analysis Meeting
Test development: indicate type(s) of test
Test review: indicate type(s) of test
Section II – Page 12
City and County of San Francisco
Department of Human Resources
Test Security Agreement and Statement of Responsibility Form
For
Test Development and Administration Staff
I understand and expressly acknowledge that:
The loss or disclosure of examination information or material, unintentional or
otherwise, is a very serious matter as it can render a test invalid and useless. Since
examinations represent a significant investment in time and money to develop and
administer, any loss of test security can be very costly, disruptive and harmful to the
operations of the Department of Human Resources (DHR) and/or other City and
County of San Francisco (CCSF) departments. Further, the loss of examination and
exam-related material undermines the public’s trust and confidence in this
Department, the CCSF and the merit system.
Similarly, applicant or candidate information (including, but not limited to, test
answers, test scores, and personal information such as addresses, social security
numbers, disabilities, etc.) must be safeguarded and kept confidential.
Everyone who is involved in test development and administration therefore has a
special responsibility to uphold the public trust and merit system principles.
Moreover, test development and administration staff is legally and ethically obligated
to protect examination material and to maintain the confidentiality of applicant and
test-related information. Everyone who participates in test development or
administration must protect the value of secure examinations and observe security
precautions when working with tests and test-related information. Indeed, there is
no time during a test’s development or administration that the security of an
examination or examination-related material is not the responsibility of those who
are entrusted with these activities.
I am aware of the confidential nature of my work and therefore expressly acknowledge
that:
1. I may be given access to confidential data, examination material or testrelated information in association with work that I perform for the CCSF. Any
such material or information for which I am given access is the property of the
CCSF. Dissemination of this information or material to persons other than
designated, authorized CCSF representatives is strictly prohibited.
2. I may be a party to conversations or discussion wherein confidential
examination-related information is discussed. Dissemination of the content of
Section II – Page 13
these conversations to persons other than authorized CCSF employees is
strictly prohibited. I am strictly prohibited from disseminating the contents of
these conversations to third parties without the express written consent of
authorized Human Resource representatives of the CCSF.
3. I must keep the content of all test questions and test-related material involving
examinations developed or administered by the DHR and/or other CCSF HR
departments in the strictest confidence. I may not discuss or otherwise make
available to anyone outside of my test development or administration duties
any test questions, answers or stimulus material.
4. I am responsible for maintaining the confidentiality of all examinations and
examination-related material and must never leave such material or
information unattended or unsecured. These materials at all times need to be
properly safeguarded.
5. I may not participate in any examination for which I, a relative or a close
personal friend, have applied in any way. Should I discover that a relative or
close personal friend of mine intends to apply or actually does apply, for a
particular examination in which I am or may be a participant in either test
development or administration activities, or otherwise have access to
examination materials, I must immediately notify DHR’s Director of
Recruitment and Assessment Services.
6. It is against the law to have in one’s possession at any time a copyrighted test
item or protected intellectual property for which one does not have
permission. Test administration staff may only have such permission at the
test center. Simply memorizing a test question and/or answer and writing it
down later can be construed as a violation of copyright law. If test
administration staff is asked to describe a test, test item(s) or answer(s) and
the requested information is provided, this also may be construed as
copyright infringement or a violation of intellectual property rights.
7. Study groups, test preparation businesses, etc., are known to try to acquire
test material. I understand that I must be especially careful in my interactions
and conversations with others regarding my test-related work and report any
unusual occurrences regarding inquiries, etc. from anyone who is not an
authorized Human Resource representative. Neither I nor, to the best of my
knowledge, any of my relatives or close friends have any personal or
business affiliation or relations with any group or organization which may
have an interest in examinations developed or administered by DHR and/or
other CCSF departments that conflicts with the objectives of preserving test
security and maintaining the confidentiality of test-related information.
Section II – Page 14
In addition, I understand that I am subject to Department policies and State and local
laws and rules governing the conduct of public officers and employees, including but not
limited to:
Political Reform Act, California Government Code § 87100 et seq.;
California Government Code § 1090;
San Francisco Charter;
San Francisco Campaign and Governmental Conduct Code;
San Francisco Sunshine Ordinance;
Statement of Incompatible Activities;
Civil Service Rules; and
Any applicable departmental policies.
I understand that I am required to report any violations, upon discovery, of the
provisions stated herein committed by others, to the Director of Recruitment and
Assessment, Department of Human Resources.
I understand that engaging in the activities that are prohibited by this Agreement and
Statement of Responsibility, or non-adherence to the terms of this Agreement and
Statement of Responsibility, may subject me to discipline, up to and including possible
termination of employment or removal from office, as well as prosecution, monetary
fines and penalties.
By signing this agreement and statement of responsibility, I acknowledge that it has
been received, read and understood.
PRINT NAME_______________________________________
SIGNATURE_____________________________
DATE______________________
Section II – Page 15
ROLE OF THE JOB ANALYST
When possible, the analyst should prepare draft job analysis documents for the Subject Matter
Experts to review in advance of the meeting. Possible changes to this information can be
discussed during the job analysis meeting. Analysts may also want to gather and provide each
SME with a copy of the following documents:
previous job analysis report
previous examination announcement
classification documents submitted for the position
Although an analyst may find an incumbent’s performance plan or appraisal useful to learn the
duties and responsibilities of a position, these documents should NOT be provided to the Subject
Matter Experts.
When first meeting with SMEs, generally, the analyst should explain:
what a job analysis is
what the components of a job analysis are
what the SME role in the process is
how the job analysis information will be used
the need for confidentiality and why SMEs must complete necessary forms (i.e.,
Background and Qualifications, and Test Security Agreement and Statement of
Responsibility forms)
In general, the analyst must carefully review and assess job analysis information provided by
SMEs and be in a position to confirm and/or contest the following:
General description of the position – Does it align with the class?
Essential duties and job behaviors – Are they appropriate to the class and position?
Knowledge, skills, and abilities – Are they linked to the duties and behaviors?
Minimum qualifications – Are they tied to the job duties?
Desirable qualifications (For possible use in the post-referral process) – Are they related
to the job?
Legally mandated licenses, certificates or other requirements
For documentation purposes, the analyst must include the following information and file it for
record purposes:
Who conducted the job analysis and contact information for that individual
Date(s) and location(s) of the job analysis meetings
A description of how the job was analyzed
Steps taken to ensure the accuracy and completeness of the collection, analysis and report
of the data and results (e.g., SME instructions)
Identification of all job analysis participants (incumbents? supervisors? race? gender?
tenure?) and how they were selected for participation
Section II – Page 16
For each SME, a completed Background and Qualifications form and a Test Security
Agreement and Statement of Responsibility form.
Signed and dated job analysis rating forms completed by SMEs
If a questionnaire was used to collect job analysis data, a copy of that questionnaire and
all of its rating scales should be documented, along with data collected by that
questionnaire
Descriptions of the work behaviors, tasks, and/or work products identified as important,
along with measures of their criticality and/or relative importance and level of difficulty.
Where appropriate, description of the knowledge (in terms of a body of learned
information), skill or ability (in terms of observable behaviors) that are used in the work
behavior(s) and their relative importance
Evidence of how the KSAs are related to the job behaviors
When a job analysis is updated, documentation regarding each SME’s revision of the
tasks, KSAs, ratings, and linkups. [In some cases it may be important to note the reason
for the change(s)]
Significant changes to the job analysis often will require an official amendment to the job
specification. In particular, changes to the minimum requirements will require such an
amendment. In these cases, the analyst should consult with the Department of Human Resources.
Based on the results of the job analysis, some classes may be deemed suitable for flexible
staffing. Generally, this is used for trainee or entry-level classes to progress to the journey level
without an additional Civil Service examination process. Consult with the Department of
Human Resources, Recruitment and Assessment Services Unit if you believe that the recruitment
you are working on may be appropriate for flexible staffing. Flexible staffing will be a rare
occurrence under Position Based Testing, but it may be possible in certain situations involving
highly specialized positions within a class series.
DESCRIBING JOB BEHAVIOR
As noted above, one of the job analyst’s key objectives is to obtain information about the worker
activities or behavior performed on a given job. Another is to seek information as to the job
requirements, which are measured in terms of needed worker characteristics.
Writing Critical Incidents
When the analyst wishes to focus on a worker’s primary activities or behavior, but not
necessarily on worker characteristics, the Critical Incident method of information gathering is
often used. This method is particularly applicable to physical or operational activities which are
part of a job but are not specified by a task approach to describing a job.
Writing a behavior description using this technique, the following basic components should be
included:
 The situation, circumstances or the setting of the “story”
 The behavior or activity performed
 The consequence or result of the behaviors
Section II – Page 17
To elicit critical incidents, SMEs may be asked one of the following questions:
 “Think of an incident indicating outstanding (extremely effective) performance in the
___________position. Describe the situation, the behavior and the consequences. It may
help to think of a person who is outstanding, and then think of incidents observed.”
 “Think of an incident that indicated less than effective (poor) performance by a _______.
Describe specifically the situation, the behavior, and the consequence.”
Here are two examples of critical incidents that correspond to the above questions respectively:
“A modification in the salary grades produced some discontent among employees. After
detecting this discontent, the manager brought his supervisors together for a meeting. In the
meeting he explained both the benefits and drawbacks of the new compensation
arrangements. He then asked his supervisors to return to their units and explain the new
compensation plan. As a result, morale was improved and the employees were more satisfied
with their treatment by management.”
“When changes in salary ranges were made, a supervisor explained to his subordinates that
the change was a matter of company policy and that no actions on their behalf could be
taken. As a result, morale deteriorated in this department.”
Writing Task Statements
A typical way of describing job behavior is by a task statement. A task statement should reflect
a single, discrete, purposeful activity. It is the smallest component of a purposeful job activity
but not so small that it no longer meets the definition of “purposeful work.” A common sense
way to decide on the scope of a given statement is to ask oneself, “Would one normally be
considered to be paid to do that activity?”
There are no firm rules on the number of task statements to be used to describe a job. However,
a practical way to decide how many is to divide the job into broad functional areas (duties) and
then determine the minimum number of task statements necessary to cover the purposeful
activities in each area. An analyst has a sufficient number of statements when they are
collectively exhaustive and mutually exclusive. That is, it is collectively exhaustive when it
includes all the significant purposeful activity that comprises the job. It is mutually exclusive
when none of the task statements overlap.
At a very minimum, a task should contain an action and an object of that action. However, a
widely used format involving task statements includes five components that are combined to
form a simple declarative sentence as seen below. Note that, the task statement generally starts
with a verb in the third person singular; the subject of the sentence is not written, but is always
implied to be, “the worker.” The following are the five, commonly-accepted, components of a
task statement.
Section II – Page 18
Performs what action? – This is the verb that tells what action is being done. The verb should
have a single, unambiguous meaning. If the verb is indefinite, unclear or ambiguous,
misunderstandings may occur, or else the verb might not really say anything. Below are some
common verbs that are usually not definite enough for task statements. Their use should be
avoided or clarified.
administers, analyzes, arranges, assesses, assists, assumes, assures, audits, collaborates,
cooperates, coordinates, counsels, develops, discusses, examines, facilitates, follow-up,
handles, helps, interviews, investigates, keeps, maintains liaison, manages, participates,
prepares, processes, is responsible for, reviews studies, supports, teaches.
To whom or what? – Grammatically, this is the indirect object, or a phrase that serves the same
purpose. Not all task statements will allow for this component.
To produce what? /Why? - This component tells the purpose of the activity and, grammatically,
is usually the direct object of the verb.
Using what tools, equipment, work aids procedures? – Things mentioned should be specific and
tangible, if possible. If there is no tangible and identifiable tool or procedure used, then this
component may be omitted from the task statement. KSAs should not be substituted for tools or
procedures.
Upon what instructions? - Instructions may be directions from a supervisor or procedures
specified in a document.
Two examples of task statements using the component format described above:
“Evaluates proposed construction projects to determine environmental impact according to laws
and regulations.”
“Initiates contact with client to explain program services either in person or by telephone taking
the names from ‘Intake Roster’.”
WORKER CHARACTERISTICS
Writing KSA Statements
Task statements describe behavior; KSA statements describe the inferred qualities which
underlie the behaviors. KSA statements do not have a standardized format but there are a
number of considerations to keep in mind when writing KSA statements:
Scope and Number of KSA Statements – In any job, it is usually possible to infer a large number
of worker characteristics that have “something” to do with the job; how large a number depends
on the level of abstraction used. The key is to find those, for which the degree of possession,
will lead to discernible and significant differences in the quantity or quality of job performance
(ranking KSAs), as well as those that represent essential characteristics for meeting a minimum
level of job competency (qualifying KSAs).
Section II – Page 19
Specificity – It is important that the KSA statement be specific, terse and direct, telling exactly
what characteristic is needed. This does not mean that a KSA needs to be specific to one task.
KSA statements should not be so nebulous that, although related to task performance, there is
confusion over exactly what is required. For example, writing “mathematical ability” may leave
us wondering what that means, especially when the true requirement is the “ability to add and
subtract whole numbers.”
Overlap – A KSA statement should never be overloaded by having unrelated characteristics
subsumed within it. As far as possible, the KSA statements, as a set, should be mutually
exclusive and collectively exhaustive, the same goal set for task statements.
Linkage – The KSA must be “linkable” to at least one job behavior; if it cannot be linked to any
behavior it definitely does not belong in the job analysis.
Intrinsic or Ancillary – Some KSAs may be linked to behaviors without having much bearing on
those behaviors. Such worker characteristics are not important to the performance of the
behavior (i.e., they are not intrinsic to the behavior). This is not to say that these worker
characteristics have no relations to the behavior; rather, they are “nice-to-know,” but are trivial,
ancillary details associated with the job. If a KSA has a direct bearing on performance of the
behavior, however, it should be included in the job analysis. [Another way of handling trivial or
ancillary KSAs is to include all KSAs in the job analysis, have them rated for importance, and
then drop out the trivial KSAs based on those ratings.]
Evaluating KSA Statements
When a worker characteristic is proposed during a job analysis it is necessary to evaluate that
worker characteristic in several ways:
1) To determine if the worker characteristic is really part of the job;
2) To edit the worker characteristic statement so that it constitutes a precise operational
definition in terms of job observables;
3) To determine if the worker characteristic can be used to rank candidates in order of merit
for the job (ranker) or if it is a pass/fail qualification for minimal job competency
(qualifier); and
4) To determine if the worker characteristic is acquired through on-the-job learning (OJL) or
whether it is something the worker must bring to the job (BTJ)
On-The-Job Learning or Brought-To-Job?
Worker characteristics which can be acquired after beginning the job (OJL) should not be
represented on a test. However, this doesn’t mean that OJL KSAs shouldn’t be stated in the job
analysis. To the contrary, there are at least three good reasons to include OJL KSAs in the job
analysis.
1. If nothing else, accounting for all KSAs provides protection against later appeals that the
test did not cover all relevant areas. Indeed, since OJL’s are part of the job and workers
Section II – Page 20
are expected to acquire them at some point, they are needed to ensure completeness of the
job analysis.
2. If the title is not a trainee title, and if a large proportion of the KSAs are learned on the
job, perhaps the title is being used inappropriately. Conversely, if the KSAs are rated as
needed at job entry and the minimum qualifications suggest otherwise, there also may be
a classification issue.
3. Elimination of OJL KSAs can exaggerate the importance of the remaining KSAs if only
the remaining KSAs were rated for importance relative to a given behavior.
The type of questions asked by the job analyst may have a bearing on whether the KSA will be
classified as OJL or BTJ. The analysts should avoid asking SMEs if the KSA can be learned
during the probationary period. In most cases, the honest answer is that any worker
characteristic, considered in isolation, could be object of intensive study or practice and possibly
learned by a new employee within three months. But the reality of the job is that one typically
does not receive three months’ salary to learn a single KSA. Hence, a better strategy would be
for the analyst to use the following pair of questions:
Is this worker characteristic going to be taught?
Is this worker characteristic something that most workers could learn on their own quickly?
In some situations the job analyst will have to use careful judgment to determine if a proposed
worker characteristic is actually OJL or BTJ, regardless of the initial claims of SMEs. Two such
situations are:
1. Although worker characteristics taught on the job are not proper material upon which to
base a test, there are situations where training programs presuppose a certain level of
competence in worker characteristics. In such cases, the training program builds upon
what is brought to the job. The training, then, provides an OJL extension for the BTJ
worker characteristic. It is permissible to test for the worker characteristic at the level
prerequisite for training. The content of the test, however, must not cover that content
pertaining to the worker characteristic which will be presented during training.
2. The job analyst must be alert to situations where worker characteristics are being
proposed as qualifying or important ranking KSAs when a more reasonable judgment
would be that these could be acquired on the job easily by anyone proficient in the other
aspects of the job. It is important, therefore, to consider how the present incumbents
acquired these worker characteristics, but it is also important that the job analyst make a
prudent judgment, regardless of the claims of the SME’s, of the reasonableness of
requiring these worker characteristics at entry.
Section II – Page 21
GENERAL INTERVIEWING GUIDE FOR AN ANALYST
(Assumes a single employee interview performed without tasks and KSAs drafted in advance.)
Introduction
Introduce self, explain the purpose of the meeting and that you will be taking notes.
[Make notes unobtrusively.] Consider using a laptop/project.
Reassure the employee that the interview does not concern performance evaluation or
reclassification.
Job Behaviors and Job Tools
Have the interviewee give an overview of the job. Ask the interviewee to describe the
major job duties in his own words; get purposes and broad duties. Do not go through the
job specification point by point.
Ask the employee to describe exactly what he does on the job during a typical day. This
is to break down the duties just described into specific behaviors.
Ask the employee about how important they are and how frequently they are performed.
Ask the employee about the tools or manuals that s/he uses on the job. Ask if manuals,
references and tool usage have to be known from memory, or if they can be looked up as
needed?
If the interview does not uncover important expected activities or tasks, ask about them,
but do not “force” them upon the employee. Resolve every ambiguity immediately.
Place the job in the organization
Ask the employee to describe how the performance of this job relates to other jobs and
people.
Determine to whom the employee reports on a regular basis and whether there are others
to whom the employee may report on a sporadic basis. If a supervisory position,
determine who is supervised and how much of the work is done by directing others.
Determine how this job fits into the overall organization and whether interaction is
needed with other employees or outsiders to accomplish the purpose of the job, or
whether the work is “self-contained.”
Determine whether all work is performed at a work station, if there is fieldwork,
traveling, meetings, etc.
Determine if the job entails working under any “unusual” conditions.
Worker Characteristics
Ask about the knowledge that the employee needs to do this job.
Ask about the kinds of abilities and skills that a person must be able to do on this job.
Ask the incumbent about on-the-job learning and whether a person must know or do this
when first hired or promoted into this job. Ask whether orientation is normally given,
whether courses are taken, etc.
Ask if there are “qualifying” KSAs; things which are necessary to do on the job but
which do not lead to better performance.
Differentiating the Elements
Ask the employee for the hardest or most demanding part of the job.
Ask about what gives newcomers the most trouble and in what areas are newcomers
best/least prepared.
Section II – Page 22
Ask the employee if there are things done on the job that an unprepared person could “let
slide” for a while.
Ask the employee if there are things which, if not done properly, will get the person in
trouble almost immediately.
Ask about the qualities or characteristics of the people who do well on this job. Explore
things such as volume of work, meeting deadlines, staying within budget, high-volume
productivity, absence of errors, speed in performance, or quality-of-product indicators.
Ask about the deficiencies of people who cannot do the job. Determine the things that
would lead to job failure.
Wrap-Up
Ask if there is additional information that hasn’t already been covered.
If the analyst’s notes are adequate for this purpose, allow the employee to rate job
behaviors or worker characteristics statements taken from the notes. Otherwise, these
ratings may be postponed until the analyst has properly drafted these statements.
Secure, or make note of, forms or manuals which might be of value either to document
the job analysis or to provide test material.
Thank the employee and end the interview.
Section II – Page 23
JOB ANALYSIS FORMS
As indicated earlier, analysts will need to document the job analysis. To do this, it is
recommended that analysts use the job analysis forms found on the following pages. [You may
also use the job analysis forms and instructions found in WRIPAC’s Job Analysis Manual.]
Procedure:
 Ask each SME to complete the Task Rating Form and the Knowledge, Skills, and
Abilities Form.
 For the Task Rating Form, the activities to be performed by the position should be listed.
 In preparing the draft of the form for SMEs to review, the analyst may want to extract
duties and responsibilities listed in the class specification.
 The analyst may also extract the knowledge, skills, and abilities listed in the class
specification to populate the Knowledge, Skills, and Abilities Form.
 Number each of the tasks and KSAs on the form.
 Compile the results to determine highest rated tasks and KSAs. Drop tasks/KSAs as
necessary based on cut-off criteria.
 Complete the Task and KSA Linkup Worksheet. Identify KSAs within each task and
then sum these identifications across the tasks listed on the worksheet to derive the
relative rank or weight of each KSA for exam development purposes. An alternate
approach is to rate KSAs (e.g., using values on a scale of how important the KSA is for
performing the task) within each task and then sum these KSA ratings across tasks listed
on the worksheet to derive relative ranks or weights of each KSA for exam development
purposes.
 Keep forms in the examination files.
Section II – Page 24
City and County of San Francisco
DEPARTMENT OF HUMAN RESOURCES
Department:
Dept. :
PBT JOB ANALYSIS
TASK RATING FORM
Division:
Section/Unit:
Phone No.:
Dept.’s HR Contact:
Name of Rater:
Rater Signature
Date:
Position Information
Job # & Title:
Reports to (Job # & Title):
Working Title (if applicable):
Supervises (Job # & Title):
Task Statements
List the purposeful activities performed by this position as task statements. For each task statement, describe: (a) the action performed (i.e.,
“what the worker does”); (b) the object of the action (i.e., “to whom” “to what” or “under what circumstance”); (c) why the action is taken (i.e.,
“in order to produce what or what is the expected output”), and; (d) how the action is performed (i.e., “using what tools, equipment, methods,
processes, instructions or directions”) Rate the importance and frequency of each task using the scales presented below.
Importance
Time Spent
No.
Task Statement
1
2
3
4
5
6
7
8
9
Importance Scale
Task is of CRITICAL IMPORTANCE with respect to overall job performan e
5
Task is of MAJOR IMPORTANCE with respect to overall job performance
4
Task is of MODERATE IMPORTANCE with respect to overall job performance
3
Task is of MINOR IMPORTANCE with respect to overall job performance
2
Task is NOT PERFORMED or it is of NEGLIGIBLE IMPORTANCE with respect to overall job performance.
1
Relative Time Spent
Task is CONSTANTLY performed on the job
5
Task is OFTEN performed on the job.
4
Task is OCCASIONALLY performed on the job.
3
Task is SELDOM performed on the job
2
Task is RARELY performed on the job or NOT PERFORMED at all.
1
Section II – Page 25
(Use the additional space below, if necessary)
Task Statements
List the purposeful activities performed by this position as task statements. For each task statement, describe: (a) the action performed (i.e.,
“what the worker does”); (b) the object of the action (i.e., “to whom” “to what” or “under what circumstance”); (c) why the action is taken (i.e.,
“in order to produce what or what is the expected output”), and; (d) how the action is performed (i.e., “using what tools, equipment, methods,
processes, instructions or directions”) Rate the importance and frequency of each task using the scales presented below.
Importance
Time Spent
No.
Task Statement
10
11
12
13
14
15
16
17
18
Importance Scale
Task is of CRITICAL IMPORTANCE with respect to overall job performance
5
Task is of MAJOR IMPORTANCE with respect to overall job performance
4
Task is of MODERATE IMPORTANCE with respect to overall job performance
3
Task is of MINOR IMPORTANCE with respect to overall job performance
2
Task is NOT PERFORMED or it is of NEGLIGIBLE IMPORTANCE with respect to overall job performance.
1
Relative Time Spent
Task is CONSTANTLY performed on the job
5
Task is OFTEN performed on the job.
4
Task is OCCASIONALLY performed on the job.
3
Task is SELDOM performed on the job
2
Task is RARELY performed on the job or NOT PERFORMED at all.
1
Section II – Page 26
City and County of San Francisco
DEPARTMENT OF HUMAN RESOURCES
Department:
Dept. #:
PBT JOB ANALYSIS
KNOWLEDGE, SKILLS, & ABILITIES FORM
Division:
Section/Unit:
Phone No.:
Departmental HR Contact:
Name of Rater:
Rater Signature:
Position Information
Date:
Job Code and Title:
Working Title (if applicable):
Reports to (Job Code and Title):
Supervises (Job Codes and Titles):
Knowledge, Skill and Ability Statements
List the knowledge, skills, abilities and personal characteristics that are necessary for someone in this position to possess in order to perform
the job’s tasks. As far as possible, KSA statements should: (a) refer to only one characteristic; (b) be mutually exclusive, avoiding overlap and;
(c) be specific, describing exactly what the characteristic is. Please rate each KSA using the three rating scales listed below. Then,
indicate the relative weight (whole number) of each KSA’s importance towards overall job success. The total of these weights should be 100.
No.
Knowledge, Skill or Ability Statements
Important
Differentiate
Q/R?
Weight?
to Possess
Job
at Entry?
Performance?
1
2
3
4
5
6
7
Important at Entry?
How important is it for a newly
hired employee to possess this
KSA?
2 Essential to possess at job
entry.
1
0
Desirable but not essential
to possess at job entry
Unimportant or it is Not
Needed at job entry (e.g.,
because it can be acquired
or learned after a training
period)
KSA Rating Scales
Differentiate Worker Performance?
To what extent does possession of
this KSA differentiate levels of job
performance among workers?
2 To a great extent.
1
Somewhat
0
Little or not at all
Section II – Page 27
Qualifying KSA
Or
Ranking KSA?
Q = this competency is
essential to have to perform
the job at a minimum level
of competency but more of it
doesn’t contribute to the
quality of job performance.
R = possessing more of this
competency will differentiate
worker performance since
more of it will allow one to
do a better job.
(Use the additional space below, if necessary)
No.
Knowledge, Skill or Ability Statements
Important
to Possess
at Entry?
Differentiate
Job
Performance?
Q/R?
9
10
11
12
13
14
15
16
Important at Entry?
How important is it for a newly
hired employee to possess this
KSA?
2 Essential to possess at job
entry.
1
0
Desirable but not essential
to possess at job entry
Unimportant or it is Not
Needed at job entry (e.g.,
because it can be acquired
or learned after a training
period)
KSA Rating Scales
Differentiate Worker Performance?
To what extent does possession of
this KSA differentiate levels of job
performance among workers?
2 To a great extent.
1
Somewhat
0
Little or not at all
Section II – Page 28
Qualifying KSA
Or
Ranking KSA?
Q = this competency is
essential to have to perform
the job at a minimum level
of competency but more of it
doesn’t contribute to the
quality of job performance.
R = possessing more of this
competency will differentiate
worker performance since
more of it will allow one to
do a better job.
Weight?
PBT TASK AND KSA LINKUP
WORKSHEET
City and County of San Francisco
DEPARTMENT OF HUMAN RESOURCES
Instructions: Label each task and
KSA. Then, for each task,
identify each KSA which you
consider important to possess in
Task
order to perform that task.
Task
Task
Task
Task
Sum
Task Task Task Task Total
KSA
KSA
KSA
KSA
KSA
KSA
KSA
KSA
KSA
KSA
KSA
Rater
Signature
____________________________
Date:
_____________________
Section II – Page 29
MINIMUM & DESIRABLE QUALIFICATIONS
MINIMUM QUALIFICATIONS
The next step in the job analysis process is the review of minimum qualifications (MQs) for the
position(s).
Minimum qualifications (MQs) are formal statements of the experiences individuals are required
to have in order to compete further in the employee selection process. They are descriptions of
the education, training, work experience, licenses, certifications, etc., that one must have to
possess the competencies needed to perform a job at entry. In other words, they are the lowest
level of acceptable education and/or experience required of an individual such that the individual
reasonably could be expected to satisfactorily perform the duties of the position. One is unlikely
to perform a job successfully if they do not possess the stated MQs associated with the job
classification. This is the primary purpose of minimum qualifications: to identify those job
seekers who clearly will not be able to perform a job successfully at entry. However, it should
also be understood that possession of MQs alone does not guarantee that someone has the
capacity to do a job successfully.
MQs are stated in the job announcement. Applicants who do not meet the stated MQs are
eliminated from the selection process. Therefore, satisfying the MQs is the first step that job
seekers must take in the selection process. Screening applications on the basis of MQs is
considered a selection procedure that is covered by the Federal Uniform Guidelines on Employee
Selection Procedures. To successfully withstand Title VII lawsuits that allege discrimination or
other illegal hiring practices, any MQ that is used to screen applicants must be job-related and
consistent with business necessity. Indeed, MQs should never be used to artificially or arbitrarily
reduce the size of the applicant pool.
Job-related refers to the degree to which KSAs that are measured by a selection procedure are
actually related to the requirements of the job. This must be demonstrated through a documented
review of the position which shows that the competencies (KSAs) measured in the screening
method are required to successfully perform the job. Business necessity means that there exists
an overriding legitimate business purpose such that the practice is necessary to the safe and
efficient operation of the business. For something to be measured under the condition of business
necessity, therefore, it must be truly essential and directly related to an essential job function.
Business necessity is a major legal defense to using an employment procedure that excludes
Section II – Page 30
persons in protected classes. This is why it is critical to link the MQs to the job analysis. It is
recommended that the Qualifications form be used to help document this linkup.
As a general rule, minimum qualifications for the tested position will remain the same as the
minimum qualifications that are stated in the class specification. There are, however, some
exceptions to this rule:
a) For good cause, the Human Resources Director may approve an exception.
b) The tested position involves a manager classification that is represented by the
Municipal Executives’ Association (MEA).
c) The classification for the tested position has a special condition associated with it.
On occasion, a Subject Matter Expert may want to recommend changes to the established
minimum qualifications for a non-managerial classification. If the analyst believes that the
recommendation has merit, he or she should ask the following questions:
Is the recommended MQ reasonable in terms of the lowest level of acceptable
education and/or experience needed to satisfactorily perform in the classification?
Does it truly reflect the minimum amount of experience and/or training needed to do
the job at job entry or is it something that can be learned on the job in a short time
(e.g., 3 months)? If the latter, it should not be included.
Are there employees working in the classification who can perform the job
successfully even though they do not possess the proposed MQs?
Are the proposed MQs too high to recruit a sufficient pool of well qualified
applicants? Or are they so narrowly defined as to severely restrict the applicant pool?
How do the previous experience requirements compare with the SME
recommendations?
Are educational requirements, such as specific college course work or degrees, fully
supported by the job analysis?
Are all current legal requirements (e.g., Licenses, certificates) accurately described
including the licensing body?
Is it appropriate to allow substitutions or “equivalent combination” language?
If comfortable with the answers to the above questions, the analyst should then submit a proposal
to revise the MQs to DHR. That proposal should address the above questions. A Classification
Review Committee in DHR will review this documentation and assess the merits of the
recommendation. That review will consider, among other things, the relationship of the
recommended changes to other classifications that are higher, lower or similar to the target class.
If the change is approved, the job specification will be amended and posted, prior to the issuance
Section II – Page 31
of the job announcement. If no valid challenges to the amended MQs are received, the job
announcement may then be posted.
For MEA-represented classifications, DHR should also review changes to the MQs if they differ
from those that were used previously for the same position or if they differ from those previously
approved by DHR when the Job Analysis Questionnaire for the position was submitted.
To summarize, minimum requirements:
Identify those applicants who clearly will not be able to perform a job at entry.
Serve as device to realistically limit the number of candidates in selection.
Give job seekers the opportunity to evaluate their chances of being able to learn/perform
the job.
Increase the efficiency in the selection process (effort & dollars are invested in
procedures only at those who have a reasonable chance of being hired).
May articulate promotional paths and provide a career path progression, where prior
experience in the occupational hierarchy leads to higher level jobs in the hierarchy.
MINIMUM QUALIFICATIONS WITH SPECIAL CONDITIONS
Minimum qualifications serve as the base requirements for all positions classified in a given job
class. As such, they always must be met by all applicants for any position in that job class.
However, the tasks and duties performed by those in the same classifications may be quite
diverse, on a position by position basis. Usually in these situations, a specific position will
require some technical competency to perform an essential function which is not performed, or is
less performed, by incumbents in other positions within the same classification.
When a position-specific KSA or technical competency is determined to be required at job entry
and needed for an essential job function and the existing MQs (as expressed in terms of
experience and education) don’t adequately focus on this KSA, the Position-Based Testing
program allows the agency to introduce a special requirement known as a “special condition.”
When this “special condition” is attached to a requisition, the announcement generated as a result
of that requisition must include the special condition as part of the MQs. In fact, when there is a
requisition with a special condition attached, DHR will not approve a hiring agency’s PBT
request for that announcement without also reminding the agency that they need to specify the
special condition in the MQs when the announcement is posted.
When the existing MQs in the job specification do not speak to the specialized area associated
with the “special condition,” the announcement for the target job with the “special condition”
should add a new MQ statement to the existing MQs. In other cases, such as the Administrative
Analyst series, where the standard MQs are broader in scope than the MQs to be used to reflect
the “special condition,” the PBT announcement may include modification of the classification’s
MQs. That is, the modified MQs should include that part of the MQ which pertains to the
“special condition” but exclude any education or experience irrelevant to the position. For
example, if the existing MQ says that one must have "three years of experience in either X, Y
Section II – Page 32
OR Z", and Z represents the special condition, then the MQ should be modified to read, "three
years of experience in Z."]
When a “special condition” is added to the MQs of a PBT job announcement, applicants can be
screened on possession of the special condition, just as they are screened on the MQs. However,
it must be remembered that the “special condition” must be approved before it can be
incorporated in the MQs of a PBT job announcement. If it is not approved, agencies may not
screen down the applicant population based on the idea that a position’s special ‘requirement’
allows them to do so. Only when that special requirement rises to the level of importance and
significance of a “special condition” which is approved by DHR, may an agency screen
applicants using MQs that differ from the base MQs associated with the job class.
To avoid possible confusion among job seekers, every announcement with a special condition
should include the following language at the bottom of the MQ section, “The above minimum
qualifications reflect special conditions associated with the position(s) to be filled. They may
differ from the standard minimum qualifications associated with this class code.” This language
also will be useful to City personnel analysts as it will mark the announcement and allow
analysts to readily distinguish it as different from the usual or standard MQs associated with the
class code.
To summarize, “special conditions” are specific qualifications that are required in order for an
employee to successfully perform the duties of the position. Special conditions:
Must be identified in the job analysis.
Must be approved by DHR.
Must be included in the job requisition.
Are not valid for use if the job can be successfully performed without them.
Are not valid for use if the underlying KSAs can be learned in a reasonable period of
time on the job.
Must be incorporated into the minimum qualifications that are stated in the job
announcement.
May serve as a basis to eliminate applicants who do not possess them.
There are some exceptions to the “special condition” policy described above. Again, the City’s
manager classifications (0900 series) are based on a broad title consolidation effort. As such,
class distinctions are more about management level than specific programmatic experience or
technical competence. In fact, the job specifications for these classes do not include MQs. Since
work in these classifications often involves the administration of a highly-specialized program,
DHR allows agencies to apply position-specific MQs to each and every PBT announcement for
0900 classifications. Consequently, there is no need in these cases for agencies to specify
“special conditions” in the PBT announcements for these classifications.
Section II – Page 33
COMMON PITFALLS ASSOCIATED WITH MINIMUM REQUIREMENTS
The following are some of the types of problems or issues confronted by job analysts when
drafting or reviewing MQs:
Language that is unclear, subjective, imprecise, or ambiguous for interpretation. The
MQ statement should be clearly stated and interpretable by all stakeholders (i.e., the
hiring agency, the screening HR analyst, and the job seeker). They also should be
verifiable and quantitative where practical.
Not setting requirements as low as practicable. Are the requirements truly needed at
job entry? Is the number of years of required experience unreasonable? Is a degree
necessary at all? The MQs should provide a reasonable basis for recruiting candidates
who are likely to be able to do the job. The MQs should not be so restrictive as to
exclude candidates who might reasonably have the ability to do the job.
Limiting experience exclusively to the public sector, when private sector experience is
also relevant (e.g., instead of experience “in a large public agency”, it may be
appropriate to add “or business organization”).
Limiting experience to a specific Department, or even Division. For example, “Four
years of experience in systems analysis in a large public welfare agency” is tantamount
to making the requirement a description of the job itself, instead of a pure description of
the essential skills needed to perform it.
Misuse of professional jargon and/or definitions.
Including a subject in their education requirement that rationally doesn’t belong there
(e.g., possession of a Bachelor’s Degree in business, management, public
administration or physical education).
Failing to recognize or credit equivalent substitution for experience, education,
certifications, licenses, etc. It should be remembered that some types of MQs may have
an adverse effect on certain protected groups depending on the particular labor market.
Consideration should be given to allow the substitution of experience for education or
education for experience requirements to mitigate this possibility.
Requiring experience that is not related to the class, position or the level of the position.
Requiring a type of experience, training or education that is really desired or preferred,
rather than truly a minimum qualification.
Requiring a driver’s license for convenience when it is not necessary to perform the
essential functions of the position
Section II – Page 34
QUALIFICATION FORM
As a result of the MQ-identified experience, it is believed that one has acquired the “know how”
(e.g., the specialized knowledge, the enabling competency or behavior, etc.) that is needed to
perform a job at entry. This is the key concept that underlies MQs. When recommending new
MQs, each subject matter expert must provide documented justification to the job analyst by
identifying the KSA that underlies the training or education experience that is being required.
This information should be identified on the Qualification form, which should be signed by both
the SME and the job analyst. [See below.]
PURPOSE:
To establish and document the required minimum qualifications that applicants
need to possess to pass the first step in the selection process.
FORM:
Qualifications form
PROCEDURE:
Analysts must list the KSAs in the appropriate column on the form.
SMEs recommend the appropriate amount of training/education and/or experience
for each KSA.
SMEs must indicate whether each experience is a minimum qualification
(something used to screen applicants at the beginning of the selection process) or
a desirable qualification (something used at the end of the selection process to
make the final hiring decision).
Section II – Page 35
POSITION BASED TESTING
JOB ANALYSIS
QUALIFICATIONS FORM
City and County of San Francisco
DEPARTMENT OF HUMAN RESOURCES
Department:
Dept. #:
Division:
Section/Unit:
Phone No.:
Departmental HR Contact:
Name of Rater:
Position Information
Job Code and Title:
Working Title (if applicable):
Reports to (Job Code and Title):
Supervises (Job Codes and Titles):
Qualifications
The following Knowledge, Skills and Abilities (KSAs) have been identified as important or critical for the performance of job activities. These KSAs will be translated into minimum
qualifications (MQs) which an applicant is required to possess in order to pass the first step of the selection process and desirable qualifications (DQs) which may be used in the postreferral selection process. Specify below for each KSA all the possible types of applicable training (education) and/or experience by which a person could acquire the minimum necessary
amount of the KSA. For each type of training specified, indicate the minimum amount needed by a newly-hired employee in terms of time needed or type of education/training and/or
experience. For each type of experience specified, indicate the minimum amount needed by a newly hired employee in terms of months or years. For each KSA, if either training or
experience is not needed, write “none”. Check if the qualification is a minimum qualification or a desirable qualification.
Knowledge, Skills and Abilities Statement
Type of Education/Training Length of
Type of Experience
How many
Check if
Check if
Education/
years?
MQ
DQ
Training
Signature
Signature of Rater:
Date:
Analyst Signature:
Section II – Page 36
DESIRABLE QUALIFICATIONS
Desirable qualifications (DQs) are statements that define what an employer would prefer a job candidate to
possess for a given job. They typically refer to a job seeker’s experience, training, education, character, work
habits, special skills or abilities (e.g., second languages, special computer skills), demonstrated achievements or
availability. DQs are considered to be useful or helpful, but not necessary, for an individual to have in order to
perform a job successfully. For example, an employee could perform important parts of a job upon entry
without satisfying a DQ, but possessing a DQ might save the employee time needed to learn certain job-related
knowledge, skills or abilities after being hired. Desirable qualifications may be stated in a given recruitment in
order to alert applicants of those qualifications that will likely be considered to identify job finalists. When
included, it is recommended that wording be used which states that the DQ may be considered at the end of the
selection process when candidates are referred for hiring. This will help address questions job seekers might
have about whether the DQ is necessary to apply for the job. Indeed, it is considered inappropriate to include
DQs for the purpose of discouraging applicants who only meet the minimum requirements or to attract only
superior applicants.
In merit system testing, satisfying the MQs means that an applicant has what it takes to participate in the
selection process and demonstrate that she or he has a proficient level of performance with respect to worker
characteristics identified in the job analysis as essential to do the job. It is inappropriate to screen or narrow
down the applicant pool on the basis of a paper review of DQs (i.e., review of application or resume) before
applicants are given an OPPORTUNITY to compete further in the selection process. That is, applicants who do
not possess the DQs should be allowed to demonstrate their ability to do a job through some method of testing
or evaluation. They should not be excluded prematurely from competing with others based on a paper review
of their credentials.
Indeed, when DQs are stated in terms of education, training and experience and are used to screen down the
applicant pool before other assessments are administered, they in essence become de facto MQs. Since DQs, by
definition, are not required or essential, but only useful or helpful to do the job, there is no legal basis for their
use.
Indeed, DQs cannot be justified as being consistent with business necessity. Even though they may be
“desired” because they “useful” or “helpful” with respect to successful job performance, they cannot, by
definition, also be considered “required” or “essential” to job success. Applicants who do not possess the DQs
may be quite successful in their job performance, if hired. An employer’s decision to exclude such candidates
from further consideration by categorizing them as merely “qualified” and not “well-qualified” based on a paper
review of their credentials would likely be viewed as arbitrary, subjective, and unsupportable in a court of law.
Although the employer might argue that categorizing candidates in this way is a practical way of reducing large
candidate populations to a manageable or convenient size, court cases have demonstrated that numbers are
irrelevant in this regard.
Guidance relating to the use of DQs:
DQs should not usurp MQs in determining who may advance in the selection process.
DQs should be job-related, consistent with the position analysis and clearly written to avoid
misinterpretation.
DQs generally should not be stated in terms of traits, skills, abilities, knowledge or competencies. When
worker characteristics such as these are deemed desirable and deemed important for the job, they
Section II – Page 37
generally should be formally assessed in the selection process, since they cannot be measured
objectively via paper review.
DQs may be used to guide the final selection process following the certification of eligibles. This is
when the hiring department has discretion under the certification rule to use DQs in its selection
decision.
ESTABLISHING A CERTIFICATION RULE
Civil Service Commission (CSC) Rules authorize certification processes for appointments from eligible lists.
As indicated in CSC Rule 111A, certification rules vary for classes represented by different bargaining units.
The use of certification rules must comply with Civil Service Rules. Agreements with labor organizations to
use broader rules must be confirmed in writing and these records must be retained with the Exam Active File.
For most classes a certification Rule of Three Scores must be used in the absence of another agreement with the
labor organization. Expanded certification rules may allow for Rule of Five Scores, Rule of Seven Scores and/or
Rule of Ten Scores. The Civil Service Commission generally does not support the use of Rule of the List
because it does not conform to standards established by the State Personnel Board (some exceptions include
MCCP titles and certain public safety titles).
Section II – Page 38
VERIFICATION AS PART OF THE APPLICATION PROCESS
The most important factor to consider when deciding whether to request verification from job applicants
with regard to the minimum requirements is the utility of obtaining and assessing verification from everyone
in the applicant pool, as opposed to only assessing verification from those eligibles who are actually being
considered for employment. Some factors to consider:
Anticipated size of applicant pool
Anticipated number of vacancies
Expertise required for job; consequence of error
Recruitment difficulty
Resources required to examine all applicants (cost, staff time, rater availability) prior to
screening for verification
Source of applicant pool – it may be more difficult for applicants to verify outside experience
Out of class issues
City experience is defined by the class specification unless there are out of class issues
Strength and appropriateness of the minimum qualifications
There are a number of documents that can be requested and used to assess employment verification of
experience:
Transcripts, licenses or certifications
Standard letters (on letterhead) from employers
Performance evaluations that describe duties performed
DHR Employment Verification form
PeopleSoft records or appointment processing forms
The following provide limited information regarding experience and can be used when other verification is
not obtainable:
W2 forms
Income tax returns
Paycheck stubs
Section II – Page 39
SELECTION PLAN EVALUATION
When an analyst chooses among the different testing modes, s/he should bear in mind the tenet that “the greater
the degree of overlap between the behaviors measured by the test and those performed on the job, the more
valid will be the test.” This is why work sample assessments tend to be the most predictive of job success.
Further, research in the field reveals that written tests and structured oral examinations/interviews also have
high validity in terms of predicting job success, followed by Behavioral Consistency Examinations and
Assessment Centers. Point method T&E examinations, on the other hand, are generally poor predictors of job
success.
More will be discussed in PBT Manual, Section IV – Exam Development regarding various test mode(s),
however, the following will serve as a brief guideline for deciding which test mode(s) an analyst should use:
1. Performance tests should be used to determine whether a candidate has developed skills needed for
satisfactory or outstanding performance on the job. Driving, dictation, and typing are skills which have
traditionally been tested in this manner. Many other kinds of developed skills can—and should—be
tested in this way when it is practical to do so.
2. Written tests are appropriate for use in determining whether candidates:
a. Possess knowledge needed for satisfactory or outstanding job performance, or
b. Can learn material which will be taught either in a classroom or on the job, or
c. Can demonstrate abilities or behaviors needed on the job.
3.
Oral tests are appropriate for use when:
a. Characteristics identified in the job analysis may not be effectively evaluated via other testing
methods (e.g., interpersonal skills);
b. Past experience or expert judgment indicates that the English language literacy level required by
the job is so low that candidates will not be able to deal effectively with written materials.
4. Training and experience evaluations (T&Es) may be appropriate in some instances for practical reasons.
For example, when the applicant population is expected to be low, especially with respect to hard-to-fill
positions where there is limited competition, it is not very practical to screen down the applicant pool
further with a formal examination process. In such cases, it makes more sense to assess the credentials
(i.e., work experience, training and education) of the applicants, place them on an eligible list and allow
the post-referral interview process to determine the most qualified candidate among this limited group.
Similarly, if the total number of candidates is slightly higher, but a broad certification rule allows the
hiring manager to consider all of those candidates at the time of referral, again, it may not be practical to
administer a formal examination. Also, when a given classification requires a license or certification,
DHR’s formal testing of the necessary knowledge and abilities to do the job may be duplicative and
unnecessary.
The “appropriate” selection procedure should be based on the following considerations:
a. The relationship between test behaviors and job behaviors. The behaviors elicited by the selection
procedure should correspond closely to the actual behaviors needed to perform the job.
b. Whether the selection procedure is suited to the level of incumbents in the particular occupational field.
Section II – Page 40
c. The expected psychometric properties of the selection procedure in terms of the reliability and validity
of similar procedures in similar measurement situations.
d. The efficiency, practicality, and feasibility of the selection procedure compared to the resources of the
employer (staff, testing time, physical and financial limitations, etc.), as well as the characteristics of the
potential labor market (expected number of applicants, open vs. promotional recruiting, number of
vacancies, etc.)
e. Potential adverse impact that might result based on a particular selection instrument and the
demographics of the candidate population.
Again, a more detailed discussion of selection plan development may be found in Section IV.
Section II – Page 41
SELECTION PLAN WORKSHEET
PURPOSE:
To document which KSAs are to be tested and how they are to be tested.
FORM:
Examination Outline Worksheet
PROCEDURE:
List KSAs and weights.
Indicate in the column under Section C if KSA is a minimum qualification or if it is a
desirable qualification.
Determine in Section D the appropriate selection devices to be used and distribute the
weights appropriately.
Enter in Section E the total weight for each selection device.
Discuss the selection plan with DHR staff if proposing the use of a multiple component
examination.
Section II – Page 42
CITY AND COUNTY OF SAN FRANCISCO
EXAMINATION OUTLINE
Class #:
Date:
Title:
___________
Section
A
KSA Letter and Title
Analyst:
Section
B
Section
C
KSA
Weight
M.Q.’s
Section E – Test Totals:
Section II – Page 43
Team Leader:
Section D
TEST WEIGHTS
Written
Performance
Test
Oral
Training and
Experience
Evaluation
Other
(Specify
type)
SECTION III
Recruitment
Section III – Page 1
SECTION III – RECRUITMENT
TABLE OF CONTENTS
Page
RECRUITMENT PLAN
3
ANNOUNCEMENT PREPARATION
4
Elements of the Exam Announcement
4
STANDARD TEXT FOR ALL JOB ANNOUNCEMENTS
6
INSTRUCTIONS FOR ANNOUNCEMENT CONTENT
9
PROCEDURE FOR AMENDING & REISSUING AN ANNOUNCEMENT
14
ANNOUNCEMENT APPEALS
15
CANCELLATION OF A RECRUITMENT, EXAMINATION OR ELIGIBLE LIST
16
Factors considered when evaluating a request to cancel
USEFUL LINKS
18
19
Section III – Page 2
RECRUITMENT PLAN
A variety of recruitment and assessment strategies are available for use. To determine the best
strategy to use in a given circumstance, the job analyst should gather additional information beyond
the job analysis. This will help the analyst make a number of key decisions that relate to:
the type of announcement (e.g., should it be a promotional announcement only OR a
combined promotional and entry-level announcement?)
the period of time the announcement should remain posted or open.
how extensive the recruitment should be (e.g., are job fairs or advertisements in
professional/technical publications necessary?)
the number and type of selection instruments
where the passing point should be established
the life or duration of the eligible list
Information useful to the analyst in making these decisions includes the:
Anticipated number of positions to be filled over the life of the eligible list.
Need for qualified candidates with special skills (special conditions).
Number of expected applicants and difficulty of recruitment.
Number of eligible candidates on last eligible list.
SME and appointing officer assessments of the quality of candidates on the last list
Appeals/lawsuits/complaints received as a result of prior recruitments.
Race and gender composition of the incumbents in the classification, along with the
composition of other classes within the title series (in particular, classes directly above and
below the target class). [This information is available in PeopleSoft.]
Diversity of the candidate pool resulting from the last recruitment(s).
Frequency of examination administration.
Competition from other jurisdictions.
Traditions and practices of the hiring department relative to the classification.
Analysts should document why they have chosen a particular recruitment strategy.
Section III – Page 3
ANNOUNCEMENT PREPARATION
City and County of San Francisco examination announcements should be informative, easy to read
and written in the clearest and simplest terms. General information presented in announcements
should be consistent across hiring departments. Oftentimes, job announcements will contain too
much information which may make it difficult for the job seeker to discern essential information from
that which is less important. For this reason, essential information should be described completely,
while less important information, or information needed by only certain applicants, should be cited
with links to reference websites.
This section will:
identify required and optional elements of announcements for various types of exams
provide instructions for announcement content, and tips on handling specific situations
specify appropriate sources of language, citations or information from which announcement
content is obtained
provide standard announcement text for general content, coding, and website addresses
ELEMENTS OF THE EXAM ANNOUNCEMENT
Recruitment Number:
Recruitment Number in JobAps:
A
-
B
-
C
A = 3-character Job Type
B = 4-digit Job Code (i.e., 1241, 0931, etc.)
C = List ID # (i.e., 055000, etc.)
Job Type
CBT
PBT
CCT
REG
TPV
PEX
TEX
RTF
IFO
EXAMINATIONS
Class Based Test - Discrete
Position Based Test - Discrete
Continuous Class Based Test
Employment Register
NON-EXAM RECRUITMENTS
Provisional
Permanent Exempt
Temporary Exempt
Transfer/Reassignment
Information Purposes Only (advertising in addition to the regular
exam process above
Required and Optional Elements:
Symbols - Description
 – Use JobAps drop down menu
 – Use Standardized Text
 – Use Links to Description, Form, or CSC Rule
 – Manually enter data in job announcement
 – Hard Coded on all job announcements
Section III – Page 4
Required or
Optional
Announcement Section
Source of
Information








Department
Analyst
Date Opened
Filing Deadline
Salary
Job Type
Employment Type
Position Description - Description includes Essential
Functions
Minimum Qualifications - Fully supported by job analysis
How to Apply
Selection Procedure
Verification
Certification
R
R
R
R
R
R
R
R
R
R
R
R
R





Eligible List
Reasonable Accommodation Request
Veteran’ Preference (if applicable)
Seniority Credit in Promotional Examinations (if applicable)
General Information concerning City and County of San
Francisco Employment Policies and Procedures
Terms of Announcement
Copies of Application Documents
Right to Work
Exam Type: E, P or CPE
ISSUED:
Name of HR Director
Department of Human Resources
R
R
R
R
R





R
R
R
R
R






DSW NOTE
Benefits Note
Working Conditions - Tied to Duties and confirmed by ERD
Substitution - Supported by job analysis
Special Conditions (if applicable) – Fold into Minimum
R
R
R
O
O
O
Licensure (if applicable) - Statutory Requirements e.g.,
O

O

O

O
O
O



DEPARTMENT / ANALYST’S INITIALS /PHONE
Qualifications
RN, PE, etc. or supported by job analysis e.g., Driver’s
License
Desirable Qualifications (PBT only ) – Use in post-referral
selection process
Transportation Security Administration (TSA) Security (if
applicable)
Safety Sensitive Positions Requirements (if applicable)
Physical Examination
NOTES: (IF APPLICABLE)
Section III – Page 5





STANDARD TEXT
Standard text is provided in the following tables. Electronic forms of this text may be copied from
the DHR website at: http://www.sfgov.org/site/sfdhr_page.asp?id=52512.
STANDARD TEXT FOR ALL JOB ANNOUNCEMENTS
How to Apply:
Applications for City and County of San Francisco jobs are being accepted through an online process. Visit
www.jobaps.com/sf to register an account (if you have not already done so) and begin the application process.
Select the desired job announcement
Select “Apply” and read and acknowledge the information
Select either “I am a New User” if you have not previously registered, or “I have Registered Previously”
Follow instructions on the screen
Computers are available for the public (from 8:00 a.m. to 5:00 p.m. Monday through Friday) to file online
applications in the lobby of the Dept. of Human Resources at 1 South Van Ness Avenue, 4th Floor, San Francisco.
Applicants may be contacted by email about this announcement and, therefore, it is their responsibility to ensure
that their registered email address is accurate and kept up-to-date. Also, applicants must ensure that email from
CCSF is not blocked on their computer by a spam filter. To prevent blocking, applicants should set up their email
to accept CCSF mail from the following addresses (@sfgov.org, @sfdpw.org, @sfport.com, @flysfo.com,
@sfwater.org, @sfdph.org, @asianart.org, @sfmta.com).
Applicants will receive a confirmation email that their online application has been received in response to every
announcement for which they file. Applicants should retain this confirmation email for their records. Failure to
receive this email means that the online application was not submitted or received.
All work experience, education, training and other information substantiating how you meet the minimum
qualifications must be included on your application by the filing deadline. Information submitted after the filing
deadline will not be considered in determining whether you meet the minimum qualifications.
Applications completed improperly may be cause for ineligibility, disqualification or may lead to lower scores.
If you have any questions regarding this recruitment or application process, please contact the exam analyst,
NAME, at PHONE.
Selection Procedure:
[Describe selection procedures.]…. Candidate scores on this examination may also be applied to other
announcements involving other job titles, when directed by the Human Resources Director.
Note: Applicants who meet the minimum qualifications are not guaranteed to advance through all of the steps in
the selection process.
Verification:
Applicants may be required to submit verification of qualifying education and experience at any point in the
application, examination and/or departmental selection process. Note: Falsifying one’s education, training, or work
Section III – Page 6
experience or attempted deception on the application may result in disqualification for this and future job
opportunities with the City and County of San Francisco.
Certification Rule:
The certification rule for this recruitment will be Rule of List/ XX Scores. Additional selection processes may
Be conducted by the hiring department prior to making final hiring decisions.
Eligible List:
The eligible list resulting from this examination is subject to change after adoption (e.g., as a result of appeals),
as directed by the Human Resources Director or the Civil Service Commission.
The duration of the eligible list resulting from this examination process will be of XX months, and may be
extended with the approval of the Human Resources Director.
[Note to Analyst: Add to the above this section for PBT only:] Upon approval of the Human Resource Director
(see Civil Service Rule 111A.26.5), the eligible list resulting from this announcement may be used by other
departments that also use this classification or a similar classification. To find other Departments which use this
classification, please see http://www.sfdhr.org/Modules/ShowDocument.aspx?documentID=13693. Search that
document by title or job code to see which departments use the classification.
Requests:
Applicants with disabilities who meet the minimum eligibility requirements for this job announcement can find
information on requesting a reasonable ADA Accommodation at
http://www.sfdhr.org/index.aspx?page=20#applicantswithdisabilities
Information regarding requests for Veterans Preference can be found at:
http://www.sfdhr.org/index.aspx?page=20#veteranspreference
Requests for an alternate test date may be considered in limited circumstances and must be submitted to the
analyst listed in this announcement within five (5) calendar days of the announcement of the test date.
Seniority Credit in Promotional Exams:
Seniority credit information can be found at: http://www.sfdhr.org/index.aspx?page=20#senioritycredit
General Information concerning City and County of San Francisco Employment Policies and Procedures:
Important Employment Information for the City and County of San Francisco can be obtained at
http://www.sfdhr.org/index.aspx?page=20 or hard copy at 1 South Van Ness Avenue, 4th Floor.
Terms of Announcement:
Applicants must be guided solely by the provisions of this announcement, including requirements, time periods and
other particulars, except when superseded by federal, state or local laws, rules or regulations. Clerical errors
may be corrected by the posting the correction on the Department of Human Resources website at
www.jobaps.com/sf.
[Note to Analyst: For PBT only, add to above this section:] The terms of this announcement may be appealed
under Civil Service Commission (CSC) Rule 111A.35.1. The standard for the review of such appeals is ‘abuse of
discretion’ or ‘no rational basis’ for establishing the position description, the minimum qualifications and/or the
certification rule. Appeals must include a written statement of the item(s) being contested and the specific
reason(s) why the cited item(s) constitute(s) abuse of discretion by the Human Resources Director. Appeals must
be submitted to the Executive Officer of the CSC within five (5) business days of the announcement issuance
date.
Copies of Application Documents:
Applicants should keep copies of all documents submitted, as these will not be returned.
Section III – Page 7
Right to Work:
All persons entering the City and County of San Francisco workforce are required to provide verification of
authorization to work in the United States.
Additional Information:
General information concerning getting a job and employment procedures may be found online at:
http://www.sfdhr.org/index.aspx?page=20.
Exam Type: E, P or CPE
Amended and/or Reissued: Month, Day, Year (Provide reason at top of announcement)
Issued: Month, Day, Year
Name of HR Director
Human Resources Director
Department of Human Resources
Recruitment ID Number: XXXXXX
Department/ Analyst’s Initials / Phone
SAMPLE TEMPLATE LANGUAGE FOR SPECIFIC ANNOUNCEMENTS
Desirable Qualifications:
If used, they should be fully supported by the job analysis and the announcement must include the
following statement: “The stated desirable qualifications may be used to identify job finalists at
the end of the selection process when candidates are referred for hiring.”
Physical Examination:
Before appointment, selected eligible candidates must pass a thorough physical examination by the
Department Physician. This exam includes a urine test to screen for the presence of drugs or
alcohol. Appointees must also pass an additional physical exam prior to the completion of their
probationary periods.
Safety Sensitive Positions Requirements:
In compliance with the Department of Transportation Omnibus Transportation Employee Testing
Act of 1991 employing the Federal Motor Carrier Safety Administration (FMCSA) and Federal
Transit Administration (FTA) regulations, drug and alcohol testing for employees in “safetysensitive” positions are required. The selected applicants for safety-sensitive positions will be
required to pass a Pre-Employment drug test prior to appointment and shall be subject to Random,
Post-Accident, Reasonable Suspicion, Return-To-Duty, and Follow-Up testing during employment.
Prior to appointment to an FMCSA position, each applicant who has participated in a DOT drug and
alcohol testing program within the immediately preceding two years will be required to sign a
consent form authorizing the City to contact his/her prior employers concerning his/her drug and
alcohol test history.
Transportation Security Administration (TSA) Security Clearance:
Candidates for employment with the San Francisco Airport Commission are required to provide a
complete employment history for the past ten (10) years and an explanation of all gaps in
employment during that period. The past ten (10) years of the candidate's employment will be
verified. In addition, candidates will be required to undergo a criminal history check, including FBI
fingerprints, in order to determine eligibility for security clearance and may be required to undergo
drug/alcohol screening. Per Civil Service Commission Rule Section 110.9.1, every applicant for an
examination must possess and maintain the qualifications required by law and by the examination
announcement for the examination. Failure to obtain and maintain security clearance may be basis
for termination from employment with the Airport Commission.
Special Condition
The above minimum qualifications reflect special conditions associated with the position(s) to be
filled. They may differ from the standard minimum qualifications associated with this class code.
Section III – Page 8
Special Condition - Bilingual Proficiency:
Some positions may require bilingual fluency in a variety of languages depending upon the
department's bilingual needs. Only those eligible candidates who pass the bilingual proficiency test
will be considered for bilingual positions. Applicants must indicate on the application form the
language(s) in which they claim proficiency.
INSTRUCTIONS FOR ANNOUNCEMENT CONTENT
Elements of the job announcement need to be entered into their respective JobAps fields:
Announcement Header
Introduction – This typically is where the position description is entered.
Minimum Qualifications
How to Apply
Selection Procedures
1. ANNOUNCEMENT HEADER – Use the JobAps format provided, most of which is straightforward.
Job Salary – Generally, the salary range information auto-populates based on the job code entered
into JobAps. If a new salary is expected to take effect shortly after the announcement is issued,
enter the projected salary and effective date.
Date Announcement is First Opened - A job posting may be issued any day of the week and
during any time of the day. The day an announcement is issued may not necessarily be the same
day that applications can be submitted. That is, the posting of the announcement may serve to
simply notify job seekers as to a future date when they can file their applications. Even though
applications may not be filed immediately, this is still considered the “official posting date.”
Filing Deadline (Last Day to Submit Application) - In accordance with CSC Rule 100, Promotive
or Combined Promotive and Entrance announcements must allow promotive applicants a
minimum of 10 calendar days for filing. However, Rule 111A does not require a specified
application-filing time period for promotive examinations. In fact, the duration of the recruitment
period for PBT announcements may vary from a minimum of one or two days up to four weeks or
more. For example, shorter filing periods may be permitted when a high volume of applications
is expected. An announcement of short duration, however, must be reasonable for interested job
seekers to apply and must receive DHR approval in advance. Conversely, when it is difficult
recruit applicants, the filing period may be kept open for an extended period of time.
Filing deadlines should be set at the end of the day: 11:59 PM on the closing date. In rare
situations, when paper applications are encouraged, the filing deadline should be set at the close
of business.
An open-ended or continuous filing period may be appropriate if the examination is for a
specialty occupation where there are few vacancies and few qualified applicants, OR the
classification has a high turnover rate and a continuous need to hire. When continuous filing is
used, DHR approval is required if the recruitment provides fewer than three qualified applicants
and the hiring agency is ready to make an appointment. However, another way to process such
recruitments would be to accept applications UNTIL there are a sufficient number to warrant
Section III – Page 9
holding an examination. In this case, the filing deadline should be “Continuous” and the
announcement should read:
“File immediately. This announcement may close at any time as applications will be accepted
until a sufficient number are received to satisfy the hiring need.”
Employment Type (Part-time, Full-time, etc.) – Select the appropriate employment type from the
job type drop-down in JobAps. If there are unusual work hours, or work locations outside of San
Francisco, or other unusual features of the job (e.g., working in confined spaces, bilingual
requirements, etc.) these should be listed in the body of the announcement.
2. INTRODUCTION - Please do not load all information into the Introduction Section, and leave
the remaining sections blank.
Position Description - Include essential functions, description of duties identified through job
analysis. If the job family is flexibly staffed from the entry to the journey level without an
additional civil service examination process, consult your team leader for appropriate language
for entry announcements, to describe the specific flexible staffing situation.
3. MINIMUM QUALIFICATIONS (MQ) – As a general rule, MQ language used in the
announcement must be consistent with the job specification. Exceptions may occur if there are
special conditions associated with the position, or if the announcement is for manager
classification. Proposed changes to the official minimum qualifications that are established in the
class specification require the approval of the Director of Recruitment and Assessment Services
and the Manager of Classification and Compensation. All minimum qualifications must be
written in a clear and unambiguous manner so as to avoid any misinterpretation on the part of the
job seeker, the analyst and the hiring department. [For more information on minimum
qualification, please see PBT Manual - Section II.] The minimum qualifications must always be
supported by the job analysis.
The MQ section may also include information to indicate:
alternative ways of meeting the requirements
“special condition” requirements for certain positions
specific experience to clarity that which will not be accepted
if a date other than the final filing date will be used to determine possession of MQs
If appropriate, candidates may be allowed to participate in the exam conditionally and/or be
placed on the eligible list conditionally (under waiver of referral, until proof of possession of
requirements is presented.) This must be noted on the announcement.
If special conditions have been approved by the Department of Human Resources, they serve as
additional “minimum” qualifications, and should be noted here. Special conditions typically
include a statement of the type and quantity of required, specialized experience, tasks, etc. When
the experience required for a special condition is more specialized than the general experience
required in the general minimum qualifications statement, the special condition should be
expressed as specialized experience included within the general requirement:
Two years of clerical experience, including one year of experience taking minutes at public
meetings.
Section III – Page 10
To avoid possible confusion among job seekers, every announcement with a special condition
should include the following language at the bottom of the MQ section, “The above minimum
qualifications reflect special conditions associated with the position(s) to be filled. They may
differ from the standard minimum qualifications associated with this class code.” This language
also will be useful to City personnel analysts as it will mark the announcement and allow analysts
to readily distinguish it as different from the usual or standard MQs associated with the class
code.
Administrative Analyst series: For the 1823 Senior Administrative Analyst and the 1824
Principal Administrative Analyst, the four recognized specialty areas (budget, finance, grants,
contracts, legislative/policy analysis) must be incorporated into the minimum qualifications as
follows: each special condition for 1823 requires two years of experience, and for 1824 each
special condition requires four years of experience. When the minimum qualifications allow
several alternate ways to qualify, rewording them to include the special condition must be done
with care. Here is an example for the 1824 Principal Administrative Analyst with the contracts
special condition:
1. Possession of a graduate degree (Master's degree or higher) from an accredited college or university,
and five (5) years full-time equivalent experience performing professional-level analytical work as
described in Note A; OR
2. Possession of a graduate degree (Master's degree or higher) from an accredited college or university
with major college coursework as described in Note B, and four (4) years full-time equivalent
experience performing professional-level analytical work as described in Note A; OR
3. Possession of a baccalaureate degree from an accredited college or university, and six (6) years fulltime equivalent experience performing professional-level analytical work as described in Note A; OR
4. Possession of a baccalaureate degree from an accredited college or university with major college
coursework as described in Note B, and five (5) years full-time-equivalent experience performing
professional-level analytical work as described in Note A;
SUBSTITUTION: Applicants may substitute up to 2 years of the required education with additional
qualifying experience in budget analysis, financial analysis and reporting, legislative/policy analysis,
or contract/grant administration. One year (2000 hours) of additional qualifying experience will be
considered equivalent to 30 semester units/45 quarter units.
Notes on Qualifying Experience and Education:
A. Qualifying professional-level analytical experience must include at least four (4) years experience
in the development of complex contracting systems and administration of competitive bid processes
and complex contractual agreements. Any remaining required experience must be in one or more of
the following functional areas: complex budget analysis, development and administration; complex
financial/fiscal analysis and reporting; development and evaluation of complex
management/administrative policy; complex grant administration and monitoring; complex program
evaluation and planning; complex legislative analysis; complex economic analysis; or other functional
areas related to the duties of positions in Class 1824, where the primary focus of the job is complex
professional-level analysis for evaluation, recommendation, development and implementation of major
programs and functions of department/organization. Analytical experience equivalent to the duties of
Class 1823 is considered qualifying.
B. Coursework applicable to a baccalaureate or higher degree in specialized subject matter areas such
as public or business administration, management, business law, contract law, public policy, urban
Section III – Page 11
studies, economics, statistical analysis, finance, accounting or other fields of study closely related to
the essential functions of positions in Class 1824.
If specific MQs will be tested during the probationary period, such as typing or heavy lifting, this
should be noted in this section. Also, if there are unusual work hours, or work locations outside
of San Francisco, or other unusual aspects of the job (e.g., working in confined spaces, bilingual
requirements, etc.) these should be listed here.
4. HOW TO APPLY – Indicate if any material or information in addition to the standard application
must be submitted. For example, if a supplemental questionnaire is required, indicate that it must
be completed and submitted with the application. The following is sample language that may be
used when applicants are required to complete a supplemental questionnaire, along with their job
application.
Supplemental Questionnaire: All applicants must complete and submit the supplemental
questionnaire that is included with their job application. The purpose of the Supplemental
Questionnaire is to evaluate the experience, tasks, knowledge, skills and abilities that an
applicant might possess in important job-related areas.
5. SELECTION PROCEDURE(S) – This section notifies applicants of the type, scope and weight of
the selection procedure(s) used to establish the eligible list. A description of the test exercise(s)
or examination component(s) should be included, as well as the knowledge, skills and abilities
measured by each exercise. Additionally, this section should explain the weighting of each test
component, indicated as a percentage of the overall weight in the entire selection process. It
should also state whether the selection component is considered “Qualifying” (i.e., pass/fail only).
The test component weights and the KSAs being measured must be supported by the job analysis.
Examples of language that have been used to describe common selection procedures are presented
below:
Oral Examination
“Candidates will be examined to determine their relative knowledge, skills and abilities in
job-related areas which may include, but not be limited to: knowledge of city and regional
planning; knowledge of laws and regulations regarding Port planning and development;
supervisory ability, analytical ability; and oral communication ability.”
OR
Candidates will be assessed through one or more structured, job-related
questions or exercises to determine their knowledge, skills and abilities in the
areas of. . .”
OR
“Candidates will be interviewed to determine their relative knowledge, skill and ability levels
in job-related areas. These areas may include, but are not limited to: knowledge of property
management, knowledge of financial and accounting principles, contract preparation and
leasing; analytical, problem-solving and decision-making abilities; oral communication
ability, and human relations ability.”
Section III – Page 12
Written Examination:
“Candidates will participate in a written multiple-choice examination that is designed to
determine their relative knowledge, skill and ability levels in job-related areas, including
but not limited to: knowledge of accounting principles and practices; ability to perform
basic mathematical calculations; and general clerical procedures.”
Written/Performance Tests:
“Candidates will be evaluated on the basis of one or more written questions and job-related
exercises to determine their relative knowledge, skill and ability levels in job-related areas,
including but not limited to: autopsy and investigative practices and procedures; standard
procedures for gathering, preserving and presenting data and physical evidence; interview
methods and techniques; basic human anatomy, physiology and forensic pathology
terminology; and written communications ability.
Training and Experience Evaluation:
“Application materials will be evaluated to determine the candidate’s experience in job related
areas which may include, but not be limited to: . . .”
Performance Examination:
“Simulated job tasks or exercises will be used to assess a candidate’s knowledge, skill and
abilities in job-related areas, which may include but not be limited to….”
Physical Performance Test:
“Candidates will participate in test exercises/events that are designed to measure selected
physical performances required on the job. See the supplement for test event descriptions.”
Section III – Page 13
PROCEDURE FOR AMENDING AND REISSUING AN ANNOUNCEMENT
The Director of Recruitment and Assessment Services must approve the amendment or reissuance of
an announcement.
Before amending or reissuing a job announcement, the analyst must print a hard copy of the original
announcement and file it in the exam active file. This will ensure that a record is retained which
shows how the announcement originally appeared prior to the change.
If the announcement is amended and reissued for any reason, this must be stated at the top of the
amended announcement, in the location under the class number and title. A brief explanation for the
amendment or reissuance should be provided. Examples:
“This announcement has been amended to extend the filing date and allow for additional
applications. Applicants who already applied under Recruitment Number CBT-9999-555555,
issued on January x, 20xx will be included in the applicant pool and need not reapply.”
“This announcement has been reissued to reflect changes to the minimum qualifications for
this classification and to allow for additional applications on that basis. Applicants who
already applied under Recruitment Number PBT-8888-555555, issued on February x, 20xx,
need not reapply and will be included in the applicant pool.”
“This announcement has been amended to correct an error in the salary listed. Applicants
who already applied under Recruitment Number CBT-7777-555555, issued on March x, 20xx,
need not reapply and will be included in the applicant pool.”
The date of amendment or the reissuance date should be inserted at the bottom of the announcement.
When extending the filing deadline, be sure to modify both the deadline in the header of the
announcement, as well as any reference to the deadline in the body of the announcement.
Changing the terms or conditions of the announcement - If the announcement states that
disqualification will result if a requirement is not met (e.g., requiring submission of transcripts by a
specific date) and if a decision is later made to extend the filing date, the amended announcement
must reflect a change to this requirement (i.e., a new date requiring submission of transcripts should
be set). Again, the Director of Recruitment and Assessment Services must approve the amended or
reissued announcement.
If minimum qualifications are changed, the old minimum qualifications should be listed in the
comment field of the recruitment planner in JobAps.
A new appeal period is not established (i.e., does not start) when an announcement is amended to
correct a clerical-type error.
Section III – Page 14
ANNOUNCEMENT APPEALS
Civil Service Commission Rule 111A, Article VIII, provides that appeals concerning the provisions
of an examination announcement must be received within five (5) business days from the issuance
date. If the announcement was first posted as an “Information Only” announcement, with no option
to apply, the five (5) business days begins from the first date of the “Information Only” posting.
All protests/appeals must be brought to the attention of the Department of Human Resources’
Director of Recruitment and Assessment as soon as possible upon review, and prior to any action
taken.
If a protest is submitted directly to a decentralized hiring department, the analyst must first assess the
timeliness of the appeal.
If a timely appeal pertains to the position description or minimum qualifications, the
analyst must forward a copy of the appeal to the Executive Officer of the Civil Service
Commission.
If a timely protest pertains to any other provision of the announcement, except for the
Certification Rule, the analyst must forward a copy of the protest to the Human Resources
Director.
A protest of the Certification Rule can be denied immediately and the appellant informed
that the Certification Rule is not appealable.
Section III – Page 15
CANCELLATION OF A RECRUITMENT, EXAMINATION OR ELIGIBLE LIST
PURPOSE:
This section describes the process to cancel an examination, recruitment or eligible list. It
applies to Position Based Testing, as well as Class Based Testing and Continuous Testing.
AUTHORITY:
According to Civil Service Commission rules, the Human Resources Director shall rule on all
matters concerning the examination and recruitment programs. [See CSC Rules 110
Examination Announcements and Applicants; 111A.26; Position Based Testing, Management
of Eligible Lists; 112.4 Eligible Lists, Cancellation of Eligibility; and 111.2 Examination
Provisions.]
PROCEDURE:
If hiring needs change and vacancies will no longer be filled, a request may be submitted to
cancel an announcement, examination or an eligible list, whatever the case may be at the time.
To request cancellation, the analyst must complete the form, Request to Cancel an
Examination/Recruitment/Eligible List. The analyst should explain the basis for the
cancellation and include, as appropriate, information such as:
Problems encountered in the recruitment and selection process
Recent departmental reorganizations or title reclassifications that may affect hiring needs
Existing and anticipated hiring needs of other departments that may use the class
Any holdovers due to layoffs that may be related to the request
Ethnic and gender demographics of the pool of qualified applicants [Sec. 110.10]
Number of filled positions in the class and appointment type
The analyst may need to contact departments that use the class to obtain the information listed
above. The completed form is reviewed and signed by the team leader or Departmental
Personnel Officer, then forwarded to the Director of Recruitment and Assessment Services for
review, with a copy sent to the Client Services Representative. The final decision is made by
the Human Resources Director, upon recommendation of the Director of Recruitment and
Assessment Services. A copy of the form with the final disposition is returned to the
requesting department, the Client Services Representative and Referral Unit.
Upon approval, the analyst must notify all applicants and the recognized employee
organization of the cancellation. If the announcement is still open, the analyst must close it
immediately. No public notice is required. The analyst will enter updated information about
the cancellation into the JobAps recruitment planner, and will ensure that all records and
active files/storage files are properly annotated and secured.
If an eligible list is approved for cancellation, the analyst must request from the Referral Unit
the names and contact information of all active eligibles on the list, and notify them of the list
cancellation.
If the request is denied, the Director of Recruitment and Assessment Services Division or
designee will discuss with the analyst how to proceed with the recruitment and/or examination
process.
Section III – Page 16
Request to Cancel an Examination/Recruitment/Eligible List
Date:
To:
John Kraus, Director, Recruitment and Assessment Services, Department of Human Resources
Cc:
Laura Dancer, Operations Manager, Recruitment and Assessment Services, Department of Human Resources
From:
Requesting Department Information
Department:
Date:
Analyst:
Phone No.:
Supervisor/Departmental Personnel Officer:
Phone No.:
Examination/Recruitment Information
CBT
Employment Register
PBT
Continuous Testing
Class Number:
Entrance
Combined Promotive & Entrance
Promotive Only
Title:
Examination/Recruitment No.:
Number in Current Applicant Pool or on Eligible List:
Special Conditions, if any:
List of Citywide Departments Affected by this Request:
(Please refer to PeopleSoft employment query)
Specify Which Department(s) Were Contacted and Their Response:
(Requesting Department is responsible for checking with all citywide departments that will be affected by the resulting decision of this
request.)
Justification/Comments
(For DHR use only)
Effective Date of Cancellation:
Recommended Date To Notify Affected Applicants:
Last Permanent Exam/Recruitment Date:
Expiration Date of the Last Eligible List:
Duration of Eligible List:
No. of Eligibles Remaining:
No. of Holdovers:
Approved:
Yes
No
Reviewed By:
Director, RAS
Date:
Approved:
Yes
No
Reviewed By:
Director, DHR
Date:
Comments/Reason For Disapproval:
Distribution: Client Services
Referral Unit
Requesting Department
Section III – Page 17
Instructions for Completing this Request Form
INSTRUCTIONS:
1) The analyst responsible for the examination/recruitment initiates the formal request to
cancel by submitting t this form to DHR, Recruitment and Assessment Services (RAS) for
review, and sending a copy to the DHR Client Services Representative.
2) The Director of Recruitment and Assessment Services will review the justification and
recommend appropriate action to the HR Director. The HR Director will make the final
determination and so indicate on this form.
3) A copy of the form with the disposition indicated will be returned to the requesting
department. An additional copy will be distributed to Client Services and Referral Unit.
Factors considered when evaluating a request to cancel
Don’t Support
Support
The requisition has either been deleted from the
budget or frozen for salary savings. [“…based
on the needs of the City…” Sec. 111A.26.4]
There are layoffs/holdovers in the classification.
The classification is being abolished through a
classification action.
Reorganization and/or reclassification eliminate
the need for any departments’ vacancies in the
short term (next 2 years).
A major/significant technology change which
affects how a classification functions citywide
after the recruitment/exam administration has
already begun.
The pool of applicants does not reflect the
demographics of the relevant labor market.
[Sec. 110.10]
When repeated attempts to recruit find that
applicants are unable to meet specific job
requirements or qualifications [Sec. 111A.26.4]
Upon further assessment, the remaining
eligibles on the list are found not to possess the
minimum requirements, special condition, or
testable skill set required for the position.
There has been a minor departmental change
(not citywide) in the use of the classification
which should be identified during the job
analysis process.
Changes in the minimum qualifications were
noted during the job analysis process and the
recruitment was opened using the old
minimum qualifications. In this case, approval
of revised minimum qualifications is required
before making any change, then amending or
reissuing the recruitment is appropriate.
Re-titling the job (more appropriately handled
through amending or reissuing.)
Continuous recruitment is no longer necessary
- current eligible list exists with active,
available, qualified applicants.
When the department goes through a
reorganization process but the function of the
classification remain unchanged.
Applicants missed the filing deadline for
submitting their applications.
Extend and expand recruitment efforts (as an
initial action, it may be more appropriate to
amend or reissue, if the applicant pool is not of
sufficient quantity or is deemed unqualified,
rather than cancel.)
As a general rule, cancellation requests will not
be supported prior to list issuance if formal
testing has begun. .
7/09
Section III – Page 18
Useful Links
CCSF Department of Human Resources
Website
Recruitment Resources
http://www.sfdhr.org/index.aspx?page=96
Current and past job announcements
http://www.jobaps.com/sf/sup/images/default.asp
Exam-related Forms & Documents
http://www.sfdhr.org/index.aspx?page=90
7/09
Section III – Page 19
SECTION IV
Examination Development
SECTION IV – Page 1
SECTION IV – EXAM DEVELOPMENT
TABLE OF CONTENTS
DEVELOPING A SELECTION PLAN
Situational Considerations
Measurement Considerations
UNASSEMBLED EXAMINATIONS
Point-Based T&E Method (Traditional T&E)
Modified T&E
Advantages of Point Based & Modified T&Es
Disadvantages of Point Based & Modified T&Es
Task-Based T&E
Advantages of Task-Based T&Es over Point-Based or Traditional T&Es
T&E’s MAY be used when…
T&E’s generally should not be used when…
Example - Combined Point Method and Task (skill) Based T&E
Example - Point Method & Task Based T&E Evaluation Sheet
Example - Task-Based Supplemental Questionnaire
Behavioral Consistency Method
Advantages of Behavioral Consistency Ratings
Disadvantages/Criticisms/Drawbacks of Behavioral Consistency Ratings
Example 1: Supplemental Questionnaire & Rating Guideline
Example 2: Supplemental Questions for Budget Analyst Applicants
Example 3: Instructions & Rating Scale for Director of Communication
Behaviorally Anchored T&E Screening
Example – Sample Supplemental Questionnaire for Screening Committee
ORAL EXAMINATIONS / STRUCTURED INTERVIEWS
Distinction between Interviews and Oral Examinations
Oral Examinations
Advantages of Oral Examinations
Disadvantages/criticisms of Oral Examinations
Linkage between the Job Analysis & the Oral Test Plan
Types of Oral Examination Questions
Writing Oral Questions
General Considerations in Writing Oral Questions
Use of Subject Matter Experts for Oral Examination Development
Oral Examination Development Meeting
Considerations in Planning the Oral Examination or Structured Interview
SECTION IV – Page 2
Page
4
5
5
6
6
7
7
7
7
8
8
9
9
12
13
14
16
16
17
20
21
24
24
32
32
33
33
34
34
36
39
40
40
41
42
Rating Guidelines
General Considerations Involving Rating Scales
Prodding/Probing
General oral examiner/rater behavior
Some Oral Rating/Scoring Issues
Oral Communication Scoring
Examples of Oral Communication Rating Scales
PERFORMANCE & JOB SIMULATION TESTS
Written Job Simulations
Advantages of Job Simulations and Performance Tests
Disadvantages/Criticisms of Job Simulations and Performance Tests
Developing a Job Simulation Exercise
Job Simulation Development Meeting with SMEs
Job Simulation Evaluation Factors
Scoring Dimensions and Rating Guidelines
Scoring Systems and Rating Scales
Checklists
Instructions for raters, proctors and candidates
Scheduling and Administration of Performance Tests
MULTIPLE-CHOICE EXAMINATIONS
Advantages of Multiple-Choice Examinations
Disadvantages/Criticisms of Multiple-Choice Examinations
Purchasing a Multiple-Choice Exam (or any standardized exam)
ESSAY TESTS
Advantages
Disadvantages
Sample Written Communication Rating Scale
SELECTION PLAN SUMMARY
SECTION IV – Page 3
44
44
47
49
49
50
51
54
54
55
55
55
55
56
56
57
58
58
60
61
61
62
62
63
63
63
64
65
DEVELOPING A SELECTION PLAN1
This section of the Position Based Testing Manual provides guidance on selecting and constructing
test components.
The effectiveness of a test is known as its validity and a valid test is one which accurately measures
whatever it is intended to measure. In occupational testing, validity refers to the extent to which the
test measures the KSAs or behaviors required to perform the job or predicts performance on the job.
It should be understood that a test has no inherent or intrinsic validity. It has validity only in regard
to a specific use.
There are different kinds of validity, although they all have much the same purpose. Criterion-related
validity is measured by comparing test scores with a criterion measure. In employment testing, the
criterion is usually some measure of job performance or job success. Construct validity involves
identifying a psychological trait (construct) that underlies successful performance on the job and then
devising a selection procedure to measure the presence and degree of the construct. However, the
method of validity used primarily by the City and County of San Francisco (and most jurisdictions
throughout the country) is content validity and this is the one of most concern with respect to the
development of PBT examinations.
Content validity (or “content-referenced validity”) is a demonstration of how well the content of a test
samples the knowledge and skills required to perform the job. It is the relationship between the
selection procedure and the job. In content validity, a selection procedure is justified by showing that
it representatively samples significant parts of the job. For example, job applicants may be given a
test of their ability to perform basic reading and writing if reading and writing are determined to be
required parts of the job. A performance or work sample test which measures the ability to do
precisely what must be done on the job (e.g., word processing for a clerk typist) has high validity
because it directly reflects the content of the job.
A well-planned and well-executed job analysis is particularly important when the content validity
approach is used. Since this method relies on a logical, rather than statistical, analysis to determine
the overlap between the behavioral domains of the job and the test, the examiner must be able to
show that a thorough job analysis was conducted, and that the test questions sample the important
behaviors required by the job, or attributes prerequisite to those behaviors.
When developing a selection tool, the most important information to consider is the job analysis.
Analysts should use the job analysis results and their professional judgment to determine the KSAs to
be measured and the most appropriate, cost-effective and timely selection device(s) to measure them.
1
As a reminder, it is incumbent upon every analyst to alert his/her respective manager if s/he, a family member
(spouse, child, legal ward, grandchild, foster child, parent, legal guardian, grandparent, brother, sister, father-in-law,
mother-in-law, or other relative who lives with you), close friend or close co-worker of the analyst, plans to take an
examination for which the analyst may be involved or exposed. Appropriate adjustments to the analyst’s work
assignments will then be made by the manager/supervisor.
SECTION IV – Page 4
An ideal PBT selection device will maximize test validity (job-relatedness), minimize adverse impact
and minimize delays in the delivery of the eligible list.
The analyst is well-advised to follow WRIPAC manual instructions on how to develop a selection
plan based on a job analysis and how to complete the selection plan outline form provided in the
WRIPAC manual. Outlining the examination content areas helps ensure that no relevant knowledge,
skills and abilities (KSAs) are omitted or inappropriately represented in the examination.
Phase IV, Step 12 of the WRIPAC manual provides a thorough methodology and explanation of the
process of matching the KSAs identified as important during the job analysis with appropriate
selection procedures. Completing this form involves selecting test types to measure KSAs. Phase IV,
Step 14 of the WRIPAC manual provides an explanation and a process for deciding whether to assess
KSAs on a ranking or pass/fail basis.
Factors to consider when determining the appropriate selection procedure(s) to use for specific
classes are:
Situational Considerations
Practicality – Are resources (time, staff, facilities, financial support) available to obtain
or develop the test and administer it to the expected applicant group?
Cost/utility ratio – What is the utility of using a particular selection tool in light of the
expected number of applicants/eligibles? What is the expected usage of the eligible list?
For example, if a given classification is hard to fill and generates a very low applicant
count, does it make sense to use a test that will require a huge investment in resources?
Applicant Type - Is the exam format appropriate? Would incumbents in the class be
expected to provide information in the way, and at the level, required by the exam
format (i.e., such as oral presentation, writing, or reading level required on the test)? Is
the difficulty level of the exam appropriate for the exam population and the job?
Security – Will test security be a concern based on the applicant population size, test
type, test location, etc.?
Measurement Considerations
Content validity – Does the examination measure a representative sample of the
essential KSAs or work behaviors identified in the job analysis? To what extent do the
selection procedures measure these essential KSAs, work behaviors or work products?
Bear in mind the tenet that, “the greater the degree of overlap between the behaviors
measured by the test and those performed on the job, the more valid will be the test.”
Reliability – Can we expect test scores to be derived reliably? Will the test show
consistent results upon repeated measurement of the same individual?
Combination of measurement techniques – Is more than one selection instrument needed
to properly assess the qualifications of the candidates?
SECTION IV – Page 5
Minimum qualifications - Have candidates already been tested through a licensing or
certification exam?
Adverse effect – Where alternative tests are available which have been shown to be
equally valid for a given purpose, use the test which has been demonstrated to have the
lesser adverse effect.
Selection instruments may include written objective tests, written/performance (essay) tests,
performance tests, oral interviews, training and experience evaluations, assessment centers, etc.
No hard and fast rules exist for selecting appropriate test type(s). However, each job
classification has unique features that should be considered when determining test mode. The
following is a discussion of various test types including appropriate applications, advantages,
disadvantages, and cautions regarding their use.
UNASSEMBLED EXAMINATIONS
A variety of Training and Experience (T&E) [aka Education and Experience (E&E)] Evaluation
Methods belong to the family of Unassembled Examinations. The term Unassembled
Examination refers to the fact that applicants need not “appear or assemble” to participate in the
selection procedure. Instead, a paper review of their application, questionnaire or resume can be
performed to assign points to candidates’ education and experience and rank them based on a
given grading/scoring methodology. Fundamentally, T&E’s are a credentialing system that
examines the level and quantity of a candidate’s background or credentials. Candidates then can
be ranked in terms of their credentials. T&E’s can be used alone but are also used in conjunction
with other test components. There are several types of T&E evaluations. The most common are:
Point-Based T&E Method (Traditional T&E) – personnel analysts review each application or
resume and assign points to the quantity and level of applicants’ education and experience They
typically do this using a formula in which applicants receive a prescribed number of points for
each month or year of relevant training, education, and experience. In the traditional T&E, the
minimum requirements typically serve as a basis for developing the scoring standard or formula.
The scoring standard may allow for the assignment of full or partial credit. For example, under
the traditional T&E an applicant may be awarded full credit for each year of experience that s/he
possesses that is equivalent to the experience requirements that are stated in the minimum
qualifications for the position. Partial credit may be assigned for experience that is related to the
experience in the minimum qualifications but is not the exact experience. Only experience
obtained by the applicant within a certain period of time is typically credited. [This applies to all
unassembled examinations as well.] It is generally recommended for most titles that only relevant
experienced earned within the last ten years immediately preceding the announced closing date be
credited.2 For some classes, this time period may be shorter if it is an occupation with many
changes in technology, laws, regulations, etc.
2
It should be noted that experience older than ten years may be accepted to allow an applicant to meet the minimum
eligibility requirements, even though it may not be used for ranking purposes .
SECTION IV – Page 6
Here in the City and County of San Francisco, the Point-Method is typically conducted using
either the standard, online CCSF JobAps application or a combination of this application and a
supplemental questionnaire that includes specific questions designed to elicit information
regarding the applicant’s education, training and/or experience. When the applicant meets the
basic, minimum qualifications, the analyst awards the applicant 700 points, the lowest passing
score. If the applicant possesses education, training and/or experience beyond that defined by the
minimum requirements, the analyst awards additional points beyond the 700, using objective
rating guidelines (e.g., “x” years of ____ experience = “y” points up to a maximum of “z” points).
Please see the examples of T&E rating sheets later in this section.
Modified T&E – similar to the traditional T&E above, but in this case personnel analysts may
also acknowledge certain types of experience or education that are related to the KSAs required in
the job, but which are not addressed by the minimum requirements. Training, education and
experience rating elements may be differentially weighted to reflect their apparent relatedness to
the KSAs and the job.
For this type of rating, analysts work with Subject Matter Experts to establish appropriate,
objective, rating guidelines. For a Task-Based Method T&E to be content valid, the tasks must be
linked to KSAs and points for each task must be derived from KSA weights. To score this type of
T&E rating, the analyst should develop a spreadsheet and rating points to calculate candidates’
scores.
Advantages of Point Based & Modified T&Es:
(1)
Efficiency – T&E’s are easy to develop and applicants can be processed quickly and
economically.
(2)
Speeded list delivery – lists can be generated much faster than with “assembled”
examinations, and hiring agencies like this.
(3)
Relatively low appeal rate – because candidates don’t need to appear for an examination,
and don’t need to demonstrate their abilities or face the risk of failing, they generally like
T&Es.
Disadvantages/Drawbacks of Point Based & Modified T&Es:
(1) Validity correlations for predicting job success tend to be lower than those of assembled
exams.
(2) They do not measure how well job tasks were performed.
(3) T&E’s rely on the integrity of the applicant – since people are evaluated on the basis of what
they put down on their applications, resumes or questionnaires, they may not always be honest
when reporting facts, events or achievements regarding their background.
Task-Based T&E – A task-oriented T&E is based on the premise that validity can be achieved
by obtaining detailed information about specific tasks which an individual has performed in the
past, regardless of the job in which the task was performed. For this method, personnel analysts
identify job tasks that are determined through job analysis to be essential aspects of the job. A
task list is built by the analyst. It is typically presented in the form of a questionnaire which is
SECTION IV – Page 7
completed by the job seeker at the time of application filing (e.g., supplemental questionnaire).
Candidates complete the questionnaire by indicating which tasks they have performed or have
been trained to do. Candidates may also be asked to provide some indication of their level of skill
or proficiency (e.g., the amount of time they were responsible for performing the task and the
setting or job in which performance of the task occurred (for verification purposes). Points are
awarded when candidates select tasks. The assignment of these points can vary based on the
relative importance of each task and the level of independence exercised in performing the task.
All T&E’s should be developed by the analyst in consultation with Subject Matter Experts. These
SMEs should identify the specific activities which, if successfully performed, would distinguish
more successful from less successful workers. When using SMEs, they should complete a
Background Information and Qualifications Form and a Test Security Agreement and Statement
of Responsibility Form. In addition, when working with the SMEs, the analyst should:
explain the T&E methodology to be used
distribute the job analysis for review, with particular attention to be given to the KSAs
explain that questions or T&E criteria will be developed to assess certain job KSAs or
combinations of KSAs
ensure that adequate rating guidelines are developed (as discussed later in this document).
develop a rating sheet for Point-Method T&Es; a rating sheet and spreadsheet for TaskBased Method T&Es; or a rating grid for the Behaviorally Anchored T&Es (as discussed
later in this document).
document every step of the T&E process and file it with the Active File.
Task-Based T&Es share many of the advantages and disadvantages of Point-Based T&Es, as
indicated above, so they won’t be repeated. On the other hand, Task-Based T&Es have the
following advantages over Point-Based T&Es:
Advantages of Task-Based T&Es over Point-Based or Traditional T&Es
More directly describe the past behavior of an applicant which may be relevant to job
performance.
A closer link to the job, from a common sense perspective of content validity.
Easy to score and automate for scoring.
Easy for applicants to understand.
Because T&Es tend to have lower validity, they generally should not be a personnel analyst’s first
choice when deciding on a selection instrument to use for a given recruitment. However, they
may be appropriate in certain situations. The following summarizes, in general, when or when
not to use T&E’s.
T&E’s MAY be used when…
(1) A major eligibility requirement for a given job classification requires a license or
certification. [Applicants with such credentials have already been determined to be
qualified.]
SECTION IV – Page 8
(2) The recruitment is not competitive and tends to generate a very low number of applications
(e.g., there may be a limited number of people who meet the requirements or the job may be
undesirable).
(3) When there is high attrition associated with the classification (i.e., the hiring agency is
willing and able to hire just about anyone who meets the requirements because of the high
employee turnover rate).
(4) There are other selection components to be used in association with the recruitment.
T&E’s generally SHOULD NOT be used when…
(1) The situation is very competitive (i.e., there are many job seekers interested in the position
and large numbers of applications filed) and no other selection procedure is to be used.
(2) The minimum requirements are not clear or precise (which may cause applicants to provide
insufficient information to be properly credited).
(3) There is more than one way to be eligible (e.g., there are substitution clauses in the MQs or
multiple ways to satisfy the minimum requirements).
(4) There are no minimum requirements for the position or the positions to be filled are entrylevel (i.e., where most applicants have little or no previous work experience or where
experience really isn’t so important).
EXAMPLE
COMBINED POINT METHOD AND TASK - (SKILL) BASED T&E
CLASS 1776 ASSISTANT REPRODUCTION MANAGER
SUPPLEMENTAL QUESTIONNAIRE
INSTRUCTIONS
The purpose of this supplemental questionnaire is to obtain specific information regarding your
education and experience in relation to the position for which you are applying. Of interest are
the knowledge, skills and abilities you have acquired which relate to activities that have been
identified as essential in the performance of the job. The supplemental questionnaire will be used
to score and rank candidates on the eligibility list. List experience from all qualifying jobs.
There are five (5) questions contained in this supplemental questionnaire. You must enter your
responses in the spaces provided. This supplemental questionnaire must be submitted with the
application at the time of filing. Failure to do so may result in rejection of your application.
*******************************
SECTION IV – Page 9
CERTIFICATION
I hereby certify that my responses to this questionnaire are true and accurately reflect my
background, skills and experiences. I understand that any false, incorrect or deceptive responses
provided in this questionnaire may result in my disqualification for this, and possibly other, job
opportunities with the City and County of San Francisco. I understand and agree that all the
information that I provide is subject to verification.
_________________________________________
Applicant Name
Check the correct box to answer the following questions.
1.
Do you have experience in printing production, including the operation of offset/digital
machinery and graphics equipment?
Yes
No
If yes, fill in the blanks below:
Examples of duties performed:
_______________________________________________________________________________
_______________________________________________________________________________
_______________________________________________________________________________
_______________________________________________________________________________
___________________________________________________ Employer:
____________________________________________________
Dates of Employment - From: __________ to: ___________
2.
Do you have experience supervising staff/ interns/ students?
Yes
No
If yes, fill in the blanks below:
Your Job Title: _________________________________________
Number of staff Supervised: _________
Employer: _____________________________________________
Length of time supervising: Years_____ Months_____
3.
Have you completed coursework in graphic communications, printing technology or a
related field?
Yes
No
If yes, fill in the blanks below:
Number of Semester/Quarter Units Completed: _____________
College/University: ____________________________________
SECTION IV – Page 10
4.
Mark the appropriate boxes to indicate type(s) of software you have used and your level of
proficiency:
Level of Proficiency
Basic
Intermediate
Advanced
Quark Xpress



Adobe Photoshop



Adobe Illustrator



Adobe Pagemaker



Filemaker Pro






Other:
________________
5. Mark the appropriate boxes to indicate type(s) of hardware you have used and your level of
proficiency:
Level of Proficiency
Basic
Offset Printing Press
Make/Model _____________
Digital Printing
Make/Model _____________
Finishing Equipment (e.g.,
drills, cutters, padded
presses, etc.)
Make/Model _____________
Intermediate
Advanced


















SECTION IV – Page 11
Example
Point Method and Task Based Training and Experience Evaluation Sheet
CLASS & TITLE
Candidate Name: __________________
Instructions: To calculate your score, add the points assigned to the types of experience and training that
you possess, according to the responses on your Supplemental Questionnaire.
Section III Training and Experience
1
.
Minimum Qualifications
Candidate possesses the following:
Five (5) years verifiable experience in printing production, including the operation of
offset/digital machinery and graphics equipment.
OR
____ years of experience, as described above
____ units (semester / quarter ) or class hours of coursework as described on the
job announcement
QUALIFIED – Candidate meets the Minimum Qualifications (700 points)
2
.
REJECTED – Candidate does not meet the Minimum Qualifications (0 points)
Supervisory experience
One (1) year of experience supervising staff/interns/students (100 points)
- Candidate possesses ____ year(s) of experience supervising staff/interns/students
Software experience
Familiarity with at least 3 (out of the first four listed) software programs, with
advanced proficiency (100 points)
OR
3
.
Familiarity with 2 (out of the first four listed) software programs, with advanced
proficiency
(50 points)
OR
Familiarity with 1 (out of the first four listed) software programs AND any of the
other software programs listed, both with advanced proficiency (25 points)
Hardware experience
4
.
Familiarity with 2 or more of the hardware systems listed, with advanced proficiency
(100 points)
OR
SECTION IV – Page 12
Candidate’s
Points
Familiarity with 1 of the hardware systems, with advanced proficiency (50 points)
Total
Points
Example
Task-Based Supplemental Questionnaire
3630 LIBRARIAN I
SUPPLEMENTAL QUESTIONNAIRE (Task Checklist)
INSTRUCTIONS: The purpose of this supplemental questionnaire (Task Checklist) is to assess your
experience with each of the tasks listed. Check the box next to each task and indicate the number of
years of experience you have performing that task. [Note: In this example, no boxes will be included.]
1.
Use a bibliographic utility, such as OCLC or RLIN, for searching bibliographic records for reference or research purposes.
2.
Conduct searches of online reference/information databases appropriate to the information sought.
3.
Use a library online catalog to locate bibliographic information by subject, author, title or keyword.
4.
Use at least two of the following modules of an integrated library system: OPAC, circulation, acquisitions, cataloging and serials.
5.
Find information using an electronic periodical database, such as Infotrac or Ebscohost.
6.
Provide information on the inter-relation of different library functions such as circulation, cataloging, reference, children’s services.
7.
Communicate with vendors providing services or materials for the library.
8.
Analyze workflow in a unit of the library, such as circulation, cataloging, reference, children's services.
9.
Manage collection through selection and evaluation of materials for condition, replacement and weeding.
10. Respond to public suggestions and inquiries.
11. Review and recommend ideas, procedures, guidelines & policies for effective library services.
12. Implement new and revised library systems and services.
13. Conduct reference interviews.
14. Locate information using reference sources.
15. Provide information to patrons through appropriate mode of communication, i.e. in person, over the phone, via e-mail.
16. Prepare bibliographies and/or resource lists on particular subjects or for reader's advisory.
17. Provide reader's advisory.
18. Provide direction, training, evaluation and support to subordinate staff.
19. Delegate tasks to library technicians and other support staff.
20. Complete regular and timely evaluations of subordinate staff.
21. Assist staff in resolving questions and problems related to library activities.
22. Conduct outreach to community groups, schools or other constituent groups.
23. Prepare programming and exhibitions, such as book talks and other events of interest to constituents.
24. Make public presentations.
25. Act as a library representative to the community.
26. Write information about library activities and services for local publications.
27. Prepare pathfinders, instructional and training materials for use by staff and the public.
28. Conduct formal staff training.
SECTION IV – Page 13
29. Conduct formal training for the public on the use of library, tools and resources.
30. Provide training in the use of on-line catalog, Internet and other electronic resources.
Employers and Contact Information
Now that you have seen the activities listed above, in this section you are to write the name of the employer(s) where
you performed these tasks or activities and the name(s) and phone number(s) of a supervisor(s) who can verify your
work experience with each employer. There is space to indicate up to 4 employers. Make note of the letter
designation for each employer that you list.
If your work experience involving the above activities has been with only one or two employers, enter that
information and skip to the next section of this questionnaire.
Please note that you may not receive credit for the work activities listed in this questionnaire unless you provide the
employer and contact information requested.
EMPLOYER "A"
Name & Address of Employer:
Address of Employer:
Name of Contact Person:
Contact Person’s Phone Number
CERTIFICATION
I hereby certify that my responses to this questionnaire are true and accurately reflect my background, skills and
experiences. I understand that any false, incorrect or deceptive responses provided in this questionnaire may result in
my disqualification for this, and possibly other, job opportunities with the City and County of San Francisco. I
understand and agree that all the information that I provide is subject to verification.
Agree
Applicant Name _________________________________________
BEHAVIORAL CONSISTENCY METHOD
Behavioral Consistency ratings also belong to the Unassembled Examination family and are
considered a form of T&E. This method is also known as Behaviorally Anchored T&Es,
Behavioral T&E and the Narrative Questionnaire Method. It will be treated separately here,
because unlike other forms of T&E, the Behavioral Consistency Method is more complex and
tends to demonstrate higher predictive validity with respect to job success (i.e., higher validity
coefficients). Indeed, the assumptions underlying behavioral consistency procedures differ from
other unassembled methods since they are based on theoretical and empirical research findings,
not unexamined credentials. The latter, for example, relies on broad or generalized descriptions
of experiences, which represent “exposures” to job-related competencies. However, the strength
of the relationship between competencies and such exposures is limited. On the other hand,
when we look at explicit descriptions of past experience and behavioral achievements, as is the
case with the behavioral consistency method, these tend to be better indicants of competence
because they are more clear and specific, and are often exclusively associated with the source of
acquisition of the competence.
The Behavioral Consistency Method requires use of a supplemental questionnaire. Candidates
respond to the supplemental questionnaire by providing narrative descriptions of actual past or
current job behavior, performance, accomplishments/achievements, characteristics or examples of
SECTION IV – Page 14
everything they have done with respect to certain types of experience which has been determined
by the job analysis to be essential to successful job performance. In so doing, they are given an
opportunity to demonstrate their competence in performing, or readiness to perform, important
job tasks or functions.
Questions on a supplemental questionnaire are designed to elicit behaviors that are related to
important knowledge, skills and abilities relevant to the class being tested. Some examples of
questions are:
1.
2.
3.
Describe the largest or most complex GIS project on which you have worked. Describe your specific role,
your duties, the development software used, and how the client is currently using the system in production.
Describe your experience in medical technology or a related field (e.g., one of the chemical, physical or
biological sciences). Discuss your experience with laboratory analysis or medical testing procedures.
Describe your experience using and maintaining standard equipment used in laboratory analysis or medical
testing procedures. List the equipment used and the types of tests you have performed using each type of
equipment.
The weights assigned to each question are linked to the weights of related KSAs that are
identified in the job analysis. It should be remembered that the assessment should be about only
those KSAs that truly differentiate between superior performers and all others. Most jobs have
anywhere from 5-10 of these KSAs. It is often useful to have SMEs provide critical incidents that
are related to the tasks/behaviors in the job analysis. These are excellent sources for Behavioral
Consistency questions.
Objective rating guidelines should be developed for the behaviorally anchored T&E. These
detailed rating guidelines can be similar to rating guidelines developed for oral examinations;
generally, different rating guidelines must be developed for each KSA being assessed with each
question. Here is a sample rating scale for the first example question presented above:
5-points – Best Qualified Candidate:
Will pick an example of moderate to large complexity in terms of number of users and/or amount of
functionality.
Will describe an independent, journey-level role and duties for him/herself.
Describes that s/he participated in the project from conception to final development.
Uses Intergraph, Arc/Info or ARCVIEW software.
The system is performing in production in a way that meets client needs.
3 points – Qualified Candidate:
Has experience but lacks length or depth; experience may be with a small project with limited functionality.
Participated in a closely supervised role rather than independent journey level.
Worked on a portion of a project rather than involvement in all phases.
Used software other than Intergraph, Arc/Info or ARCVIEW software.
1 point – Minimally Qualified Candidate:
Meets the minimum qualifications of degree in MIS or BIS plus 1 year of experience but experience has
been as a GIS user or in a support role rather than as a developer.
For certain general KSAs, the same rating guidelines can be used in each question rating that
KSA. The rating guidelines serve to focus the raters’ attention on the specific KSAs being
assessed. For KSAs that are heavily weighted, it is suggested that more than one question be
included in the T&E.
SECTION IV – Page 15
The basis for scoring this type of examination is slightly different than that of the traditional T&E
rmethod. Rather than evaluating past “exposures” that applicants have had in terms of work or
education (e.g., work experience involving ________), this method emphasizes past behaviors or
achievements of the applicant. Since the focus is more on behaviors and achievements, and not
broadly-defined experience or “exposures”, this method is better geared to evaluate the quality of
what an applicant has done, achieved or accomplished (i.e., results achieved), than traditional
T&E’s. The term “behavioral consistency” therefore stems from the idea that people tend to be
consistent in their behaviors, and that future behaviors can be predicted by measuring past
behaviors of a similar nature. This is not to say that individuals don’t change. They can.
However, most of the important behaviors for most people do not change dramatically over
reasonable periods of time. Again, because the Behavioral Consistency method is based on
content validity, applicants are evaluated on only those behavioral dimensions that show large
differences between superior and barely acceptable workers.
Behaviorally anchored T&Es must be rated by more than one rater. The panel(s) of SMEs review
the narrative descriptions provided by candidates and match them with the rating guideline’s set
of detailed pre-established benchmarks, which exemplify performance at different levels of
proficiency. Ideally, the different proficiency benchmarks associated with each task or function
represent such markedly different levels of performance that they do not overlap. If there are
highly technical areas to be evaluated, then raters knowledgeable in the subject area should be
used. The scoring methodology for behaviorally anchored T&Es should follow the same, general
scoring guidelines that apply to oral examinations. [See “General Considerations Involving
Rating Scales” in the discussion on Oral Examinations presented later in Section IV.]
The supplemental questionnaire also should include instructions that are crafted carefully.
Definitions of KSAs being tested may be included. As with all types of unassembled
examinations, it is highly recommended that applicants provide names and contact information so
that any information they provide can be verified. They should also be reminded that any and all
information they provide is subject to verification. Other examples of questions for a
behaviorally the anchored questionnaire are presented below.
Advantages of Behavioral Consistency Ratings:
1) Assesses quality of applicant’s accomplishments relative to critical job behaviors.
2) Tend to have good validity for predicting job success.
3) Useful as a first step in a selection process where subsequent testing components may be time
consuming or expensive or when out-of-state candidates may have applied.
4) Allows for the measurement of written communication skills.
5) Good to assess KSAs for highly technical and complex jobs.
6) Useful as a screening device (particularly with management jobs) and to select who is “most
qualified” to proceed in the selection process. [An example of a cover sheet, the questions for
a screening committee assessment, supplemental questionnaire and behaviorally anchored
rating guidelines are in this section.]
7) The crediting of job-relevant achievements regardless of the setting in which they were
accomplished may act to reduce adverse impact on minorities and women (since members of
SECTION IV – Page 16
these groups may have had less opportunity in the past to demonstrate achievements on the
same or similar jobs).
Disadvantages/Criticisms/Drawbacks of Behavioral Consistency Ratings:
1) Takes more time to develop questions and rating scales than self-rated T&Es and benchmarks
for scoring may be difficult to develop.
2) Relies heavily on written communication skill so not appropriate for those jobs not requiring
this skill and candidates with poor written communication skills may be at a disadvantage.
3) Shouldn’t be used when job requires limited reading, writing and analytical skills.
4) Requires staff/SME time to score which may be time-consuming and resource intensive when
there are large candidate populations.
5) As with other unassembled exams, not appropriate for entry level classes where you don’t
expect candidates to possess experience.
6) May be viewed as inconvenient or bothersome to applicants who don’t want to take the time
and effort, especially for higher level candidates with lots of experience.
Behavioral Consistency Example 1: Supplemental Questionnaire & Rating Guideline
INSTRUCTIONS
for
7263 MAINTENANCE MANAGER
Supplemental Questionnaire
The purpose of this supplemental questionnaire is to evaluate your experience in relation to the
7263 Maintenance Manager position for which you are applying. The supplemental questionnaire
will be used to: 1) assess candidates’ possession of minimum qualifications; and 2) determine
candidates’ score and rank on the eligible list. Of interest are the knowledge, skills and abilities
that you have acquired previously that relate to the essential functions associated with this class.
THIS SUPPLEMENTAL QUESTIONNAIRE MUST BE SUBMITTED WITH THE OFFICIAL
APPLICATION AT THE TIME OF FILING. FAILURE TO DO SO MAY LOWER YOUR
OVERALL SCORE ON THIS EXAMINATION OR RESULT IN YOUR
DISQUALIFICATION. FURTHER, PLEASE BE ADVSED THAT ANY FALSE,
INCOMPLETE, DECEPTIVE OR INCORRECT STATEMENTS MAY RESULT IN YOUR
DISQUALIFICATION OR DISMISSAL FROM EMPLOYMENT (IF AN EMPLOYEE OF THE
CITY AND COUNTY OF SAN FRANCISCO).
Answer the four (4) questions presented by giving specific examples of your most relevant
experiences. In order to understand the breadth and depth of your experiences, it is best that you
describe a variety of experiences, if appropriate. Include the name of the employer(s), the
employer’s address and phone number and dates of your experience.

SUPPLEMENTAL QUESTIONNAIRE
1. Describe your experience supervising or managing buildings and grounds. Indicate in your
response the:
SECTION IV – Page 17



number and size of the buildings and grounds you supervised/managed;
use of the building, e.g., residential, hospital, school; and
job title(s) and number of craft and/or laborer employees you supervise(d), e.g., 1
Chief Stationary Engineer, 2 Senior Stationary Engineers, 3 Stationary Engineers,
2 Gardeners, 1 Carpenter, 2 Electricians, and 1 General Laborer.
2.
Describe the annual preventive maintenance schedule for the buildings and grounds you
supervise or manage. Give chronological schedule from either January-December or JulyJune.
3.
Describe a specific major repair project that you supervised, detailing the different tasks
and craft workers involved in completing the project.
4.
Attendance and productivity are two problem areas that sometimes arise in this field.
Describe one situation for each problem, without providing names of employees, which
you have encountered. Explain how you dealt with the problem situations.
Please provide the information requested below after your response to each question.
Job Title:
_________________________________________________________________________
Employer:
________________________________________________________________________
Can be verified by (Name) ____________________________Title:
___________________________
Address and phone number of person who can verify:
______________________________________________________________________________

SECTION IV – Page 18
7263 MAINTENANCE MANAGER RATING COMMITTEE GUIDELINES
1. Number, size, and usage of building(s) managed and variety of craft workers candidate supervised
(Supplemental Questionnaire, Question #1)
Strong: Candidate supervised/managed 11-20 medium-size or 1 institutional size building(s)
for complex use, such as a hospital, school or manufacturing plant. Supervised 6 or
more employees in different classes/crafts performing building maintenance.
Acceptable: Candidate supervised/managed 4-10 medium or 1 large sized building for
moderately complex use, such as an apartment or office building with 100+ units.
Supervised 3-5 employees in different classes/crafts performing building
maintenance.
Weak: Candidate supervised/managed 1-3 medium-size buildings for single use, such as
retail stores or movie theaters. Supervised 1-2 employees in different classes/crafts
performing building maintenance.
2. Knowledge, of Building & Grounds Maintenance & Repair (Supplemental Questionnaire, Question
#2).
Strong: Preventive maintenance schedule that includes all aspects of buildings and grounds
maintenance.
Acceptable: Preventive maintenance schedule that includes varied building and grounds
maintenance.
Weak: Preventive maintenance schedule that is limited in scope and complexity.
3. Knowledge of Building & Grounds Operations Policies & Procedures (Supplemental Questionnaire,
Question #3).
Strong: Description of project that demonstrates difficult procedures that involve a full
range of skilled crafts and janitorial/grounds cleaning/maintenance.
Acceptable: Description of project that demonstrates moderately difficult procedures that
involve 3 or more crafts.
Weak: Description of project that does not involve varied crafts and procedures.
4. Supervisory ability (Supplemental Questionnaire, Question #4).
Strong: Explain expectations, inquire as to problems affecting attendance/
performance, monitor performance, take incremental actions if necessary.
Acceptable: Explain expectations, monitor and resolve problems.
Weak: Explain expectations.
5. Written communications ability (Supplemental Questionnaire-all questions).
Strong: Information is organized in a clear and concise manner, grammar and punctuation
are correct, layout and syntax of supplemental is easy to follow.
Acceptable: Each sentence focuses on making a specific point; information is presented in a
manner which is easy to follow.
Weak: Information is not presented in a clear manner; layout is difficult to follow.
SECTION IV – Page 19
3 point rating scale (3 = Strong, 2 = Acceptable; 1 = Weak)

Rating Sheet Sample
7263 Maintenance Manager
Candidate _________________________
QUESTIONS
KSAs
Weight
#1
A. Knowledge of Building & Grounds
Operations Policies & Procedures
B. Knowledge of Building & Grounds
Maintenance & Repair
C. Supervisory Ability
D. Human Relations Ability
#2
#3
#4
AVERAGE OR
COMPOSITE
21%
16%
21%
26%
F. Written Communications Ability
16%

Behavioral Consistency Example 2: Supplemental Questions for Budget Analyst
Applicants
1. Analytical and Quantitative Reasoning Abilities
Budget analysis must analyze complex technical data and other information, using their logic and
quantitative reasoning abilities. In doing this, they must be able to distinguish essential from unessential
information. In the space provided, please give examples of your past achievements demonstrating
these abilities.
2. Interpersonal and Organizational Skills
Budget analysts must be able to work with all kinds of people—different ethnic groups, personalities, age
group and occupational levels. In addition, they must be able to determine where to go within their
organization for needed information and to judge what information should be passed on to different levels
of management. They must also be sensitive to the needs and requirements of people at different
organizational levels and realize the extent to which they can aggressively promote their own ideas. In the
space provided, please give examples of your past achievements which demonstrate that you possess
these skills.
3. Motivation, Initiative, and Ability to Organize Work
Budget analysts must possess initiative and motivation to learn new ideas and techniques. They must be
able to budget their time for accomplishing tasks and assignments within given guidelines. How willing
SECTION IV – Page 20
are you to seek out and assume additional responsibility and to explore better methods for accomplishing
your work? How well can you work with more than one complex project or assignment at a time, organize
them as to their importance? In the space provided, please give examples of your past achievements
which demonstrate that you possess these skills.
4. Writing Ability
Budget analysts must be able to communicate well in writing. Can you write clearly and concisely? In
the space provided, please describe your past achievements demonstrating your writing ability.
5. Oral Communication Ability
Budget analysts must be able to react quickly, confidently, and with composure in stressful, interpersonal
situations and present ideas or information in an organized manner on short notice. How successful are
you in this type of oral communication? In the space provided, please give examinations of your past
achievements demonstrating your ability to communicate effectively in such situations.
[Rating scales to evaluate applicant responses are not presented in this example.]
Behavioral Consistency Example 3: Instructions and Rating Scale for Director of
Communication Applicants
Instructions to Applicant: There are various factors presented below which describe the knowledge, skills
and abilities that are needed to do the job for which you are applying. These factors will be measured in
your examination. Read each factor to be measured, including the KSAs and the example experiences
associated with each factor. Spend time thinking about your experiences and training which show that you
have this particular knowledge, skill or ability. You may have acquired this KSA from your job
experiences, education, training or other activities or accomplishments.
Describe up to three jobs which gave you your BEST experience with respect to the factor indicated. Tell
us how you acquired the knowledge, skills and abilities associated with each factor by describing the duties
you have personally performed on the job which related to that factor. In detailing your duties and
accomplishments, be certain to follow any Special Instructions associated with each factor. You must
provide enough detail in your descriptions of your work experiences, education and training so that the
examiners reviewing your examination can get an accurate picture of how your background relates to this
area of the job for which you are applying. Be specific and clear in describing what you did and how you
did it. Failure to do so may result in the lowering of your score.
Also, for each job, record your official job title and the other information requested for verification purposes.
Please note that if the requested information is not provided, your examination will be considered incomplete
and will not be scored.
--------------------------------------------------------------------------------------------------------------------------[The following is an example of the five factors that were evaluated as part of this recruitment.]
Sample factor evaluated: Management/Supervisory Ability (Factor weight: 20% of the final score)
This factor comprises the following KSAs associated with the job of Director of Communications:
Knowledge of and ability to apply management principles and techniques; administrative/supervisory ability;
ability to plan and direct promotional, informational and education communication and related projects;
ability to determine priorities and coordinate the completion of work and meet schedule deadlines;
organizational ability; ability to assign and review work; ability to provide training and technical assistance;
SECTION IV – Page 21
ability to screen and select staff; ability to evaluate the work of staff and conduct or assist in the conduct of
performance evaluations; ability to use professional judgment in performing sensitive and complex
assignments; ability to be an effective member of the management team in the problem-solving process.
Example experiences:
Experience coordinating the development of promotional, information and education services and/or leading
other professional and clerical staff in the completion of communication projects; experience planning
communication projects; experience determining priorities; experience assigning and reviewing work;
experience evaluating work performance; experience making recommendations regarding agency
communication policies and procedures; experience preparing or assisting in the preparation of unit plans,
policies and budgets.
Special Instructions: For each job, please describe your responsibilities managing programs, projects and
personnel. Indicate your specific management/administrative/supervisory responsibilities. Also include the
numbers and job titles of those persons you managed on projects or directly supervised or assisted in
supervising. Indicate any groups of individuals (e.g., vendors) that it was necessary for you to
coordinate/interact with to complete these projects.
--
SECTION IV – Page 22
Rating Scale for Factor: Management/Supervisory Ability
Definition – Experience coordinating the development of promotional, information and educational services and/or
leading other professional and clerical staff in the completion of communication projects; experience planning
communication projects; experience determining priorities; experience assigning and reviewing work; experience
validating work performance; experience making recommendations regarding agency communication policies and
procedures; experience preparing or assisting in the preparation of unit plans, policies and budgets.
[5 Rating] Very Highly Qualified – Candidate demonstrates outstanding experience in this area. Candidate has recent
experience with accountability for managing communications activities for a large and complex organization with
considerable public relations, public information/education, employee and media communication activities.
Experiences include: assigning and reviewing work of professional communications staff; evaluating employee
performance; training staff; coordinating the completion of work and meeting scheduled deadlines; supervising the
preparation and distribution of informational materials; conducting seminars and workshops to improve organization
communications. Candidates rated at this level should demonstrate the experience is within a similar position to the
class being examined for, (candidate is the highest ranking communications professional for the organizational) and that
supervision of communications staff is an on-going responsibility.
[4 Rating] Highly Qualified – Candidate demonstrates above average experience in this area. Candidate has experience
supervising communications activities for an organization with considerable public relations, public
information/education, employee and media communications activities as indicated above. Candidate demonstrates
experience planning and managing programs of public information and communications. Experiences include:
assigning and reviewing work of professional communications staff; participating in the evaluation of employee
performance; training staff; coordinating the completion of work and meeting scheduled deadlines).
Candidates rated at this level may not be the top level communications professional for the organization, but they
perform supervisory and management tasks in the communications area.
[3 Rating] Qualified – Candidate demonstrates average experience in this factor area. Candidate functions in a lead role
in a communications unit. Although this candidate does not have full supervisory responsibility, she/he has
responsibility for leading professional staff, providing training, evaluating work and may have input into performance
evaluations.
OR
Candidate has extensive experience as a team leader over communications projects including assigning and
coordinating the completion of work and meeting scheduled deadlines.
OR
Candidate has directly supervised and/or managed communications activities of some type, but the environment of the
job is very different from the position being examined for, (for example, Editor of newspaper, supervising account
executive for an advertising agency, manager of a radio or television station and other supervisory communications
positions which are not directly related to running an organization’s information functions.)
[2 Rating] Minimally Qualified – Candidate demonstrates only marginally acceptable experience in this area.
Candidate has managed a directly related communications program including coordinating the completion of work and
meeting scheduled deadlines, but has not supervised or lead staff.
OR
Candidate has only minimal supervisory experience (1 or 2 staff) in a position not directly related to the Director of
Communications Position.
OR
Candidate has some experience as a team leader over small scale communications project(s) including assigning and
coordinating the completion of work and meeting scheduled deadlines.
[1 Rating] Less Than Qualified – Candidate demonstrates less than acceptable experience in this factor area. Candidate
has no supervisory or lead level responsibilities over professional/technical level communications staff. Experience and
the jobs listed are not directly related to the Director of Communications position.
SECTION IV – Page 23
[0] Not At All Qualified – Candidate demonstrates no experience in this factor area at all.
BEHAVIORALLY ANCHORED T&E SCREENING
Rather than simply rank candidates on the basis of their score on a Behaviorally Anchored T&E,
these instruments may be used to also screen candidates on a pass/fail basis. Screening
committee assessments are generally used for management level classifications when an
assembled test (e.g., oral examination) is to be administered and it is not feasible to test the entire
applicant pool. However, DHR does have written exam instruments that can efficiently assess
large numbers of managerial candidates, so these populations can be screened (and ranked) with
those instruments.
The purpose of the screening committee assessment is to identify the most qualified applicants by
evaluating the quality of an applicant’s job-related training, experience, accomplishments and
knowledge, skills and abilities. Again, the focus here is on quality and proficiency level, not
“exposures”. That is, “exposures” described as broadly-defined experience or education are not
useful to determine proficiency level. Use of such broadly-defined experience for screening
purposes essentially constitutes replacement of the minimum requirements, a likely violation of
Title VII if those requirements are higher than the true minimum requirements. Use of any
Behaviorally Anchored T&E to screen candidate populations for titles other than manager,
therefore, must be approved by the DHR’s Director of Recruitment and Assessment Services.
As indicated previously, a supplemental questionnaire is required for this type of assessment and
responses to questions should be rated using behaviorally anchored rating guidelines. An example
of a cover sheet, the questions for a screening committee assessment supplemental questionnaire
and behaviorally anchored rating guidelines are presented below.
SAMPLE SUPPLEMENTAL QUESTIONNAIRE
FOR SCREENING COMMITTEE
SUPPLEMENTAL QUESTIONNAIRE
2594 EMPLOYEE ASSISTANCE COUNSELOR
READ THE ENTIRE PACKAGE CAREFULLY BEFORE COMPLETING IT.
The purpose of this supplemental questionnaire is to obtain a clear and concise summary of your
professional and educational background as it relates to the position of 2594 Employee Assistance
Counselor. The supplemental questionnaire and the regular application will be used to determine
if you meet the minimum qualifications as specified on the job announcement and determine the
best-qualified applicants who will be invited to continue in the examination process.
SECTION IV – Page 24
YOU MUST:
1. Provide all information and attach all documents of verification that are required. Unverified
experience, education, licensing, etc. will not be credited. Each page of the supplemental
questionnaire will direct you as to the type(s) of verification or information that you must
supply. They are:
a. Verification of Qualifying Experience and Education: The examination announcement
contains information about the requirements for verification. Please supply as much
verification as is necessary to meet the minimum requirements for this examination.
b. Additional Verification for the Supplemental Questionnaire: Parts of the supplemental
questionnaire will direct you to attach other documents if you wish to receive credit for
licenses, certifications, degrees, coursework, etc.
c. The Name of a Person Who Can Verify Information: The person should be someone such
as a supervisor, program director or personnel officer who has personal knowledge that
you either performed the specific activities or that your position required you to perform
such activities. You may use the same person, if appropriate.
2. Complete each section of the application and supplemental questionnaire as indicated even if
the information may appear redundant. Do not write, "See application."
3. Be thorough but concise. All of your information must be supplied in the spaces provided.
Provide your best or highest examples of work.
You may list unpaid work experience on the supplemental and regular applications. You may use
the same work experience or list the same employer more than once under each question or for
more than one question if you feel that it is appropriate. For example, you may have held different
positions or gained different work experiences while working for one employer. In that case, you
might want to list your experiences separately in order to demonstrate the diversity of experiences
you have gained.
CERTIFICATION
I hereby certify that my responses to this questionnaire are true and accurately reflect my
background, skills and experiences. I understand that any false, incorrect or deceptive responses
provided in this questionnaire may result in my disqualification for this, and possibly other, job
opportunities with the City and County of San Francisco. I understand and agree that all the
information that I provide is subject to verification.
Agree Name of Applicant ____________________________________
--------------------------------------------------------------------------------------------------------------------------
SECTION IV – Page 25
2594 EMPLOYEE ASSISTANCE COUNSELOR SUPPLEMENTAL QUESTIONNAIRE
1. For each relevant job position, or most relevant job positions, listed on your regular application,
delineate your experience in terms of the number of years/months worked, number of hours
worked per week and the percent of time devoted to direct services in the areas of substance abuse
and other mental health issues.
You are not required to supply letters of verification for every employment experience listed on
this page. However, you must supply information about a person who can verify that the
information you are supplying below is true and accurate.
Organization:
Your Position:
Address of organization:
Number of years/months:
Hours worked per week:
Percent of time spent providing direct substance abuse services:
Percent of time providing other direct mental health services:
Name & title of person who can verify information:
Day Phone of person who can verify information:
Organization:
Your Position:
Address of organization:
Number of years/months:
Hours worked per week:
Percent of time spent providing direct substance abuse services:
Percent of time providing other direct mental health services:
Name & title of person who can verify information:
Day Phone of person who can verify information:
Etc.
2. Briefly describe the types of training, educational workshops and/or group-process work that
you have conducted in the following areas: substance abuse, stress, communication,
supervision/management, assertiveness, habit abatement or other topics you believe are highly
relevant to the position of Employee Assistance Counselor. If you have the experience, list at
least two substance-abuse related trainings, workshops or groups. Briefly state the audience or
recipient of the service. Provide as many descriptions as space allows.
You must supply information about a person who can verify that the information you are
supplying below is true and accurate.
------------------------------------------------------------------------------------------------------------------Training/Workshop/Group:
Length of Training/Workshop/Group (in hours):
Number of times you conducted it:
Name of person who can verify information:
Position/Organization:
SECTION IV – Page 26
Training/Workshop/Group:
Length of Training/Workshop/Group (in hours):
Number of times you conducted it:
Name of person who can verify information:
Position/Organization:
Etc.
-----------------------------------------------------------------------------------------------------------------3. Briefly describe the nature and duration of your work experiences in the following functional
areas: organizational development, human resources management, outpatient/day treatment,
inpatient, residential, substance abuse, mental health, crisis intervention, case management and
other areas you feel are highly relevant to employee assistance programs.
List as many different types of experience you can, particularly if you have a combined total of at
least 6 months of experience in the area. You may list several employers for one functional area.
You may use one employer to show experience in more than one functional area.
Include the name and agency where you gained the experience. Provide information about a
person who can verify the information.
Type of Experience / Agency
Number of hours worked per week:
Total length of experience:
Name, agency and day phone number of person(s) who can verify information:
Type of Experience / Agency
Number of hours worked per week:
Total length of experience:
Name, agency and day phone number of person(s) who can verify information:
Etc.
--------------------------------------------------------------------------------------------------------------------------
4. Place a check mark next to the degree(s), license(s), certification(s), etc. that possess. You must
attach verification for all items that you check. Verifying documents must be those that are
commonly accepted as verification for professionals in this field, such as diplomas, transcripts,
licenses, certificates, etc.
A. Education: Check and attach verifying document(s) for ONE of the following that most
applies to you.
a.
b.
c.
d.
A baccalaureate degree in a related field
A master’s degree in a related field
No master’s degree, but two full years of relevant graduate-level
coursework leading to a master’s degree level.
A master’s degree in a related field plus at least one full year of related,
advanced coursework beyond the master’s-degree level
SECTION IV – Page 27
B. Licensing: Check and attach verifying document(s) for ALL of the following that apply to
you.
a.
b.
c.
d.
e.
f.
State registered internship for licensure as a mental health professional.
(You must attach a copy of the certificate with your registration number.)
Registration number:
Rehabilitation Counselor’s license/certificate
Marriage, Family and Children’s Counselor’s license
Clinical Social Worker’s license
Psychology or other related doctoral-level (Ph.D. level) license
Other license that you feel is highly related
Explanation:
C. Certificates: Check and attach verifying documents for ALL of the following that apply to
you.
a.
b.
c.
Certification as an Employee Assistance Professional
Chemical Dependency Counselor’s certificate
Other certificate that you feel is highly related.
Explanation:
SECTION IV – Page 28
BENCHMARKS FOR ASSESSING 2594 APPLICATIONS
Screening Committee Guidelines
I.
Types of training, workshops and/or process groups conducted
1.
2.
3.
4.
5.
6.
7.
Substance abuse educational training/workshop
Substance abuse process group
Stress
Communication
Supervision/Management
Assertiveness
Habit Abatement
Superior response
Includes all of the types of training and groups above, or equivalent.
Acceptable response
Includes either 1 or 2 above and 3 of the items listed in 3-7 or equivalent.
Weak response
Includes 2 or fewer items listed above or equivalent.
II.
Length and diversity of experience beyond what was needed to qualify
Length of service beyond qualifying:
III.
Superior:
4-5 years beyond qualifying
Acceptable:
2-3 years beyond qualifying
Weak:
1 year beyond qualifying
Diversity of Experience: candidate can be credited with experience in any of the
following functional areas if: (1) the applicant has at least 6 months of experience in the
functional area that is beyond and different from what was used to qualify and (2) the
experience the applicant has appears to be at a skill level equivalent to that needed to
provide professional-level service in mental health, substance abuse, employee assistance
or other direct-services programs.
1.
2.
3.
4.
5.
6.
Organizational Development
Human Resources Management
Outpatient/Day Treatment
Inpatient
Residential
Substance Abuse
SECTION IV – Page 29
7.
8.
9.
10.
Mental Health
Crisis Intervention
Case Management
Other, if related to EAP
Superior response
Includes 7 of the types of experience above or equivalent, including items 6 and 7.
Acceptable response
Includes either items 6 or 7, AND 4 other types of experience or equivalent listed above.
Unacceptable response
Includes 3 or fewer items listed above or equivalent.
IV.
V.
Education
Superior:
Master’s degree in related field
Acceptable:
No master’s degree, but two full years of graduate-level coursework
leading to master’s degree
Weak:
Baccalaureate degree (minimum qualifications)
Licensing and Certificates
Superior:
Licensed Psychologist or other doctorate-level license or Licensed Clinical
Social Worker
Acceptable:
Licensed Marriage, Family Children’s Counselor, Certified Employee
Assistance Professional, Licensed or Certified Rehabilitation Counselor or
Certified Chemical Dependency Counselor
Weak:
No license or certificate
SECTION IV – Page 30
2594 EMPLOYEE ASSISTANCE COUNSELOR
SCREENING COMMITTEE RATINGS
Candidate ____________________________
Questions
KSAs
A.
I
II
Knowledge of interview, counseling, assessment
and diagnostic tools [23%]
B. Knowledge of treatment modalities and group
dynamics [20%]
E.
Ability to develop treatment plans [18%]
G.
Written communication ability [12%]
H.
Human relations ability [17%]
I.
Program development and implementation [10%]
SECTION IV – Page 31
III
IV
V
Average
ORAL EXAMINATIONS / STRUCTURED INTERVIEWS
Distinction between Interview and Oral Examination
In civil service work, the employment interview takes place after the candidate has already passed
the examination and reached a place high enough on the list to be certified. In most cases, this
type of interview corresponds roughly to the interview given in private industry when the job
seeker is interviewed by the employer on a hire or not-hire basis. Such an interview may be
unstructured (e.g., different questions are put to different candidates, the interviewer freely covers
information that s/he feels may be relevant). In unstructured interviews, the interviewees’ work
experience and personality are usually assessed, and the interviewer normally forms a global
impression of the interviewee without the benefit of rating scales or formalized rating criteria. In
fact, it is sometimes said that unstructured interviews tend to be a search for unfavorable
information (i.e., final evaluations are more closely related to unfavorable information obtained
than to favorable information obtained). Indeed, an interviewer may make a hiring decision early
in the interview based on minimum information. Also, the unstructured interview may be “one on
one” consisting of just the interviewee and one interviewer.
A structured interview, on the other hand, is a more formalized procedure which usually involves
multiple raters and which typically measures specific knowledge and/or abilities through the use
of rating scales, scoring guidelines and weighted scores. Further, with this procedure, applicants
do more talking than interviewers and decisions are made by the interviewers after all information
has been obtained. Most importantly, this type of selection procedure is standardized and all
candidates are treated the same way.
Some prefer the term, “oral examination,” in lieu of “structured oral interview” because it
eliminates any confusion as to whether the procedure is structured or unstructured. Also, the term
“interview” tends to connote questions that explore the candidate’s background (e.g., “tell me
about yourself”) whereas the “oral examination” makes it clear that the applicant is being
“examined” on some job-relevant worker characteristic.
Specifically, an oral examination consists of one or more questions usually presented orally to a
candidate (case study type orals may present information/questions in written form) to which the
candidate must respond orally. There are numerous variations on the kind of questions that may
be asked, the options concerning the time a candidate may have to prepare a response, and the
interaction that examiners may have with the candidate during the presentation of a response.
However, the basic component of an oral examination is that the candidate provides an oral
response that is evaluated according to the degree to which the response demonstrates the jobrelevant worker characteristic(s).
SECTION IV – Page 32
Oral Examinations
Some knowledge, skills and abilities (KSAs) are most appropriately assessed through an oral
examination (or structured oral interview) process. When these KSAs are identified in the job
analysis as critical, the personnel analyst, with or without the help of SMEs may decide to
develop oral examination (or structured interview) questions and rating guidelines to assess them.
An oral examination can be of the question-and-answer type, as in the case of a structured
interview, or it can involve role-playing or a case study, or a combination of the three.
Regardless of type, the oral examination requires that all aspects of the examination be as similar
as possible for each applicant. In the question-and-answer type of examination, examiners or
raters ask a predefined set of questions. They may ask follow-up questions, but only if
examination analyst has so specified and when this occurs there are typically pre-determined
“triggers” that specify when those follow-up questions may be asked (such pre-determined
“triggers” ensure that everyone gets asked the same follow-up questions under the same
conditions).
Research has shown that structured, standardized interviews or oral examinations tend to have
significantly higher reliability and validity than unstructured interviews. Standardization ensures
that each applicant is evaluated in a consistent fashion. Without this consistency, it is impossible
to evaluate the relative qualifications of applicants. With a structured interview, job KSAs are
assessed by means of job-related questions.
Advantages of Oral Examinations:
(1) Tend to have good validity for predicting job success if well-structured.
(2) Cost effective for small candidate groups.
(3) Quick Turnaround time - can be developed relatively quickly and can efficiently process
recruitments with very low candidate counts (i.e., amount of time and work invested per
candidate).
(4) Allows for interaction between examiners (raters) and candidates.
(5) Good for assessing:
a. Oral communication ability – use when the job involves oral communication competence
in conjunction with the performance of job tasks.
b. Oral presentation ability – use when the job involves speaking and making presentation
before a group.
c. Interpersonal skills – use when the job involves oral interpersonal interaction in
conjunction with performance of job tasks.
d. Divergent thinking – use when the job requires broad, expansive or creative ways of
thinking (e.g., the job requires one to identify, consider, develop, etc. more than one or a
variety of ideas, causes, reasons, courses of action, procedures, solutions, etc.).
e. Multiple abilities – use when job requirements are so diverse that a synthesis of multiple
skills and abilities (characterized by dimensions) ought to be measured. Oral exams are
particularly suited for the assessment of characteristics/dimensions that are associated with
high-level titles (e.g., decision-making, leadership, planning, etc.).
SECTION IV – Page 33
f. Characteristics that cannot be effectively evaluated through other testing methods (e.g.,
foreign language skills).
g. Jobs where the English language literacy (reading proficiency) is low.
h. Jobs for which the ability to “think on one’s feet” is important.
(6) Generally well-accepted – while it is not unusual to receive complaints from a candidate about
the basis for a particular rating, candidates are generally more accepting of exam content and
the actual test procedure.
Disadvantages/criticisms/drawbacks of Oral Examinations:
(1) Candidate population size – oral exams do not lend themselves to testing large numbers of
candidates because of time demands and security issues. Use should generally be limited to
small or medium applicant pools and/or with job classes where a written test is inappropriate
and inadequate. If the candidate population is large, it is common to put the total candidate
group through one or more hurdles (e.g., a written test or T&E) to reduce the number of
candidates who are invited to participate in the oral exam. [Adding an oral exam to other
selection measures for a given recruitment tends to enhance the overall validity of the
selection process.]
(2) Standardization – oral exams tend to be less objective than multiple-choice tests and may be
susceptible to undesirable subjective factors such as the halo effect, rater idiosyncrasies,
misinterpretation of candidate responses, etc. Oral exams must be administered consistently
and the test administrator must control for conditions in the exam setting, in administering the
questions, and application of rating standards Hence, raters need to be trained and rating
guidelines need to be behaviorally anchored, etc. To maintain a record of candidate responses
and to document that the oral exam has been administered in a fair and standardized fashion, it
is always important to video and/or audio tape the oral examination administration.
(3) Not appropriate for solitary jobs that do not require candidates to be articulate or
communicative or for jobs that do not require a high level of interpersonal skills.
Linkage between the Job Analysis & the Oral Test Plan
Use the job analysis (JA) findings to identify the major job behaviors, dimensions or worker
characteristics to be evaluated. Table A below illustrates how the actual composition of an
examination can change depending upon the examination format selected. For example, in the
table it shows that the written examination will cover only 5 of the original 8 KSAs identified in
the job analysis. The oral examination, on the other hand, will measure only 4 of the 8 KSAs
identified in the job analysis. In this multi-test component selection process, 2 of the 8 original
KSAs happen to be measured by both the oral and the written examination. It should be noted
that, although both the number and the “mix” of KSAs differ across the two selection procedures,
both examination formats plan to address at least 60% (a majority) of the job’s KSA
requirements.
SECTION IV – Page 34
TABLE A
KSAs
JA %
Written %
Oral %
Reading Comprehension
10%
17%
Ability to Follow Written Instructions
10%
17%
Knowledge of Widget Processing
16%
26%
not tested
(Best tested via written)
not tested
(Best tested via written)
26%
Written Communication Ability
10%
17%
Oral Communication Ability
19%
not tested
(Best tested via oral)
30%
Knowledge of Widget Law
14%
23%
22%
Knowledge of General Office Practices
5%
not tested
(Too low to test)
not tested
(Too low to test)
Ability to Work Effectively with Others
14%
not tested
(Best tested via oral)
100%
22%
TOTAL
100%
not tested
(Best tested via written)
100%
Table B below illustrates how the KSA weights in the oral examination were recalculated from
the original job analysis weights. As shown in the table, the JA weights of those KSAs to be
measured by the oral exam are summed. Each KSA JA weight is then placed over this sum in a
ratio and multiplied by 100.
TABLE B
KSAs
Original JA Weight
Knowledge of Widget Processing
16%
Oral Communication Ability
19%
Knowledge of Widget Law
14%
Ability to Work Effectively with Others 14%
TOTAL
63%
SECTION IV – Page 35
Oral Weight
16/63 x 100 = 26%
19/63 x 100 = 30%
14/63 x 100 = 22%
14/63 x 100 = 22%
100%
Types of Oral Examination Questions
A variety of question types can be used to elicit information during oral examinations. Some are
more valuable for obtaining information than others. Following is a discussion of various types
including applications, advantages, and disadvantages.
1. Yes-or-No Questions. Simple “yes or no” questions are generally poor to use in an oral
examination because they fail to provide in-depth information about the candidate and do not
encourage the candidate to talk. Also, quite often such questions may be leading as well (e.g.,
“You have never had an argument with a supervisor, have you?”). Avoid using yes or no
questions for oral examinations.
2. Direct Questions. Direct questions are generally used to obtain specific, factual information.
Such questions often start with the “W” words (who, what, when, where, why). Direct
questions are easily understood by the candidate and generally result in a concise answer that
produces specific information that may be useful.
It should be noted that certain types of direct questions are considered convergent questions.
These are designed to obtain one unique answer or conclusion to a problem or situation (e.g.,
“What type of roof construction is most hazardous in fire situations?” or “When is the
administration of atropine indicated?”). Typically, these questions measure free-recall of
knowledge. However, one limitation with these types of questions is that they tend to sample
only a small amount of a candidate’s overall knowledge of a given domain and, because oral
exams typically don’t have that many questions, the candidate’s true level of knowledge may
not be reliably assessed.
Direct questions, therefore, are most useful when they call for many responses to a problem.
For example, consider the following question: “You notice a pattern of sick leave abuse
among your employees. What should you do in this situation?” This question requires one to
address the various ways to investigate and analyze the problem and to take various courses of
action which depend upon the cause the problem and the conditions involved. These questions
are associated with divergent thinking which refers to the production of a variety of ideas that
lead in different directions. They may require a response to a problem that has yet to be
defined, discovered and solved, and/or when there is no set way of solving the problem.
Having a candidate discuss, for example, various courses of action and various alternatives is
a better way to sample the thinking process of the candidate and such questions tend to better
distinguish the good from the poorer candidates.
Direct questions are also useful as follow-up questions to clarify responses to open-ended or
situational questions, which are discussed in a section below. Examples of direct questions
are: “What do you see as the primary responsibilities of a Station Supervisor?” “Why is
having a current inventory advantageous?”
3. Work Experience Questions. Work experience questions are typically open-ended questions
so that candidate will do most of the talking and provide information about their
SECTION IV – Page 36
qualifications. The examiner will use this information to evaluate the candidate on job-related
KSAs. A possible disadvantage of open-ended, work experience questions is that candidates
may ramble and go into too much detail.
One type of open-ended question can tap into a candidate’s actual behavior if it asks for an
example of what the candidate has actually done to demonstrate the qualification(s) being
addressed – in other words, to ask about their background, training, work experiences and
accomplishments. Some examples of behaviorally anchored questions follow. You may want
to consider identifying and providing the KSAs being tested for the candidate so that she or he
can provide a more focused response.
Example 1: Open-Ended, Work Experience - Programmer
“This job calls for the ability to program in COBOL. Can you tell us what training you have
had in this area? Also, please give us a few examples of how you have used or worked with
COBOL on the job, to solve some business problem. Be specific about what you did and who
might have assisted you/.”
Example 2: Open-Ended, Work Experience Question - Lineman Supervisor
“All line crews, at one time or another, are assigned an apprentice who is afraid of heights.
How have you handled such an apprentice on your crew?”
Example 3: Open-Ended, Work Experience - Cafeteria Worker
“What did you do the last time the grill you were using began malfunctioning and caused you
to stop working?”
4. Situational or Problem Solving Questions. Situational, hypothetical, or “what if?”
questions are open-ended questions which typically pose work-related problems. They are
often of the divergent type discussed above and usually require the candidate to analyze
information and then to describe how they would address the situation or problem presented.
These types of questions can often be designed to that they approach a job simulation.
The advantage of this type of question is that it requires the candidate to analyze the situation,
formulate a solution or course of action, and express this solution. The candidate’s response
provides information about his or her job-related KSAs, problem-solving abilities and depth
of experience. Disadvantages of situational questions are that a candidate may not actually do
what is said. Also, considerable time may be required for the candidate to answer the
question fully. Some examples of situational questions:
Example 1: Situational Question - Labor Relations Manager
“I’m going to give you a hypothetical situation to see how you would handle this problem.
You are the Labor Relations Manager and have been handling a heated dispute between labor
and management. It is the afternoon of the third day and parties do not seem to be getting
any closer to a solution. One of the representatives from the union and the Director of
Personnel for your department have broken into a heated argument and are now beginning to
exchange personal remarks. What would you as Labor Relations Manager do to ease this
confrontation?”
SECTION IV – Page 37
Example 2: Situational Question - Labor Relations Manager
“A supervisor wants to fire a subordinate due to the subordinate’s lack of motivation,
tardiness, and below-standard work. However, the subordinate has filed a grievance with the
union saying that she is being harassed by the supervisor because she is the only female in the
unit. How would you handle this situation?”
Example 3: Divergent, Problem-Solving Question Measuring Management Ability
“How can a fire chief reduce fire department costs and increase productivity in today’s
economy of budget crunches?”
Example 4: Situational Judgment – Fire Supervisor
“Assume you get this job and one of the firefighters you supervise comes to you and says that
another firefighter, Bob Jones, has the smell of alcohol on his breath. What would you do?”
The above question may include a second-part or follow-up question such as:
“When you talk to Bob Jones, you believe you smell alcohol on his breath, but he vehemently
denies it, and says he’s had nothing to drink. Just then, the fire alarm goes off. Jones is the
senior man on the shift, and is therefore the driver tonight. What would you do?”
Scoring guidelines for these types of questions should specify, to the degree appropriate
1) key aspects of the problem with which the response must deal
2) key aspects for solutions that an adequate response should contain
3) how inappropriate or counterproductive strategies are to be graded
4) a complete model response, at least in outline form
With these types of questions, analysts must be precise in defining the problem; otherwise, the
candidate may provide responses that match the wording of the item, but not the examiner’s
intent. Examiners should also anticipate unique approaches to the problem which may differ
from the key response.
5. Evaluation Questions. In the evaluation or critique question, the candidate is presented with
a situation or problem and a possible solution or course of action and is asked to critique or
evaluate this solution. An advantage of this form is that the candidate’s attention can be
directed to a specific problem area. A disadvantage is that unless the problem is carefully
worded, the hypothetical question can be a leading question. An example of an evaluation
question is shown below.
Example 1: Evaluation Interview Question - Personnel Analyst
“A personnel analyst was conducting a meeting and was faced with one job expert who was
constantly complaining about what a busy man he was and what a waste of time it was to be
sitting in this meeting. After listening to the complaints for about 30 minutes, the analyst told
the job expert he could leave if he so desired. Please explain why the analyst handled this
situation appropriately OR inappropriately, whatever you believe to be the case.
SECTION IV – Page 38
6. Explanation Questions. Variations of these items run from, “explain,” “explain how” to
“explain why.” Also, they are sometimes introduced by words such as “discuss” (often too
vague), “contrast,” “compare” or “describe.” Such questions are appropriate for assessing
complex knowledge areas and for tapping the understanding of underlying principles.
Typically the rater or examiner will need to be very familiar with the knowledge area to be
measured.
7. Self-Assessment Questions. Sometimes candidates are asked to tell the panel what they
consider to be their chief talents and abilities, as opposed to their training or accomplishments.
Although sometimes effective, candidates may inflate their abilities in an attempt to put their
best foot forward. To address this, such questions should also require the candidate to explain
why they rate themselves high in a given area.
Example 1: Self-Assessment Question – Social Worker
“How would you rate your ability to deal with elderly clients. Please explain why you rate
yourself that way and provide some concrete examples of what you have done.”
Example 2: Self-Assessment Question
“It is very important to be on time for this job every day. If we were to contact your present
and former employers, what would they say about your punctuality? (Follow-up) “What
would they say about your ability to get all your assignments done on schedule.”
Writing Oral Questions
To develop effective oral exam questions, one needs discipline and creativity. Most importantly,
one needs to understand the job so that the questions are job-related. The job description, job
analysis and SMEs are excellent sources for developing test question. The test writer uses
information from these sources to ask himself, ’What does it take to do this job and what does an
incumbent need to know and be able to do? Other questions that can help stimulate the writing of
test items are:
What knowledge must an incumbent have in order to do the job properly? To find out whether
the candidate has this knowledge, is it better to ask them knowledge questions, or questions about
their background, or a combination of the two?
What new issues and trends in this field should a qualified person be aware of?
What kinds of decisions must the incumbent make on the job? [Think of specific examples of
difficult decisions to be made, and the key factors to be considered in making the right decision.
This can become the basis for several questions.]
What kinds of interpersonal situations arise in this job that an incumbent might be required to
handle?
Every test question should include a rationale (what is being measured and what is the correct
answer) to help determine its job-relevance and likely effectiveness. If there is no single correct
answer, indicate the principles or elements of the question that the candidate should recognize.
SECTION IV – Page 39
[Example: “Candidate’s response should recognize that a supervisor who doesn’t delegate has no
time to monitor and develop staff…]
General Considerations When Writing Oral Questions
Focus on those KSAs, dimensions, competencies or behaviors that are important for the job.
If the oral examination is the sole selection device, chances are that you will be measuring
fewer KSAs than what otherwise might be measured. Use the job analysis findings to assess
the most important or critical KSAs.
How well is the question comprehended on first hearing by the candidate? Avoid complicated
language or sentence structure. A candidate who finds a question confusing does not get a
chance to demonstrate his or her ability. A poorly worded question contaminates what is
being measured. That is, are you measuring the candidate’s ability to understand the question,
or the candidate’s knowledge? Ambiguity breeds errors.
Phrasing of questions should be “conversational,” not academic in tone. Where appropriate,
phrase the question in the working language of the job.
Asking questions about internal procedures or jargon of a specific department is not
appropriate if the announcement is open to the public.
Is the question or its difficulty level appropriate to the job? Familiarity with the job and a
good job analysis will help to avoid difficult, impractical or unrealistic questions.
For situational questions or problem-solving questions, is there enough information to “set
up” the question or is the analyst asking for more information than the candidate can grasp,
given the information provided? Be sure to provide the candidate with all necessary details
and background information. For case study types of situational questions, determine whether
the candidate will need preparation time to review the question or materials in advance of the
testing situation. Or, if the question simply requires the candidate to extensively “think
through” the problem, a preparation period may be in order.
Does the question ask the candidate what she/he WOULD DO or SHOULD DO?
Avoid leading questions (e.g., “Wouldn’t you rather use praise than discipline to motivate
workers?”).
When measuring knowledge, attempt to assess how well a candidate can apply the knowledge
as opposed to asking for knowledge in the abstract. Test-wise candidates may be able to tell
you what the textbook answer is to a question but applying that knowledge to an actual
situation is more difficult. “Test-wise” skills can be thwarted to some degree when a question
or item resembles a work sample.
Remember to use questions that are designed to get the candidate to talk. The more the
candidate talks the more you know about the candidate to evaluate.
Do not ask questions about race, sex, marital status, political affiliation, etc.
Use of Subject Matter Experts for Oral Examination Development
The development of an oral examination ideally involves close cooperation between the analyst
and Subject Matter Experts. The analyst may be quite capable of drafting test questions of a nontechnical nature. However, the analyst may not have technical knowledge of the job which may
be needed to develop test questions and responses. SMEs can frequently provide critical incidents
to use as stimulus material for test questions. They may also be able to provide training guides or
SECTION IV – Page 40
manuals which can be used to find ideas for test questions. In any case, it is highly recommended
that the analyst work with the job expert in the review and/or development of the test questions
and rating guidelines. For example, the analyst may decide to draft some questions and have the
job experts review them. It may be appropriate to use job experts outside the City for this
purpose. These might include supervisors from other jurisdictions; representatives from
equipment and manufacturing companies; and individuals who teach courses in community
colleges, trade schools, etc. It is recommended that extra questions be developed beyond that
which will actually be administered in the event that make-up examinations are necessary.
Oral Examination Development Meeting
Select and notify Subject Matter Experts and gather the following in preparation for the meeting
with the SMEs:
1. Job Expert Background Information and Qualifications Form.
2.
Test Security Agreement and Statement of Responsibility Form
3.
Source Materials. The job analysis should help to identify activities, problems, and/or
situations encountered on the job. Ensure that the questions are appropriate for the
KSAs being assessed and at the appropriate difficulty level. Additional job materials
such as technical publications, manuals, training materials, etc., may also be helpful in
developing test questions.
4.
It is helpful to have the job experts review the following page from this manual,
“General Considerations When Writing Oral Questions”.
At the time of the meeting, have the job expert(s) complete the appropriate background and test
security forms and then have them review the job analysis, with particular attention to those
KSAs for which interview questions will be developed. Explain that questions will be developed
to assess certain job KSAs or combinations of KSAs. The task/KSA linkup can also provide
ideas for question development. If necessary, present the Subject Matter Experts with a question
theme or tentative question and have them, as a group, suggest ideas for an interview question.
When you are satisfied that you have an acceptable interview question, rating guidelines must be
developed. It is very important that you take detailed and accurate notes during this meeting. The
importance of maintaining the confidentiality of the questions must be discussed with Subject
Matter Experts. Care should be taken that Subject Matter Experts leave all sensitive materials
with the analyst.
Interview questions and rating guidelines may need to be edited and revised. Completion of this
step may require that you contact Subject Matter Experts for additional information and/or
verification of information. It is the analyst’s responsibility to ensure the validity and
professionalism of final exam products. One key responsibility of the analyst is to prepare a
question/KSA matrix to ensure that each KSA is adequately measured in relation to its weight
(test content analysis).
SECTION IV – Page 41
Considerations in Planning the Oral Examination or Structured Interview
General guidelines concerning the planning of the examination/interview are presented below.
Candidate Population Size. If the population is large, consider administering the oral exam
after the total candidate group has been screened or reduced by another selection device (e.g.,
a Behavioral Consistency instrument, a writing sample exercise, etc.). Also, it should be
remembered that, although PBTs are designed to speed list delivery, the validity of multiple
selection instruments is generally higher than that of any single component.
Raters - It is generally not recommended to use hiring managers to assess current employees
in an oral exam due to possible perceptions of impropriety. [This is not the case, though, for
certification interviews.] If hiring managers are used, candidates who are employees should
be assigned to other examiners who do not know the employees. Candidates only should be
made known to their oral examiners by their identification number, not by name. Examiners
should recuse themselves if a strong personal or familial relationship exists between a
candidate and the examiner or if there is any concern that the examiner may be unable to
evaluate the candidate impartially. In such cases, upon recognition of the candidate, the
examiner immediately must bring the matter to the attention of the test administrator who, in
turn, should find an alternate examiner to evaluate the performance of that candidate.
Rater Anonymity and Challenges – Civil Service Rules in the City and County of San
Francisco state that the identity of any examiner or board member giving any mark or grade
shall not be disclosed. However, candidates do have the right to object to or challenge a
particular examiner, provided that they make their objections known to the test administrator
prior to participation in the examination. Consequently, such an objection shall be made
based on the candidate’s physical recognition of the examiner upon entry into the examination
room, not on recognition of the examiner’s name. During the tape-recorded introductory
remarks of an oral examination, the candidate should be asked if s/he has any objections to
any of the members of the oral examination board. [Oral board members, similarly, should be
asked if they have any concerns about rating the candidate.] If the candidate voices an
objection, the test administrator should find an alternate examiner to rate the performance of
that candidate. If another rater is unavailable and the panel consists of three raters, remove the
rater who is the focus of the objection and allow the oral to proceed with two raters. [Please
note that the names of the candidates should not be known to the raters as well. Rather,
candidates should be identified only on the basis of their candidate identification number.]
Length of Interview. An interview of 20 to 40 minutes is generally sufficient to obtain
information necessary to assess applicants with respect to certain job KSAs. This time period
allows interviewers to typically ask between 5-8 questions.
Number of Raters – Increasing the number of raters will increase the reliability. It is
recommended that a minimum of two raters be used on a given rater panel. When the KSAs
SECTION IV – Page 42
to be assessed are not highly technical, it is acceptable to use one trained HR analyst and one
SME drawn from the field associated with the title, to serve on the oral panel. If the KSAs to
be assessed are not of a technical nature, it is acceptable to use two trained HR analysts to
serve on the panel.
Scheduling Format – Most analysts find that it is difficult for a given rater to process more
than 10 to 12 candidates a day. Fatigue sets in and may affect the reliability of the
assessments. Some strategies to use when many candidates need to be tested are:
 Double-Chain Oral – This consists of two oral examination panels or boards. The
examination questions are split between each panel so that, for example, “Panel A” is
responsible for questions #1, #2 and #3, while “Panel B” is responsible for questions #4,
#5 and #6. [Both panels, however, would assign Oral Communication ratings which
would then be averaged.] Thus, when a candidate is done with Panel A, s/he moves on to
Panel B. This staggered processing of candidates within a given time period makes it
possible to test more candidates within a given time period than the traditional “straight”
oral method. A primary consideration in this type of scheduling is that candidates spend
an equal amount of time with both panels. This will avoid “back-up” and candidates
waiting between rooms for unequal periods of time. Sometimes three panels are used in
the same manner (“Triple-Chain” orals).
 Parallel Processing” – This involves two or more panels. Each panel is responsible for the
same set of questions but administers the questions to different candidates. This format
invites problems in standardization (e.g., one panel being more harsh or lenient in its
ratings than another). For this reason, this approach is less preferred than the “DoubleChain” approach. There are times, however, when there is no alternative to this approach
and in such cases certain safeguards should be followed.

All raters should be provided extensive training on administration procedures and the
rating scales.3

If practicable, all raters should “rehearse” by administering and rating the examination
with “mock” candidates. Their ratings should then be reviewed to:

Determine the need for further training (i.e., is there strong inter-rater reliability?).

Eliminate those raters who are consistently assigning ratings which are not in
agreement with the other raters.

‘Balance” the composition of the panels in terms of how “lenient” or “harsh” the
individual raters may be.
 Multiple-day testing – This can involve any of the above formats. The obvious problem
with multiple-day testing is test security and for this reason, it is the least preferred option.
3
APPENDIX VI - ORAL PERFORMANCE EXAMINATIONMANUAL – A GUIDE FOR THE RATING PANEL is useful to orient raters.
SECTION IV – Page 43
Train SMEs. Allow sufficient time to train the raters. Untrained raters will likely yield less
reliable and less valid results.
Nature of KSAs. Various characteristics of KSAs are considered in planning the interview.
The number of questions developed for each KSA and interview length may vary depending
on the nature of the KSA being assessed.
Oral Communications Ability. No interview question needs to be developed specifically for
Oral Communications Ability. This ability is assessed on the basis of the candidate’s overall
response to all of the interview questions. Therefore, only one rating is needed (i.e.,
candidates don’t need to be assessed on their oral communication ability with respect to each
question).
KSA Weight. For heavily weighted KSAs, it may be desirable to develop and use more than
one question to assess that KSA.
Rating Guidelines
There are different types of rating scales but the type recommended by this Department is to
provide a fairly extensive checklist of items of behavior or responses. It is primarily used for
knowledge areas and problem-solving as well as for supervisory or administrative abilities.
During the interview the oral panel checks relevant items, and then later refers to the list when
making summary evaluations. This method tends to separate the function of observing and
recording from that of evaluating, in the belief that increased accuracy will result.
General Considerations Involving Rating Scales
Rating guidelines serve to focus the rater’s attention on the specific KSAs being assessed and on
specific responses that are acceptable and unacceptable. Consider the following when developing
rating scales:
1. A rating scale should contain enough points to differentiate among candidates and to
ensure reliability. Research indicates that the maximum number of reliable
discriminations should be 7: with the optimum from 5 to 7 scale points. We normally use
5 points (5 being the highest and 1 the lowest) on an equal interval scale.
2. Ideally, all points on the scale should be behaviorally defined – but this may depend on the
question. The “5” (optimum), “3” (marginally acceptable or barely passing), and “1” (nil)
responses should, at a minimum, always be behaviorally defined.
3. Typically, the points are defined in terms of areas to be covered by the candidate, the
quality of a problem-solving solution, the number of response elements mentioned or the
degree to which a candidate exhibits certain ability, e.g., sensitivity.
4. Anchor points – The behavioral descriptions that define the points on the rating scale
“anchor” the scale so that the points always mean the same thing. This provides the
mechanism to promote inter-and intra-rater reliability. The general rules for anchor points
are:
SECTION IV – Page 44
5.
6.
7.
8.
a. The “5” point is viewed as the optimum response to the question. When
developing a 5-point scale, begin by writing the optimum response.
b. The “3” point delimits the barely passing response. This “3” point should be
defined so that responses that meet it are at the minimum level of acceptability.
That is, once the optimum response is defined, decide what aspects of the optimum
response must be represented by the candidate to constitute a barely passing
demonstration of the characteristic(s) being rated.
c. The “1” point is defined to include weaknesses of such magnitude that the
candidate’s response is worthless or fatally flawed. It is also applied to candidates
who simply fail to respond or who fail to adequately demonstrate the characteristic
for which the rating is made.
d. The “2” point and the “4” point are defined implicitly by their neighboring points,
although the examiner may choose to define them explicitly.
The exam author may use a different model provided that:
a. there is a clear rationale for its use;
b. differentiation (i.e., separation of candidates based on performance) and reliability
issues are addressed, and;
c. anchor point definitions are in accordance with the rules given above. The 1-3-5
points must be explicitly defined.
Each characteristic to be measured should be operationally defined. For convenience, it is
permissible to group KSA’s into broader categories for the definition of the characteristic
provided that the
a. test plan specifies which KSA’s are covered by what questions, as it should, and
b. the definition of the characteristic used in the oral examination includes the
(abbreviated) description of the subsumed KSA’s. For example, a characteristic
labeled “Technical Knowledge” can be defined as including knowledge of several
specified areas. Of course, these areas must be covered in questions within the
examination with coverage consistent with the job analysis.
Characteristics should be defined so as to be mutually exclusive.
A question may be rated on more than one characteristic, if the question is appropriately
designed to elicit manifestation of more than one characteristic.
Examples of two different types of rating guidelines are presented below.
Example 1. KSA Assessed: Knowledge of Pole Climbing Procedures
Question: What factors are important when circling a pole without a safety strap on?
5 Rating -
3 Rating 1 Rating -
Correctly identifies the following 5 factors:
maintain balance
complete circle at one level (no upward or downward movement)
maintain correct body posture
drop correct foot (lead foot)
take smooth, not choppy steps
Correctly identifies only 2 or 3 of the above factors.
Fails to identify any of the above five factors.
SECTION IV – Page 45
Example 2. KSA Assessed: Knowledge of Driver’s Permit Application Procedures for
Applicants with Medical Conditions
Question: Bill, age 17, has just applied for a learner’s driving permit. Although he takes
medication daily, he occasionally suffers from convulsive seizures, the last occurring 14
months ago. Since then he has been normal both physically and mentally.
a) Under what conditions will Bill be issued a learner’s permit?
b) What information may Bill be required to furnish to the Director?
c) If Bill is granted driving privileges, under what conditions may he retain these
privileges?
Suggested Response
5
An Optimum Response – includes all of the following:
(a) 1. Bill must establish to the satisfaction of the Director that he has been free
from recurrent convulsive seizures for a period of one year with or without
medication and
2. that he is physically qualified to operate a motor vehicle.
(b) Bill may be required to furnish the Director with:
1. a statement of his case history;
2. a statement by his treating physician, including diagnosis, treatment and
prognosis; and
3. any other information which the Director may deem necessary to evaluate his
qualifications to operator a motor vehicle
(c) As a condition precedent to the issuance of driving privileges, Bill must:
1. agree in writing to submit to the Director, periodic reports containing a
statement of the individual’s case history and a statement by the treating
physician.
2. These reports shall be submitted every 6 months for a period of 2 years from the
date approval is given to hold a driver’s license.
3. Subsequent reports shall be submitted on a yearly bass.
4. The Director may, in his discretion, waive or change the interval report
requirements.
4
Clearly Adequate Response – better than the Barely Passing Response but less
than an Optimum Response.
3
Barely Passing Response – includes at least #2 from a), and 1 part of b, and 1 part
of c.
2
Clearly Inadequate Response – better than the Nil Response but less than a
Barely Passing Response.
SECTION IV – Page 46
1
Nil Response – is one that is as vague, inaccurate and/or incomplete as to indicate
a complete lack of knowledge or understanding of the subject.
Prodding/Probing
The definition of prod is “to incite to action; to stir; to stimulate”. For examination purposes,
however, the word refers to examiner remarks which “direct” or “re-direct” the candidate. In oral
interviews, the word “probe” is also used, which means “to examine or investigate penetratingly;
to delve into”. Prodding and probing can be valuable tools in oral exam as they can be used to:





Redirect a candidate who has misconstrued an item
Seek clarification on an ambiguous response
Challenge a response to see if the candidate can defend it
Request that the candidate respond in more depth
Start a candidate who has momentarily “frozen up”
However, prodding and probing introduce additional variables into what should be a standardized
testing situation. Consequently, some agencies choose not to prod or probe in order to avoid
possible protests about a lack of standardization. Some agencies, in fact, instruct their raters or
examiners to remain completely silent throughout the oral interview, save possibly the reading of
the actual test questions. Yet, complete silence by the members of the oral panel during the
administration of the interview presents somewhat of an artificial environment that can be quite
disconcerting to candidates. That is, an oral interview or examination can be anxiety provoking
as it is, let alone when one is required to talk before silent faces. Indeed, research has shown that
excessive anxiety can actually narrow one’s focus and ability to consider alternative courses of
action.
There is a higher concern about a no-prodding or “one-way communication” policy, however,
than that of an artificial situation or candidate nervousness. That is, personnel analysts should
always try to minimize the possibility of rejecting candidates (false negative errors) who are in
fact prepared and ready to do the job. A knowledgeable and capable candidate can do poorly on
an exam if s/he has misconceptions on what is expected of him/her during the oral interview or if
s/he misinterprets a question, etc. Prodding and probing can mitigate these potential sources of
measurement error. If done well, prodding and probing can help to provide a truer picture of a
candidate’s knowledge and ability level without creating an unfair or unstandardized test
situation.
Given that standardization remains a concern, some agencies may allow for certain types of
prodding or probing. The following presents a variety of guidelines that may be followed. Of
course, all candidates should be treated the same way when following these guidelines.

No Probe - Only Restatement. The oral examiner my restate the item only, do so “x”
number of times, and the restatement must closely follow the original phrasing of the item.
SECTION IV – Page 47

No probe on key concepts. The candidate is expected to provide certain key aspects of the
response spontaneously; these aspects are identified for the oral examiner to avoid in
probing.

Probe for reasons only. The item may call for the candidate to agree or disagree with a
particular position, or else to describe a course of action in response to a problem. If the
candidate does not provide reasons for his position spontaneously, the oral examiner may
be permitted to request them.

List generation. Some items may call for listing of examples, reasons, etc. The candidate
may be required to generate a number of these elements. Some candidates may terminate
their response after providing a few elements, believing them to be sufficient.
Consequently, probing allows the candidate to provide additional response elements until
the response is truly exhausted.

Challenge. The candidate might be expected to supply supporting reasons for a position
taken with respect to his/her response, even when confronted with an alternative position.
This is often useful to determine how “open”, or “flexible”, etc. the candidate is with
respect to alternative solutions/ideas. It is also a way to uncover the candidate’s reasoning
and strength of conviction behind his/her approach.
Do’s (general)
1. Let candidates know that you are looking for additional information by saying “is there
anything else you can add about…..” or “can you tell me more about…” is generally an
acceptable practice provided all candidates are given the same opportunity. As a general rule,
give the candidate at least one but no more than two “shot(s)’ at providing more information.
2. Attempt to use the candidate’s own words when phrasing a prod, if greater clarity or detail is
sought (i.e., when the candidate doesn’t say enough to get full credit for something). This
encourages standardization and avoids giving extra cues. “You said a moment ago that
_______ is important for _____. Can you tell me more about that?”
3. “Situationalizing” the prod often helps candidates to “throw” themselves into the question.
For example, a question might read, “What should a supervisor do when…..?” However, the
candidate might have difficulty responding to such a hypothetical situation. In those cases, it
may be permissible to say “Let’s assume that YOU are THAT supervisor. What would YOU
do……?”
4. As appropriate, give the same or similar prod to all candidates, and to the same extent, for the
purposes of consistency and standardization. This always should be done for “simple
rephrases” of the question (e.g., which may be necessary if it becomes readily apparent that
the original question could have been worded better).
5. If you forget or if you didn’t hear a candidate’s answer, don’t be afraid to ask for a repeat.
6. Sometimes the candidate’s answer is egocentric, presenting responses in a way that makes it
difficult for the listener to (a) follow or (b) assess whether the candidate has the knowledge or
ability being measured. In such cases, it is often helpful to inform the candidate that s/he
should “assume no knowledge on the part of the examiner concerning ____ (e.g., the subject
matter), when answering the question.”
Don’ts
SECTION IV – Page 48
1. Don’t give the answer away with a strong cue or clue word (e.g., “What about training?”).
The level of association between the cue/clue and the desired response should not be
automatic or a “mental reflex”.
2. Don’t interrupt the candidate’s train of thought (maybe s/he is getting to the point down the
line).
3. Don’t entertain preconceptions about the candidate’s “knowledge” (e.g., “Well, you know
about ___. Can you please tell me about that?).
4. Don’t prod to abuse. Too many prods can lead to a “response set” where the candidate
becomes dependent on, and awaits, the prompt before providing additional information. Also,
prodding can irritate or make some candidates nervous if they see it as evidence of their poor
performance. [Candidates generally welcome prods so the former is actually more likely than
the latter.]
General oral examiner/rater behavior
1. Consistency is the key. All candidates must be treated equally.
2. Be courteous, make candidates comfortable and be empathetic to their stressful situation.
It is the examiner’s responsibility to give each candidate every consideration. This
includes not only offering a cordial greeting but also starting the examination on time (a
potential appeal issue).
3. Take a warm-to-neutral approach; remain professional throughout. Do not be overly
friendly.
4. Avoid the temptation to compare candidates to one another. Compare performance only
to the scoring criteria.
5. Avoid strong body signals or messages (e.g., eyes, posture, etc.). Even if you are bored or
unimpressed by the candidate’s responses, don’t send that message. Conversely,
enthusiasm for exceptional candidates should be suppressed. Remember—standardized
conditions.
6. Listen carefully to the candidate. This is easier said than done, especially after examiner
fatigue starts to set in after a long day. The concentration and mental energy needed to
maintain standardized conditions can be quite draining and may lead to “examiner
burnout.”
7. It is recommended that examiners take notes while the candidate is responding, although
taking extensive notes should not be done at the expense of listening. Notes will be an
extremely helpful aid – don’t trust your memory to remember all the candidate’s
responses.
8. Examiners should be thoroughly familiar with the rating scales prior to the actual
administration, in order to be attuned to the candidate’s responses and to specifically
address those issues/items/areas covered in the rating scale(s).
Some Oral Rating/Scoring Issues
1. Halo Effect – This is the tendency to rate candidates high on all factors due to the forming
of a global impression. The “horns” effect refers to the tendency to rate candidates low on
all factors due to the forming of a global impression. Global impression is unduly
influenced by factors such as:
 One or two particularly good or poor answers/responses.
 Candidate’s dress, posture, voice quality, physical looks, “likeability”.
SECTION IV – Page 49
 Overall communicative ability.
2. Most popular techniques to determine scores:
 Average of examiners’ ratings without scoring consensus.
 Average of ratings only if consensus is not obtained.
 Consensus or composite rating based on discussion.
3. Scoring/rating procedure – It is general practice for each oral board rater to make
independent ratings and record these in his/her notes. When all of the raters have
completed their ratings across all of the scales, the board or panel then discusses the
individual ratings in order to determine a consensus rating (i.e., ratings need to be at least
within one rating of each other and, in those cases when ratings differ by one rating, all of
those ratings are averaged).
a. Raters should avoid the temptation to remark on the candidate’s performance before
ratings are completed. If practicable, examiners should rate the candidate on each factor
as soon as s/he answers the last question pertaining to that factor. This will help to reduce
“halo.”
b. Research shows rater agreement often to be high even when discussion does not
occur. However, discussion may occur when ratings differ significantly and/or
when a candidate’s performance is borderline (no one wants to fail a candidate by
a point).
4. Notes - It is recommended that examiners take notes on the candidate’s performance. This
is especially true when the candidate displays revealing behaviors that can’t be recorded
on an audio-tape or when the candidate’s performance is below marginally acceptable.
Oral Communication Scoring
1. An overall, global evaluation of performance across the entire examination/interview is
the most popular approach to scoring oral communication ability. However, there may be
times when certain test items preclude proper assessment of oral communication, so
scoring may be limited to a set number of items. It should be noted that it is probably
unnecessary to score oral communicative ability separately on an item by item basis given
that global assessments appear to be more reliable. Also, such scoring runs counter to the
common belief that one’s ability level is fairly consistent and not dependent on what
information is being imparted.
2. It is the Department of Human Resources’ practice to view oral communicative ability as
“how” information is imparted, not “what” is imparted. That is when assessing this skill,
we do not wish to contaminate the “what” with the “how” since they are independent and
can be treated as mutually exclusive factors. Indeed, the “what” aspect of the
communication should be measured by another scale. If not, contamination in scoring and
the halo effect can occur.
SECTION IV – Page 50
Examples of Oral Communication Rating Scales
ORAL COMMUNICATION SCALE
3 – Good to Excellent
Overall communicative ability was above the level required for this position.
Responses were very clear and concise.
Consistently used words appropriately and sentences were grammatically correct.
Presented ideas in a logical fashion and provided supporting information.
Rate of speech, pitch, syntax, intonation, pronunciation, inflection and volume were used
effectively to convey meaning or emphasis.
Consistently maintained an appropriate level of eye contact with the assessors.
Displayed a high level of confidence in his/her response.
Maintained a high level of interest and involvement throughout the exam.
2 – Adequate
Overall communicative ability was at the level required for this position.
Responses were, at most times, clear and concise.
Generally, used words appropriately and sentences were grammatically correct.
Presented ideas in a logical fashion, but didn’t necessary provide supporting information.
Rate of speech, pitch, syntax, intonation, pronunciation, inflection and volume were used
at times to convey meaning or emphasis.
Generally maintained an appropriate amount of eye contact with assessors.
Displayed an acceptable level of confidence in his/her responses.
Maintained an acceptable level of interest and involvement throughout the oral.
1 – Poor to Below Adequate
Overall communicative ability was not at the level required for this position.
Responses were unclear and/or not concise.
Responses were too brief to allow for an accurate assessment of communication ability.
At times, applicant used words inappropriately and/or sentences were grammatically
incorrect.
Ideas were not presented in a logical fashion.
Rate of speech, pitch, syntax, intonation, pronunciation, inflection and volume were not
used to convey meaning or emphasis.
SECTION IV – Page 51
ORAL COMMUNICATION RATING SCALE
CHECKLIST
( ) Diction
( ) Grammar
( ) Syntax
( ) Vocabulary Usage
VERBAL FACILITY – Familiarity with and ease of performance in the use of the
English language as indicated by diction (choice of words especially with regard
to correctness and enunciation), grammar (e.g., change of word forms to indicate
distinctions of case, number, tense and person), vocabulary usage and syntax (The
harmonious arrangement of words, clauses and phrases)
(
(
(
(
(
(
) Organization
) Conciseness
) Brevity
) Persuasiveness
) Opening
) Closing
VERBAL ORGANIZATION – Relates to the cogency of the candidate’s response
(i.e., the expression of pertinent and fundamental points). For example, ideas
are presented in a well-organized and logical manner, without repetition
(other than normal summarizing statements). The candidate’s responses
are sequentially logical, contextually appropriate, informative and comprehensive.
Verbal organization also includes decisiveness and conciseness in the presentation
of ideas (e.g., no rambling, vacillation, excessive pauses, etc.). If a formal oral
presentation, response should commence with an introductory statement, flow smoothly
into a relevant, stimulating and/or convincing discussion which captures the audience’s
interest and terminates with a summary/conclusion.
(
(
(
(
) Enthusiasm
) Confidence
) Independence
) Self-starting
INDEPENDENCE OF PRESENTATION – Responses are presented with
confidence. No undue assistance (i.e., in the form of reassurance, reconfirmation, or
direction) is needed from the examiner. Candidate does not rely on examiner’s
auditory or visual feedback or on examiner’s “cues” or “prompts” to develop or
continue his/her presentation
(
(
(
(
(
(
) Clarity/Tone
) Volume/Rate
) Inflection
) Modulation
) Mannerisms
) Eye Contact
EXPRESSION – Articulations are clear and audible so that hearing, recognizing,
and interpreting is accomplished without difficulty by the listener. Presentation is
free of distracting and/or annoying verbal and/or physical mannerism (e.g., “uh,”
“you know,” excessive use of hand gestures, poor eye contact, soft-spoken,
monotonous voice, hand over mouth, etc.).
RATING SCALE
5
Candidate is consistently effective in his/her communication skills as delineated in the above
categories.
4
Candidate’s typical communication skills are effective but occasionally demonstrates a weakness
in one of the above categories which, at times, detracts from the presentation.
3
Candidate’s communication skills are acceptable but weaknesses in two of the above categories
are pronounced and detract from the presentation. OR Candidate’s communication skills are
acceptable but a weakness in one of the above categories is of such proportion as to consistently
detract from the overall presentation.
2
Candidate demonstrates pronounced weaknesses in three of the above categories which seriously
detract the overall presentation.
1
Candidate demonstrates weaknesses in all major areas or candidate’s responses to questions are
incoherent, unintelligible or expressed in a manner totally inconsistent with the testing format.
SECTION IV – Page 52
Oral Communication Rating Scale
Effectively expresses ideas to audience; adjusts language or terminology to intended audience;
targets discussion to audience needs; communicates accurate information.
Rating
5
Performance is MUCH MORE THAN ACCEPTABLE and significantly
exceeds the criteria required for job success.
4
Performance is MORE THAN ACCEPTABLE and exceeds the criteria
required for job success. One weakness detracts from the communication.
3
Performance is ACCEPTABLE and meets the criteria required for job success.
Two weaknesses detract from the communication.
2
Performance is LESS THAN ACCEPTABLE and significantly below the criteria
necessary for job success. Three weaknesses detract from the communication.
1
Performance is MUCH LESS THAN ACCEPTABLE and significantly below
the criteria necessary for job success. Four or more weaknesses detract from the
communication.
Check if area
of weakness
Effective Behaviors
Clarity/Brevity: Speaks clearly and concisely.
Word Usage/Grammar: Uses appropriate words; uses sentences that are
grammatically correct, adjusts language or terminology to intended audience.
Organization: Presents ideas in a logical fashion; provides supporting
arguments; provides a conclusion.
Inflection/Modulation/Rate/Volume: Speaks at an appropriate rate; maintains
appropriate pitch and volume; properly uses pitch and volume to convey meaning
or emphasis.
Enthusiasm: Maintains interest and involvement throughout discussion.
Nonverbal: Uses gestures effectively without causing confusion or
distractions; makes eye contact when speaking; does not read directly from
notes, etc.
SECTION IV – Page 53
PERFORMANCE & JOB SIMULATION TESTS
The term performance test generally refers to a selection procedure that requires a candidate to do
and/or produce something physically so that his or her action(s) and/or product can be graded.
Performance tests measure physical agility/strength/endurance or motor skills such as typing
skills (i.e., physical performance or skills testing).
Sometimes candidates are asked to “perform” on a selection procedure (e.g. a written or oral
exam) which is designed to assess their competence involving an activity that is similar to a
workplace situation, assignment, and environment (i.e., the activity looks and feels like a piece of
the target job). These types of performance tests are better characterized as job simulations or
work samples, to distinguish them from physical or motor skill assessment. Sometimes a selection
process may include multiple job simulation exercises, such as those that comprise an assessment
center.
The critical factor differentiating job simulation exercises from all other test types, regardless of
their mode of presentation, is the high degree to which they approximate the job in reality. A test
can vary in terms of the degree of realism presented by the situation. For example, a test exercise
that requires a candidate to assume the role of a supervisor in a supervisor/subordinate role-play
involving the subordinate’s poor work performance might be considered highly realistic.
However, a question that asks a candidate to simply describe how s/he would interact with a
subordinate with a performance problem would be considered low on the realism scale.
Given that validity is maximized when the predictor used in selection (what is measured by the
test) is as similar as possible to the job performance behaviors to be predicted, it is not surprising
that job simulations or work samples are generally considered to be the best predictors of job
performance among all types of selection instruments.
Written Job Simulations - Written job simulations generally consist of writing exercises that are
designed to measure job-related abilities and/or the application of job-related knowledge.
Candidates may or may not be given an advance opportunity to study or review material and
prepare prior to the actual writing, which may be in the form of a report, letter(s), memo(s), or
essay question(s). This type of exercise is particularly effective at measuring multiple and
complex abilities such as analytical and administrative abilities. And, of course, it is the most
job-related way to assess written communication ability. Written performance tests should be
scored using behaviorally anchored rating guidelines similar to those used in oral exams.
SECTION IV – Page 54
Advantages of Job Simulations and Performance Tests:
1) Tend to have very high validity.
2) Good to measure/assess multiple KSAs, behaviors or competencies associated with jobrelated activities, physical job skills and the application of job knowledge.
3) Tend to be well-accepted by candidates because it is perceived to be job-related.
Disadvantages/Criticisms of Job Simulations and Performance Tests:
1) Can be time-consuming, expensive and resource intensive to develop, administer and score
and therefore not very efficient or cost-effective for large candidate pools.
Developing a Job Simulation Exercise - The design of a job simulation exercise and the extent
to which it realistically depicts job tasks and conditions will depend on various factors such as:
the scenario itself, availability of equipment, cost, time and safety. The job analysis serves as an
excellent source of ideas for potential tasks to be measured in the job simulation exercise. The
analyst should work with the Subject Matter Expert(s) to identify the tasks required to perform
job activities. Indeed, while the analyst might be familiar with the job activities and KSA’s, s/he
is unlikely to possess detailed knowledge of job tasks. To acquire this knowledge, it is advisable
that the analyst visit the job site, observe the work being performed, talk to incumbents and
supervisors and obtain a comprehensive understanding of the tasks to be simulated in the exercise.
Consider the following when selecting tasks for inclusion in a job simulation exercise:
 Is performance of this job task required upon hiring? If not, then this job task should not
be included.

Will training be provided on how to perform this job task? If so, this job task should not
be included.
 Is the job task an important one? If so, what is the consequence of error?
 Is it important to determine if the candidate can perform the job task or is it important to
assess the manner in which the candidate performs the job task?
 The tasks selected for the exercise should generally result in the assessment of all the
brought-to-the-job KSAs that can be directly evaluated from the observation of the
candidate’s performance on the exercise. [Tasks that entail KSAs learned on the job
should not be used.]
Job Simulation Development Meeting with SMEs - The analyst should provide general training
to the SMEs in exercise development. This should include a brief discussion of the criteria for the
selection of job tasks for the job simulation exercise and development of objective rating
guidelines. For example, the analyst should explain to the SMEs that the tasks to be simulated
will be job tasks that meet the criteria mentioned above. The analyst should then provide the
SMEs with a copy of the job analysis, including the tasks, KSAs and task-KSA link-up
information. [Any other material that may be useful for development of the exercise should be
distributed as well.] A determination about the KSAs to be assessed by the simulation and the
SECTION IV – Page 55
job tasks to be simulated can be made at this time. It is important that detailed and accurate notes
be taken during this phase of the meeting.
Job Simulation Evaluation Factors – The candidate can be evaluated based on the product s/he
produces as a result of the simulation exercise (for example assembling a parking meter
mechanism or reconciling financial statements) or the process by which the candidate used to
produce the product or perform the activity, or both. Some types of simulations do not result in a
tangible product; therefore, the process or procedure is evaluated. Examples are starting
equipment, or climbing a ladder. Activities such as these require that the behavior be evaluated
“in progress” with attention to important tasks and/or steps and proper sequencing. In other
simulations, the process is of little or no significance; the product or end result becomes the focus.
Examples are writing a computer program, drafting a report or preparing a work schedule. There
are job simulations where both product and process (e.g., safety, speed, and efficiency) are both
important and evaluated.
In consultation with the Subject Matter Experts, analysts should decide on the physical setting,
the availability and provision of necessary equipment and materials, safety precautions and
recommended time limits.
Scoring Dimensions and Rating Guidelines - Examples of factors that may be evaluated and
scored in a job simulation are presented below:
Standard Process
Quality
Quantity
Safety
Accuracy
Error rate
Choice of tools, equipment
Procedures used
Time to complete
Accident rate
Use of tools and equipment
Product
Conformance to
specifications
General appearance
Suitability for use
Quantity of output
Safety of completed product
The following questions should be asked once the scoring dimensions have been developed.
 Are the dimensions to be scored appropriate for the tasks that comprise the simulation?
 Can a rating scale or checklist be developed for the dimension to be scored?
 Is there a link between the rating guideline and the KSA(s) that comprise the dimension?
 Do the rating guidelines adequately cover important job tasks?
Thorough knowledge of the job including the observable outcome of the job task is required for
the development of rating guidelines. Rating guidelines for simulations must be designed to
ensure that raters or observers attend to the same aspects of performance, and apply the same
SECTION IV – Page 56
standards to evaluate performance. Rating guidelines should specify, for each task in the
simulation, the scoring factors and KSA’s associated with that task. Optimally, there should be at
least three scoring dimensions for each KSA being assessed on the performance test,
superior, acceptable and unacceptable performance.
Scoring Systems and Rating Scales - Performance tests can be scored with rating scales,
checklists or a combination of both. There are no hard and fast rules for selecting one or the
other. Whenever possible, use only one type of scoring system for a given test.
Rating scales can be developed for the evaluation of both product and process. When the product
is evaluated, the rating scales should describe the finished product at various levels of
acceptability. Each applicant’s product is then evaluated by comparing it to the rating scales and
determining where it most closely matches the descriptions associated with the ratings on the
scale. With process, the rating scales describe the behavior that can be demonstrated by
applicants at various levels.
Rating guidelines specify desirable and undesirable responses for each KSA being assessed by
each question. The degree of precision necessary for these rating guidelines varies with the job
class. Either a 3 or 5 point rating scale may be used. For practicality of scoring, a rating scale no
higher than 5 points may be used. In all scales “1” is used to indicate the lowest score.
The rating guidelines serve to focus the rater’s attention on the specific KSAs being assessed and
on specific responses that are acceptable and unacceptable. Rating guidelines must include
descriptions of “superior”, “acceptable” and “weak”. It is desirable to include examples of
possible responses at all levels with indications of the corresponding ratings.
EXAMPLE 1
CAFETERIA WORKER
Rating Guidelines (Rating Scale)
Task: Clean kitchen area including coffee machine and two soiled pots.
KSA(s) being evaluated: Ability to Clean and Sanitize Kitchen
Rating of 5:
Coffee machine is spotless
All surfaces including underneath the coffee machine are spotless
Pots are spotless
Burners are cleaned
Rating of 3:
Pots cleaned
Coffee grounds are dumped but coffee machine is not cleaned
All surfaces clean except under the coffee machine or burners
Rating of 1:
Fails to wipe machine
Fails to wipe burners
Fails to dump coffee grounds
SECTION IV – Page 57
Checklists - For checklists, either the task is broken down into steps, or the dimensions
characterizing the observable outcome are listed.
The checklist is a special type of rating scale: only ‘yes’ (acceptable) and ‘no’ (unacceptable)
ratings are possible. The observer or rater then simply checks yes or no (or checks 1 point or 0
points) to indicate whether each item on the checklist was completed. Checklists work
particularly well when a sequence of individual steps need to be assessed (see example below).
Analysts should determine how a checklist is to be scored in consultation with the Subject Matter
Experts. Sometimes it may be appropriate to assign higher weight to some responses on the
checklist that are considered more important than others.
EXAMPLE 2
ELECTRICAL MECHANIC
RATING GUIDELINES (Checklist)
TASK: Attempts to start engine electrically
KSA(s) being evaluated: Operating Procedures
Yes
No
1. Places fuel in tank position
2. Places oil pan baffle rod in proper position
3. Places air cleaner intake shutter in proper position
4. Places governor control in start position
5. Places remote /local switch in local position
6. Places emergency run/stop switch in normal position
7. Holds start switch in start position (no more than 15
second intervals)
Instructions for Raters, Proctors and Candidates - In addition to rating guidelines, instructions
are important components of performance tests. Instruction to both the raters and candidates are
essential. These instructions should be very detailed and cover the testing procedures, equipment
and materials, time limits, test tasks etc.
Instructions to proctors and raters should, as appropriate, outline test procedures, necessary
equipment and materials, describe physical setting and address safety precautions. They should
also include direction on what to instruct candidates, how to instruct candidates, what to do after
candidates are finished, the tasks on which performance is to be evaluated and the protest
procedure (if different from the general protest procedures). All test instructions should be
designed so that they are delivered to candidates in a standardized manner.
In a very simple performance test, instructions to candidates may be given orally. As a general
rule, though, candidate instructions should be written. They may be read to candidates and/or
given to them to read before taking the test. Instructions must be standardized (i.e., the same
presentation format and content for all candidates). Instructions generally should include: time
limits, purpose of the test, test procedures, equipment, a list of the test tasks on which
SECTION IV – Page 58
performance will be evaluated and the test sequence if there is one or more additional test
components. In addition, the protest or appeal policy and/or procedure should be explained,
preferably in writing. If the latter, candidates should sign that they have read and understand the
policy and/or procedure.
EXAMPLE
INSTRUCTIONS TO RATERS AND PROCTORS
Troubleshooting the Start Circuit of a 5 KW Generator Set
Raters and proctors shall not assist applicants during the performance test. They
are to watch applicant behavior closely for evaluation purposes and also to prevent
accidents and equipment damage. Raters will evaluate applicants on their
performance of tasks listed below.
Test Conditions:
This test will be administered indoors under fluorescent light.
Equipment Provided:
5 KW generator
Load bank
Hydrometer
Tool kit
Load cables
Schematic
Multimeter
Grease pencil
Equipment Setup: The cable and load banks will be connected for 120 volts,
single phase, and 60 cycle operation prior to the start of the performance test.
Wire # P29B18 going from #2 terminal of the S6 switch to terminal #2 of the
ammeter will be removed and replaced by a false wire.
Test Tasks:
1. Attempt to start engine electrically
2. Check battery voltage and specific gravity
3. Disconnect battery
4. Trace engine start circuit on schematic diagram using grease pencil. The
applicant will take each part of circuitry in sequence:
a. Energize the K4 coil
b. Energize the K1 coil
c. Energize the K4 coil and
d. Energize the start motor
5. Make continuity checks of the start circuit with a multimeter
6. Point out the defect to the observers on both the equipment and schematic
Time Limit: The maximum time allowed for the test is 30 minutes
SECTION IV – Page 59
EXAMPLE
INSTRUCTIONS TO APPLICANTS
Troubleshooting the Start Circuit of a 5 KW Generator Set
Test Purpose: The purpose of the test is to provide you with an opportunity to
demonstrate your ability to troubleshoot the start circuit of a 5 KW generator.
Using a prescribed procedure, you are to find any incomplete circuitry. You are to
furnish power for lighting the area. The output power is 120 volts, single phase,
and 60 cycle alternating current.
Equipment Provided:
5 KW generator
Load bank
Load cables
Multimeter
Hydrometer
Tool kit
Schematic
Grease pencil
Test Tasks:
1. Attempt to start engine electrically
2. Check battery voltage and specific gravity
3. Disconnect battery
4. Trace engine start circuit on schematic diagram using grease pencil. Take each
part of circuitry in sequence: a) energize the K4 coil, b) energize the K1 coil, c)
energize the K4 coil and d) energize the start motor
5. Make continuity checks of the start circuit with a multimeter
6. Point out any defects to the raters on both the equipment and schematic
Time Limit: You will be allowed 30 minutes to complete the test.
Scheduling and Administration of Performance Tests - In scheduling a performance test,
analysts must take into account the number of candidates, number of exercises to be completed by
each candidate, the scoring system, time limits, and the availability of raters. Examination events
or exercises should be sequenced to maximize the efficiency of the overall evaluation process.
The total number of candidates to be tested will affect the number of raters and analysts/proctors
needed for the administration of the test. Analysts must ensure that all equipment and materials
used are available for the test and in operating condition. Candidates should not be allowed to
bring in their own equipment or tools unless specified in the examination notice letter.
SECTION IV – Page 60
Analysts should emphasize to the Subject Matter Experts the importance of maintaining
confidentiality of the performance test tasks and rating guidelines. Care should be taken that the
Subject Matter Experts leave all materials with the analyst.
It is recommended that performance tests be pre-tested before their actual administration. The
performance test should be pre-tested or tried out with Subject Matter Experts, which will serve to
point out potential problems with instructions, equipment and materials, time limits, and scoring
procedures. This will enable the analyst to make the required revisions before the actual
administration.
MULTIPLE-CHOICE EXAMINATIONS
Multiple-choice exams take time to develop. Since the Position Based Testing Program focuses
on speeded delivery of lists, developing a new multiple-choice examination may not be the most
appropriate selection method. For this reason, little will be said in this PBT manual about
developing new multiple-choice examinations. On the other hand, it may be entirely appropriate
to use a multiple-choice exam for Position Based Testing if it has already been developed and
standardized in advance of a need. By standardized, we mean that the content of the examination
and the rating key already have been validated and pre-tested by your agency or another approved
agency or vendor (e.g., DHR, WRIPAC, IPMA). It also should be noted that individual multiplechoice items, by subtest area, may be obtained from the Western Regional Item Bank (WRIB).
Tests can be assembled using WRIB’s items. It should be noted that the content of such solicited
items may be fairly broad and generic, so care must be used to determine if these items are, in
fact, appropriate to the class being tested. Sometimes they will need to be revised slightly to
make them appear more “face valid”. In any event, it is highly recommended that SMEs review
these items prior to use to (a) determine item relevance or appropriateness and (b) verify the
accuracy of the keyed answer.
Advantages of Multiple-Choice Examinations:
1) Tend to have excellent validity for predicting job success (particularly with regard to
knowledge tests and tests of mental abilities).
2) Highly efficient and economical to process large candidate populations.
3) Objectively scored and generally have high reliability.
4) Appropriate for determining whether candidates:
a. Possess knowledge or mastery of subject matter needed for satisfactory or outstanding
performance on the job.
b. Can learn material which will be taught either in a classroom or on the job.
c. Can demonstrate abilities (e.g., analytical, judgment, comprehension or interpretation of
facts or information; etc.) or behaviors needed on the job.
5) Can be adapted for use with other types of assessment instruments to create hybrid exams
(e.g., multiple-choice in-basket, multiple-choice responses to a case study exercise, etc.).
SECTION IV – Page 61
Disadvantages/Criticisms of Multiple-Choice Examinations:
1)
Often not appropriate for classes where the job does not use reading materials of
higher index (grade) levels or where the vast majority of qualified applicants are likely to have
lower reading ability [In these cases, they may measure reading ability rather than the
knowledge area intended for measurement.]
2)
Multiple-choice tests that measure reasoning or cognitive skills may demonstrate
adverse impact and, therefore, care must be taken that measurement of these skills is a
business necessity and job-related.
3)
Time-consuming and costly to develop from scratch.
Purchasing a Multiple-Choice Exam (or any standardized exam) - The purchase of validated
examinations developed by outside companies or consultants may be appropriate under some
circumstances:
high level classes
specialized classes
technically-complex classes
large applicant pool
lack of Subject Matter Experts
time constraints
When considering the purchase of a vendor’s exam:
1. Determine the feasibility of using the purchased examination
review the cost and determine the availability of funds
review the structure, administration concerns, and the impact of using the purchased exam
on standard procedures, such as the terms of the announcement and inspection privileges
determine whether there is a need to have the vendor approved by the Human Rights
Commission for the Equal Benefits Ordinance
2. Survey vendors
determine sources for purchased exams
request samples, demonstration materials, and/or a copy of an examination
request reports of results (predictive validity, reliability, adverse impact, etc.)
discuss payment (cost and mode of payment)
discuss method of delivery, printing responsibilities, pick-up of exam booklets, etc.
discuss who will administer the exam (i.e., the vendor or the Department of Human
Resources)
determine if the vendor restricts use of device or disclosure of any materials/results
determine or review the vendor requirements for inventory, security, etc.
3. Assess appropriateness
discuss with vendor the appropriateness of using the purchased examination for the
department’s purposes
SECTION IV – Page 62
compare what is measured by the examination with the important tasks and KSAs
identified in the City’s job analysis of the classification for which the exam would be
used.
consider asking the vendor to develop a customized test specific to a City and County of
San Francisco position. This tends to be more costly.
4. Recommend selection
discuss findings, assessment and recommendations with the Department of Human
Resources.
ESSAY TESTS
Essay examinations require a written response from candidates. The response can be shortanswer, fill-in or long answer. It is generally the latter that is used when written communication
skills are to be measured. [A sample written communication rating form is presented below.]
Essay questions may be stand-alone written questions or they may be multiple written questions
pertaining to a written stimulus material. Sometimes this stimulus material may require up to as
much as an hour to review or analyze. Suggested response scoring criteria are determined prior to
test administration and may include a writing sample scoring guide, if writing skills are to be
assessed. In many respects the scoring guidelines/scales used with long-answer essay questions
are similar to the behaviorally anchored rating guidelines or scales that are used with oral
examinations.
Advantages:
1) Efficient - can be developed relatively quickly and can efficiently process small and mediumsized applicant pools.
2) Easier to administer than oral examinations and the schedule for rating essays is more flexible.
3) Good to evaluate written communication ability.
4) Good to measure knowledge recall and application of knowledge.
Disadvantages/Criticisms:
1) Since candidates with poor writing skills will be clearly at a disadvantage, written
communication ability must be an essential part of the job if essays are to be used.
2) May demonstrate adverse impact on protected classes.
3) Scoring can be time-consuming and resource intensive with large candidate populations.
SECTION IV – Page 63
SAMPLE WRITTEN COMMUNICATION RATING SHEET
Grammar/Syntax - Uses words and phrases correctly.
___ (2 points)
Excellent - Candidate is consistently effective
___ (1.5 points) Good - Candidate is usually effective
___ (1.0 points) Acceptable – Candidate’s level of proficiency is barely acceptable
___ (0.5 points) Needs Improvement - Candidate demonstrates a deficiency that seriously
detracts from communication
Spelling/Punctuation - Uses correct spelling and punctuation.
___ (2 points)
Excellent - Candidate is consistently effective
___ (1.5 points) Good - Candidate is usually effective
___ (1.0 points) Acceptable – Candidate’s level of proficiency is barely acceptable
___ (0.5 points) Needs Improvement - Candidate demonstrates a deficiency that seriously
detracts from communication
Clarity - Ideas are clearly/completely stated so as to be easily understood by the reader.
___ (2 points)
Excellent - Candidate is consistently effective
___ (1.5 points) Good - Candidate is usually effective
___ (1.0 points) Acceptable – Candidate’s level of proficiency is barely acceptable
___ (0.5 points) Needs Improvement - Candidate demonstrates a deficiency that seriously
detracts from communication
Organization of Ideas/Format/Length - Presents ideas in a logical and comprehensive manner;
introduces a topic, provides supporting arguments, uses appropriate arrangement and uses no
more than two pages.
___ (2 points)
Excellent - Candidate is consistently effective
___ (1.5 points) Good - Candidate is usually effective
___ (1.0 points) Acceptable – Candidate’s level of proficiency is barely acceptable
___ (0.5 points) Needs Improvement - Candidate demonstrates a deficiency that seriously
detracts from communication
Tone/Appropriate to Audience– tone is professional and sincere; essay targeted to audience.
___ (2 points)
Excellent - Candidate is consistently effective
___ (1.5 points) Good - Candidate is usually effective
___ (1.0 points) Acceptable – Candidate’s level of proficiency is barely acceptable
___ (0.5 points) Needs Improvement - Candidate demonstrates a deficiency that seriously
detracts from communication
___ Total Points for Written Communications (10 Points Maximum)
SECTION IV – Page 64
SELECTION PLAN SUMMARY
TYPE
TIMEFRAME
ADVANTAGES
DISADVANTAGES
T&E
(checklist or
quantitative)
1 - 2 months
Easier for candidates, fast delivery of
eligible list; simple and economical to
administer and rate; rarely appealed
Behaviorally
anchored T&E
2 - 4 months
Validity usually higher than simple T&E;
easier to administer than assembled test;
useful as a screen for large applicant pools
when other test components are to be
administered
Written,
Multiple-choice
(MC)
2 - 4 months
Written
Performance
(essay)
2 - 5 months
Objective. Generally good validity; good
for testing job-related knowledge; easy to
administer and score; standardized exams
cannot be appealed; good for large applicant
populations
Greater candidate confidence than selfreport T&E; easier and less time consuming
to administer and score than oral
Work sample
2 - 5 months
Highest validity and candidate confidence
rate as selection procedure closely replicates
job functions or tasks
Oral
Examination
(structured
interview)
Written (MC)
and Oral
Examination
2 - 5 months
Can closely include/describe situations
encountered on the job; high candidate
confidence rate for testing oral and human
relations abilities
Higher candidate confidence when given
multiple opportunities to demonstrate
relative KSAs (assuming compensatory
exam structure)
Lower validity than other types of exams
(can be difficult to defend if challenged);
requires more formal assessment in dept.
hiring interviews; not good for highly
competitive jobs that draw large applicant
populations
Unknown if candidates are completing the
supplemental without assistance and
whether they are being truthful about their
experience; written communication skills
may influence what a candidate reports and
may bias ratings; may be difficult to obtain
raters to score
Difficult to develop new, customized
questions; not as suitable for non-technical
classes; subject to inspection of rating key
which can lengthen process; mental ability
exams may show adverse impact.
Difficult to obtain raters to score; requires
extensive training to ensure raters are
scoring for content and not just writing
ability or lack thereof
Difficult and time-consuming to administer
and score; often time-consuming and
expensive to administer to large applicant
populations
Difficult to obtain raters to score;
candidates more likely to appeal based on
bias when they don’t score well; not good
for large applicant populations.
Time-consuming to develop, administer and
rate multiple component exams; difficult to
obtain raters to score; more components to
appeal; candidates more likely to appeal
bias when they don’t score well; candidates
inconvenienced by being required to appear
for 2 exams.
4 - 7 months
SECTION IV – Page 65
SECTION V
Screening of Applications
Section V Page 1
APPLICATION SCREENING
Analysts should refer to the official examination announcement while reviewing applications.
Unless otherwise specified on the announcement, all minimum requirements must be met by the
final filing date and applications must be postmarked by that date. Where candidates are allowed
to participate in an examination process pending completion of minimum qualifications, the
analyst should note that on the application for purposes of data-entry and eligible list annotation.
If the application is mailed, the postmarked envelope must be attached.

Make no unnecessary marks on the application. Since the applicant may review the
application at a later date, critical or questioning notations should be on a separate, detachable
paper. Use a pencil to make any mark on the application. Indicate any important omitted
information by circling item. Applicant should later complete omitted items.

Ensure that all paper applications contain an original signature and current date. Electronic
applications may be accepted without a signature because the online application system
requires a confirmation from the applicant. Unless otherwise specified on the announcement,
a photocopy of the application may be submitted, but date and signature must be original. If
an applicant adds or changes any information on the application, it should be initialed and
dated by the applicant.

In processing an application, review each item carefully:
POSITION: correct class number, full title.
SPECIALTY: if indicated
NAME: full name.
ADDRESS: If the applicant files a Change of Address form prior to the certification of the
eligible list, this item must be updated in the applicant tracking system and on the application.
Staple the Change of Address form to the application.
TELEPHONE NUMBER(S)
SOCIAL SECURITY NUMBER: Please note that applicants are not required to provide a
social security number.
PREVIOUS NAMES: This information may be useful in verifying employment or
educational background.
CURRENT CITY EMPLOYMENT: This should agree with the employment history section.
This information is very helpful in determining eligibility as a promotive candidate. (Caution:
Many applicants incorrectly mark their status.)
Section V Page 2
BILINGUAL: An applicant who specifies fluency in a language other than English may be
certified ahead of other eligibles to a position that has an approved special condition requiring
that language. Please be advised that the applicant must be tested for the language
proficiency.
LICENSES (Driver, Registered Nurse, etc.), CERTIFICATES (typing, etc.), or
REGISTRATIONS (Civil Engineer, etc.) may be requirements for some examinations.
Check expiration date.
CONVICTION HISTORY: On January 17, 2006, the Civil Service Commission adopted a
policy on the disclosure and review of criminal history records. Analysts should refer to the
DHR policy regarding review of arrests and convictions.
APPLICANT SURVEY: The purpose of the applicant survey is to obtain the applicant’s
ethnicity and gender data for legal and statistical purposes only. Failure to provide this
information is not a cause for rejection.
The data from the survey form is used to generate statistics on candidates who drop out and
those who successfully complete each stage of the selection process.
a. The survey form is detached from the application at any time that the application is to be
reviewed by persons other than the examination staff, e.g. screening assessment panel
members, hiring managers, or the eligibles.
b. A survey form for a rejected applicant can be left on the application.
c. At the conclusion of the examination process, all Applicant Survey forms are stored with
the materials that pertain to the examination in the examination storage box.
EDUCATION:
High School: Completion of high school, possession of GED Certificate or Certificate of
Proficiency may be a requirement for some examinations.
College: A degree from an accredited college or university, or a major in a specific subject
may be a requirement for some examinations. Photocopy of a transcript or other
documentation may be required based on the terms of the announcement. See DHR Policy on
Accreditation.
Special Training: Certificates from business school, trade school, community college, citysponsored management training course, etc., may be a requirement for some examinations.
EMPLOYMENT HISTORY: Analysts should review this section carefully to determine if
the applicant meets the experience requirements of the announcement. Review:
a. The length and continuity of employment; make note of breaks in employment.
b. Number of hours per week – (multiply the number of hours worked per week by the
number of weeks worked) 2000 hours equals one year. Credit candidates for no more than
40 hours per week. There may be cases when an applicant would not be deemed qualified
for lack of a small amount of experience. Consult with DHR in these cases.
c. Job title.
d. The specific duties listed.
e. Determine if stated duties correspond to salary and/or title.
f. The level of responsibility; salary and number of people supervised are sometimes helpful
indicators.
g. Identify discrepancies in dates of employment.
h. Anything questionable that might affect an applicant’s eligibility.
Section V Page 3
CERTIFICATION: The applicant’s original signature and date is required in this section for
paper applications. All electronic applications are certified by indicating so on the electronic
application.
Documenting Status: In the space marked “For Official Use Only”, the analyst must carefully
note the decisions regarding each application. Analyst must initial and date all entries.
1. Eligible: Check appropriate section.
2. Ineligible: Indicate the reason(s).
After reviewing applications, the analyst should separate applications into groups according to
whether or not they meet the minimum qualifications. Arrange in alphabetical order within each
group of ‘Eligible’, ‘Ineligible’ and, if necessary, ‘Hold’.
RESTRICTIONS ON RE-EMPLOYMENT
After data entry has been completed, analysts must run the restriction register matching report.
CIVIL SERVICE RESTRICTION REGISTER
If the Civil Service Commission has set conditions for future employability, including requiring
the approval of the Human Resources Director to file an application after satisfactory completion
of a specified period of work experience outside the City, this action is noted in the “Restriction
Register.”
An analyst must check the Restrictions Register for both the Social Security number and names
reports. If an action is found requiring review, the Human Resources Director Application
Approval form (page xx) is prepared to make a recommendation for approval or disapproval.
Another condition for future employability may be participation in an appropriate substance
abuse program and documentation of a release to work from a DHR-designated professional in
substance abuse. Analysts should refer to the DHR policy regarding Restrictions on Future
Employment. (See Appendix V-C for memo concerning restrictions due to substance abuse).
If the Commission has ruled that a person may not be re-employed in a particular department,
indicate that action at the top of the application and amend the computer entry, such as, “Not to
be employed by Police Department-CSC 9/27/01.”
The analyst should:
1. Make sure applicant meets all requirements of examination announcement before beginning
review.
2. Note the CSC action on application. Clearly indicate if the final action cannot be rescinded,
by either the Commission or the HR Director. For example: “No driving position,” “No
employment handling money,” etc. This information must be entered into applicant tracking
system restriction field.
3. Check Restriction Register file on line to determine whether there is documentation that the
applicant has already satisfied the restriction requirement. If the restriction has been
satisfied, note on application.
Section V Page 4
4.
5.
If there is no documentation that the applicant has satisfied the restriction requirement,
determine that the applicant meets the conditions of the Commission’s action; e.g. one year
of satisfactory outside work experience (2000 hours).
If the restriction regarding outside experience was imposed five or more years ago by Civil
Service action, the restriction no longer applies.
OUT-OF-CLASS ASSIGNMENT ISSUES
Civil Service Rules and DHR policy require that City employees receive credit only for the duties
of the class to which appointed or assigned unless sufficient and creditable documentation is
provided to verify performance of other duties. Employees may receive credit for duties not
usually performed by incumbents in a class if their employee file contains contemporaneous
documentation that the duties were assigned and performed. Credit for duties not usually
performed by incumbents in a class based on non-contemporaneous documentation shall require
the approval of the Human Resources Director.
Acceptable forms of documentation include:


Official out-of-class assignment forms, letters, or memoranda signed by the appointing
officer or designee.
Performance evaluations signed by the appointing officer or designee.
The analyst may accept documents that support work that was completed contemporaneously ie.
signed by the supervisor of record for a normal annual appraisal within a normal time frame for
completing performance appraisals and contained in the employee’s personnel file or completed
because of a change in duties, etc. . All other documentation must have been recorded at the time
of the assignment and must reflect the appropriate level and type of duties.
If the documentation was not completed contemporaneously, or if it was prepared after the
assignment, or if it was documented after the announcement was issued, then the analyst must
consult with DHR.
This procedure must be uniformly enforced. Questions about appropriate documentation must be
brought to the attention of Examination Team Leaders, and if necessary, DHR.
Section V Page 5
HUMAN RESOURCES DIRECTOR’S APPLICATION APPROVAL
(FOR APPLICANTS WITH EMPLOYMENT RESTRICTIONS)
PURPOSE:
To obtain the Human Resources Director’s approval of an application when the
applicant has restrictions on re-employment.
FORM:
Human Resources Director’s Application Approval form
PROCEDURE:

Fill in name of applicant, class number and title of examination.

Summarize CSC actions and dates. Review file in CSC office if necessary to see why the
applicant was terminated. Photocopy if necessary.

Note any previous denials from prior applications.

Review the seven items listed on the form to help make a recommendation. Remember that
these are guides for you to determine if the applicant’s behavior or work record (since the
Commission’s action) indicates s/he has the potential for being a competent and effective
employee.

Make a recommendation and submit it to the Team Leader for approval or disapproval. Write
a brief justification.

At the conclusion of the examination, a copy of this form and a photocopy of the front of the
application is filed with the examination papers for storage.

File the original of this form, the application, and supporting documents in the Restriction
Register file. (See Client Services staff for assistance).
Section V Page 6
Section V Page 7
DATA ENTRY
PURPOSE: To track applications for record keeping, creating notices, generating reports and
establishing eligible lists, applicant information is entered into the computerized applicant
tracking system.
PROCEDURE:

Enter all identifying data and status of applications into applicant tracking system.

It is most efficient to enter applications after they are experted, not before. This can save the
extra step later of having to go back in and enter a designation on each application.

Analysts must ensure that all data is accurately entered into the applicant tracking system.
Section V Page 8
APPLICATION STATUS RECONSIDERATION REQUESTS
Civil Service Commission Rule 111A.11 provides that “every applicant for an examination must
possess and maintain the qualifications required by law and the examination announcement for
the examination.”
Analysts may provide applicants with an additional opportunity to submit qualifying information
or verification or deny the acceptance of the application. If a decision is made to provide
additional time for applicants to submit information, then this must be applied consistently to the
applicant pool. If the additional information demonstrates possession of the minimum
qualifications, applicants must be informed that their application is accepted for the examination.
If the additional information does not demonstrate the possession of minimum qualifications,
applicants must be sent a letter or e-notice explaining clearly why the original information and
additional information provided does not demonstrate possession of the minimum qualifications.
The applicant must be notified of appeal rights or referred to documents containing appeal rights.
As with all notices to candidates, these notices should be specific and informative, respectful in
tone and as clear and non-bureaucratic in language as possible.
Final notices, in which applicants are given appeal rights, should summarize issues, identify
applicable authorities (CSC Rules, terms of announcements, etc.), and respond to key issues in a
way that forms the foundation for possible future reports.
Sample language for notices to applicants can be found on the following pages.
Section V Page 9
SAMPLE LANGUAGE FOR A REJECT LETTER
Your application, supporting documentation, and the additional information you have provided in
response to the rejection of your application for class 5207 Associate Engineer have been
reviewed and evaluated by staff.
It is important to note here that experience requirements for the City and County of San Francisco
are titled Minimum Qualifications; this term means applicants must possess a minimum of the
exact specified experience by the final filing date. The experience requirement to participate in
this examination is, to paraphrase, 3 years of professional engineering experience in specific
discipline or specialty areas, which includes 2 years experience equivalent to the 5203 Assistant
Engineer level or higher.
It was determined by personnel analysts and engineering managers that you do not possess 3
years professional engineering experience. Your experience for the City & County of San
Francisco as a Transportation Engineering Assistant I & II is sub-professional engineering
experience. Your application materials were thoroughly reviewed and evaluated against the
criteria above. Additionally, documentation related to your work experience, such as
performance appraisals, was obtained from your personnel file and reviewed to determine if there
was any documentation to support an out-of-class assignment to a level equivalent to class 5203.
There is no documentation in your personnel file that would support a claim that you were
performing an out-of-class assignment. While your experience at Korve Engineering Inc. is
qualifying, you can only be credited for 2 years 9 months professional civil engineering
experience.
As you did not possess the required experience by the final filing date of December 1, 2000, your
application cannot be accepted for this examination. This decision is final and no further
consideration, by this Department, can be given this matter. Any appeal of this decision must be
received by the Human Resources Director, 44 Gough Street, San Francisco, CA 94103, before
the close of business on the fifth working day (excluding Saturdays, Sundays and holidays)
following the postmarked mailing date of this notification. The Human Resources Director’s
action on the appeal shall be final and no reconsideration requests will be allowed.
The City & County of San Francisco is currently recruiting for class 5203 Assistant Engineer on a
continuous basis. The announcement for that class can be found on our website,
www.sfgov.org/dhr, or applications can be obtained at 44 Gough Street if you are interested.
Sincerely,
Analyst
Section V Page 10
SAMPLE LANGUAGE FOR A REJECT LETTER
The Department of Human Resources would like to extend our apologies for the delay in
processing your appeal for Class 1824 Principal Administrative Analyst. We had hoped that,
with more specific information, your experience might be qualifying. The additional information
you provided describing your experience performing legislative/administrative policy analysis
and all previous documentation you provided to support your application was reviewed by four
Subject Matter Experts: a Finance Manager for the Public Utilities Commission, a Director for the
Office of Contract Management & Compliance at the Department of Public Health, a Director of
Performance Management for the Controller’s Office, and a Senior Program Planner for the
Policy & Planning Division of the Department of Public Health. Two of these managers had
previously worked as Principal Administrative Analysts in legislative/policy positions.
Qualifying legislative/policy analysis experience must have included analysis and implementation
of policy or legislation which modifies the framework of an organization or agency, to change the
way an organization operates, rather than working on policy guidelines within an existing
framework; examples of qualifying experience include developing a body of regulations or
drafting legislation. Experience must have included analysis which required moving beyond
defined methods and practices, and applicants must demonstrate that they worked with some level
of independence, without close supervision; experience must have included presentation of
analysis and recommendations directly to the senior management of an organization.
All four Subject Matter Experts concluded that your work experience, as documented and verified
in your application materials and the detailed information you provided regarding your
experience, did not demonstrate that you possess the full range of qualifying experience as
described above. Further, the experience information that you have submitted does not provide
appropriate, out-of-class assignment documentation of qualifying duties and responsibilities. Per
Civil Service Commission rules, “City and County employees shall receive credit only for the
duties of the class to which appointed. Credit for experience obtained outside of the employee’s
class will only be allowed if recorded in accordance with the provisions of these Rules.”
Based on the above information, your request for inclusion on the 1824 Principal Administrative
Analyst registry must be denied and no further consideration can be given by the Department of
Human Resources at this time. Any appeal of this decision must be received by the Human
Resources Director before close of business on the fifth working day (excluding Saturdays,
Sundays and holidays) following the postmarked mailing date of this notification. The address of
the Human Resources Director is 44 Gough Street, San Francisco, CA 94103. If you choose to
appeal, a report to the Human Resources Director will include the determination of the Subject
Matter Experts.
If you have any questions regarding this matter, you may contact me at (phone #)
Sincerely,
Analyst
Section V Page 11
SAMPLE OF REJECT LETTER
Date
X
X
X
Dear X:
The additional information that was provided in conjunction with your application for Class 2720
Janitorial Services Supervisor has been received in this office. This information does not verify
that you possess the Minimum Qualifications as required by the examination announcement (see
below).
MINIMUM QUALIFICATIONS:
1. Two (2) years of verifiable experience as a custodial supervisor equivalent to City class
2718 responsible for the supervision, through subordinate supervisory personnel of a large
group of employees engaged in custodial work; OR
2. Four (4) years of verifiable experience as a custodial supervisor (equivalent to City class
2716) responsible for the direct supervision of a large group of employees engaged in
custodial work; AND
3. Possession of a valid Driver's License.
Your application and verification documentation state that you have 3 years of experience at XYZ
Company as a Janitor Supervisor, supervising 2 janitors. You cannot qualify for #1 of the
minimum qualifications because you do not supervise, through subordinate supervisors; and you
are lacking 1 year of experience to qualify under #2 in the minimum qualifications of this
announcement. As you have no other custodial supervisor experience, your application cannot be
accepted for this examination. Any appeal of this decision must be received by the Human
Resources Director before close of business on the fifth working day (excluding Saturdays,
Sundays and holidays) following the postmarked mailing date of this notification. The address of
the Human Resources Director is 44 Gough Street, San Francisco, CA 94103. If you choose to
appeal, a report to the Human Resources Director will include the determination of the Subject
Matter Experts.
We thank you for your interest in employment with the City & County of San Francisco. If you
are interested in other job opportunities, please visit our website at www.sfgov.org/dhr.
Section V Page 12
SAMPLE OF REJECT LETTER
Date
X
X
X
Dear X:
Thank you for providing the additional details regarding your employment at XYZ-TV as an
Administrative Assistant. In your letter dated July 23, 2001, you describe duties you performed
related to the accounting conversion undertaken by XYZ-TV’s Controller.
You described your duties as the following:
Duties
Assisted your supervisor in drafting the
initial budget and obtained approval from
the CEO.
Canvassed other media properties who
completed successful conversions.
Obtained their MBE/WBE vendor lists and
updated regulations.
Obtained approximate costs for their entire
conversion and compared these estimates
with budget draft.
Checked the quality of equipment
purchased, investigated length of warranties,
etc.
Researched maintenance contracts,
compared those against warranties and
maintenance contracts from other vendors.
Arranged for each vendor to make a
comprehensive presentation.
Requested one liaison from each vendor.
Worked with each vendor’s liaison to effect
smooth presentations.
Not qualifying – budgeting
Not qualifying – research
Not qualifying – research
Not qualifying – research
Not qualifying – research
Not qualifying – research
Not qualifying - scheduling
Not qualifying
Not qualifying
Section V Page 13
Duties
Worked with in-house staff to setup
presentation space and provide necessary
equipment for presentations.
Made arrangements for all accounting staff
to attend each presentation.
Created, printed and disseminated rating
forms for each staff member to rate the
presentations relative to optimum
productivity.
Collected rating forms from each staff
member and compiled that information and
the three presentation/bid packages for
supervisor review.
Arranged a comprehensive meeting schedule
with selected vendor to cover details of the
conversion.
Not qualifying
Not qualifying – scheduling
Not qualifying
Not qualifying – report writing
Not qualifying – policy discussion
While work on this project was performed at a highly responsible level, it does not qualify as
complex budget analysis, economic analysis, fiscal/financial analysis, contract administration, or
administrative/legislative policy analysis. An example of qualifying contract administration
experience consists of preparing contracts, conducting competitive bid selection, processing,
negotiating and awarding contracts and contract performance monitoring and evaluation.
Contract administration deals with negotiating deliverables and measurements to be tracked in the
performance of a contract between or among various parties. These deliverables can be quite
complex and may require significant negotiation among the various parties prior to the
establishment of the contract as well as during the term of the contract. The contract
administrator makes the decision on when these deliverables and measurements meet contract
requirements and disburses monies based upon those decisions.
Your work experience did not include the range or complexity of experience needed to qualify for
this examination. Therefore, your application for the 1824 Principal Administrative Analyst
examination has been rejected.
This decision may be reconsidered if you can provide additional information or explanation in
writing within five (5) business days of the postmark of this letter. Return a copy of this letter
with your response.
If you are interested in other job opportunities with the City & County of San Francisco, we
welcome you to visit our website at www.sfgov.org/dhr.
Section V Page 14
SAMPLE OF A REJECT LETTER
Date
X
X
X
Dear X
The information you provided in response to our rejection of your application for Class 1824
Principal Administrative Analyst has been received in our office. This information and the
information provided in your application materials has been reviewed, and do not verify that you
possess the minimum qualifications stated on the job announcement.
You have documented possession of a Bachelor’s of Science Degree in Accounting and therefore
require five years of experience performing complex budget analysis, contract administration,
legislative/policy analysis, and/or financial analysis in order to qualify for the 1824 registry.
The experience information that was submitted in your December 27, 2002 letter provided
information on your current 1657 Senior Systems Accountant position with the Public Utilities
Commission’s Finance Bureau. This information did not document attainment of the minimum
qualifications specified on the job announcement. Your performance appraisal for the period of
7/1/00 to 6/30/01 cannot be considered qualifying because it does not verify the full range of
duties and responsibilities required for class 1824. The experience listed in this performance
appraisal is not at the appropriate scope of responsibility to provide qualifying experience.
Experience gained overseeing maintenance of debt service, maintaining fixed assets schedules,
and providing assistance to the General Ledger during fiscal year end audits does not constitute
complex budget analysis, contract administration, legislative/policy analysis, and/or financial
analysis.
Furthermore, your work experience prior to April 2000 is also not at the appropriate level and
scope to be qualifying for the 1824 Principal Administrative Analyst registry. Though performed
at a responsible level, this experience is comprised primarily of accounting-type duties and
responsibilities and therefore does not meet the minimum qualification set forth in the job
announcement.
Absent any additional verifying experience information, the Department of Human Resources is
therefore unable to rescind the rejection of your application. This decision is final unless you are
able to provide documentation of qualifying experience within five business days of the
postmark. Return a copy of this letter with your response.
Section V Page 15
RESPONSE TO REQUEST FOR RECONSIDERATION ON REJECTION OF
APPLICATION
The information you provided in response to our rejection of your application for Class 1823
Senior Administrative Analyst has been received in this office. This information has been
reviewed, and does not verify that you possess the minimum qualifications stated on the
announcement.
You have documented possession of a Bachelor’s Degree in Economics and therefore require
three years of experience performing complex budget analysis, financial analysis, contract
administration and/or legislative-administrative policy analysis in order to qualify for the 1823
Senior Administrative Analyst registry.
Your current position as Research Assistant (Job Code 1802) for the San Francisco Airport
commission, from April 2002 to present, is not at the appropriate level and scope to be qualifying
for the 1823 Senior Administrative Analyst exam. The performance appraisal for the period of
4/5/02 to 6/7/02 that you submitted as verification does not document complex work involving
budget analysis, financial/fiscal analysis, economic analysis, contract administration, or
legislative/administrative analysis. As listed in your performance appraisal, experience collecting
and organizing air traffic statistics, maintaining and updating airport concession sales statistics
and data, analyzing monthly concession sales, and distributing annual surveys cannot be
considered qualifying for the 1823 exam. Additionally, the letter of verification from John Doe
also does not document qualifying experience at the appropriate level and scope to be qualifying
nor does it provide contemporaneous documentation of out-of-class assignment experience. Civil
Service Commission rules state that a City employee shall only be given credit for the duties and
responsibilities of the class to which appointed and that work performed outside of an employee’s
class must be documented in accordance with CSC rules and regulations.
Department of Human Resources analysis of your previous Research Assistant (Job Code 1802)
position with the San Francisco Unified School District from August 1994 to March 2002, as
documented and verified in your application and application materials, was also determined to not
be qualifying for the Class 1823 Senior Administrative Analyst exam. The performance appraisal
that was submitted for 3/1/97 to 2/28/98 does not document experience at the appropriate scope
and level of complexity required by the minimum qualifications. Your performance appraisal
listed duties and responsibilities such as assisting in the development and processing of school
site plans, gathering and reviewing data for the consent decree annual report, conducting research
studies, assisting in applying statistical methods to determine trends and factors, and supplying
data for updating reports. These duties and responsibilities are appropriate to those of the Class
1802 Research Assistant and do not provide qualifying experience for the 1823 exam.
Absent any additional experience information, the Department of Human Resources is unable to
rescind the rejection of your application. This decision may be reconsidered if you can provide
three years of qualifying documentation in writing within five business days of the postmark.
Section V Page 16
PROMOTIVE POINTS
Applicants for promotive-only or combined promotive and entrance examinations shall meet the
requirements of the examination announcement under which they apply. If otherwise qualified,
City employees with six (6) consecutive months (1040 hours) of verifiable experience in any job
classification in any appointment type qualify as promotive applicants. Such employees are
entitled to up to 60 additional points for seniority and for satisfactory performance rating if
successful in the examination. Eligibility for promotive points is computed through the final filing
date for discrete exams and the date the application was submitted for continuous exams. In the
case of series testing, the computation date will be the date of the exam.
SERVICE POINTS – 30 Points
Thirty (30) points will be given to City employees with six (6) consecutive months (1040 hours)
of verifiable experience in any job classification in any appointment type as of the computation
date (last date for filing), unless otherwise noted on the announcement.
PERFORMANCE (MERIT) RATING – 30 Points
Thirty (30) points will be given to candidates with an overall evaluation of at least Competent and
Effective in their most recent Performance Appraisal Report. No credit will be given to
candidates with an overall evaluation of less than Competent and Effective. This includes ratings
of Unacceptable and Development Needed.
For all applicants who indicate that they are current City and County of San Francisco employees,
confirm in Peoplesoft that they meet the definition of a promotive applicant.
If any claim is made that an applicant is not entitled to the full sixty (60) points and points are to
be deducted, the analyst must research and determine if the candidate is not entitled to the thirty
(30) merit points due to either a less than Competent and Effective rating on a performance
appraisal or suspension(s) within the last twelve (12) months prior to the final filing date.
Section V Page 17
VETERAN’S POINTS
For entrance employment only, any person who is otherwise qualified and has submitted the
Veteran’s Preference Application and verification at the time of application or as specified on the
examination announcement is eligible for veteran’s points. Veteran’s points are awarded only if
the candidate is successful in the examination.
Entrance employment means entrance into the City service. Veteran’s points may be applied to
entrance examination scores and combined promotive and entrance examination scores.

The standard Veteran’s Entitlement is an additional credit of 5% of the qualifying
score (35 points).

Veterans with a permanent service-connected disability may apply for a disability
credit of 10% of the qualifying score (70 points). The disability must be on record in
the U.S. Veteran’s Administration.
Candidates are no longer eligible for veteran’s points after they have passed a probationary period
for a Permanent Civil Service appointment. If a veteran who is a current permanent City
employee applies for veteran’s preference points, check with his/her department to determine if
the veteran has passed the probationary period. Veterans who have passed their probationary
period are no longer eligible for veteran’s preference points.
Section V Page 18
CITY AND COUNTY OF SAN FRANCISCO
VETERAN’S PREFERENCE APPLICATION
FOR OFFICE USE
ONLY
Eligible 5%
___
Eligible 10% ___
Not Eligible ___
Analyst ________
Date____________
For further information regarding veteran’s points, see Civil Service Commission Rule 111A.14,
Article III and Rule 111A.14, or the brochure on the DHR website entitled “Applicant
Information: Veteran’s Preference.”
Position you are applying for:
Class#
Title
Your Last Name
Your Social Security Number
Your First Name
Middle Initial
Check One:



I am applying as an Eligible Veteran, as defined in CSC Rule 111.36 (complete items 1, 2, and 6).
I am applying as a Disabled Veteran, as defined in CSC Rule 111.37 (complete items 1, 2, 3 and 6).
I am applying as the un-remarried widow/widower or surviving domestic partner of an Eligible
Veteran per CSC Rule 111.38.1 (complete items 1, 2, 4, 5 and 6).

I am applying as the un-remarried widow/widower or surviving domestic partner of a Disabled
Veteran per CSC Rule 111.38.2 (complete items 1, 2, 3, 4, 5 and 6).
1. Period of Veteran’s Qualifying Service: (attach legible copy of DD214)
DATE ENTERED
ACTIVE DUTY
DATE SEPARATED FROM
ACTIVE DUTY
TYPE OF SEPARATION/
CHARACTER OF DISCHARGE
2. Have you ever been awarded Veteran’s Preference on a City and County of San Francisco
eligible list? No  Yes 
If Yes, indicate class number(s) and approximate date(s) __________________________
3. Disabled Veteran Preference: Complete this section if the Eligible Veteran has suffered a
permanent service-connected disability that is of record in the U. S. Veteran’s Administration.
Attach a copy of the award letter.
Claim number used by U.S. Veteran’s Administration C - _______________________
Veteran’s Administration Office where claim is now filed________________________
4. Deceased Veteran’s Information:
Veteran’s Last Name
Veteran’s First and Middle Name
Veteran’s Date of Birth
Veteran’s Social Security Number
Veteran’s Military Serial Number
5. Your relationship to deceased veteran at time of his/her death_____________________
Have you subsequently remarried or entered into another domestic partnership?______
Documentation of relationship must be submitted with this application, such as marriage certificate, registration as domestic partner, veteran’s death
certificate, etc.
6. CERTIFICATION OF APPLICANT (read carefully): I hereby certify that all statements made in this application are true and complete to
the best of my knowledge. I understand that any false, incomplete, or incorrect statement, regardless of when it is discovered, may result in my disqualification or dismissal
from employment with the City and County of San Francisco.
Date
Signature of Applicant
Section V Page 19
EMPLOYMENT VERIFICATION
PURPOSE:
To verify employment information or request further information.
FORM:
Employment verification form
PROCEDURE:


Analyst completes the top portion of the form with the applicant’s
name.
The applicant must sign the form on the reverse side. The form is
sent to the company or agency from which information is desired. It
is advisable to copy the incomplete form before mailing, date it and
attach it to the application.
Section V Page 20
Section V Page 21
Section V Page 22
SECTION VI
Examination Administration
Section VI – Page 1
EXAMINATION ADMINISTRATION
The City and County of San Francisco conducts examinations under structured administration
procedures that involve strict control of the examination environment. The types of examination
may include: written, oral/performance, and performance. Analysts are responsible for advance
planning of all phases of examination administration. It is important to set examination dates
which allow adequate time to finalize the examination, secure facilities and raters, prepare (print
and package) materials, and plan logistics.
Section VI – Page 2
SCHEDULING AND NOTIFYING CANDIDATES
When developing the examination schedule, analysts should consider the following:
 Number of qualified candidates
 Provisional & permanent employees in process
 Prevention of double booking (when conducting more than one exam in class series)
 Rater availability (City departments or other jurisdictions)
 Facilities availability (# of test rooms) and set up time
 Type and length of exam
 Breaks and lunch
 Rater and proctor orientation and training time
 Processing and movement of candidates
 Availability of tools and equipment to be used during the exam
After examination scheduling has been completed, analysts must notify candidates. However,
analysts must not send notices until the test site and all raters have been secured.
Examination notices should be sent early enough to permit candidates to schedule time to
participate in the exam. For most exams, 10 business days advance notice is appropriate. For
candidates who live out of the area, more advance notice may be needed.
Examination notices must include: the date, time and location of the exam; description of the
component(s), including duration (to allow candidates to assess if they require reasonable
accommodation); documents that candidates are required to bring to the exam site (e.g., driver’s
license, exam notification); statement informing candidates of how to request reasonable
accommodation; and information on what candidates may not bring to the test.
Other information/materials that are helpful to candidates include: directions and/or maps,
parking and public transportation information.
Reasonable Accommodation
In accordance with relevant federal, state, and local laws, a qualified individual with a disability
will be provided an equal opportunity to participate in Civil Service examinations.
If analysts receive a request for reasonable accommodation, the following procedures must be
followed:

Candidates must complete the Request for Reasonable Accommodation in Civil Service
Examination form.

Analysts must review request to determine if request is appropriate and accommodation
is possible without undue hardship.

If analysts believe that the requested accommodation cannot be granted, they should
discuss other options with the candidate.

Analysts must document disposition of the request on the back of the form.

Form is to be placed in the central confidential file.
Section VI – Page 3
Request for Reasonable Accommodation
In Civil Service Examination
Candidate Name:
Examination for Class:
Title:
In accordance with relevant federal, state, and local laws, a qualified individual with a disability will be
provided an equal opportunity to participate in Civil Service examinations. If you need an accommodation in
order to compete in the examination, please complete this form and return it to the appropriate personnel
analyst.
I am requesting a reasonable accommodation in the above examination due to my disability.
Examination Component (written, multiple choice, essay, fill-in, oral, performance – specify):
Accommodation Requested (be specific):
I hereby certify that I am disabled as defined by the American with Disabilities Act and require accommodation.
I understand that I may be required to provide documentation of the disability if requested by the Department of
Human Resources and agree to cooperate fully with such request. I certify and agree that if at any point it is
determined or revealed that at the time I took this examination I did not have a disability it may result in my
disqualification or dismissal from employment with the City and County of San Francisco.
Date:
Signature:
Notice to Applicant: This form and the information it contains is needed by the Department of Human
Resources in order to attempt reasonable accommodation for you in the examination process. It will be
kept in a confidential file by the Department of Human Resources. It will not be filed with other
examination materials. All information you give is strictly confidential and will not be released to your
current or prospective employer. Please note that any accommodation necessary to perform essential job
functions must be separately requested from the employing department. Each department has an individual
designated to receive such requests. Contact the department’s personnel office for more information.
cc: DHR/EEO UNIT – ADA COORDINATOR
Section VI – Page 4
Request for Reasonable Accommodation
In Civil Service Examination – page 2
For Office Use Only
Accommodation considered or offered (be specific):
Response (i.e., accepted, refused, negotiated) and reason, if refused:
List any agencies contacted to implement this request (include phone # and response):
Analyst:___________________________________________________________Date: __________________
Section VI – Page 5
ADMINISTERING A WRITTEN EXAMINATION
Written objective exams are not encouraged for use under the Position Based Testing program unless the
examination is purchased from an approved vendor and the vendor has validated the examination and
rating key. An explanation of written objective examinations is provided here as it is a selection tool that
may be appropriate for some jobs being tested under Position Based Testing. If the analyst wishes to use
a written objective examination, then consultation and approval from DHR is required.
Written examinations must be conducted in a uniform manner and in accordance with Civil Service
Commission Rule 111A and Department of Human Resources policies and procedures.
Analysts must refer to the Written Examination Checklist in planning test administration in advance
of the test day.
At the test site:
1. Conduct proctor orientation; provide proctors with written instructions. For large
examinations, analysts might want to train proctors before the test day.
2. Set up test room(s), registration stations, post signs.
3. After making sure all proctors are ready, have them register and seat candidates. Respond to
candidates who do not have the proper identification.
4. Monitor examination administration; remain available to respond to questions and problems.
5. After the examination:
 Collect, reconcile (count), organize and package materials (make sure no test
materials are left behind)
 Return the test rooms to their original condition/order
 Have proctors sign out
 Locate custodian to secure the test site, if applicable
 Return test materials to a secure site
After the exam has been scored, analysts must schedule a five-day Review of Ratings for candidates
who fail the written exam. The sole purpose of this inspection is to determine that the computation
of scores is accurate (see CSC Rule 111A.22). Unsuccessful candidates are allowed to inspect
application materials, answer sheet and examination key.
Successful candidates are not allowed to inspect their papers until all components of the
examination are completed.
Section VI – Page 6
IDENTIFICATION VERIFICATION FORM
PURPOSE:
To verify identification of a candidate who appears for an examination without
acceptable picture identification.
FORM:
Identification Verification Form
PROCEDURE:

Analyst in charge or representative completes the upper part of the form as follows:
a.
b.
c.
d.
e.
f.
g.
Job code and job title.
Candidate’s name
Date
Fingerprint of candidate
Candidate’s signature on both copies
Analyst retains the original
Instruct candidate to bring acceptable identification to the exam unit within the prescribed
time frame
h. Inform candidate that if s/he is successful in the exam process, and identification is not
provided within this time frame, that the name will be placed under waiver on the eligible
list until the identification is verified.

The analyst in charge of the examination verifies identification presented by the candidate
and completes the lower half of the form as follows:
a.
b.
c.
d.
e.
f.

Fingerprint of candidate
Record type of identification presented
Signature and date
Candidate’s signature
Retain original
Return copy to candidate
If the candidate fails to bring in the identification within the prescribed time frame, the
candidate’s name (if successful) will be placed under waiver of appointment until appropriate
identification is verified.
Section VI – Page 7
City and County of San Francisco
Department of Human Resources
Gavin Newsom
Mayor
Philip A. Ginsburg
Human Resources Director
DEPARTMENT OF HUMAN RESOURCES
IDENTIFICATION VERIFICATION FORM
Examination job code/title ________________________________________________________
Please READ the following carefully:
I understand that I am being admitted to this examination conditionally, and that I must submit
acceptable photo identification (California Driver License, California Identification Card, Alien
Registration Card, Military Identification Card, Employer Identification Card, or passport)
within 5 business days. Failure to submit this identification may result in my disqualification
from this examination and cancellation of my examination papers.
Bring identification to: 44 Gough Street, San Francisco, CA
Fingerprint
Name of candidate: _____________________________________
Date: ________________________________________________
Candidate signature: ____________________________________
COMPLETE in duplicate.
COPY to candidate. ORIGINAL to be retained by analyst or representative.
---------------------------------------------------------------------------------------------------------------------
Fingerprint
Type of identification submitted: ___________________________
Verified by: ____________________________________________
Date: __________________________________________________
Candidate signature: ______________________________________
EXM-10a
Section VI – Page 8
SAMPLE
WRITTEN EXAM PROCTOR INSTRUCTIONS
Test Room Setup:
 Set up 2 pencils and 5 sheets of paper for each candidate.
 Set up extra paper and answer sheets at head table.
At the Door:
 Check IDs against name on letters and have candidates sign letter (if candidate did not bring
letter, have candidate print name and sign the duplicate letter).
 Do not take letter from candidates – letters will be needed in the testing room.
In the Testing Room
 Have candidates place any bags, papers, purse, etc. on the floor next to them.
 Pass out exam sheet with instructions facing up.
 Tell candidates to turn the paper over until you tell them to begin.
 Read the instruction sheet out loud to/with candidates.
 Set timer for __ minutes. Then say: You may begin.
During the exam:
 Advise candidates when __ minutes are remaining.
 Advise candidates when __ minutes are remaining.
 Advise candidates that if they finish early, they should come to the front table and turn in their
work.
 When timer goes off ask everyone to stop and separate their scratch paper from the written
exercise they want to submit. Have them leave their pencils on the table and stand in line to
assist with labeling and stapling their exercise.
Section VI – Page 9
SAMPLE
TEST ROOM PROCTOR INSTRUCTIONS
CLASS & TITLE
WRITTEN - MULTIPLE CHOICE EXAMINATION
Welcome to the written examination for Class___________.
1. Please place all notebooks, books, pamphlets, guides or any other materials under your seats.
Leave your notification letters out. Be sure that all electronic devices – beepers, pagers, cell
phones, watch alarms or any other noise-making electronics – are turned off. Once the
examination has begun, you must keep your eyes on your own papers and you will not be
allowed to talk.
2. At your desk you should have one red Scantron sheet and two # 2 Pencils.
3. Follow my instructions to fill in the information requested on your Scantron answer sheet. In
the upper right hand corner of your answer sheet, please print your last name, first name and
middle initial in the space provided. In the upper left corner, fill in the boxes with your
identification number and fill in the corresponding bubbles. Start with the top row and work
your way down. This should leave the bottom row blank. If you do not know your
identification number leave this area blank. (Proctors should check to make sure candidates are
completing sheets properly.)
4. In the upper right corner of your Scantron answer sheet, there is an example of how to
properly mark your answer on the answer sheet. Follow this example. Remember to erase
completely if you change your answer. When you are told to begin, be sure to make only one
mark for your choice of answer. There is no penalty for guessing.
5. Please sign the bottom of your notification letters. Do not print. Pass the letters to the end of
each row to be collected.
6. Now I will hand out the test booklets. Do not open them until told to do so.
7. Print your name on the top left corner of the cover of the test booklet. Find the test booklet
number in the upper right corner of your test booklet. (Ask if everyone has found it). Write that
number to the left of your name on the Scantron answer sheet.
8. I will now read the test instructions on the front of the test booklet. Read along to yourself as I
read them aloud – these are pages Roman numeral 1 thru 4 (pages I - IV). Do not open the
Question portion of the booklet. (Proctor should now read the instructions)
9. If you need to use the restroom during the examination, raise your hand to get a proctor's
attention. Only one candidate at a time will be allowed out of the room.
Section VI – Page 10
10. If you finish before time is called, raise your hand and a proctor will collect your test
materials. All test materials, including any notes, must be returned to the proctors. You will
then be dismissed.
11. If you are still working when time is called, you are to remain seated until dismissed. A
proctor will come to your desk one at a time to collect your materials. You are to stay seated
until all test materials are collected from everyone in the room. No one may leave until all
materials have been collected.
12. You are to leave quietly because others may still be taking the exam. You are to leave the
building and not loiter in the corridors or in front of the building. (Direct them towards the
front door when they leave at the end of the exam).
13. Are there any questions? Once again, you will have _____ hours and ___ minutes to complete
the examination. Time remaining will not be announced.
14. The time is now ___. You may open your books and begin. (Be sure to note the start time and end
time on the chalkboard, or on a paper that you can place on the wall for all candidates to see).
Proctors should not comment on examination items. If a question is asked, direct the candidate to
perform to the best of his/her ability. If they insist that there is something wrong with a question,
call for one of the chief proctors, but have the candidate continue to take the test.
15. Write the beginning time and the ending time on the board for all candidates to see.
16. After the examination has begun, walk around the room and check to see that the candidates have
put ALL the correct information on the top of their Scantron answer sheet. Make sure the candidates
are filling in the boxes correctly and completely, erasing completely when changing an answer and
that answers are going down the sheet and not across.
17. When time is up, tell candidates, “STOP! Put your pencils down. Please remain seated until I
dismiss you.” See that all candidates stop when time is called. Pick up each candidate’s Scantron
answer sheet and test booklet. Be sure that you have collected a Scantron answer sheet and test
booklet from each candidate.
18. The number of candidates should match the number of test booklets and scantrons.
Section VI – Page 11
WRITTEN EXAMINATION CHECKLIST
TEST MATERIALS
___1. Check with Reproduction Unit to ensure that materials will be printed by the target date
SECURE TESTING SITE
___1. Check with accounting/budget staff to obtain information on availability of funds and
documents needed for payment
___2. Obtain Dept. of Real Estate approval (if necessary)
___3. Secure testing site - call site contact person; arrange for site visit
___4. If using a school, submit school permit 10 days in advance (Note: payment must be in advance)
___5. Arrange room numbers, number of candidates per room
___6. After receiving approval(s), notify candidates of testing date, time, and location
___7. Contact school custodian or site contact person to verify opening time, room numbers and
seats per room
PROCTORS
___1. Contact proctors
___2. Plan proctor assignments and prepare assignment sheet
___3. Confirm with proctors
___4. Bring sign-in sheets for proctors
___5. Make sure new proctors have completed all necessary documentation prior to exam date.
___6. Conduct orientation for proctors at test site or earlier.
TEST PACKAGES FOR EACH ROOM
___1. Written examination booklets
___2. Answer sheets
___3. Change of Address forms
___4. Manila envelopes
___5. Scratch paper
___6. Sharpened pencils
___7. Red pens or pencils
___8. Proctor instructions
___9. Proctor report forms
__10. Timer
CHECK-IN MATERIALS
___1. I.D. Verification forms
___2. Carbon paper
___3. Clipboard
___4. Inkpad
___5. Paper towels
___6. Extra blank notification letters
___7. Check-in roster
___8. Extra pencils/pens
MISCELLANEOUS
___1. Rubber bands
___2. Chalk
___3. Coffee pot
___4. Coffee, tea, cups, cream, sugar, spoons, etc.
___5. Map of site
PREPARATION
___1. Prepare enough test packages for the number of candidates scheduled
___2. Bundle/box test packages and other supplies for each test room
___3. Bundle/box materials and supplies for other test stations (e.g., registration room, etc.)
AFTER EXAMINATION
___1. Recount examination booklets
___2. Alphabetize signed notification letters for storage box
___3. Compute proctor payroll and submit for payment after approval
___4. Store or shred examination booklets
___5. Return extra materials, coffee pot, etc.
Section VI – Page 12
REVIEW OF RATING FOR WRITTEN OBJECTIVE EXAMINATIONS
Candidates who are unsuccessful in the written component of a multiple-component examination are
allowed a five-day Review of Rating period. Generally, successful candidates should not be allowed to
see their written examination scores until all components are completed. The Review of Rating period
is specified in the notice informing candidates of their scores. The sole purpose of this review of rating
period is to determine that the scores are accurate. The sign-in sheet for Review of Rating By
Candidates in the Written Examination is used by DHR Support Staff to maintain a record of the
candidates who review their own papers during this period.
FORM:
Sign-in sheet for Review of Ratings by Candidates in the Written Examination.
PROCEDURE:

Notice is sent to candidates, informing them of the two-day Review of Rating period during which
test papers may be reviewed for errors in scoring.

The sign-in sheet for Review of Ratings by Candidates in the Written Examination is used by the
Support Services staff to record the examination answer key number and the signature for each
candidate who reviews her/his papers.

A copy of the candidate’s Scantron sheet and examination answer key is provided to the candidate
for review. If there is a large number of candidates, it may be necessary to provide multiple copies
of the key.

Applications and all supporting documentation submitted by the candidate are also available during
the review period.

Candidates are allowed to inspect their own papers only. Non candidates must submit a public
records request in order to review any candidates’ papers. Such requests will be reviewed and
answered on a case-by-case basis.
Section VI – Page 13
DEPARTMENT OF HUMAN RESOURCES
City and County of San Francisco
REVIEW OF RATINGS BY CANDIDATES IN THE WRITTEN EXAMINATION
Class Number and
Title of Examination: ____________________________________________________________________________
Dates: __________________________________________________ Analyst: _____________________________
This Review of Rating is for the sole purpose of determining that the
computation of the score has been accurate.
PROTESTS OF ACCURACY OF SCORE MUST BE IN WRITING.
***********************************************************************
DATE
TIME IN
SIGNATURE
PRINT
STAFF
KEY
LAST
NAME
INIT
NBR
Section VI – Page 14
ORAL/PERFORMANCE EXAMINATION – RECRUITING RATERS
In recruiting appropriate raters to serve on the oral examination panel, analysts must:









Determine what qualifications raters need to possess; i.e., field of expertise, level in organization,
licenses and certifications.
Consider factors that may limit/restrict participation; i.e., conflicts of interest (knowledge of
significant number of candidates, participation in prior exams). Special care and consideration
should be given if using City employees as raters for exams where the candidate pool includes
City employees.
Determine sources to be used for obtaining raters; e.g., professional organizations, other public
jurisdictions/agencies, private businesses and organizations with similar functions, colleges and
universities.
Contact sources to obtain names of potential raters; then contact individuals to determine their
availability and to confirm qualifications.
Seek diversity of raters (gender, ethnicity, sexual orientation, etc.) to reflect City’s
population/candidate pool.
Secure at least two raters for each panel. If possible, analysts should attempt to obtain three
raters per panel. For some examinations, a larger number may be desirable, due to the nature or
level of the class. Analysts should attempt to have alternate raters available should some raters
fail to appear.
Ensure that the number of raters per board is equal.
Ensure that examination supervisor reviews names of prospective raters.
Call raters one or two days prior to the examination to confirm their participation.
After arranging rater participation, analysts must send rater orientation packets as soon as possible and in
sufficient time for the rater to receive and read the material (ideally at least 10 days prior to the examination).
The packet must include:








Cover letter (See sample on next page)
Oral/Performance Examination Manual
Examination Announcement
KSA Description Sheet (Optional)
Sample Rating Sheet (Optional)
Map, directions to test site
Parking and transit recommendations
Rater Background Information and Qualifications Form (Optional)
Section VI – Page 15
SAMPLE OF COVER LETTER TO ORAL/PERFORMANCE EXAMINATION RATERS
Thank you for agreeing to serve as a rater to appraise the candidates participating in the examination for
__________________________________ .
Enclosed are copies of the following:
1.
2.
3.
4.
5.
6.
Oral Examination Manual (A guide for raters)
Examination Announcement
KSA Description Sheet (Optional)
Rating Sheet (Optional)
Map (Optional)
Rater Background Information and Qualifications Form (Optional)
Please review these materials prior to the examination orientation.
The raters will convene at _______________, on ___________________________ .
The hours will be from ________ to approximately _________ . There will be an orientation to go over the
examination procedures, and to answer any questions you may have regarding the process. While serving as a
rater, we will provide you with lunch (remove for City employees) and will reimburse you for any parking,
public transportation or bridge toll fees which you may incur.
Please do not divulge the fact that you are serving as a rater so that neither the candidates nor the
department(s) may learn the identity of the raters prior to the interviews.
If you have any questions, please call me at _______________ . I am looking forward to working with you.
Thank you again for your assistance and cooperation.
Very truly yours,
Section VI – Page 16
ADMINISTERING THE ORAL AND PERFORMANCE EXAMINATIONS
The success of the oral and/or performance examination depends to a great extent upon the impartiality of the
raters as well as their ability to put candidates at ease. Candidates will base their opinion of the value of the
examination on their treatment by the interviewers; every effort should be made to ensure that the candidates
feel they have received a fair and adequate opportunity to demonstrate the relevant knowledge, skills and
abilities.
To ensure that all the aspects of the administration are planned adequately, analysts must refer to the Checklist
for Oral Examinations.
Guidelines for Administering the Oral/Performance examination:

Refer to Checklist for Oral Exams for instructions for setting up test site before raters arrive.

Conduct proctor orientation; provide written proctor instructions.

Conduct rater orientation.
Departmental Briefing (Optional):
In some cases, a briefing by a departmental representative may be advisable. Departmental briefings
and all oral interviews must be taped. The person giving the briefing is to be cautioned not to refer to
any individual candidate, nor to suggest one specific class as being more “appropriate” experience
than another. We do not want to give any candidate or other party who may later listen to the tape
cause to claim that uniform standards and impartiality were not maintained. The briefing is to be
impartial to the candidates and is to present basic information to the panel about the duties and
responsibilities of the subject classification as well as those characteristics that will enable the
employee to perform the job successfully. In most cases, the departmental representatives leave after
the briefing is completed, unless they are needed to respond to raters’ questions.
Oral Board Orientation:
If departmental briefing is not conducted, analyst must describe the classification if it was not
adequately covered in the departmental briefing.
Analyst should tell raters that since this is a stressful situation for most candidates, raters should make
every effort to put candidates at ease.
Analyst should describe the structured examination process; the following points must be included:
1. Raters ask all candidates a pre-established set of questions designed to elicit responses upon which
an evaluation of the candidate’s knowledge, skills and abilities can be based.
2. Raters evaluate the responses of candidates based upon predefined, job-related criteria that are
called behavioral anchors.
3. Common rater errors, described on page 6 of the Oral/Performance Examination Manual.
4. Description and use of rating scale and rating sheet.
5. Raters are encouraged to take notes, but on a sheet separate from the Rating Sheet.
6. All ratings should initially be made independently, with no discussion among raters. After rating
independently, raters will review their ratings to determine if consensus has been reached.
Consensus: KSA ratings ideally should be within 1 point. If necessary, raters should discuss the
Section VI – Page 17
factors that led them to make certain ratings. Any discussion should concentrate on these factors in
relation to the rating guidelines and the discussion should focus on reaching consensus.
(Sometimes consensus cannot be reached in spite of best efforts.)
7. Preliminary ratings should be made in pencil at the conclusion of each performance test. A final
rating can be made after all applicants have been tested.
8. Raters must excuse themselves from rating any applicant who is so well known to them that it
would be difficult to rate the applicant impartially. No comments about this applicant should be
made to other raters.
9. Review contents of exam and guidelines with raters. If discussion results in any changes in
questions, changes must be made prior to first candidate’s exposure to questions.
10. Caution raters that questions must be asked in the same order and manner, no paraphrasing or
explaining is allowed.
11. Caution raters not to be inappropriately influenced by communication skills if that is not the KSA
being measured. For example, when testing technical knowledge for Animal Care Attendant if the
candidate does not use proper grammar in responding to the question, raters should not allow this
to affect the rating.
12. Caution raters not to discuss examination with candidates prior to the final adoption of the eligible
list.
13. Instruct raters on the operation of tape recorder.
14. Instruct raters to start tape recorder, and ask candidate to state his/her name and if s/he has read the
notices posted. Explain content of notices to the raters. The names of the raters and appeal rights
are posted.
15. Instruct raters that if they have any concerns at any time during the process to let the analyst know.
Analysts should brief the proctors on their responsibilities; the following directions must be included:







Check I.D. and notification letter when candidate arrives, and direct candidate to read posted
examination notices.
Verify that candidate has read the notices and ask whether there is an objection to any member of
the rating panel.
If candidate has an objection, discuss reasons with candidate to determine the best course of action
(i.e., alternate panel, excuse panel member, etc.).
If candidate is allowed to review questions prior to interview, escort candidate into a separate
room. Monitor candidate for allotted time. Ensure unauthorized materials are not used.
If candidate is provided with a copy of the questions to refer to during the interview, make sure that
the candidate does not take any examination material with him/her.
Escort candidate into interview room and introduce to raters.
Escort candidate to exit after the interview.
During the examination, the analyst should:



If possible, the analyst should stay with the oral board panel through all interviews to monitor
raters, ensure that uniform procedures are being followed, and that scoring is consistent with
standards.
Periodically monitor progress of panel(s).
Review final ratings to ensure that raters have reached consensus (if necessary), all rating sheets
are signed and any necessary comments are recorded.
Section VI – Page 18


Gather all test materials, including rater notes.
Reimburse raters for expenses and thank them.
Performance Test Procedures
Applicants are shown to the testing area and provided with the “Instructions to Applicants”. Any general
questions about the testing procedure are answered before beginning the test.
A stopwatch should be used to ensure that time limits are accurately followed. The analyst must note the
starting and stopping time of each test administration and must note any irregularities.
Section VI – Page 19
ORAL BOARD RATER FOLDERS
Rater folders are used to provide each oral board rater with the most current information regarding the
classification to be tested and the administration of the examination. The following information should be
included in the folders:










Rater Background Information and Qualifications Form
Job Announcement (optional)
KSA description from job analysis
Examination questions and script (to be read by Board Members to each candidate)
Examination rating guidelines
Rating grids or tools (optional to assist the Board Members with tabulating the candidate’s scores)
Scantron sample if used (both sides)
Note pad
Pencils
Tent card with rater’s name
Section VI – Page 20
SAMPLE
PROCTOR INSTRUCTIONS FOR ORAL/PERFORMANCE EXAMINATION
ALL PROCTORS
 Sign-in/sign-out
 Put up signs (workstation signs, check-in desk, & wall)
 Post notices (Civil Service Rule 111.14 and Rule 111A Appeal Rights, Information for Candidates
which includes the names of raters)
 Pass out packets to raters along w/name tag, clipboard, pencils, etc.
 Set up tape recorders & cassettes
 End of day – collect rater packets
CHECK-IN PROCTOR
SUPPLIES: Roster, candidate labels, inkpad, blank sheets, posting notices, list of board members, paper
towels, rating sheets







Ask for schedule letter and tell them to hold onto it.
Ask for driver’s license and check that it is current. If not current, note on roster. If no license,
fingerprint the person, write name and ID #.
Ask candidate to read posted notices (Rule 111.14 and Information for Candidates).
Ask if candidate has objections to board members (if they know them); if so, advise analyst.
Ask candidate to have a seat until called.
Check that there is one rating sheet per rater for all candidates.
After written exam, give rating sheets to escort-proctor.
WRITTEN EXERCISE PROCTOR (SEE SEPARATE INSTRUCTIONS)
SUPPLIES: Timer, pencils, folder for schedule letters, written exam, scratch paper, roster.
ESCORT PROCTOR
 Keep checking to see when boards are finished (allow them time to fill out rating sheets after
candidate has left).
 Pick up rating sheets from check-in proctor and ask proctor if anyone has objections to the board
members.
 Call candidate’s name.
 Walk candidate to testing room.
 Hand rating sheets to board members.
 Ask candidate to introduce herself to the board members.
 When candidates are finished, escort them to the exit.
Section VI – Page 21
SAMPLE
EXAMINATION INSTRUCTIONS TO CANDIDATE
CLASS 4220 PERSONAL PROPERTY AUDITOR
ORAL/PERFORMANCE EXAMINATION
Please read carefully.
There will be two parts to this examination, a written exercise and a structured oral interview. The
examination board will be rating your responses to both the written exercise and the oral interview questions
as they reflect the knowledge, skills and abilities necessary to perform the duties of this class.
These include:
Knowledge of accounting and auditing principles and practices
Ability to interpret government codes and ordinances
Problem-solving/organizational ability
Analytical ability
Written communication ability
Oral communication ability
Human relations ability
The written exercise will be rated separately, and will measure your written communication ability only. The
final draft of your response must be on the single sheet of letterhead provided. The board will read only that
sheet when evaluating your response to the written exercise.
In the oral interview, you will be asked four questions, and will be given a printed copy of these questions so
that you may follow along as they are presented to you. The oral interview should last approximately 20
minutes. You will be rated only on your response to questions asked. Please answer each question clearly,
completely and concisely.
All briefcases, papers, calculators, etc. not provided by the examination proctor must be left with the proctor.
You may not take any examination materials or notes regarding the examination from the exam area.
Please refrain from discussing the contents of this examination with anyone prior to the posting of the eligible
list.
Section VI – Page 22
CHECKLIST FOR ORAL EXAMINATIONS
 Performance test equipment (ie. Easels, mechanical
diagrams etc.)
B. PREPARATION
SECURE TESTING SITE
 Secure approval from Human Rights Commission (if
necessary)
 Check with accounting/budget staff to obtain information
on availability of funds and documents needed for
payment
 Obtain Dept. of Real Estate approval (if necessary)
 Secure off site testing site, if necessary – call site contact
person; arrange for site visit
 If using a school, submit school permit 10 days in
advance (Payment in advance)
 Prepare rater test packets (see p. 16)
 desk sign scratch paper pencils  exam
announcement
 KSA definition check-in roster
 questions and guidelines
 background forms (EXM-15)
 rating worksheets or grids(optional)
 Confirm time and place with raters and departmental
representative(s)
 Ensure supply of extra tapes and tape recorders, and/or
video recorders (for performance examinations)
 Check operation of tape recorder(s) and label tapes
NOTIFYING CANDIDATES
C. ADMINISTRATION
 After determining test site, notify candidates of testing
date, time, and location
 Contact school custodian or site contact person to verify
opening time, room numbers and seats per room
1.
PROCTORS





Contact proctors
Plan proctor assignments and prepare assignment sheet
Confirm with proctors
Bring sign-in sheets for proctors
For new proctors, ensure that they have completed all
necessary documentation prior to exam date
A. PRELIMINARY



















Oral date set (calendar noted)
Reserve tape recorder(s) and locate tapes
Raters notified – orientation packet sent
Candidates notified
Fails notified
Desk signs (name signs for raters)
Briefing by departmental representatives arranged
(optional)
Rating sheets printed
Check-in roster prepared
Questions and rating guidelines prepared
KSA definitions
Definition of Rating Scale
Rating worksheets or grids(optional)
Pencils, tablets
Digest of Rule 111
Background Forms (Exm-15)
Receipts form for individual raters
Obtain cash advance for expenses (lunch, travel,
reimbursements, etc.)
Arrangements for rater accommodations/travel, etc.
2.
3.
4.
5.
Section VI – Page 23
Before raters arrive:
 Post: Information for Candidates (EXM-14) and
Digest of Rule 111/111A
 Set up tape recorder(s) and record date, examination
and names of raters
 Have ready:  change  receipts  stapler
 staple remover  ID forms  rating sheets
When raters arrive:
 Tape record departmental briefing, if any
 Distribute rater test packets and conduct rater
orientation/training
 Collect background forms (EXM-15)
Candidate Check-in:
 Check candidate ID
 Ask candidate if s/he has read notices and if
candidate has any objections to composition of rating
panel
 Tell candidate about oral examination procedures
Inside Exam Room:
 Introduce board members
 Advise candidate of taping; start tape, ask for name,
has s/he read notices; and if s/he has any objection to
composition of rating panel
 Note on candidate roster start time and tape count
number (optional)
 Monitor questioning
 Review ratings
After Exam:
 Collect packets
 Reimburse
 Thank board members
 Prepare expense report, attach receipts
 Remove signs, clean-up room(s)
 Return tape recorder(s)
POSTING NOTICES FOR ORAL /PERFORMANCE EXAMS & DIGEST OF CIVIL
SERVICE RULE 111A
PURPOSE:
To provide candidates with the name of raters and information regarding Civil Service
Commission rules and review of ratings.
FORMS:
 Information for Candidates
 Examination Policies, Procedure and Practices and Digest of Civil Service Rule 111A
 Job announcement (optional)
 Examination instructions to candidate
PROCEDURE:

Analysts should complete top section of the Information for Candidates form and
provide the following information for each rater: name, position title, name of
organization/agency.

Post forms at examination site.
Section VI – Page 24
INFORMATION FOR CANDIDATES
Class No.: ____________ Title: ___________________________________________________
Dates: ________________________________________________________________________
Analyst in Charge: ______________________________________________________________
The members of the Panel are:
The Department of Human Resources requests that you do not discuss the interview with other
candidates who have not yet been examined. Your cooperation in adhering to this request will
equalize opportunity for all candidates. You will be notified of the results of the exam by mail or
email.
Section VI – Page 25
City and County of San Francisco
Human Resources Director Policy, Procedure and Practices for Examination
Administration
Policy, Procedure and Practices
It is the policy of the Human Resources Director that all examinations shall be conducted in compliance with Civil
Service Rules, other applicable laws, and professional practices. Position Based Testing examinations will be
conducted in a standardized manner and utilize fair employment practices in the administration of merit based
examinations.
Civil Service Rule 111A. 1.2: It is the policy of the Civil Service Commission that examination processes in the City
and County of San Francisco under the Position-Based Testing Program are conducted in an efficient and fair
manner to ensure that the best-qualified individuals are selected to perform service for the City.
I. Policy, Procedures and Practices
The procedures and practices for examination administration are established to ensure fair and impartial conduct of
examinations.
1.
The orientation of the raters may include a presentation by the department head or departmental
representative which includes a description of the class and/or position for which the examination is being
held, the setting of the class in the department, the critical elements of knowledge, skills and abilities and
other characteristics needed by employees in this class, and related information. The department head or
representative shall not discuss any candidate with any rater at this time or any other time prior to the
completion of the examination.
2.
No fraternal rings, organization pins, or insignia of any kind shall be displayed by any rater.
3.
No rater shall rate a candidate who is related to that person or rate a candidate if any strong personal
association exists between that candidate and the rater so that it would be difficult to make an impartial
rating. If possible, the excused rater shall be replaced by an alternate with similar qualifications.
4.
Raters may only consider relevant documents from candidates that are required by the scheduling notice.
5.
Uniform standards shall be applied to every candidate in each examination. .
6.
Except as otherwise permitted by law, applicants shall not be questioned regarding their race, religion, sex,
gender identity, political affiliation, age, religion, creed, national origin, disability, ancestry, marital status,
parental status, domestic partner status, color, medical condition (cancer related), ethnicity, or other
conditions Acquired Immune Deficiency Syndrome (AIDS), HIV, and AIDS related conditions or other
non-merit factors; nor shall such factors be utilized in establishing minimum qualifications requirements
and developing examination procedures.
7.
Recordings of an examination shall be retained until the eligible list is adopted. A defective recording shall
not invalidate the examination unless the Human Resources Director finds the omitted or unintelligible
material critically relevant to the examination, in which even the Human Resources Director, may order a
new examination.
8.
In the event of an appeal that could invalidate the examination, all candidates whose standing in the
examination may be affected, may be notified of the appeal prior to final action being taken.
9.
Any violation of the following procedures and practices by candidates may be cause for disqualification:
-
no fraternal rings, organization pins or insignia of any kind shall be displayed by any candidate;
Section VI – Page 26
-
no candidate shall discuss her or his candidacy or any relationship thereto with any rater prior to the
completion of all parts of the examination and the final adoption of the eligible list; and
unless expressly directed by the notice to candidates to report for examination, no letters of reference
or recommendation, performance evaluations, work samples, work products, awards, certificates, or
other materials shall be presented to the raters.
II. Civil Service Rule 111A Examination Provisions Regarding Position Based Testing
Sec. 111A.18
Adequacy of Examinations
The Human Resources Director shall approve the adequacy of the examination to rate the capacity
of the applicants to perform the job. Examinations may include, but are not limited to one or more
testing devices such as written examinations, oral interviews, performance exercises, assessment
centers, successful completion of requirements imposed by other authorities for the award of
certification, licensure, academic recognition (e.g. degree, course completion), placement on a
roster as provided in Sec. 111A.27, or any other devices or methods to determine merit and fitness
for tested positions.
Sec. 111A.19
Examination Rating Panels
The Human Resources Director shall make every reasonable effort to ensure diversity of the
qualified raters.
Sec. 111A.20
Establishing Cutoff Scores and Number of Eligibles
The Human Resources Director shall establish a cutoff or passing score and shall determine the
number of persons who shall constitute the eligible list.
Sec. 111A.21
111A.21.1
Sec. 111A.21
111A.21.2
Sec. 111A.22
Cheating in Examinations Prohibited
Any action that constitutes cheating, improper aid, hindrance, fraud, or collusion in any part of the
examination process is prohibited. The following are some specific actions that are expressly
prohibited: relevant false statements by applicants on the application or during the selection
process; the use or attempted use of materials not authorized by the scheduling notice to
candidates to report for the examination; defeating, deceiving or obstructing any person in respect
to his or her right of examination; falsely marking, grading, estimating, or reporting upon the
examination or proper standing of any person examined hereunder, or aid in so doing; making any
false representations concerning the examination or the person examined; or furnishing to any
person any special or secret information for the purpose of either improving or injuring the
prospects or chances of any person of being appointed, employed or promoted.
Cheating in Examinations Prohibited (cont.)
Any person cheating, attempting to cheat, or assisting in cheating or hindering other persons in
any phase of the examination process shall be prosecuted to the full extent of the Charter and other
laws. Actions to be taken include elimination from the examination process, dismissal and
ineligibility for future employment and such other appropriate action as may be recommended by
the Human Resources Director.
Review of Ratings by Examination Participants
111A.22.1
Examination participants shall have a minimum period of five (5) working days to review their
own examination ratings to confirm the accuracy of the calculation of their scores and/or rankings.
The identity of the examiner giving any mark or grade shall not be disclosed.
111A.22.2
The Human Resources Director shall establish the procedures for Review of Ratings.
Section VI – Page 27
RATER BACKGROUND INFORMATION AND QUALIFICATIONS FORM
PURPOSE:
To document qualifications of raters and provide notification of confidentiality of
examination process.
FORM:
Rater Background Information and Qualifications form
PROCEDURE:

Analysts should distribute forms to raters at the beginning of the orientation to the
examination.

Retain form in Active file for future reference.
Section VI – Page 28
RATER BACKGROUND INFORMATION
Examination for
Class Number:
Department:
Title:
Working
Title
1.
Name:
2.
Sex (Check One)  Male  Female
3.
Ethnic Group (Check One):  White  Black  Hispanic  Filipino
 Asian or Pacific Islander  American Indian or Alaskan Native
4.
Employed by:
Address:
Telephone number, including area code:
5.
Your position:
6.
Briefly describe your experience, training, and education in this area:
Please complete the following (please print):
I understand that all information discussed concerning an examination is of a highly
confidential nature. I further understand that to discuss these matters is unfair to
candidates and illegal.
Signature:
Date:
File in Active File
Section VI – Page 29
EXAMINATION ADMINISTRATION APPEAL PROCEDURE
To process and determine the merits of an appeal of examination administration, analyst
must:
 Ensure protest has been logged by DHR Support Services staff
 Ensure timeliness of protest (see stages of protests below)
 Determine whether claim is related to a subject that can be protested or
appealed. Discuss with supervisor before proceeding
 Investigate claim
If protest is being denied, and can be appealed, summarize the appeal issue(s), state
reason(s) for denial and advise candidate the appeal must be filed pursuant to CSC Rule
111A Article 8 Appeals of Examination Processes.
If there is merit to the protest, take steps to rectify the problem and advise candidate(s) of
corrective action.
If there is no merit to the protest, and it cannot be appealed, summarize the appeal issues,
state reason(s) for denial and advise candidate matter cannot be appealed.
All protests must be in writing unless otherwise specified in Civil Service Rule 111A. All
protests must state the specific grounds upon which they are based and provide facts
supporting allegation(s). Failure to do so may nullify the protest and subsequent appeal
rights.
Section VI – Page 30
CANCELLATION OF EXAMINATIONS
PURPOSE:
To allow for the canceling of an examination
AUTHORITY:
Civil Service Commission Rule 111A.6, Responsibilities of the
Human Resources Director: The Human Resources Director shall
administer and rule on all matters concerning the position-based
testing program.
PROCEDURE:
An examination in progress can be cancelled when hiring needs change and
anticipated vacancies are not to be filled. To request the cancellation of an
examination a letter should be submitted to the Human Resources Director clearly
explaining why the examination should be cancelled. Include the following
information in the request:
Any problems encountered in the recruitment and selection process
How the department intends to fill the position if not by permanent
appointment from an examination.
The final decision on whether to cancel the examination is made by the Human
Resources Director. Analysts must contact any applicants and DHR, Support
Services regarding the cancellation.
The request letter is initiated by analysts; reviewed and signed by Team Leaders
or Departmental Personnel Officers; forwarded to the Deputy Director of
Recruitment and Assessment Services for review; and then forwarded to the
Human Resources Director for review and approval or denial. If approved, the
signed letter is sent to DHR, Support Services staff, who will use the information
to update the master examination record.
Section VI – Page 31
EXPENSE REIMBURSEMENT FORMS
PURPOSE: To request reimbursement of examination administration expenses.
FORM:
Expense Reimbursement Receipt form
Field Expense Report form
Request for Cash Advance to Defray Business-Related Expense form
Traveling Expense Voucher form
PROCEDURE:
Analysts must refer to the Department of Human Resources Purchasing
Guideline prior to incurring any examination administration expenses.
For reimbursement without cash advance:

Each rater must complete and sign the form, attach original receipts or
provide explanation for missing receipts, and submit to analyst for
reimbursement of transportation and/or miscellaneous costs incurred.

Analysts must itemize all examination expenses incurred on the Field
Expense Report form and attach the Expense Reimbursement Receipt
forms from the raters and all other original receipts. Team Leader and
manager must sign the Field Expense Report form before it is submitted
to DHR.
For cash advance to defray examination administration expenses:

To receive a cash advance prior to the examination, analysts must
complete the Request for Cash Advance to Defray Business-Related
Expense form and submit the original and one copy three weeks prior to
the examination administration to DHR.

After the examination administration, analysts must itemize all expenses
incurred on the Traveling Expense Voucher form and attach the Expense
Reimbursement Receipt forms from the raters and all other original
receipts. Team leaders and manager must sign the Traveling Expense
Voucher form before it is submitted to DHR.
Section VI – Page 32
EXPENSE REIMBURSEMENT RECEIPT
Examination
Class #:
Title:
This form is to be completed by the rater/interviewer for all reimbursable expenses.
RECEIPT ATTACHED?
I. Transportation Expenses
YES
Amount
NO
$
a. Parking
b. Bridge Toll
c. Transit
BART
MUNI
d. Other (explain in detail)
II. Miscellaneous Expenses
(Explain in detail)
$
TOTAL EXPENSES $
(Attach receipts or explanation of missing receipts)
Rater’s Name (Please Print)
Rater’s Signature
Rater’s Social Security Number (if reimbursement is to be mailed to rater)
Date
Rater’s Telephone
Section VI – Page 33
DATE:
TO:
FROM:
SUBJECT:
REQUEST FOR CASH ADVANCE TO DEFRAY
BUSINESS-RELATED EXPENSE
Amount Requested: $
Reason for Request:
Signature of Requestor
Date
Telephone No.
Request Approved:
Team Leader
Date
Assistant Division Manger
Date
INSTRUCTIONS
1. Route original and one copy of request.
2. Allow three weeks for processing.
3. Please justify request as fully as possible, e.g., if request is to offset rating panel
expenses, indicate classification, number of days, and number of raters.
4. Approval by Assistant Division Manager and Team Leader required.
5. Request submitted without signature of requestor will not be processed.
Section VI – Page 34
Section VI – Page 35
SECTION VII
Examination Scoring
Section VII – Page 1
EXAMINATION SCORING
Introduction
Scoring a test is relatively simple, but applying the test results requires thoughtful analysis. The
use of test scores in personnel selection involves more than just adding up the correct answers.
Sound selection decisions should be made based on what the different test scores mean (i.e.,
good performance on a test must be distinguished from poor performance.) In many cases,
scores from different test components (e.g., oral and written) are often combined at the end of a
selection process to determine a candidate’s final ranking score. The process of interpreting and
combining test scores can cause significant problems. The purpose of this chapter is to discuss
some of those problems and a psychometric procedure for addressing them. This procedure is
called Standard Scoring.
What is a Standard Score?
A standard score is the result of converting a raw score to a new value, usually called a z-score,
that compares a score to all other scores in its distribution. A standard score is calculated using
the mean and standard deviation values from the distribution of raw scores. All z-score
distributions have a mean of zero (0) and a standard deviation of one (1). The range of values for
z-scores is usually between –3.0 and +3.0. A z-score indicates the relative status of the score
within its distribution because the actual value of a z-score represents its distance from the mean
in units of the standard deviation. For example, a z-score of +1.0 represents a score that is one
standard deviation above the mean of the scores. A z-score of –2.3 represents a score that is 2.3
standard deviations below the mean of the scores.
Standard scores contain all the information necessary to determine the position of a score in a
distribution of scores. A standard score can be plotted against the normal (bell-shaped) curve
and thus the position of the score in a distribution can be precisely determined. The normal
curve is a hypothetical distribution with known characteristics and is useful in personnel testing
because it allows one to determine the expected proportion of scores above and below a
particular score.
Section VII – Page 2
Figure 1: The Normal Distribution
Note: One standard deviation on each side of the mean (+ 1.0) contains
approximately 68% of the total cases. Two standard deviations on each side of
the mean (+ 2.0) contain approximately 95% of the total cases. Three standard
deviations on each side of the mean (+3.0) contain approximately 99.7% of the
total cases.
As illustrated in Figure 1, a z-score (standard score) of 0 is always located exactly at the mean.
At this location, 50% of the scores would be above 0 and 50% of the scores would be below 0.
A z-score of 1.0 would have 84% of the scores below it (50% below the mean plus the 34%
contained one standard deviation above the mean) and 16% above it (see Figure 1). The
proportion of scores above and below a specific z-score is summarized in tables that can be
found in the appendix of most statistics books.
Why Should We Standardize Scores?
Raw scores are converted to standard scores for three reasons: 1) To provide meaning to raw
scores; 2) To account for differences in test scores due to the use of multiple rating boards or
multiple administration of the same examination or panel; 3) To retain the effective weight of
test components.
Providing Meaning to Raw Scores
The interpretation of a raw test score (i.e., as either a “superior” or “poor” score) is extremely
difficult because the raw score has no intrinsic meaning. This is in contrast to other types of
measurement where the interpretation of measurement is based on an independent standard.
Section VII – Page 3
For example, consider the difference in stating that a person weighs 200 pounds and that the
person scored 80% on a reading ability test. The first measure has meaning but the second does
not. The reason for the difference has to do with the different measurement scales used to
evaluate weight and reading ability. We know how much a pound is, and we know that two
pounds weigh twice as much as one pound. We also know that a person who weighs 200 pounds
on one scale will weigh the same on another scale. We can arrive at those conclusions because
the unit of measurement (pound) is independent of the particular instrument (a scale) being used
to evaluate weight. Also, the measurement of weight is represented by equal intervals between
units of measurement. A pound is a pound no matter what scale is used, and two pounds always
weigh exactly twice as much as one pound.
In contrast, the evaluation of reading ability is considerably different. By itself, a reading ability
score of 80% is meaningless. This is because the evaluation of reading ability (the score) is
completely a function of the test that is used. There are no units of measurement of reading
ability that are independent of reading tests. A person who takes different reading tests may
score differently on each test because there is no absolute unit of reading ability. Without other
references, a score of 80 does not indicate twice the reading ability of 40.
Although psychological measures (such as raw test scores) do not have the clear and precise
meaning as the measures for physical objects, it does not mean that they do not have any
meaning. What is required is a different strategy for deriving meaning from scores obtained
from psychological measures. Standard Scoring represents the psychometric strategy for
converting raw scores into interpretable or meaningful scores.
A standard score also provides information regarding the level of performance compared to the
other test scores observed in the distribution. In fact, when standard scores are used, direct
comparisons of relative performance can be made across distributions (e.g., other tests or other
components). This information provides meaning to the test score and allows for the
interpretation that is needed to make personnel selection decisions.
Accounting for Differences in Test Scores Due to Multiple Oral Boards
Personnel selection tests are designed to measure individual differences that exist in a specific set
of knowledge, skills and abilities (KSAs). When multiple boards are used to rate candidates,
random assignment must be used to distribute candidates across the different boards. With
random assignment, all candidates have an equal chance of being assigned to any given board
and, therefore, it is assumed that candidates are evenly distributed across board with regards to
capability. If the assumed outcome of random assignment is achieved, it can be expected that the
differences that are observed in test score reflect actual differences between individual
candidates’ relative capabilities on the KSAs. However, appropriate analysis must be conducted
to assure that the ratings reflect true differences between candidates’ capabilities rather than
rating errors.
The DHR automated examination scoring system uses a statistical procedure called Analysis of
Variance (ANOVA) to assess the contribution of true individual differences versus rating errors
(e.g., leniency, harshness, use of different rating standards) to the observed test scores.
Section VII – Page 4
Significant results for the ANOVA statistic usually indicates that rating errors contributed
to the observed differences in test scores at an unacceptable level. Therefore, the assumption
that different raw test scores indicate different levels of performance or capability must be
seriously questioned.
Rating errors are usually the result of different oral boards applying different rating standards or
one oral board being more harsh or lenient than another oral board. Thus, rating errors result in
differences in test scores that are due to characteristics of the oral boards and, most likely, are
NOT due to true differences among candidates’ capabilities. The most critical consequences of
the rating errors described above become apparent when attempting to compare ratings or raw
scores given by different oral boards. The standards on which each oral board based its ratings
are simply not comparable.
For example, suppose that Chris and Lee take the same single component oral/performance
examination. They have comparable experience, skills, and abilities, but Chris scores 254.97 on
the examination and Lee scores 207.54. We can only assume that Chris is the better candidate if
the test scores reflect the true differences in candidate capabilities. The accuracy of this
assumption must be supported through appropriate statistical analyses of the test results to
establish the test’s reliability and validity.
Assume that Chris and Lee were just two candidates from a pool of 85 people taking the
examination. Two oral boards were used and candidates were randomly assigned to each board.
Chris was assigned to Board 1 and Lee was assigned to Board 2. Also assume the analysis
revealed a significant result for the ANOVA statistic and it was concluded that rating errors
contributed to the differences in test scores across boards. Due to the rating errors, the raw
scores would be standard scored by board.
Board 1
Board 2
N
41
44
Mean
202.18
162.61
Std. Dev.
41.57
30.36
Given the descriptive statistics listed above, Chris’ raw score of 254.97 is 1.27 standard
deviations above Board 1’s mean and converts to a weighted standard score of 854.13. Lee’s
raw score of 207.54 is 1.48 standard deviations above the mean of board 2 and converts to a
weighted standard score of 895.72. Thus, compensating for the differences in rating standards
used by the boards results in a higher score for Lee.
Retaining the Effective Weight of Test Components
Personnel selection tests may sometimes consist of more than one exercise or component. In
these situations a score is typically calculated for each component and a total score is then
obtained by adding the component scores together. Each test component is also assigned a
weight that reflects the desired contribution or impact of that component score to the total score.
When weighted raw scores form multiple components are combined, the assigned weights of the
components may not have the expected contribution to the total score. For example, assume that
Section VII – Page 5
a test consists of Components A and B where Component A is assigned a weight of 65% and
Component B is assigned a weight of 35%. If weighted raw scores from Components A and B
are simply added together to reach a total score, the effective weight of Component A may not be
65% and the effective weight of Component B may not be 35%. This is because any difference
between the standard deviations of the component score distributions will influence their
effective weights.
To illustrate this point, Table 1 contains hypothetical data for a sample of five candidates on two
different test components, an oral test and written test. Assume that each component is weighted
50%. Also, assume that the means for the two test components are identical but the standard
deviation of the oral test is twice as large as the standard deviation of the written test. The scores
on the two tests have been combined, once using weighted raw scores (WRS) and once using
weighted standard scores (WSS) converted to a 500-point scale (i.e., standard scores with a mean
of 350 and a standard deviation of 50). The rankings of candidates, based on the different
procedures for combining test scores, are presented in the last two columns of the table.
Candidate
WRITTEN
ORAL
TOTAL
RANKING
WRS
WSS
WRS
WSS
WRS
WSS
WRS
WSS
1
355.00
304.00
500.00
401.00
855.00
705.00
1
4
2
385.00
335.00
445.00
373.00
830.00
708.00
3
3
3
400.00
350.00
400.00
350.00
800.00
700.00
5
6
4
450.00
401.00
370.00
335.00
820.00
736.00
4
2
5
475.00
427.00
365.00
332.00
840.00
759.00
2
1
X = 400
S.D. = 48.81
X = 400
S.D. = 97.62
Total = Written + Oral
Table 1: Comparison of Candidate Ranking Resulting from Different Procedures for Combining Test Scores
Comparing the rank order of candidates established by combining the scores expressed as
weighted raw scores and those scores expressed as weighted standard scores, you will see that
there are significant differences. The difference in rankings shows that, although its assigned
weight was 50%, the effective weight that the oral test contributed to the total score was much
greater than 50%. While this example was developed to represent an extreme case, it would not
be uncommon for 30 to 40 percent of the eligibles on an eligible list to shift rank when the
procedure for combining scores changes. The errors resulting from the use of raw scores could
easily be avoided by using standard scores.
The conversion of raw scores to standard scores assures that the assigned weight of a component
is the weight it actually contributes to the total score. The retention of a component’s weight is
possible because distributions of standard scores can be made to have the same standard
deviation. The equality of standard deviation values across component score distributions
assures that component scores retain the weight assigned to them.
Section VII – Page 6
To summarize, the use of standard scoring is critical for the appropriate interpretation of test
scores in personnel selection. A raw score by itself conveys very little meaningful information.
When using raw scores, the interpretation of test results is difficult and the combination of test
results is problematic. The procedure of standard scoring converts raw scores to a common,
independent scale. Standard scoring allows for the description and meaningful interpretation of
specific test scores, within the same or across different tests, using the same measurement
language. Finally, when combining scores from different components, standard scores ensure
that each component actually retains its assigned weight relative to the total test score.
How are Standard Scores Calculated?
There are different types of standard scores, but the simplest standard score is the z-score. A zscore distribution has a mean of 0 and a standard deviation of 1. The formula for computing a zscore is:
Z = (X-X)
S.D.
where X = raw score
X = the mean of the distribution
S.D. = the standard deviation for the distribution.
Test Scores Expressed
As Weighted Raw Scores
756.67
769.17
884.17
910.83
943.33
Conversion to z-scores
(756.67 – 852.83) / 84.83 = -1.13
(769.17 – 852.83) / 84.83 = -1.00
(884.17 – 852.83) / 84.83 = .37
(910.83 – 852.83) / 84.83 = .68
(943.33 – 852.83)/84.83
= 1.00
X = 852.83
S.D. = 84.83
X=0
S.D. = 1
Table 2: Example of Converting Test Scores (Expressed as Weighted Raw Scores) to z-scores
Raw test score distributions converted to z-scores will usually contain negative numbers which
are cumbersome to work with and difficult for candidates to understand. However, once z-scores
are calculated, the mean and standard deviation of the standard score distribution can be set to
any desired value. The City and County of San Francisco uses a standard score distribution that
has a mean of 60 and a standard deviation of 20. These values are used to approximate the
characteristics of 1000-point raw score distribution when adding in promotive points.
Section VII – Page 7
Converted standard scores are easily computed from z-scores by multiplying the z-score by the
desired Standard Deviation and adding the desired Mean. The weighted standard score can then
be computed by multiplying the component weight (expressed as a percentage) by 10, and then
multiplying that figure by the converted standard score:
Converted Standard Score = (z-score x S.D.) + Mean
Weighted Standard Score = (component weight % x 10) x converted standard score
Weighted Raw Score
z-scores
Conversion to New
Standard Score
Conversion to Weighted
Standard Score
756.67
-1.13
(-1.13 x 20) + 60 = 37.4
(1 x 10) x 37.4 = 374
769.17
-1.00
(-1.00 x 20) + 60 = 40.0
(1 x 10) x 40.0 = 400
884.17
.37
(.37 x 20) + 60 = 67.4
(1 x 10) x 67.4 =674
910.83
.68
(.68 x 20) + 60 = 73.6
(1 x 10) x 73.6 = 736
943.33
1.07
(1.07 x 20) + 60 = 81.4
(1 x 10) x 81.4 = 814
X = 852.93
S.D. = 84.3
X=0
S.D. = 1
X= 60
S.D. = 20
Component Weight = 100%
Table 3: Example of converting weighted raw scores to weighted standard scores on a 1000- Point Scale
Standardized by Component vs. Standardized by Board
The specific purpose for computing standard scores determines the values, which should be used
in the z-score formula. For example, if the purpose for using standard scores is to account for
rating differences due to multiple boards, scores should be standardized by board, and board
mean and board standard deviation values should be used in the z-score formula.
If the purpose for using standard scores is to assure the retention of assigned weights, scores
should be standardized by component, and component mean and component standard deviation
values should be used in the z-score formula.
Component Weight vs. Total Possible Points
It is important to make a distinction between the weight of a component and the maximum
possible points for a component. Examination results are typically reported on a 1000-point scale
and the weight of each component is expressed as a proportion of 100 percent (e.g., 50%, 60%,
etc.).
Component Weight
The weight of the component actually refers to the impact that a candidate’s performance on the
component will have in determining the total score. For example, if an examination has two
components that are weighted 50% each, a candidate’s performance on each component should
Section VII – Page 8
have equal impact in determining the total score. If an examination has two components
weighted 60% and 40% respectively, a candidate’s performance on the first component should
have 1½ times the impact of the performance on the second component.
Total Possible Points
‘Total possible points’ refers to the maximum raw scores that can be achieved for a component
and is not related to the weight of the component. For example, the total possible points for a
multiple choice examination component is obtained by determining the raw score that would be
achieved if every item on the examination were answered correctly. If the written component
consists of 150 questions, the total possible points for that component would be 150. For
oral/performance examinations, the total possible points would be obtained by determining the
score achieved if the highest possible rating were given on every KSA measured in the
examination.
Reporting Weighted Standard Scores
Since standard scores reflect performance in relationship to that of the entire candidate pool, the
maximum points that can be calculated for any given component is a function of three things:
1. The mean and standard deviation of the component’s raw score distribution;
2. The weight of the component; and,
3. The scale upon which the scores are listed.
Consequently, when reporting examination results as weighted standard scores, the maximum
scores than can be achieved for a component may be greater than, less than, or equal to the
component weight (when the weight is expressed as a proportion of the points on a 1000-point
scale.) For example, an oral examination that is rated on a 5-point scale and has a weight of 700,
a mean of 210, and a standard deviation of 60 could result in a maximum score of 770 for a
candidate whose z-score is 2.5.
To alleviate any confusion between component weights and the calculation and reporting of
candidate’s total scores or component scores, component weights are expressed as percentages
(e.g., 50%, 60%, etc.), and component and/or total scores are reported on an approximately
1000-point scale for all examinations.
Section VII – Page 9
EXAMINATION SCORING PROCEDURES
The scoring process for an examination involves several steps, from preparing and scanning the
answer and/or rating sheets to generating the tentative eligible list.
It is helpful for analysts to review the following reports and forms prior to scoring an
examination:

Applicant Status Report
This ensures that all applicant data (i.e., social security number, ethnicity) have been
recorded and that the qualification status for each applicant is accurate.

Exam Scoring Profile
The purpose of the exam scoring profile is to assist the analyst in organizing requisite
data for automated exam scoring. This document also functions as a record of each
phase of the scoring process. All applicable sections of the profile for the component
being scored must be completed.

Answer/ Rating Sheets
Check that all appropriate bubbles are marked, that erasures are as complete as possible,
and that the scoring sheets are not crumpled or curled.
When you are ready to begin scoring, follow the instructions provided or consult with the
Department of Human Resources.
Section VII – Page 10
ANALYSIS OF EXAMINATION RESULTS
After the examination has been scored, the following reports are generated:
1. Examinations utilizing raters
 Cutoff Score Analysis
 Scoring Reports
2. For Multiple Score Examinations:
 Item Analysis – identify questionable items and have them reviewed by Subject Matter
Experts. There are three options for questionable items: delete, double key or leave item.
 Key revisions
 Cutoff Score Analysis
 Scoring Reports
The reports are used to determine passing point/cutoff score. For multiple-choice exams, the
passing point must be stated as a Weighted Standard Score.
When the passing point/cutoff score has been established and signed off by the Team Leader, the
scores can be uploaded.
Section VII – Page 11
SETTING THE PASSING POINT
The Federal Uniform Guidelines on Employee Selection state in General Principles Section V-H:
“Where cutoff scores are used, they should normally be set so as to be reasonable
and consistent with normal expectations of acceptable proficiency within the work
force. Where applicants are ranked on the basis of properly validated selection
procedures and those applicants scoring below a higher cutoff score than appropriate
in light of such expectations have little or no chance of being selected for employment,
the higher cutoff scores may be appropriate, but the degree of adverse impact
(as defined by EEOC guidelines) should be considered.”
The following factors should be considered in setting the passing point for examinations:



Minimum level of competency required to perform the job
Differential rates of performance by certain groups
Anticipated number of vacancies during the life of the list
Analysts must review the examination scoring reports. The cutoff score analysis shows where
adverse impact occurs at the various score levels. Analysts should consult team leaders to
determine the passing point.
Analysts must submit the Examination Planning Form with the examination scoring reports to
the Team Leader for approval of the passing point. After the passing point is set and scores have
been uploaded, an Applicant Flow report should be generated.
Section VII – Page 12
CONVERSION TO THE 700 - 1000 POINT SCALE
Department of Human Resources policy requires that all eligible lists utilize a 1000-point scale,
with minimum acceptable scores set at no less than 700 points. To accomplish this, a linear
transformation formula is available to convert total weighted raw or standardized scores to the
new scale prior to the addition of promotive or veterans’ points. See the Sigma manual for
instructions on how to transform scores. If you are performing the conversion manually, the
following formula should be used for single-component exams. For multiple-component exams,
the constants will need to be changed in accordance with the component weights. In all cases,
scores should be rounded to the nearest whole numbers rather than extended to two decimal
places.
Conversion formula:
700 + [{300/range of scores} x {observed score – minimum score}] = Converted Score
Key:
Observed score = total weighted raw or standards score as currently computed
Maximum
= the highest observed score
Minimum
= the observed score of the lowest scoring candidate who passes
(i.e. – the passing point)
Range of Scores = maximum - minimum
Section VII – Page 13
ENTERING PROMOTIVE AND VETERAN’S POINTS
After converting examination scores to the 700 – 1000 point scale, analysts must enter merit and
service points for City employees entitled to them, and veteran’s points for applicants entitled to
them who submitted the “Application for Veteran’s Preference” form and verification with their
applications. These points are awarded only to passing candidates.

Determine if candidates qualify for points and if so, how many points.

Enter points in the appropriate fields in Sigma.

Total points should be checked before the list is ranked.
Section VII – Page 14
HOW THE MEAN AND STANDARD DEVIATION ARE COMPUTED
Note: Examination analysts will not need to do the following computations because the
automated examination scoring and analysis system will do them. The formulas are being
provided for informational purposes only.
Section VII – Page 15
GLOSSARY OF MEASUREMENT TERMS
Arithmetic Mean (Mean): A kind of average usually referred to as the mean. It is obtained by
dividing the sum of a set of scores by the number of scores.
Average: A general term applied to the various measures of central tendency. The three most
widely used averages are the arithmetic mean (mean), the median and the mode. When the term
‘average’ is used without specifying which one it is, the most likely assumption is that it is the
arithmetic mean.
Central Tendency: A measure of central tendency provides a single most typical score as
representative of a group of scores: the ‘trend’ of a group of measures as indicated by some type
of average, usually the mean or median.
Deviation: The amount by which a score differs from some reference value, such as the mean,
the norm, or the score on some other test.
Distribution (Frequency Distribution): A tabulation of the scores (or other attributes) of a
group of individuals to show the number (frequency) of each score, or of those within the range
of each interval.
Median: The middle score in a distribution or set of ranked scores; the point (score) that divides
the group into two equal parts; the 50th percentile. Half the scores are below the median and half
above it, except when the median itself is one of the obtained scores.
Mode: The score or value that occurs most frequently in a distribution.
N: The symbol commonly used to represent the number of cases in a group.
Normal Distribution: A distribution of scores or measures that in graphic form has a distinctive
bell-shaped appearance. In a normal distribution, scores or measures are distributed
symmetrically about the mean, with as many cases at various distances above the mean as below
it. Cases are concentrated near the mean and decrease in frequency, according to a precise
mathematical equation, the farther one departs from the mean. In a perfectly normal distribution,
the mean and median are identical.
Percentile: A point (score) in a distribution at or below which falls the percentage of cases
indicated by the percentile. Thus, a score coinciding with the 35th percentile (P35) is considered
as equaling or surpassing that of 35% of the persons in the group, and such that 65% of the
persons in the group exceed this score. ‘Percentile’ has nothing to do with the percentage of
correct answers an examinee makes on a test.
Range: For some specified group, the difference between the highest and lowest obtained score
on a test; thus it is a very rough measure of spread of variability, as it is based on only two
extreme scores. Range is also used in reference to the possible spread of measurement a test
provides, which in most instances is the number of items on the test.
Section VII – Page 16
Raw Score: The first quantitative result obtained in scoring a test.
Skewed Distribution: A distribution that is not symmetrical or balanced around the mean.
Scores pile up at one end and trail off at the other.
Standard Deviation (S.D.): A measure of the variability or dispersion of a distribution of
scores. The more the scores cluster around the mean, the smaller the standard deviation. For a
normal distribution, approximately two-thirds (68.2%) of the scores are within the range from
one S.D. below the mean to one S.D. above the mean. Computation of the S.D. is based on the
square of the deviation of each score from the mean.
z-score (Standard Score): The z denotes a standard score referenced to a normal distribution,
i.e., a z-score is a measure of deviation from the mean in terms of the standard deviation as the
unit. If x is a normally distributed variable with mean  and standard deviation , then
z =x-

Any value converted to a z-score is said to be normalized, i.e., rescaled to a value within a unit
normal distribution with mean 0 and standard deviation 1. The advantage of normalizing
disparate distributions is that doing so equates the various distributions to the same scale, thus
permitting direct comparison of previously non-homologous variables.
Variability: The spread or dispersion of test scores, best indicated by their standard deviation.
Variance: For a distribution, the average of the squared deviations from the mean – thus, the
square of the S.D.
Section VII – Page 17
SECTION VIII
Eligible Lists
Section VIII – Page 1
ELIGIBLE LISTS/REVIEW OF RATINGS
Eligible lists are produced through the Applicant Tracking System.
Analysts must:

Generate eligible list noting any eligibles who are under examination-imposed restrictions,
i.e., experience, license, or certificate or candidates under waiver pending completion of
exam. Please see the procedure for removal of restrictions and for information on testing
candidates (such as deployed members of armed forces) after eligible list is adopted in this
section.

Generate notices of examination results; note restrictions on notices.
The purpose of the Review of Ratings is to confirm the accuracy of the calculation of
examination participants’ scores and/or rankings. The identity of the examiner giving any mark
or grade shall not be disclosed.
Review of Ratings may be conducted either by mail, electronic mail or on site. If conducted on
site, review period is five days from 8:30 a.m. to 4:30 p.m. The analyst may want to consult with
DHR and/or the Examination Team Leader regarding the most appropriate method for
conducting the Review of Ratings.
Unlike class based examinations, Position Based Testing Review of Ratings procedures does not
provide for review:
 By persons other than candidates in the examination process (except as required by the
Public Records Act)
 Of ratings other than your own (except as required by the Public Records Act)
 Of application packages
Under Position Based Testing, documents included for Review of Ratings generally include:
 A copy of the notice of examination results
 A copy of the written answer sheet, if applicable
 Copies of the oral rating sheets, if applicable
 Scoring key for written examinations, if applicable
 Information to candidates about how to calculate and convert their scores, if applicable
 A sign-in sheet (for onsite review only)
The Review of Rating period for Position Based Testing is five business days. If notices are to be
sent by postal service, mail notices of exam results two mailing days before the Review of
Rating period starts. If notices are to be sent by email, send notices of examination results a
minimum of one day prior to the Review of Rating period. If the candidates have not been given
an official notice of the duration of the eligible list, this is a good time to notify them of the
Section VIII – Page 2
duration. Include the pamphlet describing the referral process with notices to successful
candidates. A sample of a Review of Rating letter is on the next page.
If you are conducting the Review of Rating by email or mail (off site review), include the
applicable documents.
If conducting an onsite Review of Rating, ensure that all materials are available at the review site
by 8:30 am on the first day of review period. Analyst should be ready to respond to questions.
After the period for Review of Rating:
The examination analyst must secure all review materials before posting the eligible list.
Prepare Authorization to Adopt a List of Eligibles form and attachments and post eligible list
after confirming that either no appeals were received by DHR, analyst or CSC, appeals have
been resolved, or permission has been attained from Human Resources Director to post the
eligible list pending the resolution of the appeal(s). Submit the form and attachments to DHR
Support Services Unit for processing.
Section VIII – Page 3
SAMPLE
<DATE>
<APPLICANT NAME>
<ADDRESS>
<CITY>, <STATE> <ZIP>
<EMAIL ADDRESS>
Dear <TITLE> <LAST NAME>:
We are pleased to advise you that you have passed the examination for <CLASS> <TITLE>. Your score
is shown below. If you are entitled to promotive points, they are also shown. The certification rule applied
to this examination is <CERTIFICATION RULE> and the duration of the eligible list is <NUMBER>
months.
Position Based Test Points:
Service Points:
Merit Points:
Veteran Points:
TOTAL POINTS:
RANK ON THE LIST:
We have attached for your review the computation of your score as reflected on the Training and
Experience Evaluation sheet. Please note that your score is based on your responses on your application,
the Supplemental Application and any applicable verification documentation at the time that you filed for
this selection process. Any computational errors must be brought to my attention during the officially
designated review of rating period from <DATE> through <DATE>. In addition, you must notify us and
the Department of Human Resources of any change to your address, telephone numbers and/or email
address.
Please print and retain copies of this letter and the accompanying evaluation form for your records.
Congratulations and we wish you the very best in your career endeavors.
Sincerely,
<NAME>
<TITLE>
<PHONE NUMBER>
List ID:
Applicant ID:
Important Employment Information for Position Based Testing Examinations for the City and County of San Francisco, which specifies
announcement and application policies and procedures including applicant’s appeal rights, can be obtained at
http://www.sfgov.org/site/sfdhr_page.asp?id=46205 . Copies of this information can also be obtained at 44 Gough Street, San Francisco,
CA.
Section VIII – Page 4
REVIEW OF RATING APPEAL PROCEDURE
PURPOSE OF THE REVIEW OF RATING PERIOD:
The purpose of the review of ratings is to allow examination participants to confirm the accuracy
of the calculation of their score or ranking (Civil Service Rule 111A.22). Errors in calculations
should be corrected prior to posting and adopting the eligible list. Candidates who are negatively
affected by the correction of the calculation of a score should be notified of their new rank or
score.
Protests must be filed in writing in the office of the Department of Human Resources during the
review period of the eligible list. The decision of the Human Resources Director on appeals
regarding the accuracy of scores is final and may not be reconsidered.
OTHER POSSIBLE APPEALS DURING THE PERIOD FOR REVIEW OF RATING:
During the five (5) day period for Review of Rating, candidates may submit appeals regarding
inconsistencies in examination administration, bias of raters or failure of raters to apply uniform
standards.
Protests or appeals of this nature must be submitted directly to the Civil Service Commission. If
the analyst receives such protest or appeal, the analyst should inform the applicant to submit the
appeal directly to the Commission and/or the analyst may forward a copy of the appeal to the
Executive Officer, Civil Service Commission.
The analyst should attempt to resolve the appeal administratively. The analyst should consult
with their Examination Team Leader and/or DHR. If successful in resolving the appeal, the
analyst must notify the Executive Officer that the appeal has been resolved. The Executive
Officer will report the resolution of the appeal to the Commission.
If unsuccessful in resolving the appeal, the analyst must consult with the Examination Team
Leader and DHR to prepare for a presentation to the Civil Service Commission regarding a
response to the appeal. The analyst should bring to the Commission any documents relevant to
the appeal. The relevant documents may include specific job analysis documents, the
examination announcement, any letters or notices to applicants or candidates, examination
instructions to candidates, proctor instructions, etc. Analysts should not submit confidential
examination information to the Commission at the Commission meeting. The CSC will
determine if it wants to conduct an inspection of the examination. Consult with DHR and/or the
Executive Officer. The analyst should be prepared to submit nine copies of the materials with a
CSC Transmittal Form attached to each copy.
Analysts should refer to the section in this manual regarding appeals on information on
processing and responding to protests.
Section VIII – Page 5
AUTHORIZATION TO ADOPT A LIST OF ELIGIBLES
PURPOSE:
To initiate and document the adoption of the eligible list.
FORM:
Authorization to Adopt a List of Eligibles
PROCEDURE:

Analyst must contact the Civil Service Commission, DHR Support Services and Human
Resource Director’s office to determine if examination appeals were received during the
Review of Rating period. If appeals were received, analyst must consult the team leader
to discuss the merits of the appeal(s) and decide the appropriate course of action.

If no appeals were received, the analyst may continue to the next step.

Analysts must complete top of the Authorization to Adopt form and submit to team
leader for signature.

Submit original and two (2) copies to the DHR/Support Services Unit with two (2) copies
of the eligible list and one (1) copy of the announcement. The attached announcement
should be the same as the one indicated on the eligible list, and should always be the last
version of the announcement.
Section VIII – Page 6
AUTHORIZATION TO ADOPT A LIST OF ELIGIBLES
Submit to Support Services:
 Original and two copies of this form
 Two copies of eligible list
 One copy of announcement
Appeals received:
No
Yes
Comments: ____________________________________________________________
This list is located in: Sigma
Job Apps
Converted from Sigma
Date: ___________ Analyst: _____________________Team/Department: ________
Class & Title: __________________________________________________________
List type:
Entrance
Combined Promotive and Entrance
Promotive
List #: _________ List ID#: _________________
List limit:
Limited
Unlimited_____ Scope: ____
____
# of Eligibles: _____________________________
Class Based Testing: Date posted: _________ Desired adoption date: ___________
Position Based Testing: Date posted and adopted:____________________________
(For Position Based Testing, the posting date and adoption date must be the same.)
Duration: _____________________________ Refusals allowed: ________________
Signature of Unit Manager: ___________________________ Date: ______________
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
(For Support Services use only)
Date rcvd: _____________________ Date adopted: __________________________
Class Based List Adoption Date:____________ PBT Posting and Adoption Date:_______
Forwarded to Certification: ________________________________________________
Adoption by: ________ Master Card by: _________ List Log: ________ File: _____
One copy of the Authorization to Adopt will be returned to the analyst, one copy will be
forwarded to Referral, and one copy retained by Support Services. Adoptions will be done
within 48 hours of receipt by DHR Support Services, or on the date desired if one is noted.
Section VIII – Page 7
RESTRICTION REMOVAL FORM
PURPOSE:
To notify Referral Office that requirements which had put the eligible under
waiver have been fulfilled and the restriction may now be removed.
FORM:
Restriction Removal Form
PROCEDURE:

Analyst must prepare the form in duplicate.

Original must be delivered to the Referral Office. Duplicate is attached to
application.

If circumstances warrant, the Rules allow the immediate lifting of a waiver. This
can be done only with approval of the Human Resources Director or designee
(currently, the Referral Office manager). In this case, immediate removal of
waiver should be indicated on the form to call attention to it in the Referral Office.
Section VIII – Page 8
DEPARTMENT OF HUMAN RESOURCES
RESTRICTION REMOVAL FORM
ELIGIBLE'S NAME: __________________________________ PIN: ____________ RANK: ____
CLASS NO. & TITLE: _____________________________________ LIST ID NO: ___________
ANALYST NAME: ____________________________________ DATE: ______________________
The Department of Human Resources has determined that the general waiver _______________
may be removed from the eligible's record for the above-listed examination. This waiver was
imposed for:
EDUCATION:
 high school diploma
 transcript of college record
 baccalaureate degree
 transcript of graduate courses
 master's degree
 other: ______________________________________________________________
OTHER REQUIREMENTS:
 completion of experience requirement
 successful completion of physical agility test
 successful completion of background investigation
 submission of California Driver's license: type (class) of license: _______________
number: ____________________ expiration date: _______________________
 professional California registration or license:
type: ______________________________ expiration date: ________________
 keyboard skill
 shorthand skill
 other: ______________________________________________________________
Analyst Signature: ________________________________ Date Effective: ____________
Complete two copies of this form. Submit one either to DHR Support Services, if the list has not
yet been adopted, or to DHR Referral Unit if the list has been adopted. The other copy must be
attached to the application. If the application has already been sent to storage, copy should be
filed in the Active File.
Section VIII – Page 9
TESTING OF CANDIDATES AFTER AN ELIGIBLE LIST IS ADOPTED
Qualified candidates who cannot participate in examinations due to military service deployment,
disability, or other compelling reasons, may request to be tested at a later date.
Analysts must:
● Consult with team leader to determine if reason for not testing is significant enough to qualify
for a deferment of testing and if testing at a later date is feasible.
● Ensure appropriate documentation of the request and reason for deferment of testing is
obtained and retained for inclusion in the active file.
● Place the name of candidate(s) to be tested after the eligible list is adopted on the eligible list
below the lowest ranked candidate; place asterisks in the rank and points columns.
● Place a note on the bottom of the eligible list to indicate that the candidate is under waiver
pending successful completion of the examination and that the candidate will be ranked
according to total final score.
Section VIII – Page 10
CANDIDATE INFORMATION REQUIRED FOR SCORE INSPECTION
Examination analysts must supply calculation information sheets to candidates for score
inspection. If scores for an exam are standardized by board or by components, the following
information must be supplied to the candidate:
1. Cover sheet – Lists the type of component (i.e. oral, performance or multiple choice), the
component weight (as percentage), and the appropriate mean and standard deviation.
Scores that have been standardized by board are done to account for rating differences caused
by the use of multiple boards. Candidates should be supplied with the component weight,
their board number, and the mean and standard deviation for their board only. Candidates
should not be shown the mean and standard deviation for boards other than their own. Thus,
if six boards were used, the analyst must create six separate cover sheets.
Scores that have been standardized by component (population) are done to retain the
effective weight of the component in the calculation of the total score. Candidates should be
supplied with the component mean, the component standard deviation, and the component
weight.
2. Instructions for computing scores - A step-by-step explanation of how the candidate’s score
was calculated.
3. Worksheet for computing scores – The exact formulas for the steps on the instruction sheet
and space for recording the calculation results.
When reporting results as weighted raw scores, information from Steps 2 and 3 above must be
supplied to candidates.
Sample copies of three different score inspection packets follow. The samples cover these
situations:
1. Oral/performance exam scores (rater generated) standardized by board
2. Oral/performance exam scores (rater generated) standardized by component
3. Multiple choice exam scores standardized by component
The worksheet can be edited to reflect the actual constants that will be used in the calculations
(e.g., means, standard deviations, KSA weights, and component weights). Step 6 of the
instruction sheet should only be used if the same standardization method is used across
components. Step 8 of the instruction sheet should not be used for entrance examinations.
Once inspection packets have been compiled, the accuracy of the figures that are being supplied
to the candidates must be verified. To do this, select two candidates per mean and standard
deviation supplied and follow the instructions in the inspection packet to hand calculate their
scores.
Section VIII – Page 11
Hand calculations should generate the same score as the computer and, if they do not, should not
make a difference in a candidate’s relative placement on the eligible list. The hand-calculated
score and the computer-generated score may differ by .01 to .02 because of the rounding criteria
employed by the two different methods. Analysts should review the eligible list scores and be
prepared to explain this difference to candidates whose scores are close to other candidates’
scores on the list. Differences greater than .02 generally indicate a computational error. A
common cause of computational error is the failure to consider negative numbers in each step of
the calculation.
It is not necessary to attach the three-sheet score inspection packet to each candidate’s
application. Instead, contact DHR Support Services Supervisor to determine the appropriate
number of copies of inspection packets that are needed (the number will vary depending on exam
type and size of candidate pool). As candidates arrive for inspection, DHR Support Services
staff will give the candidate the correct paper work.
Section VIII – Page 12
CANDIDATE INSPECTION INSTRUCTIONS
Class 0010 Mine Worker I
ORAL INTERVIEW
(Rater Generated Scores)
The attached sheets contain instructions for computing your score for the oral interview
component of the examination for class 0010 Mine Worker I. Scores for the oral interview were
standardized to account for differences caused by the use of multiple interview boards.
You were seen by Board 2 and should use the following information when computing your score
for this component.
Board 2 mean
23.04
Board 2 standard deviation
6.33
Component weight
55%
SAMPLE
Section VIII – Page 13
Candidate’s Instructions for Computing
ORAL INTERVIEW
BOARD STANDARD SCORES
Use the following steps to compute your eligible list score:
Step 1:
Add the ratings for a KSA together and multiply this number by the weight of the
KSA and get your Weighted KSA Rating. Repeat for each KSA.
Step 2:
Add the Weighted KSA Ratings and divide this figure by the Number of Raters to
get your Raw Score.
Step 3:
Subtract the Board Mean from your Raw Score and divide this value by the Board
Standard Deviation to get your z-Score.
Step 4:
Multiply your z-Score by 20 and then add 60 to get your Converted Standard Score
(CSS)
Step 5:
Multiply the Component Weight by 10, and then multiply this figure by your CSS to
get your Weighted Standard Score for this component.
Step 6:
Repeat Steps 1 through 5 for each exam component.
Step 7:
Add the Weighted Standard Scores for each component together to get your Total
Examination Score.
Step 8:
Add promotive points and/or veteran’s preference points (if applicable) to your Total
Examination Score to get your Total Points.
Step 9:
Convert to the 700-1000 Point Scale
* 700 + [{300/Range of scores} x {Observed score–Minimum score}] = Converted Score
Maximum: The highest observed score
Minimum: The observed score of the lowest scoring candidate who passes (i.e. the passing point)
Range of Scores: Maximum – Minimum
Observed Scores: Candidate’s weighted raw or standard score
Section VIII – Page 14
Candidate’s Worksheet for Computing
ORAL INTERVIEW
BOARD STANDARD SCORES
(STEP 1)
RAW SCORE FORMULA
Rater
A
Rater
B
Rater
C
Rater
D
Total
Ratings
KSA
Weight
Weighted
KSA Rating
KSA- A
+
+
+
=
X
=
KSA- B
+
+
+
=
X
=
KSA- C
+
+
+
=
X
=
KSA–D
+
+
+
=
X
=
KSA- E
+
+
+
=
X
=
KSA- F
+
+
+
=
X
=
KSA- G
+
+
+
=
X
=
KSA- H
+
+
+
=
X
=
KSA- I
+
+
+
=
X
=
KSA- J
(STEP 2)
+
+
+
=
X
=
Total of all Weighted KSA Ratings =
(Number of Raters) 
Raw Score =
BOARD STANDARD SCORE FORMULA
(STEP 3)
Raw Score - Board Mean
Board Standard Deviation
= z-Score
(STEP 4)
(z-Score X 20) + 60 = Converted Standard Score (CSS)
(STEP 5)
(Component Weight X 10) X CSS = Weighted Standard Score (Observed Score)
(STEP 6):
Convert to the 700-1000 Point Scale
700 + [{300/Range of scores} x {Observed score–Minimum score}] = Converted Score
(STEP 7):
Add promotive or veteran’s points if applicable
Section VIII – Page 15
CANDIDATE INSPECTION INSTRUCTIONS
Class 0010 Mine Worker I
PERFORMANCE
(Rater Generated Scores)
The attached sheets contain instructions for computing your score for the performance exercise
component of the examination for class 0010 Mine Worker I. Scores for the performance
exercise were standardized to retain the effective weight of the component in the calculation of
your total score.
Please use the following information when computing your score for this component:
Component mean
56.04
Component standard
deviation
10.3
Component weight
15%
SAMPLE
Section VIII – Page 16
Candidate’s Instructions for Computing
PERFORMANCE
COMPONENT STANDARD SCORES
(Rater Generated Scores)
Use the following steps to compute your eligible list score:
Step 1:
Add the ratings for a KSA together and multiply this number by the weight of the
KSA and get your Weighted KSA Rating. Repeat for each KSA.
Step 2:
Add the Weighted KSA Ratings together and divide this figure by the Number of
Raters to get your Raw Score.
Step 3:
Subtract the Component Mean from your Raw Score and divide this value by the
Component Standard Deviation to get your z-Score.
Step 4:
Multiply your z-Score by 20 and then add 60 to get your Converted Standard Score
(CSS).
Step 5:
Multiply the Component Weight by 10, and then multiply this figure by your CSS to
get your Weighed Standard Score for this component.
Step 6:
Repeat Steps 1 through 5 for each exam component.
Step 7:
Add the Weighted Standard Scores for each component together to get your Total
Examination Score.
Step 8:
Add promotive or veteran’s points (if applicable) to your Total Examination Score to
get your Total Points.
Step 9:
Convert to 700-1000 point scale (* see formula).
* 700 + [{300/Range of scores} x {Observed score–Minimum score}] = Converted Score
Maximum: The highest observed score
Minimum: The observed score of the lowest scoring candidate who passes (i.e. the passing point)
Range of Scores: Maximum – Minimum
Observed Scores: Candidate’s weighted raw or standard score
Section VIII – Page 17
Candidate’s Worksheet for Computing
PERFORMANCE
Component Standard Scores
(STEP 1)
RAW SCORE FORMULA
Rater
A
Rater
B
Rater
C
Rater
D
Total
Ratings
KSA
Weight
Weighted
KSA Rating
KSA- A
+
+
+
=
X
=
KSA- B
+
+
+
=
X
=
KSA- C
+
+
+
=
X
=
KSA–D
+
+
+
=
X
=
KSA- E
+
+
+
=
X
=
KSA- F
+
+
+
=
X
=
KSA- G
+
+
+
=
X
=
KSA- H
+
+
+
=
X
=
KSA- I
+
+
+
=
X
=
KSA- J
(STEP 2)
+
+
+
=
X
=
Total of all Weighted KSA Ratings =
(Number of Raters) 
Raw Score =
COMPONENT STANDARD SCORE FORMULA
(STEP 3)
Raw Score - Component Mean
Board Standard Deviation
=
z-Score
(STEP 4)
(z-Score X 20) + 60 = Converted Standard Score (CSS)
(STEP 5)
(Component Weight X 10) X CSS = Weighted Standard Score (Observed Score)
(STEP 6):
Convert to the 700-1000 Point Scale
700 + [{300/Range of scores} x {Observed score – Minimum score}] = Converted Score
(STEP 7):
Add promotive or veteran’s points if applicable
Section VIII – Page 18
CANDIDATE INSPECTION INSTRUCTIONS
Class 0010 Mine Worker I
MULTIPLE CHOICE TEST
The attached sheets contain instructions for computing your score for the multiple choice test
component of the examination for class 0010 Mine Worker I. Scores for the multiple choice test
were standardized to retain effective weight of the component in the calculation of your total
score.
Please use the following information when computing your score for this component:
Component mean
Component standard
deviation
Component weight
76
5
30%
SAMPLE
Section VIII – Page 19
Candidate’s Instructions for Computing
Component Standard Scores
MULTIPLE CHOICE TEST
Use the following steps to compute your eligible list score:
Step 1:
Subtract the Component Mean from your Number of Correct Items and divide this
value by the Component Standard Deviation to get your z-Score.
Step 2:
Multiply your z-Score by 20 and then add 60 to get your Converted Standard Score
(CSS).
Step 3:
Multiply the Component Weight by 10, and then multiply this figure by your CSS to
get your Weighted Standard Score for this component.
Step 4:
Add the Weighted Standard Scores for each component to get your Total
Examination Score.
Step 5:
Convert to 700-1000 point scale (*see formula).
Step 6:
Add promotive or veteran’s points (if applicable) to your Total Examination Score to
get your Total Points.
* 700 + [{300/Range of scores} x {Observed score–Minimum score}] = Converted Score
Maximum: The highest observed score
Minimum: The observed score of the lowest scoring candidate who passes (i.e. the passing point)
Range of Scores: Maximum – Minimum
Observed Scores: Candidate’s weighted raw or standard score
Section VIII – Page 20
Candidate’s Instructions for Computing
Component Standard Scores
MULTIPLE CHOICE TEST
STANDARD COMPONENT SCORE FORMULA
(STEP 1)
Number of Correct Items - Component Mean
Component Standard Deviation
= z-Score
(STEP 2)
(z-Score X 20) + 60 = Converted Standard Score (CSS)
(STEP 3)
(Component Weight X 10) X CSS = Weighted Standard Score (Observed Score)
(STEP 4):
Convert to the 700-1000 Point Scale
700 + [{300/Range of scores} x {Observed score – Minimum score}] = Converted Score
Section VIII – Page 21
EXTENSION OF ELIGIBLE LISTS
PURPOSE:
To allow for the extension of eligible lists nearing expiration.
FORM:
Eligible List Extension Form
AUTHORITY: The Human Resources Director may extend the duration of an eligible list or
eligibility periods for individuals on the eligible list based on the needs of the City or merit
factors. Any extension of the eligible list or eligibility period shall occur prior to the expiration
date with the exception of correcting errors. The maximum duration of the eligible list shall not
exceed forty-eight (48) months. Affected eligibles will be notified of the extension of the
eligible list or eligibility period.
PROCEDURE:
For classes with eligible lists nearing the expiration date an assessment should be made of
the utility of extending lists. Factors to consider in making this determination are:

Number of original eligibles

Number of remaining eligibles

Qualifications of remaining eligibles to meet the specific demands of the job

Anticipated hiring needs of departments utilizing the class
The final decision on whether to extend the eligible list is made by the Human Resources
Director. A recommendation by the analyst should be included in the comments section
of the form.
The form is completed by analysts and forwarded to DHR, Recruitment and Assessment
Services for review, then forwarded to the Human Resources Director for review and
approval or denial. If approved, the signed form is sent to DHR Support Services staff.
Section VIII – Page 22
Request for Extension of Eligible List
Date:
To:
Through:
From:
Support Services
Deputy Director, HR, Recruitment and Assessment Services
Please process the extension of duration for the following eligible list: (copy of authorization attached)
1. Original List Information
Class Number:
List Type
and No.:
Title:
This list is: ***
Adoption Date:
Limit: ***
List ID No.
Scope(Specialty-if applicable):
Original Expiration Date:
Duration of Eligibility:
Number of Eligibles when adopted:
2. Current Request
Extend List/Eligible for:
New Expiration Date:
No. of Eligibles Remaining:
3. Eligible List Extension History
Previous Extension? ***
No. of Eligibles Remaining:
Duration of Extension:
Adjusted Expiration Date:
Approved By: ***
4. Comments: (Attached additional sheets if necessary)
5. Request Approved: Human Resources Director
By: ____________________________________________ Date: ________________
Print: __________________________________________
For use by Support Services only:
Processed By: ___________________
(Name/Phone No.)







Master Card _______
Master List
_______
List Detail
_______
Eligible Record_______
Extension Index_______
Date forwarded to Referral _______________
Notify RAS of Extension via e-mail ________
Section VIII – Page 23
SECTION IX
Exam Documentation, Finalization & Storage
Section IX – page 1
EXAMINATION REPORT
PURPOSE:
To document information about an examination that has been completed.
FORM:
Examination Final Report
PROCEDURE:
Fill out the form and include it with the active file.
Send a copy of the report to DHR.
Section IX – page 2
EXAMINATION FINAL REPORT
Examination for Class:
Dates Held:_________________________
Department:________________________________Division/Unit:________________________
Analyst:_______________________________________________________________________
Team Leader Approval: _________________________________________________________
Summary of the Examination Components:
Summary of the Examination Administration:
Summary of Problems, Protests and/or Appeals; Explanation of resolutions:
Summary of Recruitment and Applicant Flow:
Summary of Successes during the examination:
Recommendations for future exams:
Section IX – page 3
ACTIVE FILE AND STORAGE MATERIALS
The final step in the examination process is to organize the examination materials for inclusion
in the Active File and storage of supporting materials. This should be done immediately after the
adoption of the eligible list.
Analyst must:
1. Collect/compile all Review of Ratings materials.
2. Follow Checklist for Active File Folder to create the active file.
3. Follow Checklist for Examination Materials Storage to create storage (“sack”) file/box.
4. Submit completed folders to team leader for review and signature.
5. Secure folders and/or boxes in a locked file within the Human Resources division office.
6. The records must be retained for at least five years.
Section IX – page 4
CHECKLIST FOR ACTIVE FILE FOLDER AND EXAMINATION
MATERIALS STORAGE
PURPOSE:
To list all examination materials that must be retained in the active file and
storage file for future reference and for conformance with the DHR Record
Retention and Destruction policy.
FORMS:
Checklist for Active File Folder
Checklist for Examination Materials Storage
PROCEDURE:

Analysts should use these checklists to organize examination materials.

Submit to team leader for signature.
Section IX – page 5
Checklist for Active File Folder
Class #:
Title:
List #:
Department:
Division:
Unit:
Announcement #:
Announcement Issue Date:
Date List Adopted:
List ID#:
Items should be filed in this folder in the order listed below. Use this checklist to ensure that all materials are included.
1
Authorization to adopt eligible list
2
Final applicant flow data sheet
3
Copy of examination announcement, both original and amended.
4
Copy of eligible list, both original and amended
5
Examination report or file memorandum
6
Exam Planning forms
7
Protests and responses; appeals and responses
8
CSC reports, including all correspondence and action taken
9
Copy of supplemental application (include rating guidelines)
10
Correspondence with union representatives, boards, commissions, or appointing officers
11
Job analysis (file supporting documentation in examination materials storage box).
12
Subject matter expert background qualification forms
13
Recruitment plan, if separate from the Exam Planning form
14
Rater background information and qualifications
15
Information for candidates (Notice of Names of Oral Board or Performance Raters)
16
List of potential raters – address, phone, race, sex
17
Questions and guidelines for oral/performance examination
18
Sample of Rating grids
19
Proctor instructions for complex exams
20
Instructions for computing scores
21
Analysis of variance between rating panels, if applicable
22
Departmental briefing letter (Letter sent to Oral Boards)
23
Multiple choice examination scoring key
24
Copy of written examination (including key) in locked file cabinets
25
Protests, counter-protests, responses, sign-in roster, notice to protestants in written examination
26
Item analysis
27
Cut-off score analysis
28
Research materials from other agencies, if applicable
29
Examples of rating sheets
30
Checklist for examination materials storage
31
Master Card Information:
# of applicants: _____ # of qualified: _____ # of participants: ______
# of eligibles on list: ______ Duration of list: ____ months Certification rule:_________________
32
SACK BOX NUMBER___________ (To be completed or obtain number from DHR, Support Services, if
applicable)
Analyst Signature
Date
Team Leader Signature
Date
cc: Examination Material Storage Box
Section IX – page 6
Checklist for Examination Materials Storage
Class #:
Title:
Department:
Division/Unit:
Announcement #:
List #:
Issue Date:
Date Adopted:
List ID #:
The examination materials must be prepared for storage immediately following the
adoption of the eligible list. Remove and discard unnecessary materials, such as
envelopes, notices, etc. Use this checklist to ensure that all necessary materials are
included.














Copy of Eligible List
Copy of Announcement
Applicant Flow Data (final)
Inspection Sign-in Sheets
Job Analysis and back-up material (worksheets)
Copy of exam and rating guidelines
Applications of eligibles
Applications of persons not on eligible list
 Rejects/Not Best Qualified
 No-Show Written
 Failed Written
 No-Show Oral
 Failed Oral
 Others
Copy of Information for Candidates
Completed Request for Verification of Performance
Written Answer Sheets
Oral Rating Sheets
Other Rating Sheets (e.g., Performance, etc.)
Applicant Survey Tear offs
Copy of Checklist for Active File Folder
Analyst’s Signature
Date
___________________________________________
Team Leader’s Signature
Date
Section IX – page 7
SECTION X
Appeals
Section X – Page 1
SECTION X – APPEALS
TABLE OF CONTENTS
Page
Overview – Appeals to the Civil Service Commission
4
Appeal Points to the Civil Service Commission
4
General Requirements for Appeals to the Civil Service Commission
4
Procedural Requirements for Appeals to the Civil Service Commission
4
Appeals of the Examination Announcement
5
Additional Appellant Requirements when Filing Announcement Appeals
5
Standard of Review by the Civil Service Commission
5
Strategies for Handling Appeals Related to the Exam Announcement
5
Strategies for Handling Appeals Regarding Job Description
5
Strategies for Handling Appeals Regarding Minimum Qualifications
6
Tips on How to Avoid an Appeal on the Examination Announcement
6
Appeals to the Civil Service Commission after the Test Administration
7
Items Appealable to the Civil Service Commission after the Examination is
Administered
7
Additional Appellant Requirements when Filing Appeals after the Test
7
Standard of Review by Civil Service Commission
7
Strategies for Handling Appeals Involving Rater Bias, Failure to Apply
Uniform Standards or Inconsistency in Exam Administration
7
Tips on How to Avoid an Appeal Regarding Rater Bias, Failure to Apply
Uniform Standards in Exam Administration
8
Additional Tips for Protests or Appeals Involving either Bias or Failure
to Apply Uniform Standards
9
Section X – Page 2
Appeals to the Human Resources Director
Page
10
Strategies for Handling Protests of Rejection of Application based on
Failure to Meet Minimum Qualifications
10
Strategies for Handling Challenges to Adequacy of the Examination
11
Tips on How to Avoid an Appeal of Adequacy of Examination
12
Strategies for Handling Protests and Appeals Related to Scores
12
Strategies for Handling Challenges to the Qualifications of Eligibles
12
Section X – Page 3
OVERVIEW – APPEALS TO THE CIVIL SERVICE COMMISSION
Under Position Based Testing there are three points of appeal to the Civil Service
Commission. All other appeals are to the Human Resources Director whose decision is
final on those matters.
For appeals filed to the Civil Service Commission, appellants must meet certain general
requirements. Additionally, each of the three appeal points has specific requirements
based on the type of appeal.
Appeal Points to the Civil Service Commission
(1) After the examination announcement has been issued;
(2) After the examination has been administered and prior to the posting of the
eligible list; and
(3) After the merging of eligible lists in different classes.
General Requirements for Appeals to the Civil Service Commission
The appellant must:
state the specific grounds upon which the appeal is based;
cite the specific Civil Service Commission Rule or Department of Human
Resources Policy that the appellant contends was violated by the action which is
the subject of the appeal;
provide facts including available documents to support the appeal, and;
show a rational relationship between the alleged injury suffered by the appellant
as a result of the action being appealed and the alleged violation of Rule or Policy.
Failure to meet all of the above requirements MAY be sufficient grounds for the Civil
Service Commission to deny the appeal. The Civil Service Commission does, however,
reserve for itself the flexibility to hear an appeal even if the appellant does not meet all of
the requirements for stating the appeal.
Procedural Requirements for Appeals to the Civil Service Commission
Must be submitted in writing.
Must be submitted directly to the Executive Officer of the Civil Service
Commission.
Must be received in the Civil Service Commission office by close of business on
the fifth (5th) business day after the examination announcement issuance date.1
Note: An appellant may submit an announcement appeal only to the HR analyst or DHR,
instead of to the CSC. In such cases, an attempt should be made to resolve the appeal at
that level. If unsuccessful, provide appellant with appeal rights to the CSC.
1
See Appendix VII – Civil Service Commission Procedure One - for Appeals and Requests for Hearing
Section X – Page 4
APPEALS OF THE EXAMINATION ANNOUNCEMENT
Appeals of the examination announcement may be based only on challenges to the
position description and/or the minimum qualifications.
Additional Appellant Requirements when Filing Announcement Appeals
Include a statement of the specific component(s) or item(s) of the examination
announcement being contested.
Include specific reason(s) why inclusion of the cited portions of the
examination announcement constitutes abuse of discretion (see below) by the
Human Resources Director.
To the extent possible, all supporting documentation must be submitted with
the written appeal.
Standard of Review by the Civil Service Commission
The standard of review for appeals under this Section shall be abuse of discretion in
establishing the position description, the minimum qualifications and/or the certification
rule when the certification rule was not reached by mutual agreement with the employee
organization representing the tested class. In determining abuse of discretion, the Civil
Service Commission must find that the Human Resources Director made decisions
beyond his/her authority or had no rational basis for his/her decision.
Strategies for Handling Appeals Related to the Exam Announcement
1. Try to determine the exact nature of the problem, or concern from the appellant’s
perspective. Try to get an understanding of how the problem impacts the
appellant.
2. Attempt to answer the appellant’s questions and provide explanations of
examination rules, policies and procedures relevant to the announcement.
3. Look for ways to resolve the appeal administratively.
4. Retrieve documentation which confirms notification was sent to the union (e.g., if
notification was about certification rule or if a courtesy copy of the announcement
was sent prior to announcement issuance). [A signed hardcopy of the notification
should be on file or, if the notification is sent via email, a ‘pdf’ of the scanned
signed copy should be retained.]
5. Confer with DHR’s Recruitment and Assessment Unit on all protests and appeals.
Strategies for Handling Appeals Regarding Job Description
1. Retrieve job analysis documents, JAQ, class specification, old examination
announcements, and/or classification posting notice, if relevant. Review the
SME(s) background form for the SME(s) who defined the job.
Section X – Page 5
2. Demonstrate that the position description is consistent with the class to which it
was allocated and if possible, demonstrate that there was no protest of the position
description (classification posting).
3. Where the position description deviates from the general class description,
demonstrate that the specific position description was developed by qualified and
knowledgeable Subject Matter Experts and based on operational needs of the
department.
4. Consult with Subject Matter Experts and union regarding disagreement with job
description.
Strategies for Handling Appeals Regarding Minimum Qualifications
1. Retrieve job analysis documents, JAQ, class specification, old examination
announcements, linkage documents, MQ form, and/or classification posting
notice, if relevant. Review the SME background form for the SME who
defined/approved the minimum qualifications.
2. Review the history of the minimum qualifications by reviewing previous class
based examination announcements. If the MQs have changed, demonstrate the
linkage between the MQs and the KSAs.
3. Demonstrate that the new or modified MQs were developed with qualified SMEs
and based on the KSAs related to the essential functions of the position.
4. Confer with DHR’s Recruitment and Assessment Unit on all protests and appeals.
Tips on How to Avoid an Appeal on the Examination Announcement
Follow the “Standard Text for All Job Announcements” presented in Section III
of this manual.
Keep language (terms, phrases, descriptions) simple and sentences short.
Use simple sentences and avoid overly long and complex sentences.
Ensure that terminology for licenses, certificates, degrees, etc. is accurate and
current.
Ensure that all MQs link back to the KSAs required for the job.
Challenge SMEs to demonstrate that additional years or higher education or
additional requirements are necessary as MQs rather than desirable qualifications.
Check MQs against next lower and next higher class or other classes that are
equivalent in level.
Have a colleague read the MQs for clarity and consistency in interpretation.
Section X – Page 6
APPEALS TO THE CIVIL SERVICE COMMISSION AFTER THE TEST
ADMINISTRATION
Items Appealable to the Civil Service Commission after the Examination is
Administered
Claims of inconsistency in examination administration
Bias of raters and/or
Failure of raters to apply uniform standards
Note: Appeals may NOT be based on objections to ratings or rankings based solely
on the candidate’s belief that he or she is entitled to a higher or passing score.
Neither the Human Resources Director nor the Civil Service Commission shall
substitute his, her or its judgment for the judgment of qualified raters.
Additional Appellant Requirements when Filing Appeals after the Test
1. Must provide specific facts that demonstrate how the alleged inconsistency affects
the validity or reliability of the selection procedure.
2. Cite the specific Civil Service Rule or Department of Human Resources Policy
that was violated.
Standard of Review by Civil Service Commission
In order to prevail on an appeal under this Section, the appellant must establish by a
preponderance of the evidence, i.e. more likely than not, that the Rule or Policy at
issue was violated and that the violation caused the validity or reliability of the
examination to be compromised.
Strategies for Handling Appeals Involving Rater Bias, Failure to Apply Uniform
Standards or Inconsistency in Exam Administration
1. Determine the exact nature of the problem, or concern from the appellant’s
perspective. Try to get an understanding of how the problem impacts the
appellant.
2. Attempt to answer the appellant’s questions and provide explanations of rules,
policies and procedures relevant to the examination. Gather information from
candidate on what happened that lead the candidate to believe that the rater was
biased.
3. Look for ways to resolve the appeal administratively.
4. Retrieve and review documentation that addresses the appellant’s complaint such
as the SME Qualifications forms, Security Agreement and Statement of
Responsibility forms, candidate notice letter or preparation materials, oral or
video recordings, rating sheets, proctor, candidate or rater instructions, examiner
training or orientation materials, behaviorally anchored rating scales. [Note: Keep
Section X – Page 7
test questions, answer keys and ratings scales confidential. Do not share these
materials with the CSC without first consulting DHR.]
5. Evaluate whether ratings or rater’s actions were appropriate or consistent based
on scoring guidelines, training, specific circumstances, etc. For example,
demonstrate that the same procedures were followed by (each) proctor/rater/exam
administrator for every candidate. Demonstrate that each rater participated in the
orientation and received instructions for administering the exam, if applicable.
6. Be prepared to explain (and show documentation if necessary) if the test
administration provided for some planned flexibility or if the minor changes in the
examination administration did not affect the validity or reliability or impede the
candidate from demonstrating his/her knowledge, skill and/or ability. (For
example, one candidate may perceive another candidate’s ADA accommodation
as being unfair and inconsistent administration of the examination.)
7. Confer with DHR’s Recruitment and Assessment Unit on all protests and appeals.
Tips on How to Avoid an Appeal Regarding Rater Bias, Failure to apply uniform
standards in exam administration:
Provide written instructions to proctors, candidates (if necessary) and raters.
Follow common templates for administration of examinations (helps to avoid
mistakes) and general test administration and proctor guidelines as outlined in
RAS manuals.
Train or orient proctors, raters, etc. This should not simply include technical
training regarding the exam itself but a discussion of proper rater etiquette, e.g.,
staying engaged with the candidate; staying alert; avoiding facial expressions that
may indicate approval or disapproval; avoiding unnecessary chatter with
candidate or among raters. Also, have raters review the Oral Performance
Examination Manual, which includes how they can avoid common errors
(Appendix VI).
Post all appropriate notices. This is a requirement not a tip!
Go to the exam site prior to or as early as possible on the day of the exam to get
an early warning of any problems with the physical layout.
Carefully monitor and observe the test administration as it takes place and resolve
problems immediately.
Be sure to inform candidates prior to the start of the exam, and preferably in
writing, that they need to present any protest involving the manner in which the
test is administered at the test center before they leave. The purpose of this
notification is to allow the test administrator to address candidate concerns in a
timely manner. Once a candidate leaves the test center there may be no remedy to
address the candidate’s protest.
If a candidate complains, immediately seek to understand and resolve the problem
in a way that does not compromise the exam. If the candidate is not satisfied with
the proposed resolution, then have the candidate put the complaint in writing (as a
protest), but require the candidate to follow your instructions for continuing in the
exam.
Section X – Page 8
Make sure all unusual circumstances involving the test are documented
immediately so that there is a record available in the event there is a need to
reconstruct what occurred during the test administration. [Use Report on Conduct
of Examination form for this purpose which is available in the Proctor Manual.]
Additional Tips for Protests or Appeals Involving either Bias or Failure to Apply
Uniform Standards
Have ready the Subject Matter Expert and Rater Qualifications Background Form
in case there is a challenge to the qualifications of the rater that led to the bias of
the rater.
Use written behaviorally anchored rating scales (to help address claims of bias
due to rater’s background, e.g., private sector vs. public sector, or jurisdiction
operational differences, etc.)
During the tape-recorded introductory remarks of an oral examination, ask the
candidate if s/he has any objections to any of the members of the oral examination
board. If the candidate voices an objection, the test administrator should find an
alternate examiner to rate the performance of that candidate. If another rater is
unavailable and the panel consists of three raters, remove the rater who is the
focus of the objection and allow the oral to proceed with two raters.
The names of candidates should not be known to the raters, if at all possible.
Rather, candidates should be identified on the basis of their candidate
identification number.
Review the ratings assigned by the alleged biased rater(s) (or the raters who failed
to follow standards), and other raters as well, to assess discrepancies, etc.
Always record the oral performance examination.
Instruct the raters to allow the same amount of time for each candidate and to
keep the same environment to avoid perceptions of failure to follow uniform
standards.
NOTE: Appeals of merging of eligible lists will be addressed by the Department of
Human Resources. DHR staff may request full documentation of the exam process in
defending appeals regarding the merging of eligible lists. All examination analysts are
required to retain and store all the documents, as indicated in this examination manual.
Section X – Page 9
APPEALS TO THE HUMAN RESOURCES DIRECTOR
Protests, complaints, and challenges regarding announcements and examinations are
often submitted first to the examination analyst. The analyst or the analyst’s supervisor
should notify and consult with RAS (Director or Managers) about the complaint prior to
responding. The applicant/candidate should be given appeal rights to the HR Director in
the response which denies the protest. Indeed, the following issues may be appealed to
the Human Resources Director (but not to the CSC) with respect to PBTs:
Rejection of applicant for not meeting the minimum qualifications.
Disqualification of applicant at some point during the selection process.
Qualifications of an applicant or eligible.
Score or ranking on the list (if not based on rater bias or inconsistency in test
administration).
Cheating, aid or hindrance.
Qualifications of raters.
Challenge of a selection from the eligible list.
Selection from the eligible list.
If the appellant is not satisfied with the analyst’s response and submits the protest to the
Human Resources Director, that protest must be submitted in writing and received by the
Human Resources Director no later than the fifth business day after the analyst’s
response. If the appellant doesn’t wait for the analyst’s response and immediately
protests to the Human Resources Director, that protest must be submitted in writing and
received by the Human Resources Director no later than the fifth business day after the
occurrence or notice of the issue.
The analyst, analyst’s supervisor or analyst’s manager will prepare a draft of the response
and submit the draft to RAS (Director or Managers) for review. Once approved, the
formal reply will be signed by the HR Director and that reply shall include the following
verbiage:
“Civil Service Commission Rules for the City and County of San Francisco specify announcement,
application and examination policies and procedures, including applicant appeal rights. They can be found
on the Civil Service Commission website at http://www.sfgov3.org/index.aspx?page=300 . Copies of
specific rules can also be obtained at 1 South Van Ness, 4 th Floor, San Francisco, CA 94103.”
Strategies for Handling Protests of Rejection of Application based on Failure to
Meet Minimum Qualifications
1. Retrieve documents: M.Q. form showing linkage between KSA and minimum
qualifications; the announcement, the applicant’s letters of protest/appeal, all
responses to the applicant from the analyst, and a copy of the applicant’s
application materials.
2. Demonstrate that the minimum qualifications are linked to KSAs that are required
prior to appointment and/or at the time designated on the announcement. Explain
briefly why the applicant does not meet the minimum qualifications. Specify if
Subject Matter Expert(s) reviewed the application and concurred with the
Section X – Page 10
analyst’s findings. Explain if applicant was provided opportunities to submit
additional information about qualifications. If not, provide a reason why it was
not feasible to allow the applicant time to submit additional information.
3. Review the following documents: announcement, the candidate’s application
package, tools or devices to assess the candidate’s application. Review the MQ
wording for clarity.
4. Provide an explanation as to the assessment of why the applicant did not meet the
MQs.
NOTE: Clearly worded MQ’s avoid ambiguity and/or misinterpretation. It is often
helpful to describe what type of work will NOT be deemed as qualifying. Allow for
reasonable substitutions of equivalent experience, educations, certifications, licenses, etc.
and state these on the announcement.
Strategies for Handling Challenges to Adequacy of the Examination
1. Retrieve documents: The announcement; the task/KSA linkup form (to
demonstrate job relatedness of test); the examination questions or exercises; the
behaviorally anchored rating guidelines; the explanation of the cut-off point;
explanation of qualifications of raters or screening panel members; the appellant’s
rating (score) sheet (while not directly applicable to appeal, often how well a
candidate scored on specific part(s) of the exam is related to the part(s) of the
exam that the candidate claims is inadequate); the applicant’s letters of
protest/appeal, all responses to the applicant from the analyst.
2. Demonstrate that the examination questions and exercises are job related. Review
the task/KSA link up and explain how the questions adequately test the KSAs
identified as essential to job performance.
Section X – Page 11
Tips on How to Avoid an Appeal of Adequacy of Examination
The closer that the examination replicates the job functions of the position, the
more valid the test is and the greater the acceptance of the exam among
applicants.
Do a thorough job analysis and ensure the linkage between the job analysis and
the test is accurate.
Work with qualified and knowledgeable Subject Matter Experts on the
development of the exam.
Do not measure job tasks where candidates will be trained or can learn the task
within a short period of time. Similarly, do not measure KSAs that can be learned
on the job.
Pre-test the exam on SMEs or other examination staff to eliminate errors,
ambiguity, inconsistencies, etc. in the exam material and scoring guidelines.
Strategies for Handling Protests and Appeals Related to Scores
1. Ensure that the instructions for calculating scores are clear and understandable.
2. Test out the formula by following the instructions and manually compute scores
for a sample of candidates to ensure accuracy.
3. Have another analyst check the accuracy of the calculation of the scores prior to
sending the notices to the candidates.
4. Do not provide the candidate with the original rating sheets. For walk- in
candidate review of ratings, present a copy of the rating sheet.
5. If an error is found, correct score and eligible list. If this correction results in a
change of rank for other candidate(s), those candidates most be notified.
6. The appeal must be submitted to the Human Resources Director within the time
period designated for the Review of Ratings.
Strategies for Handling Challenges to the Qualifications of Eligibles
1. Retrieve documents: the letter of protest of candidate not being qualified, and a
copy of the application materials in question.
2. Explain briefly why the applicant does or does not meet the minimum
qualifications without disclosing anything about the 3rd party applicant. Specify if
Subject Matter Expert(s) reviewed the application and qualifications and whether
or not they concurred with the analyst’s findings.
3. If the candidate is qualified, the analyst should write a letter to the complainant
and explain in general, not specific terms, why the candidate involved is qualified.
4. If an error has been made, send a letter explaining to the candidate why their
name is being removed from or placed under waiver on the eligible list. Send a
letter to appellant explaining that the name has been removed from eligible list.
5. If the analyst is unsure whether the candidate meets the MQs, consult with SMEs
and DHR staff.
Section X – Page 12