Assessing and Evaluating Services in Libraries and Information

Assessing and Evaluating Services in
Libraries and Information Centers Towards
Sustained Progress and Development
Dennis A. Alonzo
Dean, College of Education
Director, Special Programs
Head, Curriculum Development Center
University of Southeastern Philippines
Davao City
Key Concepts
Reasons for Evaluation
 The Framework of Evaluation
 Identifying Performance issues for
Evaluation
 Methods
 Performance Measurement for the eLib

Why do we need to Evaluate?
To gather empirical data to inform
decisions.
 As an internal control mechanism to
ensure that the resources are used
efficiently and effectively
 To convince the funders and the clients
that the service is delivering the benefits
that were expected when the investment
was made

Dennis A. Alonzo
3
Big Question in Evaluation
What do you
want to know?
Dennis A. Alonzo
4
Ideal Library





Set clear, tough and meaningful standards
Tell users in a clear, straightforward way about
services
Consult widely about what services people
need and how services can be improved
Make services available to everyone who
needs them
Treat all people fairly. Have polite and helpful
staff
Ideal Library
Use resources effectively by budgeting
carefully
 Continually make improvements
 Work with other providers to provide a
better service
 Show that users agree that the services
provided are really good.

Issues in Evaluation






Collect information to facilitate decision
making
Justify increasing expenditures
Evaluate the quality of services provided
Plan for future improvements and directions
Identify the extent to which problems can be
solved
Identify differing needs of different user
categories
Issues in Evaluation
Plan public relations work and
information dissemination
 Provide feedbacks to and evaluate
contractors
 Involve users in management – allows
users to rediscover a voice in library
management and express views about
service priorities
 Avoid “questionnaire fatigue”

Focus of Evaluation
Appraisal of strengths and weaknesses?
 Effectiveness of its educational
services?

Identifying Performance Issues






The way the management structure functions
Internal operations relating to information
materials, such as cataloguing and
classification
Library/ information services to users
New programs of service delivery
Alternative possibilities for doing anything
The functioning of a total system prior to
planning change
Evaluation defined:
Can and should enhance the quality of
interventions (policies and programs)
designed to solve or ameliorate
problems in social and corporate setting
(Owen, 2006)
 Process of knowledge production
 Uses rigorous empirical enquiry

Dennis A. Alonzo
11
Logic of Evaluation (Fournier, 1995)

Establishing criteria of worth


Constructing standards


How well should the evaluand perform?
Measuring performance and comparing with
standards


On what dimensions must the evaluand do well?
How well the evaluand perform?
Synthesizing and integrating evidence into a
judgment of merit or worth

What is the worth of the evaluand?
Dennis A. Alonzo
12
Objects of an Evaluation

Policies
Legislative policies
 Large scale policies
 Local policies

Programs
 Products
 Person/ People

Dennis A. Alonzo
13
Criteria for Evaluation
Success
 Efficiency
 Effectiveness
 Benefits
 Costs – which can be evaluated
independently or in association with any
of the above

Evaluation Forms and Approaches
Program
Proactive
Evaluation
Clarificative
Evaluation
Interactive
Evaluation
Monitoring
Evaluation
Impact
Evaluation
Owen, 2006
Dennis A. Alonzo
15
PROACTIVE EVALUATION
Takes place before the program is
designed
 Assists planners to make decisions
about what type of program is needed
 Provides input about how best to
develop program in advance of the
planning stage

Dennis A. Alonzo
16
Typical Issues






Is there a need for a program?
What do we know about the problem that the program
will address?
What is recognized as best practice in this area?
Have there been attempts to find solutions to this
problem?
What does the relevant research or conventional
wisdom tell us about this problem?
What could find out from external sources to
rejuvenate an existing policy or program?
Dennis A. Alonzo
17
Major Approaches
1.
2.
3.
4.
5.
Needs Assessment or Needs Analysis
Research Synthesis
Meta-analysis
Narrative Review
Review of Best Practices
Dennis A. Alonzo
18
Evaluation Forms and Approaches
Program
Proactive
Evaluation
Clarificative
Evaluation
Interactive
Evaluation
Early After
Implementation
Dennis A. Alonzo
Monitoring
Evaluation
Impact
Evaluation
Owen, 2006
19
CLARIFICATIVE EVALUATION

Designed to assist stakeholders to
conceptualize interventions and improve their
coherence, and thus increase the chances that
their implementation will lead to the desired
outcomes.
 Concentrates on making explicit the internal
structure and functioning of an interventions
 Program logic or theory is developed/ revised
Dennis A. Alonzo
20
Issues to be addressed





What are the intended outcomes of this program
and how is the program designed to achieve
them?
What are the underlying rationale for this
program?
What program structures or elements need to be
modified to maximize program potential to achieve
the intended outcomes?
Is the program plausible?
Which aspects of the program are amenable to a
subsequent monitoring or impact assessment?
Dennis A. Alonzo
21
Approaches:
1.
2.
3.
Evaluability Assessment (EA)
Program Logic
Ex-ante Evaluation
Dennis A. Alonzo
22
Evaluation Forms and Approaches
Program
Proactive
Evaluation
Clarificative
Evaluation
Interactive
Evaluation
After The Program
Design has been
Clarified/ Finalized
Dennis A. Alonzo
Monitoring
Evaluation
Impact
Evaluation
Owen, 2006
23
3. INTERACTIVE EVALUATION

Provides systematic evaluation findings
through which local providers can make
decisions about the future direction of the
program;
 Provides assistance in planning and carrying
out self-evaluations;
 Focuses evaluation on organizational change
and improvement, in most cases on a
continuous basis; and
 Empowers providers and participants.
Dennis A. Alonzo
24
Typical Issues to be addressed
What is this program trying to achieve?
 How is this program progressing?
 Is the delivery working?
 Is it consistent with the program plan?
 How could the delivery be changed so as
to make it more effective?
 How could this organization be changed
so as to make it more effective?

Dennis A. Alonzo
25
Approaches
Responsive Evaluation (Stake, 1980)
 Action Research
 Development Evaluation
 Empowerment Evaluation
 Quality Review

Dennis A. Alonzo
26
Evaluation Forms and Approaches
Program
Proactive
Evaluation
Clarificative
Evaluation
Interactive
Evaluation
Conducted to
determine the
performance of each
unit of the program
Dennis A. Alonzo
Monitoring
Evaluation
Impact
Evaluation
Owen, 2006
27
4. MONITORING EVALUATION
Appropriate when a program is well
established and ongoing.
 Involve the development of a system of
regular monitoring of the progress of the
program.
 Include a rapid response capability
(Mangano, 1989) and to provide timely
information for organizational leaders
(Owen & Lambert, 1998)

Dennis A. Alonzo
28
Typical Issues








Is the program reaching the target population?
Is implementation meeting program benchmarks?
How is implementation progressing between sites?
How is implementation progressing now compared to
a month ago, or a year ago?
Are our cost rising or falling?
How can we fine-tune this program to make it more
efficient?
How can we fine-tune this program to make it more
effective?
Is there a site which needs attention to ensure more
effective delivery?
Dennis A. Alonzo
29
Key Approaches
1.
2.
3.
Component Analysis
Devolved Performance Evaluation
Systems Analysis
Dennis A. Alonzo
30
Evaluation Forms and Approaches
Program
Proactive
Evaluation
Clarificative
Evaluation
Interactive
Evaluation
May be conducted during
the early implementation
but mostly done after
program phase out
Dennis A. Alonzo
Monitoring
Evaluation
Impact
Evaluation
Owen, 2006
31
IMPACT EVALUATION

Determines the range and extent of outcomes
of a program;
 Determine whether the program has been
implemented as planned and how
implementation has affected outcomes;
 Provides evidence to funders, senior
managers and politicians about the extent to
which resources allocated to a program have
been spent wisely; and
 Informs decision about replication or extension
of a program
Dennis A. Alonzo
32
Typical Issues








Has the program been implemented as planned?
Has the stated goals of the program been achieved?
Have the needs of those served by the program been
achieved?
What are the unintended outcomes of the program?
Does the implementation strategy lead to the intended
outcomes?
How do differences in implementation affect program
outcomes?
Is the program more effective fro some participants
than for others?
Has the program been cost-effective?
Dennis A. Alonzo
33
Key Approaches
1.
2.
3.
4.
5.
6.
Objectives-based (Tyler, 1950)
Needs-based (Schriven, 10972)
Goal Free
Process – outcome
Realistic Evaluation
Performance Audit
Dennis A. Alonzo
34
Framework for Planning an Evaluation
1. Specifying the Evaluand
What is the object of the evaluation?
 What is known about the evaluand?
 How was it developed?
 How long has it been in existence?
 What is the nature of the evaluand:
policy/program/organization/product?
 Who are the key players in its development
(actual or projected) and its
implementation?

Dennis A. Alonzo
35
Framework for Planning an Evaluation

2. Purpose


What is the fundamental reason for commissioning
the evaluation?
Consistent with evaluation form, the evaluation is
primarily concerned with:





Synthesis of information to aid program development;
Clarification of the program;
Improvement of the implementation of the program;
Monitoring program outcomes;
Determining program impact.
Dennis A. Alonzo
36
Framework for Planning an Evaluation
3. Clients/ Audiences
To whom will the findings of the evaluation
be directed?
 Identify your clients, the primary audience,
and other people who will use the
information to make decisions

Dennis A. Alonzo
37
Framework for Planning an Evaluation
4. Resources
What person power and material resources
are available to undertake the evaluation?
 The resources available determine the
extent of the data management and the
range of evaluation findings that can be
provided.

Dennis A. Alonzo
38
Framework for Planning an Evaluation
5. Evaluation focus/ foci
Which element(s) of the program will need
to be investigated: program context,
program design, program implementation,
program outcomes or a combination?
 What is the state of the development of the
evaluand?

Dennis A. Alonzo
39
Framework for Planning an Evaluation
6. Evaluation issues and key questions


Identify the issues to be addressed
Questions lead the direction of the evaluation
7. Data management





Identify data collection strategy and analysis
Is sampling important?
Is anything known about this from other sources?
How ill the data be collected?
How will the data be analyzed to adress the key
evaluation questions?
Dennis A. Alonzo
40
Framework for Planning an Evaluation
8. Dissemination of Findings
What strategies for reporting will be used?
 When will reporting take place?
 What kind of information will be included
(findings, conclusions, judgments,
recommendations)?
 Who will make the recommendations?

Dennis A. Alonzo
41
Framework for Planning an Evaluation

9. Codes of Behavior


What are the ethical conditions which
underlie the evaluation report?
10. Budget and Timeline

Given the resources, what will be achieved
at key time-points during the evaluation?
Dennis A. Alonzo
42
Performance Measurement for the eLib


Access to electronic journal
Word processing packages
 Excel and other statistical packages
 Demonstration software
 Internet use
 Bibliographic software
 Digitized books and journals
 Electronic information database
 OPACs
 Networked CD-ROMS on local area networks
 Full text outputs via bibliographic searching
 Web based training packages
Performance Indicators for eLib
Informative content
 Reliability
 Validity
 Appropriateness
 Practicability
 Comparability

Performance Issues
Skills level
 Real vs browsing
 Recreational use
 Provision of unwanted/ unanticipated
services
 Queuing/ booking/ walkouts
 Remote logging in/ problems with
 Problems of outputting data

Performance Issues
No define service period
 Quality and reliability of internet data
 Non-use
 Changes over time
 Distributed resources
 Problems with the library’s control
 The service oriented culture
 PCs vs Macs

Proposed list of performance indicators





Percentage of target population reached by
elib services
Number of log-ins to elib services per capita
per month
Number of remote log-ins to elib services per
capita per month
Number of electronic documents delivered per
capita per month
Cost per log in per elib service
Proposed list of performance indicators







Reference enquiries submitted electronically
per month
Library computer work station use rate
Number or library work station per capita
Rejected log-ins as a percentage of total log
ins
Systems availability
Mean waiting time for access to library
computer workstations
IT expenditure as percentage of total library
expenditures