Determining Value - Virtual Library of Virginia

Determining Value: The
Development of Evaluation
Metrics for Shared Content
ACRL Conference 2017
Genya O’Gara, VIVA Associate Director
Anne Osterman, VIVA Director
Why a Value Metric Project?
• Libraries continually search for better, more informed ways to make
resource decisions.
• Budgets are tight or declining, information universe is expanding.
• A difficult task at the institutional level becomes even more complex at the
consortial level.
• Diversity in user populations of members creates diversity of collection
priorities.
• The need for a system that can compare dissimilar formats.
• What is the value of an ebook collection compared to a streaming media
subscription?
What is VIVA?
Public Colleges and Universities
Public Community and Two Year Colleges
Private Nonprofit Institutions
Educational and Research Institutions
VIVA’s Resources
175
databases
More than
Almost
7,500
50,000
videos
journals
More than
More than
2,000,000
80,000
Additional reports,
proceedings, and
newspapers
ebooks
Value Metric Task Force Project Origins
• 2014-2016 biennium, VIVA received a 5% cut and used data to inform its
cancellation decisions.
• Looking forward, the Collections Committee wanted standardized criteria to
apply to the evaluation of its resources.
• The VIVA Collections Committee formed the Value Metric Task Force (VMTF) to
figure out a consortial approach.
Developing the Value Metric System
• The VMTF Charge:
• Design and apply a framework for the coherent and holistic evaluation of
VIVA products.
• Determine what the highest collection development priorities are for the
consortium and examine how these can be translated into quantifiable
values.
• The end result will be an assessment framework and value metric system for
the evaluation of shared resources that are reflective of VIVA’s overarching
values.
Developing the Value Metric System
• Membership of task force was representative of the four major institution types
within VIVA.
• Examined priorities for the consortium from “institution type” perspective.
• Persona/brainstorming exercise surfaced institutional priorities - instead of user
type (undergraduate, faculty), used institution type (community college,
doctoral).
Developing the Value Metric System
• The persona/brainstorming exercise identified overlapping priorities.
4 Institution Types
21%
40%
3 Institution Types
9%
2 Institution Types
30%
1 Institution Type
• These were using in a survey of member institutions that focused on how
institutions valued the identified facets depending on the specific format.
Developing the Value Metric System
• For all consortial resource format types, the top two concerns were cost savings
and alignment with curriculum. The other ranked facets varied widely by format.
Ebooks
Alignment with curriculum
Cost
DRM restrictions on use and sharing
Interoperability with discovery systems
Easy, one-stop content delivery
Freeing up physical space
Stable access
Building a comprehensive core collection…
Accreditation requirements
Vendor's support of open initiatives
0
50
100
150
200
Developing the Value Metric System
• Needed accessible data that was both measurable and attainable, in order to
create a tool that was easy to implement and sustainable.
• Very important to the group to NOT reinvent the wheel or add additional tasks to
busy staff.
• Wanted to ensure that the tool/framework could be adapted at the local level.
Developing the Value Metric System
• The group conducted a data inventory.
• Included data such as degree and graduate counts, usage and cost data, etc.
• They then mapped pre-existing data to answerable questions from the surveyidentified areas of need.
• For each product type the group asked:
• What data do we already collect?
• Does this data align with ways libraries measure value for users?
• Are there other factors we aren’t collecting that could answer this question?
Value Metric: Putting it All Together
• The group used the results of the survey to weight the included components
according to relative importance to the consortium.
• Two kinds of grids (current and prospective) were developed for each format type
(databases, ebooks, ejournals, and streaming media).
• Grids were divided in two parts:
• Demonstrated usefulness to the consortium (e.g. cost per use, alignment with
degrees awarded).
• VIVA “values” (e.g. an emphasis on open initiatives, COUNTER-compliant
usage statistics, usage rights, etc.).
• Each grid has a potential score of 100, allowing for cross-format comparisons.
Value Metric: Database Example
CRITERIA
1. Alignment with Curriculum and/or Accreditation Requirements
a. Resource constitutes a high percentage of VIVA content within the subject area by format
b. Resource belongs to subject area with high number of degrees awarded
c.Percentage of total use coming from single public highest-use institution
d. Percentage of total use coming from public highest-use institution type
TOTAL CATEGORY Score
2. Cost Effectiveness
b. Cost-per-Use
a. Cost Avoidance
c. Annual Increase
d. Private Pooled Funds
TOTAL CATEGORY Score
3. Interoperability w/Discovery Systems
a. Discovery Tools in which Product is Indexed
TOTAL CATEGORY Score
4. Easy, One-Stop Content Delivery
a. Platform
b. Full-Text Availability
TOTAL CATEGORY Score
SCORE
6
6
3
3
18
6
5
4
3
18
14
14
7
6
13
Value Metric: Database Example
CRITERIA
SCORE
6. Multidisciplinarity
a. Subject by Call Number
8
TOTAL CATEGORY Score
8
7. Usage Statistics
a. COUNTER compliant
4
b. Institution-Level Statistics
3
TOTAL CATEGORY Score
7
8. Technical Issues
a. Frequency and Nature of Technical Issues
3
b. Vendor Responsiveness
3
TOTAL CATEGORY Score
6
9. Supports Open Initiatives
Demonstrable commitment to open initiatives/exploring alternate open access
publishing models
3
TOTAL CATEGORY Score
3
Value Metric: Rubric & Instructions Example
CRITERIA
b. Resource belongs
to subject area with
high number of
degrees awarded
RUBRIC
< 10% = 0
10-19% = 1
20-29% = 2
30-39% = 3
40-49% = 4
50-59% = 5
>60%=6
INSTRUCTIONS
To score, please refer to the "Format Breakdown" sheet's mapped subject areas. Filter
"Degree-Type-LC-Mapping spreadsheet" subjects by those three subjects. This
spreadsheet maps (at only the highest level) LC subjects to degree types awarded in
Virginia. Add the percentages of relevant degrees, and assign the number according to
the rubric.
Using the consortial usage statistics (http://library.gmu.edu/vivasafe/index.htm), find the
total Record Views from public institutions from the most recent complete fiscal year of
< 20% = 3 points
c. Percentage of total
data. Then find the public institution that had the highest number of Record Views for
20-40% = 2
use coming from
this resource. The ratio of this individual institution to the whole is the percentage of
points
single public highesttotal use coming from the single public highest-use institution. For example, if a resource
40-60% = 1 point
use institution
had 150 Record Views from GMU and there were 500 total Record Views from all public
> 60% =0 points.
institutions, the ratio would be 150/500 = 0.3 = 30%. This product would therefore get 2
points in this category
Value Metric: Putting it into Operation
• Collections Committee Product Managers “tested” the grids by filling them out,
and they were further refined.
• VIVA Central filled out the grids based on each product.
• We created a database to store the grid data and ease comparison and reporting
of different evaluative sections across products.
• The Collections Committee used the data to identify cancellations to meet a state
budget reversion.
Value Metric: Sample Chart
Value Metric Totals for Relevant Products
100
90
80
70
60
50
Cost Effectiveness
40
Content and Usage
30
Discovery and Access
20
Stability
10
Vendor Infrastructure
Open/Independent
Databases
Journals
Ebooks
MAX POSSIBLE EBOOKS
BEST EBOOKS
Ebooks 3
Ebooks 2
Ebooks 1
MAX POSSIBLE JOURNALS
BEST JOURNAL
Journal 5
Journal 4
Journal 3
Journal 2
Journal 1
MAX POSSIBLE DATABASE
BEST DATABASE
Database 4
Database 3
Database 2
Database 1
0
Faculty Output
Next Steps
• VIVA Central will continue to fill these out in consultation with product managers,
in upcoming years, for each licensed resource.
• Although extensive, designed to be plug and play – already being adapted by the
Virginia Community College System.
• We will make grid adjustments as appropriate – they are meant to be living, not
static documents; as consortial and state priorities shift, so should the
assessment of our resources.
Outcomes
• Development of the framework has given the consortium a way to tell a fuller
story of what VIVA provides to members and to the state through thoughtful,
data-informed resource decisions.
• Specific outcomes include:
• A better understanding of how resources align with statewide curricular
needs.
• A standardized approach to the review of new and existing products.
• The use of data to strategically inform collection development and compare
dissimilar products.
Thank You to the Task Force!
•
•
•
•
•
•
Beth Blanton-Kent (University of Virginia)
Cheri Duncan (James Madison University)
Summer Durrant (University of Mary Washington)
Julie Kane (Washington & Lee University)
Madeline Kelly (George Mason University)
Crystal Newell (Piedmont Virginia Community College)
Credits
• Noun Project: “Question” by Jessica Lock, CA
• Noun Project: “Graph” by Chance Smith, US
• Noun Project: “EReader” by Amelia Edwards, US: from the Reading and
eBooks Collection
• Noun Project: “Measuring” by pictohaven: from the marketing - bold Collection
• Noun Project: “Choice” by Kirby Wu, TW: from the Business / Enterprise /
Management Collection
• Noun Project: “People” by Gregor Cresnar: from the Business: Marketing Vol.
2 Collection