Project SAILS Standardized Assessment of Information

Standardized Information
Literacy Assessment:
SAILS and Beyond
Juliet Rumble, Auburn University
Cheryl Cecil & Beth Ashmore,
Samford University Library
April 27, 2006
AACRL/CUS Best Practices
Alabama Library Association Annual Conference 2006
Project SAILS
Standardized Assessment of Information Literacy Skills

An initiative of Kent State University Libraries

Purpose: develop a test instrument for the assessment of
information literacy skills that:
--Is standardized
--Assesses at institutional level
--Provides for both external and internal
benchmarking
April 27, 2006
AACRL/CUS Best Practices
Alabama Library Association Annual Conference 2006
Project Structure

3 phase pilot testing of instrument (2003-2005)
--Samford and AU participated in pilot testing
--82 institutions participated (including 7
Canadian)
--42,304 students tested

Diverse group of participating institutions
--Carnegie Doctoral/Research level universities
--2-4 year colleges
April 27, 2006
AACRL/CUS Best Practices
Alabama Library Association Annual Conference 2006
SAILS Test Instrument

Multiple choice test

45 questions randomly drawn from a data
bank of 252 items

Each test question addresses an ACRL Objective for
Information Literacy Instruction
April 27, 2006
AACRL/CUS Best Practices
Alabama Library Association Annual Conference 2006
SAILS Administration at Samford

Sample:
--mostly freshmen; students enrolled in UCCA 102
--approximately 40 juniors and senior from research
methods and senior seminar courses

Campus stakeholders:
--University Core Communication Arts Program
--Office of Institutional Research and Assessment
--Center for Teaching, Learning & Scholarship

Incentives: some instructors gave credit for taking the
assessment; others required it; others just encouraged students
to take it
April 27, 2006
AACRL/CUS Best Practices
Alabama Library Association Annual Conference 2006
SAILS administration at Auburn

Sample:
--freshmen through seniors participated
--students from 12 disciplines/subject majors

Campus stakeholders:
--English Freshman Composition program
--Office of Institutional Research and Assessment
-- Core Curriculum Oversight Committee

Incentives: participants entered in drawing for 3 Apple
iPods
April 27, 2006
AACRL/CUS Best Practices
Alabama Library Association Annual Conference 2006
SAILS Data Reports
Results reported at two levels of specificity:
--4 ACRL Info Lit Competency Standards
--12 skill sets (derived from ACRL’s Objectives for
Information Literacy Instruction)

Measures the performance of groups, not individuals

Test questions plotted according to difficulty level
April 27, 2006
AACRL/CUS Best Practices
Alabama Library Association Annual Conference 2006
What we learned from the SAILS data….

On all standards and skill sets, the average student
at Samford and Auburn scored at about the same
level as the average student from all participating
institutions.

Both Samford & Auburn focused on person-item
maps for the 12 skill sets to identify areas of
strength and weakness in students’ performance.
April 27, 2006
AACRL/CUS Best Practices
Alabama Library Association Annual Conference 2006
April 27, 2006
AACRL/CUS Best Practices
Alabama Library Association Annual Conference 2006
A Difficult Question for Our Students
If you have a research paper due, and the course instructor has not
advised you to use a particular citation style, which of the
following is the best thing to do?
CHOOSE ONLY ONE ANSWER.
􀀻 Select a citation style and use it consistently.
􀀻 Use various citations styles based on the type of resource.
􀀻 Use your own citation style and use it consistently.
􀀻 You should always use APA if no other style is requested.
􀀻 You should always use MLA if no other style is requested.
April 27, 2006
AACRL/CUS Best Practices
Alabama Library Association Annual Conference 2006
What the numbers don’t tell us….

The national “benchmarks” associated with standards and skill sets do
not indicate “mastery” of information literacy

Scores are not based on success in actually performing tasks
associated with learning outcomes (although they are intended to be
predictors of success).

Samford & Auburn’s test results do not track development of cohort
groups (no longitudinal studies were conducted).

There’s still a lot we don’t know about cohort groups that we’re
comparing.
-- E.g.: Did test takers receive library instruction?
-- E.g.: How much (and what kind of) instruction in research
skills did students receive in other classes?
April 27, 2006
AACRL/CUS Best Practices
Alabama Library Association Annual Conference 2006
What we learned about doing assessment

Sharing resources and expertise with other campus groups charged
with programmatic assessment is a key to success.

Programmatic assessment involves a serious commitment of time and
money. Support must come, not only from individual faculty members
and departments, but also from university administration.

Assessment can be done without a standardized tool. In all likelihood,
we need a variety of different assessment tools.
--E.g. Pre- and post- tests that address specific learning
objectives
April 27, 2006
AACRL/CUS Best Practices
Alabama Library Association Annual Conference 2006