Designing Pragmatic Approaches to Understand and Improve Writing (and Writing Instruction) Across Language and Cultural Difference

Designing Pragmatic Approaches to
Understand and Improve Writing Instruction
Across Language and Cultural Difference
David S. Martins, University Writing Program
Leah Bradley, Student Learning Outcomes Assessment Office
1. General Education Writing Assessment at RIT
2. Using Relevant Source Data: A Global Assessment
3. Assessment Findings
4. Use of Results, Next Steps, and Lessons Learned
5. Discussion
 University Writing Program and
Assessment Office collaborate on
assessing each of Gen Ed SLOs related to
First Year Writing (FYW)
 Writing faculty define the student learning
outcomes, develop scoring guides, and
conduct direct assessment of student
writing
Use Relevant Evidence
Gathered Through
Accepted Scholarly
Methods and Properly
Acknowledge Sources
of Information
Criteria
Range
Scope: Determines the Extent of Information Needed
0-4
Context: Evaluates Information and its Sources
Critically
0-4
Purpose: Uses Information Effectively to Accomplish a
Specific Purpose
0-4
Integrate: Integrates and Documents Sources
1-4
Variety: Relates Variety of Selected Sources Directly
to Author’s Purpose
1-4
Total Score
2-20
Benchmark: 100% of students will receive a total score of 5 or better
 National Technical Institute for
the Deaf
 American University in Kosovo
 RIT Dubai
 RIT Croatia: Dubrovnik and
Zagreb
 Final researched, “claim-based” essays were collected
from FYW courses at all locations
 17 Faculty participated in a norming session prior to
assessment
 Each essay was scored by two faculty readers
 13 Faculty members scored 231 student papers
Collection: Essays from 5 RIT locations
Technology: facilitating communication; reporting
scores; shared platform
Process: full- and part-time faculty, administrators,
assessment office and ELC staff; all faculty were
American (no ESL); developing “community of
practice”/faculty development
Student Writers
•
•
•
Students struggled with similar
issues (e.g., context, synthesis)
Student’s educational
background (potentially) had
more significant impact than
language and cultural
background
Error not systematically
addressed
Writing Instructors
•
•
•
Emerging “community of
practice” – obvious need to share
practices and terminology, and
compare assumptions
Conflating cultural background
and educational background
Recognizing need for
instructional alignment –
personal and programmatic
No location met the
100% benchmark
Results consistent
with the pilot scoring
in 2012
% of Papers Meeting the Benchmark by
Location
90%
91%
92%
87%
All Locations
(n=193)
RIT
(n=130)
NTID
(n=18)
International
(n=45)
Average Score by Criteria
2.1
2.1
2.5
2.6
1.6
Scope
Context Purpose Integrate Variety
At all locations, students
scored the lowest on
“Context” (Evaluating
Info and its Sources)
Finding consistent with
2012 pilot
1. Comparability: how can your assessment design
anticipate comparison?
2. Sample size: how do you determine sample size for
each location?
3. Norming: what is the purpose of the norming?
4. Alignment: do data collection methods fit analysis
needs?
Process reveals as much about community of practice than student writing:
• Must identify what remains hidden by scoring guide and
consider use of “benchmark” – e.g., critical thinking,
evaluating assumptions
• Must standardize scoring guide and make more coherent
• Must focus on program-wide instructional alignment – e.g., types
of assignments, teaching students what we expect them to do
• Must continue to develop community of practice – e.g.,
sharing practices, values, and assumptions
13
Improve Process and
Methodology
 Scoring guide revise
 Benchmark
adjustment
 Assignment review
Improve Teaching and
Curriculum
 Workshop & Data
Review
 Alignment
1.
What have your assessment
projects revealed about the
impact of difference – linguistic,
cultural, or educational – on
writing and/or learning?
2.
What are your experiences of
conducting multi-site assessments
taught you?