Bridging the Digital Divide by Increasing Computer and Cancer

Journal of Health Communication, 14:228–245, 2009
Copyright # Taylor & Francis Group, LLC
ISSN: 1081-0730 print=1087-0415 online
DOI: 10.1080/10810730902805804
Bridging the Digital Divide by Increasing Computer
and Cancer Literacy: Community Technology Centers
for Head-Start Parents and Families
PETER SALOVEY AND PAMELA WILLIAMS-PIEHOTA
Health, Emotion, and Behavior Laboratory, Department of Psychology,
Yale University, New Haven, Connecticut, USA
LINDA MOWAD
New England Office, Cancer Information Service, Yale Cancer Center,
New Haven, Connecticut, USA
MARTA ELISA MORET
Urban Policy Strategies, LLC, New Haven, Connecticut, USA
DENIELLE EDLUND AND JUDITH ANDERSEN
Health, Emotion, and Behavior Laboratory, Department of Psychology,
Yale University, New Haven, Connecticut, USA
This article describes the establishment of two community technology centers
affiliated with Head Start early childhood education programs focused especially
on Latino and African American parents of children enrolled in Head Start. A
6-hour course concerned with computer and cancer literacy was presented to 120
parents and other community residents who earned a free, refurbished, Internetready computer after completing the program. Focus groups provided the basis
for designing the structure and content of the course and modifying it during the project period. An outcomes-based assessment comparing program participants with 70
nonparticipants at baseline, immediately after the course ended, and 3 months later
suggested that the program increased knowledge about computers and their use,
knowledge about cancer and its prevention, and computer use including health
Pamela Williams-Piehota is currently employed by RTI International, Research Triangle
Park, NC. Judith Andersen is now at the VA Center for Integrated Healthcare, VA Medical
Center, Syracuse, NY.
Research reported in this article was funded by the National Cancer Institute in the form
of a subcontract to the New England Cancer Information Service (N02-CO-01040-75). The
Health, Emotion, and Behavior (HEB) Laboratory is supported by grants from the Ethel
Donaghue Foundation Women’s Health Investigator Program at Yale University, the
National Cancer Institute (R01-CA68427), the National Institute of Mental Health (P01-MH=
DA56826), and the National Institute on Drug Abuse (P50-DA13334). We thank our partners
for their contributions to this project: the LULAC Head Start programs in the New Haven
community who hosted the program and Computers 4 Kids for designing and teaching the
bilingual (Spanish and English), family-focused computer course.
Address correspondence to Peter Salovey, Department of Psychology, Yale University,
P.O. Box 208205, New Haven, CT 06520-8205, USA. E-mail: [email protected]
228
Bridging the Digital Divide
229
information-seeking via the Internet. The creation of community computer
technology centers requires the availability of secure space, capacity of a community
partner to oversee project implementation, and resources of this partner to ensure
sustainability beyond core funding.
Several years ago, the National Cancer Institute (NCI) designated cancer
communications as an Extraordinary Opportunity for Investment (NCI, 1999). This
designation reflects, in part, the wide and increasing use of the Internet as a source of
cancer-relevant information. In 2006, more than 113 million adults in the United States
searched the Internet for health and medical information (Fox, 2006). As of 2003 more
than 61% of American households owned a personal computer (National Telecommunication and Information Administration, 2004), and in 2007 71% of adults used
computers to access the Internet (Pew Internet & American Life Project, 2007).
Despite the proliferation of home computer ownership and access to the
Internet, a digital divide still exists between those Americans who do and who do
not use computers and access the Internet. Not surprisingly, lack of access is greatest
among individuals in traditionally underserved populations, including ethnic
minorities, low-income and low-literate individuals, and speakers of languages other
than English (Institute of Medicine [IOM], 2002; National Telecommunication and
Information Administration, 2004). For instance, about 73% of Whites are Internet
users, but this figure drops to 62% for African Americans (Pew Internet & American
Life Project, 2007), and 56% for Latinos (Fox & Livingston, 2007). A recent national
survey suggested that Latinos are narrowing the gap, with 78% of English-speaking
Hispanics using the Internet, but only 32% of Hispanic adults who speak only
Spanish doing so (Fox & Livingston, 2007). Even if these individuals own or have
access to a computer, a study commissioned by The Children’s Partnership revealed
that about 50 million Americans face what they termed ‘‘content barriers’’ to using
the Internet (Lazarus & Mora, 2000). These barriers include the following: (a) the
fact that 87% of documents on the Internet are in English, (b) a lack of relevant local
information, (c) the paucity of Internet content generated by ethnic communities
themselves or organized around their cultural interests and practices, and (d) the
high level of literacy needed to understand much of what is available on-line.
Being health literate is essential to the provision of quality health care (IOM,
2001). Healthy People 2010 and the National Library of Medicine define health
literacy as ‘‘the degree to which individuals have the capacity to obtain, process,
and understand basic health information and services necessary to make appropriate
health decisions’’ (Ratzan & Parker, 2000). Health literacy requires a variety of
skills, including reading and using technology (IOM, 2004). Rudd, Kirsch, and
Yamamoto (2004) found that individuals who had lower levels of education, were
members of racial=ethnic minority groups, were foreign born, or were older had
lower health literacy levels. Low computer and health literacy pose barriers to accessing and using health information (Lazarus & Mora, 2000).
One attempt to traverse the digital divide has been through the creation of
Community Technology Centers (CTCs), public computer facilities located in lowincome neighborhoods that promote free access to computers and Internet resources
(Breeden, Cisler, Guilfoy, Roberts, & Stone, 1998). Other approaches to access have
attempted to bridge the digital divide by introducing computers to low-income, innercity residents in the course of their health care. For example, in one project at a
community health clinic, African American and Latino children learned to control
230
P. Salovey et al.
their asthma and became more computer proficient by interacting with a computer
game that taught asthma-specific self-regulation skills (Bartholomew et al., 2000).
The NCI funded the Digital Divide Pilot Projects (DDPPs) to develop and test
strategies for communicating effectively through new communication technologies
with hard-to-reach populations that were especially vulnerable to cancer (Kreps
et al., 2007). This article describes one of these projects in which we formed a collaborative partnership that was built on several areas of expertise. The New England
Region Cancer Information Service (CIS) and the Department of Psychology’s
Health, Emotion, and Behavior (HEB) Laboratory at Yale University, with expertise
in cancer prevention and health communications, served as the coordinators of the
project and developed the protocols for program implementation. Expertise on
the provision of community-rooted services for low-income Latino and African
American families was provided by League of United Latin American Citizens
(LULAC) Head Start, a federally funded program with years of experience in the
provision of early childcare and education and a citywide partner in health and social
services with two federally qualified community health centers. Computers 4 Kids, a
Connecticut-based nonprofit organization experienced in helping low-income
families access computer skills and technology, designed and taught a bilingual
(Spanish and English), family-focused computer course. Finally, Urban Policy Strategies, a research and policy organization specializing in the evaluation of community interventions for Latinos and African Americans designed the evaluation.
We developed CTCs at two Head Start programs in the New Haven,
Connecticut community modeled after the CTC example. With Head Start’s
growing commitment to expanding health, education, and social services to other
members of a Head Start child’s family, it provided an ideal opportunity to examine
the efficacy of a Head Start link between a CTC and low-income, predominantly
Latino and African American community residents. There is a growing body of literature on health education and behavior change interventions targeting Latinos and
African Americans, such as studies that used lay health advisors (promotoras) to
deliver the intervention (Gary et al., 2003; Kim, Koniak-Griffin, Flaskerud, &
Guarnero, 2004; Staten et al., 2004) or interventions conducted through churches
(Campbell et al., 2000; Resnicow et al., 2000; Resnicow et al., 2004). More needs to
be done, however, to elucidate the most effective intervention modalities. This article
conveys findings concerning the effectiveness of the program in increasing both
computer and cancer literacy. It also offers some insights regarding the challenges
and the opportunities of sustaining CTCs in low-income, urban neighborhoods.
Method
Participants
Participants were 190 adults (parents and caretakers) recruited from LULAC Head
Start and the New Haven, Connecticut neighborhoods served by it. Power calculations conducted prior to the study suggested a sample of this size was sufficient to
detect medium-sized effects with power >.80. There were 120 participants assigned
to a computer training (intervention) group and 70 to a wait-list, assessments-only
(control) group. The 190 original participants ranged in age from 19 to 76 years
old, with a mean age of 34 years. The demographic characteristics of the sample
are summarized further in Table 1. Most participants were women (76%), and nearly
Bridging the Digital Divide
231
Table 1. Demographic characteristics of sample
Measure
Sex
Male
Female
Missing
Race=ethnicity
Latino=Hispanic
African American
Caucasian
Native American
Other
Missing
Language spoken
Spanish
English
Both
Missing
Original sample
(N¼190)
% [Frequency]
Intervention group
(n ¼ 119)
% [Frequency]
Control group
(n ¼ 70)
% [Frequency]
8.9 [17]
75.8 [144]
15.3 [29]
8.4 [10]
68.9 [82]
22.7 [27]
10.0 [7]
88.6 [62]
1.4 [1]
23.2
52.6
2.6
1.6
3.7
16.3
[44]
[100]
[5]
[3]
[7]
[31]
30.2 [36]
40.3 [48]
2.5 [3]
0 [0]
2.5 [3]
24.4 [29]
11.4
74.3
2.9
4.3
5.7
1.4
[8]
[52]
[2]
[3]
[4]
[1]
4.7
61.1
15.8
18.4
[9]
[116]
[30]
[35]
6.7
47.1
20.2
26.1
1.4
85.7
8.6
4.3
[1]
[60]
[6]
[3]
Measure
Education
Elementary
High school=GED
Career or tech
Some college
College graduate
College, some graduate
Grad school
Missing
Marital status
Married
Divorced
Separated
Single
Missing
Children enrolled in Head Start
Yes
No
Missing
Original
sample
% [Frequency]
[8]
[56]
[24]
[31]
Intervention
group
% [Frequency]
Control
group
% [Frequency]
14.2
34.2
12.1
13.7
5.8
1.1
1.6
17.4
[27]
[65]
[23]
[26]
[11]
[2]
[3]
[33]
16.0
32.8
9.2
10.1
5.0
0
1.7
25.2
[19]
[39]
[11]
[12]
[6]
[0]
[2]
[30]
11.4
37.1
17.1
20.4
7.1
2.9
1.4
2.9
[8]
[26]
[12]
[14]
[5]
[2]
[1]
[2]
23.2
7.9
4.7
45.3
18.9
[44]
[15]
[9]
[86]
[36]
21.0
6.7
2.5
44.5
25.2
[25]
[8]
[3]
[53]
[30]
27.1
10.0
8.6
47.1
7.1
[19]
[7]
[6]
[3]
[5]
30.0 [57]
51.6 [98]
18.4 [35]
37.8 [45]
34.5 [41]
27.7 [33]
17.1 [12]
81.4 [57]
1.4 [1]
232
P. Salovey et al.
all were non-White (97%). Ninety-one percent (172 of 190) of participants completed
the postintervention assessment, and 65 percent (124 of 190) of participants
completed the 3-month assessment.
Procedure
We assembled four focus groups with neighborhood parents and staff of LULAC
Head Start to determine the following: (a) baseline computer knowledge and
experience, as well as cancer prevention knowledge, (b) need for bilingual materials
and presentations, and (c) scheduling preferences and need for childcare and transportation. Participants in focus groups dedicated 60 minutes to this activity and were
provided food, beverages, and $10 compensation.
The curriculum, course logistics, and evaluation=research measures for the bilingual, 6-hour computer training course were based on information obtained from the
focus groups and cancer education guidelines of the NCI. The training manual, a
hardcover, loose-leaf notebook, was available in both English and Spanish and
was given to each participant to keep.
Computer Training
First, computer training was provided by Computers 4 Kids to staff, parents, and
community residents who had basic computer skills (i.e., used a computer in
their job or at home) and volunteered to be coaches and technical supports for
the participants who completed the basic computer training. These ‘‘coaches’’ signed
a written agreement with the HEB lab for 24 volunteer hours of helping their peers in
exchange for either a free refurbished computer or $229. They had to demonstrate
proficiency on a list of skills before they could begin coaching. We felt that peerto-peer support would facilitate comfort with the program and sustainability when
the formal program ended.
Second, one trainer along with a bilingual trainer=translator provided 6 hours of
computer training. The computer training curriculum included the following topics:
1. Computer Hardware: Computer components, Using the keyboard, Using the
mouse;
2. Introduction to Windows: Opening a program, The program menu, Windows
Explorer, Operating system, The desktop, Taskbar & start menu;
3. Using Programs: StarOffice word processor, StarOffice activity;
4. Introduction to the Internet: What is the Internet, What do you do with the
Internet, What do I need to access the Internet, How to use a browser, Searching
for information on the Internet, Finding an address or phone number;
5. Fundamentals of Internet E-mail: What is E-mail, What do you do with E-mail,
What do you need to send and receive E-mail, E-mail basic functions, E-mail
attachments;
6. Setting Up Your Computer at Home: Computer setup at home, Computer
maintenance, How to install a printer; and
7. Understanding Cancer: What is cancer, Cancer web sites.
In addition to a brief presentation on cancer prevention and detection by the
trainer, participants learned to use the Internet by surfing cancer-relevant websites.
Print materials developed by (or adapted from) the NCI were distributed to participants at the end of the training. These packets included educational information
Bridging the Digital Divide
233
about cancer prevention and detection, as well as a list of local and national
resources for cancer information (e.g., Fact Sheet 5.l6: Q & A: The Pap Test; Fact
Sheet 5.28: Screening Mammograms; Fact Sheet 6.32: Q & A: Screening, Early
Detection, and Treatment for Colorectal Cancer; Fact Sheet 8.13: Questions and
Answers About Smoking Cessation). In addition, we inserted cancer information
into class assignments, such as using search engines and word processing.
In order to facilitate computer training for individuals with basic or no computer literacy, we utilized a software program that allowed the trainer to freeze everyone’s screen and show them how to do something. This software also alerted the
trainer if someone was falling behind the rest of the class so she could approach them
quickly and help them get back on track.
Because the community partners would not allow any form of random assignment, participants signed up for the study on a first-come, first-served basis. The first
120 participants to sign up for the study were placed in the training program, and the
next 70 participants to sign up composed the wait-list, assessments-only control
group. Participants’ training enrollment dates were determined by lottery. Registration sessions were held at two Head Start locations; the first 10 names drawn composed the first class and so on. Participants chose to attend training on Saturday
mornings or Tuesday evenings. When the participants signed up for the study, they
agreed to complete two, 3-hour training sessions and several informational surveys.
Upon completion of the program, they received a free, refurbished computer.
At the beginning of the first training session, participants filled out a written consent
agreement and the preintervention survey and were paid $10. The postintervention survey was administered immediately after the second 3-hour training was completed. We
then briefly instructed participants on how to set up a computer at home, and they
received their free computers with e-mail and Internet access. The refurbished computers
came with a 90-day warranty that included repair or exchange for new parts.
We operated a telephone help-line for the duration of the study, during which
participants could call in to ask questions about their computer and receive free
trouble-shooting. The program coordinator, who was proficient in home computer
use and basic troubleshooting, attended to the helpline. A computer specialist from
Computers 4 Kids who set up and maintained the computer lab was available
for help-line consultation as well. In addition, a bulletin board website, on which
participants could post computer- and training-related questions, was maintained.
The instructor from Computers 4 Kids, who was formally trained in computer
instruction and web design, posted class updates on the bulletin board website.
Three months after the participants completed the computer-training course, we
administered follow-up surveys. These surveys were mailed to participants’ homes
with a stamped, return-addressed envelope. Participants were paid $5 upon the
return of their follow-up survey. The participants who did not return their survey
by mail were telephoned at home to complete the surveys over the phone. Those participants on the wait list (who served as controls) were offered computer-training
classes upon completion of their 3-month surveys and computers upon completion
of the training.
Evaluation Plan
We evaluated the program in two ways. First, we conducted focus groups and interviews to examine issues related to the design, implementation, and administration
234
P. Salovey et al.
of this project. Qualitative data were collected to provide insights regarding the
opportunities and challenges of building an effective CTC. Second, outcomes associated with increased knowledge, skills, and use of computer-related technology, as
well as changes in attitudes regarding cancer prevention, also were investigated.
These quantitative data were collected from training participants and the individuals
serving as controls.
Qualitative Evaluation. We focused on how successfully the core components of
the program were implemented. Core implementation components included the
following:
1. Creation and initiation of computer training and technology centers at LULAC
Head Start buildings;
2. Design and implementation of a bilingual computer literacy curriculum for
low-income families living in the neighborhoods surrounding LULAC;
3. Recruitment for, attendance at, and completion of 6-hour computer training
sessions for 120 individuals; and
4. Participants’ perception of increased knowledge and skills related to computer
technology, including word processing, e-mail, and Internet as well as increased
awareness of cancer prevention resources and behaviors.
At the onset of the project, we conducted four focus groups to help us design the
structure, scope, and content of the training sessions. These groups also helped us to
estimate the literacy levels and language preferences for the participants, a significant
number of whom were Latino. We invited three groups of individuals to the four
focus group sessions (4 to 8 participants each):
1. Program and administrative staff of LULAC Head Start, who were targeted to
conduct recruitment and registration, participate in computer training sessions,
and provide a source for coaches and technical support for community residents;
2. Residents from the community who would volunteer to provide up to 24 hours of
coaching and technical support to participants; and
3. Community residents and parents of LULAC Head Start children who were
likely to participate in computer training sessions.
We designed a moderator’s guide to facilitate the focus group discussions, which
examined the following:
1.
2.
3.
4.
Baseline knowledge of and interest in computers;
Baseline knowledge of and interest in cancer prevention;
Commitment to 6 hours of training;
Ideal times and locations for training that would take into consideration the need
for childcare and transportation;
5. Language preference; and
6. Literacy level.
Focus groups were conducted in Spanish and English for about 60 minutes. The
feedback from the focus groups was used to guide the creation of the components of
the curriculum, literacy level, language desired, and logistics of the training sessions.
Interviews of project partners were also conducted to examine challenges with
implementation and sustainability. Staff who supervised implementation, staff
Bridging the Digital Divide
235
who served as hosts of the project, and instructors were interviewed monthly throughout the project year. Data from interviews were used to address the following:
1. Perceived effectiveness of the project;
2. Strength of collaborative partnership in problem-solving and decision-making; and
3. Ability of the partnership to address methods for sustaining the project beyond
the funding period.
Quantitative Evaluation. We compared 119 participants enrolled in the training
program against their own baseline as well as to a sample of 70 individuals assigned
to the wait-list group. One participant in the training program dropped out. Because
this evaluation strategy did not use random assignment, it can be considered a quasiexperimental design with two groups (intervention, control) and three time points
(baseline prior to the intervention, immediately post-intervention, and 3-month
follow-up).
Evaluators administered questionnaires at the start of the first session (baseline),
at the end of the second session (immediate post-intervention), and 3 months after
the training ended. All materials were available in Spanish and English. If literacy
levels were low, evaluators worked interactively with participants to read aloud
the questions and response alternatives in Spanish or English.
Measures for Quantitative Evaluation
The survey items were largely based on measures the HEB Laboratory at Yale
University has created for their program of research on message framing and tailoring
(Latimer, Katulak, Mowad, & Salovey, 2005; Salovey & Williams-Piehota, 2004).
Pre- and Post-Intervention Questions
Computer-Related Attitudes and Perceived Computer Knowledge. Two questions
assessed how important computers were to participants’ current work and for the
job they would like to have in the future. One question was included to assess the
importance of having a computer in the home. Ratings were made on 5-point scales,
ranging from 1 (not important) to 5 (extremely important). One question assessed
perceptions of current computer knowledge and was also rated on a 5-point scale
from 1 (nothing) to 5 (very much).
Cancer Knowledge, Perceptions, and Experience. Perceived Knowledge: First,
participants rated the importance of knowing about how to prevent cancer on a scale
from 1 (not important) to 5 (extremely important). Then, participants were asked
separate questions about how much they already knew about lung cancer, breast
cancer, cervical cancer, prostate cancer, skin cancer, and childhood cancers. Ratings
again were made on 5-point scales ranging from 1 (nothing) to 5 (very much).
Perceived Risk and Severity: We used four questions to assess participants’ feelings
about and perceptions of cancer risk, as perceived risk has been a reliable predictor of
preventive heath behavior (Becker, 1974; Janz & Becker, 1984). The first question asked
how anxious participants were about cancer, and was rated on a scale from 1 (not
anxious) to 5 (extremely anxious). Distress and anxiety can provide the motivation
for positive health behavior changes, although they also may lead to maladaptive
health behaviors in some cases (McCaul et al., 2005). One question was included to
236
P. Salovey et al.
assess participants’ perceived likelihood that they would get cancer compared with other
people, from 1 (not likely) to 5 (extremely likely). One question asked how much
contact the participant had with people who have cancer, rated from 1 (no contact) to
5 (a great deal of contact). The next two questions asked participants how severe a
disease cancer is, and how easy it is to treat cancer. Both questions were rated on 5point scales ranging from 1 (not severe [easy]) to 5 (extremely severe [easy]).
Self-Reported Internet Knowledge and Use. One question briefly explained what
the Internet was and then asked participants to choose from among several options
numbered 1 to 3: I don’t know what the Internet is, I am not sure what the Internet is,
or I know for sure what the Internet is. Two additional questions asked how often
participants used the Internet to look up anything in the last 12 months, and how
often they had used the Internet to find information on cancer. These questions were
rated on 4-point scales from 1 (never) to 4 (7 or more times). Two additional questions assessed willingness and intentions to use the Internet to find cancer information in the next 12 months. Willingness was rated on a 5-point scale from 1 (not at all
willing) to 5 (extremely willing). Intentions were rated on a 5-point scale from 1 (definitely not) to 5 (definitely yes).
3-Month Follow-Up Questions
Some items were assessed at post-test only to keep the questionnaires brief, especially
as this was not a well-educated sample.
Self-Reported Health Behaviors and Intentions. A set of seven questions regarding self-reported health behaviors included the following: ‘‘Since completing the
computer training course have you: used sunscreen, smoked cigarettes, talked to a
doctor about a cancer question, looked up health information on the Internet,
increased the amount of fruits and vegetables you eat, called 1-800-4-CANCER
(the NCI’s Cancer Information Service), and told a friend about 1-800-4CANCER?’’ Each of these questions was answered 1 to 3, indicating No; Yes, once;
or Yes, more than once. The same seven questions were asked in terms of future
intentions for the next 12 months and answered No or Yes.
Self-Reported Computer and Internet Use. We asked participants a series of 16
questions regarding their self-reported computer behavior in the last 3 months.
‘‘In the last 3 months have you: Written a letter on a computer; played music on
a computer; downloaded music on a computer; used e-mail; purchased a product
on the Internet; used a search engine, like Google, to find information on the
Internet; looked at any websites; looked at the program’s bulletin board; posted anything on the program’s bulletin board; seen your child use a computer on his=her
own; taught your child anything on a computer; taught a family member how to
do anything on a computer; taught a friend how to do anything on a computer; used
a computer outside your home; obtained a new job that requires computer use?’’
These questions were rated 1 to 3: No; Yes, once; or Yes, more than once.
Results
Analysis Plan
We focused the quantitative data analysis on differences between the two groups
(training intervention, control) over time (baseline prior to the program, immediately
Bridging the Digital Divide
237
post-intervention, and 3-month follow-up) using mixed-model, two-way analyses of
variance (ANOVAs). Significant time or group main effects, and significant interaction effects, were followed up with pairwise comparisons among the means. We
focused our attention, primarily, on significant time-by-group interactions, which
verify the efficacy of the training program. For the 3-month follow-up data, t tests
and chi-square analyses were conducted as well to examine specific differences
between the two groups in computer- and health-related intentions and behaviors.
Computer-Related Knowledge
It was expected that the intervention would increase participants’ perceived
knowledge of computers, as compared with the control group participants who
did not receive the training program. The focus groups indicated that participants
had little experience with computers and limited knowledge about cancer prevention.
The means provided in Table 2 corroborate these qualitative findings and indicate
that participants started the training sessions with significantly less perceived knowledge about computers than the controls (p < .01). Participants’ perceived knowledge
of computers significantly increased over time, however, as compared with the controls (p < .05). Follow-up tests of the significant time-by-group interactions reported
in Table 2 showed that the intervention was effective in increasing participants’ perceived knowledge about computers and that this knowledge was retained 3 months
later (p < .05). The intervention also increased participants’ knowledge about the
Internet in comparison with the control group, although the time-by-group interaction was not significantly different (p < .10).
Computer-Related Attitudes
We expected that the intervention would influence participants’ attitudes concerning
computers. Ratings of the importance of computers for work differed somewhat
over time between the two groups (p < .10), as shown in Table 2. The intervention
affected participants’ ratings of the importance of computers for the jobs they would
like to have in the future (p < .01). Similarly, the intervention affected participants’
ratings of the importance of having computers in their homes (p < .01).
Computer-Related Intentions and Behavior
We believed that the intervention would increase participants’ computer use. The
analyses reported in Table 2 comparing the intervention and control groups over
time demonstrated that, in fact, the intervention significantly increased participants’
self-reported use of the Internet (p < .001) and, specifically, to search for information
on cancer (p < .001). Individuals in both groups, however, reported that they were
somewhat unwilling to use the Internet to find information on cancer in the future.
Perceived Cancer-Related Knowledge
We were interested not only in increasing participants’ computer-related knowledge; it
also was expected that the intervention would increase participants’ knowledge about
certain cancers, especially those emphasized in the training program (breast, skin,
cervical). As expected, the intervention increased participants’ perceived knowledge
238
Int.
Cntrl.
Computer-related behavior
How often have you used the
Internet to look up anything
in the last 12 months?
Int.
Cntrl.
Int.
Cntrl.
How important is it to have a
computer in your home?
How often have you used the
Internet to find information
on cancer in the last 12 months?
Int.
Cntrl.
Int.
Cntrl.
How important is a computer to
the job you would like to have
in the future?
Computer-related attitudes
How important is a computer to
the work you do?
Int.
Cntrl.
Int.
Cntrl.
Computer-related knowledge
How much do you already know
about computers?
The Internet is a set of millions . . .
Group
Question
1.52b (0.69)
1.63a (0.91)
1.12a (0.37)
1.34b (0.58)
4.42a (0.62)
4.45a (0.55)
4.42a (0.66)
4.37a (0.71)
2.26b (1.13)
2.89c (1.06)
4.34ab (0.75)
4.53ab (0.60)
4.45a (0.69)
4.26ab (0.72)
2.01a (1.14)
2.89c (1.18)
3.85b (1.06)
3.89b (1.20)
2.70b (0.64)
2.78b (0.58)
2.56a (0.60)
2.76b (0.49)
3.61a (1.23)
3.62ab (1.16)
3.27b (0.93)
3.05b (0.83)
Post-test
M (SD)
2.59a (0.95)
2.87b (0.83)
Pre-test
M (SD)
Table 2. Computer-related knowledge, attitudes, intentions, and behavior
2.19c (0.86)
1.76a (1.0)
3.26c (1.0)
2.82c (1.18)
4.16b (0.67)
4.55a (0.65)
4.08b (0.87)
4.45ab (0.69)
3.35a (1.28)
3.78b (1.27)
2.85b (0.40)
2.73b (0.61)
3.17b (0.89)
3.13b (0.80)
3-Month Follow-up
M (SD)
37.9
15.77
0.57
2.41y
4.75
2.07
15.62
Time F
0.09
3.84y
1.63
1.10
0.61
0.37
0.00
Group F
8.21
20.69
5.23
6.70
2.51y
3.01y
4.19
Time Group F
239
3.99b (0.91)
4.11ab (0.81)
4.58b (0.83)
4.08a (0.97)
3.86a (0.91)
4.00ab (0.91)
4.33a (0.82)
4.16a (0.95)
Int.
Cntrl.
Int.
Cntrl.
4.29a (0.75)
4.00a (0.93)
3.69a (0.92)
3.95ab(1.13)
1.93
2.52y
5.77
1.39
1.55
0.26
Note. Different superscripts denote significant differences between the cells, using Bonferroni-corrected, pairwise comparisons. Comparisons can be
made only within the same row or the same column. Degrees of freedom may vary due to some missing data, but are generally (2, 224) for Time, (1, 112)
for Group, and (2, 224) for Time Group.
y
p < .10, p < .05, p < .01, p < .001. Int ¼ Intervention group. Cntrl ¼ Control group.
Computer-related intentions
How willing are you to use the
Internet to find information on
cancer in the next 12 months?
Do you plan to use the Internet to
find information on cancer in the
next 12 months?
240
P. Salovey et al.
of some of the specific cancers that were emphasized in the training sessions, but not
the cancers that were not discussed. Specifically, participants but not controls reported
significant increases in perceived knowledge about breast and skin cancers, F(2,
180) ¼ 12.46, p < .001 and F(2, 178) ¼ 6.42, p < .01, respectively. Similarly, participants but not controls reported increases in perceived knowledge about cervical
cancer, F(2, 178) ¼ 2.42, p < .10. Over the course of the study, however, both participants and controls reported increases in perceived knowledge about cancers not specifically discussed during the training sessions, such as cancers of the lung (F(2, 180) ¼
3.18, p < .05), prostate (F(2, 180) ¼ 2.34, p < .10), and childhood leukemia (F(2, 178) ¼
11.07, p < .001). Taking the surveys multiple times may have increased respondents’
awareness of these cancers.
Emotions, Attitudes, and Beliefs About Cancer
We found that the intervention did not significantly affect participants’ worry about
cancer. Both participants and controls reported some increase in feelings of anxiety
about cancer immediately after the intervention, F(2, 170) ¼ 2.53, p < .09. Both
groups also reported significant increases in their likelihood of getting cancer at
some point in their lives, F(2, 168) ¼ 4.52, p < .05. Again, these results may be attributable to a heightened awareness brought about by responding to the surveys three
times. The intervention did not affect participants’ ratings of the importance of
knowing how to prevent cancer nor ratings of the severity of most cancers. Analyses
comparing the intervention and control groups over time indicated that the intervention did increase participants’ ratings of how easy it is to treat cancer, although
not significantly, F(2, 174) ¼ 2.33, p < .10, perhaps due to explicit discussions of
mammography screening and subsequent sharing of personal life experiences with
cancer among the participants.
Health-Related Outcomes at Follow-Up
At the 3-month follow-up, we asked all participants about various health behaviors
performed since the end of the computer training (from the immediately postintervention to the 3-month follow-up). The intervention group reported that they
were more likely to have searched for health information on the Internet
(M ¼ 2.05, SD ¼ 0.86) than the control group (M ¼ 1.74, SD ¼ 0.79; t (94) ¼ 1.76,
p < .09). They also were more likely to report that they told a friend about 1-8004-CANCER (M ¼ 1.48, SD ¼ 0.71 vs. M ¼ 1.13, SD ¼ 0.42; t (94) ¼ 2.63, p < .01).
There were no significant differences between the groups in terms of sunscreen
use, cigarette smoking, talking to a doctor about a cancer question, increasing the
number of fruits and vegetables consumed, or actually calling 1-800-4-CANCER.
At the 3-month follow-up, we also asked participants whether or not they
intended to perform various health behaviors over the next 12 months, and the
differences between the intervention and control groups were compared using chisquare tests. The intervention group was somewhat more likely to report having
plans to use sunscreen over the next 12 months than the control group (58.5% vs.
42.1%; v2(1) ¼ 2.82, p < .10). They were significantly more likely to plan to tell a
friend about 1-800-4-CANCER over the next 12 months than the control group
(81.9% vs. 55.6%; v2(1) ¼ 9.07, p < .01). There were no significant differences
between the groups in their plans to smoke cigarettes, talk to a doctor about a cancer
Bridging the Digital Divide
241
question, look up health information on the Internet, increase the number of fruits
and vegetables eaten, or call 1-800-4-CANCER.
Computer-Related Outcomes at Follow-Up
At the 3-month follow-up, all participants were asked about their performance of
various computer-related behaviors in the past 3 months. The intervention group
was more likely than the control group to report that they played computer games
(M ¼ 2.67, SD ¼ 0.61 vs. M ¼ 2.20, SD ¼ 0.81; t (85) ¼ 3.03, p < .01), used e-mail
(M ¼ 2.18, SD ¼ 0.93 vs. M ¼ 1.77, SD ¼ 0.85; t (86) ¼ 1.99, p < .05), and looked at
the program’s bulletin board in the past 3 months (M ¼ 1.56, SD ¼ 0.80 vs.
M ¼ 1.17, SD ¼ 0.53; t (85) ¼ 2.43, p < .05). These activities were emphasized and
practiced during the training sessions. More importantly, participants in the intervention group were more likely than control group members to report that they
had seen their children use computers on their own (M ¼ 2.61, SD ¼ 0.75 vs.
M ¼ 1.94, SD ¼ 0.93; t (86) ¼ 3.72, p < .001), that they had taught their children to
do something on the computer (M ¼ 2.53, SD ¼ 0.79 vs. M ¼ 1.68, SD ¼ 0.83; t
(84) ¼ 4.70, p < .001), that they had taught other family members to do something
on the computer (M ¼ 2.48, SD ¼ 0.80 vs. M ¼ 1.68, SD ¼ 0.86; t (80) ¼ 4.21,
p < .001), and that they had taught their friends to do something on the computer
(M ¼ 2.21, SD ¼ 0.89 vs. M ¼ 1.53, SD ¼ 0.73; t (84) ¼ 3.59, p < .001).
There were no between-group differences in the reported frequency with which
participants wrote letters on a computer, played music on a computer, downloaded
music on a computer, or purchased a product on the Internet. The intervention
participants did not improve in these areas, perhaps because they were not a focus
of the computer-training program. In addition, there were not any differences
between groups in their reports of the frequency with which they used a computer
outside of their homes, perhaps because all training group participants were given
a home computer as part of their participation in the program, and so they no longer
needed to go outside of their homes to use one. There were not any differences
between groups in their reports of obtaining a new job that required computer
use. This is not surprising considering that this follow-up was after only 3 months
and that it typically takes longer than 3 months to find success in the job market.
Although participants did visit the program’s bulletin board website, it appears that
they did not post to the bulletin board frequently.
Discussion
The training program and associated CTC had a positive impact on computer
literacy, and it was possible to increase cancer literacy in the context of such a program. For example, significant changes were observed among the participants in the
program (but not among the control group) including increased use of the Internet
to find health information, as well as a greater likelihood of using a computer and
teaching someone else—a child, family member, or friend—something on a computer. These results are encouraging, as they suggest that the participants shared what
they learned during the training sessions with others. Therefore, the training sessions
affected not only participants but likely others in the community who also may
have lacked computer skills. The diffusion of computer skills through social
networks is worthy of further study. Cancer literacy outcomes observed among
242
P. Salovey et al.
the program participants but not among the control group included greater
perceived knowledge about some cancers and more positive response efficacy beliefs
about cancer prevention.
Some of the findings ran counter to expectations. Paradoxically, it appears that
participants lowered their ratings of the importance of computers for work and
home following training. Perhaps such ratings were unrealistically high prior to
training, due to excitement about the pending program. After training, the ratings
may have regressed to more realistic (but still high) levels. Findings from the focus
groups 3 months after training suggested that participants uniformly expressed great
pleasure in owning a computer and knowing how to operate it. These largely African
American and Latino participants indicated a strong sense of self-confidence
regarding their skills and how these skills might relate to home and work activities.
Similarly, both intervention and control participants reported that they were
somewhat less willing to use the Internet to find information on cancer in the future.
Perhaps the novelty of the Internet for finding cancer-related information faded over
time. Or, perhaps program participants determined that this effort was no longer
needed, as they already had received such information in their course. Focus group
discussions also revealed that participants felt confident that they knew how to
locate information about cancer. Although only a few had done so, those who
had indicated that it was on behalf of a friend or family member. Participants
may have been reluctant to indicate on a survey that they would seek cancer
information in the future, yet during focus groups they readily shared; as one
participant noted, ‘‘if I ever had a question on cancer, I would now know how to
find the answer.’’
Despite the program’s apparent success with respect to the kinds of outcomes
summarized above, we must point out some of the limitations in trying to generalize
from these findings. First, the fact that participants were not randomly assigned to
the program versus the wait-list (assessments-only) control group limits the inferences we can make about differences between the groups after the program. Random
assignment was—quite simply—unacceptable to the Head Start staff or parents.
They viewed the program as a wonderful opportunity to learn skills and obtain a free
computer and felt that only a first-come, first-served registration procedure was fair.
Despite the benefits of using random assignment to avoid threats to the internal
validity of a study, it conveys arbitrariness to community participants, and it seems
to punish those individuals who may have been most excited about the program (and
who, therefore, signed up early). Without random assignment, there is little doubt
that the program participants were different from the nonparticipants right from
the start. For example, they were more likely to be Head Start parents rather than
other neighborhood residents, and they had more positive beliefs (perhaps unrealistically positive beliefs) at baseline about the value and importance of computers in
their lives and for future employment. A lack of controls placed on contamination
of information between the intervention and control groups further limits the
conclusions one may draw from the study.
Other limitations included the rather short follow-up period of 3 months following the end of the program. One wonders whether these new computer skills and
acquired health knowledge snowball in some way, such that after, say, one year
the differences between participants and controls would be even more apparent.
Of course, differences may fade over time as well. Moreover, we wished it were
possible to have obtained a greater range of behavioral data at follow-up—to have
Bridging the Digital Divide
243
observed actual computer use preferably in participants’ homes. We wonder, for
example, whether a bit of computer literacy changes interactions with family members and neighbors. Being the first person in one’s building to have a computer in
one’s apartment may, for example, also afford opportunities to broaden one’s social
network. In addition, this study relied on self-reported measures, including measures
of perceived knowledge as opposed to direct measures of knowledge. It is possible
that intervention participants over-reported computer knowledge and behavior
changes at post-test; social desirability bias cannot be eliminated entirely as an explanation for the findings, at least in part. Reported changes in health behavior are
less likely to be an artifact of social desirability bias, however, because the health
information was provided incidentally.
Despite their limitations, these findings have practical applications for health
communications research. There are a number of other lessons learned from the
experience of linking academic institutions and community agencies in partnerships
designed to enhance community capacity. One key lesson is that the relationship
between academic institutions and the communities in which they reside often is built
on wide disparities in perceptions of each other. Community partners perceive that
academic institutions impart to the community their warehouse of rich resources
only when research aims can be met. Conversely, academic institutions fail to see
that community partners need a great deal of support to be effective collaborators
in decision making and problem solving. In this project, Head Start was eager
to accept the resources offered by Yale University. Because a system was not established that consistently included Head Start in planning and implementation,
however, Head Start was not in a position to take responsibility for the long-term
sustainability of these CTCs. To overcome similar challenges, it would be beneficial
for others conducting this type of research to hold a series of regular meetings and
maintain open communication between the academic institution and community
agency partners in all phases of the project including planning, implementation,
and maintenance.
A second, important lesson of this project is that low-income community residents are eager to learn computer skills. Many do not have the time or the financial
resources to take traditional computer classes, but with a manual and a free 6-hour
course, participants demonstrated the capacity to continue using computers in novel
ways at least 3 months beyond their training. Most importantly, they shared their
newly acquired skills with their children, other family members, and friends.
Although it is not clear if this project would have been as successful if participants
did not receive a free refurbished computer at the end of the training period, it is the
case that even older refurbished computers were an important incentive. There is a
clue, however, that neighborhood residents are eager for computer training even if
they do not receive a computer. After the program’s completion, computer-training
classes, in the form of brief workshops on specific topics, continued at the Head Start
CTCs. These workshops involved a small registration fee and did not result in
participants receiving a free computer upon completion. Yet, they remained popular
with Head Start parents and others in the surrounding community. There seems little
doubt that there is demand for programs of this kind.
A final and important lesson is that community residents are open to learning
about cancer prevention (and health, more generally), even ‘‘incidentally’’ in the
context of computer skills training. Discussions at the training sessions revealed
that many participants had experiences with cancer and were eager to share these
244
P. Salovey et al.
experiences in the group. Computer skills training programs have potential as
vehicles for transmitting health-relevant messages and motivating health information seeking.
References
Bartholomew, L. K., Gold, R. S., Parcel, G. S., Czyzewski, D. I., Sockrider, M. M., Fernandez,
M., Shegog, R., & Swank, P. (2000). Watch, discover, think, and act: Evaluation of
computer-assisted instruction to improve asthma self-management in inner-city children.
Patient Education and Counseling, 39, 269–280.
Becker, M. H. (1974). The health belief model and personal health behavior. Health Education
Monographs, 2, 324–473.
Breeden, L., Cisler, S., Guilfoy, V., Roberts, M., & Stone, A. (1998). Computer and communications use in low-income communities: Models for the neighborhood transformation
and family development initiative. Baltimore, MD: Annie E. Casey Foundation.
Campbell, M. K., Motsinger, B. M., Ingram, A., Jewell, D., Makarushka, C., Beatty, B.,
Dodds, J., McClelland, J., Demissie, S., & Demark-Wahnefried, W. (2000). The North
Carolina Black Churches United for Better Health Project: Intervention and process
evaluation. Health Education & Behavior, 27, 241–253.
Fox, S. (2006, October 29). Online health search 2006. Washington, DC: Pew Internet and
American Life Project. Retrieved May 1, 2009, from http://www.pewinternet.org/
PPF/r/190/report_display.asp
Fox, S., & Livingston, G. (2007, March 14). Latinos online: Hispanics with lower levels of education and English proficiency remain largely disconnected from the Internet. Washington,
DC: Pew Internet and American Life Project & Pew Hispanic Center. Retrieved May 1,
2009, from http://www.pewinternet.org/~/media/Files/Reports/2007/Latinos_Online_
March_14_2007.pdf.pdf
Gary, T. L., Bone, L. R., Hill, M. N., Levine, D. M., McGuire, M., Saudek, C., & Brancati, F. L.
(2003). Randomized controlled trial of the effects of nurse case manager and community
health worker interventions on risk factors for diabetes-related complications in urban
African Americans. Preventive Medicine, 37, 23–32.
Institute of Medicine (IOM). (2001). Crossing the quality chasm: A new health system for the
21st century. Washington, DC: National Academy Press.
Institute of Medicine (IOM). (2002). Speaking of health: Assessing health communication
strategies for diverse populations. Washington, DC: National Academy Press.
Institute of Medicine (IOM). (2004). Health literacy: A prescription to end confusion. Washington,
DC: National Academy Press.
Janz, N. K., & Becker, M. H. (1984). The health belief model: A decade later. Health Education
Quarterly, 11, 1–47.
Kim, S., Koniak-Griffin, D., Flaskerud, J. H., & Guarnero, P. A. (2004). The impact of
lay health advisors on cardiovascular health promotion: Using a community-based
participatory approach. Journal of Cardiovascular Nursing, 19, 192–199.
Kreps, G. L., Gustafson, D., Salovey, P., Perocchia, R. S., Wilbright, W., Bright, M. A., &
Muha, C. (2007). The NCI digital divide pilot projects: Implications for cancer education.
Journal of Cancer Education, 22, S56–S60.
Latimer, A. E., Katulak, N. A., Mowad, L., & Salovey, P. (2005). Motivating cancer prevention and early detection behaviors using psychologically tailored messages. Journal of
Health Communication, 10 (Suppl. 1), 137–155.
Lazarus, W., & Mora, F. (2000). Online content for low-income and underserved Americans:
The digital divide’s new frontier. Santa Monica, CA: The Children’s Partnership.
McCaul, K. D., Peters, E., Nelson, W., & Stefanek, M. (2005). Linking decision-making
research and cancer prevention and control: Important themes. Health Psychology, 24,
S106–S110.
Bridging the Digital Divide
245
National Cancer Institute (NCI). (1999). The nation’s investment in cancer research: A budget
proposal for Fiscal Year 2001. Bethesda, MD: Author.
National Telecommunication and Information Administration. (2004). A nation online: Entering the broadband age. Retrieved May 1, 2009, from http://www.ntia.doc.gov/reports/
anol/index.html
Pew Internet & American Life Project. (2007). February 15 – March 7, 2007 tracking survey.
Retrieved May 1, 2009, from http://www.pewinternet.org/trends.asp
Ratzan, S. C., & Parker, R. M. (2000). Introduction. In C. R. Selden, M. Zorn, S. C. Ratzan,
& R. M. Parker (Eds.), National Library of Medicine Current Bibliographies in Medicine:
Health Literacy (NLM Pub. No. CBM 2000-1). Bethesda, MD: National Institutes of
Health.
Resnicow, K., Wallace, D. C., Jackson, A., Digirolamo, A., Odom, E., Wang, T., Dudley, W.
N., Davis, M., Mitchell, D., & Baranowski, T. (2000). Dietary change through African
American churches: Baseline results and program description of the eat for life trial.
Journal of Cancer Education, 15, 156–163.
Resnicow, K., Campbell, M. K., Carr, C., McCarty, F., Wang, T., Periasamy, S., Rahotep, S.,
Doyle, C., Williams, A., & Stables, G. (2004). Body and soul. A dietary intervention
conducted through African-American churches. American Journal of Preventive Medicine,
27, 97–105.
Rudd, R., Kirsch, I., & Yamamoto, K. (2004). Literacy and health in America. Princeton, NJ:
Educational Testing Service.
Salovey, P., & Williams-Piehota, P. (2004). Field experiments in social psychology: Message
framing and the promotion of health protective behaviors. American Behavioral Scientist,
47, 488–505.
Staten, L. K., Gregory-Mercado, K. Y., Ranger-Moore, J., Will, J. C., Giuliano, A. R.,
Ford, E. S., & Marshall, J. (2004). Provider counseling, health education, and community
health workers: The Arizona WISEWOMAN project. Journal of Women’s Health, 13,
547–556.