Panels: Spam, Spam, Spam: How Can We Stop It?

CHI 2003: NEW HORIZONS
Panels
Spam,
Spam:How
HowCan
CanWe
WeStop
Stop
Spam,Spam,
Spam,Spam,
Spam, Spam:
It?It?
Jenny Preece
Dept. of Information Systems
UMBC
1000 Hilltop Circle
Baltimore, Maryland, 21250 USA
+1 410 455 6238
[email protected]
Jonathan Lazar
Dept. of Computer and Information Sciences
Towson University
8000 York Road
Towson, Maryland 21252 USA
+1 410 704 2255
[email protected]
the last year, another estimate is that 12-15% of all e-mail
traffic is spam [1,2].
How can users and managers effectively deal with
spam and other unwanted e-mail? On the one hand, too
much spam can waste user time, and can overload
organizational
information
systems.
However,
automatically deleting e-mail that is perceived to be spam,
or to be inappropriate, can also result in deleting e-mail that
is appropriate and intended for the recipient. Two
commonly-used techniques for dealing with spam are
filters and moderation. But where does spam control stop
and censorship start? In order to deal with these technical
and ethical issues, our panel will consist of experts in
various filtering techniques, moderating, and computer
mediated communication, as well as ethical and policy
issues.
ABSTRACT
How do we keep our channels of electronic
communication, both individual and group, open, while
keeping out inappropriate and unrelated materials, such as
spam? Does someone other than the intended recipient
have the right to control what electronic mail users see?
Might this lead to censorship? If others DO have the right
to control what e-mail users see, how should this filtering
or censorship occur? Are users aware of this filtering? If
others are NOT controlling what users receive, what can
users themselves do to control their environments to limit
the amount of incoming spam? These are some of the
topics that this CHI panel will address.
Keywords
Computer-Mediated
Communication,
Filtering,
Censorship, Moderation, Electronic mail, Spam,
Teamwork,
Ethics,
Informed
Consent,
Online
Communities, Human Values
Filtering
E-mail filters are programs or agents that
automatically delete or file outgoing or incoming e-mail
based on certain criteria. These e-mail filters can be setup
to delete or file mail if it includes certain words, or comes
from certain e-mail addresses, domains, or countries. This
filtering can take place at an individual or an organizational
level. For instance, filtering could be used by an individual
to route e-mails on a specific topic directly to a mail folder
(e.g. the bulk mail feature of Yahoo! Mail.) Another option
that an individual could take is to use a killfile, which
deletes all incoming messages from a specific e-mail
address.
At the organizational level, all users should ideally
be aware of organizational filtering, along with the criteria
for filtering. In reality, this does not happen frequently.
When users are not aware of the presence of filters, they
may assume that all of their sent e-mail is arriving at the
recipient, and at the same time, that they are receiving any
e-mail sent to them. However, some companies are
automatically filtering any e-mail that comes from outside
of the U.S.A., since those companies view any non-US email as non-business related [2]. Other companies are
filtering employee e-mail from certain domains, without
informing those users. This is a major problem for
Some Questions posed to the panel:
Should companies, government agencies, Internet
service providers, online community managers, and
universities have the right to stop spam and other e-mail
that is considered inappropriate? Yes or No?
If some filtering by others is acceptable, how do
we control what is filtered and how filtering is done so that
there is no potential for unwanted censorship?
If filtering by others is unacceptable, what are the
best tools for individuals to use to control their own
incoming e-mail? What are the most current software
methods for filtering inappropriate e-mail messages? What
software tools are used, and what are the flaws in those
tools?
What challenges do moderators and/or “human
filters” face for group communication? What is the role of
a collaborative filtering system?
INTRODUCTION
As computer-mediated communication (such as e-mail and
listservers) becomes an increasingly important component
of human-human communication, we rely on it, and
assume that it will always work, and will be failsafe. In
reality, there are thousands of chain letters, flames, and
other inappropriate e-mail messages circling around the
Internet. Spam is becoming an increasing problem. One
estimate is that the amount of spam has increased 600% in
Copyright is held by the author/owner(s).
CHI 2003, April 5-10, 2003, Ft. Lauderdale, Florida, USA.
ACM
1-58113-630-7/03/0004
ACM 1-58113-637-4/03/0004
1-58113-630-7/03/0004
706
706
CHI 2003: NEW HORIZONS
Panels
geographically distributed teams, who cannot always be
confident that their e-mails are arriving. When mail is sent
to an incorrect e-mail address, the user may receive a
“return to sender” e-mail to let them know that the e-mail
didn’t arrive. When e-mails get automatically filtered, users
may be unaware that their e-mails are not arriving. Trust
may break down between distributed team members. In
addition, E-mail filters face many of the same problems as
web site filters, namely that appropriate messages may be
filtered out, while inappropriate messages may still find
their way through the filters.
challenge -- devising economic and social structures to
avoid theft of attention and despoiled common information
spaces without overreacting and inviting censorship.
Jenny Preece, UMBC- Moderation is a well-established
technique for filtering messages but should moderators
have sole power for deciding what is posted and what is
not? Off-topic messages can stimulate conversation or be
distracting, and controversial comments can have positive
or negative impacts. Tools are needed but they are not
perfect, and therefore clear policies can help bridge the
gap.
Moderation
Moderation is a process for deciding whether a message
sent through a group communication tool (such as
listservers, newsgroups, etc.) is appropriate and should be
distributed. Moderation can be done manually (by a human
moderator) or with the help of software. Moderation
software can be very similar to filtering software, as
moderation is really a form of filtering, for a group setting.
However, if clear posting policies are not present, users can
be confused as to what should be posted, and why their
message was not posted to the community. Without some
form of moderation, an online community may become
overwhelmed with unwanted and inappropriate messages,
making it impossible to successfully exist as a community
[3]. The community may then lose members or die out.
Effective moderation can help avoid these attacks on the
community.
RELATION TO CHI 2003 THEME
The conference theme is New Horizons, and one of the
special topic areas is mass communication. Filtering and
moderation are becoming major challenges as unwanted email continues to plague users and the amount of spam
only increases. Unfortunately, “spam” is on the horizon,
and is a threat to successful mass communication.
PANEL FORMAT
1. The panelists will be introduced by the moderator,
and give a brief statement of their views on this
topic (10 minutes). The moderator (Jonathan
Lazar) will also discuss his own personal
experiences with spam and moderation, which led
to the formation of this panel (5 minutes).
2. A scenario related to spam and moderation will be
presented by the moderator, and each panelist will
have a few minutes to respond, followed by a freefor-all discussion within the panelists (20
minutes).
3. The audience will have 10 minutes to respond to
the scenario (10 minutes).
4. A second scenario will be presented, and each of
the panelists will have a few minutes to respond,
followed by a free-for-all discussion within the
panelists (20 minutes).
5. The audience will have 10 minutes to respond to
the second scenario (10 minutes).
6. Jonathan Lazar will lead the panel in a summary
of the day’s discussion, followed by any
outstanding questions from the audience (15
minutes).
PANELISTS
Jonathan Lazar, Towson University (panel moderator)
Elizabeth Churchill, FX Palo Alto Laboratory-We can't
and should not stop spam; we can raise the costs for
senders and we can give end-users more effective controls.
Where institutional spam filters and moderation practices
are in place, they should follow good design: make actions
explicit and give clear feedback about what has been
filtered.
Hans de Graaff, KPN Mobile- In two years time we'll
look on to spam as one of those curious growing pains of
the internet, but human moderation will stay with us
forever
Batya Friedman, University of Washington Filters are an
imperfect technology -- at times censoring legitimate
communication and at times allowing access to unwanted
communication. Given these imperfections, I will argue
that technical solutions to spam based on filter-technology
should also implement informed consent so that users (1)
are aware of the filter’s strengths and limitations, and (2)
have the opportunity to accept or decline use of the filter.
Joseph Konstan, University of Minnesota-The same
technological advances that make it possible for one
"speaker" to flood millions of mailboxes can also empower
the "listeners" to individually and collectively filter and
moderate content to close the door on unwanted messages.
As a society, this leaves us with the same difficult
References
[1] Lee, J. (June 27, 2002). Spam: An escalating attack of
the clones. The New York Times.
[2] Mayer, C., & Eunjung-Cha, A. (June 9, 2002). Making
spam go splat: Sick of unsolicited e-mail,
businesses are fighting back. The Washington
Post, pp. H1.
[3] Preece, J. (2000). Online Communities: Designing
Usability , Supporting Sociability. New York:
John Wiley & Sons.
707
707