E-Commerce Agents

Are INTELLIGENT
E-Commerce Agents
PARTNERS or
PREDATORS?
Mobile agents are changing the face of
tion users’ purchasing experience. Executives from
the third-party information aggregators were quick
e-business and reshaping business models.
to point out their systems were actually benevolent
in that they served as “repeaters” or mirrors of eBay
In the process these agents are also posing
information, thus actually lowering the load. They
new concerns regarding who really owns
also stressed that while they were promoting the
offers, purchasing transactions were still carried out
information.
at eBay’s site, so business was not taken away from
EBay made headlines three years ago by initiating a the company, but, in fact, promoted.
drastic new policy against third-party predatory
The culprits in this situation were mobile intellisearch agents. The policy was directed against intel- gent agents [8], or more specifically, one type of
ligent agents that would enter the auction site, mobile intelligent agents. In the eBay scenario, intelsearch for items their issuers were looking for, and ligent agents were harvesting information and were
then notify the issuer about pricing and deadlines. sending it to their company’s computer that collected,
EBay’s modified user agreement would prohibit analyzed, and redistributed that information.
third-party sites from collecting and sharing inforAre agents truly predators? EBay’s response clearly
mation found on eBay’s site. The
suggests so. Yet, Murch and Johnson
problem, as reported, was the
[9] claim just the opposite, stating
search agents were frequently Christian Wagner “It is in the interest of all companies
accessing eBay, sifting through auc- and Efraim Turban that wish to sell over the Internet ...
tion offers, harvesting the informathat their information is formatted
tion, and placing it on alternate
and available in such a way that it
Web sites known as “aggregators.” The list of aggre- can be easily accessed by ... these agents.” In other
gators included names such as biddersEdge.com words, agents are viewed by some as having positive
(now bankrupt), AuctionWatch. com (now a portal characteristics.
site on auction management), itrack.com (acquired
Despite recent court rulings, third-party companies
by overBid), and Ruby Lane (now an auction por- are still assessing the legality of eBay’s action and huntal). EBay claimed the aggregators’ search agents dreds of Web users are still expressing their opinions in
were harmful in multiple ways. They would slow chat rooms and newsgroups (the vast majority criticizes
down eBay’s transaction processing systems, thus eBay, see “Talkback post” at www.znet.com). Here, we
reducing performance for all other eBay visitors. discuss the various aspects related to intelligent agents
Moreover, outside search agents might not show the and information aggregation, focusing on auction sites
most up-to-date information and thus lower auc- as well as implications for other e-commerce.
84
May 2002/Vol. 45, No. 5 COMMUNICATIONS OF THE ACM
RANDALL ENOS
What Do Intelligent Agents in
E-Commerce Do?
Many types of agents exist in e-commerce. Maes [7]
organizes e-commerce agents into three categories
that correspond to stages in the buying process: Product brokering, merchant brokering, and negotiation.
Wang [11] classifies e-commerce agents into eight
categories according to the various tasks they support. An overview of agents by application area (for
example, in email, competitive intelligence, banking,
and investment) can be found in [9], while
www.botspot.com provides free periodical reports on
innovative agent applications. Turban et al. [10] offer
numerous examples of real-world agent applications.
Intelligent agents can carry out numerous decisionmaking and problem-solving tasks that traditionally
require human intelligence, such as diagnosis, data classification, planning, or negotiation. They can answer
email messages, search the Internet for useful
information, carry out comparisons, or even
become electronic pets (such as the Tamagotchi). They can also buy and sell products or services. Among the many types of
agents most relevant for our discussion are
mobile agents that collect information
from remote sites. Mobile agents
are not bound in their operation to the server from which
they originate. They are typically written in a platformindependent
language
such as Java, and can travel
from host to host where they
execute as if they were local
programs. Here, we mean
“mobile” whenever we use the
term “agent.”
Hendler [4] differentiates
four types of agents by function.
Problem-solving agents do what many
traditional planning expert systems
did, namely gather data, analyze a
situation, and make a corresponding decision for how to act on the user’s
behalf. Purchasing agents fall into this
category. User-centric agents facilitate
interaction with the user. In essence,
they provide a better user interface
by learning about the user’s system use preferences and tailoring the interface to the user
preferences. Control agents
control the operation of several agents in a multiagent
environment. In this context one needs to remember
that agents are not only mobile, but also small in size,
each with a very specialized capability. Hence, the
interaction of several agents might be necessary to
provide sufficient intelligence and capability. These
are very advanced agents used in research experiments. Finally, transaction agents translate information
between different data standards within a heterogeneous database or file environment. Among these
four types, the ones that create contention are problem-solving agents specializing in data harvesting.
They may be assisted by transaction agents to access
data from multiple data sources and may be controlled by control agents. Nevertheless, the critical
functionality is the ability to collect and analyze information from remote sites.
From the perspective of computing paradigms,
Web agents offer a new alternative that has evolved
from the concepts of client/server computing and
code-on-demand [6]. In client/server computing,
services are implemented on a server and offered by
that server to clients. Hence, the model is server
centric, and intelligence is not easily added. The
server holds the know-how as well as
processor capability and resources. In a
code-on-demand environment, clients have
resources and processing power (such as
any user’s PC accessing
the Internet), but often not
all the necessary know-how.
This can then be downloaded from a host (such as
in the form of Java applets). In the
agent environment, all three—processing, resources and know-how—can be flexibly distributed throughout the network
environment. Agents containing knowhow and possibly resources can travel
from host to host, carry out tasks, gather
information, and then move on again.
Agents are (or should be)
fairly light, thus creating
only a light processing
load on the network environment and consuming or occupying
only few resources. Small agent size offers several
advantages over other computing paradigms,
namely low latency and
little network load.
COMMUNICATIONS OF THE ACM May 2002/Vol. 45, No. 5
85
Although agents are light and not too resource
intensive, they nevertheless require some resources.
Furthermore, these resources are not provided by the
host that instantiated them, but by some other server
where they temporarily reside. In other words, thirdparty sites are appropriating resources from remote
hosts without compensation. However, Jim Wilcoxson, CEO of Ruby Lane, quantifies the consumption:
“Our programs ... are very sophisticated, automatically slowing down or stopping if the load on eBay
gets too high. The agents represent only 0.025% of
eBay’s traffic.” Whether 0.025% is an acceptable level
of resource consumption is subject to debate. The fact
is, however, that every entity represented by a server
on the Internet implicitly agrees to have some of its
resources occupied or consumed by outside parties
whether they are buyers or not. This is, after all, one
of the new realities of the Internet world where companies are opening up their databases and transaction
processing systems to anyone. The resource providers
can obviously insist that their resources are only available for the
direct and sole Table 1. Internet
benefit of the business models
end user but
not for intermediaries, and can formulate contracts to restrict the use by
intermediaries. Enforcement of
such agreements is difficult, however, especially if no user registration is required for site access.
Is Information Sharing a
Win-Win Situation?
One might argue that eBay—or
other companies in the same situation—are not hurt by agents at
work. Wilcoxson of Ruby Lane pointed out his
company’s agents only consume a fraction of
resources consumed by a Web browser accessing the
site. Furthermore, he stressed that by mirroring
eBay’s listings, Ruby Lane would in fact take some of
the search load off eBay’s site, while the transaction
would still be completed with eBay.
To properly interpret this argument, it is necessary
to understand different archetypal business models on
the Internet. The several different classification models, for example [1] and the three business models
shown in Table 1, are useful to illustrate the different
arguments related this topic.
As Table 1 illustrates, the first two types of Internet
businesses models charge for their service, with the
purpose of generating a profit from that service. For
instance, an e-shop such as eBay will charge for prod86
May 2002/Vol. 45, No. 5 COMMUNICATIONS OF THE ACM
ucts (services), while an application service provider
(ASP) will charge a service fee for software use. The
third model, “e-free,” offers something of value, typically information or an information-based product
without charge (one of the Internet’s original driving
forces). However, in order to be commercially viable,
e-free providers need to generate commissions or
advertising fees by producing sales transactions and
revenue elsewhere. Hence, they have to contain
advertising, or have to offer sales leads, or have to sell
their customer base to others. An e-free provider may
argue that all they want are “eyeballs,” not the actual
transaction. So, if eBay retains the transaction (and
eBay does not need the eyeballs), the result is a winwin situation.
Unfortunately, the circumstances have changed
following the significant decline in online advertising
revenues during 2000 and 2001. Several of the companies with an e-free model, including third-party
agent sites, have had to change their business models
(or will need to do so very soon). As a result, auction
aggregators changed into specialized auction sites (for
example, Ruby Lane), auction management companies (for example, AuctionWatch), or sold the business to a larger competitor (for example, iTrack), thus
becoming potential or direct competitors.
To analyze the potential conflict between aggregators and the sites they harvest, it is also worthwhile to
investigate the market capitalization contribution of
the different models. At this point, few companies
with a pure e-free model remain, and lack of sales
transactions is strongly reflected in market value. At
the same time, many of the e-shops use their brand
strength to draw advertising revenues, notably Amazon and eBay. In fact, part of the relatively high valuations of e-shops can be attributed to the brand
strength and its leverage in advertising (rates), sales
leads, or cross selling. A table of stock valuations of
different Internet
of customers, numCOMPANIES MAY ULTIMATELY NEED TO
entities illustrates
ber of visits, numCHARGE A PRICE FOR WHAT IS PRESENTLY
this concept.
ber of transactions,
Table 2 offers
and time spent at
OFFERED FOR FREE AT A LEVEL THAT WILL
strong
evidence
the eBay site—
REFLECT THE TRUE COST OF THAT SERVICE.
that a company’s eclearly gives it that
shop contributes
critical mass. It also
THUS, E-FREE SITES ARE ULTIMATELY
significantly more
provides a strong
GOING TO DISAPPEAR.
to its valuation
incentive for others
than any e-free
to replicate that
offering, especially following the dramatic downturn critical mass by simply mirroring eBay’s offerings and
in online advertising rates. In today’s market, the e- then augmenting them with those from other sources.
free model contributes, on a per-user basis, only sin- The mirroring weakens the need for auction buyers or
gle-digit ($4 for iVillage) or low double-digit dollar sellers to choose the eBay (or other original) site as the
values ($18 for Ask) to a company’s valuation. This primary point to look for list item. Buyers want to save
represents a drop of more than 95% from two years time, so instead of visiting several auction sites, they
ago. At that time, a Web agent that rerouted visitors will go to one site that aggregates information from
from the original data producing site (such as eBay) to several sources. Sellers can choose any auction site evalan alternate site could potentially increase the valua- uated by Web agents, and may pick the one with the
tion of the agent site by as much as a few hundred dol- least commission, instead of the most popular site.
lars per visitor. Today, the monetary values are much Thus, mirroring weakens the reinforcement provided
lower, but with e-free sites desperately fighting for sur- by critical mass and ultimately may erode it.
Table 2. Company
stock valuation by
revenue generation
model and number
of users.
vival, the stakes are even higher. Furthermore, an
alternate site can cherry-pick, using its Web agents
and own Internet-based information systems to handle primarily popular items that draw customer attention, whereas the original site has to handle all
transactions. Hence, the resulting situation is the
losses to the data originator (for example, eBay) may
not necessarily be made up by extra sales generated
there by referrals from the alternate (agent) site.
An additional issue is the critical mass of e-shop
sites. Any seller knows that being “big enough” is very
important. Even in traditional markets, companies (for
example, car dealerships) co-locate in order to draw a
critical mass of potential buyers. Web sites, including
auction Web sites, rely on this concept even more due
to the large fixed cost in infrastructure and advertisement. EBay’s top rank as an auction site—in number
Are Agents a Security
Risk?
A major concern voiced by
opponents of (mobile) intelligent agent technology is that
agents can pose a security risk not only to remote
hosts, but also to their original host (and to themselves). A comprehensive discussion of these risks
and possible countermeasures is provided by [8].
The following potential risks were identified in part
from [6]:
Stealing data/Illegal access. Web agents may try to
gain access to databases they are not supposed to
access or for which there is an access fee.
Free use of resources (through masquerading). Agents
always “steal” resources from remote hosts. As long as
this is in line with accepted protocols, it is an acceptable practice. However, if agents masquerade as alternate processes, they may use unacceptable levels of
resources. For example, a Web agent may even “borrow” resources from a remote host to send or receive
email.
Unauthorized program execution (Trojan horse).
Agents may also masquerade and then execute programs that are ultimately harmful to the remote hosts.
Such Trojan horses have already been used repeatedly
on the Internet. However, an open computing environment that freely accepts agents on remote hosts
creates a much larger risky arena for such attacks.
Data stripping or alteration (by server). Technically it
is possible to strip Web agents of their data. This is
COMMUNICATIONS OF THE ACM May 2002/Vol. 45, No. 5
87
mostly a concern for a site that sends out agents to
remote hosts, but also it could potentially affect other
sites. For instance, suppose Buyer has a trusted relationship with both Seller 1 and Seller 2. However,
there exists a competitive relationship between the
two sellers. An intelligent agent that originates from
Buyer and travels to Seller 1 and then to Seller 2 could
be stripped by Seller 2 to obtain competitive data
about Seller 1.
Resource exhaustion (trashing) resulting in denial-ofservice. Web agents can exhaust remote host resources
to the point where the remote host can no longer
function properly. Ruby Lane’s CEO points out that
companies spidering the eBay site only consume
about 0.025% of eBay’s resources and even that
would only take place in off-peak load situations.
Other agents may not be as considerate to the remote
host: in fact, they can be designed to bring down the
remote server, as has been aptly demonstrated by
numerous denial-of-service attacks.
Deceitful agent behavior. Agents can mislead other
Transaction costs. One of the benefits e-commerce
can provide is lowered transaction costs. Yet, in order
to achieve this goal much of the transaction processing procedures need to be automated. If closing the
deal requires either negotiation, or search for information, or similar activities, it also requires intelligence, and thus provides a rich application area for
Web agents.
Furthermore, the cost of processing transactions
that do not add value per se needs to be lowered as
much as possible, particularly customer inquiries
about items or between-sales service requests. It has
become so easy for customers to send out requests for
information or service by email that companies are
inundated with this type of mail. Companies such as
Dell Computers receive tens of thousands of email
messages per day, many of which are not purchase
requests. Asking a question is generally quick, easy,
and inexpensive, answering it may be just the opposite. Hence, an ability to handle 80% to 90% of
inquiries through AI-based automatic procedures
greatly reduces the cost of staying close
DESPITE THEIR LIMITATIONS AND RISKS, MOBILE to customers. Answer agents or advice
agents that have this ability, possibly
WEB AGENTS ARE HAILED AS ONE OF THE MOST even in real time, are available from sevATTRACTIVE TECHNOLOGIES FOR THE NEAR FUTURE eral vendors. Firepond, for instance,
claims its answer agent (“eServicePerAND ARE CONSIDERED BY MANY AN ABSOLUTE
former Answer”) can handle “up to
NECESSITY FOR E-BUSINESS IN LIGHT OF THE
80% of a customer’s email with 98%
accuracy” and respond within seconds.
EXPONENTIAL INFORMATION LOAD INCREASE FOR
An example of extensive use of
BUYERS AND SUPPLIERS.
agents can be seen at Cisco Systems.
Cisco is using a suite of commerce
agents or hosts about their intention and can lie about agents that help customers/partners to do business
transactions. For example, in agent activities that go electronically:
beyond information collection, such as transaction
completion, a malicious behavior would be the denial • Lead Time Agent gives customers and partners
of a transaction that actually took place. The agent
the current lead times for Cisco products.
essentially reneges on the deal, like a person might do • Service Order Agent provides access to informain real-world transactions. This is a fundamental issue
tion on service orders.
since an increasing number of transactions will make • Contract Agent provides information about
the monitoring of each individual transaction less feaservice contracts.
sible and thus increasing the need for trust in such • Upgrade Agent allows requests for software or
transactions (for example, [3, 4].
hardware upgrades and documentation.
• Notification Agent lets users specify criteria that
Benefits of Agents
will result in receiving email automatically about
Despite their limitations and risks, mobile Web
changes in order status or pricing.
agents are hailed as one of the most attractive tech- • Configuration Agent enables users to create and
nologies for the near future and are considered by
price online configurations.
many an absolute necessity for e-business in light of
the exponential information load increase for buyers
These agents can be integrated with the customers’
and suppliers. Hence, it is worthwhile to look at information systems.
some positive effects of Web agents related to the
Turnaround time. In some e-commerce applicadebate.
tions, quick turnaround (or cycle) time is absolutely
88
May 2002/Vol. 45, No. 5 COMMUNICATIONS OF THE ACM
crucial. Customers of Internet brokerage firms want
instantaneous order execution, and also expect
response to inquiries within very short time, that is,
no longer than 24 hours. A brokerage firm has to be
able to provide such service levels regardless of high or
low market volume. In fact, on high volume days,
(that is, days with large market volatility), the number
of information requests might even be disproportionally higher than on normal transaction days. Hence,
the brokerage has to be able to respond to the peak
load, while ideally not keeping too much overhead
during low load periods. Here again, Internet agents
able to classify requests and answer routine inquiries
can significantly lower the transaction volume and
provide a high service level even in peak periods. An
example is E*Trade’s “ask” agent.
Closing the deal. As agents can greatly increase the
efficiency of e-commerce transactions, they can also
improve its effectiveness. The sheer ability to close a
deal via an agent allows companies to make
sales that are otherwise impossible. A small
mom-and-pop Internet store can suddenly
provide 24-hour customer service and
sales on a global scale, not tethered by the
limitations of local time zones. Businesses
can also look much more intelligent by
boosting order taking through intelligent
agents.
Lowest price purchase. Comparison shopping
over the Internet has become one of the most
popular applications for agent technology.
Agents can make the search for the lowest cost
almost effortless for the customer. While
highly advantageous for buyers, this development has obviously raised concerns with
retailers who complain about the singleminded price orientation of intelligent agents.
Whose Information Is It?
One of the very contentious points in the discussion
that followed the original eBay story was information
ownership. Many individuals argued that since the
information was “in the public domain” it was no
longer eBay’s. Ruby Lane’s CEO pointed out his site
was not copying eBay’s information, but simply providing URL links to it.
Settling the issue of ownership right—and therefore the decision whether it might be a criminal
offense for others to take the information from the
original site—is tricky. As the information remains on
the original site, no theft takes places, but an infringement of copyright is possible. However, information
is never copyrighted, only a particular form of expression. Furthermore, the copyright may not reside with
the original site, but with the customers who entered
the information in the first place. These individuals
might have the right to refuse third-party sites to
broadcast their transaction data based on information
privacy regulation, or conversely, may demand open
accessibility.
More relevant for e-businesses than the legal concerns, however, may be some commercial issues. An ebusiness may contractually limit information use
rights with its business partners. That is the path eBay
has taken with its September 1999 user agreement
change. Such a contractual agreement, however, does
not limit nonsubscribers from exploiting any freely
available data to the fullest, as nonsubscribers have no
contractual agreement with eBay. To avoid giving subscribers effectively less rights than nonsubscribers,
e-commerce sites may need (some already have)
very different information access policies with
significant information access for registered
users and very little access for other parties.
Yet, such policies may result in negative effects on a site’s attractiveness and increase the difficulty
of drawing new customers.
Conclusion
We have attempted to provide two contradicting views about intelligent agents,
stressing the associated problems, but also
pointing to great benefits. In drawing
conclusions, it may be beneficial
to reflect on the potential effects
brought about by a “future with intelligent agents.”
First, agents will be game spoilers. Whenever
companies are offering a free service and in
exchange are trying to extract something from the
customer (such as a lengthy site visit), agents can be
created that take over the customer’s activities. Thus,
the agents extract the value without the service
provider receiving the expected returns. Hence, companies may ultimately need to charge a price for what
is presently offered for free at a level that will reflect
the true cost of that service. Thus, e-free sites are ultimately going to disappear. We should expect this scenario to play out as soon as efficient micropayment
systems become widely used.
Second, agents may create a situation that was traditionally known as the “tragedy of the commons.”
The “commons” were the free grazing areas to which
farmers in England could send their livestock. As the
access was free, the best strategy for each individual
farmer was to send as many animals as possible, thus
leading to overgrazing and diminished future returns.
COMMUNICATIONS OF THE ACM May 2002/Vol. 45, No. 5
89
Today’s “commons” are the Web sites that generate
information and make it available to customers free of
charge, while they generate the necessary revenue
from transaction processing or advertisements. As
more and more secondary sites harvest the information generated by primary sites, the primary sites will
not receive sufficient traffic to produce enough surplus, thus forcing them to reduce or eliminate their
free services. Thus, overharvesting by search sites may
result in poorer original data sites.
Third, although companies such as eBay may complain about data misuse by others, they may have little choice in the long run but to make the
information freely available anyway. After all, one of
eBay’s sources of success has been the creation of an
open market where buyers and sellers could easily link
up and where at least some information was available
to assess the trustworthiness of partners in the sales
transaction. In order to maintain this source of competitive advantage, there may be little option but to
“keep the books open” and to accept some level of
information poaching. “Having one’s cake and eating
it, too,” but forbidding use of openly available information may not be possible in the end. In the
interim, however, eBay’s “no trespassing” rules may
have lead to casualties among aggregators. For
instance, Bidder’s Edge shut down on Feb. 21, 2001,
due to “market and financing conditions,” following
a court ruling in May 2000 that barred the company
from searching eBay’s information. Bidder’s Edge
maintains that the eBay ruling did not cause its
demise. Interestingly enough, companies with a market position such as that of eBay are in the best position to poach from others, augmenting their already
large selection of goods with specialty offers harvested
from other sites. In other words, the seemingly more
feasible strategy would be to not exclude others from
harvesting one’s site, but to beat them at their game.
This is a somewhat different strategy as suggested by
[2]. They maintain that in the face of disintermediation, the traditional intermediary has to shift strategy.
Given the impact of agents in changing the nature
of businesses, some e-businesses will likely try to
shield themselves from information harvesting by frequently changing the interface to their data, thus
undermining an agent’s capability to access information using old communication protocols. At the same
time, any change in interface must be either invisible
to human users or acceptable as an improvement in
information access.
Denial-of-service, whether due to overload by
benign ‘bots or due to malicious attacks, will remain
a key source of concern. E-business Web sites will
have to protect themselves against such attacks, and
90
May 2002/Vol. 45, No. 5 COMMUNICATIONS OF THE ACM
may possibly introduce measures such as service
restriction to agents based on system load, or preferential treatment based on customer and agent profile.
For example, agents issued by trusted partners or by
customers with a track record may be handled on
more responsive servers than those from unknown
issuers.
Ultimately, agents are providing very useful services, both to customers who may benefit from buying at low costs, and for companies who also lower
their search costs. As long as agents do not pose an
unacceptable load on remote servers and as long as
security problems can be curbed, the agents’ great
benefits in lowering transaction costs, accelerating
cycle time, and closing the sale will result in their
widespread use. However, one class of agents—those
that harvest free information and reroute it to other
sites—may change the nature of the e-free business
and will face a significantly more turbulent and
perilous future. c
References
1. Applegate L.M., McFarlan, W.F., and McKenney, J.L. Corporate
Information Systems Management: The Challenge of Managing in the
Information Age. Irwin Professional Publications, 1999.
2. Chircu, A.M. and Kauffman, R.J. Reintermediation strategies in business-to-business electronic commerce. International Journal Electronic
Commerce 4, 4 (Summer 2000), 7–42.
3. Falconi, R. and Firozabadi, B.S. The challenge of trust. Knowledge
Engineering Review 14, 1 (1999), 81–89.
4. Hendler, J. Making sense out of agents. IEEE Intelligent Systems.
(Mar./Apr. 1999), 32–37.
5. Hoffman D.L., Novak, T.P., and Peralta, M. Building consumer trust
online. Commun. ACM 42, 4 (Apr. 1999), 80–85.
6. Lange, D.B. and Oshama, M. Programming and Deploying Java Mobile
Agents with Aglets. Addison-Wesley, Reading, PA, 1998.
7. Maes, P., Guttman, R.H., and Moukas, A.G. Agents that buy and sell.
Commun. ACM 42, 3 (Mar. 1999), 81–91.
8. Mandry T., Pernul, G., and Röhm, A. Mobile agents on electronic
markets: Opportunities, risks, agent protection. International Journal
of Electronic Commerce 5, 2 (Winter 2000–2001), 47–60.
9. Murch, R. and Johnson, T. Intelligent Software Agents. Prentice Hall
PTR, 1999.
10. Turban, E., Lee, J., Lee, J.K., King, D., and Chung, H.M. Electronic
Commerce: A Managerial Perspective. Prentice Hall, Upper Saddle
River, NJ, 2002.
11. Wang S. Analyzing agents for electronic commerce. Information Systems Management 16, 1 (Winter 1999), 40–47.
Christian Wagner ([email protected]) is an associate professor
of information systems at City University of Hong Kong and an
advisor to several technology start-up companies.
Efraim Turban ([email protected]) is a visiting professor of
information systems/e-commerce at City University of Hong Kong
and a consultant to major corporations worldwide.
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for
profit or commercial advantage and that copies bear this notice and the full citation on
the first page. To copy otherwise, to republish, to post on servers or to redistribute to
lists, requires prior specific permission and/or a fee.
© 2002 ACM 0002-0782/02/0500 $5.00