Comparing Customer Trust in Virtual Salespersons With Customer

Proceedings of the 38th Hawaii International Conference on System Sciences - 2005
Comparing Customer Trust in Virtual Salespersons
With Customer Trust in Human Salespersons
Sherrie Komiak
Faculty of Business Administration,
Memorial University of
Newfoundland
[email protected]
Weiquan Wang
Sauder School of Business,
University of British
Columbia
[email protected]
Abstract
Virtual salespersons (computer agents) act in a similar
role in online stores as human salespersons act in
physical stores. Customer trust in a salesperson is key in
generating transactions and managing customer
relationships. In this exploratory study, 44 participants
used the services of both virtual and human salespersons
in the same commercial store. Written protocols were
collected by asking the participants open-ended questions
regarding their comparative trust. This paper finds that
similar to trust in a human salesperson, trust in a virtual
salesperson contains trust in competence, benevolence,
and integrity; however, the formation processes of trust in
virtual salespersons, trust in human salespersons, distrust
in virtual salespersons, and distrust in human
salespersons are different. This paper theoretically
outlines to what extent research on trust in computer
agents can draw from literature on interpersonal trust. It
practically contributes to our understanding of how to
better design trustworthy virtual salespersons.
1. Introduction
Firms are still learning how to effectively market on
the Internet. In a physical store, if customers are shopping
for complex or unfamiliar products, or if customers are
confused by various product offerings, they can consult
with a human salesperson. In an online store, customers in
similar situations may consult with a virtual salesperson
which is an intelligent computer agent embedded in the
online store. Virtual salespersons are increasingly
prevalent in online stores, such as www.amazon.com and
www.landsend.com. They are useful for reducing
information overload [1], providing online customers with
recommendations on suitable products [2], and facilitating
online customers’ shopping decision-making [3-5]. They
act in a similar role in online stores as human salespersons
act in physical stores.
Customer trust in a salesperson is the key to generating
business transactions and building customer relationships
Izak Benbasat
Sauder School of Business,
University of British
Columbia
[email protected]
[6, 7]. Trust is becoming increasingly important in online
shopping environments due to the lack of proven
guarantees that the e-vendors or agent providers will
refrain from opportunistic behaviors (e.g., by taking
advantage of consumers and providing biased
recommendations), and due to the lack of cues available
to assess the quality of recommendation services [8, 9]. In
the context of online stores, customers have to trust a
virtual salesperson before they are willing to use it [8, 9].
Customer trust in a virtual salesperson will also
significantly influence their attitude toward the web store
and their intention to shop online [10]. Thus, a key
question is how to design trustworthy virtual salespersons.
In order to design a trustworthy virtual salesperson, it
is natural for researchers and practitioners to draw upon
the rich literature on interpersonal trust. However, it is
controversial to what extent they can do so [9]. If trust in
a virtual salesperson (a computer agent) and trust in a
human salesperson (a person) are fundamentally the
same, then researchers and practitioners can largely reuse
our accumulated knowledge about interpersonal trust and
interpersonal interactions to design virtual salespersons. If
customer trust in a virtual salesperson is different from
customer trust in a human salesperson, we should ask new
questions. How are they different? Are the differences
beneficial or detrimental for a virtual salesperson to gain
trust? Are the differences fundamental or can they be
reduced by improving the design of the virtual
salesperson?
These questions are important both theoretically and
practically. Theoretically, the answers will affect the
boundary between the human-computer interaction area
and interpersonal interaction area. This will help clarify
the conceptualization of customer trust in a computer
agent, and determine to what extent research on trust in a
computer agent can draw from prior research on trust in a
person. Practically, the answers will improve on how to
better design virtual salespersons and how to more
effectively market in online stores. This will contribute to
our understanding on how to integrate, or differentiate,
electronic commerce and traditional commerce.
0-7695-2268-8/05/$20.00 (C) 2005 IEEE
1
Proceedings of the 38th Hawaii International Conference on System Sciences - 2005
In this exploratory study, we intend to reveal, compare,
and contrast the processes of forming customer trust in a
virtual salesperson, customer trust in a human
salesperson, customer distrust in a virtual salesperson, and
customer distrust in a human salesperson. Based on the
results, suggestions will be given on how to better design
trustworthy virtual salespersons.
2. Literature review
The central theme of this paper is to understand
customers’ trust and distrust formations in virtual
salespersons, and compare them with customers’ trust and
distrust formations in human salespersons. Thus this
section elaborates on the literature on trust in people
versus trust in technological artifacts (e.g. a computer
agent) and the literature on trust formation processes.
2.1 Trust in people versus in technological
artifacts
It is controversial whether trust in a person and trust in
a technological artifact (e.g. a computer agent) are
fundamentally the same or different. It is widely accepted
that trust in a person can be conceptualized as trust in the
person’s competence, benevolence, and integrity [e.g., 11,
12]. Regarding trust in a technological artifact, some
researchers doubt that trust in the benevolence and
integrity of a technological artifact exists [13], while
many other researchers believe that trust in a
technological artifact still contains trust in competence,
benevolence, and integrity [e.g., 8, 14].
Given this controversy, in the context of trust in a
computer agent (e.g. a virtual salesperson), we tend to
believe the idea that trust in a technological artifact is not
fundamentally different from trust in a person, thus trust
in a computer agent’s benevolence and integrity exists,
together with trust in the computer agent’s competence.
We are inclined to this position, due to the Theory of
Social Responses to Computers [15] and Sztompka [16]’s
theory of trust.
The Theory of Social Responses to Computers [15]
argues that people treat computers as social actors and
apply social rules to them. People do so automatically and
mindlessly, while they do not consciously recognize this
behavior [15]. After conducting more than 30 empirical
studies on this issue, Nass, Reeves, and their colleagues
have found that even technologically sophisticated people
treat technological artifacts (e.g., computers) as if they
were other human beings, rather than just tools. People
are polite to computers, respond to praise they receive
from computers, and view them as teammates. People
easily assign personalities (e.g., dominance, friendliness
and helpfulness) to computers. Such social responses
apply not only to sophisticated conversational computer
agents [17], but also to computer systems with simple text
interfaces [15, 18].
Sztompka suggests that the difference between trust in
a person and trust in a technological artifact is “not so
striking and fundamental” [19, p.42], inasmuch as behind
all human-made technologies, there stand people who
design, operate, and control the technologies, and it is
these people whom a trustor ultimately endows with trust
[19]. A trustor may be acquainted with the people behind
the technologies, he may simply imagine them, or he may
have some information about them. Thus, trust in a person
and trust in a technological artifact operate according to
the same logic, because behind trust in a person or a
technology, “there looms the primordial form of trust – in
people, and their actions” [19, p.46]. Appearances
notwithstanding, both a person and a technological
artifact are reducible to human actions, and we ultimately
trust human actions, and derivatively their effects, or
products [19]. Thus, in the case of trust in a technological
artifact, “we trust those who design the technology, those
who operate them, and those who supervise the
operations” [19, p.46].
Furthermore, a variety of studies on information
technology have extended the attribute of trustworthiness
to abstract and technical systems, as well as intelligent
computer agents [20, 21]. For example, several studies by
Muir and his collaborators [e.g., 21, 22, 23] have included
a dimension of morality (e.g., responsibility) in their
definition of trust in machines and automation. In their
experiments, participants were able to evaluate the
responsibility of machines in the processes of building
users’ trust. Similarly, in a study of embodied
conversational agents by Cassell and Bickmore [17], trust
was defined as a composite of benevolence and
credibility. An agent’s benevolence was demonstrated
through past examples of benevolent behavior, referring
to third-party affiliations, or its participation in
interaction-based social rituals, such as greetings.
In this paper, we will contribute by actually testing the
existence of trust in the benevolence and integrity of a
virtual salesperson (e.g., a computer agent). We asked
customers the open-ended questions regarding their trust
in a virtual salesperson without indicating the contents of
trust. We then conducted a protocol analysis to see
whether customers would mention trust in the virtual
salesperson’s benevolence and integrity, together with its
competence.
2.2 Trust formation processes
As summarized in Table 1, prior literature
conceptualizes the trust formation processes in different
ways.
0-7695-2268-8/05/$20.00 (C) 2005 IEEE
2
Proceedings of the 38th Hawaii International Conference on System Sciences - 2005
Table 1: Conceptualizations of Trust Formation
Processes
Study
[24]
[25]
[20]
[26]
[27]
[28]
[29]
Input
Knowledge about a target:
• Objective evidence
• Emotional bond
Propensity to trust
Social trust
Knowledge about a
salesperson/firm:
• reputation
• size
• willingness to
customize
• information sharing
• relationship length
• likeability
• similarity
• frequent contacts
• expertise
Knowledge about a
trustee
Trustor’s characteristics
Knowledge-based
familiarity
Institution-based trust
Evidence of
trustworthiness
Emotional bond with a
target
A target’s trust-implying
actions
Knowledge about a target
(peer)
• Citizenship behavior
• Interaction Frequency
• Reliable role
performance
• Cultural-ethnic
similarity
• Professional credentials
Propensity to trust (Faith
in humanity)
First impression about a
target
Institution-based trust
Processes
Prediction
Attribution
Bonding
Reputation
Identification
Calculative
Prediction
Capability
Intentionality
Transference
[30]
[31]
• Unit grouping
• Reputation
categorization
• Stereotyping
• Illusion of
control
Calculus-based
trust
Relational trust.
Institution-based
trust
Process-based
trust
Characteristicbased trust
Institutionalbased trust
3. Protocol coding scheme
Competence
Assessment;
Expectation
Confirmation;
Control;
Unknown;
Integrity
Assessment;
Information
Sharing;
Verification;
Interface;
Benevolence
Assessment
Calculativebased trust
Cognitive base
of trust
Emotional base
of trust
Behavioral base
of trust
Affect-based
trust
Cognition-based
trust
Categorization:
We adopt Komiak [20]’s coding scheme of trust
formation which classifies trust/distrust formation
processes in terms of how customers subjectively
construe their first-hand knowledge to develop their
trust/distrust. In addition, we have added two processes:
connection process and prediction process, because
Komiak [20] mainly addresses the formation of trust and
distrust in computer agents, while we are concerned on
both trust in computer agents and trust in humans.
1). Competence Assessment: Customers ascribe
competence to a trustee based on their general evaluation
of the knowledge and expertise that the trustee possesses
to accomplish the tasks. This process is partially similar to
Competence process [25], Attribution process [24], and
Cognitive Base of Trust [27, 28]. For example, “The
virtual salespersons are very knowledgeable of the
products that are being presented.”
2). Expectation Confirmation: When a trustee’s actions
and features confirm or exceed a customer’s expectations,
customer trust will develop. In contrast, when a trustee’s
actions and features are below customer’s expectations,
customer distrust will develop. Examples include: “Even
though virtual salesperson asks questions, it does not
allow me to ask questions or make input on what I am
looking for.”
3). Control Process: When customers feel that they
have more control over a trustee, this feeling builds trust
[32], while the feeling of less control may build distrust.
Trust-building involves illusions according to trust
theories [e.g. 33] and empirical studies [e.g. 34], and the
illusion of control is an unrealistically inflated perception
of personal control that helps to build trust [29]. In
addition, Komiak [20] shows that the feeling of being in
control is more than an illusion – it is real. It involves the
amount of choices provided by the trustee, the tendency
0-7695-2268-8/05/$20.00 (C) 2005 IEEE
3
Proceedings of the 38th Hawaii International Conference on System Sciences - 2005
of the trustee to influence customers’ decision making,
and the customers’ opportunity to express their needs, to
list a few. Examples include: “I also found that you got
more of a choice with the virtual salesperson in terms of
supplying info and what features I would like on my DVD
player.”
4). Awareness of the “Unknown” Process: The process
deals with how customers process their awareness of the
“unknown” during their interactions with a trustee. “Trust
is particularly relevant in conditions of ignorance or
uncertainty with respect to unknown or unknowable
actions of others” [35], thus the impact of awareness of
the unknown on trust/distrust should be included in
trust/distrust research. Example protocols include:
“However for the company that I don’t know, I trust less
the virtual salesperson than the real person salesperson.”
5). Integrity Assessment: A customer ascribes integrity
to a trustee based on observable evidence. This process is
partially similar to Intentionality process [25], Attribution
process [24], Emotion Base of Trust [27], and Affectbased Trust [28]. Example protocols include: “The virtual
salesperson can gain you trust by being neutral, you are
just given the facts on a particular item.”
6). Information Sharing: When a trustee explains her
reasoning process explicitly or shares detailed product
information with customers, customers trust will build.
However, too much information may confuse or
overwhelm the customers. Then customer distrust will
develop. For example, “The virtual salesperson is similar
to the human salesperson in terms of explaining the
different components and the need for each feature for a
specific product.”
7). Verification Process: When customers are able to
verify that the information provided by a trustee is true or
good, their trust builds. Lack of verification facilities
builds distrust. For example: “If the product I was buying
was a high end item I would also want to test out the sales
person regarding service related issues.”
8). Media Assessment: A pleasant interface helps to
build trust, while an unpleasant interface helps to build
distrust. Interface (appearance) of a media has been
suggested as an antecedent of trust and/or distrust [e.g.
36]. For example: “The presentation of this RA is
pleasing to the eye. I feel comfortable.”
9). Benevolence Assessment: A customer ascribes
benevolence to a trustee based on observable evidence. It
is similar to the Intentionality process [25], Attribution
process [24], Emotion Base of Trust [27], and Affectbased Trust [28]. For example, “Salespeople are normally
on commission and of course want to sell higher ticketed
items in order to make more money.”
10). Connection Process: This process relates to
customers’ feeling of being connected to a salesperson.
To some extent, trust formation is a relation building
process [30]. Such a connection facilitates relationship
building. Examples include “I prefer the human
salesperson. There's a connection made with a human
that you just can't get with a computer.” Another example,
“I trust the human salesperson more because I have a
tendency to trust people more when they can look you in
the eye and you can gauge their response to questions by
their body language.”
11). Prediction Process: A customer assesses whether
or not the trustee is reliable, consistent, and predictable
[37, 38]. For example, “human salespeople can be too
variable.” Another example, “the [virtual salesperson’s]
answers to my questions are pre-determined; they are
consistent based on my personal preferences and
choices.”
4. Data collection and data analysis
Forty-four (44) subjects participated. They were all
senior undergraduate business students in a Canadian
university. All the subjects had prior experience dealing
with human salespersons. The participants were asked to
access
the
RadioShack
website
(http://www.radioshack.ca/) to interact with RadioShack’s
virtual salesperson (a computer agent that provides
recommendations on what kind of product fits the
customer’s personal needs best) as if they were shopping
for an electronic product. RadioShack is a well-known
brand name as an electronics retailer in Canada. All the
subjects were asked to compare RadioShack’s virtual
salesperson to RadioShack’s human salesperson when
they were answering three open-ended questions:
• Question 1: How SIMILAR is the virtual salesperson
to the human salesperson, in terms of gaining your
trust?
• Question 2: How DIFFERENT is the virtual
salesperson to the human salesperson, in the terms of
gaining your trust?
• Question 3: Who do you trust more: the virtual
salesperson or the human salesperson? Why?
Each written protocol was broken into episodes and
each episode contained at most one trust or distrust
building process. The first two authors coded the episodes
independently. To assess the reliability of the coding
scheme and ensure the validity of the analysis, Cohen’s
Kappa coefficient [39] was used to measure the intercoder agreement [40]. The Kappa coefficient is .82, which
indicates a good inter-judge agreement [41].
5. Results of protocol analysis
Slightly more than half of the subjects (52%) judged
the virtual salesperson more trustworthy than human
salespersons. 41% of the subjects trusted the human
salespersons more than the virtual salesperson. The other
0-7695-2268-8/05/$20.00 (C) 2005 IEEE
4
Proceedings of the 38th Hawaii International Conference on System Sciences - 2005
7% of subjects have similar levels of trust in the virtual
salesperson and in the human salespersons.
The results support the theoretical perspective that
trust in a virtual salesperson’s integrity and trust in the
virtual salesperson’s benevolence do exist, although the
virtual salesperson is a computer agent instead of a
person.
The averages of the two judges’ coding results were
used for further analysis aimed at unraveling the main
processes for 1) customers’ trust and distrust building in
virtual salespersons, and 2) customers’ trust and distrust
building in human salespersons.
We separated the episodes for the trust/distrust object
of virtual salespersons and human salespersons. For each
trust building process, the total numbers of episodes
related to trust from all participants were summed up. The
same was calculated for each distrust building process.
Table 2 and 3 show the amounts of different processes
that were involved in the trust and distrust formation in
virtual salespersons and in human salespersons,
respectively.
Table 2. Trust/Distrust formation processes
in a virtual salesperson
Distrust Building
Trust Building
Process
Number of Process Number of Process
(%)
(%)
Competence
27.5
11.5
Assessment
(14.8%)
(13.1%)
Expectation
33
13
Confirmation
(17.8%)
(14.9%)
Control
34
3.5
Process
(18.3%)
(4.0%)
Unknown
0.5
8
Process
(0.3%)
(9.1%)
Integrity
13
1
Assessment
(7.0%)
(1.1%)
Information
28
13.5
Sharing
(15.1%)
(15.4%)
Verification
2.5
6
Process
(1.3%)
(6.9%)
Media
3.5
8.5
Assessment
(1.9%)
(9.7%)
Benevolence
37
1.5
Assessment
(19.9%)
(1.7%)
Connection
1
19
Process
(0.5%)
(21.7%)
Prediction
5.5
2
Process
(3.0%)
(2.3%)
185.5
87.5
Total
(100%)
(100%)
Table 3. Trust/Distrust formation processes
in a human salesperson
Distrust Building
Trust Building
Process
Number of Process Number of Process
(%)
(%)
Competence
14.5
24.5
Assessment
(13.5%)
(19.2%)
Expectation
12
5.5
Confirmation
(11.2%)
(4.3%)
Control
3
16
Process
(2.8%)
(12.5%)
Unknown
1.5
3
Process
(1.4%)
(2.4%)
Integrity
0.5
12.5
Assessment
(0.5%)
(9.8%)
Information
13.5
5.5
Sharing
(12.6%)
(4.3%)
Verification
8.5
3
Process
(7.9%)
(2.4%)
Media
12.5
0.5
Assessment
(11.6%)
(0.4%)
Benevolence
7.5
49.5
Assessment
(7.0%)
(38.8%)
Connection
33
1
Process
(30.7%)
(0.8%)
Prediction
1
6.5
Process
(0.9%)
(5.1%)
107.5
127.5
Total
(100%)
(100%)
In order to examine the differences between trust and
distrust formation processes and those processes in
different objects, Ȥ2 tests were conducted. The results are
shown in Table 4. Statistically, trust and distrust in virtual
salespersons are formed via different processes and trust
and distrust in human salespersons are also formed via
different processes. Trust in virtual salespersons and trust
in human salespersons are formed via different processes,
and distrust in virtual salespersons and distrust in human
salespersons are also formed via different processes.
Table 4.
2
test for trust/distrust building process
comparisons
Comparison
Ȥ2
Df
p
Trust vs. distrust in virtual
92.922
10
<.0001
salespersons
Trust vs. distrust in virtual
106.69
10
<.0001
salespersons
Trust in virtual
salespersons vs. in human
104.72
10
<.0001
salespersons
Distrust in virtual
salespersons vs. in human
99.041
10
<.0001
salespersons
0-7695-2268-8/05/$20.00 (C) 2005 IEEE
5
Proceedings of the 38th Hawaii International Conference on System Sciences - 2005
Furthermore, we analyzed the main processes that
were involved in trust and distrust building in virtual
salespersons as well as in human salespersons. Table 5
shows the top five processes for trust and distrust building
in virtual salespersons and human salespersons. These top
processes cover from 75% to 86% of all the processes for
trust and distrust formation in virtual salespersons and
human salespersons.
Table 5 reveals the complementary natures of
trust/distrust building in virtual salespersons and human
salespersons. As shown in the shaded cells in Table 5, the
main processes that lead to customers’ distrust in virtual
salespersons are the same as those that contribute to
customers’ trust in human salespersons. On the other
hand, the main processes that lead to customers’ distrust
in human salespersons are similar to those that contribute
to customers’ trust in virtual salespersons.
In addition, we also found that 1) benevolence
assessment and control process mainly contribute to trust
in the virtual salespersons, 2) connection process and
media richness assessment mainly contribute to distrust in
the virtual salespersons, and 3) the expectation
confirmation, information sharing, and competence
assessment contribute to both trust and distrust in the
virtual salespersons.
Table 5. Top 5 processes for trust formations
Object Virtual salespersons Human Salespersons
• Benevolence
Assessment (20%)
• Control Process
(18%)
• Expectation
Trust
Confirmation (18%)
• Information Sharing
(15%)
• Competence
Assessment (15%)
• Connection Process
(22%)
• Information Sharing
(15%)
• Expectation
Distrust
Confirmation (15%)
• Competence
Assessment (13%)
• Media Assessment
(10%)
• Connection Process
(31%)
• Competence
Assessment (14%)
• Information Sharing
(13%)
• Media Assessment
(12%)
• Expectation
Confirmation (11%)
• Benevolence
Assessment (39%)
• Competence
Assessment (19%)
• Control Process
(13%)
• Integrity Assessment
(10%)
• Prediction Process
(5%)
6. Implications and discussion
Overall, customers trust virtual salespersons slightly
more than human salespersons. Therefore, in terms of
gaining customers’ trust, virtual salespersons can be used
as a good service channel to provide online shoppers with
recommendation services. This study reveals the main
processes that were involved in customers’ trust and
distrust building in virtual salespersons as well as in
human salespersons.
Before discussing the implications of this study, it is
important to consider the study’s limitations. First, this
exploratory study only investigates one virtual
salesperson from a well-known candian store’s website:
www.radioshack.ca. Readers are therefore advised to be
cautious about generalizing the results of this study to
computer agents from other sources. Second, subjects are
university students taking an IS course, they might be
relatively technology savvier than other web shoppers.
More research is needed to replicate this study in other
populations.
This exploratory study makes significant contributions
to research and practice. For researchers, the present study
helps understand the nature of trust and distrust building
in virtual salespersons as well as in human salespersons.
Customers’ trust and distrust formation in technological
artifacts is still an under-investigated area. In general,
interpersonal trust still applies to trust in technological
artifacts such as virtual salespersons. Our coding scheme
works for both virtual salespersons and human
salespersons. Customers assess and perceive the
benevolence and integrity of agents when they are
forming their trust in virtual salespersons although
benevolence and integrity are inherently human
characteristics. However, we found that customers form
their trust as well as distrust in virtual salespersons
quantitatively differently from their trust in human
counterpart.
Moreover, this study reveals the asymmetric nature of
trust and distrust building in virtual salespersons.
Although trust and distrust are formed via some common
processes (e.g., information sharing and competence
assessment), there are unique processes that mainly
contribute to trust building and ones for distrust building.
This calls more research on not only the issues of trust in
technological artifacts but also the issues of distrust. This
study also reveals a complementary nature of trust/distrust
in virtual salespersons and human salespersons. It helps
us understand the relationships between customers’
trust/distrust in virtual salespersons and human
salespersons.
For practitioners, the results shed light on the design of
trustworthy virtual salespersons for online shopping. In
particular, three areas deserve attention.
0-7695-2268-8/05/$20.00 (C) 2005 IEEE
6
Proceedings of the 38th Hawaii International Conference on System Sciences - 2005
First, we need to maintain the agent features and
capabilities that induce the benevolence assessment and
control processes. These two processes mainly lead to
customers’ trust in the virtual salespersons. Customers
recognize and appreciate the goodwill of the agents to
understand their needs and find the suitable products that
may satisfy their needs. Customers realize that unlike
human salespersons, virtual salespersons are not guided
by gaining commissions. For example, to further induce
users’ benevolence perceptions, recommendation agents
should elicit users’ needs by asking needs-based questions
rather than attributes-based questions. Similarly,
customers enjoy the greater control offered by the virtual
salespersons in www.RadioShack.ca. Customers can
choose any questions to answer, freely change their
preferences, take their time to consider what they prefer,
and view the details of any products they like.
Second, more agent features and capabilities need to
be provided to inhibit the distrust related connection and
media richness process. Lack of connection between the
virtual salespersons and virtual salespersons and the
limited media richness, which can be delivered through a
website, lead to customers’ distrust in the virtual
salespersons. These two processes indeed are the
strengths of interpersonal interactions. They are among
the main processes that lead to customers’ trust in human
salespersons. We need to simulate the interpersonal
interactions to build the “connection” between virtual
salespersons and customers and increase the richness of
web-based virtual salespersons. For example, a computer
agent should be personalized so that it can recognize each
user, know the user’s background, and greet the user by
using computer cookies or user log-in information. With
regard with media richness, computer voice and Avatar
technologies should be used to increase the interface
richness [42].
Third, some agent features should be carefully
designed because some processes can engender both trust
and distrust. For example, one process is the information
sharing process. Well designed explanations that are
embedded in a virtual salesperson increase users trust in
the agent [43], while without appropriate explanations,
customers distrust the virtual salesperson.
Reference:
[1]
P. Maes, "Agents that Reduce Work and
Information Overload," Communications of the ACM, vol.
37, pp. 31-40, 1994.
[2]
R. T. Rust and P. K. Kannan, "E-service: a new
paradigm for business in the electronic environment,"
Communications of the ACM, vol. 46, pp. 37-42, 2003.
[3]
R. M. O'Keefe and T. McEachern, "Web-Based
Customer Decision Support Systems," Communications
of the ACM, vol. 41, pp. 71 - 78, 1998.
[4]
P. Maes, R. H. Guttman, and A. G. Moukas,
"Agents that Buy and Sell," Communications of The ACM,
vol. 42, pp. 81-91, 1999.
[5]
R. T. Grenci and P. A. Todd, "Solutions-Driven
Marketing," Communications of the ACM, vol. 45, pp. 6571, 2002.
[6]
J. E. Swan, M. R. Bowers, and L. D. Richardson,
"Customer Trust In The Salesperson: An Integrative
Review And Meta-Analysis Of The Empirical Literature,"
Journal of Business Research, vol. 44, pp. 93-107, 1999.
[7]
R. M. Morgan and S. D. J. Hunt, "The
commitment-trust theory of relationship marketing,"
ournal of Marketing, vol. 58, pp. 20-38, 1994.
[8]
D. H. McKnight, V. Choudhury, and C. Kacmar,
"Developing and Validating Trust Measures for eCommerce: An Integrative Typology," Information
Systems Research, vol. 13, pp. 334-359, 2002.
[9]
D. Gefen, E. Karahanna, and D. W. Straub,
"Trust and TAM in Online Shopping: An Integrated
Model," MIS Quarterly, vol. 27, pp. 51-90, 2003.
[10]
G. L. Urban, F. Sultan, and W. J. Qualls,
"Placing Trust at the Center of Your Internet Strategy,"
Sloan management Review, vol. 42, pp. 39-48, 2000.
[11]
R. C. Mayer, J. H. Davis, and F. D. Schoorman,
"An Integrative Model of Organizational Trust," Academy
of Management Review, vol. 20, pp. 709-734, 1995.
[12]
L. A. Crosby, K. R. Evans, and D. Cowles,
"Relationship Quality in Services Selling: an
Interpersonal Influence Perspective," Journal of
Marketing, vol. 54, pp. 68-81, 1990.
[13]
B. Friedman, P. H. Kahn, Jr., and D. C. Howe,
"Trust Online," Communications of the ACM, vol. 43, pp.
34-40, 2000.
[14]
X. S. Komiak and I. Benbasat, "Understanding
Customer Trust in Agent-mediated Electronic Commerce,
Web-mediated Electronic Commerce, and Traditional
Commerce," Information Technology and Management
(ITM), vol. 5, pp. 181-207, 2004.
[15]
B. Reeves and C. Nass, The Media Equation:
How People Treat Computers, Television, and New
0-7695-2268-8/05/$20.00 (C) 2005 IEEE
7
Proceedings of the 38th Hawaii International Conference on System Sciences - 2005
Media Like Real People and Places. New York, NY:
Cambridge University Press, 1996.
[16]
P. Sztompka, Trust: A Sociological Theory.
Cambridge, UK: Cambridge University Press, 1999.
[17]
J. Cassell and T. Bickmore, "External
Manifestation of Trustworthiness in the Interface,"
Communications of the ACM, vol. 43, pp. 50-56, 2000.
[18]
C. I. Nass, Y. Moon, J. Morkes, E. Y. Kim, and
B. J. Fogg, "Computers Are Social Actors: A Review of
Current Research," in Human Values and the Design of
Computer Technology, B. Friedman, Ed. Stanford, CA:
CSLI Publications, 1997, pp. 137-162.
[19]
P. Sztompka, "Trust: A Sociological Theory."
Cambridge, UK: Cambridge University Press, 1999.
[20]
X. S. Komiak, "The Impact of Internalization
and Familiarity on Trust and Adoption of
Recommendation Agents," Unpublished Dissertation,
MIS division, the University of British Columbia,
Vancouver, Canada 2003.
[21]
B. M. Muir and N. Moray, "Trust in Automation:
Part II. Experimental Studies of Trust and Human
Intervention in a Process Control Simulation,"
Ergonomics, vol. 39, pp. 429-460, 1996.
[22]
B. M. Muir, "Trust Between Humans and
Machines, and the Design of Decision Aids,"
International Journal of Man Machine Studies, vol. 27,
pp. 527-539, 1987.
[23]
B. M. Muir, "Trust in Automation: Part I.
Theoretical Issues in the Study of Trust and Human
Intervention in Automated Systems," Ergonomics, vol. 37,
pp. 1905-1922, 1994.
[24]
K. Chopra and W. A. Wallace, "Trust in
Electronic Environments," in Proceedings of the 36th
Hawaii International Conference on System Sciences
(HICSS'03). Hawaii: IEEE, 2003.
[25]
P.-M. Doney and J.-P. Cannon, "An examination
of the nature of trust in buyer-seller relationships," in
Journal of Marketing, vol. 61, 1997, pp. 35-51.
[26]
D. Gefen, E. Karahanna, and D. W. Straub,
"Trust and TAM in Online Shopping: An Integrated
Model," in MIS Quarterly, vol. 27, 2003, pp. 51-90.
[27]
D. J. Lewis and A. Weigert, "Trust as a social
reality," in Social Forces, vol. 63, 1985, pp. 967-985.
[28]
D. J. McAllister, "Affect-Based and CognitionBased Trust as Foundations for Interpersonal Cooperation
in Organizations," in Academy of Management Journal,
vol. 38, 1995, pp. 24-59.
[29]
D. H. McKnight, L. L. Cummings, and N. L.
Chervany, "Initial Trust Formation in New Organizational
Relationships," in Academy of Management Review, vol.
23, 1998, pp. 473-490.
[30]
D. M. Rousseau, S. B. Sitkin, R. S. Burt, and C.
Camerer, "Not so different after all: A cross-discipline
view of trust," in Academy of Management Review, vol.
23, 1998, pp. 393-404.
[31]
L.-G. Zucker, "Production of trust: Institutional
sources of economic structure, 1840-1920," in Research
in Organizational Behavior, vol. 8, 1986, pp. 53-111.
[32]
D. Ariely, "Controlling the Information Flow:
Effects on Customers' Decision Making and Preferences,"
in Journal of Customer Research, vol. 27, 2000.
[33]
J. G. Holmes, "Trust and the Appraisal Process
in Close Relationships," in Advances in Personal
Relationships, vol. 2, D. Perlman, Ed. London: Jessica
Kingsley, 1991, pp. 57-104.
[34]
R. M. Kramer and A. M. Isen, "Trust and
Distrust - Its Psychological and Social Dimensions," in
Motivation and Emotion, vol. 18, 1994, pp. 105-107.
[35]
D. Gambetta, "Can we trust trust?," in Trust:
Making and Breaking Cooperative Relations, D.
Gambetta, Ed. Oxford, UK: Blackwell, 1988, pp. 154-175.
[36]
J. E. Swan, M. R. Bowers, and L. D. Richardson,
"Customer Trust In The Salesperson: An Integrative
Review And Meta-Analysis Of The Empirical Literature,"
in Journal of Business Research, vol. 44, 1999, pp. 93107.
[37]
P. M. Doney and J. P. Cannon, "An Examination
of the Nature of Trust in Buyer-Seller Relationships,"
Journal of Marketing, vol. 61, pp. 35-51, 1997.
[38]
T. G. Brashear, J. S. Boles, D. N. Bellenger, and
C. M. Brooks, "An Empirical Test of Trust-Building
Processes and Outcomes in Sales Manager-Salesperson
Relationships," Journal of the Academy of Marketing
Science, vol. 31, pp. 189-200, 2003.
[39]
J. Cohen, "A Coefficient of Agreement for
Nominal Scales," Educational and Psychological
Measurement, vol. 20, pp. 37-46, 1960.
0-7695-2268-8/05/$20.00 (C) 2005 IEEE
8
Proceedings of the 38th Hawaii International Conference on System Sciences - 2005
[40]
P. Todd and I. Benbasat, "Process Tracing
Methods in Decision Support Systems Research:
Exploring the Black Box," MIS Quarterly, vol. 11, pp.
493-512, 1987.
[41]
J. R. Landis and G. G. Koch, "The Measurement
of Observer Agreement for Categorical Data," Biometrics,
vol. 33, pp. 159- 174, 1977.
[42]
J. Cassell, "Embodied conversational interface
agents," Communications of the ACM, vol. 43, pp. 70-78,
2000.
[43]
W. Wang and I. Benbasat, "An Empirical
Investigation of Intelligent Agents for E-Business
Customer Relationship Management: A Knowledge
Management Perspective," The Proceedings of the 11th
European Conference on Information Systems, Naples,
Italy, 2003.
0-7695-2268-8/05/$20.00 (C) 2005 IEEE
9