Advisory Systems to Support Decision Making

CHAPTER 24
Advisory Systems to Support Decision Making
Brandon A. Beemer and Dawn G. Gregg
Business School, University of Colorado, Denver, CO, USA
Both advisory systems and expert systems provide expertise to support decision making in
a myriad of domains. Expert systems are used to solve problems in well defined, narrowly
focused problem domains, whereas advisory systems are designed to support decision
making in more unstructured situations which have no single correct answer. This paper
provides an overview of advisory systems, which includes the organizational needs that
they address, similarities and differences between expert and advisory systems, and the
supportive role advisory systems play in unstructured decision making.
Keywords: Advisory systems; Expert systems; Knowledge systems; Intelligent assistants
1
Introduction
Advisory systems provide advice and help to solve problems that are normally
solved by human experts; as such, advisory systems can be classified as a type of
expert system (e. g., Vanguard 2006, Forslund 1995). Both advisory systems and
expert systems are problem-solving packages that mimic a human expert in
a specialized area. These systems are constructed by eliciting knowledge from
human experts and coding it into a form that can be used by a computer in the
evaluation of alternative solutions to problems within that domain of expertise.
While advisory systems and expert systems share a similar architectural design,
they do differ in several significant ways. Expert systems are typically autonomous problem solving systems used in situations where there is a well-defined
problem and expertise needs to be applied to find the appropriate solution
(Aronson and Turban 2001). In contrast, advisory systems do not make decisions
but rather help guide the decision maker in the decision-making process, while
leaving the final decision-making authority up to the human user. The human
decision maker works in collaboration with the advisory system to identify problems that need to be addressed, and to iteratively evaluative the possible solutions
to unstructured decisions. For example, a manager of a firm could use an advisory
system that helps assess the impact of a management decision on firm value (e. g.,
Magni et al. 2006) or an oncologist can use an advisory system to help locate brain
tumors (e. g., Demir et al. 2005). In these two examples, the manager and the oncologist are ultimately (and legally) accountable for any decisions/diagnoses
978-3-540-48712-8_21_20070709_1
362
Brandon A. Beemer and Dawn G. Gregg
made. The advisory system, for ethical reasons, only acts as a tool to aid in the
decision-making process (Forslund 1995).
This paper provides an overview of both advisory systems and expert systems,
highlighting their similarities and differences. It provides a background on both
expert and advisory systems and describes the architectures and the types of decisions each system supports. It distinguishes between advisory systems which utilize
the case-based reasoning methodology and traditional expert systems that use rulebased reasoning. A review and classification of recent advisory/expert systems
research is included to show how both types of systems are currently being utilized.
The paper concludes with recommendations for further advisory system research.
2
Expert Systems
In response to the organizational need of intelligent decision support, expert systems
were developed by coupling artificial intelligence (AI) and knowledge management
techniques. Expert systems are designed to encapsulate the knowledge of experts
and to apply it in evaluating and determining solutions to well-structured problems.
2.1 Expert Systems Decisions
Expert systems have applications in virtually every field of knowledge. They are
most valuable to organizations that have a high level of knowledge and expertise
that cannot be easily transferred to other members. They can be used to automate
decision making or used as training facilities for non-experts (Aronson and Turban
2001). Expert systems were designed to deal with complex problems in narrow,
well-defined problem domains. If a human expert can specify the steps and reasoning by which a problem may be solved, then an expert system can be created to
solve the same problem (Giarranto and Riley 2005).
Expert systems are generally designed very differently from traditional systems
because the problems they are designed to solve have no algorithmic solution.
Instead, expert systems utilize codified heuristics or decision-making rules of
thumb which have been extracted from the domain expert(s), to make inferences
and determine a satisfactory solution. The decision areas expert systems are typically applied to include configuration, diagnosis, interpretation, instruction, monitoring, planning, prognosis, remedy, and control (Giarranto and Riley 2005). Expert systems research and development encompasses several domains, which
include but are not limited to: medicine, mathematics, engineering, geology, computer science, business, and education (Carroll and McKendree 1987).
Researchers and developers of the initial expert systems tried to address the
problem of lost or hard to transfer expertise by capturing the expert’s knowledge
and replicating their decision-making capacity. An example of this objective is
found in CATS-1, a pioneering expert system that addressed General Electric’s
978-3-540-48712-8_21_20070709_1
Advisory Systems to Support Decision Making
363
eventual loss of their top expert in troubleshooting diesel electric locomotive engines (Bonnisone and Johnson 1983). The structural design of expert systems
reflects this ambition to completely replace the expert, and is inspired by the human information processing theory (Waugh and Norman 1965).
2.2 Expert Systems Architecture
Expert systems have been defined as “a system that uses human knowledge captured in a computer to solve a problem that ordinarily needs human expertise”
(Aronson and Turban 2001). As is shown in Figure 1, expert system architecture
distinctly separates knowledge and processing procedures in the knowledge base
and inference engine, respectively (Bradley et al. 1995, Waterman 1986, Aronson
and Turban 2001).
The knowledge base of expert systems contains both tacit and explicit knowledge. Tacit knowledge exists in the mind and is difficult to articulate; it governs
explicit knowledge through mechanisms such as intuition (McGraw et al. 1989).
Explicit knowledge is context specific, and is easily captured and codified (Bradley
et al. 2006). A knowledge engineer is often needed to help elicit tacit knowledge
from the expert and then to codify it into the knowledge base. The knowledge engineer uses various methods in structuring the problem-solving environment, these
include interpreting and integrating the expert’s answers to questions, drawing
analogies, posing counter examples, and bringing conceptual difficulties to light
(Aronson and Turban 2001).
The knowledge representation formalizes and organizes the knowledge so that
the inference engine can process it and make a decision. One widely used knowledge representation in expert systems is an IF-THEN rule. The IF part of the rule
lists a set of conditions the rule applies to. If the IF part of the rule is satisfied, the
Figure 1. Expert system architecture (Aronson and Turban 2001, Bradley et al. 1995,
Waterman 1986)
978-3-540-48712-8_21_20070709_1
364
Brandon A. Beemer and Dawn G. Gregg
THEN part of the rule can be executed, or its problem-solving action taken. Expert
systems whose knowledge is represented in rule form are called rule-based systems.
In expert systems, the inference engine organizes and controls the steps taken to
solve the problem. It uses rule-based reasoning to navigate through the rules,
which are stored in the knowledge base (Aronson and Turban 2001). When the
knowledge base is structured in this way, as to supplement rule-based reasoning, it
is referred to as a decision tree. Each unique branch in the decision tree represents
a correct answer to the situational antecedents that lead to it. If the inference engine starts from a set of conditions and moves toward some conclusion, the
method is called forward chaining. If the conclusion is known (for example, a goal
to be achieved) but the path to that conclusion is not known, then the inference
engine reasons backwards using backward chaining (Giarranto and Riley 2005).
Once the inference engine determines a solution to the problem, it is presented to
the user through the user interface. In addition, explanation facilities in expert
systems trace the line of reasoning used by the inference engine to help end-users
assess the credibility of the decision made by the system (Feigenbaum et al. 1988).
Often the decisions made by expert systems are based on incomplete information about the situation at hand. Uncertainty increases the number of possible
outcomes to all possible solutions, making it impossible to find a definable best
solution to the problem. For example, in the medical domain there are constraints
of both time and money. In many cases, running additional tests may improve the
probability of finding an appropriate treatment – but the additional tests may cost
too much money or take time the patient does not have (Giarranto and Riley
2005). In an attempt to accommodate for uncertainty, many expert systems utilize
methods to perform inexact reasoning, which allows them to find an acceptable
solution to an uncertain problem. Two popular methods used to perform reasoning
under uncertainty are Bayesian probability (Castillo et al. 1997) and fuzzy theory
(Bellman and Zadeh 1970, Negoita 1985).
2.3 Expert Systems Limitations
Many expert systems are based on the notion that the process of solving unstructured decisions consists of five sequential phases: 1) problem identification; 2)
assimilating necessary information; 3) developing possible solutions; 4) solution
evaluation; 5) solution selection (Brim et al. 1962, Dewey 1910). These expert
systems perform the last four decision-making steps for the user and have been
applied successfully in a wide variety of highly specialized domains. Traditionally
rule-based expert systems operate best in structured decision environments, since
solutions to structured problems have a definable right answer, and the users can
confirm the correctness of the decision by evaluating the justification provided by
explanation facility (Gefen et al. 2003). However, researchers have identified
many limitations to current expert systems, which include (Luger 2005):
978-3-540-48712-8_21_20070709_1
Advisory Systems to Support Decision Making
365
1. Difficulty in capturing the deep knowledge of the problem domain.
2. Lack of robustness and flexibility.
3. Inability to provide in depth explanations of solution logic (instead, expert system explanations are generally restricted to a description of the
steps taken in finding a solution).
4. Difficulties in solution verification.
5. Little learning from experience.
The inflexibility of traditional expert systems reduces their ability to handle unstructured and more loosely defined problems. In the 1970s, decision theorists
discovered that the phases within the decision process are executed iteratively
until an acceptable decision is reached (Mintzberg et al. 1976, Witte 1972). When
a decision maker gathers and assimilates information, they subconsciously begin
to comparatively evaluate it with previously gathered information (Mintzberg et al.
1976). This comparative evaluation of information, coupled with an understanding
of the information’s contextual relevancy, results in decisions sufficient for unstructured problems which have no definable right solution because of the existence of outcome uncertainty (Mintzberg et al. 1976, Witte 1972). One reason that
the rule-based inference engine used in traditional expert systems have limited
capacity to handle unstructured decisions is because they usually do not support
the required iterative process of decision making (Mintzberg et al. 1976, Witte
1972).
While many researchers agree with the preceding description of expert systems
and their limitations (e. g., Turban and Watkins 1986, Aronson and Turban 2001),
there is disagreement in the research community regarding the scope of expert
system functionality. For example, Quinlan (1980, 1988) describes expert systems
that incorporate the capability of addressing unstructured decision environments.
3
Advisory Systems
Advisory systems are advice-giving systems as opposed to systems that present
a solution to the decision maker (Aronson and Turban 2001). Research in advisory
systems has found that for many problems decision makers need the problem to be
identified and framed so that they can make decisions for themselves (e. g., Forslund 1995, Miksch et al. 1997, Gregg and Walczak 2006).
3.1 Advisory Systems Decision Support
Advisory systems support decisions that can be classified as either intelligent or
unstructured, and are characterized by novelty, complexity, and open-endedness
(Mintzberg et al. 1976). In addition to these characteristics, contextual uncertainty is
ubiquitous in unstructured decisions, which when combined exponentially increases
978-3-540-48712-8_21_20070709_1
366
Brandon A. Beemer and Dawn G. Gregg
the complexity of the decision-making process (Chandler and Pachter 1998). Because of the novel antecedents and lack of definable solution, unstructured decisions
require the use of knowledge and cognitive reasoning to evaluate alternative courses
of action to find the one that has the highest probability of desirable outcome (Chandler and Pachter 1998, Mintzberg et al. 1976). The more context-specific knowledge
acquired by the decision maker in these unstructured decision-making situations, the
higher the probability that they will achieve the desirable outcome (Aronson and
Turban 2001).
The decision-making process that occurs when users utilize advisory systems is
similar to that which is used for the judge-advisor model developed in the organizational behavior literature (Sniezek 1999, Sniezek and Buckley 1995, Arendt
et al. 2005). Under this model, there is a principle decision maker that solicits
advice from many sources; however, the decision maker “holds the ultimate authority for the final decision and is made accountable for it” (Sniezek 1999). The
judge-advisor model suggests that decision makers are motivated to seek advice
from others for decisions that are important, unstructured, and involve uncertainty.
Similarly, advisory systems help to synthesize knowledge and expertise related to
a specific problem situation for the user; however, the ultimate decision-making
power and responsibility lies with the user – not the system.
Advisory systems support decisions related to business intelligence, health diagnostics, mechanical diagnostics, pharmaceutical research, autonomous aviation
systems, infrastructure procurement, and many more (Chandler and Pachter 1998,
Rapanotti 2004, Sniezek 1999). Advisory systems can also support problem identification in unstructured decision-making environments. Without expert levels of
knowledge, most unstructured decisions often remain unidentified because “most
strategic decisions do not present themselves to the decision maker in convenient
ways; problems and opportunities in particular must be identified in the streams of
ambiguous, largely verbal data that decision makers receive” (Mintzberg et al.
1976, Mintzberg 1973, Sayles 1964). Additionally, decision makers who lack
access to the proper expertise “are constrained by cognitive limits to economically
rational behavior that induce them to engage in heuristic searches for satisfactory
decisions, rather than comprehensive searches for optimal decisions” (Blanning
1987, March and Simon 1958, Simon 1972).
3.2 Advisory System Architecture
Advisory systems differ from expert systems in that classical expert systems can
solve a problem and deliver a solution, while advisory systems are designed to
help and complement the human’s problem-solving process (Forslund 1995,
Mintzberg et al. 1976). In unstructured situations, which have no single correct
answer, cooperative advisory systems that provide reasonable answers to a wide
range of problems are more valuable and desirable than expert systems that produce correct answers to a very limited number of questions (Forslund 1995).
978-3-540-48712-8_21_20070709_1
Advisory Systems to Support Decision Making
367
The changes in advisory systems from expert systems includes giving the final
decision back to the user and utilizing the case-based reasoning methodology in
the inference engine (Forslund 1995). In contrast to the rule-based reasoning used
in traditional expert systems, which uses Boolean logic, case-based reasoning
accommodates for uncertainty by using algorithms to compare the current situation to previous ones, and assigns probabilities to the different alternatives (Watson 1999). Once probabilities have been assigned, the advisory system inference
engine is then able to evaluate the alternatives; this iterative evaluation functionality resembles and supplements the cognitive process used by humans when making unstructured decisions, and thus it is more effective in supporting the users of
the system. Case-based reasoning is often mistakenly referred to as a technology,
but in fact is a methodology which is implemented through various technologies.
These technologies include nearest neighbor distance algorithms, induction, fuzzy
logic, and Structure Query Language (SQL) Online Analytical Processing (OLAP)
tools (Watson 1999). These intelligent suggestions, which are the result of the
case-based reasoning inference engine, are then incorporated into the iterative
decision-making process of the human decision maker, the user (Forslund 1995,
Witte 1976, Mintzberg et al. 1976). Figure 2 illustrates the iterative support of
advisory systems in the decision-making process; this functionality contrasts expert systems which only provided a final answer with supportive justification.
In addition to iterative user interaction, advisory systems include a monitoring
agent to help identify the need for identifying unstructured decisions that need to
be addressed. This is displayed in Figure 2 as the flow of information from domain variables to the inference engine (Mintzberg et al. 1976, Mintzberg 1973,
Sayles 1964, Forslund 1995). If environmental domain variables exceed expected
norms, then the system shell will notify the user that there is a situation which
Figure 2. Proposed advisory systems architecture, adapted from Forslund (1995)
and Mintzberg (1976)
978-3-540-48712-8_21_20070709_1
368
Brandon A. Beemer and Dawn G. Gregg
needs to be addressed and will begin the iterative decision-making process by
offering a suggested course of action.
The three main processes of expert systems are knowledge acquisition, inference, and interface (Aronson and Turban 2001); similarly, the three main processes in advisory systems are knowledge acquisition, cognition, and interface.
Because of the monitoring functionality that is adopted by advisory systems, the
term “cognition” better describes the middle process. To provide a visual aid, the
main processes of advisory systems have been labeled in Figure 2 and are described in the following sections.
3.2.1
Process 1: Knowledge Acquisition
The process of knowledge acquisition in advisory systems is similar to that of traditional expert systems, but it can be much more complicated because the unstructured nature of the problem domain can make the knowledge more difficult to capture and codify. In general, advisory systems are designed to support a broad
category of problems, too broad to exactly specify all of the knowledge necessary to
solve the problem (Forslund 1995). The eventual success or failure of an advisory
system is dependent upon the effectiveness of knowledge acquisition: the measure
of effectiveness lies in the structure and quality of the encoded knowledge, not the
quantity. The knowledge base structure and codification must be conducive to the
inference engine design. The knowledge representation scheme used in advisory
systems formalizes and organizes the knowledge so that it can be used to support
the type of case-based reasoning implemented in the system.
3.2.2
Process 2: Cognition
Cognition does a better job of describing this process in advisory systems than
does inference, because it encapsulates the added functionality of active monitoring and problem recognition, which was introduced in the transition from expert
systems. Most unstructured decisions do not present themselves to the decision
maker in convenient ways, so advisory systems supplement the task of problem
identification by monitoring environmental variables (Mintzberg et al. 1976,
Mintzberg 1973, Sayles 1964). There are various methods used by advisory systems to perform this task, and the method used is dependent upon the environment
that the advisory system operates in. Advisory systems can either monitor for
problems (e. g., mortgage credit checks) or for opportunities (e. g., pharmaceutical
drug discovery) (Rapanotti 2004). In addition to monitoring for potential problems
or opportunities, advisory systems support the decision maker in the iterative
process of determining a solution to the problem. The inference engine uses the
environmental variables, user input, and the knowledge base to evaluate different
courses of action and make suggestions to the decision maker. Unlike expert systems, the suggestions made by advisory systems do not always represent the final
answer to the problem. Instead, they represent advice used by the decision maker
as a part of the iterative problem solving process.
978-3-540-48712-8_21_20070709_1
Advisory Systems to Support Decision Making
3.2.3
369
Process 3: Interface
This process encapsulates all subprocesses that facilitate information exchange
between the inference engine and the end-user. This includes the automated input
of environmental parameters that are used in monitoring functionality, the iterative
communication with the user throughout the decision-making process, and the
reasoning process the advisory system used in making the recommendation (the
explanation) as well as some expression indicating the advisory system’s evaluation of the quality of the advice (Sniezek 1999). Unlike more traditional expert
systems, user interaction with advisory systems can involve much more than entering the initial problem conditions and waiting for the system recommendation and
explanation. Advisory systems can have multiple mid stages in the decision process, which require user input to guide the overall advisor decision-making process.
Since the inception of advisory systems, there has not been a lot of research or
design literature concerning the new iterative functionality of the user interface.
While much attention is given to the cognition process components, the user interface is equally important because these types of systems are prone to a lack of user
acceptance. This problem was initially realized with the development of expert
systems because they were “perceived as a potential threat to an employee who
perceives that his or her most valuable skill is embodied within this system and
that job security is accordingly threatened as a result of system use” (Liker and
Sandi 1997). While this is not quite the concern with advisory systems, it is still
prudent to design the user interface in such a way as to foster feelings of perceived
usefulness and ease of use by users (Davis et al. 1989).
4
Comparing, Contrasting, and Classifying Expert
and Advisory Systems
The distinction between advisory systems and expert systems has historically not
been explicitly made by researchers (e. g., Negoita 1985). Advisory systems are an
evolutionary extension of expert systems, evidence of this is found in the similarities between their architectural designs; yet despite their similarities, there are
critical differences between these two system architectures, which we believe
merits their distinction.
4.1 Comparing Expert and Advisory Systems
Both expert systems and advisory systems provide numerous benefits to users operating in complex decision-making environments; some of these benefits are summarized in Table 1. The main factor that affects the realization of these benefits is
that users accept, trust, and use the systems (Davis et al. 1989, Swanson 1988).
978-3-540-48712-8_21_20070709_1
370
Brandon A. Beemer and Dawn G. Gregg
Table 1. Expert and advisory system benefits (Aronson and Turban 2001)
Benefit
Decreased decisionmaking Time
Description
Using the system’s recommendations, a human can make
decisions much faster. This property is important in supporting
frontline decision makers who must make quick decisions
while interacting with customers.
Enhancement of problem Enhances problem solving process by incorporating the
solving and decision
knowledge of Top Experts into the decision-making process.
making
Improved decision
making process
Provides rapid feedback on decision consequences, facilitates
communication among decision makers on a team, and allow
rapid response to unforeseen changes in the environment, thus
providing a better understanding of the decision-making environment.
Improved decision
quality
Reliability. Advisory systems consistently monitor all the
details and do not overlook relevant information, potential
problems, or potential solutions.
Ability to solve complex
problems
Some advisory and expert systems are already able to solve
problems in which the required scope of knowledge exceeds
that of any one individual. This allows decision makers to gain
control over complicated situations and improve the operation
of complex systems.
4.2 Contrasting Expert and Advisory Systems
Although advisory and expert systems do share some commonalties in their shell
structures, Table 2 highlights the major differences such as the decisions they are
each designed for (unstructured versus structured), the AI methodologies that each
uses (case-based versus rule-based), the role they each play in the decision-making
process (decision support versus decision maker). In addition to these differences,
advisory systems incorporate new advancements in the active monitoring functionality highlighted in Figure 2, and are designed to further supplement human cognitive problem solving process by incorporating iterative interaction with the user.
An example of a current expert system is a deicing system being developed by
the Colorado Department of Transportation (Denver 9 News 2006). In an effort to
reduce costs and wasted chemicals, the system is designed to decide the optimal
amount of magnesium chloride (a liquid deicer) to distribute on the roads based on
automated humidity and temperature inputs from sensors in the road, and manual
inputs from the truck drivers which are entered via laptops in the their trucks. These
inputs are sent wirelessly to a central computer which uses its artificial intelligence
and knowledge base components to provide snow removal truck drivers with the
appropriate amount of deicer to use. In this system, the system ultimately has the
ability to make better decisions than the snow removal professional.
978-3-540-48712-8_21_20070709_1
Advisory Systems to Support Decision Making
371
Table 2 Advisory and expert system classification table, adapted from Turban and Watkins
(1986)
Attribute
Advisory system
Expert system
Decision structure
Unstructured
Structured
AI methodology
Case-based reasoning
Rule-based reasoning
Role in decision process
Decision support
Decision maker
Query direction
Human ↔ System
Human ← System
Problem identification
User or system
User
An example of a current advisory system is a system developed to support hospital
operations called HELP (Health Evaluation through Logical Processes). This system performs various functions to aid physicians in providing effective and expedient health care to hospital patients. The HELP system’s functionality includes:
1) reviewing manually inputted lab results and identifying patient issues which
need to be addressed, 2) using the knowledge base and case-based reasoning to
provide physicians with a preliminary diagnosis, 3) monitoring vitals for ICU
patients and identifying when urgent care is needed, 4) assisting physicians with
complex diagnoses. This advisory system incorporates a knowledge base which
works harmoniously with an artificial intelligence component, but unlike traditional expert systems the system is designed to be used as an advisor and not
a decision maker. Also, this system incorporates a monitoring capacity and provides problem identification. One area where this is performed is by monitoring
a patient’s vital signs and lab result inputs, and proactively identifying suggested
courses of action for evolving problems. The query flow in the HELP system is
bidirectional, meaning that the system or the user can initiate the iterative decision-making process. Unlike the Colorado Department of Transportation deicing
system, the HELP system works with physicians, providing them with additional
information an insights into the problem at hand. However, the physician still has
a great deal of control of the ultimate decision that is reached and ethics demands
a human decision maker to assume responsibility for the outcome of care given.
4.3 Classifying Current Expert and Advisory Systems
Many of these described advisory systems, which are designed to apply human
expertise in supporting the decision maker (but not solve the problem), are being
classified as expert systems by IS researchers. Table 3 contains a brief review of
systems classified as expert systems by IS researchers; along with our own classification of these systems by using the criteria in Table 3. A high percentage of
these systems are actually advisory systems – not truly expert systems. Of the
systems, 41% (7/17) were expert systems according to our classification criteria
978-3-540-48712-8_21_20070709_1
372
Brandon A. Beemer and Dawn G. Gregg
and 59% (10/17) were advisory systems. Thus, the majority of new systems in our
limited survey were actually advisory systems. This highlights the transition to the
new advisory system paradigm, and helps motivate the distinction between advisory and expert systems.
Table 3. Recent expert system and advisory system research
System name
AccuStrat
(Rapanotti 2004)
Description of functionality
System type
A predictive model for disease management;
the model suggests which patients need additional care.
A business rule management application that
is used to match delinquent accounts with
collection agencies.
A knowledge-based system designed to assist
computational chemists in the drug discovery
process.
Online Auction Recommendation and Bidding Decision Support System.
Advisory system
Decision Script’s allow the capture of business rules and builds complex applications.
Business Rules Management that lets policy
managers and business analysts manage
complex rule sets.
Pathways analysis
Generates multiple biological networks with
(Rapanotti 2004)
functional analysis to facilitate understanding
of experiments.
EZ-Xpert 3.0
A rule-based expert system that is designed
(AI developers 2006) for quick development and deployment. It
generates code.
Buffer overflow
Uses neural network and the fuzzy logic
control
controllers to rid internet buffer overflow at
(Lin et al. 2006)
the user/server level.
An interactive knowledge based system
Intelligent tutoring
which is used for distributing circuit analysis
(Butz et al. 2006)
knowledge to non-experts.
Firm evaluation
Couples fuzzy logic with rule-based reason(Magni et al. 2006)
ing to support firm evaluation.
Prototype system which demonstrates the
Software design
feasibility of using expert system technology
assistant
to aid software design.
(Moynihan et al.
2006)
Uses an anatomical knowledge base to diagUltrasonography
nose brain diseases and trauma.
system (Hata et al.
2005)
Advisory system
PlacementPlus 4.0
(Rapanotti 2004)
ClassPharmer
(Rapanotti 2004)
Auction advisor
(Gregg and Walczak
2005)
Decision script
(Vanguard 2006)
JR-ules 4.6
(Rapanotti 2004)
978-3-540-48712-8_21_20070709_1
Expert system
Advisory system
Advisory system
Expert system
Advisory system
Expert system
Expert system
Advisory system
Advisory system
Advisory system
Expert system
Advisory Systems to Support Decision Making
373
Table 3. Continued
System name
Memory controller
(Rubio and Lizy
2005)
Recycling management
(Fonseca 2005)
Reservoir management (Karbowski
et al. 2005)
Design advisor
(Chau 2004)
5
Description of functionality
A server memory controller which decomposes database queries into simple operations
to foster efficiency.
Expert system: helps manufacturers assess
and analyze their industrial residuals as potential road construction material.
Expert system that combines a rule-based
inference engine and algorithmic calculations
for reservoir flood gate control.
Expert system that advises engineers in design of liquid-retaining structures.
System type
Expert system
Advisory system
Expert system
Advisory system
Future Research
The majority of current advisory systems research has consisted of applied studies
that developed advisory systems in specific application domains (e. g., Gregg and
Walczak 2005, Magni et al. 2006, Butz et al. 2006). While there is certainly an
ongoing need to explore the diverse array of potential applications of advisory
systems, there is also a need for basic research on the advisory system paradigm.
This includes research related to improving user interaction with advice giving
systems, defining the objectives and types of advice given by these systems, and
improving the ability to acquire and represent the knowledge used in these systems (Roldán et al. 1995).
Over the past few decades both successful and unsuccessful expert and advisory systems have been developed; improving user interaction with these systems
is necessary in order for them to be trusted, accepted, and to contribute to the
decision-making process (Carroll and McKendre 1987). Improving user interaction with advisory systems requires additional understanding and research on the
role of advice in decision making, facilitating the iterative interaction between
decision makers and the system, and the impact of the advice given on the final
decision that is made (Sniezek 1999). Specifically, there is a need to determine
how systems can best give advice which is adaptive and conducive to the cognitive decision-making process of the user(s) (Sniezek 1999). Research is also
needed to examine how to enhance the iterative decision support functionality of
advisory systems (Brim et al. 1962, Mintzberg et al. 1976).
There is also a need for additional research in knowledge acquisition and representation. The process of eliciting tacit knowledge obtained by an expert and coding it into explicit knowledge that is congruent with the AI technology in the inference engine is a very complicated process which spans across the following
research disciplines: psychology, information systems, and computer science
978-3-540-48712-8_21_20070709_1
374
Brandon A. Beemer and Dawn G. Gregg
(Bradley et al. 2006). This process differs from that found in traditional expert
systems because the tacit knowledge which is necessary for the system is much
more difficult to define, codify, evaluate, and represent than is rule-based explicit
knowledge (Bellman and Zadeh 1970, Bradley et al. 2006, McGraw and HarbisonBriggs 1989).
6
Conclusion
The goal of this paper is to extend previous publications that suggest and discuss
the advisory systems paradigm (Aronson and Turban 2001, Forslund 1995), by
incorporating insight from decision theory into the design of this emerging system
architecture (Mintzberg et al. 1976). For the past decade many advisory systems
have been classified as expert systems by IS researchers, even though they provide
advice instead of making a decision. It is our hope that this review of advisory
systems provides insight and fosters the acceptance of advisory systems as
a unique paradigm of research aside from expert systems.
There is a distinct organizational need for advice giving systems. However, additional research is needed to better define the role advisory systems should play
in supporting decision making and how best to improve their effectiveness and
acceptance within organizations.
References
AI Developers, EZ-Xpert 3.0, July 1, 2006. Accessed via
http://www.ez-xpert.com/index.html.
Arendt, L.A., R.L. Priem and H.A. Ndofor, “A CEO-Advisor Model of Strategic
Decision Making,” J Manage, 31(5), 2005, 680−699.
Aronson, J. and E. Turban, Decision Support Systems and Intelligent Systems.
Upper Saddle River, NJ: Prentice-Hall, 2001.
Bellman, R.E. and L.A. Zadeh, “Decision-Making in a Fuzzy Environment,”
Manage Sci, 17(4), 1970, 141−164.
Blanning, R.W, “Expert Systems as an Organizational Paradigm,” in Proceedings
of the International Conference on Information Systems, 1987, pp. 232−240.
Bonnisone, P.P. and H.E. Johnson, Expert System for Diesel Electric Locomotive
Repair: Knowledge-based System Report. New York, NY: General Electric,
1983.
Bradley, J.H, R. Paul and E. Seeman, “Analyzing the structure of Expert Knowledge,” Inform Manage, 43(1), 2006, 77−91.
978-3-540-48712-8_21_20070709_1
Advisory Systems to Support Decision Making
375
Bradley, J.H. and R. Hauser, “A Framework for Expert System Implementation,”
Expert Syst Appl, 8(1), 1995, 157−167.
Brim, O., G.C. David, C. Glass, D.E. Lavin and N. Goodman, Personality and
Decision Processes. Stanford, CA: Stanford University Press, 1962.
Carroll, J.M. and J. McKendree, “Interface Design Issues For Advice-Giving
Expert Systems,” Commun ACM, 30(1), 1987, 14−31.
Castillo, E., J.M. Gutiérrez and A.S. Hadi, Expert Systems and Probabilistic Network Models. Berlin: Springer, 1997.
Chandler, P. R. and M. Pachter, “Research Issues in Autonomous Control of Tactical UAVs,” in Proceedings of the American Control Conference, 1998,
pp. 394−398.
Chau, K., “An Expert System on Design of Liquid-Retaining Structures With
Blackboard Architecture,” Expert Syst, 21(4), 2004, 183−191.
Davis, F.D., R.P. Bagozzi and P.R. Warshaw, “User Acceptance of Computer
Technology: A Comparison of Two Theoretical Models,” Manage Sci, 35(8),
1989, 982−1003.
Demir, C., S.H. Gultekin and B. Yener, “Learning the Topological Properties of
Brain Tumors,” IEEE/ACM IEEE ACM T Comput, 1(3), 2005, 262−270.
Denver 9 News, “Technology Helps Road Crews Fight Snow,” October 27, 2006.
Accessed www.9news.com/news/article.aspx?storyid=41091.
Dewey, J., How We Think. Mineola, NY: Dover, 1910.
Feigenbaum, E., P. McCorduck and H.P. Nii, The Rise of the Expert Company.
New York, NY: Times, 1988.
Fonseca, D. and E. Richards, “A Knowledge-based System for the Recycling of
Non-hazardous Industrial Residuals in Civil Engineering Applications,” Expert
Syst, 22(1), 2005, 1−11.
Forslund, G., “Toward Cooperative Advice-Giving Systems: A Case Study in
Knowledge Based Decision Support,” IEEE Expert, 1995, 56−62.
Giarranto, J.C. and G.D. Riley, Expert Systems: Principles and Programming.
Boston, MA: Thompson Course Technology, 2005.
Gefen, D., E. Karahanna and D.W. Straub, “Trust and TAM in Online Shopping:
An Integrated Model,” MIS Quart, 27(1), 2003, 51−90.
Gregg, D. and S. Walczak, “Auction Advisor: Online Auction Recommendation
and Bidding Decision Support System,” Decis Support Syst, 41(2), 2006,
449−471.
978-3-540-48712-8_21_20070709_1
376
Brandon A. Beemer and Dawn G. Gregg
Hata, Y., S. Kobashi, K. Kondo, Y. Kitamura and T. Yanagida, “Transcranial
Ultrasonography System for Visualizing Skull and Brain Surface Aided by
Fuzzy Expert System,” IEEE T Syst, 35(6), 2005, 1360−1373.
Hill, T.R., “Toward Intelligent Decision Support Systems: Survey, Assessment
and Direction,” in Proceedings from International Conference on Information
Systems, 1987.
Karbowski, A. and N.S. Malinowski, “A Hybrid Analytic/Rule-based Approach to
Reservoir System Management During Flood,” Decis Support Syst, 38(4),
2005, 599−610.
Lin, W.W., A.K. Wong and T. Dillon, “Application of Soft Computing Techniques to Adaptive User Buffer Overflow Control on the Internet,” IEEE T
Syst, 36(3), 2006, 397−410.
Liker, J. and A. Sindi, “User Acceptance of Expert Systems: A Test of The Theory
of Reasoned Action,” Journal Eng Technol Manage, 14(1), 1997, 147−173.
Luger G., Artificial Intelligence: Structures and Strategies for Complex Problem
Solving. Addison Wesley, 2005.
Magni, C.A., S. Malagoli and G. Mastroleo, “An Alternative Approach to Firms’
Evaluation: Expert Systems and Fuzzy Logic,” Int J Int Tech Decis, 5(1), 2006,
195−225.
March, J.G. and H.A. Simon, Organizations. Hoboken, NJ: Wiley, 1958.
McGraw, K. and K.A. Harbison-Briggs, Knowledge Acquisition: Principles and
Guidelines. NJ: Prentice-Hall, 1989.
Miksch, S., K. Cheng and B. Hayes-Roth, “An intelligent assistant for patient
health care,” in Proceedings of the First International Conference on Autonomous Agents, 1997, pp. 458−465.
Mintzberg, H., The Nature of Managerial Work. New York, NY: Harper and Row,
1973.
Mintzberg, H., D. Raisinghani and A. Theoret, “The Structure of ‘Unstructured’
Decision Processes,” Admin Sci Quart, 21(2), 1976, 246−275.
Moynihan, G., A. Suki and D.J. Fonseca, “An Expert System for the Selection of
Software Design Patterns,” Expert Syst, 23(1), 2006, 39−52.
Negoita, C. V., Expert Systems and Fuzzy Systems. Menlo Park, CA: Benjamin/Cummings. 1985
Pryor, A., “Health Evaluation through Logical Processes System,” Med Syst,
January 11, 2007. Accessed via www.med.utah.edu/index.html.
Quinlan, J.R., “An Introduction to Knowledge-Based Expert Systems,” Aust
Comput J, 12(2), 1980, 56−62.
978-3-540-48712-8_21_20070709_1
Advisory Systems to Support Decision Making
377
Quinlan, J.R., “Induction, Knowledge, and Expert Systems,” in J.S., Gero and
Stanton, R. (eds.), Artificial Intelligence Developments and Applications. Amsterdam: North Holland, 1988, pp. 253−271.
Rapanotti, L., “News,” Expert Syst, 21(3−4), 2004, 229−238.
Roldán, J.L. and A. Leal, “Executive Information Systems in Spain: A Study of
Current Practices and Comparative Analysis,” in Mora, M., Forgionne, G., and
Gupta, J.N.D. (eds.), Decision Making Support Systems: Achievements, Trends
and Challenges for the New Decade. Hershey: Ideal Group, 2003, pp. 287−304.
Rubio, J. and J. Lizy-Kurian, “Reducing Server Data Traffic Using a Hierarchical
Computation Model,” IEEE T Parall Distr, 16(10), 2005, 933−943.
Sayles, L.R., Managerial Behavior: Administration in Complex Organizations.
New York, NY: McGraw- Hill, 1964.
Simon, H.A., “Theories of Bounded Rationality,” in McGuire, C.B. and Radner,
R. (eds.), Decision and Organization. Amsterdam, North Holland: 1972,
pp. 162−176.
Sniezek, J.A., “Judge Advisor Systems Theory and Research and Applications to
Collaborative Systems and Technology,” in Proceedings of the 32nd Hawaii
International Conference on System Sciences, 1999, pp. 1−2.
Sniezek J.A. and T. Buckley, “Cueing and Cognitive Conflict in Judge-Advisor
Decision Making,” Organ Behav Hum Dec, 62(2), 1995, 159−174.
Turban, E. and P.R. Watkins, “Integrating Expert Systems and Decision Support
Systems,” MIS Quart, 10(2), 1986, 121−136.
Vanguard Software Corporation, Decision Script, 2006. Accessed via
www.vanguardsw.com/decisionscript/jgeneral.htm.
Waugh, N.C. and D.A. Norman, “Primary Memory,” Psychol Rev, 72, 1968,
89−104.
Waterman, D.A., A guide to Expert Systems. Reading, MA: Addison-Wesley,
1986.
Watson, I., “Case-Based Reasoning is a Methodology Not a Technology,” Knowl
Based Syst, 12(1), 1999, 303−308.
Witte, E., “Field Research on Complex Decision-Making Processes − The Phase
Theorem,” Int J Stud Manage Org, 2(2), 1972, 156−182.
978-3-540-48712-8_21_20070709_1