Agent-based System for Emotion Generation

CSMR, VOL. 1, NO. 1 (2011)
Agent-based System for Emotion Generation
Alexandra Ciortan, Mihaela-Alexandra Puica
University POLITEHNICA of Bucharest
Faculty of Automatic Control and Computers, Computer Science Department
Emails: [email protected], [email protected]
Abstract
The domains of Artificial Intelligence and Multi-Agent Systems have always searched for
ways of developing systems that model human behaviour. They begun with adding
intelligence to agent-based system, but then they realized that this is not enough. One
important factor that influences human behaviour is emotions. Machines have to learn to
recognize emotions, but also to generate them. In this paper we are focusing on the second
research direction: we have built a system that generates emotions for a general, universal
situation. Later on, adding a knowledge base to this agent, the system can be applied for a
specific domain.
Keywords: agent-based system, emotion generation
1.
Introduction
Humans are complicated. They're not always rational. This is because, among other reasons,
they are acting influenced by emotions. Therefore, modelling human behaviour includes
modelling emotions - both generation and recognition. Currently, we are concentrating on
generating emotions. Several models of agents displaying emotions exist, but most often they
were developed for a specific domain - like for example e-learning, where an artificial tutor
shows emotion based on its tutoring results. What the authors are presenting in this paper is
an agent-based system that outputs emotions based on a set of general rules. General rules
means context-independent rules which can be extended with a domain dependent knowledge
base to match a specific situation.
In what follows, section 2 of the paper will present the current state of the art in the
domain. Section 3 will show the framework of the system. Section 4 and 5 will describe the
model of the agent and its architecture. In section 6 a first attempt of an implementation of the
system is illustrated. And finally in section 7 some conclusions will be drawn.
2.
Related work
Modelling human behaviour has always been a constant preoccupation of the scientists. They
have built various systems, but each had their limitations. Psychology came with a solution:
people don't act only following their rational reasons, but they also take into account
(unwillingly) emotions they experiment. Consequently, a new area of research appeared,
affective computing.
2.1.
Theories of emotions
To begin with, we must first understand emotions. As we all know, people can experience a
variety of emotions and emotional states, from joy and happiness, to anger and disgust.
Scientists have tried to categorize those using different criteria. One classification separates
the emotions into basic and complex. The first category consists of emotions universally felt
25
A. CIORTAN, ET AL.
AGENT-BASED SYSTEM FOR EMOTION GENERATION
by all humans, while the latter consists of emotions specific to an individual or to a culture.
According to [1], basic and complex emotions can be structured as follows:
Basic
emotion
Basic
opposite
Complex emotion Composed of…
Complex opposite
Joy
Sadness
Optimism
Anticipation + Joy
Disappointment
Acceptance
Disgust
Love
Joy + Acceptance
Remorse
Fear
Anger
Submission
Acceptance + Fear
Contempt
Surprise
Anticipation
Awe
Fear + Surprise
Aggressiveness
Sadness
Joy
Disappointment
Surprise + Sadness
Optimism
Disgust
Acceptance
Remorse
Sadness + Disgust
Love
Anger
Fear
Contempt
Disgust + Anger
Submission
Anticipation
Surprise
Aggressiveness
Anger + Anticipation Awe
Table 1. Basic emotions [1]
Table 2. Complex emotions [1]
Besides the types of emotion there are several theories that try to explain how emotions
arise [8]. The process starts when an event occurs and ends when the emotion arises, but in
between, the steps are different. Some say that the event causes psychological changes in our
body and, as we notice them, we interpret the situation based on these changes and we
experience emotion. Other theories say that psychological changes and emotions arise at the
same time, without any thought or interpretation or reasoning about the event. Another theory
states that when an event occurs, the psychological changes in our body determine us to
reason about the situation and, deciding its nature, we experience a certain emotion. For
example, if one is alone on a dark street at night and they hear footsteps behind them, their
body will start to tremble, and therefore they will reason that walking alone at night on a dark
street is a dangerous behaviour, so they will experience fear. In this case, this is a process that
happens based on previous experiences (personal or other people's experience). A person that
has never heard, seen or experienced the same situation wouldn't know what to think about it
when it first happens. Only after the consequences are seen, the situation can be appraised.
Consequently, the situation is learned and next time it will happen, its possible consequences
will be foreseen and an emotion will be experienced.
2.2.
Existing emotional models
Several models of agent-based emotions-enhanced systems have been developed. A number
of them are based on the OCC model, which is a cognitive appraisal theory developed by
Ortony, Clore and Collins [2]. This model states that an agent appraises an event based on its
goals, standards and attitudes, and the emotions arise as a result of this appraisal.
In the PETEEI project [3], the agent evaluates the event in the context of its expectation
and desirability. The expectation of user's actions is measured probabilistically, based on the
frequency of their actions, and the desirability is learnt by reinforcement learning. Learning is
the main point in this project: the agent dynamically learns from the events and the user's
actions, using four types of learning. Eventually, it is proven that the model simulates well
enough the emotions, but its structure and reasoning is very simple.
In [4], the authors developed a BDE model for behaviour anticipation. They started from
the BDI model (belief, desire, intention) and added the influence of emotions over the agent's
behaviour, obtaining the BDE model (belief, desire, emotion). BDE has been used in [5] to
describe conceptually an emotion-enhanced artificial tutor and in [6] to implement an
artificial tutor that recognizes and generates emotions. In this works, a set of emotions was
chosen to be represented and taken into consideration by the agent. Their disadvantage is that
26
CSMR - COMPUTER SCIENCE MASTER RESEARCH, VOL. 1, NO. 1 (2011)
they display emotions in the particular case of an e-learning system, so the agent is domain
specific. In this paper we describe a general agent, one that generates emotions based on
general rules, and that can be further used in any specific situation by adding a knowledge
base for that specific domain. As far as we know, there is currently no such general agentbased system. Our system includes the learning methods in PETEEI when generating
emotions, so we can say that it is a BDE system with evolving emotional intelligence.
3.
Framework
The framework has a three-layer architecture, corresponding to the three steps that need to be
done in order to achieve the final goal - have an agent capable of generating emotions.
The first layer is that of the developer that creates the general reasoning of the system. It
takes as input the knowledge base of the specific domain, the set of all possible emotions,
actions and characteristics of the agent and the sequence of events, and it generates the
emotions and the actions of the agent. The core of the first level is a CLIPS engine - a rule
interpreter that applies the given rules to obtain actions and emotions from facts and current
emotions.
Figure 1. Framework architecture
The second layer corresponds to a second developer, the one that applies the framework
to create a particular agent. The developer on level two sends all the input to the developer on
level one by creating an .xml file in which he writes the rules and the set of all possible
emotions, actions, characteristics and events. The developer on level one does all the
reasoning based on these input parameters.
Finally, at the third layer stands the user that interacts with the agent. It's the testing layer,
where the agent shows what it does and what it feels. The user is the one who choose what
event is happening from the list of possible events (defined by the developer in level two).
We have built both layer one and layer two, so that there is an example of an already
created agent that shows how all this framework works.
4.
Model of the agent
Starting from the previously developed system in [6] we have built a framework for affective
intelligent agents that generates emotions independent of the domain. This framework is
based on a rule based system and will use information coming from three different sources:
the emotions knowledge base, the domain specific knowledge base and the agent emotional
memory.
The first one contains the list of emotions that the agent can experience and will be used
differently in the emotion generation process according to the specific agent characteristics
that can be found in the domain specific knowledge base. The second one will be different for
every agent developed using the framework and shall contain the goals that the agent is trying
27
A. CIORTAN, ET AL.
AGENT-BASED SYSTEM FOR EMOTION GENERATION
to achieve, its initial desires and believes. The integrated BDE architecture will give the
possibility for this domain specific knowledge base to be revised each time an event or agent
action occurs, giving the agent the opportunity to reconsider its initial desires and believes
and to update them in accordance with the current state of the world. The emotional memory
will be used for storing the past events that occurred in the lifetime of the agent together with
the feelings it had experienced at that time. The emotion generation process will take into
account the past experiences and will in this manner change its behaviour in time.
Figure 2. Agent model
The emotion generation module will extract the emotions that apply for the current state
of the agent by matching the emotion generation rules with the agent characteristics and the
event that cause the state of the agent to change. The resulting information will be further
processed by the learning module that will store in the emotional memory the connection
between the event and the experienced emotion.
5.
System architecture
As we pointed out in the previous paragraph, the agent framework will be based on a BDE
architecture which will be extended so that the agents’ behaviour and actions will also be
influenced by its past experiences.
Figure 3. Agent architecture
The agent perceives the environment, and the percepts influence the belief revision and
the emotion generation processes. Belief revision leads to new beliefs which, on their part will
28
CSMR - COMPUTER SCIENCE MASTER RESEARCH, VOL. 1, NO. 1 (2011)
influence the emotion generation process and the resulting agent behaviour. In this manner the
agent behaviour will be associated directly with an emotional state and this will have an
impact on its future actions.
The BDE architecture has been modified in order to consider the possibility of learning
emotion-event associations from previous experiences. We start from a basic emotional model
and use different learning techniques similar to the ones described in the PETEEI project:
learning about event sequences associated with experienced emotions, learning about the
other agents’ actions and moods, and the Pavlovian conditioning.
6.
Implementation
Implementation was made in Java and CLIPS. The rule interpreter was written in CLIPS,
while the rest of the application was written in Java. The two environments were linked by
CLIPSJNI (CLIPS Java Native Interface), a Java API for CLIPS.
6.1.
Level one - general framework
At level one there are two main points that need to be solved. The first one is the language in
which the developers on level two are writing their specifications and the second one is the
rule interpreter which generates the rules.
6.1.1.
Parser
We chose to let the developer write the specifications in an .xml file because it is a well
known language and a W3C recommendation, so the necessity of learning a new language is
gone. The file should have the following form:
Figure 4. BNF form of the XML file
The knowledge base contains the current facts, characteristics and emotions, rules that the
agent obeys to and the lists of all the characteristics that the agent can have, the emotions that
it can experience, the actions it can do and the events that can take place in the environment.
A rule tests facts, characteristics, emotions and events and outputs facts, characteristics,
emotions and actions. A fact is a piece of information that the agent has about the world. The
state of the agent is given by its characteristics and by its emotions. The result of all these
evaluations gives the actions that the agent will do and may change its knowledge base by
adding or removing facts, characteristics and emotions.
The .xml file is parsed using a DOM parser and structures for each of the constructions
are created, as shown below:
29
A. CIORTAN, ET AL.
AGENT-BASED SYSTEM FOR EMOTION GENERATION
Figure 5. Java data structures
There is a valid field for each of the fact, characteristic, emotion and action structures
because once introduced in the knowledge base, they are never removed: when they are not
true anymore, they are just made false.
6.1.2.
CLIPS engine
The CLIPS engine receives the parameters given in the .xml file and based on them it
constructs the rules. The CLIPS program also defines structures similar to that constructed in
Java: it has templates for the rule, for the agent current state, for the event that occurs in the
environment and for the changes brought to the agent. All this structures have in common the
following slots:
Figure 6. Clips data structures
This is because they constitute the central part in the reasoning: the beliefs and the state of
the agent determine its actions and its emotions. The facts, emotions and characteristics are
split into valid and invalid because the in the current state they may be true or not (see the
Java structures in the previous section). A rule removes a fact, characteristic or emotion by
making it false.
6.1.3.
Graphical interface
The interface was designed using Java Swing and has the purpose of showing how the system
works. It is simple, easy to use and to understand.
The user loads the .xml file in the upper side of the window. When loaded, the file is
parsed and the structures are created and shown in the areas below, together with buttons for
each event. The left most area presents the rules that the agent obeys, the right most area
shows the associations event-emotion in the emotional memory and in the middle the data
structures are illustrated: what does the agent believe at the current moment (facts), what does
the agent feels (emotions), what are its characteristics and what actions is it taking at the
current state. The lower side of the window presents a console where everything that happens
is shown chronologically. When the user pushes an event button, it is sent to the CLIPS
engine which reasons to generate the emotion and make the changes in the knowledge base.
At every moment the current state of agent is updated and displayed.
30
CSMR - COMPUTER SCIENCE MASTER RESEARCH, VOL. 1, NO. 1 (2011)
Figure 7. User interface
6.2.
Level two - domain specific agent
We decided to apply the framework for a pet, namely a dog agent. This was because a dog is
much simpler than a human. It does not experience the whole set of emotions that people do
and it cannot do the whole set of actions that people can. Put it shortly, "dogs behave in
doggie ways and not human ways"[7].
For the purpose of our example, we had to choose a list of all possible parameters that a
rule can have. These are resumed below:
Figure 8. Parameters
Each rule respects the format specified earlier. For example, the following rule says that if
the agent is hungry and it receives food, then it eats, it's not hungry anymore and it
experiences joy:
Figure 9. Example of a rule
31
A. CIORTAN, ET AL.
AGENT-BASED SYSTEM FOR EMOTION GENERATION
The specifications for the dog agent may seem trivial and it is probably the case. But the
intention was not to create a real, complete emotional agent, but just to show how the system
works. The reasoning in the rule-based-system is done on general, context independent data.
It's true that the system cannot be proved without the specifications in the .xml file, but that
doesn't make the framework less general.
7.
Conclusions
Our work was inspired by the fact that, as far as we know, there is currently no other
emotional agent that generates emotions without considering a specific domain. Each
emotional agent architecture that we have studied models its structure to fit within a domain
specific pattern which poses serious limitations when considering further developments.
What our work brings new to the field of intelligent agents is the possibility of using a
common framework for the creation of individual and unique emotional intelligent agents,
each exhibiting different domain specific behaviour according to the knowledge base that is
initially provided and the actions that it performs in the environment.
This development will facilitate a further study of how multiple individual emotional
agents that express different interests and knowledge bases are able to influence each other
when placed in a common environment. This will give the opportunity to study the manner in
which emotions are generated within a network of emotional agents that continuously
influence each other but which display different emotional behaviours.
References
[1] Arora, A. Emotions, http://www.anmolarora.com/?p=299, downloaded: May, 20th
2010
[2] Ortony, A; Clore, G; Collins, A. The Cognitive Structure of Emotions, Cambridge
University Press, Cambridge, UK, 1988
[3] El-Nasr, M. S; Ioerger, T. R; Yen, J. PETEEI: A Pet with Evolving Emotional
Intelligence, Texas A&M University Technical Report
[4] Florea, A. M; Kalisz, E. Behavior Anticipation Based on Beliefs, Desires and
Emotions, International Journal of Computing Anticipatory Systems, CHAOS,
Liege, Belgium, 2004, vol.14, pag. 37-47
[5] Florea, A. M; Kalisz, E. Embedding Emotions in an Artificial Tutor, Proceedings of
Seventh International Symposium on Symbolic and Numeric Algorithms for
Scientific Computing, 2005 (SYNASC 2005)
[6] Puică, M. A. Affective intelligent agents, Diploma Project, 2009
[7] Dog and cat pet tips, Dog behaviour - understanding your dog's behaviour,
http://www.pets.ca/pettips/tips-50.htm , downloaded: June, 27th 2010
[8] Heffner, C. L. - Psychology 101, Chapter 7: Motivation and Emotion,
http://allpsych.com/psychology101/emotion.html, downloaded: May, 20th 2010
32