Bachelor in Computer Science Engineering Universidad Politécnica

Bachelor in Computer Science Engineering
Universidad Politécnica de Madrid
Escuela Técnica Superior de
Ingenieros Informáticos
Final Project
Development of a behavior management system for the
aerial robot software framework Aerostack
Author: Alberto Camporredondo Portela
Director: Martin Molina
MADRID, JUNE, 2017
Acknowledgements
Thanks to my family and friends for all the support. Thanks to my tutor Martín Molina
and all my colleagues and friends at the laboratory for all the help given during this year.
R ESUMEN
Este proyecto se centra en la capa ejecutiva, una parte clave a la hora de realizar diversos
comportamientos enviados desde capas superiores como la capa deliberativa.
En el presente documento se habla sobre el análisis, diseño, implementación y validación
realizados a los diferentes componentes pertenecientes a un sistema de gestión de comportamientos para vehículos aéreos. Los componentes realizan actividades como:
• La coordinación y ejecución de comportamientos. Donde se deberá tomar decisiones de ejecución rápidamente y, donde se gestionará errores ocurridos durante
dicha ejecución.
• El apoyo a la integración de los comportamientos. Esto es, que los comportamientos
creados por cualquier desarrollador estén integrados plenamente con el sistema y
sin ningún profundo conocimiento del funcionamiento del mismo.
• La interpretación de un catálogo. Que permite especificar fácilmente información
sobre todos los comportamientos, por cualquier desarrollador, para que el sistema
sepa como interactuar entre ellos.
• La activación y gestión de recursos del drone. Como pueden ser los procesos de
planificación.
I
A BSTRACT
This project is focused on the executive layer, a key component that allows the drone to
make new behaviors received from higher layers such as the Deliberative Layer.
The document contains the analysis, the design, the implementation and the validation
of different components related to a Behavior Management System for aerial vehicles.
These components perform the following tasks:
• Coordination and execution of behaviors. Executive decisions should be taken
quickly and errors should be controlled during the execution.
• Behavior integration support. The integration of new behaviors made by developers
must be fully integrated within the system and without any deep knowledge of how
the system works.
• Catalog interpretation. It allows other developers to specify behaviors properties
easily so as to allow the system to know how to interact with other behaviors.
• Activation and management of drone’s resources, such as planning processes.
II
C ONTENTS
Resumen
I
Abstract
II
1
Introduction
1
1.1
Project objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
1
1.2
Document organization . . . . . . . . . . . . . . . . . . . . . . . . . . .
2
2
3
Context of the project
3
2.1
The executive system of robot architectures . . . . . . . . . . . . . . . .
3
2.2
Robot Operating System (ROS) . . . . . . . . . . . . . . . . . . . . . .
4
2.3
The Aerostack framework . . . . . . . . . . . . . . . . . . . . . . . . .
5
2.3.1
Software Architecture . . . . . . . . . . . . . . . . . . . . . . .
6
2.3.2
The execution engine . . . . . . . . . . . . . . . . . . . . . . . .
7
Development of the behavior management system
10
3.1
Behavior Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
11
3.1.1
Process creation (Drone Process) . . . . . . . . . . . . . . . . . .
11
3.1.2
Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
12
3.1.3
Services and topics . . . . . . . . . . . . . . . . . . . . . . . . .
13
3.1.4
UML diagram and Algorithms . . . . . . . . . . . . . . . . . . .
14
Behavior Coordinator . . . . . . . . . . . . . . . . . . . . . . . . . . . .
18
3.2.1
Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
18
3.2.2
Services and topics . . . . . . . . . . . . . . . . . . . . . . . . .
19
3.2.3
UML diagram and algorithms . . . . . . . . . . . . . . . . . . .
19
Behavior Specialist . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
22
3.3.1
22
3.2
3.3
Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
III
3.4
3.5
4
5
3.3.2
Services and topics . . . . . . . . . . . . . . . . . . . . . . . . .
24
3.3.3
UML diagram and algorithms . . . . . . . . . . . . . . . . . . .
25
Resource Manager . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
28
3.4.1
Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
28
3.4.2
Services and topics . . . . . . . . . . . . . . . . . . . . . . . . .
28
3.4.3
UML diagram and algorithms . . . . . . . . . . . . . . . . . . .
29
Example of a behavior . . . . . . . . . . . . . . . . . . . . . . . . . . .
32
Software Validation
35
4.1
Testing using ROS . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
35
4.2
Validation inside Aerostack . . . . . . . . . . . . . . . . . . . . . . . . .
36
4.2.1
Simulator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
36
4.2.2
Real flight . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
40
Conclusions
42
5.1
43
Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
6
Bibliography
44
7
Appendix A – Behavior Catalog
45
8
Appendix B – UMLs Behavior Specialist
48
9
Appendix C – Validated mission
51
IV
1 I NTRODUCTION
Nowadays, robots are more integrated into our society than ever. This process of integration is rising gradually and every sector is becoming more automated (as the automotive
or the medical sector). In the aerial field, there are different paths of research, but there is
a strong demand for making better and safer autonomous vehicles.
The research group (CVAR - Computer Vision and Aerial Robotics) at the Technical
University of Madrid (UPM) is working on making aerial vehicles more autonomous
using Artificial Intelligence techniques and computer vision. For this reason, they are
developing a general software for robots called Aerostack (aerostack.org). This software
was originally created in 2013 and the first release as a matured documented product was
in June 2016. Different teams are working on developing, improving and maintaining
new processes and documentation.
Aerostack proposes an hybrid architecture based on AI robotics. It is composed of different layers (such as reactive and deliberative). A key component of this architecture is
the executive system that allows the translation of concepts into low-level actions of the
drone. This transformation is not as simple as it seems because there exist different ways
of managing these concepts and actions. So, this project represents one of these many
possible aproximations.
1.1
PROJECT OBJECTIVES
The main objective of this project is the development of a software solution that can be
place inside the executive system and that has to be capable of coordinating and executing
different processes, managing different errors, occurred during the flight, and activating
different processes associated with the vehicle. These characteristics have a common
1
goal, to verify that high-level concepts are correctly translated into low-level commands.
The following stages are followed to achieve this objective:
• Analysis: This part contains a study of different architectures, different software,
different state-of-the-art technologies, different programming languages and different methodologies to resolve the requirements established.
• Design and implementation: This part represents the detailed definition of the processes and solutions developed. In addition, the implementation of the processes
and the technology used is explained.
• Validation: This task has the objective to verify if the software developed works as
expected and if it meets the requirements demanded. For this part, an exhaustive
software validation is made by doing different tests to each of the components and
also different tests to the whole system.
1.2
DOCUMENT ORGANIZATION
The document is divided into five different chapters. Firstly, there are the “Introduction”
and the “Context of the project” chapters where the main idea and the concepts needed
to understand this project are mentioned. Then, there is an explanation of the design of
different processes and why they are created and also, the implementation of each C++
process. After this, it is explained the validation done to the software developed. Finally,
some conclusions are written based on the results obtained.
2
2 C ONTEXT OF THE PROJECT
This section is intended to place the reader in a situation that can understand the fundamental parts of this project.
2.1
THE EXECUTIVE SYSTEM OF ROBOT ARCHITECTURES
In robotics, there exist different types of architectures, but the one more used for the
last years is an hybrid approximation. It combines a Deliberative Layer and a Reactive
Layer tied by the Executive Layer. The Deliberative Layer (plan) [1] is in charge of more
cognitive actions. Such as creating better trajectories, supervising its own performance
or selecting the best behavior to activate in different situations. The Reactive Layer (actsense) [1] can be considered as a low-level layer in which there is a pair actuator-sensor
that activate different actuators based on the data received from different sensors. The
figure 2.1 represents this idea.
There are different approximations to handle the pair actuator-sensor. One of them is: one
sensor is associated with an actuator. Another approximation is that different sensors can
activate one or more actuators.
Having in mind the above paragraphs and the figure 2.1 the Executive Layer can be explained as the layer capable of translating complex missions from the Deliberative Layer
into commands of the Reactive Layer. This means that high-level commands such as
movements tasks or trajectories to follow are translated into low-level commands that,
eventually, will call low-level controllers.
3
Figure 2.1: An schema of an Hybrid system [1]
As it can be read in [2] such kind of architectures are the most used for two reasons:
• New Technologies enhance the simultaneous execution of both the Deliberative
Layer and the Reactive Layer.
• It encourages software modularity. This means, that pure Reactive applications are
allocated in the Reactive Layer apart from the Deliberative Layer. This also happens
within the Deliberative Layer.
2.2
ROBOT OPERATING SYSTEM (ROS)
ROS (ros.org) is a middleware used in the robotic field that abstains the communication
between processes. It follows the editor/subscriber philosophy. Editors are any process
able to publish a message and subscribers are any process able to receive messages. Also,
there must be an intermediary process that route messages from editors to subscribers.
This process is called rosmaster.
ROS has two ways of handling the communication between processes: (1) A blocking
technique called services that waits for a response (useful when the data sent to a process
needs to return a result) and (2) a non-blocking method called topics that just sends a
message without waiting for a response. The figure 2.2 shows the types of communication
mentioned. Furthermore, every time a message is received via topic or via service, the
function associated to them is called.
Another important aspect of these methods are: (1) Services are one-to-one, this means
4
other processes can not read what was sent through this service. (2) Topics are one-tomany, so, anyone can read what is published.
Figure 2.2: Communication Process schema
In addition, ROS has different commands to launch and to test processes. An example of
ROS commands are: roslaunch, to launch any ROS process, rosservice, to manage ROS
services, rostopic, to manage ROS topics and rosmsgs, to manage ROS messages. These
commands can be used with different arguments to make different operations.
2.3
THE AEROSTACK FRAMEWORK
Aerostack [3] [4] is a software for aerial robotics. It uses ROS to communicate different
processes. Also, it is characterized for following a modular architecture that allows the
creation and integrations of new processes easily. Other characteristics are:
• Multiagent systems: Some vehicles can perform a mission communicating with
each other via WLAN.
• Various operating modes: Autonomous or teleoperation.
• Hardware independency: It can easily be adapted to different hardware (for example, the Parrot Ar Drone 2.0 or your own drone).
5
2.3.1
Software Architecture
As it can be seen in figure 2.3, the actual architecture is an ampliation of the three layer
hybrid architecture mention in the first section. Aerostack has a total of five layers, which
are: the Social Layer, the Reflective Layer, the Deliberative Layer, the Executive Layer
and the Reactive Layer.
Figure 2.3: Aerostack Architecture
Below, are explained the objectives and components of each layer:
• Reactive Layer: It is composed by: the “Feature Extraction System” and the “Motor System”. The former receives information provided by sensors and the latter
sends commands to the vehicle.
• Executive Layer: It is composed by the “Executive System”, which sends commands to the Reactive Layer based on what it is sent by the Deliberative Layer
and the “Situation Awareness System”, which is in charge of conducting a model
of the environment, based on the information received by the “Feature Extraction
System”.
• Deliberative Layer: It is composed by the “Planning System” that is in charge of
realizing the task planning.
• Reflective Layer: It reacts to any system failure and it informs the Deliberative
layer about the actual situation. This helps the Deliberative Layer on the decision
6
management. Also, the Reflective Layer informs about tasks set by Deliberative
Layer and informs the Social Layer about its performance. It is composed by the
“Action Monitor”, “Process Monitor” and the “Problem Manager”.
• Social Layer: It is in charge of the communication with external elements, which
can be operators or other vehicles.
2.3.2
The execution engine
In order to improve the usability of Aerostack, a new software component called execution engine [5] has been specified. This new engine provides the vehicle with effective execution capabilities. This system incorporates different technical solutions used in
robotics and artificial intelligence. The figure 2.4 shows the architecture that represents
the executive layer within the Aerostack framework.
Figure 2.4: Execution layer in the architecture of an autonomous robot
The main objectives of this engine are simplicity and robustness. These concepts means
that different plans should be easy to write by operators and robots should be able to react
7
properly to contingencies that happen in unpredictable environment with and effective
performance.
This executive engine is composed of two parts (figure 2.5): (1) The behavior management system and (2) the belief management system.
Figure 2.5: Executive System architecture [5]
This project is focused on the behavior management system, which is composed of by
four processes (figure 2.6).
• Behavior Coordinator: It is in charge of coordinating and executing behaviors.
• Behavior Specialist: A process used to centralize information about how to use
each behavior.
• Behavior Process: this process abstains other developers from the communication
with other components of this execution engine.
• Resource Manager: It is in charge of starting the processes that support the execution of behaviors, trying to use efficiently the resources of the robot.
8
Figure 2.6: Behavior management system architecture[5]
9
3 D EVELOPMENT OF THE BEHAVIOR MAN AGEMENT SYSTEM
The following sections explain the design and development of the processes that shape
the behavior management system. These processes are the behavior process, the behavior
coordinator, the behavior specialist and the resource manager. Almost every name of
each process has the word behavior associated. This nomenclature is used because most
important concept of this system is the behavior.
The concept of behavior [5] represents anything the drone is able to do. Traditionally, this
concept has been using in the robotic field as a way of encapsuling a set of perception
algorithms and actuation controllers to generate a particular pattern perception actuation.
But, in this system, behaviors are capsules of code that provides and abstraction, hiding
low level technical details and complexitiy. For example, the figure 3.1 represents some
behaviors the vehicle can perform.
With this concept in mind, every mission can be specify as a set of behaviors that can be
activated or deactivated.
Another aspect worth mentioning are capabilities, which are a set of resources available.
This concept appears because the mission may use different types of physical devices
from the vehicle, which they need to be managed efficiently. For that reason, the resource
manager is in charge of activating and deactivating them.
10
Figure 3.1: Example of behaviors[5]
3.1
3.1.1
BEHAVIOR PROCESS
Process creation (Drone Process)
This process acts as a state machine controlling the execution of a process. The figure 3.2
shows all the possible states. When a process is created, it must call the setUp function,
which set the state as “Ready to start”. So, the start and stop functions will change the
state between “Running” and “Ready to start”.
States can be changed by calling these functions inside the code (such as the main function of a process) or by using functions associated with the services advertised (“process_name/start” and “process_name/stop”).
11
Figure 3.2: Drone Process flow diagram[6]
3.1.2
Design
Each behavior created must be integrated within the system easily and without a deep
knowledge of how the system works. For that reason, the Behavior Process class was
created. It abstains the developer from any operation it has to do with the behavior management system and allow it to focus on the development of the behavior itself.
As the system is developed in an Object Oriented language, the best way to automate
and to control the behavior is via inheritance. The developer just needs to inherit this
class when a behavior is created and fill the functions provided. In that way, when the
behavior is launched via roslaunch, the process advertises the services needed, which are
“behaivor_name/start”, “behavior_name/stop” and “behavior_name/check_situation”.
However, this is not simple. There are some limitations with inheritance and the programming language (C++). Following these lines, these problems, limitations and respective
solutions are mentioned.
• Behaviors are processes and every process must inherit from Drone Process because
it abstains developers from common process features. But, as behaviors have also
some characteristics in common, the inheritance of Drone Process can not be done
directly. It has to be done through Behavior Process. For that reason, the functions
provided by Drone Process must be overwritten by the Behavior Process.
• As explain in subsection 3.1.1 when a service is called, a callback is executed.
12
Thus, when the service “/drone1/behavior_X/start” is called, the function start is
executed. In this case, although the start and the stop functions are defined within
the Behavior Process class, they are not called because of inheritance. Instead,
DroneProcess functions are called, which is not intended since some behavior operations need to be done within the start function. The best and optimal solution
was to redefine the functions available from Drone Process (start, setup, run and
stop) inside Behavior Process. This allows the functions from Behavior Process to
make the operations needed before calling the functions from Drone Process.
• One of the main parts of behaviors are events. Events are sent for many reasons. As
the main objective is to abstain the developer, the Behavior Process needs to send
this messages and, for that, it needs to know the reason of the event. In order to
know the event code, there are some accessible variables via public functions. The
developer just needs to assign the correct value.
Another important aspect related to the last point is each behavior automatically stops
after sending the event. This sets the internal state of the process as “Ready to start”.
Also, this operation avoids sending more than one event. If for some reason, the behavior
coordinator needs to activate this behavior again, first, it needs to remove it from his
memory and then, it activates the behavior received.
3.1.3
Services and topics
As explained in [5], behaviors has the following operations available as services. This operations are: (1) “behavior/start”, (2) “behavior/stop” and (3) “behavior/check_situation”.
Each operation has its corresponding message. The figure 3.3 represents these messages.
13
Figure 3.3: Behavior Process messages
Apart from the operations available, when a behavior reaches its target, the way it announces this is via a ROS topic called “behavior_event”, which sends the message named
“behaviorEvent.msg”. The fields of this message are shown in the figure 3.4. The “behavior_event_code” just can have one of the terminations showed, if another number is
set, the system will not understand it and will discard it.
Figure 3.4: Behavior Event message
3.1.4
UML diagram and Algorithms
The figure 3.5 shows what properties has the class Behavior Process. In the variables
section, as it has been mentioned, some of these variables are modified by each behavior at different moments. These variables are, “started”, “finished_event” and “finished_condition_satisfied”. The first one changes to False if an error happens during
the ownStart function. The other two variables changes to its respective value when the
14
objective of a behavior is reached and only if that behavior requires supervision. Other
variables such as arguments and timeout are set when the start callback is executed, which
happens when the start service is called.
Figure 3.5: UML Behavior Process
In the functions section, there are different callbacks that manage the communication
between the different processes of the behavior management system. Also, there is a
15
callback called “timerCallback” which is used to set a variable (“timer_finished”) as true
when the timeout reaches zero.
Another important aspect, the Drone Process functions (start, stop, run and setUp) do
not overwrite directly the Drone Process function, this means that Behavior Process has
defined this functions and the Drone Process ones are called from Behavior Process ones.
In addition, start and stop returns a std::tuple with a boolean representing if an error
occurred and a string representing the error message if any.
Below there is a pseudocode explaining the operations each of these functions (start,
setUp, stop and run) do.
Algorithm 3.1: setUp()
1
ros.advertise("/"+process_namespace+"/"+behavior_name+"/start", startCallback)
2
ros.advertise("/"+process_namespace+"/"+behavior_name+"/stop", stopCallback)
3
ros.advertise("/"+process_namespace+"/"+behavior_name+"/check_situation",
situationCallback)
4
5
... // Different operations from DroneProcess
6
ownSetUp()
7
... // Different operations from DroneProcess
The setup function, is executed when the behavior is launched by the command “roslaunch”.
It just advertises the minimum services the behavior needs to have available to be used.
Appart from this services, it performs some operations that changes the status of the class
inherited, the Drone Process.
Algorithm 3.2: start()
1
ros.advertise("behavior_event", startCallback)
2
ros.serviceClient("active_resources")
3
ros.serviceClient("cancel_resources")
4
5
activated = activateCapabilities()
6
if not activated:
7
return Error
8
9
timer.activate(timeout)
10
DroneProcess.start()
11
if not started:
12
13
return Error
return Ok
16
The start function is executed when the service “/start” is called. As the low-level processes need to be started, this function activates each capability associated to this behavior
by sending the name of the behavior to the resource manager. After this activation, a timeout is activated to control the termination and the start function of the Drone Process is
called, which will call the ownStart function.
In addition, the function returns and error if something has failed while doing the ownStart.
Algorithm 3.3: run()
1
DroneProcess.run()
2
3
if hasFinished():
4
msg = droneMsgsROS.behaviorEvent
5
msg.name = getName()
6
msg.behavior_event_code = getFinishedEvent()
7
ros.publish(msg)
8
stop()
The run function is executed in a low-rate loop after calling the setup funcions. But,
thanks to the states of the Drone Process class and to the started flag mentioned in the
UML, the function will only perform operations whether it is in the “Running” state.
Basically, the algorithm calls the run function of the Drone Process which will call the
ownRun function and then if the supervision is finished the condition returns true, the
event message is sent and the behavior stops.
Algorithm 3.4: stop()
1
DroneProcess.stop()
2
3
stopped = cancelCapabilities()
4
if not stopped:
5
return Error
6
7
ros.shutdown("activate_resources")
8
ros.shutdown("cancel_resources")
9
return Ok
The stop function is pretty easy, it just calls the stop function from the Drone Process class,
which will call the ownStop function allocated within the behavior, and then it stops every
capability associated to by sending the name of the behavior to the resource manager.
Capabilities are canceled after calling the Drone Process becouse it seems more optimal
17
to shut down the topics set by the developer first. After this, the services associated to the
resource manager are stopped because the “behavior_event” topic could generate some
errors (such as discarded messages) if the publisher is shutted down while the subscribers
are still activated.
3.2
3.2.1
BEHAVIOR COORDINATOR
Design
The Behavior Coordinator is in charge of activating any behavior it receives via service,
but first, it must check if the behavior received can be activated or not. There are some
reasons by which a behavior cannot be activated:
• Situation: The activation of behaviors must follow some logic, there are some behaviors that can not be executed more than once in the same mission. For example,
TAKE_OFF cannot be activated if the drone is already flying. Apart from this verification, behaviors can check more parameters, such as battery levels. If the battery
is low the drone will not do the TAKE_OFF.
• Incompatibilities: The architecture is capable of executing some behaviors at the
same time, for that reason, the behavior coordinator needs to know if the behavior
received is able to be executed with other behaviors that are already executing.
To resolve the situation issue, behaviors should follow a standard nomination when they
are created, thus the Behavior Coordinator is able to know how to call behaviors. This
provides a scalable and modular solution and so the behavior coordinator can call different operations of each behavior easily, such as the situation, the start operation, and
the stop operation. As an illustrative example, if a TAKE_OFF is received, the behavior
coordinator will call “/drone1/behavior_take_off/check_situation” to check the different
situation coded by the developer of the behavior. The are other operations available as
well: “/drone1/behavior_take_off/start” and “/drone1/behavior_take_off/stop”.
To resolve the second issue the behavior coordinator delegates this task to the behavior
specialist. As a summary, the behavior coordinator just sends the name of the behavior
that wants to check and all the behaviors already executing. The Behavior Specialist
responses with the inconsistent processes if any.
18
Another aspect of this process is the necessity of managing different errors happening
when a behavior is executing. For that, every message received when a ROS service is
called has to response with a boolean flag representing the error, and also, a string with
the error occurred.
3.2.2
Services and topics
To allow the reception of behaviors (to activate or to cancel them), the process advertises
two services, which are “activate_behavior” and “inhibit_behaviors”, respectively. Each
service sends a custom message called “behaviorSrv.srv” containing another message
called “behaviorCommand.msg” that contains a string with the name of the behavior and
the arguments the behavior needs at the start (some behaviors does not have arguments).
Also as they are services, a response needs to be set. Here, the response is the error
message and the acknowledge of the operation. The figure 3.6 shows the structure of the
messages mentioned.
Figure 3.6: Messages advertised from Behavior Coordinator
Apart from the services advertised by this process, it also publishes information about the
name of the active behaviors. The topic is “list_of_active_behaviors” and the associated
message is “listOfBehaviors.msg” which contains an array of all the behaviors active.
3.2.3
UML diagram and algorithms
The figure 3.7 contains every function and variable created. The behavior coordinator
class inherits from the Drone Process class as told in subsection 3.1.1. In the variable part
(first block), the diagram contains a variable (called “active_behaviors”) that stores the
name of active behaviors, as well as some ROS variables needed.
Focusing in the function part (second block) there are some callbacks and functions that
make the process working. In addition, some of the functions return a std::tuple contain19
ing the data return as well as the boolean and the string representing the error message if
any. This std::tuple allows the function to return more than one variables without further
change in the code if the function changes.
Figure 3.7: Behavior Coordinator UML
To explain in deeper detail how this process works, the following lines represent some
pseudocode.
Algorithm 3.5: activateBehavior(string behavior, string arguments)
1
msg = droneMsgsROS.situation;
2
valid = ros.call("behavior_"+behavior+"/check_situation", msg)
3
if not valid:
4
return Error
5
6
msg = droneMsgsROS.checkConsistencyBehaviors
7
msg.request.behavior = behavior
8
msg.request.active_behaviors = active_behaviors
9
consistent = ros.call("check_consistency", msg)
20
10
if not consistent:
11
for _behavior in behaviors:
12
stopBehavior(_behavior)
13
removeFromMemory(_behavior) # Remove from active_behaviors
14
15
msg = droneMsgsROS.defaultValues
16
msg.request.behavior = behavior
17
values = ros.call("default_values", msg)
18
19
msg = droneMsgsROS.startBehavior
20
msg.request.timeout = values.timeout
21
msg.request.argument = arguments
22
started = ros.call("behavior_"+lowercase(behavior)+"/start", msg)
23
if not started:
24
return Error
25
26
addBehaviorToMemory(behavior) # Add to active_behaviors
27
ros.publish("list_of_active_behaviors", active_behaviors)
28
return Ok
As it can be seen from the pseudo code above, when a behavior is received via “active_behaviors” the process does the following operations in sequence:
1. Check situation. Here it asks the behavior if it is consistent with beliefs by calling “/drone1/behavior_X/check_situation”. The X represents any behavior such as
TAKE_OFF or GO_TO_POINT.
2. Check consistency with active behaviors. It sends the behavior received and all
active behaviors in a message called “checkConsistencyBehaviors.msg” for consistency verification. The service responses with a boolean representing the consistency. A true means that is consistent and a false value represents that is not
consistent.
3. Get catalog data. If at this point, every check has returned positive, the process asks
the behavior specialist the data store in the catalog of the behavior received.
4. Execute (start) behavior. The data gotten in the previous part is now sent within the
call. In addition, it can be seen something introduced in previous parts, behaviors
are started following a naming standard.
5. Add to memory. The process must save the behaviors executing for consistency
checking. For this reason, when a behavior is activated it is also saved in a variable
called “active_behaviors”, as mentioned before.
21
On the other hand, as it was said before, behaviors can be inhibited via “inhibit_behaviors”.
The following pseudocode shows how it is done.
Algorithm 3.6: inhibit_behavior(string behavior)
if isActive(behavior): # if is in active_behavior
1
2
stopped = ros.call("behavior_"+toLowerCase(behavior)+"/stop")
3
if not stopped:
return False
4
5
6
removeFromMemory(behavior) # Remove from activate_behaviors
7
ros.publish("list_of_active_behaviors", active_behaviors)
return OK
8
return Error
9
The code is pretty simple. The Coordinator just checks if the behavior is active and if
it is not active, returns an error. If it is active, the behavior coordinator tries to stop the
behavior and if it succeeds, it removes the behavior from the list of active behaviors.
In addition, when a behavior finishes, it sends an event with the ending termination. The
behavior coordinator receives this event and removes the behavior kept in memory that
sent the event. This event is sent via the topic “behavior_event”.
3.3
BEHAVIOR SPECIALIST
3.3.1
Design
Mainly, this process centralizes information about how to use each behavior [5]. All this
data is represented in a file called “behavior_catalog.yaml”, which is written in YAML
language because it is very intuitive and it allows inexperienced users to write new behaviors easily. This catalog can be found in the appendix A. Although it is easy to understand,
there are some concepts that should be explained. Not every tag within the catalog is important, this means that some tags are irrelevant and can be omitted. Tags are divided into
three categories depending on how important they are. The following lines explain that
concept.
• Error tags: These tags can not be omitted. They must be written to have a correct
catalog. If one of these tags is not written, the process will stop with an error. These
22
tags are: (1) From “behavior_descriptor”: “behavior”, (2) from “behavior_lists”:
“list” and “behavior” and (3) from “capability_descriptors”: “capability” and
“processes”. In addition, “behavior_descriptors” and “capability_descriptors”
are considered Error tags as well.
• Warning tags: The system must know the value of this tags. But, if tags are not
set, the default value associated is set and a warning will be displayed in the terminal. Mostly, these tags are properties of behaviors and capabilities: “timeout”,
“recurrent” and “activate_by_default”.
• Dependency tags: These tags are only needed if the configuration implies it. For
example, when a behavior is defined maybe is not incompatible with any other behavior, so, the tag “incompatible_lists” should not be written. If the tag is not set,
the process will display an info message telling the tag is not set and the value assigned. As an important point, if a list of incompatibilities is assigned as a value
to a behavior, but it is not defined inside the “behavior_lists” the process will stop
and, also, an error will be displayed. So, it needs to be set only if a list of incompatibilities is written.
On the other hand, the data inside the file needs to be retrieved fast, for that reason, all the
data needs to be kept in memory to avoid the delay of the hard drives. In order to achieve
that, three classes were created to represent the knowledge, which are the Behavior Descriptor, the Capability Descriptor, and the Argument Descriptor. The following lines
will explain the mission of every of these classes mentioned plus two more to interpret
the catalog and to represent that data through the terminal in a beautiful way.
• Behavior Descriptor: This class is used to structure all the information gathered
related to the Behaviors.
• Capability Descriptor: This class is used to save all the information related to the
Capabilities. Because a capability has information that needs to be managed.
• Argument Descriptor: This is used to save all the information related to the arguments of a behavior.
• Behavior Catalog: This class is in charge of interpreting the catalog file needed.
Also, it uses the Prettify class to display the information gathered.
• Prettify: It acts as a support class (it can be seen also as a library) to display the
23
information loaded from the catalog in a pretty way.
One important aspect of this process is when the catalog is loading (saving information
into memory), the errors occurred can not be sent to other processes due to ROS technology. To resolve this issue, if an error happens when the process is loading the configuration is displayed in the terminal as an error (in color red) and stop loading the file, which
may cause that if a user does not notice the error all the attempts to start a behavior will
be ineffective.
3.3.2
Services and topics
As it can be seen in the figure 2.6 the process has different services available. Although
some of them are used in the previous section, in this section the messages associated with
each topic are explained. After this, all the messages are shown as images to understand
the explanation better.
• “check_group_consistency”: This service uses the message “behaviorConsistency.srv”
to encapsulate the information sent. It sends the actual behavior to activate and all
the active behaviors if any. The figure 3.8 represents this message.
• “default_values”: This service uses the message “behaviorDefaultValues.srv”. It
just contains the default data and some fields representing if an error has happened
while retrieving the information from memory. For example, if the behavior is not
in the configuration file. The figure 3.8 represents this message.
• “initiate_behaviors”: This service uses the message “initiateBehaviors.srv”. It is
used just to ask the Resource Manager to active the default capabilities via a service
called “initiate_resources”. “initiate_behaviors” service is very useful because it
removes temporal dependencies caused by ROS technology and by the way processes are implemented. The figure 3.8 represents the message of this topic.
• “check_behavior_format”: This service uses the message “behaviorSrv.srv”, which
was explain in the previous section (figure 3.6). However, this topic is useful to
compare if the arguments passed to the behaviors are correctly written. In the case
an incorrect argument is passed, the behavior will return an error.
• “check_capabilities_consistency”: This service uses the message “capabilityConsistency.srv”. As it can be seen in the figure 3.8, the message is composed by the
24
capability trying to activate and all the active capabilities. If the capabilities are
not consistent or an error happened during the verification, the response sends the
corresponding boolean and error message.
• “behavior_resources”: This service uses the message “queryResources.srv”. In
the response field, it sends all capabilities corresponding to the behavior received.
The figure 3.8 shows that.
Figure 3.8: Behavior Specialist messages
3.3.3
UML diagram and algorithms
The figure 3.9 shows a UML diagram containing the association between each class that
shapes the behavior specialist. These classes are the ones mentioned before. The Behavior Specialist class, which inherits from Drone Process, has an instance of the behavior
catalog that lets the process to access the catalog interpretation. In addition, the behavior
specialist, as well as the behavior catalog, uses descriptor classes (Behavior Descriptor,
Capability Descriptor, and Argument Descriptor) to store information.
25
Figure 3.9: General UML of the Behavior Specialist process
To get a deep knowledge of how the different classes are defined, the following lines
explain what each class consists of. Also every figure mentioned is within Appendix B.
Behavior specialist class
The behavior specialist class contains as many callbacks as services the process advertises. As it can be read in section 2.2 this happens because each service available is
associated to a function, which is executed when the service is called.
Descriptor classes
These classes represent every descriptor. These classes are used to manage information
and, for that reason, they contain different get and set functions as well as some functions
to serialize the information to send via ROS services. The argument and the capability
descriptor has more functions to verify the format of an argument and to increase the
number of references associated with a capability.
26
Behavior catalog class
The UML can be seen in appendix B. The class is in charge of the interpretation and
managing of the catalog file where every behavior is defined. The class contains two variables called behaviors_map and capabilities_map that store capabilities and behaviors.
Also, these variables were used because it is by far one of the most efficient and easiest
ways to retrieve information fast. In addition, type std::map is used instead of std::vector
because: (1) the data inside these variable does not change and (2) this type finds and
erase information quickly. The rest of the functions are basically to read and to retrieve
information.
Prettify
The last class named Prettify is used only by the behavior catalog to represent pretty the
information displayed within the terminal. It contains different functions to represent
different types of information that goes from Errors or titles to arrays of information.
In the figure 3.10 is represented an example of how the behavior catalog is loaded and
displayed using this class.
Figure 3.10: Example of a loaded catalog
27
3.4
3.4.1
RESOURCE MANAGER
Design
This process supports the execution of behaviors, trying to use efficiently the resources of
the robot [5]. As every behavior calls this unique process, it should execute operations fast.
These can be done in different ways, but, as the behavior specialist is the only one that
knows all the information related to the behavior, the resource manager just received the
name of a behavior and retrieves its correspondent capabilities. Then, it executes them.
This process is similar to the behavior coordinator but with capabilities, because it does
some checking before activating some capabilities. The incompatibility of capabilities is
determined by the platform used.
As the process is in charge of retrieving capability data and saving it, a class called Capability has been created to manage this concept. This class stores just the name, the
processes associated with a capability and the reference number that acts as a counter
representing the number of times the capability has been needed since it starts. When a
Capability is active, it is saved in a data structure called map as it is quick to find, to erase
and to store new data.
However, as this process interacts with low-level processes that depend on the platform
used, a limitation was found: the sequence of process activation. To resolve this, the
correct sequence must be placed inside the catalog of the behavior specialist. When this
process asks the behavior specialist for the capabilities associated with a behavior it sends
these capabilities in the same sequence as the one set in the catalog.
3.4.2
Services and topics
The process has the following services available:
• “activate_resources” and “cancel_resources”: both manage the message “resourceSrv.srv”.
This service is pretty simple. It just has a field for the name of the behavior and two
more fields in the response part to return an error if any. The figure 3.11 shows this
in detail.
• “initiate_resources”. This topic was mentioned in the behavior specialist section.
28
It just contains the basic capabilities to activate as request and the error message
and acknowledge as a response. The figure 3.11 shows it.
Figure 3.11: Resource Manager messages
3.4.3
UML diagram and algorithms
The figure 3.12 contains every function and variable of this process. It shows two different
classes, the Capability class and the Resource Manager class.
• Capability class: An aspect of this class worth explaining is the redefinition of the
operator “==”. This is needed because it makes the code more legible. In addition,
the class contains the typical aspects of a common class.
• Resource Manager class: It contains a variable called “active_capabilities” that
stores in a std::map every capability activated. Also, it contains every callback
associated with an advertised ROS service. The functions in the “Features” section
are used to avoid code repetitions. They present a return value of std::tuple, which
is used as the previous process, to return the values and the errors occurred if any.
29
Figure 3.12: UML Resource Manager
As a deeper explanation of what the process does, below there is a basic representation of
pseudo code of the operations available. The code is not the same as the one implemented,
so, some operations may not be possible to be made.
Algorithm 3.7: activateResources(string behavior_name)
1
msg = droneMsgsROS.queryResources
2
msg.behavior_name = behavior_name
3
capabilities = ros.call("behavior_resources", msg)
4
5
if capabilities.isEmpty(): # behavior has no capability
6
stopUnreferenceCapabilities()
7
publishActiveResources()
8
return Ok
9
10
11
for _capability in capabilities:
if isActive(_capability):
12
# increment reference number in active_capabilities
13
incrementReferenceNumber(_capability)
14
capabilities.remove(_capability)
15
if capabilities.isEmpty():
30
16
stopUnreferenceCapabilities()
17
publishActiveResources()
18
return Ok
19
20
msg = droneMsgsROS.capabilityConsistency
21
msg.capabilities_to_activate = capabilities
22
msg.active_capabilities = active_capabilities
23
consistent = ros.call("check_capabilities_consistency", msg)
24
if not consistent:
25
return Error
26
27
28
for _capability in capabilities:
for _process in _capability.getProcesses()
29
started = ros.call("/drone1/"+_process+"/start")
30
if not started:
31
return Error
32
33
# increment field reference number
34
incrementReferenceNumber(_capability)
35
# adding into active_capabilities
36
insertIntoMemory(_capability)
37
38
stopUnreferenceCapabilities()
39
publishActiveResources()
40
return Ok
This function is in charge of activating every capability associated to a behavior, as its
name indicates. This is done in four steps:
1. Retrieve every capability associated to the behavior received by calling the behavior
specialist process. It sends the name of the behavior and receives the message with
every capability associated.
2. Check if the capabilities trying to activate are consistent with the already activated
ones. For that reason, it sends to the behavior specialist via ROS service the capabilities received in the previous point and the already activated capabilities kept
in memory by this process. The service returns with a positive boolean value if no
inconsistency was found or returns a negative boolean value if an inconsistency was
found.
3. Start processes associated to capabilities. It just call the service available of each
process. The start service of each process follows an standard as well as the behaviors. If the service returns positive the process is correctly activated, but if the
service returns a negative value, some problem happened.
31
4. After the activation of each process associated to a capability, this capability increase its reference number and it is store in memory for future checks.
Another important aspect is that before the function finishes, every unreference capability
(reference number equals to zero) is stopped by calling the operation “process_name/stop”
Algorithm 3.8: cancelResources(string behavior_name)
1
msg = droneMsgsROS.queryResources
2
msg.behavior_name = behavior_name
3
capabilities = ros.call("behavior_resources", msg)
4
if capabilities.isEmpty():
5
return OK
6
7
active = checkEveryCapabilityIsActive(capabilities)
8
if not active:
9
return Error
10
11
for _capability in capabilities:
12
# decrement references in active_capabilities
13
decrementReferences(_capability.getName())
14
15
return Ok
The cancel resources function just decrements the reference number from the capabilities
associated to the behavior received. To know which capability is associated to that behavior, the resource manager asks the behavior specialist about it. An important data, if some
capability is already stopped, an error is returned.
Another important algorithm is the one associated to the initate behavior service. This
function is very simple, it just starts every capability received from its topics. In this way,
every default capabilities is activated. Defaults capability are the ones that need to be
started at any time since the beginning of the mission.
3.5
EXAMPLE OF A BEHAVIOR
As a result of this implementation, the figure 3.13 shows the UML diagram with the
minimum requirements to be compatible with the behavior management system. The
developer only has to know how to interact with the controllers in order to check the
current pose or the current speed of the Drone.
32
Figure 3.13: Example of a behavior
These operations are the minimum needed to develop a behavior. Of course, ROS subscriptions and callbacks are needed to make it worth it. Any of the functions returns a
value except the ownCheckSituation functions which returns a std::tuple representing the
error occurred. If no error has occurred, the boolean value is set to true.
As for the communication of the behavior with low-level processes, the figure 3.14 represents and example of different topics and services the behavior uses to supervise (ownRun
function) and to start its objective (move to a point or send a message).
The behavior sends different references to the controller. First, it sends to the trajectory
controller references like the actual yaw and the actual position. Then it ends the target
point to the trajectory planner which in turn this process sends to the controller the trajectory to follow, fragmented as points. After this, the behavior sends the new mode to the
drone command. These modes can be: HOVER, LAND, TAKE_OFF and MOVE. It acts
as an interface to the controllers. In addition, the odometry state estimator publishes the
actual pose and velocity, which is used by the behavior to supervise itself.
33
Figure 3.14: Communication between behaviors and low level processes
34
4 S OFTWARE VALIDATION
This chapter is intended to represent all the tests and different validations that were made
to the software implemented. For that reason, there are different parts according to different tests made. The first chapter is a minimum validation of the software, it is used to
verify that the code does not break. The next chapters can be seen as unit and integration
tests because it tests the whole system and also each components.
4.1
TESTING USING ROS
ROS provides different commands to debug a process, which were explained in section 2.2. These commands were used to debug the whole system. Following these lines,
are explained how processes were debugged using ROS.
For ROS services: the command rosservice was used. As processes are composed
mostly of services, this command helps to start the process by sending him the expected
message. When the start is sent each process display a message within the terminal showing what are they doing. These messages help to verify if everything works as expected.
For ROS topics: the command rostopic echo topic_name allows us to see what
messages are being sent by this topic. This command was used to verify that when a
behavior ends, every process does the correct operations by reading the information displayed.
35
4.2
VALIDATION INSIDE AEROSTACK
As written in the introduction, the software is integrated within the Aerostack suite. Different tests were made to verify the correct execution of each component and the correct
execution with the rest of the Aerostack framework. First of all, simulation tests are explained, then, real flights are shown. Another important aspect worth mentioning of this
validation is about other components such as behaviors and higher-level processes like
the graphical interface.
4.2.1
Simulator
The simulator is a process that allows the execution of different ROS processes in a virtual
platform. This process receives and sends the messages related to the drone, exchanging
the real actions to virtual ones. This speeds the testing and helps to resolve basic issues
easily.
Here different tests were made to verify the correct operation of each process developed.
The following lists explain a summary of some tests made to each process.
Behavior process
1. Behavior starts: Test if a behavior is started, the capabilities are sent and the ownstart function is called.
2. Behavior stops: Test if a behavior is stopped by canceling the capabilities associated
and checking if the ownStop function is called.
3. Events are sent: Test if a behavior sends one event only with the correct termination.
4. Checking situation: Test if situation operation works as expected by calling this
operation in different moments.
Behavior coordinator
1. Non launch behavior: Test if an error is returned when a non launched behavior is
received for activation.
36
2. Behavior consistency: Test if a inconsistent behavior is stopped before activating
the new behavior received.
3. Recurrent behaviors: Test if different non inconsistent behaviors are activated without errors.
4. Memory management: Test if behaviors are removed or added when they are stopped
or activated.
Behavior specialist
1. Check consistency: Test if the operation to verify the consistency of different behaviors and capabilities work by sending different messages.
2. Catalog interpretation: Test if the behavior catalog loads the catalog file correctly
and if it creates as many behaviors and capabilities as the file contains.
3. Behaviors and capabilities are retrieved: Test if every concept loaded from the
catalog is sent correctly.
4. Default capabilities: Test if capabilities asigned as default, are correctly activated.
Resource Manager
1. Capabilities are retrieve: Test if the correct capabilities associated to the behavior
sent are retrieved correctly.
2. Processes are activated and deactivated: Test if processes associated to a capabilty
are ativated and deactivated when the operation is requested.
3. Consistency is checked: Test if inconsistent capabilties are not activated.
As the software is integrated inside the Aerostack suite, there are other tools that help the
validation of the processes developed. These tools are a command line interface (figure
4.1) and a Human machine interface (figure 4.2).
Although the Human Machine Interface is the best way to control and execute missions,
the CLI was used for debugging the first simulation tests because it contains more data of
low-level processes available, which was useful for the development of the system. After
some tests, the Human Machine interface was used instead.
37
Figure 4.1: Command line interface
Figure 4.2: Human Machine interface
Another important tool provided by ROS is the interface of the simulator. This can be seen
in the figure 4.3. It is called Rviz and it represents the environment and all the movements
the drone is doing.
These tools were used for veryfing the correct integration of the software developed with
the Aerostack framework. As a summary of these tests the following table explains them.
38
Figure 4.3: Rviz. Simulator interface
N Test
Test name
Test definition
Test if a behavior is requested
1
Behaviors are activated
by high-level processes it is
activated correctly by reading
the command line interface.
Test if a behavior is canceled
2
Behaviors are stopped
by high-level processes it is
correctly stopped and the pose
or speed does not change.
Test if every time a behavior
3
Belief manager integration
check its situation the response
received is treated correctly.
Test if a behavior containing
an error argument is sent, the
4
Error management
behaivor coordinator returns
the error and the behavior is
not activated.
39
4.2.2
Real flight
For the real flight tests, the Parrot Ar Drone 2.0 was used. As it can be seen from the
figure 4.4 a supervised escenary was set to validate different missions. The environment
contains some references dispersed throughout the whole map. With this references the
drone knows where it is, which helps him knowing how it should go to a target point.
These tests are useful because the real world has many more problems than the simulator.
So, many more errors can be appeared.
Figure 4.4: Environment to test the software
The same checks mentioned before and the rest of Aerostack processes were also tested
in this environment. As an example of the mission, the following image shows how the
drone takes off and goes to a point (figure 4.5).
40
Figure 4.5: Drone moving to a target point
The whole mission tested was written using the Python interpreter and the human machine
interface. Inside Appendix C a real code of the whole mission perform is written for the
Python interpreter.
For the development of the tests, a number of tools from Aerostack were used that were
developed in parallel. Therefore these tests were also useful to validate the correct integrated operation of the complete set of toois. These tools include:
• Python mission language and the execution engine [7]: An example of this script
can be seen in chapter 9. This software send behaviors to activate via a Python API.
• Human machine interface and Behavior trees [8]. Integrated within the human machine interface, this tool allow operators to create new missions easily and to supervise them.
• The library of behaviors: This contains a library of different behaviors that sustain
the system developed. In addition, it uses the Behavior Process as its father class.
41
5 C ONCLUSIONS
This project consists on developing the behavior management system which is in charge
of the coordination and execution of different behaviors. These behaviors uses some
classes developed to abstain the developer from the communications of the system. This
helps anyone with or without a deep knowledge of how the behavior management system
works to develop new behaviors easily. These can be seen in the examples represented
within the document.
This project achieved the objectives proposed in the introduction. These objectives are:
(1) the analysis of different requirements established help thinking of the best solution.
(2) The design was object oriented because this technology help with the management of
different concepts. (3) For the implementation, the C++ language was used because of
the efficiency of the language. (4) Last, for the validation objective, a total of 25 different
tests were made distributed through unit tests and integrations tests. Every tests was done
following the requirements stablished.
The results fulfilled from the validation parts indicates that the software works as expected
and it can be used, although some bugs that didn’t appear during the validation part can
be haunted.
This software will be published in the imminent release of the Aersotack 2.0 framework.
This project is an important component of the executive engine as it changes the way
developers interact with this framework. In addition, the software will be used in the
IMAV 2017 competition, held in Toulouse.
42
5.1
FUTURE WORK
In the future, the software can improve different aspects.
• Improve the behavior catalog interpreter. To incorporate more validation and configuration of different aspects.
• Create or improve new processes to manage every error occurred during the flight.
• Improve the resource manager to make decisions when a sensor of the drone breaks
it.
• Improve the Behavior Process to be more simple.
• Create and API compatible with all ROS supported languages that abstaing the
Executive system from high-level applications.
• Improve the way argument format is checked.
43
6 B IBLIOGRAPHY
[1] R. R. Murphy, Introduction to AI Robotics. Massachusetts Institute of Technology,
2000, ISBN: 026213383.
[2] B. Siciliano and O. Khatib, Springer handbook of robotics, 2nd ed. Springer International Publishing, 2016, ISBN: 9783319325507.
[3] J. L. Sanchez-Lopez, R. A. Suarez-Fernandez, H. Bavle, C. Sampedro, M. Molina, J.
Pestana, and P. Campoy, “Aerostack: An Architecture and Open-Source Framework
for Aerial Robotics”, International Conference on Unmanned Aircraft Systems 2016,
Arlington, USA, 2016.
[4] J. L. Sanchez, M. Molina, H. Bavle, C. Sampedro, R. A. Suarez-Fernandez, and
P. Campoy, “A Multi-Layered Component-Based Approach for the Development
of Aerial Robotic Systems: The Aerostack Framework.”, Journal of Intelligent and
Robotic Systems, 1-27, 2017.
[5] M. Molina, “An Execution Engine for Aerial Robot Mission Plans”, Technical University of Madrid, Tech. Rep., 2017.
[6] D. Palacios, “Sistema de supervisión del comportamiento de un vehículo robótico
aéreo no tripulado”, Final Project, Bachelor in Computer Science Engineering, Technical Universitiy of Madrid, 2016.
[7] G. de Fermín, “Mission Planner Interpretation for the Aerostack software framework”, Final Project, Bachelor in Computer Science Engineering, Technical Universitiy of Madrid, 2017.
[8] C. Valencia, “Interfaz gráfica de usuario para configuración de misiones de vehículos
aéreos no tripulados”, Final Project, Bachelor in Computer Science Engineering,
Technical Universitiy of Madrid, 2017.
44
7 A PPENDIX A – B EHAVIOR C ATALOG
1
2
3
4
5
behavior_descriptors:
- behavior: TAKE_OFF
timeout: 15
recurrent: no
incompatible_lists: [motion_behaviors, rotation_behaviors]
6
7
8
9
10
- behavior: LAND
timeout: 15
recurrent: no
incompatible_lists: [motion_behaviors, rotation_behaviors]
11
12
13
14
15
16
17
18
19
20
21
22
23
- behavior: GO_TO_POINT
timeout: 120
recurrent: no
incompatible_lists: [motion_behaviors, rotation_behaviors]
capabilities: [SETPOINT_BASED_FLIGHT_CONTROL, PATH_PLANNING]
arguments:
- argument: COORDINATES
allowed_values: [-100,100]
dimensions: 3
- argument: RELATIVE_COORDINATES
allowed_values: [-100,100]
dimensions: 3
24
25
26
27
28
29
30
31
32
- behavior: ROTATE
timeout: 15
recurrent: no
incompatible_lists: [motion_behaviors, rotation_behaviors]
capabilities: [SETPOINT_BASED_FLIGHT_CONTROL]
arguments:
- argument: ANGLE
allowed_values: [-360,360]
33
34
35
36
37
38
39
40
41
- behavior: START_MOVING
timeout: 15
recurrent: no
incompatible_lists: [motion_behaviors, rotation_behaviors]
capabilities: [SETPOINT_BASED_FLIGHT_CONTROL]
arguments:
- argument: SPEED
allowed_values: [0,30]
45
42
43
- argument: DIRECTION
allowed_values: [BACKWARD, FORWARD, UP, DOWN, LEFT, RIGHT]
44
45
46
47
48
49
- behavior: FOLLOW_OBJECT_IMAGE
timeout: 90
recurrent: no
incompatible_lists: [motion_behaviors, rotation_behaviors]
capabilities: [VISUAL_SERVOING]
50
51
52
53
- behavior: PAY_ATTENTION
recurrent: yes
capabilities: [VISUAL_MARKERS_RECOGNITION]
54
55
56
57
58
59
60
61
62
63
64
- behavior: KEEP_HOVERING
timeout: 15
recurrent: no
incompatible_lists: [motion_behaviors, rotation_behaviors]
capabilities: [SETPOINT_BASED_FLIGHT_CONTROL]
arguments:
- argument: DURATION
allowed_values: [1,1000]
- argument: ARUCO
allowed_values: [0,1023]
65
66
67
68
69
70
71
72
73
- behavior: WAIT
timeout: 15
recurrent: no
arguments:
- argument: DURATION
allowed_values: [1,1000]
- argument: UNTIL_OBSERVED_VISUAL_MARKER
allowed_values: [0,1023]
74
75
76
77
78
79
80
81
82
- behavior: FLIP
timeout: 15
recurrent: no
incompatible_lists: [motion_behaviors, rotation_behaviors]
capabilities: [SETPOINT_BASED_FLIGHT_CONTROL]
arguments:
- argument: DIRECTION
allowed_values: [BACK, FRONT, LEFT, RIGHT]
83
84
85
86
87
88
- behavior: START_HOVERING
timeout: 15
recurrent: no
incompatible_lists: [motion_behaviors, rotation_behaviors]
capabilities: [SETPOINT_BASED_FLIGHT_CONTROL]
89
90
91
92
93
94
95
- behavior: BROADCAST_MESSAGE
timeout: 15
recurrent: no
arguments:
- argument: TEXT
allowed_values: TEXT
96
97
behavior_lists:
46
98
99
100
101
102
103
104
105
106
107
108
- list: motion_behaviors
behaviors:
- TAKE_OFF
- LAND
- FLIP
- KEEP_HOVERING
- FOLLOW_OBJECT_IMAGE
- START_MOVING
- START_HOVERING
- GO_TO_POINT
- VISUAL_SERVOING
109
110
111
- list: rotation_behaviors
behaviors: [ROTATE, LOOK_AT_POINT]
112
113
114
115
116
117
capability_descriptors:
- capability: SETPOINT_BASED_FLIGHT_CONTROL
active_by_default: no
processes: [droneTrajectoryController]
incompatible_capabilities: [VISUAL_SERVOING]
118
119
120
121
- capability: PATH_PLANNING
active_by_default: no
processes: [droneTrajectoryPlanner, droneYawPlanner]
122
123
124
125
126
- capability: VISUAL_SERVOING
active_by_default: no
processes: [trackerEye, open_tld_translator, droneIBVSController]
incompatible_capabilities: [SETPOINT_BASED_FLIGHT_CONTROL]
127
128
129
130
- capability: SELF_LOCALIZATION_BY_ODOMETRY
active_by_default: yes
processes: [droneOdometryStateEstimator]
131
132
133
134
135
136
- capability: SELF_LOCALIZATION_BY_VISUAL_MARKERS
processes:
- droneLocalizer
- droneObstacleDistanceCalculator
- droneObstacleProcessor
137
138
139
140
- capability: VISUAL_MARKERS_RECOGNITION
processes:
- droneArucoEyeROSModule
47
8 A PPENDIX B – UML S B EHAVIOR S PE CIALIST
Figure 8.1: UML Behavior Specialist class
48
Figure 8.2: UMLs of different descriptor classes
Figure 8.3: UML Behavior Catalog class
49
Figure 8.4: UML Prettify class
50
9 A PPENDIX C – VALIDATED MISSION
1
import executive_engine_api as api
2
3
4
def runMission():
print "Starting mission"
5
6
7
8
print "Taking off..."
result = api.executeBehavior('TAKE_OFF')
print "Take off finished with status: %s" % result
9
10
11
print "Activating visual marker recognition..."
api.activateBehavior('PAY_ATTENTION_TO_VISUAL_MARKERS')
12
13
14
15
print "Move to a higher position"
result = api.executeBehavior('GO_TO_POINT', relative_coordinates=[0,0,0.5])
print "Go to point finished with status: %s" % result
16
17
18
19
20
21
22
23
24
25
26
print "Memorizing current point..."
sucess, unification = api.consultBelief('position(self, (?X, ?Y, ?Z))')
if sucess:
print "Position memorized correctly"
x,y,z = unification['X'], unification['Y'], unification['Z']
else:
print "Position unknown, landing..."
result = api.executeBehavior('LAND')
print "Landed with status: %s" % result
return
27
28
29
30
print "Move to the target point"
result = api.executeBehavior('GO_TO_POINT', coordinates=[3,5.5,1.3])
print "Go to point finished with status: %s" % result
31
32
33
34
print "Rotate 180 degrees"
result = api.executeBehavior('ROTATE', angle:-90)
print "Rotate finished with status: %s" % result
35
36
37
38
print "Returning home"
result = api.executeBehavior('GO_TO_POINT', coordinates=[x,y,z])
print "Go to point finished with status: %s" % result
39
40
41
print "Deactivating visual marker recognition..."
api.inhibitBehavior('PAY_ATTENTION_TO_VISUAL_MARKERS')
51
42
print "Deactivated"
43
44
45
46
print "Landing..."
result = api.executeBehavior('LAND')
print "Land finished with status: %s " % result
47
48
return True
52
Este documento esta firmado por
Firmante
Fecha/Hora
Emisor del
Certificado
Numero de Serie
Metodo
CN=tfgm.fi.upm.es, OU=CCFI, O=Facultad de Informatica - UPM,
C=ES
Thu Jun 08 17:58:26 CEST 2017
[email protected], CN=CA Facultad de
Informatica, O=Facultad de Informatica - UPM, C=ES
630
urn:adobe.com:Adobe.PPKLite:adbe.pkcs7.sha1 (Adobe
Signature)