(Title 17, United States Code) governs the maki

Warning Concerning Copyright Restrictions
The Copyright Law of the United States (Title 17, United States Code) governs the
making of photocopies or other reproductions of copyrighted materials.
Under certain conditions specified in the law, libraries and archives are authorized
to furnish a photocopy or other reproduction. One of these specified conditions is
that the photocopy or reproduction is not to be used for any purpose other than
private study, scholarship, or research. If electronic transmission of reserve
material is used for purposes in excess of what constitutes "fair use," that user may
be liable for copyright infringement.
University of Nevada, Reno
Unmanned Autonomous System Swarm
A thesis submitted in partial fulfillment
of the requirements for the degree of
Bachelor of Computer Science and Engineering and the Honors Program
by
David L. Anderson
Dr. Sushil Louis, Ph.D., Thesis Advisor
May, 2015
UNIVERSITY
OF NEVADA
RENO
THE HONORS PROGRAM
We recommend that the thesis
prepared under our supervision by
David L. Anderson
entitled
Unmanned Autonomous System Swarm
be accepted in partial fulfillment of the
requirements for the degree of
Bachelor of Science, Computer Science and Engineering
______________________________________________
Sushil Louis, Ph.D., Thesis Advisor
______________________________________________
Tamara Valentine, Ph. D., Director, Honors Program
May, 2015
Abstract
Unmanned Autonomous System Swarm, UASS, provides a means for a single operator to
control a swarm of unmanned autonomous systems (UAS). Each operator will utilize a
computer interface to control the UAS swarm. The current supported drones are the
Parrot AR.Drone 2.0 and Pioneer ground robots. Each autonomous system runs the Robot
Operating System (ROS) to give an operator control, and will be communicating with a
Unity3D Game Engine to provide a user interface. UASS can be expanded for further
drone research, or could be sold as a commercial product. This project is a collaboration
between David Anderson, Chris Forkner, Niki Silveria, Dr. Sushil J Louis, and Dr.
Monica Nicolescu.
i
Table of Contents
Abstract ............................................................................................................................................. i
Table of Contents ............................................................................................................................... ii
Introduction ...................................................................................................................................... 1
Requirements ................................................................................................................................... 2
Requirements Elicitation: Interview Findings ............................................................................... 2
Requirements elicitation: Additional Findings .............................................................................. 3
System Requirements ...................................................................................................................... 4
Functional Requirements ............................................................................................................. 4
Non-Functional Requirements ..................................................................................................... 5
Use Case Modeling .......................................................................................................................... 7
Use Case Diagram ....................................................................................................................... 7
Use Case Descriptions................................................................................................................. 8
Detailed Use Case Descriptions of Select Use Cases .............................................................. 10
Requirement Traceability Matrix ................................................................................................ 11
High-level and Medium-level Design ............................................................................................. 12
High level System Diagram ........................................................................................................ 12
High Level Class Diagram: ........................................................................................................ 13
Program Units: ....................................................................................................................... 14
Detailed Design .............................................................................................................................. 19
UASS Simulation Activity Diagram ............................................................................................ 19
UAS Command Statechart ......................................................................................................... 20
ROS AR Drone Node Graph ...................................................................................................... 21
ROS to Unity Message Sending State Diagram ........................................................................ 22
UASS Message Protocol............................................................................................................ 23
User Interface Design .................................................................................................................... 24
Project Summary............................................................................................................................ 30
Implemented Functionality ......................................................................................................... 30
Functionality Not Implemented .................................................................................................. 34
Future Work ............................................................................................................................... 35
Contributions of Team Members .................................................. Error! Bookmark not defined.
Software Overview ......................................................................................................................... 35
Annotated References ................................................................................................................... 37
Glossary of terms ........................................................................................................................... 39
ii
Introduction
UAS, Unmanned Autonomous Systems, such as air and ground drones are
growing to be a popular topic. An increasing number of customers, business and
personal, such as Amazon, Google, farmers, and the Navy are starting to take interest in
the capabilities of easily controllable drones. As the general interest in drones is rising,
the necessity to control multiple drones at the same time is increasing. UASS satisfies
that need to control multiple UAS.
UASS provides an interface where a single operator can control multiple
unmanned autonomous systems in real-time. A group of vehicles composed of both land
and aerial autonomous robots can be added to its controllable system. Given a command
from the graphical user interface, the system of UAS acts as a cohesive unit to follow the
operator’s instructions. In order to send commands to a drone, ROS, the Robot Operating
System, is used to create control nodes that can be either published to, used for sending
commands, or probed to view and alter the data of individual robots. The
interconnections of ROS and related interfaces are described further in later design
sections. Once gathered, information sent and received from these nodes is incorporated
into the operator’s interface.
The user interface is designed using the Unity3D Game Engine. The interface
provides an intuitive means to command multiple robots. Operators can choose to control
individual units or groups of units via keyboard and mouse instructions. The interface
relays each unit’s commands to the associated UAS as well as displays where UAS are in
their environment. Each UAS follows basic commands consisting of commands such as
move, halt, and follow. Detailed illustrations of the user interface are present in the User
Interface section of this document.
Additionally, operators can network their Unity interface sessions. Once a server
is started, any number of operators can connect and the locations of all UAS on the
network will be shared. Operators can only control UAS that they have added to their
session.
1
Requirements
Requirements Elicitation: Interview Findings
Q: What will be the project’s primary use after its completion?
A: The project will mostly be used for academic research purposes, but this product
could be modified to fit the specific needs of a professional or private user.
Q: How will you make the robotic units not collide with walls or other units?
A: In the future, each robot will have its own sensors for detecting collision.
Additionally, the robots will be sent the relative positions of all other robots. Using all of
this data, potential fields will be constructed which will guide the robot where to go so
that they do not collide.
Q: How does this project make controlling multiple robots a more manageable endeavor?
A: This project eases control in two ways.
1. Robots can expect very different information in order to run. This could be
desired velocity, heading, or simply a desired position. UASS abstracts these
differences and allows control over the robot using simple mouse clicks with a
few additional buttons.
2. User can easily select multiple robots by ctrl+clicking with the mouse or dragging
a selection box in a 3d virtual representation. Once the robots are selected, a user
can implement all common commands that the robots share via mouse or buttons.
Q: What is the maximum payload in terms of number of clients and number or robots
that you expect to want the interface to support at one point?
A: At this point we do not have the hardware capability to fully test the load that our
system can handle (robots and laptops are expensive), but we feel confident that our
system can handle 10+ units per UASS Client program.
Q: How does the system react to emergency situations?
A: The system by itself does not automatically react to dangerous situations. There is an
emergency procedure that can be carried out by an operator to shut down the movement
of individual UAS or all UAS within their control.
Q: How will the system adapt to different environments?
A: Currently, the system expects to have a 3d model of the environment given to it.
However, the system can eventually be adapted to create the 3d model based on sensors
readings from the robot. In terms of having a dynamic environment each robot should be
equipped with their own collision avoidance algorithms and sensors.
2
Q: What robots will be supported? Can a user easily add a new robot to the system?
A: The current supported robots are the Pioneer drone, and the AR.Parrot drone. If time
permits, the ability to easily add various robots into the system will be implemented.
Q: How many physical machines are needed to run this system?
A: The smallest setup in the system consists of two machines, one that is running the
Robot Operating System to communicate with a UAS, and one that is running the Unity
Client.
Q: Where will the 3D environment maps for the robotic units come from?
A: As of now the maps have to be created and scaled by the developers. Eventually we
would like to have the robots map the environment for us, but this is something that we
will not have the time to develop before May.
Requirements elicitation: Additional Findings
Due to the nature of the project, many requirements workshops were made for various
parts of the project. The workshops we had pertained to the following sections of our
project; Networking, Command Message Format (Communication amongst all entities),
where collision avoidance takes place (Unity or Robot), and localization/error correction,
plus many others. However, we will only be discussing two of those workshops.
Networking:
Brainstorm: Initially our team had a few ideas for our network architecture. The first
was to use an external server and broadcasting. Essentially, each robot would connect to
the external server with position update, and that server would broadcast all the position
updates to each Unity Client. Additionally, commands would be sent from the client to
the server, which then would broadcast it and let each ROS and Unity client sort through
the broadcast for relevant information pertaining to it.
The second option was to use Unity networking to manage world state information and
keep all the Unity clients in track. That information could then be sent to the robots
periodically. Most notable is that the ROS clients would directly communicate to the
Unity clients using individual UDP packets.
Outcome Analysis:
Our group eventually went with the latter of the two options. By having each client hosts
its own robots, we not only simplify ownership of robots, but we also make it more of a
distributed system. The “Master server” wouldn’t be a single point of failure as in the
3
first option. If one client fails, the robots from other clients will still work. If the server
was to fail, all robot communication would cease to exist.
Command Message Format:
Brainstorm: After determining what our network architecture would look like we as a
team needed to decide on what protocol will be used for the system. We discussed what
message types were needed in the protocol, and what the messages will look like in use.
At this point we also weighed the pros and cons of sending different data to the robotic
units: desired speed and heading or just desired position. Desired position is what the AR
Drone uses for motion, and the pioneer uses desired speed and heading.
Outcome Analysis: At the end of the discussion the team came up with a working
protocol that is in use with the current system. We also decided to use the command type
of sending desired position, because this would make for easy integration with the AR
Drone. The pioneer will need to make some calculations to convert this to desired speed
and heading, but doing the opposite would take a lot more development time.
System Requirements
Functional Requirements
Functional Requirements specify behaviors or functions that make up the software. They
are ordered by priority, going from 1 to 3.
FR - 1.01
UASS shall load a predefined map of command environment.
FR - 1.02
Units shall be informed of their initial location in an environment map either by
default or user instruction.
FR - 1.03
UASS shall display units in real-time in an environment map.
FR - 1.04
All UASS units shall follow the same commands.
FR - 1.05
All air units shall be controlled using the same command protocol.
FR - 1.06
All ground units shall be controlled using the same command protocol.
FR - 1.07
UASS will have an emergency shutdown to stop all commands and movement to
all units.
FR - 1.08
UASS units shall be controlled through mouse and keyboard.
FR - 1.09
UASS units shall be selected individually or in groups.
4
FR - 1.10
UASS shall receive commands from the user through a Unity UI.
FR - 1.11
Parrot AR Drone commands from Unity UI will be processed and interpreted by
ROS tum_ardrone.
FR - 1.12
Connection must be available for AR Parrot Drones and P3-DX Pioneer Robots.
FR - 2.01
User shall be able to manipulate the command queue for an individual or group
of units.
FR - 2.02
User shall be able to add and subtract from the number of vehicles controlled
with UASS.
FR - 2.03
Units will know their current and initial locations in an environment map.
FR - 2.04
UASS shall allow user to interrupt currently running commands.
FR - 2.05
UASS shall reset unit positions
FR - 3.01
UASS shall construct a map based on unit sensor inputs.
FR - 3.02
Users can command UASS units to move in predefined formations.
FR - 3.03
UASS shall allow common UAS to be added to its units.
Non-Functional Requirements
Non-Functional Requirements Criteria that is put on the software, as opposed to what the
software does. This criteria can relate to security, hardware, which language is used to
write it, etc.
NFR - 1.01
UASS can communicate with units within 15 meters of main access point.
NFR - 1.02
Time until complete connection to all units shall be under 2 seconds.
NFR - 1.03
Communication between Unity interface and units must be maintained for the
entire running time of UASS.
NFR - 1.04
Reaction time to user commands through the interface must be under 1 second.
NFR - 1.05
UASS will use components of Unity development engine and the Robot
Operating System (ROS).
5
NFR - 1.06
UASS will use ROS Hydro distribution.
NFR - 1.07
ROS components and add-ons will be written in C++ or Python.
NFR - 1.08
Unity scripts will be written in C#.
NFR - 1.09
UASS will run on Ubuntu 12.04.
NFR - 2.01
Units can exchange information through internet connection with Unity
interface.
NFR - 2.02
UASS will be accessible from Unix, Windows, and Mac OS.
NFR - 2.03
One computer station shall connect with multiple IP addresses.
NFR - 3.01
All units react to commands with no delay.
NFR - 3.02
UASS cannot be controlled from an access point the user did not initiate.
NFR - 3.03
UASS can communicate with units in an environment lacking open routers or
internet through local connection technology.
6
Use Case Modeling
Use Case Diagram
Use cases depict various actions that a user may take when using the software. The
following is a diagram that illustrates what actions a user may take using the UASS.
7
Use Case Descriptions
The following table contains a list and brief description of the use cases
UC - 01
Select One Unit
The user can select a single unit while using the interface
by left clicking on the model representation. Commands
given after this point affect the single selected model/unit’s
position.
UC - 02
Select Multiple
Units
The user can select multiple units while using the interface
by left clicking and dragging the mouse over several
models before release. Commands given after this point
affect the selected models/units positions.
UC - 03
Command Move
The user commands their selected unit(s) to move to a
location specified by a right click in the simulated
environment map. If one unit is selected only that unit will
move to the location. If multiple units are selected all units
will move to the location.
UC - 04
Command Follow
The user commands their selected unit(s) to follow another
unit by a right clicking on that unit’s model. Units will then
move behind the “leader”. When the “leader” moves, units
told to follow will chain movements after it.
UC - 05
Command Halt
The user commands their selected unit(s) to stop moving. If
command appears in a queue of commands, the unit(s) will
finish their last command and then halt. If the command is
given outside of a queue, unit(s) will stop wherever they are
in the middle of their current command. Alternatively, the
unit reaches the end of its last command and halts of its
own accord.
UC - 06
Command Look at
The user commands their selected unit(s) to turn towards a
point specified in the simulated environment map. Unit(s)
will change their heading until they face the point.
UC - 07
Override Command
The user overrides their selected unit(s) commands, halting
the unit(s), and clearing the command queue. The units
initiate hovering and wait for new commands.
UC - 08
Initiate Connection
The user adds a new unit to the UASS. This new unit is
added to the simulated environment map and is able to
accept further commands from the user. Other units will
recognize this unit’s existence and their avoidance
algorithms will have an additional object to keep track of.
Units are also added upon UASS startup.
UC - 09
Terminate
Connection
The user removes a unit of the UASS. This unit is then
removed from the simulated environment map and is no
8
longer able to receive communications.
UC - 10
Restart Simulation
The user can restart the simulation. This reinitializes all
non-positional parameters. The unit’s initial position can
then either be set to their current positions within the map,
or the initial positions can be changed according to a new
unit placement.
UC - 11
Load Map
The user can add a simulated environment map to UASS.
The 3d Model of the designated area would then be used
for the units to recognize the topology of their environment.
UC - 12
Set Initial Position
The user can set the initial position of a unit relative to the
beginning of the simulated environment map. This position
UC - 13
Add to Queue
The user can add commands to a queue for their selected
unit(s) to follow. Each command will be completed before
moving onto the next command unless the user intervenes.
The user can choose to add a command to a Patrol queue if
the unit is currently patrolling.
UC - 14
Remove from
Queue
The user can reduce the number of commands in their
selected unit(s) queue. When a command is finished it is
automatically removed from a unit’s queue.
UC - 15
Emergency
Shutdown
The user can either put all units, or selected units into an
emergency shutdown state. Aerial units will land gracefully
and then turn off, while ground units will stop moving and
then turn off.
UC - 16
Command Patrol
The user commands their selected unit(s) to patrol amongst
a queue of Positions. The unit(s) will repetitively move
from Position 1 to N, and from N to 1. If the last position
coordinate is within a small threshold from the last position
coordinate, the order will represent a cycle going from 1 to
N, and then jumping from the Nth position to the first
position.
UC - 17
Command Land
The user commands their selected Aerial units to land.
UC - 18
Command Launch
The user commands their selected Aerial unit(s) to launch.
UC - 19
Command Barrel
Roll
The user commands their selected unit(s) to perform a
barrel roll if possible.
UC - 20
Command Move
Relative
The user commands their selected unit(s) to move x meters
in direction y.
9
Detailed Use Case Descriptions of Select Use Cases
The following contains a detailed description of a few use cases
Use Case: Select one unit
Use Case ID: UC - 01
Actor: User
Precondition: Simulation is in execution
Flow of Events:
1. User left clicks near, or on top of, a unit
Post condition: The unit is now selected, and ready for commands
Use Case: Command Move
Use Case ID: UC - 03
Actor: User
Precondition: One or more units are selected
Flow of Events:
1. User right clicks the mouse at a position where the drones need to move to (target)
2. Units are given commands (in real world)
3. The simulation updates where the units are as they travel to their new target
Post condition: Units have moved to designated location.
Use Case: Emergency Shutdown
Use Case ID: UC - 15
Actor: User
Precondition: Simulation is in execution
Flow of Events:
1. User commands all units to start emergency shutdown
2. Commands are sent to all units to undergo emergency shutdown
a. Ground units are cleared of all commands and are told to stop moving
b. Aerial units are cleared of all commands and are told to land gracefully
Post condition: All units are in a shutdown state
10
Requirement Traceability Matrix
The requirement traceability matrix maps functional requirements to use cases.
Functional Requirements
Use Cases
11
High-level and Medium-level Design
High level System Diagram
The following diagram illustrates the high level system diagram of UASS. The
core to this system is a set of UASS Clients that control the system’s robotic units.
12
High Level Class Diagram:
The following is a high level class diagram depicted in Figure 2 is for the UASS
Client system. The figure only includes classes that were developed by team members,
and not of any classes that Unity provides its programmers.
13
Program Units:
The program units are the classes that make up the Class diagram. The following
is a more detailed description of each of these classes and the components that make
them up.
Class: Unit
The unit class stores information about individual UAS, such as a unique id,
orientation, and position. It is accessed by Unity3D to place and update units in the
simulation.
Method
Returns
Description
start()
void
Initializes starting parameters.
update()
void
Updates unit’s position by reading from its associated input
socket.
CopyAttributes()
void
Copies over robotic unit information from one Unit to
another.
Class: UnitManager
The unit class stores information about individual UAS, such as a unique id,
orientation, and position. It is accessed by Unity3D to place and update units in the
simulation.
Method
Returns
Description
start()
void
N/A
update()
void
N/A
AddUnit(NewRobot)
void
Creates a new robotic unit and adds it to the world
unit list.
FindUnit(UnitID)
GameObject
Returns a reference to a unit based off of its unit
ID.
14
Class: CommandManager
The CommandManager is the interface used to give commands to selected units.
Its methods are used to send movement commands to selected units.
Method
Returns Description
start()
void
Initializes starting parameters.
update()
void
Searches for user input via the keyboard, mouse, or
other external devices.
Move(SelectedUnits,
Direction)
void
Move all selected units in a list in a specified direction.
GoTo(SelectedUnits,
Position)
void
Tell all selected units to move to a specified position.
SendDesiredPosition(Unit, void
DesiredPos, Desired Yaw)
Change the offset of the given unit to DesiredPos and
DesiredYaw.
Launch(SelectedUnits)
void
Send launch command to all aerial units in the selected
units list.
Land(SelectedUnits)
void
Send land command to all aerial units in the selected
units list.
Stop(SelectedUnits)
void
Tell all selected units to stop its current action.
SendMessage( IP, Port,
cmdMsg)
void
Private method that sends a compiled message
command to a specific IP and port.
15
Class: SelectionManager
The selection manager takes information from the input manager indicating a
click on or near a unit and adds that unit to the list of selected units that will be affected
by the next command. If the input was to add a unit to the current set, the selection
manager will append that unit. If the input was to select a new unit or group of units,
current units will be deselected before appending the new units.
Method
Returns
Description
start()
void
Initializes starting parameters.
update()
void
Checks for new information from the input
manager.
AddUnitToSelectedList(Unit)
void
Adds a specific unit to the selected units list.
AddUnitsToSelectedList(Units)
void
Adds a list of units to the selected units list.
RemoveUnitFromSelectedList(Unit)
void
Removes a specific unit from the selected
units list.
RemoveAllUnits()
void
Removes all units from the selected units
list.
16
Class: NetworkManager
The network manager initializes and maintains connection with all units that will
take part in the simulation.
Method
Returns
Description
start()
void
N/A
update()
void
N/A
OnDestroy()
void
Disconnect from the Unity Network.
StartUnityServer()
void
Starts a Unity Server instance.
StartUserServer()
void
Starts a local Unity Server.
RefreshUnityList()
void
Polls the Unity Server for any game room instances.
RefreshUserList()
void
This method sends a command to a unit.
JoinServer(Host)
void
Join a server specified by host.
InitializeServerSettings()
void
Set server settings to default values.
TestConnection(IP)
bool
Checks connection with a given IP address, returns
test result.
17
Class: InputManager
The input manager is where all user input is checked and processed. The input
manager is the class that calls the command managers functions to send commands to
robotic units.
Method
Returns
Description
start()
void
N/A
update()
void
Checks user input each frame. Performs correct
actions based off of input mode and input type.
CheckMode()
void
Checks the input mode of the input system.
SetPositionModeUpdate()
void
Perform position offset input updates.
RegularUpdateMode()
void
Perform regular input updates (selection, movement
commands).
CheckKeysPressed()
void
Checks keyboard inputs, performs specific action
depending on input.
GetDirection()
int
Returns the correct integer representing direction,
based off of keyboard input.
Class: FlockingManager
This class deals with controlling robots in a formed manner. For instance, if we
wanted the robots to move in a line, the flocking manager would help make that happen
by interacting with the command manager.
Method
Returns
Description
start()
void
N/A
update()
void
Checks user input each frame. Performs correct
actions based off of input mode and input type.
checkForCloseCalls()
void
This function looks at the current position that the
robots and determines if a collision in bound to
happen. If so, it will adjust the position update to
compensate.
18
Detailed Design
UASS Simulation Activity Diagram
The activity diagram below is a representation of the actions taken during the
simulation of UASS. First, the simulation has to initialize itself with the user entered
information about the units and the map. Once that is complete, the simulation enters its
‘step’ loop. During this loop, the program will update unit positions, based on the
information received from each units ROS node. The ‘step’ loop will also take care of
any user input received by the program. User input could include moving the simulator
camera, selecting units, moving units or ending the program. Should no user input be
detected the simulation will simply update unit positions.
19
UAS Command Statechart
The operator of UASS has the ability to command individual and groups of
robots. This state chart shows how the UAS will react when they receive commands. The
simulation initializes the UAS to be in a landed state. This state applies directly to aerial
drones, but for the purposes of a ground UAS the state is equivalent to initial power on.
From this state the operator can command a takeoff, which initializes all of the UAS
reference points and localization processes and launches aerial drones into the air. The
positions of UAS at this point are fed into Unity3D to show the operator the locations of
the robots.
Commanding the UAS to move or follow another moves the robots to the move or
follow state, each of which end with a cancel command, which is done manually by the
user at any point, or when the action is completed. At any point in the program, operators
can send and emergency signal to one or all UAS. This signal cancels all commands in a
queue, initiates aerial landings, and halts all ground drones. Emergency signals must be
manually disabled to return to the landed state, which will begin reinitializing ROS
localization. UASS will stop sending commands and shut down when commanded to end
simulation in the landed state.
20
ROS AR Drone Node Graph
The following screenshot is a representation of the nodes required to run the AR Parrot
Drone. Essentially, each node provides some functionality for the drone. Nodes may
require specific information that another node contains and they may also provide
services that can be utilized by other nodes. For instance, if one looks at the pub node, it
has an arrow pointing towards the node “/UASS_Offset”. Essentially this means that the
pub node is publishing to a channel named “/UASS_Offset”. The sub node has an arrow
pointing from “/UASS_Offset” to itself. This means that the sub node is subscribing to
the channel “/UASS_Offset.”
21
ROS to Unity Message Sending State Diagram
The following contains the state diagrams for UASS_ROS system sending
messages to a Unity Client and a Unity client receiving messages from a UASS_ROS
system. The UASS_ROS side continually probes the positional data from
TUM_ARDRONE and relays it to its associated Unity client. The unity client has a
receive thread that waits for a message, and then updates the game worlds state based on
the information received from ROS.
22
UASS Message Protocol
Every message has a type, which is designated by the first character of every
message. The following list shows the different base message types are found in the
UASS system protocol:
0 - Request to connect
1 - Request to connect granted
2 - Send robot command
3 - Robot position update
4 - Send world state to ROS
Each message also contains additional information about the message. For
example, command type 2 has a subset of command types to the message:
0 - GoTo command
1 - Stop
2 - Launch
3 - Land
4 - Set new offset
5 - Arrow key move
Each command type may also contain information about the command itself. For
the GoTo command a desired position is needed in order to perform the command.
23
User Interface for UASS Client
The user interface for the UASS client is built using the Unity3D engine. It
provides an intuitive way to connect to or start a server, add UAS to the system, and
control their movement. Upon starting the UASS client, an operator has a choice to either
start a server or join an already open one. Joining or starting a server allows other
operators to see the locations of UAS in the 3D environment from anywhere. Through the
main menu, a message and debug log are available to obtain information about the
environment.
UAS units automatically begin to communicate with the Unity environment after
the ROS side of UASS is started. Positions of each unit are initialized to zero but the
operator can edit where the UAS believes its position to be with an interface window.
Units are selectable with the right mouse button. Units are controllable once selected, and
new desired positions are assigned using the left mouse button. The series of screenshots
below detail the main screens and interactions available through the UASS client.
The screen that greets the operator upon starting UASS client. Operators can choose to
start their own Unity hosted or local server, or connect to a preexisting server.
24
After choosing to either join a server or create one the 3D model of the current
environment is shown to the operator.
When a UAS connects, it thinks its position is at the origin, despite where it’s at in the
room. The above panel allows a user to tell the robot where it is at in the world.
25
Hitting the escape key displays a main menu. The main menu provides a way to access
message or server operations.
The user can access the message log by selecting “Message Log” from the main menu.
This log contains a record of all communication via chat amongst clients. Currently, it
has some filler text.
26
This is a debugging panel. Information such as actions taken, messages sent and received
from the robots are recorded here.
Upon adding UAS into the system, they automatically display themselves in the 3D
environment. Here, the AR.Drone and Pioneer are both updating their positions in real
time and receiving command information.
27
Selecting a UAS displays a selection circle. Once selected, the operator can command the
UAS or change its relative position.
28
This window is part of the Tum_ardrone package. This window allows the user to not
only control the robot manually, but also to view the current commands it is processing.
29
Project Summary
Implemented Functionality
The following functional and non-functional requirements were met.
FR - 1.01
UASS shall load a predefined map of command environment.
FR - 1.02
Units shall be informed of their initial location in an environment map either by
default or user instruction.
FR - 1.03
UASS shall display units in real-time in an environment map.
FR - 1.04
All UASS units shall follow the same commands.
FR - 1.05
All air units shall be controlled using the same command protocol.
FR - 1.06
All ground units shall be controlled using the same command protocol.
FR - 1.07
UASS will have an emergency shutdown to stop all commands and movement to
all units.
FR - 1.08
UASS units shall be controlled through mouse and keyboard.
FR - 1.09
UASS units shall be selected individually or in groups.
FR - 1.10
UASS shall receive commands from the user through a Unity UI.
FR - 1.11
Parrot AR Drone commands from Unity UI will be processed and interpreted by
ROS tum_ardrone.
FR - 1.12
Connection must be available for AR Parrot Drones and P3-DX Pioneer
FR - 2.01
User shall be able to manipulate the command queue for an individual or group
of units.
FR - 2.02
User shall be able to add and subtract from the number of vehicles controlled
with UASS.
FR - 2.03
Units will know their current and initial locations in an environment map.
FR - 2.04
UASS shall allow user to interrupt currently running commands.
FR - 2.05
UASS shall reset unit positions
30
Non-Functional - Criteria that is put on the software, as opposed to what the software
does. This criteria can relate to security, hardware, which language is used to write it, etc.
NFR - 1.01
UASS can communicate with units within 15 meters of main access point.
NFR - 1.02
Time until complete connection to all units shall be under 2 seconds.
NFR - 1.03
Communication between Unity interface and units must be maintained for the
entire running time of UASS.
NFR - 1.04
Reaction time to user commands through the interface must be under 1 second.
NFR - 1.05
UASS will use components of Unity development engine and the Robot
Operating System (ROS).
NFR - 1.06
UASS will use ROS Hydro distribution.
NFR - 1.07
ROS components and add-ons will be written in C++ or Python.
NFR - 1.08
Unity scripts will be written in C#.
NFR - 1.09
UASS will run on Ubuntu 12.04.
NFR - 2.01
Units can exchange information through internet connection with Unity
interface.
NFR - 2.02
UASS will be accessible from Unix, Windows, and Mac OS.
NFR - 2.03
One computer station shall connect with multiple IP addresses.
NFR - 3.03
UASS can communicate with units in an environment lacking open routers or
internet through local connection technology.
31
The following use cases were met
UC - 01
Select One Unit
The user can select a single unit while using the interface
by left clicking on the model representation. Commands
given after this point affect the single selected model/unit’s
position.
UC - 02
Select Multiple
Units
The user can select multiple units while using the interface
by left clicking and dragging the mouse over several
models before release. Commands given after this point
affect the selected models/units positions.
UC - 03
Command Move
The user commands their selected unit(s) to move to a
location specified by a right click in the simulated
environment map. If one unit is selected only that unit will
move to the location. If multiple units are selected all units
will move to the location.
UC - 04
Command Follow
The user commands their selected unit(s) to follow another
unit by a right clicking on that unit’s model. Units will then
move behind the “leader”. When the “leader” moves, units
told to follow will chain movements after it.
UC - 05
Command Halt
The user commands their selected unit(s) to stop moving. If
command appears in a queue of commands, the unit(s) will
finish their last command and then halt. If the command is
given outside of a queue, unit(s) will stop wherever they are
in the middle of their current command. Alternatively, the
unit reaches the end of its last command and halts of its
own accord.
UC - 06
Command Look at
The user commands their selected unit(s) to turn towards a
point specified in the simulated environment map. Unit(s)
will change their heading until they face the point.
UC - 07
Override Command
The user overrides their selected unit(s) commands, halting
the unit(s), and clearing the command queue. The units
initiate hovering and wait for new commands.
UC - 08
Initiate Connection
The user adds a new unit to the UASS. This new unit is
added to the simulated environment map and is able to
accept further commands from the user. Other units will
recognize this unit’s existence and their avoidance
algorithms will have an additional object to keep track of.
Units are also added upon UASS startup.
UC - 09
Terminate
Connection
The user removes a unit of the UASS. This unit is then
removed from the simulated environment map and is no
longer able to receive communications.
32
UC - 10
Restart Simulation
The user can restart the simulation. This reinitializes all
non-positional parameters. The unit’s initial position can
then either be set to their current positions within the map,
or the initial positions can be changed according to a new
unit placement.
UC - 11
Load Map
The user can add a simulated environment map to UASS.
The 3d Model of the designated area would then be used
for the units to recognize the topology of their environment.
UC - 12
Set Initial Position
The user can set the initial position of a unit relative to the
beginning of the simulated environment map. This position
UC - 13
Add to Queue
The user can add commands to a queue for their selected
unit(s) to follow. Each command will be completed before
moving onto the next command unless the user intervenes.
The user can choose to add a command to a Patrol queue if
the unit is currently patrolling.
UC - 14
Remove from
Queue
The user can reduce the number of commands in their
selected unit(s) queue. When a command is finished it is
automatically removed from a unit’s queue.
UC - 15
Emergency
Shutdown
The user can either put all units, or selected units into an
emergency shutdown state. Aerial units will land gracefully
and then turn off, while ground units will stop moving and
then turn off.
UC - 16
Command Patrol
The user commands their selected unit(s) to patrol amongst
a queue of Positions. The unit(s) will repetitively move
from Position 1 to N, and from N to 1. If the last position
coordinate is within a small threshold from the last position
coordinate, the order will represent a cycle going from 1 to
N, and then jumping from the Nth position to the first
position.
UC - 17
Command Land
The user commands their selected Aerial units to land.
UC - 18
Command Launch
The user commands their selected Aerial unit(s) to launch.
UC - 19
Command Barrel
Roll
The user commands their selected unit(s) to perform a
barrel roll if possible.
UC - 20
Command Move
Relative
The user commands their selected unit(s) to move x meters
in direction y.
33
Functionality Not Implemented
The following requirements were priority 3, meaning that they were requirements that the
project would not have enough time to implement. These are features that were listed and
are intended to be fulfilled at a later point in time.
However, some of these requirements were partially met.
One use case was not implemented.
FR 3.01
UASS shall construct a
map based on unit
sensor inputs.
No Progress Was Made
FR 3.02
Users can command
UASS units to move in
predefined formations.
A user has the ability to queue commands and specify
which direction and distance to move a selected unit.
However, this is not predefined and is done during runtime.
FR 3.03
UASS shall allow
common UAS to be
added to its units.
The interface doesn’t allow a user to easily add models of
different robots, however, the system is set up such that
Unity sends a general command format, and the user can
specify how it treats each of the commands being received.
NFR All units react to
- 3.01 commands with no
delay.
Currently, the system operates with a delay from anywhere
of .2seconds to 1 second depending on the load and
capabilities of the router.
NFR UASS cannot be
- 3.02 controlled from an
access point the user did
not initiate.
UASS contains some security measures such as requiring
ID when receiving any commands. However, there are
many other precautions that must be met to ensure security.
UC - 06
The user commands their selected unit(s) to turn towards a
point specified in the simulated environment map. Unit(s)
will change their heading until they face the point.
Command Look at
34
Future Work
The Priority 3 requirements listed above are some of the primary features that will
be looked upon. Having additional forms of localization for the robot units is another
aspect that could further improve this project. Currently, the planned localization
methods were using basic sensors readings from the robots, in addition to using an
algorithm called SLAM, which simultaneously builds a map of the robots environment
and determines where it is within that environment. Using additional markers and GPS
could make the project much more robust and general purpose, which is one of the goals
of UASS.
Software Overview
Unity Client: https://github.com/Zayik/UASS_UnityV3_24
 AerialUnit.cs - Derives from Unit class. Contains additional information
pertaining to aerial units.
 CameraMotion.cs - Allows a user complete movement control over the camera.
 CommandMgr.cs - Contains a list of command propagators which enqueues the
chosen command to each selected unit.
 DownLineControl.cs - This script helps project a line going to the ground from
aerial units.
 EnableScript.cs - Manages some text input fields and disabled some panels upon
joining a game server.
 GroundCircleControl.cs - Used for aerial units. Aerial units have a line pointing
downwards that is projected from them. A circle is drawn where the line intersects
the ground.
 GroundUnit.cs - Derives from Unit class. Contains additional information
regarding ground robots.
 InputMgr.cs - This manager deals with all of the input a user gives, be it via
keyboard or mouse. It then puts meanings to that user input such as a right click
translating to “make all selected units move to the position the user right clicked”
 NetworkMgr.cs - This manager allows a user to create a room and host it on the
Unity server, on a personal server, or allows a user to join a room on either of the
two servers.
 RobotNetworkMgr.cs - Handles the all messages received from robotic units. Is
able to connect units and receive position updates that update the virtual robotic
units in the simulation.
 SelectionMgr.cs - This manager keeps a list of currently selected units. This class
also has the functionality to add and remove units from the selected units list.
 UIController.cs - Handles pausing and unpausing the simulation environment.
 Unit.cs - Base class for all robotic units in the UASS system. Contains basic
information about the connected unit.
35





UnitMgr.cs - This manager contains a list of all connected units. It is also able to
instantiate and remove units from the simulation environment.
UpdatePositionScript.cs - This script is used for the “Update Position Panel”
which allows a user to change where a robot is in the world. It manages the input
fields of the robot that contain x,y,z and yaw. Once these values have been
changed, the user can then adjust the robots offset by calling the function
UpdateUnitPosAndOri().
Aspect.cs - Base class for all aspects of a Unit class. contains basic information
about the aspect.
UnitAI.cs - This class is an aspect of a robot. It monitors the robot’s command
queue to determine if it has reached its goal. Upon reaching a goal, it will remove
the command from its queue and send a new command to the real-life robot for it
to execute.
Command.cs - This file contains a bunch of classes that represent each type of
command. These classes can be placed on a units command queue. They contain a
start method and an update method.
UASS_Drone: https://github.com/forknerc/UASS_Drone
 ardrone_autonomy/ - Main driver for the ardrone. Gives simple control of the
robotic unit.
 tum_ardrone/ - a more complex driver built upon the ardrone_autonomy driver.
Uses computer vision to give localization to the drone and also has a built in
command scripting language.
 tum_changes - Changes that needed to be made to the tum_ardrone driver in order
for the other ROS node to have access to once hidden controls in the system.
 UASS_Drone/sub.cpp - This is a ROS node that takes care of initial connection to
a UASS Unity client and then sends position updates to the UASS Unity client.
 UASS_Drone/pub.cpp - This is a ROS node that receives and processes messages
from a connected UASS Unity client. Processed messages are then converted and
sent to tum_ardrone to control the drone unit.
 UASS_Drone/UASSflatTrim.cpp - A simple ROS node that calls a service
provider to flat trim (recalibrate the orientation) of a connected drone unit.
 UASS_Bash/ - Used to quickly start up the drivers needed for an ardrone to
connect to a UASS Unity client. The main bash script is UASS_Drone.sh.
Pioneer:
 pos_listener.cpp - A ROS node to read the estimated position of the Pioneer,
convert the position information into a Unity understandable format, and then
transmit that information to the client.
 commandListener.py - A ROS node that retrieves command messages from
Unity. Messages are processed and then issued to the Pioneer.
 controller.py - A ROS node that issues commands to the Pioneer. It’s functions
are called by commandListener.py
 RosAria.cpp - The ROS driver for the Pioneer P3-DX.
36
Annotated References
Robotics Operating System (ROS):
http://www.ros.org/
ROS will be the main framework on which TUM AR.Drone and Unity3D will
communicate with each other. Unity3D will obtain commands from the user, and then a
ROS node (process) will interpret the command and send the interpretation to TUM
AR.Drone and other ROS nodes that will control ground units.
TUM AR.Drone ROS Package:
http://wiki.ros.org/tum_ardrone
The TUM AR.Drone Package will be the driver for the aerial units in our system. TUM
AR.Drone comes with the ability to give commands to drones, and estimate where the
drone is in 3-dimensional space. TUM AR.Drone comes with the ability to also show a
single unit in a 3-D space, but we will not be using this for our system, since Unity3D
will allow for multiple unit display.
Unity3D Game Engine:
https://unity3d.com/
Unity3D game engine will be the user interface for the system. Unity3D will take
commands from the user, and then send the commands to a ROS node that will distribute
the commands to their given units. Unity3D will also receive information about the unit’s
positions for display.
From ROS to Unity: leveraging robot and virtual environment middleware for
immersive teleoperation
http://vgrserver.cs.yorku.ca/~jenkin/papers/2014/ros2unity.pdf
This paper deals with using the Unity 3D game engine in order to visual the map created
by using a SLAM algorithm. It looks into how a virtual reality system can be created on
the fly using sensor data.
37
Micro Aerial Vehicles (MAVs)
https://vision.in.tum.de/research/quadcopter
This website contains a multitude of information pertaining to the AR. Parrot drones and
information pertaining to sensors and how to do interesting things with those sensors. It
contains information on how to implement a SLAM algorithm to deal with localization
and mapping of the environment.
Swarms:
http://www.swarms.org/
Swarms is a project that brings together a variety of different disciplines such as control
theory, biology, robotics, and AI to deal with how to control a multitude of robots
simultaneously. This project link contains information pertaining to creating artificial
networks inspired by biological models.
StarCraft 2:
http://us.battle.net/sc2/en/
StarCraft 2 is a real-time strategy game that contains a very simple yet powerful interface
that our interface is based on. It contains a clean method of controlling a variety of
different units all at the same time.
38
Glossary of terms
UAS (Unmanned Autonomous System): An unmanned robotic system that makes
choices without human intervention.
Parrot AR Drone 2.0: Quadricopter that can be controlled by user or autonomous
commands. The AR Drone 2.0 comes with two camera (forward and down facing) and
multiple motion sensors.
P3-DX Pioneer Drone: Ground mobile robot that has 8 forward-facing sonar sensors and
optionally 8 rear-facing sonar sensors.
Localization: Estimating the place (and pose) of an object relative to a map.
Unity3D: a powerful rendering engine fully integrated with a complete set of intuitive
tools and rapid workflows to create interactive 3D and 2D content; easy multiplatform
publishing; thousands of quality, ready-made assets in the Asset Store and a knowledgesharing community.
Robot Operating System (ROS): a flexible framework for writing robot software. It is
a collection of tools, libraries, and conventions that aim to simplify the task of creating
complex and robust robot behavior across a wide variety of robotic platforms.
User Interface: the means by which the user and a computer system interact, in
particular the use of input devices and software.
Simultaneous Localization and Mapping (SLAM): a technique used by digital
machines to construct a map of an unknown environment (or to update a map within a
known environment) while simultaneously keeping track of the machine's location in the
physical environment.
Augmented Reality (AR): a live direct or indirect view of a physical, real-world
environment whose elements are augmented (or supplemented) by computer-generated
sensory input such as sound, video, graphics or GPS data.
TUM_ARDrone: a system consisting of three components: a monocular SLAM system,
an extended Kalman filter for data fusion and state estimation and a PID controller to
generate steering commands.
39
Monocular SLAM System: employs a particle filter and top-down search to allow realtime performance while mapping large numbers of landmarks.
Kalman filter: algorithm that uses a series of measurements observed over time,
containing noise (random variations) and other inaccuracies, and produces estimates of
unknown variables that tend to be more precise than those based on a single measurement
alone.
Frame of Reference: a coordinate system used to represent and measure properties of
objects, such as their position and orientation, at different moments of time.
Simulation: imitation of the operation of a real-world process or system over time.
Ubuntu: a Linux operating system that has the capability to run the ROS environment.
Minimap: A miniature version of the world map, giving the user a view of where units
are in the known world.
Master Server: A server that is either hosted by Unity or the user. The server acts as a
place for finding and connecting Unity clients together.
Socket Programming: An easy to use programming API for sending packets across a
network.
40