Functional Specifications

University of Portland
School of Engineering
5000 N. Willamette Blvd.
Portland, OR 97203-5798
Phone 503 943 7314
Fax 503 943 7316
Functional Specification
Project Alder: MapBot
Contributors:
CJ Cullen
Ryan Pila
Ningxuan Wang
Approvals
Name
√
√
Dr. Crenshaw
Mr. Favors
Date
10/7/09
10/16/09
Name
√
√
Dr. Nuxoll
Mr. Foran
Date
10/8/09
10/16/09
Insert checkmark (√) next to name when approved.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
REV. 1.0
PAGE II
Revision History
Rev.
0.5
0.9
Date
09/23/09
09/28/09
Author
all
all
Reason for Changes
Initial draft
First full draft:
Completed first edit cycle.
Added Ethical Considerations
Added Milestones
Added Conclusion
0.91
10/4/09
all
Modified map-display (Figures 2-5) to represent more coherent view.
Removed redundancies
Added three-part description for all figures
Added Challenge of Precision Section
Improved Environment Modeler Detail (Figure 9)
Clarified parts of Environment Modeler
Added to conclusion
Edited use cases
Added glossary
Added minor phrasing edits
0.92
10/6/09
all
Reworded Soar section in Background.
Corrected figure labels.
Modified Ethical Considerations
Changed Challenge of Precision to Challenge of Perception.
Modified Challenge of Precision.
Reordered figures and descriptions in Preliminary Design
Added reference between multiple sensors sections
Clarified difference between “sketch” and “guess.”
Clarified definition of “low-level instruction”
Clarified Use Cases.
Added grammar and phrasing edits.
0.93
10/7/09
all
Corrected inconsistent wording about perception.
Fixed spacing and formatting issues.
0.95
0.96
10/8/09
10/14/09
all
Approved by Faculty Advisors
Fixed mac-windows formatting issues
Fixed figure captions
Added Figure Table
Rephrased several sections.
0.97
10/14/09
all
Added SCI example tables
Added Module Origin table
Corrected spacing, labeling issues
1.0
10/16/09
UNIVERSITY OF PORTLAND
-
Approved by Industry Representatives
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
.
.
Table of Contents.
.
Overview .....................................................................................................................
10
.
.
Environmental Specifications
..................................................................................11
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
REV. 1.0
PAGE III
General Setting..................................................................................................................................... 11
Room Shape ........................................................................................................................................ 12
Floor Surface ........................................................................................................................................ 12
No Slope ............................................................................................................................................... 12
No Stairs ............................................................................................................................................... 12
Empty Room ......................................................................................................................................... 12
Programming Platform .............................................................................................12
iRobot model ........................................................................................................................................ 13
iRobot hardware interface.................................................................................................................... 13
Hardware Driver ................................................................................................................................... 13
Demo PC specification......................................................................................................................... 13
Development PC specification............................................................................................................. 13
Software Specifications ............................................................................................14
Graphics API ........................................................................................................................................ 14
Development Environment .................................................................................................................. 14
Open Interface ...................................................................................................................................... 14
Intelligent System Architecture ............................................................................................................ 15
Challenge of Perception ...........................................................................................17
Overview of System Architecture ...........................................................................17
Component Details ....................................................................................................20
iRobot Create ....................................................................................................................................... 20
Sensors ................................................................................................................................................. 20
Actuators ............................................................................................................................................... 21
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
.
Create Interface .................................................................................................................................... 21
.
.
Environment Modeler ........................................................................................................................... 21
.
.
Decision Machine .................................................................................................................................
21
.
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
REV. 1.0
PAGE IV
Actuator Modeler .................................................................................................................................. 22
Create Perception Database ............................................................................................................... 22
Use Cases ...................................................................................................................22
Use Case 1: User Start Up and View................................................................................................. 22
Use Case 2: Wall................................................................................................................................. 23
Use Case 3: Rectangular Room with or without n obstacles............................................................ 23
Use Case 4: Non-Rectangular Room with or without n obstacles ................................................... 24
General Approach .....................................................................................................26
Progress Tracking ................................................................................................................................ 27
Personnel Assignment ......................................................................................................................... 27
Assumptions ..............................................................................................................27
Milestones ..................................................................................................................28
Functional Specifications 0.9 ............................................................................................................... 29
Functional Specifications 0.95 ............................................................................................................. 29
Functional Specifications 1.0 ............................................................................................................... 29
Program Review Presentation............................................................................................................. 29
Measure Rotational Perception ........................................................................................................... 29
Interpret Sensor Packet ....................................................................................................................... 29
Populate Create Perception Database ............................................................................................... 29
Design Document 0.9 .......................................................................................................................... 29
Design Document 0.95 ........................................................................................................................ 30
Design Document 1.0 .......................................................................................................................... 30
Program Create Interface .................................................................................................................... 30
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
.
Program Environment Modeler: Model ............................................................................................... 30
.
.
Program Environment Modeler: View ................................................................................................. 30
.
Program Environment.Modeler: Controller ......................................................................................... 30
.
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
REV. 1.0
PAGE V
Program Actuator Modeler................................................................................................................... 30
Program Wall Sketcher ........................................................................................................................ 31
Program Wall Guesser......................................................................................................................... 31
Program Decision Machine ................................................................................................................. 31
Integration Testing................................................................................................................................ 31
Final Documentation 0.9 ...................................................................................................................... 31
Final Documentation 0.95 .................................................................................................................... 31
Final Documentation 1.0 ...................................................................................................................... 31
Founder’s Day Presentation ................................................................................................................ 31
Risks ............................................................................................................................32
Resources...................................................................................................................33
Personnel.............................................................................................................................................. 33
Preliminary Budget ............................................................................................................................... 34
Equipment............................................................................................................................................. 34
Facilities ................................................................................................................................................ 34
Soar....................................................................................................................................................... 36
Procedural Memory.............................................................................................................................. 36
iRobot Create ....................................................................................................................................... 36
RooStick................................................................................................................................................ 37
iRobot Serial Command Interface (SCI) ............................................................................................. 37
Microsoft Windows Graphics Device Interface (GDI)......................................................................... 37
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
Figures
.
.
.
.
.
.
.
.
.
REV. 1.0
PAGE VI
Figure 1. iRobot Create: The platform on which MapBot is built .................................................................8
Figure 2. Raw Perception: .......................................................................................................................... 10
Figure 3. Map Sketch .................................................................................................................................. 10
Figure 4. Map Guess................................................................................................................................... 11
Figure 5. Actual Map ................................................................................................................................... 11
Figure 6. Robot Top View ........................................................................................................................... 13
Figure 7. Robot Bottom View ...................................................................................................................... 13
Figure 8. MapBot System Architecture ...................................................................................................... 18
Figure 9. Environment Modeler Detail........................................................................................................ 19
Figure 10. Overall Development Process .................................................................................................. 26
Tables
Table 1. Environmental Specifications ....................................................................................................... 11
Table 2. System Hardware Specifications ................................................................................................. 12
Table 3. Software Specifications ................................................................................................................ 14
Table 4. Sample SCI Drive Command....................................................................................................... 14
Table 5. Sample SCI Sensor Packet.......................................................................................................... 15
Table 6. Origin of Modules .......................................................................................................................... 22
Table 7. Alder milestones............................................................................................................................ 28
Table 8. Alder project risks, severity, and likelihood. ................................................................................. 32
Table 9. Budget ........................................................................................................................................... 34
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
Chapter
1
.
.
.
.
.
.
.
.
.
REV. 1.0
PAGE 7
Summary
This document describes the requirements of Team Alder’s MapBot. MapBot is an
autonomous, mobile, map-making robot. Built on the iRobot Create robotic platform, the
MapBot software architecture incorporates intelligent guidance, extrapolative mapping,
and visual plotting to efficiently and effectively map and display a reproduction of the twodimensional geometry of a room. The software is executed on a MicroPC with a direct
connection to the Create, but its progress can be tracked through a remote connection
with a desktop PC.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
Chapter
2
.
.
.
.
.
.
.
.
.
REV. 1.0
PAGE 8
Background
There are many situations where an accurate blueprint of a space is necessary to
maximize productivity. An office needs a correct floor plan to best utilize all possible space.
A factory needs a map of its warehouse so that storage space is maximized. This can be
done manually, but the task of generating these maps can be accomplished more
efficiently through an automated means. Team Alder’s MapBot is a practical way to
automate this process.
MapBot relies on two main technologies:
1. iRobot Create
Figure 1 depicts the iRobot Create mobile robot platform.
Figure 1. iRobot Create: The platform on which MapBot is built
iRobot Create is an educational version of the popular Roomba floor vacuuming robot
(iRobot Corporation Website). We interface with the Create using the Create Serial
Control Interface (SCI). SCI is the set of instructions that can be given to the Create. It
is discussed further in the Software Specifications.
2. Soar
An important piece of the design of MapBot is the method by which it chooses a route
to explore a room. The most efficient method is one where MapBot takes the least
amount of time possible to generate a perceived map of the room. However, this
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
.
method is difficult because MapBot has no information about the room until it gathers
.
it. The choice of an algorithm for this method is important to the efficiency of MapBot.
.
Soar is a.cognitive architecture that mimics the cognitive behavior of humans, and as
. feature, MapBot can use Soar to help navigate a room. Soar has the
an optional
potential .to allow a robot to reason more like a human would, making high level
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
REV. 1.0
PAGE 9
decisions based on recognized patterns. This approach allows MapBot to more
effectively make guidance decisions. It also adds an interesting investigation
opportunity into the reasoning methods of humans (Soar Website). The use of Soar is
an optional feature for MapBot, but it could greatly increase efficiency.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
Chapter
3
.
.
.
.
.
.
.
.
.
REV. 1.0
PAGE 10
Requirements
This chapter discusses the requirements of MapBot. It provides an overview and details
the environment, programming platform, and software requirements.
Overview
This product uses an autonomous mobile robot to explore an indoor environment. A mapmaking program interfaced with the robot automatically generates and displays a map
based on the robot’s perception within a specified amount of time. It can be used to
determine the geographical feature of an area where human sensation is restricted. The
user places the robot on the ground and runs the map-making program which prompts the
user to specify time duration for the robot to run.
The four figures below demonstrate how the MapBot’s software translates horizontal
bumps detected by the front bumpers of MapBot into a map display. Consider that the
MapBot is traversing the actual room shown in Figure 5. What the MapBot perceives is a
series of distances travelled and bumps experienced, as shown in Figure 2. From that
perception, the MapBot software creates a map sketch and a map guess as shown in
Figure 3 and Figure 4.
Figure 2. Raw Perception:
Black dots indicate bumps and lines
indicate path.
UNIVERSITY OF PORTLAND
Figure 3. Map Sketch
Thick lines indicate wall sketches and thin
lines indicate path.
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
.
.
.
.
.
.
.
.
.
REV. 1.0
PAGE 11
Figure 5. Actual Map
The actual map as a comparison
against the map guess.
Figure 4. Map Guess
Black area shows a guess of the
map.
In Figure 2, the program displays a black dot when the robot reports a bump, and it
displays constantly updating lines to keep track of the robot’s path. In Figure 3, the
program displays thick lines to indicate wall sketches based on the sequence of the black
dots and their positions in Figure 2 as well as thin lines for the path. In Figure 4, the black
area shows the program’s best room guess, which gets constantly updated. The program
only shows Figure 2, Figure 3 and Figure 4. Figure 5 is the actual map as a comparison to
show that the program is subject to mistakes as in this case it misses part of the map.
Environmental Specifications
This section contains a list of the physical specifications and their required values.
Requirement
General Setting
Room Shape
Floor Surface
No Slope
No Stairs
Empty Room
Value
Office Setting
Rectilinear
Carpet
Table 1. Environmental Specifications
General Setting
The general environment setting is an office setting in an indoor, room-temperature
environment unaffected by outdoor weather.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
.
Room Shape
.
. of the room must be a rectilinear room. This means any room with straight
The shape
. that only intersect with other walls at a 90 degree angle. A curved wall can
linear walls
.
potentially cause problems in sensor detection and map drawing algorithms.
.
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
REV. 1.0
PAGE 12
Floor Surface
The project runs on carpet floors. This ensures that the robot’s speed stays at a
constant rate and the values for measuring distances are not skewed.
No Slope
The floor must be level and without slopes. This keeps the values for measuring
distances as accurate as possible.
No Stairs
Stairs must not be present within the environment. The robot is not able to create
maps that span multiple floors of a building.
Empty Room
The room must be empty during the entire run time duration. Furniture, persons, or
any other obstacles are not to be present in the room.
Programming Platform
This section contains a list of devices required to build the product. It uses an iRobot Create to
perceive the environment and a Micro PC that sits on the Create to run the program; they are
connected by a RooStick that requires the Micro PC to run a CP2103 driver. For debugging
purposes, a 15-foot serial cable is used initially to connect the Create to a development laptop.
For demonstration, the Micro PC is placed on the Create to run the program, while
simultaneously connecting with the laptop wirelessly to display results.
Requirement
iRobot Model
iRobot Hardware Interface
Hardware Driver
Development PC
Demo PC
Value
iRobot Create
RooStick
CP2103
IBM ThinkPad
Sony Micro PC
Table 2. System Hardware Specifications
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
.
iRobot model
.
.
Consider the .two figures below depicting the robot sensors, connectors, and chassis.
. IR Receiver
Cliff Sensors
Omnidirectional
.
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
REV. 1.0
PAGE 13
Cargo Bay Connector
Wheels& Drop Sensors
Bump Sensors
MiniDin Connector
Figure 6. Robot Top View
The robot has a receiver and two
connectors at the top
Figure 7. Robot Bottom View
The robot has 9 sensors and 3 wheels at the
bottom.
The iRobot Create Programmable Robot is a preassembled mobile robot platform. It
allows users to program its movements and sounds. The Create consists of an
omnidirectional infrared receiver, a cargo bay connector, and an external MiniDin-6
connector at the top (as shown in Figure 6), three wheels (caster, left and right), two bump
sensors (left and right), three wheel drop sensors (caster, left and right), and four cliff
sensors (front left, front right, left and right) at the bottom (as shown in Figure 7). The robot
uses a rechargeable battery pack, which can power the robot for up to 1.5 hours.
iRobot hardware interface
The RooStick enables computer to serially communicate with an iRobot Create through a
USB port. Programs capable of issuing serial commands to a COM port can use the
RooStick to control the robot. The RooStick consists of a MiniDIN-7 female connector to
connect to the robot and a standard USB type A connector to connect to a computer.
Hardware Driver
The CP2103 Driver is a USB-to-UART Virtual COM Port driver.
Demo PC specification
Sony VAIO VGN-UX180P 4.5-inch Ultraportable Micro PC, Intel Core Solo Processor
U1400, 512 MB RAM, Microsoft Windows XP Professional Edition service pack 3.
Development PC specification
IBM ThinkPad R60 14.1-inch Laptop PC, Intel T2300 1.66 GHz, 1.25 GB RAM. Microsoft
Windows XP Home Edition service pack 3.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
.
.
Software Specifications
.
.
. Requirement
.
Graphics API
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
REV. 1.0
PAGE 14
Value
Windows GDI
Development Environment
Visual Studio 2008
Open Interface
iRobot SCI
Intelligent System Architecture Soar 9.2.0 Beta
Table 3. Software Specifications
Graphics API
The Microsoft Windows graphics device interface (GDI) is a class-based application
programming interface in C/C++. It allows users to use graphics and formatted text on
Windows-based applications. It can be used to draw lines and curves, adjust fonts and
colors.
Development Environment
Visual Studio C++ 2008, Version 9.0.30729.1 SP
Open Interface
The iRobot Serial Command Interface (SCI) is a serial protocol that consists of a set of
commands which allow users to control the robot and monitor its sensors through its
external MiniDin connector. Users can send commands to the robot to actuate its wheels
and steering mechanism and to request its sensor data. Such commands are drive, song,
sensors and so on. A command is a sequence of bytes in which the first byte indicates the
type of the command and the rest of them are arguments that users can include
specifications such as velocity, time duration and radius. For example, “137 1 44 128 0” is
a sequence of bytes, which commands the Create to drive straight at 300mm/sec. Table 4
shows the meaning of each byte when used with the “Drive” OpCode.
OpCode
Velocity
High-Byte
Velocity
Low-Byte
Radius
High-Byte
Radius
Low-Byte
137
1
44
128
0
Table 4. Sample SCI Drive Command
As shown in Table 4, the Drive command contains five bytes. The first byte indicates that it
is a drive command. The second and third bytes specify the velocity. The fourth and fifth
bytes specify the radius.
Sensor data is polled in the form of sensor packets that contain data collected by the
robot’s sensors such as the bump sensors. A sensor packet is a byte in which each bit
corresponds to the state of a sensor. Table 5 shows the meaning of each bit when the
right bump sensor has been triggered.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
. N/A
Wheeldrop
Bump
.
Caster
Left
Right
Left
Right
. 0
0 .
0
0
0
0
0
1
Table 5. Sample SCI Sensor Packet
.
.
As seen in Table 5, if a byte is 00000001, the rightmost bit 1 indicates that the right bump
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
REV. 1.0
PAGE 15
sensor has detected a horizontal bump.
Intelligent System Architecture
The Soar architecture is designed for developing systems that generate intelligent
behaviors. It models human cognitive processes including understanding and anticipation.
It uses an internal structure called working memory modeled after human short-term
memory to build an agent’s knowledge about its environment. It employs a process called
decision cycle to produce actions in pursuit of goals. This cycle consists of four phases: 1)
the agent receives information from the external environment, 2) it proposes possible
actions, 3) it decides on one of them based on its knowledge provided by the programmer,
4) it applies the chosen action to the environment. The current version is Soar 9.2.0 Beta
(Soar Software Website).
For example, in the following Soar code snippet, if there is a state named hello-world, a
print action will get proposed; if this action gets selected, it will be applied, which prints out
“Hello World.”
sp {hello-world*propose*print
(state <s> ^name hello-world)
-->
(<s> ^operator <o> +)
(<o> ^name print)}
sp {hello-world*apply*print
(state <s> ^name hello-world)
(<s> ^operator <o>)
(<o> ^name print)
-->
(write | Hello World |)
(halt)}
Soar contains a long-term, procedural memory of skills. It consists of sequences of
procedures. It tells the agent how to do certain things without knowing when and where it
got this memory. With procedural memory, the agent doesn’t need to know why it should
follow the procedures to achieve the goal; instead, as long as the condition of the
procedural memory matches, the agent will make a corresponding action spontaneously.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
.
.
.
.
Ethical
Considerations
.
.
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
Chapter
4
REV. 1.0
PAGE 16
MapBot is not seriously dangerous to humans directly. There are two minor danger
hazards however. MapBot is very low to the ground, and could be difficult to see if one is
not looking down. A person could not see MapBot as he/she is walking, and could trip on
the MapBot. Also because it could be hard to see, MapBot could startle someone,
possibly resulting in injury. Though minor, these two hazards are possible dangers of
MapBot.
There is still another ethical consideration: privacy. Even though the iRobot Create is not
powerful enough to cause physical harm to a human, it can invade a human’s privacy. If a
user gains unauthorized access to private property, that user could run MapBot’s mapmaking system on that property. This violates the privacy of the rightful owner of the
property. However, laws are in place to prevent unauthorized access to private property,
and Team Alder trusts those laws to prevent gross misuse of MapBot’s functionality.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
Chapter
5
.
.
.
.
.
.
. Preliminary Design
.
.
REV. 1.0
PAGE 17
Challenge of Perception
There are three challenges to consider when building a map-making robot: perception,
decision making, and motor control. For perception in particular, we have to rely on the
assumption that the robot will behave in the same manner every time a command is issued.
This means that in order for the MapBot to construct a map of its surrounding, the robot must
travel the same velocity when given a drive command and the robot must consistently rotate at
the same rate when given a rotation command. During the course of this project, we will refer
to this challenge as the challenge of “perception”.
Overview of System Architecture
This section describes the preliminary overview of the system architecture for MapBot. As
shown in Figure 8 and Figure 9, the MapBot system architecture consists of five software
components and the iRobot Create platform.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
.
.
.
.
.
.
.
.
.
REV. 1.0
PAGE 18
Figure 8. MapBot System Architecture
This figure depicts the modules of the MapBot.
In Figure 8, there are six components. They are:
1. Environment Modeler: Stores and updates a model of the physical environment.
2. Decision Machine: Makes guidance decisions based on environment model.
3. Actuator Model: Converts a high level decision into low-level instructions.
4. Create Interface: Provides an API (Application Programmer Interface) for transfer of
information between software and Create.
5. Create Perception Database: Contains conversion data about the perception of the
Create.
6. iRobot Create: The physical robot that interacts with the environment.
Each of these components is described in further detail on pages 20-22 in the subsection
“Component Details
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
.
.
.
.
.
.
.
.
.
REV. 1.0
PAGE 19
Figure 9. Environment Modeler Detail
This figure shows the architecture of the Environment Modeler in more detail.
In Figure 9, there are five components. They are:
1. Environment Model: Contains all discovered information about the environment.
2. Environment View: Displays the Environment Model to the user.
3. Environment Controller: Modifies the Environment Model. Sends messages to other
modules.
4. Wall Sketcher: Produces lines (walls) based on points (bumps).
5. Wall Guesser: Produces a set of lines (room) based on walls and bumps.
Each of these components is described in further detail on page 21 in the subsection
“Component Details
” under the item “Environment Modeler
Figure 8 and Figure 9 above describe the flow of information through the components of
the MapBot.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
.
The Environment Controller sends its Current Environment Model to the Decision
.
Machine.
.
. Machine processes the Current Environment Model, and generates
The Decision
High-level. Guidance Command, which it sends to the Actuator Modeler.
.
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
1.
2.
REV. 1.0
PAGE 20
3. The Actuator Modeler converts the High-level Guidance Command into Low-level
Instructions and sends them to the Create Interface.
4. The Create Interface converts the Low-level Instructions into SCI Commands, and
sends them to the iRobot Create. Simultaneously, the Create Interface starts a Timer.
5. The iRobot Create executes the SCI Commands until a Sensor is triggered. This
causes the Create Interface to receive a Sensor Data Packet with pertinent
information.
6. The Create Interface stops its timer, and prompts the Create Perception Database to
convert from Time Data into Distance Data.
7. The Create Interface sends Last Path & Bump Data to the Environment Controller.
8. The Environment Controller updates the Environment Model, and repeats from step 1.
Component Details
This section describes each of the above components in detail.
iRobot Create
The iRobot Create is as described in the Programming Platform
section. Team Alder will not be modifying this component.
Sensors
The Create has three Sensors that Team Alder is interested in:
1. Bump Sensor
Bump Sensors are described in Figure 7 in the section “Programming Platform
.” These are the main sensors used by MapBot because they detect walls.
2. Cliff Sensors
Cliff Sensors are described in Figure 7 in the section “Programming Platform
.” Use of Cliff Sensors is an optional addition, but is useful to prevent MapBot from
falling down stairs.
3. Heading Sensor
The Heading Sensor is an additional external component that allows perceptive
heading direction data to be determined. This is an optional addition, but is useful to
ensure perceptive heading data. If this component is not used, heading data will be
determined by the rotation of the Create.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
.
Actuators
.
.has two types of movements that it can make. It can drive (forward or
The Create .
backward), and
. it can rotate in place. The actuators are the registers that are modified in
order to make the Create move.
.
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
REV. 1.0
PAGE 21
Create Interface
The Create Interface sends SCI commands to the Create, and receives sensor data
packets from it. It is designed by Team Alder to provide a convenient abstraction of the
physical robot. It allows necessary information and functionality to be accessed easily.
Instead of sending commands directly to the Create, the other software modules interact
with the Create Interface. The Create Interface translates the instructions from the
Actuator Modeler into SCI commands and sends these to the Create. The Create
Interface also translates the sensor data packets it receives into messages relevant to the
Environment Modeler. This module is created by Team Alder.
Environment Modeler
The Environment Modeler converts the bump data into a set of points on a Cartesian
plane. It is based on the Model-View-Controller architecture. The specialized pieces of the
Environment Modeler are the Wall Sketcher and Wall Guesser. The Wall Sketcher infers
that two nearby bump-points must be a wall, so it adds a “sketch” of this wall to the
Environment Model. A “sketch” can be thought of as a low level inference that if multiple
points are close to forming a line, they can be sketched as a wall. The Wall Guesser
continuously examines the Environment Model and adds its “guess” of a set of walls to the
Environment Model. A “guess” can be thought of as a higher-level inference that if multiple
walls are close to forming a rectilinear room, they can be guessed to be a room.
Decision Machine
There are three options for the Decision Machine:
1. Random Decision Machine
The movement directions to the Actuator Modeler would be generated randomly. An
implementation of MapBot that randomly picks a direction and moves until it reaches
an obstacle is a simple but effective way of generating the necessary data.
2. Simple Heuristic Decision Machine
The movement directions to the Actuator Modeler would be generated according to a
simple set of heuristics. Though more difficult than the Random Decision Machine, the
Simple Heuristic Decision Machine should generate data more efficiently.
3. Soar Decision Machine
The movement directions to the Create would be generated by the Soar cognitive
architecture. MapBot more efficiently produces the necessary data according to
recognized patterns.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
.
Actuator Modeler .
.
The Actuator. Modeler receives a High-level Guidance Command from the Decision
Machine. It breaks
. this down into individual steps and sends it to the Create Interface as
Low-level Instructions. These low-level instructions are an ordered list of basic directions
.
such as: {(Move,-0.3 m), (Rotate, 55°), (Move, 2 m)}.
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
REV. 1.0
PAGE 22
Create Perception Database
The Create Perception Database is a small database consisting of measured data
about the Create’s speed and rotational perception. This is used by the Create
Interface so that distance and degrees of rotation can be determined.
Table 6 displays the origin of each of the eight modules shown in Figure 8 and
described in “Component Details
”.
Component
Origin
iRobot Create
Sensors
Actuators
Create Interface
Environment Modeler
Decision Machine
Actuator Modeler
Create Perception Database
Already Available
Already Available
Already Available
Made by Team Alder
Made by Team Alder
Made by Team Alder
Made by Team Alder
Made by Team Alder
Table 6. Origin of Modules
Table 6 specifies which modules are already available to Team Alder as well as which
modules Team Alder is required to develop.
Use Cases
This section describes possible use cases for MapBot.
Use Case 1: User Start Up and View
Goal in context: Start up project.
Preconditions: The Create and PC remote desktop are on and ready to perform
task.
Scenario:
The user runs the project executable.
Three windows pop up on the desktop screen as shown in Figure 2, Figure 3, & Figure 4:
Window A: displays the path the Create takes
Window B: displays wall sketches
Window C: displays wall guesses.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
.
Create is ready to navigate.
.
.
Exceptions:
.
1. Project executable
not available.
. betweenis Create
2. Connection
and PC remote desktop is not well established.
.
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
REV. 1.0
PAGE 23
Priority: Essential, must be implemented.
When available: By Founder’s Day. First release. Releases are described in the
Milestones under Founder’s Day presentation.
Frequency of use: Always.
Use Case 2: Wall
Goal in context: Map the length of a wall.
Preconditions: The Create and PC remote desktop are on and ready to perform
task.
Scenario:
1. The user runs the project executable.
2. Three windows pop up on the desktop as shown in Figure 2, Figure 3, & Figure 4:
Window A: displays the path the Create takes
Window B: displays wall sketches
Window C: displays wall guesses.
3. The Create travels forward.
4. Line segment drawings appear displaying the path the Create takes periodically
in Window A
5. The Create bumper sensor hits the wall and a point is drawn in Window A.
6. Create moves in new direction until the wall is hit again and a point is drawn in
Window A.
7. Window B starts displaying short line segments representing wall
approximations.
8. Window C starts to display lines that are the program’s best wall placement.
9. Repeat steps 6 through 8 as Create continues to map room.
10. An alert will tell the user that the Create is done.
Exceptions:
a. No wall exists.
b. Map fails to display correctly because connection between Create and PC
remote desktop is not well established.
Priority: Essential, must be implemented.
When available: By Founder’s Day. First Release. Releases are described in the
Milestones under Founder’s Day presentation.
Frequency of use: Always.
Use Case 3: Rectangular Room with or without n obstacles
Goal in context: Navigate and map a rectangular room.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
.
Preconditions: Room is rectangular. The Create and PC remote desktop are on and
.
ready to perform task. The Create is set on the ground in the room.
.
Scenario: .
1. The user. runs the project executable.
2. Three windows
. pop up on the desktop:
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
REV. 1.0
PAGE 24
Window A: displays the path the Create takes
Window B: displays wall sketches,
Window C: displays wall guesses.
3. The Create travels forward.
4. Line segment drawings appear displaying the path the Create takes periodically
in Window A
5. The Create bumper sensor hits a wall and a point is drawn in Window A.
6. Create moves in new direction until wall is hit again and a point is drawn in
Window A.
7. Window B starts displaying short line segments representing wall
approximations.
8. Window C starts to display lines that are the program’s best guess layout on how
the room will look like.
9. Obstacles that exist in the room are drawn in Windows B and Windows C using
lines to depict the outline of the obstacles.
10. Repeat steps 6 through 9 as Create continues to map room.
11. An alert will tell the user that the Create is done.
Exceptions:
1. Map fails to display correctly because connection between Create and PC
remote desktop is not well established.
2. In the case that there are obstacles present, Environment Modeler misinterprets
data points to be walls or obstacles when they are not.
Priority: Essential, must be implemented.
When available: Without Obstacles: On Founder’s Day, first release. With
Obstacles: second release or later. Releases are described in the Milestones under
Founder’s Day presentation.
Frequency of use: Always.
Use Case 4: Non-Rectangular Room with or without n obstacles
Goal in context: Navigate and map a non-rectangular room.
Preconditions: Room is empty. The Create and PC remote desktop are on and
ready to perform task. The Create is set on the ground in the room.
Scenario:
1. The user runs the project executable.
2. The project will perform very similarly as the scenario described in Use Case 3
with the exception that non-rectilinear walls may be drawn.
3. An alert will tell the user that the Create is done.
Exceptions:
1. Map fails to display correctly because connection between Create and PC
remote desktop is not well established.
2. Environment Modeler misinterprets data points of a curved wall.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
.
3. In the case that there are obstacles present, Environment Modeler misinterprets
.
data points to be walls or obstacles when they are not
.
Priority: Not.essential.
.
When available:
. Third release. Releases are described in the Milestones under
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
REV. 1.0
PAGE 25
Founder’s Day presentation.
Frequency of use: Seldom.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
.
.
.
Development
Process
.
.
.
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
Chapter
6
REV. 1.0
PAGE 26
This chapter illustrates the anticipated development process for Team Alder.
General Approach
This section describes the general approach of development of MapBot. As shown in
Figure 10, there are dependencies among the software modules of MapBot.
Figure 10. Overall Development Process
This figure shows the dependencies and the anticipated order of modules of MapBot.
Figure 10 displays the nine software modules of MapBot and their order. They are:
1. Create Perception Database: The Create Perception Database is simple and is
necessary for other modules.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
.
Create Interface: The Create Interface is crucial to have to debug and test as other
.
modules are being generated.
.
. Modeler-Model: The Environment Modeler-Model is the basis for all
Environment
elements.of the whole Environment Modeler.
.
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
2.
3.
REV. 1.0
PAGE 27
4. Environment Modeler-View: The Environment Modeler-View is useful for debugging
and testing elements of the Environment Modeler.
5. Environment Modeler-Controller: The Environment Modeler-Controller is necessary
for sending messages from within the Environment Modeler.
6. Actuator Modeler: The Actuator Modeler is necessary to allow the Decision Machine
to connect with the Create Interface.
7. Wall Sketcher: The Wall Sketcher has nothing depending on it.
8. Wall Guesser: The Wall Guesser has nothing depending on it.
9. Decision Machine: The Decision Machine has nothing depending on it.
Progress Tracking
Team Alder will report its progress in three different ways:
1. Student Team meeting:
The students of Team Alder will meet informally at least once a week to discuss
progress, challenges, and goals.
2. Full Team meeting:
Team Alder will meet in full (students and faculty advisors) once a week to formally
discuss progress, challenges and goals.
3. Bi-Weekly report to Industry Representative
Team Alder will write a brief status report at least every other week so that the
Industry Representative can track progress.
Personnel Assignment
Though all teammates are responsible for all pieces of MapBot, leadership of each
component will be assigned. Some small components may be lead by a single team
member, while others may need the work of two members, or even the full attention of the
entire team.
Assumptions
This section describes the assumptions that Team Alder makes about the project.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
.
Replacement hardware is always available. This includes a replacement Create and
.
RooStick,
.
.
A microcomputer
is available to be placed on top of the Create to be used as an
interface .for the Decision machine, Environment Modeler, and Actuator Modeler.
.
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER



REV. 1.0
PAGE 28
Connection between PC remote desktop and the microcomputer is well-established.
Milestones
This section lays out the milestones that Team Alder will meet throughout the year.
Milestones for programming components do not just indicate code-completeness, but it
also includes a test-and-fix phase.
Number
1
Description
Fu Functional Specifications 0.9
Completion
Date
10/2
2
3
4
Functional Specifications 0.95
Functional Specifications 1.0
Program Review Presentation
10/9
10/16
10/26
5
6
7
Measure Rotational Perception
Interpret Sensor Packet
Populate Create Perception Database
11/6
11/13
11/20
8
Design Document 0.9
11/25
9
Design Document 0.95
12/4
10
11
12
Design Document 1.0
Program Create Interface
Program Environment Modeler: Model
12/11
1/15
1/22
13
14
15
Program Environment Modeler: View
Program Environment Modeler: Controller
Program Actuator Modeler
Program Wall Sketcher
Program Wall Guesser
Program Decision Machine
Integration Testing
1/29
2/5
2/12
16
17
18
19
20
21
Final Document 0.9
Integration Testing
Final Document 0.95
Integration Testing
Founder’s Day Presentation
Final Document 1.0
2/19
2/26
3/5
3/19
3/26
4/2
Table 7. Alder milestones.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
. 0.9
Functional Specifications
.
. of the functional specifications is the initial draft. It is to be completed
This version
.
when all .sections of the functional specification document are filled out. It is to be
approved by our advisors.
.
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
REV. 1.0
PAGE 29
Functional Specifications 0.95
Functional specification 0.95 is our revised version of the document. It is to be
approved by our advisors after all necessary revisions have been made and the
document looks acceptable to give to our industry representative.
Functional Specifications 1.0
This is our final version of the functional specifications. It is considered finished when
all final revisions have been included within the document and the industry
representative has approved it.
Program Review Presentation
The program review presentation is a monthly presentation given to the EE/CS 480
class about our progress in the senior design project. The presentation is 5-10
minutes long and must include a slide presentation.
Measure Rotational Perception
The robot is mobile in two different ways. One of the ways the robot can move is by
rotating itself up to 360 degrees. Measuring the rotational perception is finished when
we can give the robot a set input and it’ll rotate to the specified angle.
Interpret Sensor Packet
In order to determine whether the robot has hit a wall or object with its bump sensor,
we need to constantly request for a sensor packet update at a particular time interval.
Create can send different types of sensor packets depending on which sensors were
used. This milestone is completed when the robot has successfully sent the correct
sensor packet after hitting a wall and the Create Interface has correctly interpreted
which sensor was used.
Populate Create Perception Database
The Create Perception Database is a module in the software portion of our project
that holds the conversions from robot distance traveled to robot time traveled in
milliseconds and vice versa. It is finished when all entries have been filled out.
Design Document 0.9
This version of the design document is the initial draft. It is to be completed when all
sections of the design document are filled out. It is to be approved by our advisors.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
.
Design Document 0.95
.
. document 0.95 is our revised version of the document. It is to be
The design
.
approved. by our advisors after all necessary revisions have been made and the
document looks acceptable to give to our industry representative.
.
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
REV. 1.0
PAGE 30
Design Document 1.0
This is our final version of the design document. It is considered finished when all final
revisions have been included within the document and the industry representative has
approved it.
Program Create Interface
The Create Interface is a module in the software portion of our project that connects
the software and hardware portions of our projects together. It processes the
commands from the Actuator Modeler and turns them into serial commands with the
help of the Create Accuracy Database. The Create Interface also interprets the
sensor bumps into points and paths for the Environment Modeler to use. The Create
Interface is finished when it can send all necessary SCI commands as well as
interpret all sensor packets needed by the other components.
Program Environment Modeler: Model
The Environment Modeler: Model is a module that holds all necessary information
about walls and paths such as points, path lines, etc. The Environment Modeler:
Model is completed when it can successfully hold all necessary data such as the
robot’s path, wall bumps, wall sketches, and wall guesses, during a trial run of the
robot.
Program Environment Modeler: View
The Environment Modeler: View is a module that displays the robot’s perception, map
draft, and the best approximation of the map. The Environment Modeler: View is
done when it correctly displays the map based on the information from the
Environment Modeler: Model.
Program Environment Modeler: Controller
The Environment Modeler: Controller is a module that is the main communicator
between the Create Interface, Wall Sketcher, Wall Guesser, and the Decision
Machine. It is finished when it sends the correct information to the specified modules.
Program Actuator Modeler
The Actuator Modeler is a module that interprets the high level guidance command
such as Explore from the Decision Machine into a lower level command such as Drive
and Rotate. It is completed when it correctly interprets the high level command into a
lower level command.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
.
Program Wall Sketcher
.
.
The Wall.Sketcher is a module that makes a sketch of wall positions depending on
sensor hits.
. The Wall Sketcher is done when given the data from the Environment
Modeler: Model, the Wall Sketcher is able to interpret two nearby points (wall bumps)
.
on the map as a potential wall of the room.
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
REV. 1.0
PAGE 31
Program Wall Guesser
The Wall Guesser is a module that makes the best possible guess of the walls in the
room. It is finished when given the data from the Environment Modeler: Model, the
Wall Guesser creates a reasonable approximation of the map of the room.
Program Decision Machine
The Decision Machine is a module that decides on the best possible movements that
the robot should travel. We know it is completed when the Decision Machine
reasonably navigates the robot.
Integration Testing
It is important that time be set aside to test both individual components and the project
as a whole. Testing ensures that each component is bug free and that all
components communicate with each other as described in the System Architecture.
Testing is done when the project runs as described in the Requirements Overview.
Final Documentation 0.9
This version of the final documentation is the initial draft. It is to be completed when
all sections of the final documentation document are filled out. It is to be approved by
our advisors.
Final Documentation 0.95
Final documentation 0.95 is our revised version of the document. It is to be approved
by our advisors after all necessary revisions have been made and the document looks
acceptable to give to our industry representative.
Final Documentation 1.0
This is our final version of the final documentation. It is considered finished when all
final revisions have been included within the document and the industry representative
has approved it.
Founder’s Day Presentation
April 2nd, 2010 is the final presentation for the senior design project that explains and
demos the project. The goal is to finish the project by Founder’s Day. There may be
several releases of MapBot before the Founder’s Day presentation. We wish to demo
at least the first release. The first release of the MapBot makes sure that the MapBot
can start up and map an empty rectilinear room using a simple heuristic decision
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
.
machine to navigate the robot. The second release of the MapBot includes the use of
.
Soar as the decision machine to navigate the robot. This release may also deal with
.
mapping obstacles. The third release maps a non-rectilinear room.
.
.
.
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
Risks
REV. 1.0
PAGE 32
This section discusses the potential risks of this project.
Risk
Severity
Likelihood
Team members contract the flu
Low
Moderate
Team leader’s course load and sports schedule hampers
progress
Moderate
High
The challenge of perception is not addressed properly
High
Low
Demo does not standout during Founder’s Day presentation
Low
Moderate
Algorithms for calculating direction and wall positions do not
work
High
Moderate
Table 8. Alder project risks, severity, and likelihood.
Team members contract the flu
With the flu season approaching and the ongoing scare of the H1N1 virus, it is likely that
one of the team members may contract the flu. While sick, team members may be unable
to work with full effort, possibly slowing the progress of the project. To reduce the spread
of the flu, sick team members may not want to meet with the not sick members. It will be
difficult to communicate about the project should the team not fully meet.
To combat this risk, we could get a flu vaccination from the University Health Center. Flu
vaccination lowers the risk that one of us gets the flu. Should one of the team members
get sick, they could work on the software side of the project since that can be done at
home. Communication between team members, both sick and not sick, can be done
through emails or instant messaging.
Team leader’s course load and sports schedule hampers progress
CJ Cullen (the team leader) is currently taking a full course load as well as part of the
school’s baseball team. While he has been able to achieve a high GPA the past three
years while doing both school and baseball, there is a possibility that the stress from
working on the senior design project could affect his GPA as well as the project.
To detect this risk, the team will constantly keep track of CJ’s sports schedule and will
make any necessary adjustment to accommodate his absence or lack of work should he
need to leave campus for a game.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
.
The challenge of perception
is not addressed properly
.
. is a machine that no team members have any experiences working with
Since the Create
. of perception exists. This challenge must be addressed if we want our
it, the challenge
.
map to display properly.
.
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
REV. 1.0
PAGE 33
To adapt to this risk, the team will be often checking the perception of the Create. Should
the perception be slightly different from previous trails, we only need to change the
distance to time conversion. Should the perception be completely different from previous
trails, this Create may be defected and a new one would be used.
Demo does not standout during Founder’s Day presentation
While it may not be a required team goal, we will strive to achieve the “Best Project” title
for our Senior Design project. This requires our demo during the Founder’s Day
presentation to not only work correctly, but to standout from its competitors as well.
We believe that since our project is the only one dealing with robotics, this project has a
good chance of standing out on its own. There are a couple of things that we should do to
increase the odds of achieving the “Best Project” title. First, our project must work and it
must work well. Second, we need to give a good presentation. We will strive to make
sure that both of these objectives are met.
Algorithms for calculating directions and wall positions do not work
Creating an algorithm both for deciding on which direction to navigate the Create as well
as determining the positions of the walls is a challenging task. We want to make sure that
the Create can navigate and obtain all possible data needed to create a map with correct
wall positions and is as similar as possible to a real mapping of the room. These
algorithms are another important objective that our team needs to satisfy in order for our
project to work.
To adjust to this risk, our team will constantly try to develop algorithms that can do these
jobs efficiently and we will continue to try to improve these algorithms over the course of
this project.
Resources
This section lists the resources necessary to complete MapBot
Personnel

CJ Cullen: Fall Team Lead

Ryan Pila: Spring Team Lead

Ningxuan Wang: Chief Technical Officer
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
.
Preliminary Budget .
.
Line Category
Description
.
.
1
Materials
.
1.1
Backup RooStick
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
1.2
REV. 1.0
PAGE 34
Number
USB->5 Pin miniUSB Cable
TOTAL
# of parts
1
1
Rate
Amount
$30.00
$7.00
Subtotal
$30.00
$7.00
$37.00
Table 9. Budget
Equipment
Team Alder will need access to the iRobot Creates that belong to the engineering
department.
Facilities
Team Alder will need use of the iRobot room where the Creates are stored and charged.
Team Alder will also need access to a simple room to debug MapBot.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
Chapter
7
.
.
.
.
.
.
.
.
.
REV. 1.0
PAGE 35
Conclusions
MapBot is an autonomous, mobile, map-making robot. MapBot consists of a specialized
software architecture built on the iRobot Create platform in order to efficiently model the
physical environment. This document describes the functional specifications of MapBot.
This document goes over the required resources in terms of the environment and software
specifications as well as the programming platform. The project runs in an office setting
using the Create and the Sony Micro PC.
The document details the team’s vision of the overall system architecture. The architecture
is broken down into different modules, each having their own purpose in the overall
creation of a map.
Lastly, the development process, milestones, and risks are included to ensure the success
of MapBot by Founder’s Day.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
.
.
.
.
.
.
.
.
.
REV. 1.0
PAGE 36
Appendix A
Glossary
Soar
The Soar architecture is designed for developing systems that generate intelligent
behaviors. It models human cognitive processes including understanding and anticipation.
It uses an internal structure called working memory modeled after human short-term
memory to build an agent’s knowledge about its environment. It employs a process called
decision cycle to produce actions in pursuit of goals. In this cycle, the agent receives
information from the external environment, proposes possible actions and decides on one
of them based on its knowledge provided by the programmer, and applies the chosen
action to the environment. The current version is Soar 9.2.0 Beta (Soar Software
Website).
For example, in the following Soar code snippet, if there is a state named hello-world, a
print action will get proposed; if this action gets selected, it will be applied, which prints out
“Hello World.”
sp {hello-world*propose*print
(state <s> ^name hello-world)
-->
(<s> ^operator <o> +)
(<o> ^name print)}
sp {hello-world*apply*print
(state <s> ^name hello-world)
(<s> ^operator <o>)
(<o> ^name print)
-->
(write | Hello World |)
(halt)}
Procedural Memory
A long-term memory of skills consisting of sequences of procedures. It tells the agent how
to do certain things without knowing when and where it got the memory. With procedural
memory, the agent doesn’t need to know why it should follow the procedures to achieve
the goal; instead, as long as the condition of the procedural memory matches, the agent
will make a corresponding action spontaneously.
iRobot Create
The iRobot Create Programmable Robot is a preassembled mobile robot platform. It
allows users to program its movements and sounds. This robot consists of three wheels
(caster, left and right), two bump sensors (left and right), three wheeldrop sensors (caster,
left and right), an omnidirectional infrared receiver, four cliff sensors (front left, front right,
left and right), a cargo bay connector, and an external MiniDin-6 connector. The robot
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN
.
.
.
.
uses an alkaline battery pack consisting of 12 non-rechargeable “AA” batteries, which can
.
power the robot for up to 1.5 hours.
.
.
RooStick
.
The RooStick.enables computer to serially communicate with an iRobot Create through a
FUNCTIONAL SPECIFICATIONS
PROJECT ALDER
REV. 1.0
PAGE 37
USB port. Programs capable of issuing serial commands to a COM port can use the
RooStick to control the robot. The RooStick consists of a MiniDIN-7 female connector to
connect to the robot and a standard USB type A connector to connect to a computer.
iRobot Serial Command Interface (SCI)
The iRobot Serial Command Interface (SCI) is a serial protocol that consists of a set of
commands which allow users to control the robot and monitor its sensors through its
external MiniDin connector. Users can send commands to the robot to actuate its wheels
and steering mechanism and to request its sensor data. Such commands are drive, song,
sensors and so on. A command is a sequence of bytes in which the first byte indicates the
type of the command and the rest of them are arguments that users can include
specifications such as velocity, time duration and radius. For example, “137 1 44 128 0” is
a sequence of bytes, which commands the Create to drive straight at 300mm/sec. Sensor
data is polled in the form of sensor packets that contain data collected by the robot’s
sensors such as the bump sensors. A sensor packet is a byte in which each bit
corresponds to the state of a sensor. For example, if a byte is 00000001, the rightmost bit
1 indicates that the right bump sensor has detected a horizontal bump.
Microsoft Windows Graphics Device Interface (GDI)
The Microsoft Windows graphics device interface (GDI) is a class-based application
programming interface in C/C++. It allows users to use graphics and formatted text on
Windows-based applications. It can be used to draw lines and curves, adjust fonts and
colors.
UNIVERSITY OF PORTLAND
SCHOOL OF ENGINEERING
CONTACT: CJ CULLEN