MapMaker Robot - Swiss Talent Forum

Studienwoche Faszination Informatik 2013
MapMaker Robot
Gwendolin Rohner, Patrick Balestra
Tutor: Dr. Ivo Blöchliger
Task:
Our aim was to build a robot which notices its environment and draws a map of it. To realize this we
used a NXT brick from Lego MINDSTORMS that we programmed in Java. The whole project has
three parts: Building a robot, writing the program for the robot and writing the program for the
computer, so it can draw a map with the information from the
robot. The robot itself should drive without crashing into obstacles
and send the different positions to the computer.
Materials:
-
Eclipse: IDE
Java: Programming Language
-
leJos: Library
Lego MINDSTORMS NXT: Robot material (Ultrasonic
sensors, sound microphone, lights, motors) and brick
Our robot with two Ultrasonic
sensors, microphone, lights and two
motors
Procedure:
We built out of a Lego MINDSTORMS NXT box a robot. First we set the skills he must finally have
and then we decided which sensors will be necessary to approach our aim. For driving we certainly
needed two motors. To make sure he doesn’t crash into walls and sees obstacles, we installed an
Ultrasonic sensor on the top (later we added another one to the left side). In the beginning we started
also working with Touch-Sensors, but during the development of the program we noticed that they
aren’t sensitive enough, so the robot must hit the obstacles far too hard to get the signal of
“Touchsensor is pressed”.
The code was written in Java in the IDE
of Eclipse using the leJos-library. We
started by getting into touch with the
sensors and the motors, so we got to
know how they work and how to control
them. We received an example program
in which the robot communicates with
the computer by Bluetooth. We modified
it that it suits to our demands. Now the
robot could send its information and
System of drawing a map with the robot
measurement to the computer (and vice
versa). After this we started to code the driving skills of the robot. He was supposed to drive in steps
of five centimetres until there was an obstacle in its way and then turn left.
Studienwoche Faszination Informatik 2013
The main aim was to draw a map. For this we set the robot to an x-, y-start point and a start angle and
begun adding the movements he does. The direction he drives is a vector, so we split it with Sinus and
Cosines to an x and y part. In every turn the robot does the program adapts the angle too. Using the xand y-position we got the robot as a black square on our screen.
For the obstacle points we added the distance (split up in x-/ydirection) measured by the Ultrasonic sensors and finally we
got them as red and yellow points on the map.
Results:
In the end we got a map created by the robot. The walls are
recognizable, but a lot of work has still to be done to precise the map. The robot
drives on its own with almost no problems and avoids the walls. We built the
program with seven Java classes, three for the robot and four for the computer
which are interacting with each other. Adding to this we
installed some lights which are just glowing and a
microphone which lets the robot start driving as soon as
the noise level (for example by clapping) rises over a
certain decibel.
Discussion:
Our tasked is solved by 80%. The robot is driving quite
well and the map is recognizable. Also the
communication between robot and computer works very
well. We connected different skills in the classes
successfully, so the robot is able to do what it is
demanded to. Our personal math skills were very usable
to calculate x and y.
Map drawn with the robot’s information
Black: Robot’s way
Red: Sensor onthe top
Yellow: Sensor on the left side
The map isn’t as precise as we wanted it to be. Instead of a square there is rather a circle. Our
supposed reason is, that the Ultrasonic sensor can only notice walls witch are right-angled (90°) to the
robot. The signals that are sent by the sensor can’t be reflected back to the sensor if the walls are in a
different angle to the sensor. We would need to make sure, that the robot has its 90° every time, when
he is approaching a wall, or we could work with additional sensors
Studienwoche Faszination Informatik 2013
Further, there is measure acceleration with each movement the robot does and this acceleration is
added each time the robot moves. So in the beginning the drawn map is quite precise and gets then
more and more inaccurate. If we had more time, we could try to enable the robot to get its bearings by
fix points (e.g. lights or black points on the ground) to solve the problem of inaccuracy.
Conclusion:
We found some solutions for our map-making problem, but we didn’t have enough time to realise
them. Our robot isn’t perfectly finished yet and still needs some developing time, nevertheless the
whole project gave us a lot interesting things to learn and some successful events.
Acknowledgements:
We would like to thank Mr. Blöchliger for his help and his medical robot care. Further we thank the
University of Fribourg that they made it possible for us to work at their university and also that they
presented us some perspectives for our future.
Also we would like to thank our schools for allowing us to participate in this computer science week
during school time, in particular Prof. Omar Gianora and Prof. Laura Rulli from the Liceo cantonale di
Bellinzona and the Kantonsschule Baden.
And last but not least we want to thank Schweizer Jugend forscht, which enabled us a great, learn
intensive and special week!