Development of Underwater Terrain Map Building Method on Polar

International Journal of Applied Engineering Research ISSN 0973-4562 Volume 11, Number 14 (2016) pp 8259-8264
© Research India Publications. http://www.ripublication.com
Development of Underwater Terrain Map Building Method on Polar
Coordinates by Using 3D Sonar Point Clouds
Eon-ho Lee and Sejin Lee
MS Course Student, Assistant Professor,
Division of Mechanical and Automotive Engineering,
Kongju National University, Cheon an-si, Chung Cheongnam-do, South Korea.
Sound Metrics, is so high that it is not practical to attach them
to multiple robotic vehicles. Thus, providing 3D observability
with underwater search equipment with 2D sonar sensor is of
great practical value to construct the rapidly deployable
system for searching mid-water human and objects. In this
research, therefore, we plan to build the scanning sonar sensor
equipment with M900-90 of TELEDYNE-BlueView to secure
3D observability and practicality. M900-90, 2D multi-beam
imaging sonar, mounted on a motor would be rotated and
synchronized together to take accurate 3D range point cloud
data as shown in Figure 1.
Abstract
This paper proposes the underwater terrain map building
method based on the occupancy grids by using 3D point cloud
from the polar sonar sensor system. Because it is expensive to
use current 3D underwater sonar sensors, the low-cost 3D
sensor systems are needed for the generalized purposes. So
the polar-styled sonar sensor system was designed and
implemented. It was constructed on the rotating motor with
the 2D multi-beam sonar image sensor. And we developed the
depth map model based on an occupancy probability grids
map using the Bayesian-updating model. This way could
reduce the noise caused by fundamental characteristics of a
sonar sensor. The experiment was conducted to verify the
validity and the usefulness of the proposed approach.
Keywords: Underwater Sonar Sensor, Terrain Depth Map,
Bayesian Rule, Occupancy Grids, 3D Point Cloud.
INTRODUCTION
The year before last, in South Korea, a passenger ferry namely
Sewol with 420 people has sunken into the ocean, and as a
result, more than 300 people were dead. Over months, the
government had a difficult time to search the bodies of those
sacrificed. One of two main challenges were that the victims’
bodies were washed away as time went by due to the strong
current, and later it became really difficult to locate them due
to the muggy water [1]. In 2013, a citizen committed a suicide
by jumping from a bridge into the Han River which is running
through Seoul, the capital city of South Korea, and is known
for its muggy and fast running water. To recover the body of
the citizen, a number of resources including a special purpose
helicopter for over-the-water search-and-rescue operation
were deployed. During the next 4 days, the search operation
continued. However, the body was not recovered as it stayed
mid-water and was not able to be seen from the rescuers
outside the muggy water [2]. Based on the recent report on
underwater searching equipment status by the national fire
station, as of 2009, only 33 out of 2000 fire stations (1.44%)
in South Korea possess a kind of underwater search-andrescue equipment.
The importance of the exploring underwater has been
increased due to above reasons. The existing (reasonable
priced and thus widely used) equipment such as
NAVIGATOR [3] suffers from an inherited physical
restriction on the sonar as the user would recognize his/her
surrounding within a reconstructed 2D space. On the other
hand, the cost of high-end 3D sonar, e.g. ARIS Sonars of
Figure 1: Polar sonar sensor system with Blueview M900 and
Dynamixel-Pro
Research on the three-dimensional mapping of underwater
terrain using sensors is an important factor in solving the
aforementioned issues, and the most common sensors used
underwater are ultrasonic sensors. Jakuba and Yoerger
developed an unmanned underwater vehicle called the
Autonomous Benthic Explorer (ABE), which mounted the
Simrad Mesotech SM2000 ultrasonic sensor, and used it to
create the feature maps of the Explorer Ridge and Lost City
[4]. Langer and Hebert proposed a method for reconstructing
an elevation map of Juan de Fuca Ridge’s submarine
topography using backscatter images of the ultrasound data
[5]. Fairfield’s research team introduced a three-dimensional
simultaneous localization and mapping (SLAM) technique
that uses ultrasonic sensors to navigate a submersible during
the exploration of the underwater cave Zacaton Cenote [6].
Williams proposed a method for using ultrasonic sensors to
extract the noticeable natural landmarks and then using the
8259
International Journal of Applied Engineering Research ISSN 0973-4562 Volume 11, Number 14 (2016) pp 8259-8264
© Research India Publications. http://www.ripublication.com
collected data to track the position of the submersible [7]. Lee
et al. proposed the advanced sonar sensor model capable of
building an underwater terrain map based on the Bayes filter
in the manner of the occupancy grid. The proposed sonar
sensor model basically considers the losses of received
ultrasound intensity according to the underwater absorption
and reflection on a terrain surface. The Bayes filter including
the advanced sonar sensor model is applied to build an
occupancy grid map. The underwater sonar data is acquired
by the pencil beam type’s sonar sensor mounted on the
unmanned surface vehicle (USV) [8]. Table 1 lists the major
studies on creating ultrasonic sensor-based underwater terrain
maps.
Underwater Sonar Sensor
A sonar sensor generates ultrasonic waves and it gets echoes
as shown in Figure 3. So, the sonar sensor has the retuned
signals including a lot of a noise data caused by the reflection
effect and the wide beam width. Therefore, the uncertainties
of data set should be increased. Thus, this problem was
handled by a stochastic method.
Table 1: Major Studies on Building Ultrasonic Sensor-based
Underwater Terrain Maps
Research
Map
Resolution /
Coverage (m)
Application
[4]
[5]
[6]
[7]
Sonar
Beam
Shape
Pencil
Wide
Pencil
Pencil
Grid
Grid
Grid
Point
2/2000
0.25/112
0.5/350
-/40
[8]
[9]
Pencil
Wide
Grid
Point
0.4/150
-/200
[10]
Wide
Point
-/1000
Terrain survey
Terrain survey
SLAM
Terrain based
localization
Terrain survey
Terrain based
SLAM
Terrain based
SLAM
Figure 3: A concept of a sonar senor
Table 2: Specification of the Blueview M900
Property
Capability
Frequency
900 (kHz)
Beam Width 1×20 ( °)
Max. Range
1∼100 (m)
Size
193×102×102 (mm)
Weight in Air 1,814 (g)
Supply Voltage 12-48 (V, DC)
Grid Map Building
The probability of an occupancy grid map can be expressed as
(1):
(1)
pm | x1:t , z1:t 
This paper proposes the depth map representation method to
visualize underwater geographic features. The 3D point cloud
achieved in underwater [11], [12] were refined by using the
occupancy probability grids [13]. Section 3 presents the
results of the experiment, and finally, section 4 makes the
conclusions based on the results of the experiment.
In (1), m represents the grids that form the map, and consists
of N cells, as in m1, m2, …, mi, …, mN, where mi, an i-th cell,
is the occupational probability of an object. The occupancy
status of the grid is presented as a probability value between 0
and 1, and this probability is indicated as p(mi). In (1), x1:t
indicates the location of the robot (USV). The directional
angle of the robot, and the x and y coordinates, are used to
indicate the location of the robot in this study. The data are
calculated as GPS values. x1:t refers to all the robot positions
from time step 1 to t, whereas z1:t is all the ultrasonic sensor
distance measurements from time step 1 to t. In other words,
the occupation probability of the current grid can be
calculated through the position of the robot from the start of
the operation to its current location, along with its
corresponding ultrasound distance measurement.
Generally, a complicated calculation is required to solve (1).
Therefore, to simplify the calculation, it was assumed that all
the grids were independent of each other. When the
occupancy grids are assumed to be independent, (1) can be
simplified as follow:
OCCUPANCY GRID MAP
Occupancy grid mapping is one of the methods for the
environmental map building using sensor data for an uncertain
distance mixed with noise. The environment through which
the robot moves is divided into three-dimensional grids, and
the terrain map is built by continuously updating the object’s
occupation probability within each grid using the distance
sensor data [14, 15]. Figure 2 shows the flow chart of the
proposed algorithm in this study. There are three steps such as
achieving the 3D point cloud, creating the occupancy grid,
and representing the depth map.
N
pm | x1:t , z1:t    pmi | x1:t , z1:t 
(2)
i 1
Occupancy Probability Calculation
Each grid is assumed to be independent of the others, and only
the occupancy probability of each cell is calculated. The
Bayes filter is applied to (2) in order to represent the equation
Figure 2: The flow chart of the proposed algorithm
8260
International Journal of Applied Engineering Research ISSN 0973-4562 Volume 11, Number 14 (2016) pp 8259-8264
© Research India Publications. http://www.ripublication.com
as a regression equation applicable as an algorithm. A
summarization of (2) under the condition of x1:t and z1:t-1 is as
follow:
pmi | x1:t , z1:t   pmi | x1:t , z1:t-1 , zt 
(3)
Y
In (4), the Bayes filter is applied to the equation above as
follow:
pmi | x1:t , z1:t-1 , zt  
θj
pzt | mi , xt  pmi | x1:t , z1:t 1 
pzt | x1:t , z1:t 1 
θi
(4)
In the following equation, the Bayes filter is applied again to
the robot’s state variables and the probability of the sensor
value prediction according to a grid of (4):
pmi | xt , zt  pzt | xt 
pzt | mi , xt  
pmi | xt 
(5)
ri
W
Here, p(mi|xt) is a probability that indicates the occupancy of
each grid in accordance with the state of the robot. However,
the occupancy of each grid is independent of the robot’s state
variables because the robot cannot gain any information
regarding the environment without receiving sensor values.
Therefore, the probability regarding the state of the cell can be
expressed as follows:
pmi | x1:t , z1:t  
X
Figure 4: Sensor footprint shape and its variables
pmi | xt , zt  pzt | xt  pmi | x1:t -1 , z1:t 1 
pmi 
pzt | x1:t , z1:t 1 
Occupancy Probability Calculation
The sensor model value in (9) depends on the probabilities
with respect to distance ri, angle θi. This section explains the
probabilities with respect to distance ri and angle θi.
R in Figure 5 is the distance from the sensor position to the
location of an object recognized by the sensor. The range of
±ε from R, which is the distance recognized by the sensor, is
recognized as an occupied area, and the range below R-ε (i.e.,
below Rmin) is an empty area. The center of the sensor
measuring direction with a θ value of 0 in Figure 6 has a
probability of 1, and the probability finally decreases to 0 as
the object becomes further away from the center. This
equation is expressed as follows:
(6)
Using odds will simplify the calculation when creating a
regression equation using the Bayes filter, and consequently,
the application of the log odds ratio produces the following
equations:
pmi | x1:t , z1:t 
pmi | xt , zt 
pmi | x1:t-1 , z1:t 1  1  pmi 

pmi | x1:t , z1:t  1  pmi | xt , zt  1  pmi | x1:t -1 , z1:t 1  pmi 
(7)
Oddsmi | x1:t , z1:t   lt mi   log
rj
pmi | xt , zt 
 lt 1 mi   l0 mi 
1  pmi | xt , zt 
(8)
In other words, the log odds ratio of the current t step in
relation to a grid mi can be calculated through the log odds
ratio in relation to mi at the t-1 step, the log odds ratio in
relation to mi at the initial step, and from the sensor model.
The following section explains the underwater ultrasonic
sensor model in detail.
2

r R

 pocc ri   1   i
 2ε 


r2
pemp ri   i 2

2Rmin

2

θ 
pθi   1   

W 

Underwater Ultrasonic Sensor Model
The expression log(p(mi|xt, zt))/(1-p(mi|xt, zt)) must be
interpreted in order to calculate (8), which was obtained by
applying the log odds ratio in the previous section. Figure 1
shows that probability p(mi|xt, zt) can belong to an occupied
area or an empty according to the location of cell i depending
on the measurement conditions. Pocc is used to calculate
probability p(mi|xt, zt) when cell i belongs to an occupied area,
and Pemp is used when it is an empty area. Pocc and Pemp can be
expressed as follows:
 pocc  pocc ri   pθi 
(9)

 pemp  pemp ri   pθi 
Rmin  ri  Rmax 
0  ri  Rmin 
 W  θ  W 
(10)
Here, W is the value of the angle from the center of the sensor
measurement direction to the end of the maximum angle.
Therefore, W is the same as the maximum value of θ.
p(r)
1
0.5
Pocc(ri) in (9) is a probability calculated based on the distance
from the location of the sensor to cell i when cell i is within
the occupied area, and Pemp(ri) is the probability calculated
based on the distance from the location of the sensor to cell i
when cell i is within an empty area. p(θi) is the probability
calculated based on the angle of cell i to the center of the
sensor measurement direction.
2ε
0
R min R R max
r
Figure 5: Occupancy probability according to the distance
8261
International Journal of Applied Engineering Research ISSN 0973-4562 Volume 11, Number 14 (2016) pp 8259-8264
© Research India Publications. http://www.ripublication.com

W
0
1
p( )
-W
(a) 3D raw data
(b) Occupancy probability grids
(c) Top view of raw data
(d) Depth map
Figure 6: Occupancy probability according to the angle
Polar Coordinates
In this study, the polar coordinate in 3 dimensions can be
represented by the measured distance, z in the vertical axis,
the beam angle, ϕ in the y axis, and the rotation angle of a
motor, θ counter-clockwise in the x axis as shown in Figure 7.
Figure 9: Raw data and resulted grid maps
Figure 9(a) represented the 3D raw data of the rotating sonar
sensor on the polar coordinates. Figure 9(b) shows the
occupancy probability grid map by using the 3D raw sonar
data. Figure 9(c) shows the raw data on the θ-ϕ plane. Figure
9(d) was the image data that represented the distance data by a
color on the θ-ϕ plane. As the red ellipses were indicated in
Figure 9(c) and (d), the blue dots marked in the red circle in
Figure 9(c) were gone in Figure 9(d). Although it was the
noise data of the sonar sensor, it was disappeared in Figure
9(d) by the occupancy grid mapping technique. This problem
caused by the data noise was handled by the occupancy
probability grids map algorithm. The each probability value of
grids was renewed by the Bayesian updating rule.
Figure 7: Polar coordinates
Experimental Results
Figure 8 shows the polar sonar sensor system capable of
achieving 3D point cloud.
(a) Grids 1 at USV Position 1 (b) Grids 2 at USV Position 1
without object
without object
Figure 8: Polar sonar sensor system with USV
8262
International Journal of Applied Engineering Research ISSN 0973-4562 Volume 11, Number 14 (2016) pp 8259-8264
© Research India Publications. http://www.ripublication.com
(c) Grids at USV Position 2
without object
orange circle means a cube or box at Figure 10(e). Figure
10(a)-(c) were the occupancy probability grids map without
the box or the cube. And the USV was on the position 1 in
Figure 10(a) and Figure 10(b). And the USV was laid on the
position 2 in Figure 10(c). Figure 10(d), Figure 10(e) and
Figure 10(f) were the occupancy probability grids map with
the cube. And the USV was on the position 1 in Figure 10(d)
and Figure 10(e). The USV was laid on the position 2 in
Figure 10(f). Then, the cube was on the position 1 in Figure
10(d). In Figure 10(e) and Figure 10(f), the cube lay on the
position 2. Figure 10(g), Figure 10(h) and Figure 10(i) were
the occupancy probability grids map with the box. And the
USV was on the position 2 in Figure 10(g) and Figure 10(h).
The USV was laid on the position 1 in Figure 10(i). Then, the
box was on the position 1 in Figure 10(g). In Figure 10(h) and
Figure 10(i), the box was laid on the position 2.
(d) Grids at USV Position1 &
Cube Position 1
(e) Grids at USV Position1 & (f) Grids at USV Position 2 &
Cube Position 2
Cube Position 2
(a) USV Position 1
(b) USV Position 1
(c) USV Position 2
(d) USV Position 1 & Cube
Position 1
(e) USV Position 1 & Cube
Position 2
(f) USV Position 2 & Cube
Position 2
(g) USV Position 2 & Box
Position 1
(h) USV Position 2 & Box
Position 2
(g) Grids at USV Position 2 & (h) Grids at USV Position 2 &
Box Position 1
Box Position 2
(i) Grids at USV Position 1 & Box Position 2
Figure 10: Occupancy probability grids map of the real data
Figure 10 shows the occupancy probability grids maps that
created from the raw data to the occupancy probability grids
map data. The 3D point cloud was achieved in a water tank.
Their depth was 8(m). And 2 boxes were poured into a water
tank for the experiment. Those size of cube and box was
1(m)×1(m)×1(m) and 1(m)×0.5(m)×1(m). The USV, the cube
and the box was moved when the 3D point cloud was
achieved. In Figure 10(a), the green circle means a structure in
the water tank. The blue circle means a floor in the water tank.
The black circle means a wall in the water tank. And the
(i) USV Position 1 & Box Position 2
Figure 11: Depth map of the real data.
8263
International Journal of Applied Engineering Research ISSN 0973-4562 Volume 11, Number 14 (2016) pp 8259-8264
© Research India Publications. http://www.ripublication.com
[4]
Figure 11 shows depth maps that were created from the
occupancy probability grids map data to the depth map data.
As mentioned, the each green circle, blue circle, black circle
and orange circle was a structure, a floor, a wall, a cube and a
box. Figure 11(a), Figure 11(b) and Figure 11(c) were the
depth map without the box or the cube. And the USV was on
the position 1 in Figure 11(a) and Figure 11(b). And the USV
was laid on the position 2 in Figure 11(c). Figure 11(d),
Figure 11(e) and Figure 11(f) were the depth map with the
cube. And the USV was on the position 1 in Figure 11(d) and
Figure 11(e). The USV was laid on the position 2 in Figure
11(f). Then, the cube was on the position 1 in Figure 11(d). In
Figure 11(e) and Figure 11(f), the cube was laid on the
position 2. Figure 11(g), Figure 11(h) and Figure 11(i) were
the depth map with the box. And the USV was on the position
2 in Figure 11(g) and Figure 11(h). The USV was laid on the
position 1 in Figure 11(i). Then, the box was on the position 1
in Figure 11(g). In Figure 11(h) and Figure 11(i), the box was
laid on the position 2.
As shown in Figure 7 and 8, the evidences of cube and box
can be indicated in the occupancy grid maps. It is known that
the evidences can be shown differently according to its
position although these evidences indicate the same object.
Moreover, the sensor noise was reduced by using the Bayes
filter.
[5]
[6]
[7]
[8]
[9]
CONCLUSION
This paper presented the underwater terrain map building
method based on the occupancy grids by using 3D point cloud
from the polar sonar sensor system. Because it is expensive to
use current 3D underwater sonar sensors, the low-cost 3D
sensor systems are needed for the generalized purposes. So
the polar-styled sonar sensor system was designed and
implemented. It was constructed on the rotating motor with
the 2D multi-beam sonar image sensor. And we developed the
depth map model based on an occupancy probability grids
map using the Bayesian-updating model. This way could
reduce the noise caused by fundamental characteristics of a
sonar sensor. The experiment was conducted to verify the
validity and the usefulness of the proposed approach. For the
future works, the Deep learning algorithm will be applied for
the classification task.
[10]
[11]
[12]
[13]
[14]
[15]
ACKNOWLEDGEMENT
This research was supported by Basic Science Research
Program through the National Research Foundation of Korea
(NRF) funded by the Ministry of Science, ICT & Future
Planning(NRF-2015R1C1A1A01054724).
REFERENCES
[1]
[2]
[3]
http://www.koreaherald.com/view.php?ud=20140501
000824
http://www.hani.co.kr/arti/english_edition/e_national
/599060.html
http://www.sharkmarine.com/Products/HandHeld/Na
v.html
8264
M. Jakuba and D. Yoerger, High-resolution
multibeam sonar mapping with the Autonomous
Benthic Explorer(ABE), Proc. the 13th Unmanned
Untethered Submersible Technology Conference,
Durham, NH, 2003.
D. Langer and M. Hebert, Building Qualitative
Elevation Maps From Side Scan Sonar Data For
Autonomous Underwater Navigation, Proc. of the
IEEE International Conference on Robotics and
Automation, Sacramento, CA, 1991, 2478-2483.
N. Fairfield, G.Kantor, and D. Wettergreen, Realtime SLAM with octree evidence grids for
exploration in underwater tunnels, Journal of Field
Robotics, 24(1-2), 2007, 3-21.
S. B. Williams, P. Newman, J. Rosenblatt, G.
Dissanayake, and H. Durrant-Whyte, Autonomous
underwater navigation and control, Robotica, 19,
2001, 481-496.
S. Lee, D. Kim, and A. O. Tokuta, Development of
Advanced Sonar Sensor Model for Underwater
Terrain Mapping based on Occupancy Grids,
International Journal of Applied Engineering
Research, 10(5), 2015, 3979-3982.
P. Newman, J. Leonard, and R. Rikoski, Towards
constant-time slam on an autonomous underwater
vehicle using synthetic aperture sonar, Proc. of 11th
Int. Symp. On Robotics Research, Sienna, Italy,
2005, 409-420.
I. T. Ruiz, S. Raucourt, Y. Petillot, and D. M. Lane,
Concurrent Mapping and Localization Using
Sidescan Sonar, IEEE Journal of Oceanic
Engineering, 29(2), 2004, 442-456.
The Navy Unmanned Undersea Vehicle(UUV)
Master Plan, US Navy, 2004.
Defense Robot Master Plan, Korea Agency for
Defense Development, 2005.
Report on Unmanned System for the Sea, Korea
Agency for Defense Development, 2006.
Yun-Kyu Choi and Se-Jin Lee, Development of
Advanced Sonar Sensor Model using Data
Reliability and Map Evaluation Method for Grid
Map Building, Journal of Mechanical Science and
Technology, 29(2), 2015, 485-491.
Jong-Hwan Lim and Chul-Ung Kang, Grid-Based
Localization of a Mobile Robot Using Sonar Sensors,
Journal of Mechanical Science and Technology,
16(3), 2002, 302-309.