Research Article
Research Article
Journal of Sensors
Volume 2016, Article ID 8594096, 16 pages
http://dx.doi.org/10.1155/2016/8594096
Research Article
Vision-Based Autonomous Underwater Vehicle Navigation in
Poor Visibility Conditions Using a Model-Free Robust Control
                                                               1                              2
         Ricardo Pérez-Alcocer, L. Abril Torres-Méndez,
                             2                                   2
         Ernesto Olguín-Díaz, and A. Alejandro Maldonado-Ramírez
1         CONACYT-Instituto Politecnico´ Nacional-CITEDI, 22435 Tijuana, BC, Mexico
         2
             Robotics and Advanced Manufacturing Group, CINVESTAV Campus Saltillo, 25900 Ramos Arizpe, COAH, Mexico
         Correspondence should be addressed to L. Abril Torres-Mendez;´ abriltorresm15@gmail.com
         Copyright © 2016 Ricardo Perez´-Alcocer et al. This is an open access article distributed under the Creative Commons
         Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is
         properly cited.
         This paper presents a vision-based navigation system for an autonomous underwater vehicle in semistructured environments
         with poor visibility. In terrestrial and aerial applications, the use of visual systems mounted in robotic platforms as a control
         sensor feedback is commonplace. However, robotic vision-based tasks for underwater applications are still not widely
         considered as the images captured in this type of environments tend to be blurred and/or color depleted. To tackle this problem,
         we have adapted the color space to identify features of interest in underwater images even in extreme visibility conditions. To
         guarantee the stability of the vehicle at all times, a model-free robust control is used. We have validated the performance of our
         visual navigation system in real environments showing the feasibility of our approach.
submarine tasks such as navigation, surveying, and                    order to define the AUV behavior in a semistructured
mapping. The system is robust in mechanical structure and             environment. The AUV dynamic model is described and a
software components. Localization has been also addressed             robust control scheme is experimentally validated for
with vision systems. In [15] a vision-based localization              attitude and depth regulation tasks. An important controller
system for an AUV with limited sensing and computation                feature is that it can be easily implemented in the
capabil-ities was presented. The vehicle pose is estimated            experimental platform. The main characteristics affecting
using an Extended Kalman Filter (EKF) and a visual                    the images taken underwater are described, and an adapted
odometer. The work in [16] presents a vision-based                    version of the perceptually uniform color space is used to
underwater localization technique in a structured                     find the artificial marks in a poor visibility environment.
underwater environment. Artificial landmarks are placed in            The exact positions of the landmarks in the vehicle
the environment and a visual system is used to identify the           workspace are not known, but an approximate knowledge
known objects. Additionally a Monte Carlo localization                of their localization is available.
algorithm estimates the vehicle position.                                  The main contributions of this work include (1) the
    Several works for visual feedback control in underwater           development of a novel visual system for detection of
vehicles have been developed [17–28]. In [17], the authors            artificial landmarks in poor visibility conditions
present a Boosting algorithm which was used to identify               underwater, which does not require the adjustment of
features based on color. T his method uses, as input, images in       internal parameters when environmental conditions change,
the RGB color space, and a set of classifiers are trained offline     and a new simple visual navigation approach which does
in order to segment the target object to the background and the       not require keeping the objects of interest in the field of
visual error is defined as an input signal for the PID con-troller.   view of the camera at all times, considering that only their
In a similar way, a color-based classification algorithm is           approximate localization is given. In addition, a robust
presented in [18]. This classifier was implemented using the          controller guarantees the stability of the AUV.
JBoost software package in order to identify buoys of different            The remaining part of this paper is organized as
color. Both methods require an offline training process which is      follows. In Section 2 the visual system is introduced. The
a disadvantage when the environment changes. In [19], an              visual navigation system and details of the controller are
adaptive neural network image-based visual servo controller is        presented in Section 3. Implementation details and the
proposed; this control scheme allows placing the under-water          validated exper-imental results are presented in Section 4.
vehicle in the desired position with respect to a fixed target. In    Finally, Section 5 concludes this work.
[20], a self-triggered position based visual servo scheme for the
motion control of an underwater vehicle was presented. The
visual controller is used to keep the target in the center of         2. The Visual System
image with the premise that the target will always remain
                                                                      Underwater visibility is poor due to the optical properties of
inside the camera field of view. In [21], the authors present an
evolved stereo-SLAM procedure implemented in two                      light propagation, namely, absorption and scattering, which
underwater vehicles. They computed the pose of the vehicle            are involved in the image formation process. Although a big
using a stereo visual system and the navigation was performed         amount of research has focused on using mathematical models
following a dynamic graph. A visual guidance and control              for image enhancement and restoration [25, 26], it is clear that
methodology for a docking task is presented in [22]. Only one         the main challenge is the highly dynamic environment; that is,
high-power LED light was used for AUV visual guidance                 the limited number of parameters that are typically considered
without distance estimation. The visual information and a PID         could not represent all the actual variables involved in the
controller were employed in order to regulate the AUV attitude.       process. Furthermore, for effective robot navigation, the
In [23], a robust visual controller for an underwater vehicle is      enhanced images are needed in real time, which is not always
presented. The authors implemented genetic algorithms in a            possible to achieve in all approaches. For that reason, we
stereo visual system for real-time pose estimation, which was         decided to explore directly the use of perceptually uniform
tested in environments under air bubble disturbance. In [24],         color spaces, in particular the color space. In the following
the development and testing process of a visual system for            sections, we describe the integrated visual framework
buoys detection is presented. This system used the HSV color          proposed for detecting artificial marks in aquatic
space and the Hough trans-formation in the detection process.         environments, in which the color space was adapted for
These algorithms require the internal parameters adjusting            underwater imagery.
depending on the work environment, which is a disadvantage.
In general, the visual systems used in these papers were              2.1. Color Discrimination for Underwater Images Using the
configured for a particular environment and when the                  Color Space. Three main problems are observed in
environmental characteristics change, it is necessary to readjust     underwater image formation [26]. The first is known as
some parameters. In addi-tion, robust control schemes were not        disturbing noise, which is due to suspended matter in water,
proposed for attitude regulation.                                     such as bubbles, small particles of sand, and small fish or
                                                                      plants that inhabit the aquatic ecosystem. These particles
   In this work, a novel navigation system for autonomous             block light and generate noisy images with distorted colors.
underwater vehicle is presented. The navigation system                The second problem is related with the refraction of light.
combines a visual controller with an inertial controller in           When a camera set and objects are placed in two different
Journal of Sensors                                                                                                                          3
environments with different refractive index, the objects in the           In this image, the data of the objects are concentrated in a
picture have different distortion for each environment, and                small interval.
therefore the position estimation is not the same in both                      Therefore, in order to increase the robustness of the
environments. The third problem in underwater images is the                iden-tification method, new limits for each channel are
light attenuation. The light intensity decreases as the distance           established. These values help to increase the contrast
to the objects increases; this is due to the attenuation of the            between objects and the background in the image. The new
light in function of its wavelength. The effect of this is that the        limits are calculated using the frequency histogram for each
colors of the observed underwater objects look different from              of the channels, and, with this, the extreme values in the
those perceived in the air. Figure 1 shows two images with the             histogram with a higher frequency than a threshold value
same set of different color objects taken underwater and in air.           are computed. The difference between using the frequency
In these images, it is possible to see the characteristics of the          histogram, and not only the minimum and maximum
underwater images mentioned above.                                         values, is that the first method eliminates outliers.
    A color space is a mathematical model through which the                    Finally, a data normalization procedure is performed
perceptions of color are represented. The color space selection            using the new interval in each channel of the color space.
is an important decision in the development of the image                   Af ter this, it is possible to obtain a clear segmentation of
processing algorithm, because it can dramatically affect the               the objects with colors located at the end values of the
performance of the vision system. We selected the color space              channels. Figure 3 shows the result of applying the
[27], because it has features that simplify the analysis of the            proposed algorithm in the , , channels. It can be observed
data coming from the underwater images. In underwater                      that some objects are significantly highlighted from the
images, the background color (sea color) is usually blue or                greenish background; particularly, the red circle in the beta
green; these colors correspond to the limits of the and                    channel presents a high contrast.
channels, respectively, and, therefore, to identify objects with
contrasting colors to the blue or green colors results much                2.2. Detection of Artificial Marks in Aquatic Environments.
easier. A modification of the original transformation method               The localization problem for underwater vehicles requires
form the RGB to the space color was made. The logarithm                    identifying specific objects in the environment. Our naviga-
operation was removed from the transformation reducing the                 tion system relies on a robust detection of the artificial marks
processing time while keeping the color distribution. Thus, the            in the environment. Artificial red balls were selected as the
mapping between RGB and the modif ied color space is                       known marks in the aquatic environment. Moreover, circles
expressed as a linear transformation:                                      tags with different color were attached to the sphere in order to
                                                                           determine the section on the sphere that is being observed.
                      0.3475 0.8265            0.5559          R               Detection of circles in images is an important and frequent
           []          [                                  ][       ]       problem in image processing and computer vision. A wide
            []         =[0.2162   0.4267          −0.6411][G]          ,   variety of applications such as quality control, classi-fication of
           []         [0.1304 −0.1033             −0.0269][B]              manufactured products, and iris recognition use circle detection
                                                                           algorithms. The most popular techniques for detecting circles
where ∈ [0.0,1.73] is the achromatic channel which                         are based on the Circle Hough Transform (CHT) [28].
determines the luminance value, ∈ [−0.6411,0.6429]is the                   However, this method is slow, demands a considerable amount
yellow-blue opposite channel, and ∈ [−0.1304,0.1304], is the               of memory, and identifies many false positives, especially in
red-cyan with a significant influence of green. The data in                the presence of noise. Furthermore, it has many parameters that
these channels include a wide variety of colors; however, the              must be previously selected by the user. This last characteristic
information in aquatic images is contained in a very narrow                limits their use in under-water environments since ambient
interval. Figure 2 shows an underwater image and the                       conditions are constantly changing. For this reason, it is
frequency histogram for each channel of the color space.                   desirable a circle detection
4                                                                                                             Journal of Sensors
(c) (d)
algorithm with a fixed set of internal parameters that does not      displays the circles detected in the original image. The rows
require adjustment even if small or large circle identification is   in the figure present the obtained results under different
required or if the ambient light changes. The circle detection       conditions. The first experiment analyzes a picture taken in
algorithm presented by Akinlar and Tobal [29] provides the           a pool with clear water. Although the spheres are not close
desired properties. We have evaluated its performance in             to the camera, they can be easily detected by our visual
aquatic images with good results. Specifically, we applied the       system. T he second row is also a photograph taken in the
algorithm to the channel image which is the resulting image          pool, but in this case the visibility was poor; however, the
from the procedure described in the previous section. As it was      method works appropriately and detects the circle. Finally,
mentioned, the channel presents the highest contrast between         the last row shows the results obtained from a scene taken in
red color objects and the background color of underwater             the marine environment, in which visibility is poor. In this
images. This enables the detection algorithm to find circular        case, the presence of the red object in the image is almost
shapes in the field of view with more precision. This is an          imperceptible to the human eye; however the detector
important discover to the best of our knowledge, this is the first   identifies the circle successfully.
time that this color space model is used in underwater images            The navigation system proposed in this work is the
for this purpose.                                                    integration of the visual system, described above, with a
    Figure 4 shows the obtained results. T he images are             novel control scheme that defines the behavior of the
organized as follows: the first column shows the original            vehicle based on the available visual information. The
input image; the second column corresponds to the graphical          block diagram in Figure 5 shows the components of the
representation of the channel; and finally the third column          navigation system and the interaction between them.
Journal of Sensors                                                                                                                 5
(c) (d)
Figure 3: Result of conversion of the input image to color space after adjusting the range of values.
Figure 4: Example results of applying the circle detection algorithm using the color space in underwater images with different visibility
conditions.
                                     zd              Depth          d                                                                     z       ż
                                                     control
                                                                                           Fu                    u
                                                                                Attitude           Force
                                                     Route              d
    environment           system             2       planner
                                                                2       ̇
                                            2
Figure 5: Block diagram of the proposed navigation system for autonomous underwater vehicle.
Journal of Sensors                                                                                                                                                                                  7
              F
                  =
                          ̂^̇̂^
                                      +                                                                                                       ≤ + ‖ ̇‖‖ ‖+
                                          V                                                                                                      6        7       q      8             ‖‖.
                                                                                        (4)       Then, control law (4) adopts the following shape in the
                                  −                           2                                   Lagrangian space:
−                                     (q)( s + ∫s      +‖s‖ s),
        ̂ ̂                                                                                                       = (q)F
where , , , , and are constant positive definite V Λ
                                                                                                                         ̂                           ̂                       ̂
matrices, is a positive scalar, and q̃ is the pose error:
                                                                                                                      = (q)q̈+ (q,q̇)q̇+ (q)q̇− s
                                                                                                                                                                                                         (11)
                                                                                                                                                                  2
                                                                                                                          −              ∫s          −‖s‖               s,
property:
                                1̇
                                      ̂               ̂
    Now consider that the left-hand side of Lagrangian                                      Then the last term in (20) is bounded as follows:
formulation (9) can be expressed in the following
regression-like expression:                                                                                            ̃
                                                                                                               ( )− ( ) −g (q)
               (q)q̈+ (q,q̇)q̇+ (q,q̇,^ )q̇+g (q)
                                                                                  (14)
                 = (q,q̇,q̈),
                                                                                                                                            ()
                                                                                                                                                 ̃
                                                                    2
                                                                                                                            lim ‖s‖ →0.
    ( , ̇, ̇, ̈)                                           q̇
                                  ̃
                                                                                                                            →∞
    qqqq                ≤ 9 q̇+ 10            + 11 q̈.       (17)
Then the closed-loop dynamics is found using control law (11)
                                       ̃
                      2
               = − ‖s‖ s − (q,q̇,q̇,q̈) −g (q)+ ( ).in the
                                                    open-
    Now consider the following Lyapunov candidate func-
tion:
 loop Lagrangian expression (9):
      ̂         ̂           ̂                                                                        Finally, after definition (6) whenever s = 0 it follows
        (q)ṡ+ (q,q̇)s   + (q)s + s +         ∫s                                                        ̇
  with ̃a ≜ a0 − ∫s for some constant vector a0 ∈ R . T he                                                        that q̃ = −Λq̃ which means that q reaches the set point q .
time derivative of the Lyapunov candidate function along the
                                                          (18)
trajectories of the closed-loop system (18), after property                                     Therefore, the stability for the system is proved. A detailed
(13) and proper simplifications, becomes                                                          explanation and analysis of the controller can be found in
          ss              2
                                                                                                [33].
      ̇         ̂ q s s s s s a ()=− [ ()+ ] − ‖‖ −             0                                       The implementation of the control does not require
                                                                                  (20)
                                      1       ̂                         1    −1                    knowledge of the dynamic model parameters; hence it is
                                          s                              ̃
                   (s) =              2           (q)s +                2a
                                                                                  (19) a robust control with respect to the f luid disturbances and dynamic
                                                                                  parametric knowledge. However it is necessary to know the relationship
                                                                                  between the control input and the actuators.
         3.2. Thrusters Force Distribution. The                        by a set of six f ins which move along a sinusoidal path
         propulsion forces of Mexibot are generated                    defined as
                                       ̃
                +s ( ( )− (q,q̇,q̇,q̈) −g (q)).
                                                                                                            2
Assuming that ^̇is bounded implies that both q̇and q̈are                               ( )=         sen (       + )+ ,             (
                                          ̇
also bounded. Then, assuming that and are also bounded                                          2
                               ̃                     2
it yields ‖ ( )‖ + ‖ (q,q̇,q̇,q̈)‖ ≤ 1 + 1 ‖q̇‖+ 2 ‖q̇‖ , which can
be expressed in terms of the extended error as
                                                                       where is the angle of the position of the flip, is the
                                                                       amplitude of motion, is the period of each cycle, is the
                                                              2        central angle of the oscillation, and is the phase offset
                                   ̃                                   between different fins of the robot.
               ( , ̇, ̇, ̈)
        () +    qqqq                       ≤ 0 + 1 ‖s‖+ 2 ‖s‖ . (21)
Journal of Sensors                                                                                                          =        + + + +
                                                                                                                                11   22   33   44   55
                                                                                                                                                         (30e
    Both Georgiades in [35] and Plamondon in [36] show                                                                                                   )
models for the thrust generated by the symmetric oscillation of
the fins used in the Aqua robot family. Plamondon presents a
relationship between the thrust generated by the fins and the
parameters describing the motion in (26). Thus, the magnitude
of force generated by each flip with the sinusoidal movement
(26) is determined by the following equation:
                                                                          2
                                           (   1       +2 2)
                =0.1963                                                                   − 0.1554,
                                                                  3
where , 1, and 2 correspond to the dimensions of the fins,
represents the density of water, is the amplitude, and is the
period of oscillation. T hus, the magnitude of the force
generated by the robot fins can be established in function of
the period and the amplitude of the fin oscillation
movement at runtime. Figure 8 shows the force produced by
the fins, where defines the direction and the magnitude of
the force vector expressed in the body reference frame as
                                                       [                                               ]
                                                       [                                               ]
                                                                       = .
                                                                      F []
                                [                 ]
In addition, due to the kinematic characteristics of the
vehicle, = 0. Therefore, the vector of forces and moments
generated by the actuators is defined as follows:
                                                                  [                                    ]
                                                                  [                                    ]
                                                                  [                                    ]
                                                                  [                                    ]
                                                                  [                                    ]
                                                                  [                                    ]
                                                            F =
                                                                  [                                        ].
                                                                                                   []
                                                                  [                                    ]
                                                                  [                                    ]
                                                                                                   []
                                                       []
       =0                                                                                                           (30b)
       =+++++
                                                                                                                    (30c)
                      1       2            3                  4                   5        6
                                      + +                                 +                                     +
       =
            1             1       2    2           3                  3       4       4            5            5
                                                                                                                    (30d)
+               6 6
                                                                           9                                                                                                                        l
                                                                                                                                                                                                     x1
                                                                                                       6                                                                           4
        Trust line
                                                                                                                                              5
                                                               F
                                             Joint             px
                                 Fin                                                           Figure 9: Fins distribution in the underwater vehicle.
                                                                   Fp
                                             F
                                                 pz
                                                                                    in Figure 9. Note that the symmetry of the vehicle establishes
Figure 8: Diagram of forces generated by the fins movements                         that       =           =−       =−   ,           =−           ,               =−       =       =−   ,
where the angle establishes the direction of the force.                                    1       3            4        6   2                5           1            3       4        6   and 2 = 5 =
                                                                                    0.
                                                                                        System (30a), (30b), (30c), (30d), (30e), and (30f) has
y                                                     yp1                           five equations with twelve independent variables. Among
                                                                                    all possible solutions the one presented in this work arises
                                                      l                             after the imposition of the following constraints:
                                                      y1
                                                                                x
                                                                                                                         C1:                  = = ,
                                                                                                                                          1           2           3
          3                                                1
                                                                                                                         C2:                  = = ,
                                                                                                                                          4           5           6
+             6 6
                                                                                                                C3:          + =− − ,
                                                                                                                         1       3        4       6
                                                                                                                                                                                                 (31)
                        =        + + + +                                                                        C4:          − = − ,
                            11   22    33   44   55
                                                                        (30f)                                            1           3   4            6
+             6 6
                    ,                                                                                                    C5:
                                                                                                                                          2
                                                                                                                                              =− ,
                                                                                                                                                              5
where and are the distance coordinates of the th fin joint                                                                   C6:                          =
with respect to to the vehicle’s center of mass as shown                                                                                 0.
10                                                                                                                                                                 Journal of Sensors
Then one system solution is found to be                                                                 where is the actual yaw angle, ̃V is the visual error in
                                                                                                        horizontal axis, rows and columns are the image dimensions,
                                  1
                                               0               0        1                               and is the radius of the circle. This desired yaw angle is
          [    1      ]      [ 2                                                 ]
          [            ]     [    1                                              ]
                                                                                                        proportional to the visual error, but it also depends on the
                             [                                                   ]
          [     4      ]     [                 0               0        −1       ]
                                                                                                        radius of the circle found. When the object is close to the
          [           ]      [ 2                                                 ]
                     [ ]     [                                                   ][   ]                 camera, the sphere radius is larger, and therefore the change
          [     1      ]          0                                     0
                             [                 1                    2            ][   ]
          [           ]
                    [ ]= [                                                       ][   ] ,        (32)   of       also increases. Note that the resolution of the image
          [     2      ]     [    0                            0 0               ][   ]
                3
                             [0                                     −   0        ]
                                                                                                        the gain used to define the reference yaw angle in (37) was
          [           ]      [                 1                    2            ][   ]
          [
                       ]     [                                                   ]                      established as 300. This value was obtained experimentally,
          [            ]
               4             [0−1 2                                              0]
                                                                                                        with a trial error procedure, and produces a correction of
                                                                                                                              ∘
                                                                                        = diag {0.30,0.20,1.50},                             (
Figure 10: Diagram to illustrate the approximate location of visual                     = diag {0.10,0.07,0.20},
marks.
                                                                           The control gains in (4) and (36) were established after a
                                                                      trial and error process. The nonlinear and strongly coupled
                            ∘
                       ≈0
                                                            ∘
                                                      ≈45
0 10
                                                                               5
     (m)
−0.5
                                                                        ∘
                                                                               0
                                                                        ()
     z
                                                                             −5
            −1
                  0      10              20           30           40        −10
                                       Time (s)                                    0         10            20              30              40
                                                                                                         Time (s)
                                  (a)
                                                                                                     (b)
             10
                                                                              50
              5
     ∘
              0
                                                                        ∘
     ()
                                                                               0
                                                                        ()
−5
           −10                                                               −50
                  0       10             20           30           40
                                                                                   0         10            20             30               40
                                       Time (s)
                                                                                                         Time (s)
                                 (c)
                                                                                                   (d)
Figure 12: Navigation experiment when tracking one sphere. Desired values in red and actual values in black. (a) Depth ( );(b) roll ( );
(c) pitch ( ); (d) yaw ( ).
                                                                            The navigation task assigned to the underwater vehicle in
                                                                        the second experiment includes the two spheres with
this value is updated after the inertial navigation system
starts. The corresponding depth and attitude error signals
are depicted in Figure 13, where all of these errors have
considerably small magnitude, bounded by a value around
                                  ∘
0.2m for the depth error and 10 for attitude error.
    The time evolution of the visual error in the horizontal
axis is depicted in Figure 14. Again, the first thirty seconds
does not show relevant information because no visual feed-
back is obtained. Later, the visual error is reduced to a
value in an acceptable interval represented by the red lines.
This interval represents the values where the desired yaw
angle does not change, even when the visual system is not
detecting the sphere. As mentioned before, the experiments
show that when V ≤ 7 pixels, the AUV can achieve the
assigned navigation task. Finally, a disturbance, generated
by nearby swimmers when they displace water, moves the
vehicle and the error increases, but the visual controller acts
to reduce this error.
    The previous results show that the proposed controller
(4) under the thruster force distribution (32) provides a
good behavior in the set-point control of the underwater
vehicle, with small depth and attitude error values. This
performance enables the visual navigation system to track
the artificial marks placed in the environment.
the distribution showed in Figure 11. For this experiment the
exact position of the spheres is unknown; only the
approximate relative orientation and distance between them
are known. The first sphere was positioned in front of the
AUV at an approximated distance of 8m. When the robot
detects that the first ball was close enough, it should change
                        ∘
the yaw angle to 45 in order to find the second sphere.
Figure 16 shows the time evolution of the depth coordinate
( ),the attitude signals        ( ), ( ),         ( ),and the
corresponding reference signals during the experiment.
Similarly to the previous experiment, the actual depth, roll
angle, and pitch angle are close to the desired value, even
when small ambient disturbances are present. The yaw angle
plot shows the different stages of the system. The desired
value at the starting period is an arbitrary value that does not
have any relation with the vehicle state. After the
initialization period, a new desired value for the yaw angle
is set and this angle remains constant as long as the visual
system does not provide information. When the visual
system detects a sphere, the navigation system generates a
smooth desired signal allowing the underwater vehicle to
track the artificial mark. When the circle inside of the other
                                                          ∘
circle was detected, the change in direction of 45 was
applied. This reference value was fixed until the second
sphere was detected and a new desired signal was generated
with small changes. Finally, the second circle inside of the
sphere was detected and a new
Journal of Sensors                                                                                                                                                   13
0.5 10
                                                                                                             5
   (m)
               0                                                                                             0
   ̃z
−5
          −0.5                                                                                             −10
                   0               10                 20           30                  40                        0        10             20                30             40
                                                   Time (s)                                                                           Time (s)
                                                                   (a)                                                                                                         (b)
                                        10                                                                                                        20
5 0
()
          0                                                                                      −20
                                                                                        ∘
     ̃
−5 −40
         −10                                                                                     −60
               0              10                20            30                  40                   0             10           20             30             40
                                              Time (s)                                                                          Time (s)
                                             (c)                                                                               (d)
Figure 13: Navigation tracking errors of one sphere when using controller (4). (a) Depth ( );(b) roll ( );(c) pitch ( );(d) yaw ( ).
40
20
                                                                          0
                                                              p
                                                              c
                                                              e
                                                              s
                                                              (
                                                              )
                                                              i
−20
                                                                        −40
                                                                              0                                      10                               20                             30
                                                                                       Time (s)
                       Figure 14: Navigation tracking of one sphere. Visual error obtained from the AUV navigation experiment.
                                                                                            Note that, in this experiment, significant amount of the
                                                                                            error was produced by environmental disturbances.
                       ∘
change of 45 was performed and the desired value
remained constant until the end of the experiment.
    Figure 17 shows the depth and attitude error signals.
Similar to the first experiment, the magnitude of this error
is bounded by a value around 0.2m for the depth error and
   ∘
10 for the attitude error, except for the yaw angle, which
presents higher values produced by the direction changes.
    Finally, the graph of the time evolution of the visual
error is depicted in Figure 15. It can be observed that, at the
beginning, while the robot was moving forward, the error
remained constant because the system was unable to
determine the presence of the artificial mark in the
environment. At a given time , the visual system detected
the first sphere, with an estimated radius of about 30pixels.
Then, as the robot gets closer to the target, the visual error
begins to decrease due to the improvement in visibility and
14                                                                                                                                Journal of Sensors
100
50
                                                     −50
                                                           0    10   20     30       40           50   60
                                                                          Time (s)
Figure 15: Navigation tracking of two spheres. Visual error obtained from the AUV navigation experiment.
0 10
                                                                                          5
          −0.5
                        (m)
                                                                                          0
                        z
           −1
                                                                                         −5
          −1.5                                                                       −10
                 0                  20                         40          60                 0             20                         40              60
                                                Time (s)                                                               Time (s)
                                               (a)                                                               (b)
            10
             5
     ∘
             0
     ()
−5 0 20 40 60
          −10
             0                      20                         40          60
                                          Time (s)                                                          Time (s)
(c) (d)
Figure 16: Navigation tracking of two spheres. Desired values in red and actual values in black. (a) Depth ( );(b) roll ( );(c) pitch
( );(d) yaw ( ).
                                                                             but rapidly this error decreased to the desired interval. At
                                                                             the end of the experiment, another change of direction was
                                                                             generated and the error remained constant, because no
the radius of the sphere increases. When the radius is                       other sphere in the environment was detected.
bigger than a given threshold, a change-of-direction action
is fired in order to avoid collision and to search for the
second sphere. Then, all variables are reset. Once again the
error remains constant at the beginning due to the lack of
visual feedback. In this experiment, when the second mark
was identified, the visual error was bigger than 100 pixels,
5. Conclusion
In this paper, a visual-based controller to guide the navigation
of an AUV in a semistructured environment using artificial
marks was presented. The main objective of this work is to
provide to an aquatic robot the capability of moving in an
environment when visibility conditions are far from ideal and
artificial landmarks are placed with an approximately known
distribution. A robust control scheme applied under a given
thruster force distribution combined with a visual servoing
control was implemented. Experimental evaluations
Journal of Sensors                                                                                                                  15
0.5 10
                0
                                                                                0
                                                                           ̃
                                                                               −5
         −0.5                                                              −10
                0            20                40                60                 0      20                40                60
                                                                                                  Time (s)
                                    Time (s)
                                                                                                (b)
                                   (a)
          10
                                                                           100
           5
                                                                            50
           0
                                                                      ()
     ̃
                                                                               0
                                                                      ̃
          −5                                                               −50
                                                                                    0      20                40               60
         −10                                                                                     Time (s)
                0            20                40                60                             (d)
                                   Time (s)
                                  (c)
Figure 17: Navigation error when tracking two spheres when using controller (4). (a) Depth ( );(b) roll ( );(c) pitch ( );(d) yaw ( ).
                                                                    lenges,” in Proceedings of the IFAC Conference of Manoeuvering
                                                                    and Control of Marine Craft, vol. 88, 2006.
for the navigation system were carried out in an aquatic
environment with poor visibility. The results show that our
approach was able to detect the visual marks and perform
the navigation satisfactorily. Future work includes the use
of natural landmarks and to lose some restrictions, for
example, that more than one visual mark can be present in
the field of view of the robot.
Competing Interests
The authors declare that they have no competing interests.
Acknowledgments
The authors thank the financial support of CONACYT,
Mexico´.
References
[1] J. J. Leonard, A. A. Bennett, C. M. Smith, and H. J. S. Feder,
“Autonomous underwater vehicle navigation,” in Proceedings of
the IEEE ICRA Workshop on Navigation of Outdoor Autonomous
Vehicles, 1998.
[2] J. C. Kinsey, R. M. Eustice, and L. L. Whitcomb, “A survey
of underwater vehicle navigation: recent advances and new chal-
[3] L. Stutters, H. Liu, C. Tiltman, and D. J. Brown, “Navigation
technologies for autonomous underwater vehicles,” IEEE Trans-
actions on Systems, Man and Cybernetics Part C: Applications
and Reviews, vol. 38, no. 4, pp. 581–589, 2008.
[4] L. Paull, S. Saeedi, M. Seto, and H. Li, “AUV navigation and
localization: a review,” IEEE Journal of Oceanic Engineering,
vol. 39, no. 1, pp. 131–149, 2014.
[5] A. Hanai, S. K. Choi, and J. Yuh, “A new approach to a laser
ranger for underwater robots,” in Proceedings of the IEEE/RSJ
International Conference on Intelligent Robots and Systems
(IROS ’03), pp. 824–829, October 2003.
[6] F. R. Dalgleish, F. M. Caimi, W. B. Britton, and C. F. Andren,
“An AUV-deployable pulsed laser line scan (PLLS) imaging sensor,”
in Proceedings of the MTS/IEEE Conference (OCEANS ’07), pp. 1–5,
Vancouver, Canada, September 2007.
[7] A. Annunziatellis, S. Graziani, S. Lombardi, C. Petrioli, and
R. Petroccia, “CO2Net: a marine monitoring system for CO2
leakage detection,” in Proceedings of the OCEANS, 2012, pp. 1–
7, IEEE, Yeosu, Republic of Korea, 2012.
[8] G. Antonelli, Underwater Robots-Motion and Force Control
of Vehicle-Manipulator System, Springer, New York, NY, USA,
2nd edition, 2006.
[9] T. Nicosevici, R. Garcia, M. Carreras, and M. Villanueva, “A
review of sensor fusion techniques for underwater vehicle nav-
igation,” in Proceedings of the MTTS/IEEE TECHNO-OCEAN
’04 (OCEANS ’04), vol. 3, pp. 1600–1605, IEEE, Kobe, Japan,
2004.
16                                                                                                                          Journal of Sensors
[10] F. Bonin-Font, G. Oliver, S. Wirth, M. Massot, P. L. Negre,          [25] M. Bryson, M. Johnson-Roberson, O. Pizarro, and S. B.
and J.-P. Beltran, “Visual sensing for autonomous underwater              Williams, “True color correction of autonomous underwater
exploration and intervention tasks,” Ocean Engineering, vol. 93,          vehicle imagery,” Journal of Field Robotics, 2015.
pp. 25–44, 2015.                                                          [26] A. Yamashita, M. Fujii, and T. Kaneko, “Color registration of
[11] K. Teo, B. Goh, and O. K. Chai, “Fuzzy docking guidance using        underwater images for underwater sensing with consideration of
augmented navigation system on an AUV,” IEEE Journal of Oceanic           light attenuation,” in Proceedings of the IEEE International
Engineering, vol. 40, no. 2, pp. 349–361, 2015.                           Conference on Robotics and Automation (ICRA ’07), pp. 4570–
[12] R. B. Wynn, V. A. I. Huvenne, T. P. Le Bas et al.,                   4575, Roma, Italy, April 2007.
“Autonomous Underwater Vehicles (AUVs): their past, present               [27] D. L. Ruderman, T. W. Cronin, and C.-C. Chiao, “Statistics of
and future contributions to the advancement of marine                     cone responses to natural images: implications for visual coding,”
geoscience,” Marine Geology, vol. 352, pp. 451–468, 2014.                 Journal of the Optical Society of America A: Optics and Image
[13] F. Bonin-Font, M. Massot-Campos, P. L. Negre-Carrasco, G.            Science, and Vision, vol. 15, no. 8, pp. 2036–2045, 1998.
Oliver-Codina, and J. P. Beltran, “Inertial sensor self-calibration in    [28] T. D’Orazio, C. Guaragnella, M. Leo, and A. Distante, “A
a visually-aided navigation approach for a micro-AUV,” Sensors,           new algorithm for ball recognition using circle hough transform
vol. 15, no. 1, pp. 1825–1860, 2015.                                      and neural classifier,”Pattern Recognition, vol. 37, no. 3, pp. 393–
[14] J. Santos-Victor and J. Sentieiro, “The role of vision for under-    408, 2004.
water vehicles,” in Proceedings of the IEEE Symposium on                  [29] C. Akinlar and C. Topal, “EDCircles: a real-time circle
Autonomous Underwater Vehicle Technology (AUV ’94), pp. 28– 35,           detector with a false detection control,” Pattern Recognition, vol.
IEEE, Cambridge, Mass, USA, July 1994.                                    46, no. 3, pp. 725–740, 2013.
[15] A. Burguera, F. Bonin-Font, and G. Oliver, “Trajectory-based         [30] G. Dudek, P. Giguere, C. Prahacs et al., “AQUA: an amphibious
visual localization in underwater surveying missions,” Sensors,           autonomous robot,” Computer, vol. 40, no. 1, pp. 46–53, 2007.
vol. 15, no. 1, pp. 1708–1735, 2015.                                      [31] U. Saranli, M. Buehler, and D. E. Koditschek, “RHex: a simple
[16] D. Kim, D. Lee, H. Myung, and H.-T. Choi, “Artificial                and highly mobile hexapod robot,” International Journal of Robotics
landmark-based underwater localization for AUVs using weighted            Research, vol. 20, no. 7, pp. 616–631, 2001.
tem-plate matching,” Intelligent Service Robotics, vol. 7, no. 3, pp.     [32] T. I. Fossen, Guidance and Control of Ocean Vehicles, John
175– 184, 2014.
                                                                          Wiley & Sons, 1994.
[17] J. Sattar and G. Dudek, “Robust servo-control for underwater         [33] R. Perez´-Alcocer, E. Olgu´ın-D´ıaz, and L. A. Torres-Mendez,´
robots using banks of visual filters,” inProceedings of the IEEE          “Model-free robust control for fluid disturbed underwater vehicles,” in
International Conference on Robotics and Automation (ICRA ’09),           Intelligent Robotics and Applications, C.-Y. Su, S. Rakheja, and H. Liu,
pp. 3583–3588, Kobe, Japan, May 2009.                                     Eds., vol. 7507 of Lecture Notes in Computer Science, pp. 519–529,
[18] C. Barngrover, S. Belongie, and R. Kastner, “Jboost                  Springer, Berlin, Germany, 2012.
optimization of color detectors for autonomous underwater vehicle
                                                                          [34] E. Olgu´ın-D´ıaz and V. Parra-Vega, “Tracking of
naviga-tion,” in Computer Analysis of Images and Patterns, pp.
                                                                          constrained submarine robot arms,” in Informatics in Control,
155–162, Springer, 2011.
                                                                          Automation and Robotics, vol. 24, pp. 207–222, Springer, Berlin,
[19] J. Gao, A. Proctor, and C. Bradley, “Adaptive neural network
                                                                          Germany, 2009.
visual servo control for dynamic positioning of underwater
                                                                          [35] C. Georgiades, Simulation and control of an underwater
vehicles,” Neurocomputing, vol. 167, pp. 604–613, 2015.
                                                                          hexapod robot [M.S. thesis], Department of Mechanical
[20] S. Heshmati-Alamdari, A. Eqtami, G. C. Karras, D. V.
                                                                          Engineering, McGill University, Montreal, Canada, 2005.
Dimarog-onas, and K. J. Kyriakopoulos, “A self-triggered visual
servoing model predictive control scheme for under-actuated               [36] N. Plamondon, Modeling and control of a biomimetic under-
underwa-ter robotic vehicles,” in Proceedings of the IEEE                 water vehicle [Ph.D. thesis], Department of Mechanical Engi-
International Conference on Robotics and Automation (ICRA ’14),           neering, McGill University, Montreal, Canada, 2011.
pp. 3826– 3831, Hong Kong, June 2014.
[21] P. L. N. Carrasco, F. Bonin-Font, and G. O. Codina, “Stereo
graph-slam for autonomous underwater vehicles,” in Pro-ceedings
of the 13th International Conference on Intelligent Autonomous
Systems, pp. 351–360, 2014.
[22] B. Li, Y. Xu, C. Liu, S. Fan, and W. Xu, “Terminal navigation and
control for docking an underactuated autonomous underwater vehicle,” in
Proceedings of the IEEE International Conference on CYBER Technology
in Automation, Control, and Intelligent Systems (CYBER ’15), pp. 25–30,
Shenyang, China, June 2015.
[23] M. Myint, K. Yonemori, A. Yanou, S. Ishiyama, and M. Minami,
“Robustness of visual-servo against air bubble disturbance of
underwater vehicle system using three-dimensional marker and dual-
eye cameras,” in Proceedings of the MTS/IEEE Washington (OCEANS
’15), pp. 1–8, IEEE, Washington, DC, USA, 2015.
[24] B. Sut¨o,˝ R. Doczi,´ J. Kallo´ et al., “HSV color space based
buoy detection module for autonomous underwater vehicles,” in
Proceedings of the 16th IEEE International Symposium on
Computational Intelligence and Informatics (CINTI ’15), pp. 329–
332, IEEE, Budapest, Hungary, November 2015.
                                                                                                                             International Journal of
                                                                                                                             Rotating
                                                                                                                             Machinery
International Journal of
Journal of
Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation
http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014
                                                                                                                                                                                                                                               Journal of
                                                                                                                                                                                                                                               Control Science
                                                                                                                                                                                                                                                and Engineering
Advances in
Civil Engineering
 Hindawi Publishing Corporation                                                                                                                                                                                                                Hindawi Publishing Corporation
 http://www.hindawi.com           Volume 2014                                                                                                                                                                                                  http://www.hindawi.com                     Volume 2014
Journal of
                                                                              VLSI Design
 Advances in
 OptoElectronics
                                                                                                                                                                                                                                               International
                                                                                                                                                                                                                                               Journal of
                                                                                                                                                                                                                                               Enginee
                                                                                                                             International Journal of                              Simulation                                                  Aerospace
                                                                                                                                                                                   Modelling &
                                                                                                                             Observation                                           in Engineering
                                                                                                                                                                                                                                               ring
                                                                                                                             Navigation and
Hindawi Publishing Corporation    Volume 2014                                                                                                                                                                                                                                             Volume 2014
                                                 International Journal of
International Journal of                         Antennas and                                                      Active and Passive                                                                            Advances in
Chemical Engineering                             Propagation                                                       Electronic Components                          Shock and Vibration                            Acoustics and Vibration
  Hindawi Publishing Corporation                 Hindawi Publishing Corporation                                    Hindawi Publishing Corporation                 Hindawi Publishing Corporation                 Hindawi Publishing Corporation
http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014