See discussions, stats, and author profiles for this publication at: https://www.researchgate.
net/publication/312111863
Simulation environment for mobile robots testing using ROS and Gazebo
Conference Paper · October 2016
DOI: 10.1109/ICSTCC.2016.7790647
CITATIONS READS
156 4,136
4 authors, including:
Valeri Kroumov Florentin Smarandache
Okayama University of Science University of New Mexico
57 PUBLICATIONS 912 CITATIONS 4,185 PUBLICATIONS 58,460 CITATIONS
SEE PROFILE SEE PROFILE
All content following this page was uploaded by Florentin Smarandache on 06 July 2020.
The user has requested enhancement of the downloaded file.
Simulation Environment for Mobile Robots Testing
Using ROS and Gazebo
Kenta Takaya∗ , Toshinori Asai† , Valeri Kroumov‡ and Florentin Smarandache§
∗ Graduate School, Okayama University of Science, 1-1 Ridai-cho, Okayama City 700–0005, Japan
Email: kenta@kids.ee.ous.ac.jp
† FBR Technology Engineering Services Co., 2-2-2 Chuo-cho, Tsuruga City 914-0811, Japan
Email: asronx@gmail.com
‡ Dept. of Electrical & Electronic Engineering, Okayama University of Science, 1-1 Ridai-cho, Okayama City 700-0005, Japan
Email: val@ee.ous.ac.jp
§ University of New Mexico, 705 Gurley Ave., Gallup, New Mexico 87301, USA
Email: fsmarandache@gmail.com
Abstract—In the process of development a control strategy for II. R ELATED W ORK
mobile robots, simulation is important for testing the software
components, robot behavior and control algorithms in different There are several commercial and open source simulation
surrounding environments. In this paper we introduce a simula- environments for robotic field. Some common examples of
tion environment for mobile robots based on ROS and Gazebo. such software are briefly listed here.
We show that after properly creating the robot models under
Gazebo, the code developed for the simulation process can be WEBOTS[3] supports C/C++, Java, Python, URBI and
directly implemented in the real robot without modifications. MATLAB and has TCP/IP interface to communicate to other
In this paper autonomous navigation tasks and 3D-mapping software products. It has many components which can be
simulation using control programs under ROS are presented. connected to create complex construction easily. Robot Virtual
Both the simulation and experimental results agree very well Worlds[4] was primarily designed for educational purpose but
and show the usability of the developed environment.
it seems that it can be used for some advanced applications.
LabVIEW[5] is a complex software system having numerous
I. I NTRODUCTION
libraries for simulation of hardware components and supports
Today’s robot system is a complex hardware device most of the standard interfaces.
equipped with a numerous sensors and computers which are On other hand there exist a great number of open source
often controlled by complex distributed software. Robots must simulators and many of them have very advanced features.
navigate and perform successfully specific tasks in various USARSim (Unified System for Automation and Robot
environments and under changing conditions. It is costly Simulation)[6] is commonly used in RoboCup rescue virtual
and time consuming to build different test fields and to robot competition as well as a research platform.
test the robot behaviour under multiple conditions. Using a The OpenHRP3[8] (Open-architecture Human-centered
well-developed simulation environment allows safe and cost Robotics Platform version 3) is compatible with OpenRTM-
effective testing of the robotic system under development. aist[9]—a robotic technology middelware. However, the
The simulation decreases the development cycle and can be OpenHRP is designed to develop and simulate mainly hu-
versatilely applied for different environments. manoid robots.
In this paper we describe the design and implementation OpenRAVE[10] is a tool for testing, development, and
of an environment for development and simulation of mobile simulation for robotics systems. It uses high level scripting
robots using ROS (Robot Operation System)[1] and Gazebo[2] such as MATLAB and Octave. The OpenRAVE focuses mainly
software. Accurate models of the simulated robots and their on humanoid robots and robot manipulators. A comparison
working environment are designed. Simulation and experi- between several open source simulation environments for
ments for mapping and control are presented as well.The mobile robots can be found in [11], [12] and the references
software used during simulations is successfully used in the there.
control of real robots without any modifications. There are many publications about robot simulation cover-
This paper is organized as follows. The next section de- ing variety of robots: manipulators, legged robots, underwater
scribes related work. Section III describes briefly Gazebo and vehicles, and unmanned aerial vehicles (UAV). Many of those
the ROS software. The robot and the working environment developments are based on ROS and Gazebo software pack-
models are introduced, too. Section IV and Section V details ages which proves their reliability and great usability. The
the 2D and 3D simulation and experimental results, respec- Virtual Robotics Challenge[13] hosted by Defense Advanced
tively. Section VI concludes the paper. Research Projects Agency (DARPA) as a part of the DARPA
Robotics Challenge led to development and improvement of interprocess communication visualization tool, as well as a 3D
simulation software to run nearly identically to the real robotic environment visualization tool and many others. ROS allows
hardware[14]. Simulation of manipulation tasks including effective development of new robotic systems and when used
grasp and place motions are presented in [15]. Results in UAV together with a simulation middleware like Gazebo the time for
simulation and experimental challenges are covered in other development a reliable and high performance robotic control
publications[16], [17]. software can be dramatically decreased.
III. ROBOT S IMULATION U NDER ROS AND G AZEBO
C. Robot and Environment Modeling
A. Gazebo
Gazebo is a part of the Player Project[18] and allows simu- In representing the robot and environment models in ROS,
lation of robotic and sensors applications in three-dimensional the URDF (Universal Robotic Description Format) is used.
indoor and outdoor environments. It has a Client/Server ar- The URDF is an XML file format used and standardized
chitecture and has a Publish/Subscribe model of interprocess in ROS for description of all elements (sensors, joints, links
communication. etc.) of a robot model. Because URDF can only specify the
Gazebo has a standard Player interface and additionally has kinematic and dynamic properties of a single robot in isolation,
an native interface. The Gazebo clients can access its data to make the URDF file work properly in Gazebo, additional
through a shared memory. Each simulation object in Gazebo simulation-specific tags concerning the robot pose, frictions,
can be associated with one or more controllers that process inertial elements and other properties were added[23]. The
commands for controlling the object and generate the state of addition of these properties makes the original URDF file com-
that object. The data generated by the controllers are published patible with the native SDF (Simulation Description Format)
into the shared memory using Gazebo interfaces (Ifaces). The Gazebo’s model description format. SDF can fully describe
Ifaces of other processes can read the data from the shared the simulated world together with the complete robot model.
memory, thus allowing interprocess communication between The process of conversion from URDF to SDF can be easily
the robot controlling software and Gazebo, independently of done by adding the so called gazebo plugins into URDF file.
the programming language or the computer hardware platform. The gazebo plugins can attach into ROS messages and service
In the process of dynamic simulation Gazebo can access calls the sensor outputs and driving motor inputs[24], i.e. the
the high-performance physics engines ODE[19], Bullet[20], gazebo plugins create an interface (Topic) between ROS and
Simbody[21] and DART[22] which are used for rigid body Gazebo. The control process intercommunication under ROS
physical simulation. is achieved by performing a Publish/Subscribe to that Topic.
The Client sends control data, simulated objects’ coordi- There are several plugins available in gazebo plugins[24]:
nates to the Server which performs the real-time control of the Camera (ROS interface for simulating cameras), Multicamera
simulated robot. It is possible to realize a distributed simula- (synchronizes multiple camera shutters to publish their images
tion by placing the Client and the Server on different machines. together—typically stereo cameras), GPU Laser, F3D (for
Deploying ROS Plugin for Gazebo helps to implement a direct external forces on a body), IMU, Bumper, Differential Drive,
communication interface to ROS thus controlling the simulated Skid Steering Drive, Planar Move Plugin and many others.
and the real robots using the same software. This provides an
effective simulation tool for testing and development of real
robotic systems.
B. ROS
ROS[1] is a collection of libraries, drivers, and tools for
effective development and building of a robot systems. It
has a Linux-like command tool, interprocess communication
system, and numerous application-related packages. The ROS
executable process is called Node and interprocess commu-
nication has a Publish/Subscribe model. The communication
data is called Topic. The Publisher process may publish one
or more Topics and processes which subscribe to certain
Topic can receive its content. The interprocess communication
library allows easily to add user developed libraries and ROS
executables. Moreover, the ROS-based software is language
and platform-independent—it is implemented in C++, Python,
and LISP.
The process name resolving and execution is scheduled by
the Master Server. The ROS packages include many sensor
drivers, navigation tools, environment mapping, path planning, Fig. 1. Pioneer3-DX (left) and PeopleBot (right) models in Gazebo.
Robot model
Corridor
Fig. 2. P3-DX and lab models in Gazebo Fig. 3. Map generation using Hector mapping in Rviz
In this study we have created models of PeopleBot and for local planning and obstacle avoidance. The global planning
Pioneer 3-DX robots. The model of P3-DX is distributed is based on the Dijkstra’s Algorithm[27] and in the local
together with Gazebo, but its dimensions and properties differ coordinates the path is additionally corrected using Dynamic
from the real robot. We have created a new much more Window Approach (DWA)[28].
precise model of P3-DX including the models of several
A. Unknown Environment Mapping Simulation
sensors. Part of P3-DX model was adopted in the process of
designing the PeopleBot model. We have added the models A simulation example of map generation of unknown en-
of 2D and 3D laser finders, sonars, odometry, camera, IR vironment is shown in Fig. 3. To perform map generation
sensors, and bumpers. Additionally, we have properly defined the hector mapping[29] package was used. hector mapping
the robot masses and frictions. The models created are shown realizes the SLAM (Simultaneous Localization and Mapping)
in Fig. 1. In the process of simulation we are using the algorithm and provides robot pose estimates at the scan rate
already distributed Willow Garage model and the model of of the laser scanner (40Hz for the UTM-30LX LIDAR).
our laboratory as shown in Fig. 3. Generally, the package does not need odometry if the robot
platform does not perform yaw motion. Because during the
IV. 2D S IMULATION AND E XPERIMENTAL R ESULTS simulations our obstacle avoidance algorithm causes some yaw
This section describes the simulation results using ROS and motions, we are using the odomery data to properly estimate
Gazebo. The robot model and the real robot are equipped the robot pose.
with 2D laser finder (Hokuyo UTM-30LX LIDAR), 2 Web One way to perform the map generation is by simultane-
cameras (Logicool c615), sonars (16pcs for P3-DX robot and ously setting goal positions until the whole working space
24psc for PeopleBot), odometry system and a laptop computer is covered by the robot. Another approach is to make the
for controlling the robot. The sonars are used by the obstacle robot to “explore” the environment until the complete map
avoidance Node. Because there is no a plugin for sonars is constructed. The problem with the latter is that a special
in the gazebo plugins, we adopted the Laser plugin (ray, care must be taken in the exploration algorithm to make the
libgazebo ros laser.so) making it work approximately like robot not to perform unnecessary turns and “reexplore” already
a sonar sensor. Additionally, we have designed an obstacle covered areas. Otherwise, the map generation process may
avoidance Node: a standard Britenberg vehicle type node. take very long time. In our simulations we are using the first
The first camera is used for environment monitoring and approach: setting simultaneously goals as depicted in Fig. 3.
the second one transfers the floor area directly in front of Fig. 4 shows the completed map which is further used in the
the robot which increases the reliability during remote control navigation simulation.
operations. The robot and the environment models are created
using Gazebo and simulation is performed under the ROS B. Navigation Simulation
control. As depicted in Fig. 3 and Fig. 4 during the simulation After performing the map generation using the hec-
we use the RViz package for monitoring. By using the Camera tor mapping package, we used the amcl[30] (Adaptive Monte
plugin the simulated environment is displayed in RViz. The Carlo Localization) package for navigation inside the gener-
goal position of the robot is set by pointing and clicking at it ated map. To properly estimate the robot position inside the
with the computer mouse. environment, this package uses the laser scan and odometry
The simulation results from map generation of unknown readings data as well a laser-based map. Most of the algorithms
environment and robot navigation using the generated map used by amcl package are described in [31]. One simulation
are presented in the next subsections. For path generation result in known environment is shown in Fig. 4.
we are using the ROS Navigation Stack[25] package which
extensively used costmaps[26] to store information about the C. Experimental Results
obstacles situated in the robot working space. The Navigation Using the simulation control software we performed ex-
Stack uses one costmap for global planning and another one periments of control of real robots under ROS. During the
Fig. 4. Navigation using AMCL in Rviz
Fig. 6. 3D Mapping Using Octomap and Hector Mapping
Fig. 5. 3D Mapping Using Only the Octomap Package
Fig. 7. 3D Map Generation Using Gazebo in RViz (simulation)
experiments there was no diference in robots behaviour com-
pared to the simulations. However, due to differences in It was shown that after designing properly the models of the
the interprocess communication speed and calculation speeds robot platforms and their working environments the software
there was need to tune some parameter of the Navigation stack used in the simulations can be directly used to control the real
node. robots. Simulations and experimental results in 2D and 3D
mapped environments prove the usability of the models. The
V. 3D M APPING S IMULATION AND E XPERIMENTAL good combination of ROS packages allowed enough precise
R ESULTS map generation in 3D space.
The final goal of this research is a design of reliable guiding
To perform the 3D mapping we use the Octomap
robot for elderly for indoor and outdoor environments.
package[32] and the Hokuyo YVT-X002 LIDAR model. The
The paper describes in details which packages were em-
Octomap Package does not possess a SLAM algorithm and
ployed and we hope that the results reported here will be useful
relies on odometry measurements, which introduce a bias
at least for part of the roboticists community.
in the position estimation of the robot and consequently
uncertainties in the map as shown in Fig. 5. R EFERENCES
In order to solve the above problem we have combined the [1] Open Source Robotic Foundation, “ROS/Introduction,” 2016, http://wiki.
Octomap and the Hector Mapping. The result is generation of ros.org/ROS/Introduction.
quite precise 3D map as shown in Fig. 6 and 7. [2] GAZEBO, “Robot simulation made easy,” 2016, http://gazebosim.org/.
[3] Webots, 2016, https://www.cyberbotics.com.
Fig. 6 depicts an experimental result under ROS control and [4] Robot Virtual Worlds, 2016, http://www.robotvirtualworlds.com/.
Fig. 7 shows the simulation using Gazebo. From the figures [5] National Instruments, “What Can You Do With LabVIEW?,” 2016, http:
it can be confirmed that the maps are precise enough. The //www.ni.com/labview/why/.
[6] S. Carpin, M. Lewis, J. Wang, S. Balakirski, and C. Scrapper, “US-
map created during the experiment has some noise which ARSim: a robot simulator for research and education,” in 2007 IEEE
successfully can be neglected. International Conference on Robotics and Automation, Roma, Italy, 10-
14 April 2007, pp. 1400–1405.
VI. C ONCLUSION [7] Epic Games, “Unreal Engine 4,” 2016, https://www.epicgames.com/.
[8] OpenHRP3 Official Site, “About OpenHRP3,” 2016, http://fkanehiro.
The purpose of this study is to develop a reliable environ- github.io/openhrp3-doc/en/about.html.
[9] Japan’s National Institute of Advanced Industrial Science and Technol-
ment for simulation and control of mobile robots using the ogy, “OpenRTM-aist,” 2016, http://www.openrtm.org/openrtm/en/node/
ROS and Gazebo software. 629.
[10] R. Diankov and J. Kuffner, “Openrave: A planning architecture for [19] Russell L. Smith, “Open Dynamics Engine,” 2016, https://bitbucket.org/
autonomous robotics,” Robotics Institute, Pittsburgh, PA, Tech. Rep. odedevs/ode/.
CMU-RI-TR-08-34, July 2008. [20] Real-Time Physics Simulation, “BULLET Physics Library,” 2016, http:
[11] P. Castillo-Pizarro, T.V. Arredondo, and M. Torres-Torriti, “Introductory //bulletphysics.org/wordpress/.
Survey to Open-Source Mobile Robot Simulation Software,” 2010 Latin [21] Simbody: Multibody Physics API, 2016, https://simtk.org/home/
American Robotics Symposium and Intelligent Robotics Meeting (LARS), simbody/.
pp. 150–155, 2010. [22] Georgia Tech Graphics Lab and Humanoid Robotics Lab, “DART (Dy-
[12] D. Cook, A. Vardy, and R. Lewis, “A survey of AUV and robot namic Animation and Robotics Toolkit),” 2016, http://dartsim.github.io/.
simulators for multi-vehicle operations,” 2014 IEEE/OES Autonomous [23] URDF in Gazebo, “Tutorial: Using a URDF in Gazebo,” 2016, http:
Underwater Vehicles (AUV), Ocford, Missisippi, Oct. 2014, pp. 1–8. //gazebosim.org/tutorials/?tut=ros urdf.
[13] Defense Advanced Research Projects Agency (DARPA), “DARPA [24] Gazebo plugins in ROS, “Tutorial: Using Gazebo plugins with ROS,”
Robotics Challenge,” 2016, http://theroboticschallenge.org/. 2016, http://gazebosim.org/tutorials?tut=ros gzplugins.
[14] C. E. Aguero, N. Koenig, I. Chen, H. Boyer, S. Peters, J. Hsu, B. Gerkey, [25] ROS Navigation Stack, “navigation,” 2016, http://wiki.ros.org/
S. Paepcke, J. L. Rivero, J. Manzo, E. Krotkov, and G. Pratt, “Inside navigation.
the Virtual Robotics Challenge: Simulating Real-Time Robotic Disaster [26] “costmap 2d Package,” 2016, http://wiki.ros.org/costmap 2d.
Response,” IEEE Trans. Autom. Sci. Eng., Vol. 12, No. 2, pp. 494-506, [27] J. C. Latombe, Robot Motion Planning, Kluver Academic Publishers,
April 2015. pp. 604–608, 1998.
[15] W. Qian, Z. Xia, J. Xiong, Y. Gan, Y. Guo, S. Weng, H. Deng, Y. Hu, J. [28] R. Segward and I. R. Nourbakhsh, Introduction to Autonomous Mobile
Zhang, “Manipulation Task Simulation using ROS and Gazebo,” in2014 Robots, The MIT Press, pp.282–284, 2004.
IEEE Int. Conf. on Robotics and Biomimetics, Dec. 5-10, 2014, Bali, [29] S. Kohlbrecher, “hector mapping,” 2016, http://wiki.ros.org/hector
Indonesia, pp. 2594–2598. mapping.
[16] Q. Bu, F. Wan, Z. Xie, Q. Ren, J. Zhang, and S. Liu, “General Simulation [30] B. P. Gerkey, “amcl,” 2016, http://wiki.ros.org/amcl.
Platform for Vision Based UAV Testing,” in 2015 IEEE Int. Conf. [31] S. Turn, W. Burgard and D. Fox, Probabilistic Robotics, The MIT Press,
Information and Automation, Lijiang, China, Aug. 2015, pp.2512–2516. 2006.
[17] M. Zhang, H. Qin, M. Lan., J. Lin., S. Wang, K. Liu, F. Lin, and B. M. [32] “Octomap - A probabilistic, flexible, and compact 3D mapping library
Chen, “A High Fidelity Simulator for a Quadrotor UAV using ROS and for robotic systems,” 2016, http://octomap.github.io/octomap/doc/md
Gazebo,” IECON2015-Yokohama, Nov. 9-12, 2015, pp. 2846–2851. README.html.
[18] Player/Stage project, “The Player Project,” 2016, http://playerstage.
sourceforge.net/.
View publication stats