Skip to content

This is an "Industrial AMR" that combines different frameworks (Nav2, RTAB-Map, etc.) and fuses various sensors (IMU, Stereo Camera, 2D-LiDAR, etc.) to perform SLAM, Localization, and Navigation in challenging environments.

License

Notifications You must be signed in to change notification settings

ali-pahlevani/Multi_Sensor_SLAM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

58 Commits
 
 
 
 
 
 

Repository files navigation

Multi Sensor SLAM

A framework to work with Industrial AMRs and perform SLAM using various sensors in challenging environments

In this repository, I have combined different frameworks to create a functional AMR that performs Robust SLAM in smart factories and similar environments:

rtab

For my AMR development, I’ve selected the following Open-Source Repositories:

  1. Linorobot2 as the base repository for creating my AMR,
  2. RTAB-Map as the main framework for sensor fusion and SLAM,
  3. Kinematic-ICP for future work on enhancing ICP-based odometry,

Additionally, there are two more packages I created:

  1. amr_main as the base package for running everything together,
  2. laser_fusion which contains the necessary 2D-LiDAR fusion node (to fuse measurements from different 2D LiDARs).
  • I’m still working on enhancing this project. It works well, but still needs some improvements to incorporate additional features (e.g., using Kinematic-ICP alongside RTAB-Map).

Main Files (To adapt the work to your needs)

  1. RTAB-Map parameter file: In this file, you can change the RTAB-Map parameters. You can also set which 2D LiDAR you want to use (front, back, all) and which Stereo Camera (front, back, both -> (Not available yet)).

    code ~/Multi_Sensor_SLAM/src/rtabmap_ros/rtabmap_launch/launch/rtabmap.launch.py
    
  2. Change the Robot and Sensors' parameters or Add/Remove them:

    code ~/Multi_Sensor_SLAM/src/linorobot2/linorobot2_description/urdf/4wd_properties.urdf.xacro
    code ~/Multi_Sensor_SLAM/src/linorobot2/linorobot2_description/urdf/robots/4wd.urdf.xacro
    code ~/Multi_Sensor_SLAM/src/linorobot2/linorobot2_description/urdf/sensors/laser_new.urdf.xacro
    code ~/Multi_Sensor_SLAM/src/linorobot2/linorobot2_description/urdf/sensors/stereo_camera.urdf.xacro
    code ~/Multi_Sensor_SLAM/src/linorobot2/linorobot2_description/urdf/sensors/imu.urdf.xacro
    
  3. 2D-LiDAR fusion algorithm:

    code ~/Multi_Sensor_SLAM/src/laser_fusion/laser_fusion/combine_laser_measurements.py
    

Demo

amr_rtab_gif_2

  • In this demo, I have used only the front LiDAR (not both). I have also used the front stereo camera. Here, I'm performing VIO and ICP odometry, as well as visual and laser loop closures to create a 2D map. Additionally, RTAB-Map uses graph-based SLAM.

Installation and Usage

  • This project requires ROS 2 (Recommended: ROS 2 Humble):
  1. Clone the Repository:
    git https://github.com/ali-pahlevani/Multi_Sensor_SLAM.git
    cd Multi_Sensor_SLAM
    
  2. Install the required Dependencies:
    rosdep update && rosdep install --from-path src --ignore-src -y
    
  3. Build the Workspace:
    colcon build
    source install/setup.bash
    
  4. Launch the Simulation:
    ros2 launch amr_main launch_all.launch.py
    

If you have any question about any part, please feel free to ask!

About

This is an "Industrial AMR" that combines different frameworks (Nav2, RTAB-Map, etc.) and fuses various sensors (IMU, Stereo Camera, 2D-LiDAR, etc.) to perform SLAM, Localization, and Navigation in challenging environments.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published