Skip to content

ashayp22/SLAM

Repository files navigation

SLAM - Landmark Detection & Tracking

In this project, I implemented SLAM (Simultaneous Localization and Mapping) for a 2 dimensional world. SLAM is a an algorithm that combines robot sensor measurements and movement to create a map of an environment from only sensor and motion data gathered by a robot, over time. SLAM gives you a way to track the location of a robot in the world in real-time and identify the locations of landmarks such as buildings, trees, rocks, and other world features.

All code was written using Jupyter Notebook and was created for Udacity's "Computer Vision" course. The Notebooks are also fully documented with visuals, answers, and descriptions.

(1) Robot Moving and Sensing

Define the robot class for moving and sensing its environment.

(2) Omega and Xi

Create a matrix and a vector (omega and xi, respectively) in order to implement SLAM.

(3) Landmark Detection and Tracking

Bring everything together by creating test environments, updating Omega and Xi using motion and measurement data, and create a 2D map of the environment.

About

2D Landmark Detection & Tracking using SLAM (Simultaneous localization and mapping)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published