In this project, I implemented SLAM (Simultaneous Localization and Mapping) for a 2 dimensional world. SLAM is a an algorithm that combines robot sensor measurements and movement to create a map of an environment from only sensor and motion data gathered by a robot, over time. SLAM gives you a way to track the location of a robot in the world in real-time and identify the locations of landmarks such as buildings, trees, rocks, and other world features.
All code was written using Jupyter Notebook and was created for Udacity's "Computer Vision" course. The Notebooks are also fully documented with visuals, answers, and descriptions.
Define the robot class for moving and sensing its environment.
Create a matrix and a vector (omega and xi, respectively) in order to implement SLAM.
Bring everything together by creating test environments, updating Omega and Xi using motion and measurement data, and create a 2D map of the environment.