A Gaussian Process Trajectory Representation with Closed-Form Kinematics for Continuous-Time Motion Estimation
- Install Ubuntu 20.04 and ROS NOETIC
- Checkout the master branch
- Please catkin build SFUISE in your workspace to have the cf_msg, which is required in gptr.
- Please install Ceres 2.0 and sophus
sudo apt install libfmt-dev # may be required as a dependency of sophus sudo apt install ros-noetic-sophus sudo apt install libceres-dev - Git clone and catkin build the repo.
- Install Ubuntu 22.04 and ROS HUMBLE
- Checkout ros2 branch
- Please colcon build SFUISE2 in your workspace to have the cf_msg.
- Install sophus and ceres 2.0
sudo apt install libfmt-dev # may be required as a dependency of sophus sudo apt install ros-humble-sophus sudo apt install libceres-dev - Git clone and colcon build the repo.
Ceres 2.2 replaces LocalParameterization class with Manifold. We have also did some tests on ceres.2.2 on Ubuntu 22.04 and ROS HUMBLE.
- Install ceres from source.
- Install sophus from source. Do
git checkout 1.24.6. If there is a complaint about cmake version, you can manual change the cmake version in CMakeLists.txt file, for examplecmake_minimum_required(VERSION 3.22)instead of 3.24. - Make sure
libfmt-devis installed. - Checkout the ceres.2.2 branch of this repo and colcon build the repo.
Please raise an issue should you encounter any issue with the compilation of the package.
You can download and unzip the file cloud_avia_mid_dynamic_extrinsics from here. It contains the pointclouds and the prior map for the experiment.
After that, modify the path to the data and prior map in run_sim.launch and launch it. You should see the following visualization from rviz.
Similar to the synthetic dataset, please download the data and the prior map from here.
Then specify the paths to the data and prior map in gptr/launch/run_lio_cathhs_iot.launch before roslaunch. You should see the following illustration.
Please use the scripts analysis_cathhs.ipynb and analysis_sim.ipynb to evaluate the result.
Please download the UTIL (TDoA-inertial) dataset.
Change bag_file and anchor_path in gptr/launch/run_util.launch according to your own path.
For ROS1 users, please run
roslaunch gptr run_util.launch
For ROS2 users, please first convert the UTIL dataset to ROS2 bag using ros2bag_convert_util.sh from SFUISE2 and run
ros2 launch gptr run_util.launch.py
Below is an exemplary run on sequence const2-trial4-tdoa2
Please set if_save_traj in gptr/launch/run_util.launch to 1 and change result_save_path accordingly.
evo_ape tum /traj_save_path/gt.txt /traj_save_path/traj.txt -a --plot
For comparison, a baseline approach based on ESKF is available in the paper of UTIL dataset.
Run the following command from terminalroslaunch gptr run_vicalib.launch
This dataset is converted from the original one in here.
The heart of our toolkit is the GaussianProcess.hpp header file which contains the abstraction of the continuous-time trajectory represented by a third-order GaussianProcess.
The GaussianProcess object provides methods to create, initialize, extend, and query information from the trajectory.
The toolkit contains three main examples:
- Visual-Inertial Calibration: a batch optimization problem where visual-inertial factors are combined to estimate the trajectory and extrinsics of a camera-imu pair, encapsulated in the
GPVICalib.cppfile. - UWB-Inertial Localization: a sliding-window Maximum A Posteriori (MAP) optimization problem featuring TDOA UWB measurements and IMU, presented in the
GPUI.cppfile. - Multi-lidar Coupled-Motion Estimation: a sliding-window MAP optimization problem with lidar-only observation, featuring multiple trajectories with extrinsic factors providing a connection between these trajectories, implemented in the
GPLO.cpptrajectory.
For the theorectical foundation, please find our paper at arxiv
If you use the source code of our work, please cite us as follows:
@article{nguyen2024gptr,
title = {GPTR: Gaussian Process Trajectory Representation for Continuous-Time Motion Estimation},
author = {Nguyen, Thien-Minh, and Cao, Ziyu, and Li, Kailai, and Yuan, Shenghai and Xie, Lihua},
journal = {arXiv preprint arXiv:2410.22931},
year = {2024}
}