Skip to content
/ TBDVO Public

The implementation of the approach from the FRUCT Conference Paper "Transformer-Based Deep Monocular Visual Odometry for Edge Devices"

Notifications You must be signed in to change notification settings

toshiks/TBDVO

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TBDVO

PyTorch Lightning Config: Hydra Template

Description

The implementation of the approach from the FRUCT Conference Paper "Transformer-Based Deep Monocular Visual Odometry for Edge Devices"

Prepare data

Create folder data.

Download KITTI dataset for odometry into data/kitti_dataset.

Download pretrained FlowNet weights (flownets_bn_EPE2.459.pth.tar) in data/checkpoints from repository.

How to run

Install dependencies

# clone project
git clone https://github.com/toshiks/TBDVO.git
cd TBDVO

# create conda environment
conda env create -f conda_env_gpu.yaml -n myenv
conda activate myenv

Train model with default configuration (deepvo original)

# default
python run.py

# train on CPU
python run.py trainer.gpus=0

# train on GPU
python run.py trainer.gpus=1

Train model with chosen experiment configuration from configs/experiment/

python run.py experiment=experiment_name

You can override any parameter from command line like this

python run.py trainer.max_epochs=20 datamodule.batch_size=64

Run benchmarks:

export PYTHONPATH=$PWD
python util_scripts/benchmarks.py

About

The implementation of the approach from the FRUCT Conference Paper "Transformer-Based Deep Monocular Visual Odometry for Edge Devices"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages