Skip to content

WangYu0611/LongSplat

 
 

Repository files navigation

[ICCV 2025] LongSplat: Robust Unposed 3D Gaussian Splatting for Casual Long Videos

Chin-Yang Lin · Cheng Sun · Fu-En Yang
Min-Hung Chen . Yen-Yu Lin · Yu-Lun Liu

Logo

Installation

  1. Clone LongSplat.
git clone --recursive https://github.com/NVlabs/LongSplat.git
cd LongSplat
  1. Create the environment
conda create -n longsplat python=3.10.13 cmake=3.14.0 -y
conda activate longsplat
conda install pytorch torchvision pytorch-cuda=12.1 -c pytorch -c nvidia  # use the correct version of cuda for your system
pip install -r requirements.txt
pip install submodules/simple-knn
pip install submodules/diff-gaussian-rasterization
pip install submodules/fused-ssim
  1. Optional but highly suggested, compile the cuda kernels for RoPE (as in CroCo v2).
# DUST3R relies on RoPE positional embeddings for which you can compile some cuda kernels for faster runtime.
cd submodules/mast3r/dust3r/croco/models/curope/
python setup.py build_ext --inplace
cd ../../../../../../

Dataset Preparation

DATAROOT is ./data by default. Please first make data folder by mkdir data.

Free Dataset

Download our preprocessed Free dataset from Dropbox, and save it into the ./data/free folder.

Hike Dataset

Download our preprocessed Hike dataset from Google Drive, and save it into the ./data/hike folder.

Tanks and Temples

Download the data preprocessed by Nope-NeRF as below, and the data is saved into the ./data/tanks folder.

wget https://www.robots.ox.ac.uk/~wenjing/Tanks.zip

Run

Training and Evaluation

The training scripts include both training, rendering, and evaluation steps:

Each .sh script runs three main Python scripts in sequence:

  • train.py: Trains the LongSplat model
  • render.py: Renders the trained model to generate novel views
  • metrics.py: Evaluates the rendering quality and computes metrics
# For Free dataset
bash scripts/train_free.sh

# For Hike dataset
bash scripts/train_hike.sh

# For Tanks and Temples dataset
bash scripts/train_tnt.sh

Run on your own video

  • To run LongSplat on your own video, you need to first convert your video to frames and save them to ./data/$CUSTOM_DATA/images/

  • Before running the script, you need to modify the scene= parameter in scripts/train_custom.sh to point to your custom data directory. For example, change scene='./data/IMG_4190' to scene='./data/YOUR_CUSTOM_DATA'.

  • Run the following command:

bash scripts/train_custom.sh

Acknowledgement

Our render is built upon 3DGS. The data processing and visualization codes are partially borrowed from Scaffold-GS. We thank all the authors for their great repos.

Citation

If you find this code helpful, please cite:

@inproceedings{lin2025longsplat,
  title={LongSplat: Robust Unposed 3D Gaussian Splatting for Casual Long Videos},
  author={Chin-Yang Lin and Cheng Sun and Fu-En Yang and Min-Hung Chen and Yen-Yu Lin and Yu-Lun Liu},
  booktitle={ICCV},
  year={2025}
}

About

[ICCV 2025] LongSplat: Robust Unposed 3D Gaussian Splatting for Casual Long Videos

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 98.6%
  • Shell 1.4%