Skip to content

[NeurIPS 2025] PhysCtrl: Generative Physics for Controllable and Physics-Grounded Video Generation

Notifications You must be signed in to change notification settings

cwchenwang/physctrl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PhysCtrl: Generative Physics for Controllable and Physics-Grounded Video Generation

arXiv Project Page

📦 Installation

python3.10 -m venv physctrl
source physctrl/bin/activate
# CAUTION: change it to your CUDA version
pip install torch==2.5.1 torchvision==0.20.1 torchaudio==2.5.1 --index-url https://download.pytorch.org/whl/cu118 xformers
pip install torch-cluster -f https://data.pyg.org/whl/torch-2.5.1+cu118.html
pip install -r requirements.txt

🤖 Pretrained Models

Download checkpoints:

bash download_ckpts.sh

📂 Dataset

Currently, it's difficult for us to release full dataset since it's too large. But since our dataset is based on the open-source TRELLIS-500K, it would be easy to recreate our dataset. Here we provide the scripts for creating the dataset for elastic, plasticine and sand material.

  1. Download the Objaverse sketchfab dataset

    cd src/data_generation
    python3 dataset_toolkits/build_metadata.py ObjaverseXL --source sketchfab --output_dir data/objaverse
    python3 dataset_toolkits/download.py ObjaverseXL --output_dir data/objaverse
  2. Generate h5 data with MPM simulator for different materials

    # Use "--uid_list configs/objaverse_valid_uid_list.json" to include the full dataset
    python3 generate_mpm_data.py	--material elastic --start_idx 0 --end_idx 1 --visualization 
    python3 generate_mpm_data.py	--material plasticine --start_idx 0 --end_idx 1 --visualization
    python3 generate_mpm_data.py	--material sand --start_idx 0 --end_idx 1 --visualization

    You can view the simulated trajectories in src/data_generation/data/objaverse/visualization

🎥 Image to Video Generation

We provide several examples in the examples folder. You can put your own example there using the same format.

cd src
python3 inference.py --data_name "penguin"

🏋️‍♂️ Training and Evaluation

Inference Trajectory Generation

python3 eval.py --config configs/eval_base.yaml

Train Trajectory Generation

For base model (support elastic objects with different force directions, fast inference, works for most cases):

accelerate launch --config_file configs/acc/8gpu.yaml train.py --config configs/config_dit_base.yaml

For large model (support all elastic, plasticine, sand and rigid objects, the latter three only supports gravity as force):

accelerate launch --config_file configs/acc/8gpu.yaml train.py --config configs/config_dit_large.yaml

Evaluate Trajectory Generation

python3 volume_iou.py --split_lst EVAL_DATASET_PATH --pred_path PRED_RESULTS_PATH

Estimating Physical Parameters

python3 -m utils.physparam --config configs/eval_base.yaml

📜 Citation

If you find this work helpful, please consider citing our paper:

@article{wang2024physctrl,
    title   = {PhysCtrl: Generative Physics for Controllable and Physics-Grounded Video Generation},
    author  = {Wang, Chen and Chen, Chuhao and Huang, Yiming and Dou, Zhiyang and Liu, Yuan and Gu, Jiatao and Liu, Lingjie},
    journal = {NeurIPS},
    year    = {2025}
}

About

[NeurIPS 2025] PhysCtrl: Generative Physics for Controllable and Physics-Grounded Video Generation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages