Skip to content

ptrvilya/tridi

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TriDi: Trilateral Diffusion of 3D Humans, Objects and Interactions

Ilya A. PetrovRiccardo MarinJulian ChibaneGerard Pons-Moll

ICCV 2025

Project Teaser

Paper PDF Project Page YouTube video

Environment

The code was tested under Ubuntu 24.04, Python 3.10, CUDA 13.0, PyTorch 2.9.0. Use the following command to create a conda environment with necessary dependencies:

conda env create -f environment.yml

Data downloading and processing

The steps are described in docs/data.md.

Pre-trained models and evaluation

Pre-trained model can be obtained from the link. With the commands:

wget https://nc.mlcloud.uni-tuebingen.de/public.php/dav/files/bmsRACRqzCQ4QPq/gb_main.pth -O ./assets/gb_main.pth
wget https://nc.mlcloud.uni-tuebingen.de/public.php/dav/files/bmsRACRqzCQ4QPq/gb_contacts.pth -O ./assets/gb_contacts.pth

The command below is used to run sampling. Prameter sample.mode controls the choice of modalities, i.e.: three numbers correspond to human, object, and interaction, respectively; 1 means the modality is sampled, 0 means it is conditioned on. For example, sample.mode="sample_101" means sampling human and interaction conditioned on the object.

python main.py -c config/env.yaml scenarios/gb_main.yaml -- \
  run.job=sample run.name=001_gb_main sample.target=hdf5 \
  resume.checkpoint="./assets/gb_main.pth" \
  dataloader.batch_size=1024 sample.mode="sample_101" \
  run.datasets=["grab","behave"] sample.dataset=normal sample.repetitions=3 \
  model.cg_apply=True model.cg_scale=2.0

Use the command below to run evaluation on the generated samples. The eval.sampling_target parameter controls which modalities are evaluated (possible values: sbj_contact, obj_contact,):

python main.py -c config/env.yaml scenarios/gb_main.yaml -- \
  run.job=eval run.name=001_gb_main resume.step=-1 \
  eval.sampling_target=['sbj_contact'] 

Training

Use the following command to run the training:

python main.py -c config/env.yaml scenarios/gb_main.yaml -- \
  run.name=001_gb_main run.job=train

Citation

@inproceedings{petrov2025tridi,
   title={TriDi: Trilateral Diffusion of 3D Humans, Objects and Interactions},
   author={Petrov, Ilya A and Marin, Riccardo and Chibane, Julian and Pons-Moll, Gerard},
   booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision},
   year={2025}
}

Acknowledgements

This project benefited from the following resources:

About

[ICCV'25] Method for generating static human-object interactions

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages