Ilya A. Petrov Riccardo Marin Julian Chibane Gerard Pons-Moll
The code was tested under Ubuntu 24.04, Python 3.10, CUDA 13.0, PyTorch 2.9.0.
Use the following command to create a conda environment with necessary dependencies:
conda env create -f environment.ymlThe steps are described in docs/data.md.
Pre-trained model can be obtained from the link. With the commands:
wget https://nc.mlcloud.uni-tuebingen.de/public.php/dav/files/bmsRACRqzCQ4QPq/gb_main.pth -O ./assets/gb_main.pth
wget https://nc.mlcloud.uni-tuebingen.de/public.php/dav/files/bmsRACRqzCQ4QPq/gb_contacts.pth -O ./assets/gb_contacts.pthThe command below is used to run sampling. Prameter sample.mode controls the choice of modalities, i.e.:
three numbers correspond to human, object, and interaction, respectively;
1 means the modality is sampled, 0 means it is conditioned on.
For example, sample.mode="sample_101" means sampling human and interaction conditioned on the object.
python main.py -c config/env.yaml scenarios/gb_main.yaml -- \
run.job=sample run.name=001_gb_main sample.target=hdf5 \
resume.checkpoint="./assets/gb_main.pth" \
dataloader.batch_size=1024 sample.mode="sample_101" \
run.datasets=["grab","behave"] sample.dataset=normal sample.repetitions=3 \
model.cg_apply=True model.cg_scale=2.0Use the command below to run evaluation on the generated samples. The eval.sampling_target parameter controls
which modalities are evaluated (possible values: sbj_contact, obj_contact,):
python main.py -c config/env.yaml scenarios/gb_main.yaml -- \
run.job=eval run.name=001_gb_main resume.step=-1 \
eval.sampling_target=['sbj_contact'] Use the following command to run the training:
python main.py -c config/env.yaml scenarios/gb_main.yaml -- \
run.name=001_gb_main run.job=train@inproceedings{petrov2025tridi,
title={TriDi: Trilateral Diffusion of 3D Humans, Objects and Interactions},
author={Petrov, Ilya A and Marin, Riccardo and Chibane, Julian and Pons-Moll, Gerard},
booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision},
year={2025}
}This project benefited from the following resources: