Jose Luis Ponton1, Eduardo Alvarado2, Lin Geng Foo2, Nuria Pelechano1, Carlos Andujar1, Marc Habermann2
1 Universitat Politècnica de Catalunya (UPC)
2 Max Planck Institute for Informatics
Project page | Paper (arXiv) | Video | Data
Step2Motion is a system for reconstructing full-body locomotion from multi-modal insole sensors (pressure + IMU). It enables robust motion capture in unconstrained, real-world environments, overcoming the limitations of traditional mocap suits or optical systems.
- Python: 3.9+
- PyTorch: Follow the official installation guide for your system.
- Clone this repository.
- Create and activate a virtual environment:
python -m venv env # On Windows: .\env\Scripts\activate # On macOS/Linux: # source env/bin/activate
- Install dependencies:
pip install -r requirements.txt
- Install PyTorch according to your compute platform.
Unzip the provided preprocessed UnderPressure data into the data/UnderPressure directory.
Outcome: You should have a data/UnderPressure/underpressure_test.pt file.
To process raw BVH files manually (downloaded from the official repository):
python src/process_underpressure.py underpressure data/UnderPressure/- Download the Step2Motion dataset.
- Unzip and place it in
data/step2motion/. The structure should be:data/ └── step2motion/ ├── 00/ │ ├── clip.bvh │ ├── clip.json │ └── clip.txt ├── ... - Run the processing script:
python src/process_step2motion.py step2motion data/step2motion/ --seed 14
The dance dataset is provided in the data/dancing/ directory and is already preprocessed.
Generate a prediction for a specific clip:
python src/test.py models/UnderPressure/ skeletons/UPSkeleton_S4_AMASS.bvh --dataset data/UnderPressure/underpressure_test.pt --clip 0Output: models/UnderPressure/predictions/underpressure_test_c0_pred.bvh. This file can be viewed in Blender or the Unity Visualizer.
Generate predictions for an entire dataset:
# UnderPressure
python src/test_model.py models/UnderPressure/ data/UnderPressure/underpressure_test.pt skeletons/UPSkeleton_S4_AMASS.bvh --only_test
# Dancing
python src/test_model.py models/dancing/ data/dancing/dance_test.pt skeletons/UPSkeleton_S1_AMASS.bvh --only_test
# Step2Motion
python src/test_model.py models/step2motion/ data/step2motion/step2motion_test.pt skeletons/step2motion.bvh --only_testCompute metrics and generate distribution plots reported in the paper. Choose the target dataset using the argument: up (UnderPressure), dance, or step2motion.
python src/visualize_metrics.py [up|dance|step2motion]- Setup: Install Unity Hub and the Unity Editor (tested on version 2022.3).
- Open Project: Open
Unity/InsoleVisualization/. - Load Scene: Open
Assets/Scenes/Visualizer.unity. - Configuration:
- Select the
GlobalManager_UPGameObject. - Locate the referenced
UnderPressureScriptableObject. - Update the "ModelsPath" field to the absolute path of your local models directory (e.g.,
C:/Users/user/Desktop/Step2Motion/models/).
- Select the
- Playback:
- Press Play in the Editor.
- To change clips, modify the prediction name (e.g.,
underpressure_test_c0) in theGlobalManager_UPscript.
- Controls:
SPACE: Play/PauseR: Restart motionG: Focus on Ground TruthP: Focus on PredictionS: Scene View cameraI: Toggle Insole visualizationScroll: Zoom← / →: Previous/Next frame↑ / ↓: Increase/Decrease playback speed
To train a new model from scratch:
python src/train.py --config configs/config_underpressure.jsonIf you use this project in your research, please cite:
@article{2026:ponton:step2motion,
author = {Ponton, Jose Luis and Alvarado, Eduardo and Foo, Lin Geng and Pelechano, Nuria and Andujar, Carlos and Habermann, Marc},
title = {Step2Motion: Locomotion Reconstruction from Pressure Sensing Insoles},
journal = {Computer Graphics Forum},
booktitle = {Eurographics 2026},
volume = {45},
number = {2},
year = {2026},
doi = {10.1111/cgf.70405},
}This code is released under the MIT License.