An aggregation of human motion understanding research.
-
Updated
Feb 25, 2025
An aggregation of human motion understanding research.
KMM: Key Frame Mask Mamba for Extended Motion Generation
This is an open collection of state-of-the-art (SOTA), novel Text to X (X can be everything) methods (papers, codes and datasets).
The official implementation of MotionLab
Pytorch implementation of MoLA
NVIDIA-accelerated packages for arm motion planning and control
Arm manipulation workflows
🌴[CVPR 2024] OakInk2: A Dataset of Bimanual Hands-Object Manipulation in Complex Task Completion
(ECCV 2024) SignAvatars: A Large-scale 3D Sign Language Holistic Motion Dataset and Benchmark
Official PyTorch implementation of the paper "SINC: Spatial Composition of 3D Human Motions for Simultaneous Action Generation" [ICCV 2023]
InfiniMotion: Mamba Boosts Memory in Transformer for Arbitrary Long Motion Generation
Official repo for paper "[AAAI'25] MotionCraft: Crafting Whole-Body Motion with Plug-and-Play Multimodal Controls"
[ECCV 2024] Official PyTorch implement of paper "ParCo: Part-Coordinating Text-to-Motion Synthesis": http://arxiv.org/abs/2403.18512
🔥 [ECCV 2024] Motion Mamba: Efficient and Long Sequence Motion Generation
[CVPR 2024] OakInk2 baseline model: Task-aware Motion Fulfillment (TaMF) via Diffusion
[CVPR 2022] OakInk: A Large-scale Knowledge Repository for Understanding Hand-Object Interaction
Pytorch implementation of Stable Vector Fields on Lie Groups through Diffeomorphism
Official repository for "BAMM: Bidirectional Autoregressive Motion Model (ECCV 2024)"
[BMVC 2024] Motion Avatar: Generate Human and Animal Avatars with Arbitrary Motion
A memory-efficient, realistic, real-time controllable locomotion generator.
Add a description, image, and links to the motion-generation topic page so that developers can more easily learn about it.
To associate your repository with the motion-generation topic, visit your repo's landing page and select "manage topics."