By Yanjie Ze, Siheng Zhao, Weizhuo Wang, Angjoo Kanazawa†, Rocky Duan†, Pieter Abbeel†, Guanya Shi†, Jiajun Wu†, C. Karen Liu†, 2025 († Equal Advising)
- 2025-12-02. 1st successfuly reproducing of TWIST2 appears. Check his bilibili video
- 2025-12-02. I will give a video tutorial for TWIST2 next week to show you how to use TWIST2. Please stay tuned.
- 2025-12-02. TWIST2 is open-sourced now. Give it a star on GitHub!
- Disclamer 1: with current repo, you should be able to control Unitree G1 via cable connection in both sim and real with a PICO VR headset.
- Disclamer 2: I am still working on better documentation and cleaning some onboard streaming and inference code (as the teleop pipeline is complex and requires some hardware setup). Please stay tuned.
- Disclamer 3: The high-level policy learning part will be released in a separate repo. It is modifed from iDP3 and I am working on releasing it soon.
- 2025-11-05. TWIST2 is released. Full code will be released within 1 month (mostly ready and under the internal process). Please stay tuned.
We will have two conda environments for TWIST2. One is called twist2, which can be used for controller training, controller deployment, and teleop data collection. The other is called gmr, which can be used for online motion retargeting. This is because isaacgym requires python 3.8, but newest mujoco requires python 3.10.
1. Create conda environment:
conda env remove -n twist2
conda create -n twist2 python=3.8
conda activate twist22. Install isaacgym. Download from official link and then install it:
cd isaacgym/python && pip install -e .3. Install packages:
cd rsl_rl && pip install -e . && cd ..
cd legged_gym && pip install -e . && cd ..
cd pose && pip install -e . && cd ..
pip install "numpy==1.23.0" pydelatin wandb tqdm opencv-python ipdb pyfqmr flask dill gdown hydra-core imageio[ffmpeg] mujoco mujoco-python-viewer isaacgym-stubs pytorch-kinematics rich termcolor zmq
pip install redis[hiredis] # for redis communication
pip install pyttsx3 # for voice control
pip install onnx onnxruntime-gpu # for onnx model inference
pip install customtkinter # for guiif this is your first time to use redis, install and start redis server:
# sudo apt install redis-server
# redis-server --daemonize yes
sudo apt update
sudo apt install -y redis-server
sudo systemctl enable redis-server
sudo systemctl start redis-serveredit /etc/redis/redis.conf:
sudo nano /etc/redis/redis.confmodify to the following lines:
bind 0.0.0.0
protected-mode nothen restart redis-server:
sudo systemctl restart redis-server4. if you wanna do sim2real with laptop, you also need to install my modified version of unitree sdk here. (if you wanna do sim2real with onboard robot computer, no need to install unitree sdk on your laptop.)
# Clone the Unitree SDK2 repository
cd ..
git clone https://github.com/YanjieZe/unitree_sdk2.git
cd unitree_sdk2
# Install system dependencies
sudo apt-get update
sudo apt-get install build-essential cmake python3-dev python3-pip pybind11-dev
# Install Python dependencies
pip install pybind11 pybind11-stubgen numpy
# Build Python SDK binding
cd python_binding
export UNITREE_SDK2_PATH=$(pwd)/..
bash build.sh --sdk-path $UNITREE_SDK2_PATH
# Install the compiled module to your conda environment
# Get the site-packages path for your current conda environment
SITE_PACKAGES=$(python -c "import site; print(site.getsitepackages()[0])")
echo "Installing to: $SITE_PACKAGES"
# Copy the compiled module (rename to remove version-specific suffix)
sudo cp build/lib/unitree_interface.cpython-*-linux-gnu.so $SITE_PACKAGES/unitree_interface.so
# Verify installation
python -c "import unitree_interface; print('✓ Unitree SDK Python binding installed successfully')"
python -c "import unitree_interface; print('Available robot types:', list(unitree_interface.RobotType.__members__.keys()))"
cd ../..4. [If you want to train your own controller, download data; otherwise, skip this step] Download TWIST2 dataset from my google drive [Small note: if you use this dataset in your project, please also add proper citation to this work]. Unzip it to anywhere you like, and specify the root_path in legged_gym/motion_data_configs/twist2_dataset.yaml to the unzipped folder.
Note: we also provide a small set of example motions in assets/example_motions. You can use them to test the system. It is recorded by myself so no license issue.
Note: We also provide our controller ckpt assets/ckpts/twist2_1017_20k.onnx for you to test the system directly.
5. Install GMR for online retargeting and teleop. We use a separate conda environment for GMR/online retargeting due to requiring python 3.10+.
conda create -n gmr python=3.10 -y
conda activate gmr
git clone https://github.com/YanjieZe/GMR.git
cd GMR
# install GMR
pip install -e .
cd ..
conda install -c conda-forge libstdcxx-ng -y
6. Install PICO SDK:
- On your PICO, install PICO SDK: see here.
- On your own PC,
- Download deb package for ubuntu 22.04, or build from the repo source.
- To install, use command
then you should see
sudo dpkg -i XRoboToolkit_PC_Service_1.0.0_ubuntu_22.04_amd64.deb
xrobotoolkit-pc-servicein your APPs. remember to start this app before you do teleopperation. - Build PICO PC Service SDK and Python SDK for PICO streaming:
conda activate gmr git clone https://github.com/YanjieZe/XRoboToolkit-PC-Service-Pybind.git cd XRoboToolkit-PC-Service-Pybind mkdir -p tmp cd tmp git clone https://github.com/XR-Robotics/XRoboToolkit-PC-Service.git cd XRoboToolkit-PC-Service/RoboticsService/PXREARobotSDK bash build.sh cd ../../../.. mkdir -p lib mkdir -p include cp tmp/XRoboToolkit-PC-Service/RoboticsService/PXREARobotSDK/PXREARobotSDK.h include/ cp -r tmp/XRoboToolkit-PC-Service/RoboticsService/PXREARobotSDK/nlohmann include/nlohmann/ cp tmp/XRoboToolkit-PC-Service/RoboticsService/PXREARobotSDK/build/libPXREARobotSDK.so lib/ # rm -rf tmp # Build the project conda install -c conda-forge pybind11 pip uninstall -y xrobotoolkit_sdk python setup.py install
7. Ready for training & deployment!
We have provided the trained student ckpt in assets/ckpts/twist2_1017_20k.onnx. You can directly use it for deployment. If you want to deploy our ckpt directly, go to 4 directly.
And we have also provided full motion datasets to ensure you can successfully train our teacher & student policy.
1. Training TWIST2 general motion tracker:
bash train.sh 1021_twist2 cuda:0- arg 1: policy expid
- arg 2: cuda device id
2. Export policy to onnx model:
bash to_onnx.sh $YOUR_POLICY_PATH- arg 1: your policy path. you should find the
.ptfile's path.
3. Sim2sim verification:
[If this is your first time to run this script] you need to warm up the redis server by running the high-level motion server.
bash run_motion_server.shYou can also just select one motion file from our motion dataset by modifying the motion_file in run_motion_server.sh.
Then, you can run the low-level controller server in simulation.
bash sim2sim.sh- This will start a simulation that runs the low-level controller only.
- This is because we separate the high-level control (i.e., teleop) from the low-level control (i.e., RL policy).
- You should now be able to see the robot stand still. The robot is standing still because we make redis server send stand pose by default.
You can also see the policy execution FPS in the terminal. It should be around 50 Hz. If your laptop's GPU/CPU is not strong enough, the FPS may be lower and hurts the policy execution.
=== Policy Execution FPS Results (steps 1-1000) ===
Average Policy FPS: 38.88
Max Policy FPS: 41.55
Min Policy FPS: 24.72
Std Policy FPS: 1.92
Expected FPS (from decimation): 50.00And now you can control the robot via high-level motion streaming.
Note: you need to open another terminal to run the high-level motion streaming.
We have provided two choices for you:
- for offline motion streaming:
bash run_motion_server.sh- for online PICO teleop:
bash teleop.sh4. Sim2real verification. If you are not familiar with the deployment on physical robot, you can refer to unitree_g1.md or unitree_g1.zh.md for more details.
More specifically, the pipeline for sim2real deploy is:
- start the robot and connect the robot and your laptop via an Ethernet cable.
- config the corresponding net interface on your laptop, by setting the IP address as
192.168.123.222and the netmask as255.255.255.0. - now you should be able to ping the robot via
ping 192.168.123.164. - then use Unitree G1's remote control to enter dev mode, i.e., press the
L2+R2key combination. - now you should be able to see the robot joints in the damping state.
- then you can run the low-level controller by:
bash sim2real.sh- please set the network interface name to your own that connects to the robot in
sim2real.sh.
Similarly, you run the low-level controller first and then control the robot via high-level motion server, i.e.,
- offline motion streaming:
bash run_motion_server.sh- or online PICO teleop:
bash teleop.sh5. GUI interface for everything. Check gui.sh for more details.
bash gui.shYou should be able to
- run the low-level controller in simulation
- run the low-level controller on physical robot
- run the high-level motion streaming in offline mode
- run the high-level motion streaming in online PICO teleop mode
- run the data collection script
- run the neck controller script
- run the ZED streaming script all in this GUI. This GUI is also what I use for data collection and teleoperation.
If you find this work useful, please cite:
@article{ze2025twist2,
title={TWIST2: Scalable, Portable, and Holistic Humanoid Data Collection System},
author= {Yanjie Ze and Siheng Zhao and Weizhuo Wang and Angjoo Kanazawa and Rocky Duan and Pieter Abbeel and Guanya Shi and Jiajun Wuand C. Karen Liu},
year= {2025},
journal= {arXiv preprint arXiv:2511.02832}
}And also consider citing the related works:
@article{ze2025twist,
title={TWIST: Teleoperated Whole-Body Imitation System},
author= {Yanjie Ze and Zixuan Chen and João Pedro Araújo and Zi-ang Cao and Xue Bin Peng and Jiajun Wu and C. Karen Liu},
year= {2025},
journal= {arXiv preprint arXiv:2505.02833}
}
@article{joao2025gmr,
title={Retargeting Matters: General Motion Retargeting for Humanoid Motion Tracking},
author= {Joao Pedro Araujo and Yanjie Ze and Pei Xu and Jiajun Wu and C. Karen Liu},
year= {2025},
journal= {arXiv preprint arXiv:2510.02252}
}If you have any questions, please contact me at yanjieze@stanford.edu.
We use AMASS and OMOMO motion datasets for research purposes only. Our code is built upon the TWIST repository.