Robot helper functions and CLI tools for Unitree Go2. This package provides a high-level helper class (Go2RobotHelper) for simplified robot control, a command-line tool for executing robot actions, and a collection of examples and applications.
- Go2RobotHelper: High-level helper class for simplified robot control with automatic connection management, state monitoring, and emergency cleanup
- CLI Tool:
go2actioncommand-line tool for executing robot actions - Examples: Comprehensive examples for audio, video, data channels, and LIDAR
- Apps: Ready-to-use applications for gamepad control, gesture recognition, and visualization
pip install go2toolsOr install from source:
git clone <repository-url>
cd go2tools
pip install -e .from go2tools.robot_helper import Go2RobotHelper
async def main():
async with Go2RobotHelper() as robot:
await robot.ensure_mode("mcf")
await robot.execute_command("Hello")
await robot.handstand_sequence()# List available actions
go2action --list
# Execute an action
go2action StandUp
# Execute with monitoring
go2action Hello --monitorThe Go2RobotHelper class provides:
- Automatic connection management with context manager support
- Built-in state monitoring and display
- Emergency cleanup and safe robot positioning
- Simplified command execution with error handling
- Obstacle detection control and status querying
- Mode switching (MCF/Sport)
See go2tools/robot_helper.py for full documentation.
The go2action command-line tool allows you to execute robot actions directly from the terminal:
go2action [action] [options]Options:
--list: List all supported actions--wait <seconds>: Wait time after action (default: 3.0)--monitor: Enable state monitoring output
Supported Actions:
- StandUp, StandDown, RecoveryStand
- Sit, RiseSit
- Hello, Stretch, Content, Scrape, Heart
- Dance1, Dance2
- FrontFlip, LeftFlip, BackFlip
- FrontJump, FrontPounce, Handstand
examples/audio/internet_radio/stream_radio.py- Stream internet radio to the robotexamples/audio/live_audio/live_recv_audio.py- Play robot audio live through host speakersexamples/audio/save_audio/save_audio_to_file.py- Record robot audio to a WAV file
examples/video/camera_stream/display_video_channel.py- Display live video frames with OpenCV
examples/data_channel/move_test.py- Minimal Move command testerexamples/data_channel/lowstate/lowstate.py- Comprehensive low-level state monitoringexamples/data_channel/sportmodestate/sportmodestate.py- Monitor sport mode state valuesexamples/data_channel/vui/vui.py- Control LED brightness, color, and flashing via VUI APIs
examples/data_channel/lidar/lidar_stream.py- Basic LIDAR subscription and data printingexamples/data_channel/lidar/lidar_performance_test.py- Measure LIDAR decoding performanceexamples/data_channel/lidar/plot_lidar_stream.py- Web-based LIDAR visualization via Flask/Socket.IO/Three.jsexamples/data_channel/lidar/record_lidar_pcd.py- Record LIDAR point clouds to PCD filesexamples/data_channel/lidar/view_pcd.py- View recorded PCD files
examples/data_channel/robot_odometry/robot_odometry.py- Display robot odometry: pose and velocitiesexamples/data_channel/robot_odometry/analyze_timestamp_drift.py- Analyze timestamp driftexamples/data_channel/robot_odometry/timestamp_drift.py- Timestamp drift analysis tool
Control the Go2 robot using a USB/Bluetooth gamepad:
apps/gamepad/gamepad_controller.py- Main gamepad control applicationapps/gamepad/gamepad_visualizer.py- Visualize and discover gamepad button/axis mappingsapps/gamepad/gamepad_config.py- Gamepad configuration schemaapps/gamepad/gamepad_mapping.yaml- Gamepad button/axis mapping configuration
Features:
- Joystick axes for continuous motion (forward/back, sidestep, yaw)
- Buttons and D-pad for discrete actions
- Configurable via YAML mapping file
- Optional obstacle-avoidance toggle
Control the Go2 using hand gestures captured from your webcam:
apps/gesture/hand_gestures.py- Hand gesture recognition and robot control
Features:
- MediaPipe-based hand tracking
- Motion-to-action mapping:
- Push hand forward → move back
- Pull hand backward → move forward
- Push hand down → StandDown
- Push hand up → StandUp
- Swipe left/right → side step
- Simulation mode (no robot)
- Two-hand requirement option to reduce false triggers
Combined video and LIDAR visualization using Rerun:
apps/rerun/rerun_video_lidar_stream.py- Real-time video and LIDAR point cloud visualization
Features:
- Real-time video stream from robot camera
- Real-time 3D LIDAR point cloud visualization
- CSV read/write support for offline analysis
- Y-value filtering for LIDAR points
- Modular design (can disable video or LIDAR independently)
- Python 3.8+
unitree_webrtc_connect(automatically installed as a dependency)- Go2 robot with WebRTC enabled
For examples and apps, install optional dependencies:
# For examples with visualization
pip install ".[examples]"
# For apps (gamepad, gesture, rerun)
pip install ".[apps]"
# For development
pip install ".[dev]"Set the robot IP address as an environment variable:
export ROBOT_IP="192.168.8.181"Then run any example or app:
python examples/data_channel/move_test.py
python apps/gamepad/gamepad_controller.pyMIT
Based on the work from legion1581/unitree_webrtc_connect.