🤖 Control your Unitree Go2 robot with hand gestures!
This project enables gesture-based control of the Unitree Go2 quadruped robot using MediaPipe hand tracking and the Unitree SDK2 Python API.
- 🤚 Hand Gesture Recognition - Detect wrist flips and hand poses
- 🐕 Robot Control - Execute commands without remote controller
- 📹 Real-time Video Stream - Process robot's front camera feed
- 🎮 Multiple Gestures:
- Wrist flip → Stand / Sit
- ✌️ Gesture 1 (Index + Middle finger up) → Hello
- 🤟 Gesture 2 (Victory sign) → Heart
# Python 3.8+
pip install -r requirements.txtSee requirements.txt for full list:
mediapipe- Google's hand tracking solutionopencv-python- Computer visionnumpy- Numerical computingunitree-sdk2-python- Unitree robot SDK
-
Clone the repository
git clone https://github.com/lez666/Unitree-Go2-GestureDetection.git cd Unitree-Go2-GestureDetection -
Install dependencies
pip install -r requirements.txt
-
Install Unitree SDK
# Follow official guide: https://github.com/unitreerobotics/unitree_sdk2_python
Run on your laptop with display:
python3 hand_dog_cv.py <network_interface>Example:
python3 hand_dog_cv.py enp7s0SSH into the robot's Jetson extension board:
python3 hand_dog_nonscreen.py <network_interface>Example:
python3 hand_dog_nonscreen.py eth0Find your network interface:
# Linux
ip addr
# macOS
ifconfigCommon interfaces:
enp7s0- Ethernet (Linux)eth0- Ethernet (Jetson)wlan0- WiFi (Linux)
- Video Capture - Get frames from Go2's front camera
- Hand Detection - MediaPipe processes the video stream
- Gesture Recognition - Analyze hand landmarks:
- Wrist position determines flip direction
- Finger positions identify specific gestures
- Robot Control - Send commands via Unitree SDK
Robot not responding?
- Check network connectivity
- Verify SDK initialization
- Ensure robot is in sport mode
Gesture not detected?
- Ensure good lighting
- Keep hand within camera frame
- Make distinct gestures
Contributions welcome! Please open an issue or submit a PR.
MIT License
Enze Li
- GitHub: @lez666
- Email: [email protected]
基于 MediaPipe 手势识别和 Unitree SDK2 Python API,实现用手势控制宇树 Go2 机器狗。
- 🤚 手势识别 - 检测手腕翻转和手势
- 🐕 机器人控制 - 无需遥控器执行命令
- 📹 实时视频流 - 处理机器狗前置摄像头
- 🎮 支持手势:
- 手腕翻转 → 站起 / 坐下
- ✌️ 手势 1 (食指+中指向上) → Hello
- 🤟 手势 2 (胜利手势) → 比心
# Python 3.8+
pip install -r requirements.txt详见 requirements.txt:
mediapipe- Google 手势追踪opencv-python- 计算机视觉numpy- 数值计算unitree-sdk2-python- 宇树机器人 SDK
-
克隆项目
git clone https://github.com/lez666/Unitree-Go2-GestureDetection.git cd Unitree-Go2-GestureDetection -
安装依赖
pip install -r requirements.txt
-
安装宇树 SDK
# 参考官方指南: https://github.com/unitreerobotics/unitree_sdk2_python
在笔记本上运行:
python3 hand_dog_cv.py <网络接口>示例:
python3 hand_dog_cv.py enp7s0SSH 进入机器狗的 Jetson 扩展板:
python3 hand_dog_nonscreen.py <网络接口>示例:
python3 hand_dog_nonscreen.py eth0查找网络接口:
# Linux
ip addr
# macOS
ifconfig常用接口:
enp7s0- 以太网 (Linux)eth0- 以太网 (Jetson)wlan0- WiFi (Linux)
- 视频采集 - 获取 Go2 前置摄像头画面
- 手势检测 - MediaPipe 处理视频流
- 手势识别 - 分析手部关键点:
- 手腕位置判断翻转方向
- 手指位置识别特定手势
- 机器人控制 - 通过 Unitree SDK 发送指令
机器人无响应?
- 检查网络连接
- 验证 SDK 初始化
- 确保机器狗处于运动模式
手势检测不到?
- 确保光线充足
- 手保持在画面内
- 做明显的手势
欢迎提交 Issue 和 PR!
MIT License
Enze
- GitHub: @lez666
⭐ Star us if this helps!