Real-time hand tracking and teleoperation system for RUKA robotic hands using MediaPipe and Aria2Manus conversion.
RUKA (Rethinking the Design of Humanoid Hands with Learning) is a dexterous robotic hand system that enables intuitive human-to-robot hand pose transfer through computer vision-based tracking.
# Clone the repository
git clone <repository-url>
cd RUKA-Mediapipe
# Install dependencies
pip install -r requirements.txt
# Install the package in development mode
pip install -e .
# Download RUKA training data and checkpoints from OSF
bash download_data.sh
Note: You'll need OSF credentials to download the training data. Contact the maintainers for access.
Motor Calibration:
# Reset motors to initial state
python scripts/reset_motors.py
# Calibrate motor limits (interactive process)
python calibrate_motors.py
USB Port Detection:
# Auto-detect USB serial ports for hand controllers
python port_check.py
Copy the output port assignments to ruka_hand/utils/constants.py
:
USB_PORTS = {
"left": "/dev/tty.usbserial-XXXXXXXX",
"right": "/dev/tty.usbserial-YYYYYYYY"
}
Basic Teleoperation (Terminal):
python mediapipe_teleop.py --hand_type left
Web Interface (Recommended):
python mediapipe_teleop.py --web --hand_type left
Then open your browser to http://localhost:5000
to see the real-time hand tracking interface.
# Control left hand with web interface
python mediapipe_teleop.py --web --hand_type left
# Control right hand in terminal mode
python mediapipe_teleop.py --hand_type right
# Collect hand motion data for training
python collect_data.py --hand_type left --duration 300
# Test trained controllers
python examples/test_controllers.py
RUKA-Mediapipe/
├── aria2manus/ # Hand pose retargeting system
├── ruka_hand/ # Core hand control system
│ ├── control/ # Motor control and hand operation
│ ├── learning/ # Neural network training pipeline
│ ├── teleoperation/ # Teleoperation interfaces
│ └── utils/ # Utilities and constants
├── model/ # MediaPipe hand tracking models
├── motor_limits/ # Calibrated motor limit files
├── configs/ # Training configurations
├── examples/ # Usage examples
└── templates/ # Web interface templates
Motor limits and USB ports are configured in ruka_hand/utils/constants.py
:
USB_PORTS
: Serial port assignments for left/right handsMOTOR_RANGES_LEFT/RIGHT
: Calibrated motor position ranges
MediaPipe tracking parameters can be adjusted in mediapipe_teleop.py
:
- Detection confidence thresholds
- Tracking confidence levels
- Hand landmark processing settings
- Servos: Dynamixel XL330-M288-T actuators
- Controllers: USB2Dynamixel or similar Dynamixel interfaces
- Hands: RUKA dexterous robotic hands (left/right configurations)
The system supports data collection for training improved control policies:
- Collect Data: Use
collect_data.py
to record human demonstrations - Process Data: Preprocess collected data using tools in
ruka_hand/learning/
- Train Controllers: Use
train_controller.py
with configuration files inconfigs/
Common Issues:
- Port Detection: Run
python port_check.py
and updateconstants.py
with correct USB ports - Motor Calibration: Ensure motors are properly calibrated using
calibrate_motors.py
- Camera Access: Grant camera permissions for MediaPipe tracking
- Dependencies: Install all requirements with
pip install -r requirements.txt
Motor Issues:
- Check USB connections and power supply
- Verify motor IDs match configuration in
constants.py
- Run motor reset script if servos are unresponsive