Skip to content

Official implementation for the project RUKA: Rethinking the Design of Humanoid Hands with Learning. Project Website: https://ruka-hand.github.io

License

Notifications You must be signed in to change notification settings

NYU-robot-learning/RUKA-Mediapipe

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

34 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

RUKA: MediaPipe Hand Tracking Integration

Real-time hand tracking and teleoperation system for RUKA robotic hands using MediaPipe and Aria2Manus conversion.

RUKA (Rethinking the Design of Humanoid Hands with Learning) is a dexterous robotic hand system that enables intuitive human-to-robot hand pose transfer through computer vision-based tracking.

RUKA Demo

🚀 Quick Start

1. Installation

# Clone the repository
git clone <repository-url>
cd RUKA-Mediapipe

# Install dependencies
pip install -r requirements.txt

# Install the package in development mode
pip install -e .

2. Download Pre-trained Models and Data

# Download RUKA training data and checkpoints from OSF
bash download_data.sh

Note: You'll need OSF credentials to download the training data. Contact the maintainers for access.

3. Hardware Setup

Motor Calibration:

# Reset motors to initial state
python scripts/reset_motors.py

# Calibrate motor limits (interactive process)
python calibrate_motors.py

USB Port Detection:

# Auto-detect USB serial ports for hand controllers
python port_check.py

Copy the output port assignments to ruka_hand/utils/constants.py:

USB_PORTS = {
    "left": "/dev/tty.usbserial-XXXXXXXX",
    "right": "/dev/tty.usbserial-YYYYYYYY"
}

4. Run the System

Basic Teleoperation (Terminal):

python mediapipe_teleop.py --hand_type left

Web Interface (Recommended):

python mediapipe_teleop.py --web --hand_type left

Then open your browser to http://localhost:5000 to see the real-time hand tracking interface.

🎮 Usage Examples

Real-time Hand Control

# Control left hand with web interface
python mediapipe_teleop.py --web --hand_type left

# Control right hand in terminal mode
python mediapipe_teleop.py --hand_type right

Data Collection

# Collect hand motion data for training
python collect_data.py --hand_type left --duration 300

Controller Testing

# Test trained controllers
python examples/test_controllers.py

📁 Project Structure

RUKA-Mediapipe/
├── aria2manus/              # Hand pose retargeting system
├── ruka_hand/               # Core hand control system
│   ├── control/             # Motor control and hand operation
│   ├── learning/            # Neural network training pipeline
│   ├── teleoperation/       # Teleoperation interfaces
│   └── utils/               # Utilities and constants
├── model/                   # MediaPipe hand tracking models
├── motor_limits/            # Calibrated motor limit files
├── configs/                 # Training configurations
├── examples/                # Usage examples
└── templates/               # Web interface templates

🔧 Configuration

Motor Configuration

Motor limits and USB ports are configured in ruka_hand/utils/constants.py:

  • USB_PORTS: Serial port assignments for left/right hands
  • MOTOR_RANGES_LEFT/RIGHT: Calibrated motor position ranges

Hand Tracking Parameters

MediaPipe tracking parameters can be adjusted in mediapipe_teleop.py:

  • Detection confidence thresholds
  • Tracking confidence levels
  • Hand landmark processing settings

🤖 Supported Hardware

  • Servos: Dynamixel XL330-M288-T actuators
  • Controllers: USB2Dynamixel or similar Dynamixel interfaces
  • Hands: RUKA dexterous robotic hands (left/right configurations)

📊 Data Collection & Training

The system supports data collection for training improved control policies:

  1. Collect Data: Use collect_data.py to record human demonstrations
  2. Process Data: Preprocess collected data using tools in ruka_hand/learning/
  3. Train Controllers: Use train_controller.py with configuration files in configs/

🔍 Troubleshooting

Common Issues:

  • Port Detection: Run python port_check.py and update constants.py with correct USB ports
  • Motor Calibration: Ensure motors are properly calibrated using calibrate_motors.py
  • Camera Access: Grant camera permissions for MediaPipe tracking
  • Dependencies: Install all requirements with pip install -r requirements.txt

Motor Issues:

  • Check USB connections and power supply
  • Verify motor IDs match configuration in constants.py
  • Run motor reset script if servos are unresponsive

About

Official implementation for the project RUKA: Rethinking the Design of Humanoid Hands with Learning. Project Website: https://ruka-hand.github.io

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 73.2%
  • HTML 26.3%
  • Shell 0.5%