A comprehensive markerless gait analysis system using computer vision and machine learning techniques for analyzing human walking patterns.
This system provides a complete pipeline for markerless gait analysis using:
- Computer Vision: Real-time pose estimation with MediaPipe
- Machine Learning: Temporal Convolutional Networks (TCN) for gait pattern analysis
- Data Processing: Advanced preprocessing and feature extraction
- Visualization: Real-time pose visualization with trail effects
- π Unified Pose Estimation: Extensible architecture supporting multiple pose estimation backends
- π§ TCN Architecture: Temporal sequence modeling for gait analysis
- π Advanced Analytics: Gait event detection, phase analysis, and performance metrics
- π¨ Real-time Visualization: Interactive pose visualization with trail effects
- π§ Modular Design: Easy to extend with new pose models and analysis methods
- π Cross-validation: Robust evaluation pipeline with comprehensive metrics
- π Organized Outputs: All results centralized in
outputs/directory
The unified pose processor manager makes it easy to add new pose estimation models:
- Create a new processor class inheriting from
PoseProcessor - Implement required abstract methods
- Add the model to the
AVAILABLE_MODELSdictionary - Update the
create_processormethod
gait_analysis/
βββ core/ # Core system modules
β βββ utils/ # Utility modules
β β βββ constants.py # Core constants
β β βββ config.py # Configuration management
β β βββ logging_config.py # Logging configuration
β βββ pose_processor_manager.py # Unified pose processor manager
β βββ mediapipe_integration.py # MediaPipe pose estimation
β βββ gait_data_preprocessing.py # Data preprocessing and feature extraction
β βββ gait_training.py # Training and evaluation module
β βββ tcn_gait_model.py # Temporal Convolutional Network model
βββ usecases/ # Use case implementations
β βββ gait_analysis/ # Main gait analysis use case
β β βββ features/ # Feature-specific implementations
β β β βββ realtime_pose_visualization.py # Real-time visualization
β β βββ utils.py # Utilities for quick analysis
β β βββ main_gait_analysis.py # Main pipeline orchestrator
β βββ testing/ # Testing and validation
β βββ test_pose_models.py # Pose model testing and comparison
β βββ test_system.py # System testing and validation
βββ scripts/ # Utility scripts
β βββ pose_model_comparison.py # Pose model comparison tool
β βββ run_gait_analysis.py # Gait analysis runner
βββ configs/ # Configuration files
β βββ default.json # Default configuration
β βββ gait_analysis.json # Configuration for pose models
βββ docs/ # Documentation
β βββ visualizations/ # Generated visualizations
β βββ README_RealTime_Visualization.md # Real-time visualization docs
β βββ README_TCN_Gait_Analysis.md # TCN system documentation
β βββ README_Installation.md # Installation guide
β βββ README_Changelog.md # Project changelog and history
βββ archive/ # Legacy scripts (see archive/README.md)
βββ data/ # Input data directory
β βββ models/ # Trained models
βββ videos/ # Video files directory
β βββ raw/ # Raw video files
β βββ sneak/ # Sneak gait videos
βββ outputs/ # Output results directory
βββ gait_analysis/ # Gait analysis results
βββ mediapipe/ # MediaPipe outputs
βββ test_results/ # Test results
βββ logs/ # Application logs
βββ models/ # Trained models
βββ visualizations/ # Generated visualizations
On macOS/Linux:
./setup_environment.shOn Windows:
setup_environment.batsource .venv/bin/activate # macOS/Linux
# or
.venv\Scripts\activate # Windows# Test the complete system
python3 usecases/testing/test_system.py
# Test pose models specifically
python3 usecases/testing/test_pose_models.py
# Show available models
python3 scripts/pose_model_comparison.py --info# Basic gait analysis with MediaPipe
python3 usecases/gait_analysis/main_gait_analysis.py \
--videos videos/raw/sample.mp4 \
--output outputs/gait_analysis/
# Pose detection only
python3 usecases/gait_analysis/main_gait_analysis.py \
--videos videos/raw/sample.mp4 \
--pose-detection-only
# With real-time visualization
python3 usecases/gait_analysis/main_gait_analysis.py \
--videos videos/raw/sample.mp4 \
--with-visualization- Speed: Fast, real-time processing
- Accuracy: Good for most applications
- Resource Usage: Low, works on CPU
- Best For: Real-time applications, mobile/edge devices
The system is designed to easily support additional pose estimation models:
- Create a new processor class that inherits from
PoseProcessor - Implement the required abstract methods
- Add the model to the
AVAILABLE_MODELSdictionary inPoseProcessorManager - Update the
create_processormethod to handle the new model type
# Compare available models on the same video
python3 scripts/pose_model_comparison.py --video videos/raw/sample.mp4 --compare
# Process with specific model
python3 usecases/gait_analysis/main_gait_analysis.py \
--videos videos/raw/sample.mp4 \
--pose-model mediapipeThe system includes an interactive real-time pose visualization tool that displays pose keypoints as colored dots with trail effects.
# Basic visualization with trail effect
python3 usecases/gait_analysis/features/realtime_pose_visualization.py videos/raw/sample.mp4
# Show confidence values
python3 usecases/gait_analysis/features/realtime_pose_visualization.py videos/raw/sample.mp4 --show-confidence
# Fast performance mode
python3 usecases/gait_analysis/features/realtime_pose_visualization.py videos/raw/sample.mp4 --model-complexity 0 --no-trail- 'q': Quit visualization
- 't': Toggle trail effect
- 'c': Toggle connections
- 'r': Reset trail
- SPACE: Pause/resume
- '1', '2', '3': Change model complexity
All results are organized in the outputs/ directory:
outputs/
βββ gait_analysis/ # Main gait analysis results
β βββ cv_metrics.json # Cross-validation metrics
β βββ fold_scores.json # Per-fold performance
β βββ training_histories.json # Training curves data
β βββ classification_report.txt # Detailed classification report
β βββ confusion_matrix.png # Confusion matrix visualization
β βββ training_curves.png # Training curves plot
β βββ detailed_results.json # Complete results summary
βββ mediapipe/ # MediaPipe pose detection outputs
βββ test_results/ # Testing and validation results
βββ logs/ # Application logs
βββ visualizations/ # Charts, graphs, and visual outputs
βββ models/ # Trained models and artifacts
- Real-time Visualization: Interactive pose visualization guide
- TCN Gait Analysis: Comprehensive TCN system documentation
- Installation Guide: Detailed setup instructions
- Core Modules: Core system modules documentation
- Changelog: Project history and changes
- Archive: Legacy scripts and migration notes
The system uses JSON configuration files for customization:
{
"pose_model": "mediapipe",
"task_type": "phase_detection",
"num_classes": 4,
"num_filters": 64,
"kernel_size": 3,
"num_blocks": 4,
"dropout_rate": 0.2,
"learning_rate": 0.001,
"epochs": 100,
"batch_size": 32
}- Fork the repository
- Create a feature branch
- Make your changes
- Add tests for new functionality
- Submit a pull request
This project is licensed under the MIT License - see the LICENSE file for details.
- MediaPipe team for the pose estimation framework
- TensorFlow/Keras community for the deep learning framework
- OpenCV community for computer vision tools
Note: Legacy scripts from the initial development phase have been moved to the archive/ directory. See archive/README.md for details about the archived files and migration notes.