An up-to-date & curated list of Awesome IMU-based Human Activity Recognition(Ubiquitous Computing) papers, methods & resources. Please note that most of the collections of researches are mainly based on IMU data.
Many thanks to the useful publications and repos: Jingdong Wang, Awesome-Deep-Vision, Awesome-Deep-Learning-Papers, Awesome-Self-Supervised-Learning, Awesome-Semi-Supervised-Learning and Awesome-Crowd-Counting.
Please feel free to contribute this list.
- IJCAI, ACM MultiMedia, AAAI, KDD, ICDM, TKDE, TIP, TNNLS, TPAMI, TMM, Pattern Recognition, AI, Nature Communication, Nature Digital Medicine, ICPR, Sensors, Ubicomp(IMWUT Journal)
 - https://github.com/OxWearables/Oxford_Wearables_Activity_Recognition
 
- Capture-24 [link]
 - mHealth [link]
 - HHAR [link]
 - Opportunity [link]
 - PAMAP2 [link]
 - GOTOV [link]
 - REALDISP [link]
 - UCIDSADS [link]
 - MMAct [link]
 - TotalCapture [link]
 - WISDM [link]
 - MotionSense [link]
 - MobiAct [link]
 - Fenland [link]
 - Salad 50 [link]
 - DIP [link]
 - LARa [link]
 - Human Inertial Pose [link]
 - Kinetics-400 dataset [link]
 - UCF-101 dataset [link]
 
- EEG analysis/prediction/modelling: https://github.com/meagmohit/EEG-Datasets
 
- Large-Scale/Diverse Dataset Research
 - Multi-Modality: sensor-vision, sensor-skeleton, sensor-3DPose, Sensor-Motion
 - window selection
 - Generative Model: e.g., cross modality data generation, IMU2Skeleton
 - Handling the NULL-Class problem
 - Open-World, Real-World: complex/non-repetitive activities
 - Advanced model
 - Data-cental: active learning, unsupervised learning, semi-supervised learning, self-supervised learning
 - Actiion Segmentation
 - Are the existing settings/models reliable?
 - Graph Representation
 - Motion-Capture, Kinetic
 - Privacy related
 - Interpretability
 - Data Imbalance
 - Domain Adaptation
 - Fine-Grained
 - Multi-Label
 - Federated Learning
 - Ensemble
 - Knowledge Integragation/distillation
 
- 
Body-Area Capacitive or Electric Field Sensing for Human Activity Recognition and Human-Computer Interaction: A Comprehensive Survey
 - 
A Survey on Deep Learning for Human Activity Recognition (ACM Computing Surveys (CSUR)) [paper]
 - 
Applying Machine Learning for Sensor Data Analysis in Interactive Systems: Common Pitfalls of Pragmatic Use and Ways to Avoid Them (ACM Computing Surveys (CSUR)) [paper]
 - 
[DL4SAR] Deep Learning for Sensor-based Activity Recognition: A Survey (Pattern Recognition Letters) [paper][code]
 - 
Deep Learning for Sensor-based Human Activity Recognition: Overview, Challenges, and Opportunities (ACM Computing Surveys (CSUR)) [paper]
 - 
Human Action Recognition from Various Data Modalities: A Review (IEEE TPAMI 2022) [paper] top AI Journal
 
- IMUGPT 2.0: Language-Based Cross Modality Transfer for Sensor-Based Human Activity Recognition
 - Self-supervised Learning for Accelerometer-based Human Activity Recognition: A Survey
 - Ask Less, Learn More: Adapting Ecological Momentary Assessment Survey Length by Modeling Question-Answer Information Gain
 - GoalTrack: Supporting Personalized Goal-Setting in Stroke Rehabilitation with Multimodal Activity Journaling
 - Temporal Action Localization for Inertial-based Human Activity Recognition
 - WEAR: An Outdoor Sports Dataset for Wearable and Egocentric Activity Recognition
 - Collecting Self-reported Physical Activity and Posture Data Using Audio-based Ecological Momentary Assessment
 - EarSleep: In-ear Acoustic-based Physical and Physiological Activity Recognition for Sleep Stage Detection
 - AutoAugHAR: Automated Data Augmentation for Sensor-based Human Activity Recognition
 - CrossHAR: Generalizing Cross-dataset Human Activity Recognition via Hierarchical Self-Supervised Pretraining
 - Changing Your Tune: Lessons for Using Music to Encourage Physical Activity
 - The EarSAVAS Dataset: Enabling Subject-Aware Vocal Activity Sensing on Earables
 - Self-supervised learning for Human Activity Recognition Using 700,000 Person-days of Wearable Data
 - IMUGPT 2.0: Language-Based Cross Modality Transfer for Sensor-Based Human Activity Recognition
 - HARMamba: Efficient Wearable Sensor Human Activity Recognition Based on Bidirectional Selective SSM
 - HyperHAR: Inter-sensing Device Bilateral Correlations and Hyper-correlations Learning Approach for Wearable Sensing Device Based Human Activity Recognition
 - Lateralization Effects in Electrodermal Activity Data Collected Using Wearable Devices
 - Body-Area Capacitive or Electric Field Sensing for Human Activity Recognition and Human-Computer Interaction: A Comprehensive Survey
 - exHAR: An Interface for Helping Non-Experts Develop and Debug Knowledge-based Human Activity Recognition Systems
 - Kirigami: Lightweight Speech Filtering for Privacy-Preserving Activity Recognition using Audio
 - Co-Designing Sensory Feedback for Wearables to Support Physical Activity through Body Sensations
 - Semantic Loss: A New Neuro-Symbolic Approach for Context-Aware Human Activity Recognition
 - CAvatar: Real-time Human Activity Mesh Reconstruction via Tactile Carpets
 - Deep Heterogeneous Contrastive Hyper-Graph Learning for In-the-Wild Context-Aware Human Activity Recognition
 - SF-Adapter: Computational-Efficient Source-Free Domain Adaptation for Human Activity Recognition
 - Spatial-Temporal Masked Autoencoder for Multi-Device Wearable Human Activity Recognition
 - Optimization-Free Test-Time Adaptation for Cross-Person Activity Recognition
 - TextureSight: Texture Detection for Routine Activity Awareness with Wearable Laser Speckle Imaging
 - TS2ACT: Few-Shot Human Activity Sensing with Cross-Modal Co-Learning
 
- Integrating Gaze and Mouse Via Joint Cross-Attention Fusion Net for Students' Activity Recognition in E-learning
 - Integrating Gaze and Mouse Via Joint Cross-Attention Fusion Net for Students' Activity Recognition in E-learning
 - VAX: Using Existing Video and Audio-based Activity Recognition Models to Bootstrap Privacy-Sensitive Sensors
 - LAUREATE: A Dataset for Supporting Research in Affective Computing and Human Memory Augmentation
 - MMTSA: Multi-Modal Temporal Segment Attention Network for Efficient Human Activity Recognition
 - HMGAN: A Hierarchical Multi-Modal Generative Adversarial Network Model for Wearable Human Activity Recognition
 - TAO: Context Detection from Daily Activity Patterns Using Temporal Analysis and Ontology
 - HAKE: Human Activity Knowledge Engine [link]
 - PhysiQ: Off-site Quality Assessment of Exercise in Physical Therapy [link]
 - SAMoSA: Sensing Activities with Motion and Subsampled Audio [link]
 - Physical-aware Cross-modal Adversarial Network for Wearable Sensor-based Human Action Recognition [link]
 - IMU2CLIP: Multimodal Contrastive Learning for IMU Motion Sensors from Egocentric Videos and Text
 - Real-time Context-Aware Multimodal Network for Activity and Activity-Stage Recognition from Team Communication in Dynamic Clinical Settings
 - X-CHAR: A Concept-based Explainable Complex Human Activity Recognition Model
 - Hierarchical Clustering-based Personalized Federated Learning for Robust and Fair Human Activity Recognition
 - AMIR: Active Multimodal Interaction Recognition from Video and Network Traffic in Connected Environments
 - Narrative-Based Visual Feedback to Encourage Sustained Physical Activity: A Field Trial of the WhoIsZuki Mobile Health Platform
 - Human Parsing with Joint Learning for Dynamic mmWave Radar Point Cloud
 - RF-CM: Cross-Modal Framework for RF-enabled Few-Shot Human Activity Recognition
 - PrISM-Tracker: A Framework for Multimodal Procedure Tracking Using Wearable Sensors and State Transition Information with User-Driven Handling of Errors and Uncertainty
 - Self-supervised Learning for Human Activity Recognition Using 700,000 Person-days of Wearable Data
 - GLOBEM: Cross-Dataset Generalization of Longitudinal Human Behavior Modeling
 - TransFloor: Transparent Floor Localization for Crowdsourcing Instant Delivery
 - Understanding the Mechanism of Through-Wall Wireless Sensing: A Model-based Perspective
 - Unveiling Causal Attention in Dogs' Eyes with Smart Eyewear
 - MHCCL: Masked Hierarchical Cluster-wise Contrastive Learning for Multivariate Time Series
 
- Self-Supervised Contrastive Pre-Training for Time Series via Time-Frequency Consistency
 - MaeFE: Masked Autoencoders Family of Electrocardiogram for Self-supervised Pre-training and Transfer Learning
 - A Simple Self-Supervised IMU Denoising Method For Inertial Aided Navigation
 - Adaptive Memory Networks with Self-supervised Learning for Unsupervised Anomaly Detection
 - MHCCL: Masked Hierarchical Cluster-wise Contrastive Learning for Multivariate Time Series
 - FLAME: Federated Learning across Multi-device Environments
 - Longitudinal cardio-respiratory fitness prediction through wearables in free-living environments
 - Self-supervised transfer learning of physiological representations from free-living wearable data
 - Learning Generalizable Physiological Representations from Large-scale Wearable Data
 - Application-Driven AI Paradigm for Human Action Recognition
 - A hybrid accuracy-and energy-aware human activity recognition model in IoT environment
 - Predicting Performance Improvement of Human Activity Recognition Model by Additional Data Collection
 - SAMoSA: Sensing Activities with Motion and Subsampled Audio
 - Towards Ubiquitous Personalized Music Recommendation with Smart Bracelets
 - Towards a Dynamic Inter-Sensor Correlations Learning Framework for Multi-Sensor-Based Wearable Human Activity Recognition
 - Augmented Adversarial Learning for Human Activity Recognition with Partial Sensor Sets
 - Bootstrapping Human Activity Recognition Systems for Smart Homes from Scratch
 - Towards a Dynamic Inter-Sensor Correlations Learning Framework for Multi-Sensor-Based Wearable Human Activity Recognition [link]
 - Cosmo: Contrastive Fusion Learning with Small Data for Multimodal Human Activity Recognition
 - What Makes Good Contrastive Learning on Small-Scale Wearable-based Tasks?
 - ClusterFL: a similarity-aware federated learning system for human activity recognition
 - Human Action Recognition from Various Data Modalities: A Review (IEEE TPAMI 2022 (top AI Journal))
 - Semantic-Discriminative Mixup for Generalizable Sensor-based Cross-domain Activity Recognition
 - Are You Left Out?: An Efficient and Fair Federated Learning for Personalized Profiles on Wearable Devices of Inferior Networking Conditions
 - Progressive Cross-modal Knowledge Distillation for Human Action Recognition [link]
 - Leveraging Sound and Wrist Motion to Detect Activities of Daily Living with Commodity Smartwatches
 - I Want to Know Your Hand: Authentication on Commodity Mobile Phones Based on Your Hand's Vibrations
 - CSI:DeSpy: Enabling Effortless Spy Camera Detection via Passive Sensing of User Activities and Bitrate Variations
 - Acceleration-based Activity Recognition of Repetitive Works with Lightweight Ordered-work Segmentation Network
 - IF-ConvTransformer: A Framework for Human Activity Recognition Using IMU Fusion and ConvTransformer
 - Quali-Mat: Evaluating the Quality of Execution in Body-Weight Exercises with a Pressure Sensitive Sports Mat
 - Non-Bayesian Out-of-Distribution Detection Applied to CNN Architectures for Human Activity Recognition
 - Resource-Efficient Continual Learning for Sensor-Based Human Activity Recognition
 - Beyond the Gates of Euclidean Space: Temporal-Discrimination-Fusions and Attention-based Graph Neural Network for Human Activity Recognition
 - LiteHAR: LIGHTWEIGHT HUMAN ACTIVITY RECOGNITION FROM WIFI SIGNALS WITH RANDOM CONVOLUTION KERNELS
 - A Review on Topological Data Analysis in Human Activity Recognition
 - Deep CNN-LSTM with Self-Attention Model for Human Activity Recognition using Wearable Sensor
 - Zero-Shot Learning for IMU-Based Activity Recognition Using Video Embeddings
 - Deep Transfer Learning with Graph Neural Network for Sensor-Based Human Activity Recognition
 - Meta-learning meets the Internet of Things: Graph prototypical models for sensor-based human activity recognition
 - Federated Multi-Task Learning
 - Unsupervised Human Activity Recognition Using the Clustering Approach: A Review
 - Hierarchical Self Attention Based Autoencoder for Open-Set Human Activity Recognition
 - Assessing the State of Self-Supervised Human Activity Recognition using Wearables
 - ROBUST AND EFFICIENT UNCERTAINTY AWARE BIOSIGNAL CLASSIFICATION VIA EARLY EXIT ENSEMBLES
 - Machine learning detects altered spatial navigation features in outdoor behaviour of Alzheimer’s disease patients
 - Evaluating Contrastive Learning on Wearable Timeseries for Downstream Clinical Outcomes
 - Segmentation-free Heart Pathology Detection Using Deep Learning
 - Anticipatory Detection of Compulsive Body-focused Repetitive Behaviors with Wearables
 - Assessing the State of Self-Supervised Human Activity Recognition using Wearables
 - Method and system for automatic extraction of virtual on-body inertial measurement units
 - Enhancing the Security & Privacy of Wearable Brain-Computer Interfaces
 - Detecting Smartwatch-Based Behavior Change in Response to a Multi-Domain Brain Health Intervention
 - ColloSSL: Collaborative Self-Supervised Learning for Human Activity Recognition
 - Multi-scale Deep Feature Learning for Human Activity Recognition Using Wearable Sensors
 - Improving Wearable-Based Activity Recognition Using Image Representations
 - Multi-sensor information fusion based on machine learning for real applications in human activity recognition: State-of-the-art and research challenges
 - A recurrent neural network architecture to model physical activity energy expenditure in older people
 - Application of artificial intelligence in wearable devices: Opportunities and challenges
 - A Close Look into Human Activity Recognition Models using Deep Learning
 - YONO: Modeling Multiple Heterogeneous Neural Networks on Microcontrollers
 - CogAx: Early Assessment of Cognitive and Functional Impairment from Accelerometry
 - Deep Temporal Conv-LSTM for Activity Recognition
 - Human Activity Recognition from Wearable Sensor Data Using Self-Attention
 - Combined deep centralized coordinate learning and hybrid loss for human activity recognition
 - Real-time human activity recognition using conditionally parametrized convolutions on mobile and wearable devices
 - Proposing a Fuzzy Soft‐max‐based classifier in a hybrid deep learning architecture for human activity recognition
 - HAR-GCNN: Deep Graph CNNs for Human Activity Recognition From Highly Unlabeled Mobile Sensor Data
 - Sensor-based human activity recognition using fuzzified deep CNN architecture with λmax method
 - WearRF-CLA: Continuous Location Authentication with Wrist Wearables and UHF RFID
 - Non-Bayesian Out-of-Distribution Detection Applied to CNN Architectures for Human Activity Recognition
 - Improving the Performance of Open-Set Classification in Human Activity Recognition by Applying a Residual Neural Network Architecture
 - A Review on Topological Data Analysis in Human Activity Recognition
 - UBIWEAR: An End-To-End Framework for Intelligent Physical Activity Prediction With Machine and Deep Learning
 - High-Precision and Personalized Wearable Sensing Systems for Healthcare Applications
 - ColloSSL: Collaborative Self-Supervised Learning for Human Activity Recognition
 - DANA: Dimension-Adaptive Neural Architecture
 - DeXAR: Deep Explainable Sensor-Based Activity Recognition in Smart-Home Environments
 - Latent Independent Excitation for Generalizable Sensor-based Cross-Person Activity Recognition
 - The Severity Prediction of The Binary And Multi-Class Cardiovascular Disease -- A Machine Learning-Based Fusion Approach
 - An Unsupervised User Adaptation Model for Multiple Wearable Sensors Based Human Activity Recognition
 - Machine Learning on Clinical Time Series: Classification and Representation Learning
 - Learning Disentangled Behaviour Patterns for Wearable-based Human Activity Recognition
 - What Makes Good Contrastive Learning on Small-scale Wearable-based Tasks?
 - Leveraging Activity Recognition to Enable Protective Behavior Detection in Continuous Data,
 - IMU2Doppler: Cross-Modal Domain Adaptation for Doppler-based Activity Recognition Using IMU Data
 - A CNN-based Human Activity Recognition System Combining a Laser Feedback Interferometry Eye Movement Sensor and an IMU for Context-aware Smart Glasses
 - Winect: 3D Human Pose Tracking for Free-form Activity Using Commodity WiFi
 - Zero-Shot Learning for IMU-Based Activity Recognition Using Video Embeddings
 - KATN: Key Activity Detection via Inexact Supervised Learning
 - Fusing Visual and Inertial Sensors with Semantics for 3D Human Pose Estimation
 - Multi-gat: A graphical attention-based hierarchical multimodal representation learning approach for human activity recognition
 - Semantics-aware adaptive knowledge distillation for sensor-to-vision action recognition
 - Human action recognition from various data modalities: A review
 - Eldersim: A synthetic data generation platform for human action recognition in eldercare applications
 - Home action genome: Cooperative compositional action understanding
 - Cross-modal Knowledge Distillation for Vision-to-Sensor Action Recognition
 - Sensor-Augmented Egocentric-Video Captioning with Dynamic Modal Attention
 - Disentanglement Approach for Video Action Recognition
 - Fusion-GCN: Multimodal Action Recognition using Graph Convolutional Networks
 - Meta-learning meets the Internet of Things: Graph prototypical models for sensor-based human activity recognition
 - Human Activity Recognition Based on Wearable Sensor Data: A Standardization of the State-of-the-Art
 
- 
Approaching the Real-World: Supporting Activity Recognition Training with Virtual IMU Data [paper]
 - 
Can You See It? Good, So We Can Sense It! [paper]
 - 
An Ensemble of ConvTransformer Networks for the Sussex-Huawei Locomotion-Transportation (SHL) Recognition Challenge [paper]
 - 
Fast Deep Neural Architecture Search for Wearable Activity Recognition by Early Prediction of Converged Performance [paper]
 - 
Human Activity Recognition Based on Acceleration Data From Smartphones Using HMMs [paper]
 - 
On the Role of Context Length for Feature Extraction and Sequence Modeling in Human Activity Recognition [paper]
 - 
ObscureNet: Learning Attribute-invariant Latent Representation for Anonymizing Sensor Data [paper]
 - 
SenseCollect: We Need Efficient Ways to Collect On-body Sensor-based Human Activity Data! [paper]
 - 
Self-supervised Learning for Reading Activity Classification [paper]
 - 
Approaching the Real-World: Supporting Activity Recognition Training with Virtual IMU Data [paper]
 - 
Reducing Muscle Activity when Playing Tremolo by Using Electrical Muscle Stimulation to Learn Efficient Motor Skills [paper]
 - 
Pushing the Limits of Long Range Wireless Sensing with LoRa [paper]
 - 
CardiacWave: A mmWave-based Scheme of Non-Contact and High-Definition Heart Activity Computing [paper]
 - 
Multimodal Federated Learning [paper]
 - 
A Deep Learning-Based Framework for Human Activity Recognition in Smart Homes [paper]
 - 
Interactive Hybrid Intelligence Systems for Human-Ai/Robot Collaboration: Improving the Practices of Physical Stroke Rehabilitation [paper]
 - 
Continual Activity Recognition with Generative Adversarial Networks [paper]
 - 
A multibranch CNN-BiLSTM model for human activity recognition using wearable sensor data [paper]
 - 
Unsupervised User Adaptation Model for Multiple Wearable Sensors Based Human Activity Recognition [paper]
 - 
ClusterFL: A Similarity-Aware Federated Learning System for Human Activity Recognition (MobiSys) [paper]
 - 
Improving Deep Learning for HAR with shallow LSTMs (ISWC/ubicomp) [paper]
 - 
Contrastive Predictive Coding for Human Activity Recognition (IMWUT/ubicomp) [paper]
 - 
Leveraging Activity Recognition to Enable Protective Behavior Detection in Continuous Data (IMWUT/ubicomp) [paper]
 - 
Watching Your Phone's Back: Gesture Recognition by Sensing Acoustical Structure-borne Propagation (IMWUT/ubicomp) [paper]
 - 
ApneaDetector: Detecting Sleep Apnea with Smartwatches (IMWUT/ubicomp) [paper]
 - 
NeckFace: Continuously Tracking Full Facial Expressions on Neck-mounted Wearables (IMWUT/ubicomp) [paper]
 - 
We Hear Your PACE: Passive Acoustic Localization of Multiple Walking Persons (IMWUT/ubicomp) [paper]
 - 
mTeeth: Identifying Brushing Teeth Surfaces Using Wrist-Worn Inertial Sensors (IMWUT/ubicomp) [paper]
 - 
Acoustic-based Upper Facial Action Recognition for Smart Eyewear (IMWUT/ubicomp) [paper]
 - 
Two-Stream Convolution Augmented Transformer for Human Activity Recognition (AAAI2021) [paper]
 - 
Unsupervised Human Activity Representation Learning with Multi-task Deep Clustering (IMWUT/ubicomp) [paper]
 - 
Attend and Discriminate: Beyond the State-of-the-Art for Human Activity Recognition Using Wearable Sensors (IMWUT/ubicomp) [paper]
 - 
SelfHAR: Improving Human Activity Recognition through Self-training with Unlabeled Data (IMWUT/ubicomp) [paper]
 - 
Latent Independent Excitation for Generalizable Sensor-based Cross-Person Activity Recognition (AAAI 2021) [paper]
 - 
Weakly-Supervised Sensor-based Activity Segmentation and Recognition via Learning from Distributions (Artificial Intelligence (AIJ)) [paper]
 
- 
GIobalFusion: A Global Attentional Deep Learning Framework for Multisensor Information Fusion (IMWUT/ubicomp) [paper]
 - 
METIER: A Deep Multi-Task Learning Based Activity and User Recognition Model Using Wearable Sensors (IMWUT/ubicomp) [paper]
 - 
Instance-Wise Dynamic Sensor Selection for Human Activity Recognition (AAAI 2020) [paper]
 - 
Cross-Dataset Activity Recognition via Adaptive Spatial-Temporal Transfer Learning (IMWUT/ubicomp) [paper]
 - 
MARS: Mixed Virtual and Real Wearable Sensors for Human Activity Recognition with Multi-Domain Deep Learning Model [arXiv]
 - 
Towards Deep Clustering of Human Activities from Wearables (ISWC/ubicomp) [paper]
 - 
[UDA4HAR] A Systematic Study of Unsupervised Domain Adaptation for Robust Human-Activity Recognition (IMWUT/ubicomp) [paper]
 - 
Adversarial Multi-view Networks for Activity Recognition (IMWUT/ubicomp) [paper]
 - 
Weakly Supervised Multi-Task Representation Learning for Human Activity Analysis Using Wearables (IMWUT/ubicomp) [paper]
 - 
[IMUTube] IMUTube: Automatic Extraction of Virtual on-body Accelerometry from Video for Human Activity Recognition (IMWUT/ubicomp) [paper]
 - 
Robust Unsupervised Factory Activity Recognition with Body-worn Accelerometer Using Temporal Structure of Multiple Sensor Data Motifs (IMWUT/ubicomp) [paper]
 - 
Masked reconstruction based self-supervision for human activity recognition (ISWC/ubicomp) [paper]
 - 
Digging deeper: towards a better understanding of transfer learning for human activity recognition with Body-worn Accelerometer Using Temporal Structure of Multiple Sensor Data Motifs (ISWC/ubicomp) [paper]
 - 
IndRNN based long-term temporal recognition in the spatial and frequency domain (ISWC/ubicomp) [paper]
 - 
Tackling the SHL challenge 2020 with person-specific classifiers and semi-supervised learning (ISWC/ubicomp) [paper]
 - 
DenseNetX and GRU for the sussex-huawei locomotion-transportation recognition challenge (ISWC/ubicomp) [paper]
 
- 
A Novel Distribution-Embedded Neural Network for Sensor-Based Activity Recognition (IJCAI) [paper][code]
 - 
Learning Bodily and Temporal Attention in Protective Movement Behavior Detection
 - 
[AttnSense] AttnSense: Multi-level Attention Mechanism For Multimodal Human Activity Recognition (IJCAI) [paper]
 - 
Multi-agent Attentional Activity Recognition (IJCAI) [paper][code]
 - 
Distribution-based Semi-Supervised Learning for Activity Recognition (AAAI) [paper][code]
 - 
On the Role of Features in Human Activity Recognition (ISWC/ubicomp) [paper]
 - 
Handling Annotation Uncertainty in Human Activity Recognition (ISWC/ubicomp) [paper]
 - 
Leveraging Active Learning and Conditional Mutual Information to Minimize Data Annotation in Human Activity Recognition (IMWUT/ubicomp) [paper]
 
- 
[Vision2Sensor] Vision2Sensor: Knowledge Transfer Across Sensing Modalities for Human Activity Recognition (IMWUT/ubicomp) [paper]
 - 
How Does a Nation Walk? Interpreting Large-Scale Step Count Activity with Weekly Streak Patterns (IMWUT/ubicomp) [paper]
 
- 
Understanding and Improving Recurrent Networks for Human Activity Recognition by Continuous Attention (ISWC/ubicomp) [paper]
 - 
On specialized window lengths and detector based human activity recognition (ISWC/ubicomp) [paper]
 - 
Adding structural characteristics to distribution-based accelerometer representations for activity recognition using wearables (ISWC/ubicomp) [paper]
 - 
On Attention Models for Human Activity Recognition (ISWC/ubicomp) [paper]
 - 
[AROMA] AROMA: A Deep Multi-Task Learning Based Simple and Complex Human Activity Recognition Method Using Wearable Sensors (IMWUT/ubicomp) [paper]
 
- 
[EnsemblesLSTM] Ensembles of Deep LSTM Learners for Activity Recognition using Wearables (IMWUT/ubicomp) [paper] [Tensorflow]
 - 
Deep Learning for Sensor-based Activity Recognition: A Survey (Pattern Recognition Letters) [paper]
 - 
Activity Recognition for Quality Assessment of Batting Shots in Cricket using a Hierarchical Representation (IMWUT/ubicomp) [paper]
 - 
Label Propagation: An Unsupervised Similarity Based Method for Integrating New Sensors in Activity Recognition Systems (IMWUT/ubicomp) [paper]
 - 
CNN-based sensor fusion techniques for multimodal human activity recognition (ISWC/ubicomp) [paper]
 
- 
Learning from less for better: semi-supervised activity recognition via shared structure discovery (ubicomp) [paper]
 - 
Wearable sensor based multimodal human activity recognition exploiting the diversity of classifier ensemble (ubicomp) [paper]
 
- 
Beyond activity recognition: skill assessment from accelerometer data (ubicomp) [paper]
 - 
I did not smoke 100 cigarettes today!: avoiding false positives in real-world activity recognition (ubicomp) [paper]
 - 
Let's (not) stick together: pairwise similarity biases cross-validation in activity recognition (ubicomp) [paper]
 - 
Improved activity recognition by using enriched acceleration data (ubicomp) [paper]
 - 
A field study comparing approaches to collecting annotated activity data in real-world settings (ubicomp) [paper]
 - 
Personalization revisited: a reflective approach helps people better personalize health services and motivates them to increase physical activity (ubicomp) [paper]
 
- 
MONITORING HOUSEHOLD ACTIVITIES AND USER LOCATION WITH A CHEAP, UNOBTRUSIVE THERMAL SENSOR ARRAY (ubicomp) [paper]
 - 
Connecting personal-scale sensing and networked community behavior to infer human activities (ubicomp) [paper]
 - 
Using electrodermal activity to recognize ease of engagement in children during social interactions (ubicomp) [paper]
 
- 
Fine-Grained Sharing of Sensed Physical Activity: A Value Sensitive Approach (ubicomp) [paper]
 - 
Towards zero-shot learning for human activity recognition using semantic attribute sequence model (ubicomp) [paper]
 - 
Personalized mobile physical activity recognition (ubicomp) [paper]
 - 
A Hybrid Unsupervised/Supervised Model for Group Activity Recognition (ubicomp) [paper]
 - 
Confidence-based Multiclass AdaBoost for Physical Activity Monitoring (ubicomp) [paper]
 - 
An exploration with online complex activity recognition using cellphone accelerometer (ubicomp) [paper]
 - 
[UniPad] UniPad: Orchestrating collaborative activities through shared tablets and an integrated wall display (ubicomp) [paper]
 - 
Human Activity Recognition Using Heterogeneous Sensors (ubicomp) [paper]
 - 
A probabilistic ontological framework for the recognition of multilevel human activities (ubicomp) [paper]
 - 
Ubiquitous support for midwives to leverage daily activities (ubicomp) [paper]
 - 
Combining Embedded Accelerometers with Computer Vision for Recognizing Food Preparation Activities (ubicomp) [paper]
 
- 
A Spark Of Activity: Exploring Information Art As Visualization For Physical Activity (ubicomp) [paper]
 - 
[BodyScope] BodyScope: A Wearable Acoustic Sensor for Activity Recognition (ubicomp) [paper]
 - 
An Integrated Framework for Human Activity Classification (ubicomp) [paper]
 
- 
The Place for Ubiquitous Computing in Schools: Lessons Learned from a School-Based Intervention for Youth Physical Activity (ubicomp) [paper]
 - 
[CSN] Enabling Large-scale Human Activity Inference on Smartphones using Community Similarity Networks (ubicomp) [paper]
 
- Using Wearable Activity Type Detection to Improve Physical Activity Energy Expenditure Estimation (ubicomp) [paper]
 

