published journal article

DisCovHAR: Contrastive Attention for Human Activity Recognition Under Distribution Shifts

IEEE Internet of Things Journal

Publication Date

June 1, 2025

Author(s)

Abstract

Advances in Internet of Things (IoT) wearable sensors and edge-artificial intelligence (Edge-AI) have enabled practical realizations of machine learning (ML)-enabled mobile sensing applications like human activity recognition (HAR). The effective deployment of these data-driven models necessitates learning robust representations capable of handling prevalent distribution shifts (DS), including new users, device positions, rotations, and more. In that respect, contrastive learning (CL) has shown promise in learning transformation-invariant features, outperforming traditional HAR methods. However, recent findings reveal that the contrastive loss induces shrinkage and expansion of the feature space which may limit the generalization capacity of the model. To address this, we propose DisCovHAR, a contrastive attention method to selectively apply the contrastive loss to a subset of the feature space through the transformer encoder attention mechanism. Extensive experiments on three HAR datasets (DSADS, PAMAP2, and USCHAD) demonstrate its superiority over state-of-the-art methods. Specifically, our approach yields up to 4.47% and 7.82% average accuracy improvements in subject-wise and position-wise generalization settings. Furthermore, DisCovHAR demonstrates up to 5.07% increased robustness compared to prior methods under multivariate distribution shift scenarios.

Suggested Citation
Luke Chen, Mohanad Odema and Mohammad Abdullah Al Faruque (2025) “DisCovHAR: Contrastive Attention for Human Activity Recognition Under Distribution Shifts”, IEEE Internet of Things Journal, 12(12), pp. 21973–21983. Available at: 10.1109/JIOT.2025.3551263.