论文标题

参加和歧视:超越使用可穿戴传感器的人类活动识别的最先进

Attend And Discriminate: Beyond the State-of-the-Art for Human Activity Recognition using Wearable Sensors

论文作者

Abedin, Alireza, Ehsanpour, Mahsa, Shi, Qinfeng, Rezatofighi, Hamid, Ranasinghe, Damith C.

论文摘要

可穿戴设备对于提高我们对人类活动的理解至关重要,尤其是对于越来越多的医疗保健应用程序从康复到细粒度的步态分析。尽管我们的集体知识解决了可穿戴设备的人类活动识别(HAR)问题的发展,但随着端到端的深度学习范式的发展取得了长足的进展,但仍然忽略了几个基本机会。我们严格探索这些新的机会,以学习丰富且高度歧视的活动表示。我们建议:i)学会利用多通道传感器方式和特定活动之间的潜在关系; ii)研究多模式传感器数据流的数据不足扩展的有效性以使深HAR模型正常; iii)包含分类损失标准,以鼓励最小的类内表示差异,同时最大程度地提高阶层间差异以实现更具歧视性的特征。我们的贡献在四种不同的活动识别问题基准的基准方面取得了新的最新性能,并具有较大的利润 - 相对利润率提高了6%。我们通过广泛的实验(包括活动错位措施,消融研究和通过定量和定性研究共享的洞察力)广泛验证了设计概念的贡献。

Wearables are fundamental to improving our understanding of human activities, especially for an increasing number of healthcare applications from rehabilitation to fine-grained gait analysis. Although our collective know-how to solve Human Activity Recognition (HAR) problems with wearables has progressed immensely with end-to-end deep learning paradigms, several fundamental opportunities remain overlooked. We rigorously explore these new opportunities to learn enriched and highly discriminating activity representations. We propose: i) learning to exploit the latent relationships between multi-channel sensor modalities and specific activities; ii) investigating the effectiveness of data-agnostic augmentation for multi-modal sensor data streams to regularize deep HAR models; and iii) incorporating a classification loss criterion to encourage minimal intra-class representation differences whilst maximising inter-class differences to achieve more discriminative features. Our contributions achieves new state-of-the-art performance on four diverse activity recognition problem benchmarks with large margins -- with up to 6% relative margin improvement. We extensively validate the contributions from our design concepts through extensive experiments, including activity misalignment measures, ablation studies and insights shared through both quantitative and qualitative studies.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源