SNAP-HAR: Signal-Neural Adaptive Processing for Robust Human Activity Recognition
Research Article
Open Access
CC BY

SNAP-HAR: Signal-Neural Adaptive Processing for Robust Human Activity Recognition

Huazhen Liu 1*
1 Branksome Hall, Toronto, Ontario, Canada
*Corresponding author: jennyliu080306@gmail.com
Published on 11 November 2025
Volume Cover
ACE Vol.204
ISSN (Print): 2755-273X
ISSN (Online): 2755-2721
ISBN (Print): 978-1-80590-517-2
ISBN (Online): 978-1-80590-518-9
Download Cover

Abstract

Human Activity Recognition (HAR) using wearable sensors is essential for healthcare and smart home applications, yet real-world deployment remains challenging due to sensor noise. Current self-supervised methods apply uniform architectures regardless of signal quality, leading to poor performance on noisy data. This paper presents SNAP-HAR, a framework implementing Signal-Neural Adaptive Processing that jointly optimizes signal preprocessing and neural architectures based on dataset-specific noise characteristics. Power spectral density analyses quantify orders-of-magnitude differences between laboratory-preprocessed (UCI-HAR) and real-world (USC-HAD, MotionSense) datasets. Our adaptive processing achieves consistent improvements across all configurations, with gains up to 17.0% on clean data and 24.6% on noisy conditions. Most significantly, SNAP-HAR elevates real-world performance to 0.9121 F1-score, matching the 0.9276 laboratory baseline. This convergence validates that robust HAR is achievable through adaptive signal-neural processing, eliminating the deployment gap that historically limited practical applications. Universal improvements confirm our approach provides architecture-agnostic enhancements applicable to self-supervised paradigms.

Keywords:

Human Activity Recognition, Signal Denoising, Self-supervised Learning, Dataset Adaptation, Noise Robustness

View PDF
Liu,H. (2025). SNAP-HAR: Signal-Neural Adaptive Processing for Robust Human Activity Recognition. Applied and Computational Engineering,204,38-45.

References

[1]. Zhou, B., Yang, J. and Li, Q. (2019) Smartphone-Based Activity Recognition for Indoor Localization Using a Convolutional Neural Network. Sensors, 19, 621.

[2]. Anguita, D., Ghio, A., Oneto, L., Parra, X. and Reyes-Ortiz, J.L. (2012) Human Activity Recognition Using Support Vector Machines on Smartphones. Proceedings of IWAAL, 216-223.

[3]. Ronao, C.A. and Cho, S.B. (2015) Deep Convolutional Neural Networks for Human Activity Recognition with Smartphone Sensors. Lecture Notes in Computer Science, 9492, 46-53.

[4]. Ordóñez, F. and Roggen, D. (2016) Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition. Sensors, 16, 115.

[5]. Haresamudram, H., Beedu, A., Agrawal, V., Grady, P.L., Essa, I., Hoffman, J. and Plötz, T. (2020) Masked Reconstruction Based Self-Supervision for Human Activity Recognition. Proceedings of ISWC, 45-49.

[6]. Wang, J., Zhu, T. and Ning, H. (2023) An Improved Masking Strategy for Self-supervised Masked Reconstruction in Human Activity Recognition. ArXiv Preprint.

[7]. Kaur, H., Rani, V. and Kumar, M. (2024) Human Activity Recognition: A Comprehensive Review. Expert Systems, 41, e13680.

[8]. Bulling, A., Blanke, U. and Schiele, B. (2014) A Tutorial on Human Activity Recognition Using Body-Worn Inertial Sensors. ACM Computing Surveys, 46, 1-33.

[9]. Maitre, J., Bouchard, K. and Gaboury, S. (2023) Data Filtering and Deep Learning for Enhanced Human Activity Recognition from UWB Radars. Journal of Ambient Intelligence and Humanized Computing, 14, 7845-7856.

[10]. Hu, J., Shen, L. and Sun, G. (2018) Squeeze-and-Excitation Networks. Proceedings of CVPR, 7132-7141.

[11]. Woo, S., Park, J., Lee, J.Y. and Kweon, I. (2018) CBAM: Convolutional Block Attention Module. Proceedings of ECCV, 3-19.

[12]. Anguita, D., Ghio, A., Oneto, L., Parra, X. and Reyes-Ortiz, J.L. (2013) A Public Domain Dataset for Human Activity Recognition Using Smartphones. Proceedings of ESANN, 437-442.

[13]. Malekzadeh, M., Clegg, R.G., Cavallaro, A. and Haddadi, H. (2018) Protecting Sensory Data Against Sensitive Inferences. Proceedings of W-P2DS, 1-6.

[14]. Zhang, M. and Sawchuk, A.A. (2012) USC-HAD: A Daily Activity Dataset for Ubiquitous Activity Recognition. Proceedings of PervasiveHealth, 1036-1043.

[15]. Xu, H., Zhou, J., Tan, R., Li, M. and Shen, G. (2021) LIMU-BERT: Unleashing the Potential of Unlabeled Data for IMU Sensing Applications. Proceedings of SenSys, 220-233.

Cite this article

Liu,H. (2025). SNAP-HAR: Signal-Neural Adaptive Processing for Robust Human Activity Recognition. Applied and Computational Engineering,204,38-45.

Data availability

The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.

About volume

Volume title: Proceedings of CONF-MLA 2025 Symposium: Intelligent Systems and Automation: AI Models, IoT, and Robotic Algorithms

ISBN: 978-1-80590-517-2(Print) / 978-1-80590-518-9(Online)
Editor: Hisham AbouGrad
Conference date: 12 November 2025
Series: Applied and Computational Engineering
Volume number: Vol.204
ISSN: 2755-2721(Print) / 2755-273X(Online)