OG-PCL: Efficient Sparse Point Cloud Processing for Human Activity Recognition
By: Jiuqi Yan, Chendong Xu, Dongyu Liu
Potential Business Impact:
Lets radar see what people are doing.
Human activity recognition (HAR) with millimeter-wave (mmWave) radar offers a privacy-preserving and robust alternative to camera- and wearable-based approaches. In this work, we propose the Occupancy-Gated Parallel-CNN Bi-LSTM (OG-PCL) network to process sparse 3D radar point clouds produced by mmWave sensing. Designed for lightweight deployment, the parameter size of the proposed OG-PCL is only 0.83M and achieves 91.75 accuracy on the RadHAR dataset, outperforming those existing baselines such as 2D CNN, PointNet, and 3D CNN methods. We validate the advantages of the tri-view parallel structure in preserving spatial information across three dimensions while maintaining efficiency through ablation studies. We further introduce the Occupancy-Gated Convolution (OGConv) block and demonstrate the necessity of its occupancy compensation mechanism for handling sparse point clouds. The proposed OG-PCL thus offers a compact yet accurate framework for real-time radar-based HAR on lightweight platforms.
Similar Papers
Enhanced Sparse Point Cloud Data Processing for Privacy-aware Human Action Recognition
CV and Pattern Recognition
Radar sees your actions without cameras.
Open-Set Gait Recognition from Sparse mmWave Radar Point Clouds
CV and Pattern Recognition
Helps computers recognize people by how they walk.
Multi-Head Adaptive Graph Convolution Network for Sparse Point Cloud-Based Human Activity Recognition
CV and Pattern Recognition
Helps robots understand actions without cameras.