Score: 0

Lightweight Channel Attention for Efficient CNNs

Published: January 2, 2026 | arXiv ID: 2601.01002v1

By: Prem Babu Kanaparthi, Tulasi Venkata Sri Varshini Padamata

Potential Business Impact:

Makes computer vision faster and smarter.

Business Areas:
Image Recognition Data and Analytics, Software

Attention mechanisms have become integral to modern convolutional neural networks (CNNs), delivering notable performance improvements with minimal computational overhead. However, the efficiency accuracy trade off of different channel attention designs remains underexplored. This work presents an empirical study comparing Squeeze and Excitation (SE), Efficient Channel Attention (ECA), and a proposed Lite Channel Attention (LCA) module across ResNet 18 and MobileNetV2 architectures on CIFAR 10. LCA employs adaptive one dimensional convolutions with grouped operations to reduce parameter usage while preserving effective attention behavior. Experimental results show that LCA achieves competitive accuracy, reaching 94.68 percent on ResNet 18 and 93.10 percent on MobileNetV2, while matching ECA in parameter efficiency and maintaining favorable inference latency. Comprehensive benchmarks including FLOPs, parameter counts, and GPU latency measurements are provided, offering practical insights for deploying attention enhanced CNNs in resource constrained environments.

Country of Origin
πŸ‡ΊπŸ‡Έ United States

Page Count
6 pages

Category
Computer Science:
CV and Pattern Recognition