Efficient Aspect Term Extraction using Spiking Neural Network
By: Abhishek Kumar Mishra , Arya Somasundaram , Anup Das and more
Potential Business Impact:
Helps computers understand opinions using less power.
Aspect Term Extraction (ATE) identifies aspect terms in review sentences, a key subtask of sentiment analysis. While most existing approaches use energy-intensive deep neural networks (DNNs) for ATE as sequence labeling, this paper proposes a more energy-efficient alternative using Spiking Neural Networks (SNNs). Using sparse activations and event-driven inferences, SNNs capture temporal dependencies between words, making them suitable for ATE. The proposed architecture, SpikeATE, employs ternary spiking neurons and direct spike training fine-tuned with pseudo-gradients. Evaluated on four benchmark SemEval datasets, SpikeATE achieves performance comparable to state-of-the-art DNNs with significantly lower energy consumption. This highlights the use of SNNs as a practical and sustainable choice for ATE tasks.
Similar Papers
STAS: Spatio-Temporal Adaptive Computation Time for Spiking Transformers
Machine Learning (CS)
Makes AI see faster and use less power.
Temporal Dynamics Enhancer for Directly Trained Spiking Object Detectors
CV and Pattern Recognition
Helps AI see moving things better and faster.
Efficient Eye-based Emotion Recognition via Neural Architecture Search of Time-to-First-Spike-Coded Spiking Neural Networks
Neural and Evolutionary Computing
Glasses read your feelings using less power.