Score: 0

Lightweight Hopfield Neural Networks for Bioacoustic Detection and Call Monitoring of Captive Primates

Published: November 4, 2025 | arXiv ID: 2511.11615v1

By: Wendy Lomas , Andrew Gascoyne , Colin Dubreuil and more

Potential Business Impact:

Listens to animal sounds to track their health.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Passive acoustic monitoring is a sustainable method of monitoring wildlife and environments that leads to the generation of large datasets and, currently, a processing backlog. Academic research into automating this process is focused on the application of resource intensive convolutional neural networks which require large pre-labelled datasets for training and lack flexibility in application. We present a viable alternative relevant in both wild and captive settings; a transparent, lightweight and fast-to-train associative memory AI model with Hopfield neural network (HNN) architecture. Adapted from a model developed to detect bat echolocation calls, this model monitors captive endangered black-and-white ruffed lemur Varecia variegata vocalisations. Lemur social calls of interest when monitoring welfare are stored in the HNN in order to detect other call instances across the larger acoustic dataset. We make significant model improvements by storing an additional signal caused by movement and achieve an overall accuracy of 0.94. The model can perform $340$ classifications per second, processing over 5.5 hours of audio data per minute, on a standard laptop running other applications. It has broad applicability and trains in milliseconds. Our lightweight solution reduces data-to-insight turnaround times and can accelerate decision making in both captive and wild settings.

Country of Origin
🇬🇧 United Kingdom

Page Count
16 pages

Category
Computer Science:
Sound