Interaction-via-Actions: Cattle Interaction Detection with Joint Learning of Action-Interaction Latent Space
By: Ren Nakagawa , Yang Yang , Risa Shinoda and more
This paper introduces a method and application for automatically detecting behavioral interactions between grazing cattle from a single image, which is essential for smart livestock management in the cattle industry, such as for detecting estrus. Although interaction detection for humans has been actively studied, a non-trivial challenge lies in cattle interaction detection, specifically the lack of a comprehensive behavioral dataset that includes interactions, as the interactions of grazing cattle are rare events. We, therefore, propose CattleAct, a data-efficient method for interaction detection by decomposing interactions into the combinations of actions by individual cattle. Specifically, we first learn an action latent space from a large-scale cattle action dataset. Then, we embed rare interactions via the fine-tuning of the pre-trained latent space using contrastive learning, thereby constructing a unified latent space of actions and interactions. On top of the proposed method, we develop a practical working system integrating video and GPS inputs. Experiments on a commercial-scale pasture demonstrate the accurate interaction detection achieved by our method compared to the baselines. Our implementation is available at https://github.com/rakawanegan/CattleAct.
Similar Papers
Cattle-CLIP: A Multimodal Framework for Cattle Behaviour Recognition
CV and Pattern Recognition
Helps farmers watch cows' health with cameras.
Automatic Retrieval of Specific Cows from Unlabeled Videos
CV and Pattern Recognition
Identifies cows automatically from videos without deep learning.
Classification of Cattle Behavior and Detection of Heat (Estrus) using Sensor Data
Machine Learning (CS)
Helps farmers know when cows are ready to have babies.