Slot Attention-based Feature Filtering for Few-Shot Learning
By: Javier Rodenas, Eduardo Aguilar, Petia Radeva
Potential Business Impact:
Filters out junk to help computers learn from few examples.
Irrelevant features can significantly degrade few-shot learn ing performance. This problem is used to match queries and support images based on meaningful similarities despite the limited data. However, in this process, non-relevant fea tures such as background elements can easily lead to confu sion and misclassification. To address this issue, we pro pose Slot Attention-based Feature Filtering for Few-Shot Learning (SAFF) that leverages slot attention mechanisms to discriminate and filter weak features, thereby improving few-shot classification performance. The key innovation of SAFF lies in its integration of slot attention with patch em beddings, unifying class-aware slots into a single attention mechanism to filter irrelevant features effectively. We intro duce a similarity matrix that computes across support and query images to quantify the relevance of filtered embed dings for classification. Through experiments, we demon strate that Slot Attention performs better than other atten tion mechanisms, capturing discriminative features while reducing irrelevant information. We validate our approach through extensive experiments on few-shot learning bench marks: CIFAR-FS, FC100, miniImageNet and tieredIma geNet, outperforming several state-of-the-art methods.
Similar Papers
Stochastic-based Patch Filtering for Few-Shot Learning
CV and Pattern Recognition
Helps computers identify food from few pictures.
Smoothing Slot Attention Iterations and Recurrences
CV and Pattern Recognition
Makes AI better at spotting objects in videos.
Unsupervised Structural Scene Decomposition via Foreground-Aware Slot Attention with Pseudo-Mask Guidance
CV and Pattern Recognition
Finds objects in pictures better by ignoring background.