Exploring Complementarity and Explainability in CNNs for Periocular Verification Across Acquisition Distances
By: Fernando Alonso-Fernandez , Kevin Hernandez Diaz , Jose M. Buades and more
Potential Business Impact:
Identifies people by their eyes, even far away.
We study the complementarity of different CNNs for periocular verification at different distances on the UBIPr database. We train three architectures of increasing complexity (SqueezeNet, MobileNetv2, and ResNet50) on a large set of eye crops from VGGFace2. We analyse performance with cosine and chi2 metrics, compare different network initialisations, and apply score-level fusion via logistic regression. In addition, we use LIME heatmaps and Jensen-Shannon divergence to compare attention patterns of the CNNs. While ResNet50 consistently performs best individually, the fusion provides substantial gains, especially when combining all three networks. Heatmaps show that networks usually focus on distinct regions of a given image, which explains their complementarity. Our method significantly outperforms previous works on UBIPr, achieving a new state-of-the-art.
Similar Papers
Leveraging Large-Scale Face Datasets for Deep Periocular Recognition via Ocular Cropping
CV and Pattern Recognition
Identifies people by the area around their eyes.
nnMobileNet++: Towards Efficient Hybrid Networks for Retinal Image Analysis
CV and Pattern Recognition
Helps doctors find eye diseases faster and better.
CLRecogEye : Curriculum Learning towards exploiting convolution features for Dynamic Iris Recognition
CV and Pattern Recognition
Makes eye scans work even when blurry or tilted.