Self-Supervised Neural Architecture Search for Multimodal Deep Neural Networks
By: Shota Suzuki, Satoshi Ono
Neural architecture search (NAS), which automates the architectural design process of deep neural networks (DNN), has attracted increasing attention. Multimodal DNNs that necessitate feature fusion from multiple modalities benefit from NAS due to their structural complexity; however, constructing an architecture for multimodal DNNs through NAS requires a substantial amount of labeled training data. Thus, this paper proposes a self-supervised learning (SSL) method for architecture search of multimodal DNNs. The proposed method applies SSL comprehensively for both the architecture search and model pretraining processes. Experimental results demonstrated that the proposed method successfully designed architectures for DNNs from unlabeled training data.
Similar Papers
Efficient Global Neural Architecture Search
CV and Pattern Recognition
Finds best computer brain designs faster.
HHNAS-AM: Hierarchical Hybrid Neural Architecture Search using Adaptive Mutation Policies
Machine Learning (CS)
Finds better computer brains for reading text.
LLM-Driven Composite Neural Architecture Search for Multi-Source RL State Encoding
Machine Learning (CS)
Helps robots learn faster from many senses.