Network of Theseus (like the ship)
By: Vighnesh Subramaniam , Colin Conwell , Boris Katz and more
Potential Business Impact:
Changes computer brains without losing their smarts.
A standard assumption in deep learning is that the inductive bias introduced by a neural network architecture must persist from training through inference. The architecture you train with is the architecture you deploy. This assumption constrains the community from selecting architectures that may have desirable efficiency or design properties due to difficulties with optimization. We challenge this assumption with Network of Theseus (NoT), a method for progressively converting a trained, or even untrained, guide network architecture part-by-part into an entirely different target network architecture while preserving the performance of the guide network. At each stage, components in the guide network architecture are incrementally replaced with target architecture modules and aligned via representational similarity metrics. This procedure largely preserves the functionality of the guide network even under substantial architectural changes-for example, converting a convolutional network into a multilayer perceptron, or GPT-2 into a recurrent neural network. By decoupling optimization from deployment, NoT expands the space of viable inference-time architectures, opening opportunities for better accuracy-efficiency tradeoffs and enabling more directed exploration of the architectural design space.
Similar Papers
PLATONT: Learning a Platonic Representation for Unified Network Tomography
Machine Learning (CS)
Finds hidden internet problems using many clues.
On Effectiveness of Graph Neural Network Architectures for Network Digital Twins (NDTs)
Networking and Internet Architecture
Simulates networks to predict and boost user speed
Finite-Agent Stochastic Differential Games on Large Graphs: II. Graph-Based Architectures
Machine Learning (CS)
Helps computers solve complex games faster.